content
large_stringlengths
3
20.5k
url
large_stringlengths
54
193
branch
large_stringclasses
4 values
source
large_stringclasses
42 values
embeddings
listlengths
384
384
score
float64
-0.21
0.65
## About migrating from GitLab with GitHub Actions Importer The instructions below will guide you through configuring your environment to use {% data variables.product.prodname\_actions\_importer %} to migrate GitLab pipelines to {% data variables.product.prodname\_actions %}. ### Prerequisites \* A GitLab account or organization with pipelines and jobs that you want to convert to {% data variables.product.prodname\_actions %} workflows. \* Access to create a GitLab {% data variables.product.pat\_generic %} for your account or organization. {% data reusables.actions.actions-importer-prerequisites %} ### Limitations There are some limitations on migrating processes automatically from GitLab pipelines to {% data variables.product.prodname\_actions %} with {% data variables.product.prodname\_actions\_importer %}. \* Automatic caching in between jobs of different workflows is not supported. \* The `audit` command is only supported when using an organization account. However, the `dry-run` and `migrate` commands can be used with an organization or user account. #### Manual tasks Certain GitLab constructs must be migrated manually. These include: \* Masked project or group variable values \* Artifact reports For more information on manual migrations, see [AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-gitlab-cicd-to-github-actions). ## Installing the {% data variables.product.prodname\_actions\_importer %} CLI extension {% data reusables.actions.installing-actions-importer %} ## Configuring credentials The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname\_actions\_importer %} when working with GitLab and {% data variables.product.prodname\_dotcom %}. 1. Create a {% data variables.product.prodname\_dotcom %} {% data variables.product.pat\_v1 %}. For more information, see [AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic). Your token must have the `workflow` scope. After creating the token, copy it and save it in a safe location for later use. 1. Create a GitLab {% data variables.product.pat\_generic %}. For more information, see [{% data variables.product.pat\_generic\_caps\_plural %}](https://docs.gitlab.com/ee/user/profile/personal\_access\_tokens.html#create-a-personal-access-token) in the GitLab documentation. Your token must have the `read\_api` scope. After creating the token, copy it and save it in a safe location for later use. 1. In your terminal, run the {% data variables.product.prodname\_actions\_importer %} `configure` CLI command: ```shell gh actions-importer configure ``` The `configure` command will prompt you for the following information: \* For "Which CI providers are you configuring?", use the arrow keys to select `GitLab`, press `Space` to select it, then press `Enter`. \* For "{% data variables.product.pat\_generic\_caps %} for GitHub", enter the value of the {% data variables.product.pat\_v1 %} that you created earlier, and press `Enter`. \* For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for {% data variables.location.product\_location\_enterprise %}, and press `Enter`.{% else %}press `Enter` to accept the default value (`https://github.com`).{% endif %} \* For "Private token for GitLab", enter the value for the GitLab {% data variables.product.pat\_generic %} that you created earlier, and press `Enter`. \* For "Base url of the GitLab instance", enter the URL of your GitLab instance, and press `Enter`. An example of the output of the `configure` command is shown below. ```shell $ gh actions-importer configure ✔ Which CI providers are you configuring?: GitLab Enter the following values (leave empty to omit): ✔ {% data variables.product.pat\_generic\_caps %} for GitHub: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\* ✔ Base url of the GitHub instance: https://github.com ✔ Private token for GitLab: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\* ✔ Base url of the GitLab instance: http://localhost Environment variables successfully updated. ``` 1. In your terminal, run the {% data variables.product.prodname\_actions\_importer %} `update` CLI command to connect to {% data variables.product.prodname\_registry %} {% data variables.product.prodname\_container\_registry %} and ensure that the container image is updated to the latest version: ```shell gh actions-importer update ``` The output of the command should be similar to below: ```shell Updating ghcr.io/actions-importer/cli:latest... ghcr.io/actions-importer/cli:latest up-to-date ``` ## Perform an audit of GitLab You can use the `audit` command to get a high-level view of all pipelines in a GitLab server. The `audit` command performs the following steps: 1. Fetches all
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/gitlab-migration.md
main
github-actions
[ -0.0730733722448349, -0.09653422236442566, -0.06212099269032478, -0.0046158889308571815, -0.018996402621269226, -0.026101714000105858, 0.06911773979663849, 0.03703829646110535, -0.07686115801334381, 0.008001542650163174, 0.016316663473844528, -0.02837289683520794, 0.0821869969367981, 0.016...
-0.027602
The output of the command should be similar to below: ```shell Updating ghcr.io/actions-importer/cli:latest... ghcr.io/actions-importer/cli:latest up-to-date ``` ## Perform an audit of GitLab You can use the `audit` command to get a high-level view of all pipelines in a GitLab server. The `audit` command performs the following steps: 1. Fetches all of the projects defined in a GitLab server. 1. Converts each pipeline to its equivalent {% data variables.product.prodname\_actions %} workflow. 1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname\_actions\_importer %}. ### Prerequisites for the audit command In order to use the `audit` command, you must have a {% data variables.product.pat\_generic %} configured with a GitLab organization account. ### Running the audit command To perform an audit of a GitLab server, run the following command in your terminal, replacing `my-gitlab-namespace` with the namespace or group you are auditing: ```shell gh actions-importer audit gitlab --output-dir tmp/audit --namespace my-gitlab-namespace ``` ### Inspecting the audit results {% data reusables.actions.gai-inspect-audit %} ## Forecast potential build runner usage You can use the `forecast` command to forecast potential {% data variables.product.prodname\_actions %} usage by computing metrics from completed pipeline runs in your GitLab server. ### Running the forecast command To perform a forecast of potential {% data variables.product.prodname\_actions %} usage, run the following command in your terminal, replacing `my-gitlab-namespace` with the namespace or group you are forecasting. By default, {% data variables.product.prodname\_actions\_importer %} includes the previous seven days in the forecast report. ```shell gh actions-importer forecast gitlab --output-dir tmp/forecast --namespace my-gitlab-namespace ``` ### Forecasting an entire namespace To forecast an entire namespace and all of its subgroups, you must specify each subgroup in the `--namespace` argument or `NAMESPACE` environment variable. For example: ```shell gh actions-importer forecast gitlab --namespace my-gitlab-namespace my-gitlab-namespace/subgroup-one my-gitlab-namespace/subgroup-two ... ``` ### Inspecting the forecast report The `forecast\_report.md` file in the specified output directory contains the results of the forecast. Listed below are some key terms that can appear in the forecast report: \* The \*\*job count\*\* is the total number of completed jobs. \* The \*\*pipeline count\*\* is the number of unique pipelines used. \* \*\*Execution time\*\* describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname\_dotcom %}-hosted runners. \* This metric is correlated to how much you should expect to spend in {% data variables.product.prodname\_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname\_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs. \* \*\*Queue time\*\* metrics describe the amount of time a job spent waiting for a runner to be available to execute it. \* \*\*Concurrent jobs\*\* metrics describe the amount of jobs running at any given time. This metric can be used to define the number of runners you should configure. Additionally, these metrics are defined for each queue of runners in GitLab. This is especially useful if there is a mix of hosted or self-hosted runners, or high or low spec machines, so you can see metrics specific to different types of runners. ## Perform a dry-run migration of a GitLab pipeline You can use the `dry-run` command to convert a GitLab pipeline to its equivalent {% data variables.product.prodname\_actions %} workflow. ### Running the dry-run command You can use the `dry-run` command to convert a GitLab pipeline to an equivalent {% data variables.product.prodname\_actions %} workflow. A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline. To perform a dry run of migrating your GitLab pipelines
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/gitlab-migration.md
main
github-actions
[ -0.09220439195632935, -0.04641474783420563, -0.036156848073005676, -0.022991079837083817, 0.015466410666704178, -0.1042678952217102, 0.035295601934194565, 0.03050246462225914, -0.04147273302078247, 0.007739929482340813, 0.04120631888508797, -0.038512133061885834, 0.04214616119861603, -0.01...
0.041525
You can use the `dry-run` command to convert a GitLab pipeline to an equivalent {% data variables.product.prodname\_actions %} workflow. A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline. To perform a dry run of migrating your GitLab pipelines to {% data variables.product.prodname\_actions %}, run the following command in your terminal, replacing `my-gitlab-project` with your GitLab project slug, and `my-gitlab-namespace` with the namespace or group (full group path for subgroups, e.g. `my-org/my-team`) you are performing a dry run for. ```shell gh actions-importer dry-run gitlab --output-dir tmp/dry-run --namespace my-gitlab-namespace --project my-gitlab-project ``` ### Inspecting the converted workflows You can view the logs of the dry run and the converted workflow files in the specified output directory. {% data reusables.actions.gai-custom-transformers-rec %} ## Perform a production migration of a GitLab pipeline You can use the `migrate` command to convert a GitLab pipeline and open a pull request with the equivalent {% data variables.product.prodname\_actions %} workflow. ### Running the migrate command To migrate a GitLab pipeline to {% data variables.product.prodname\_actions %}, run the following command in your terminal, replacing the following values: \* `target-url` value with the URL for your {% data variables.product.github %} repository \* `my-gitlab-project` with your GitLab project slug \* `my-gitlab-namespace` with the namespace or group you are migrating (full path for subgroups, e.g. `my-org/my-team`) ```shell gh actions-importer migrate gitlab --target-url https://github.com/:owner/:repo --output-dir tmp/migrate --namespace my-gitlab-namespace --project my-gitlab-project ``` The command's output includes the URL to the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following: ```shell $ gh actions-importer migrate gitlab --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --namespace octo-org --project monas-project [2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log' [2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1' ``` {% data reusables.actions.gai-inspect-pull-request %} ## Reference This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname\_actions\_importer %} to migrate from GitLab. ### Using environment variables {% data reusables.actions.gai-config-environment-variables %} {% data variables.product.prodname\_actions\_importer %} uses the following environment variables to connect to your GitLab instance: \* `GITHUB\_ACCESS\_TOKEN`: The {% data variables.product.pat\_v1 %} used to create pull requests with a converted workflow (requires the `workflow` scope). \* `GITHUB\_INSTANCE\_URL`: The URL to the target {% data variables.product.prodname\_dotcom %} instance (for example, `https://github.com`). \* `GITLAB\_ACCESS\_TOKEN`: The GitLab {% data variables.product.pat\_generic %} used to view GitLab resources. \* `GITLAB\_INSTANCE\_URL`: The URL of the GitLab instance. \* `NAMESPACE`: The namespaces or groups that contain the GitLab pipelines. These environment variables can be specified in a `.env.local` file that is loaded by {% data variables.product.prodname\_actions\_importer %} when it is run. ### Using optional arguments {% data reusables.actions.gai-optional-arguments-intro %} #### `--source-file-path` You can use the `--source-file-path` argument with the `forecast`, `dry-run`, or `migrate` subcommands. By default, {% data variables.product.prodname\_actions\_importer %} fetches pipeline contents from source control. The `--source-file-path` argument tells {% data variables.product.prodname\_actions\_importer %} to use the specified source file path instead. For example: ```shell gh actions-importer dry-run gitlab --output-dir output/ --namespace my-gitlab-namespace --project my-gitlab-project --source-file-path path/to/.gitlab-ci.yml ``` If you would like to supply multiple source files when running the `forecast` subcommand, you can use pattern matching in the file path value. The following example supplies {% data variables.product.prodname\_actions\_importer %} with any source files that match the `./tmp/previous\_forecast/jobs/\*.json` file path. ```shell gh actions-importer forecast gitlab --output-dir output/ --namespace my-gitlab-namespace --project my-gitlab-project --source-file-path ./tmp/previous\_forecast/jobs/\*.json ``` #### `--config-file-path` You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands. By default, {% data variables.product.prodname\_actions\_importer %} fetches pipeline contents from source control. The `--config-file-path` argument tells {% data variables.product.prodname\_actions\_importer %} to use the specified source files instead. The `--config-file-path`
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/gitlab-migration.md
main
github-actions
[ -0.05914764106273651, -0.0656064823269844, -0.02422233857214451, 0.006651461590081453, -0.029680699110031128, -0.07676892727613449, 0.004205958917737007, 0.021786345168948174, -0.04232873022556305, 0.004644923377782106, 0.015591168776154518, -0.036417294293642044, 0.07041040062904358, 0.00...
-0.04735
my-gitlab-namespace --project my-gitlab-project --source-file-path ./tmp/previous\_forecast/jobs/\*.json ``` #### `--config-file-path` You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands. By default, {% data variables.product.prodname\_actions\_importer %} fetches pipeline contents from source control. The `--config-file-path` argument tells {% data variables.product.prodname\_actions\_importer %} to use the specified source files instead. The `--config-file-path` argument can also be used to specify which repository a converted reusable workflow should be migrated to. ##### Audit example In this example, {% data variables.product.prodname\_actions\_importer %} uses the specified YAML configuration file to perform an audit. ```shell gh actions-importer audit gitlab --output-dir path/to/output/ --namespace my-gitlab-namespace --config-file-path path/to/gitlab/config.yml ``` To audit a GitLab instance using a configuration file, the file must be in the following format, and each `repository\_slug` value must be unique: ```yaml source\_files: - repository\_slug: namespace/project-name path: path/to/.gitlab-ci.yml - repository\_slug: namespace/some-other-project-name path: path/to/.gitlab-ci.yml ``` ##### Dry run example In this example, {% data variables.product.prodname\_actions\_importer %} uses the specified YAML configuration file as the source file to perform a dry run. The pipeline is selected by matching the `repository\_slug` in the configuration file to the value of the `--namespace` and `--project` options. The `path` is then used to pull the specified source file. ```shell gh actions-importer dry-run gitlab --namespace my-gitlab-namespace --project my-gitlab-project-name --output-dir ./output/ --config-file-path ./path/to/gitlab/config.yml ``` ##### Specify the repository of converted reusable workflows {% data variables.product.prodname\_actions\_importer %} uses the YAML file provided to the `--config-file-path` argument to determine the repository that converted reusable workflows are migrated to. To begin, you should run an audit without the `--config-file-path` argument: ```shell gh actions-importer audit gitlab --output-dir ./output/ ``` The output of this command will contain a file named `config.yml` that contains a list of all the composite actions that were converted by {% data variables.product.prodname\_actions\_importer %}. For example, the `config.yml` file may have the following contents: ```yaml reusable\_workflows: - name: my-reusable-workflow.yml target\_url: https://github.com/octo-org/octo-repo ref: main ``` You can use this file to specify which repository and ref a reusable workflow or composite action should be added to. You can then use the `--config-file-path` argument to provide the `config.yml` file to {% data variables.product.prodname\_actions\_importer %}. For example, you can use this file when running a `migrate` command to open a pull request for each unique repository defined in the config file: ```shell gh actions-importer migrate gitlab --project my-project-name --output-dir output/ --config-file-path config.yml --target-url https://github.com/my-org/my-repo ``` ### Supported syntax for GitLab pipelines The following table shows the type of properties {% data variables.product.prodname\_actions\_importer %} is currently able to convert. For more details about how GitLab pipeline syntax aligns with {% data variables.product.prodname\_actions %}, see [AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-gitlab-cicd-to-github-actions). | GitLab Pipelines | GitHub Actions | Status | | :-------------------------------------- | :------------------------------ | :-------------------------- | | `after\_script` | `jobs..steps` | Supported | | `auto\_cancel\_pending\_pipelines` | `concurrency` | Supported | | `before\_script` | `jobs..steps` | Supported | | `build\_timeout` or `timeout` | `jobs..timeout-minutes` | Supported | | `default` | Not applicable | Supported | | `image` | `jobs..container` | Supported | | `job` | `jobs.` | Supported | | `needs` | `jobs..needs` | Supported | | `only\_allow\_merge\_if\_pipeline\_succeeds` | `on.pull\_request` | Supported | | `resource\_group` | `jobs..concurrency` | Supported | | `schedule` | `on.schedule` | Supported | | `script` | `jobs..steps` | Supported | | `stages` | `jobs` | Supported | | `tags` | `jobs..runs-on` | Supported | | `variables` | `env`, `jobs..env` | Supported | | Run pipelines for new commits | `on.push` | Supported | | Run pipelines manually | `on.workflow\_dispatch` | Supported | | `environment` | `jobs..environment` | Partially supported | | `include` | Files referenced in an `include` statement are merged into a single job graph before being transformed. | Partially supported |
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/gitlab-migration.md
main
github-actions
[ -0.0810624361038208, -0.02406620979309082, -0.014688528142869473, 0.029763536527752876, 0.021465366706252098, -0.03627012297511101, 0.021667107939720154, 0.020535457879304886, -0.008593558333814144, -0.013358771800994873, -0.029058320447802544, -0.03713315352797508, 0.029404759407043457, 0...
-0.026786
Run pipelines for new commits | `on.push` | Supported | | Run pipelines manually | `on.workflow\_dispatch` | Supported | | `environment` | `jobs..environment` | Partially supported | | `include` | Files referenced in an `include` statement are merged into a single job graph before being transformed. | Partially supported | | `only` or `except` | `jobs..if` | Partially supported | | `parallel` | `jobs..strategy` | Partially supported | | `rules` | `jobs..if` | Partially supported | | `services` | `jobs..services` | Partially supported | | `workflow` | `if` | Partially supported | For information about supported GitLab constructs, see the [`github/gh-actions-importer` repository](https://github.com/github/gh-actions-importer/blob/main/docs/gitlab/index.md). ### Environment variables syntax {% data variables.product.prodname\_actions\_importer %} uses the mapping in the table below to convert default GitLab environment variables to the closest equivalent in {% data variables.product.prodname\_actions %}. | GitLab | GitHub Actions | | :-------------------------------------------- | :------------------------------------------------------------------------------------ | | `CI\_API\_V4\_URL` | {% raw %}`${{ github.api\_url }}`{% endraw %} | | `CI\_BUILDS\_DIR` | {% raw %}`${{ github.workspace }}`{% endraw %} | | `CI\_COMMIT\_BRANCH` | {% raw %}`${{ github.ref }}`{% endraw %} | | `CI\_COMMIT\_REF\_NAME` | {% raw %}`${{ github.ref }}`{% endraw %} | | `CI\_COMMIT\_REF\_SLUG` | {% raw %}`${{ github.ref }}`{% endraw %} | | `CI\_COMMIT\_SHA` | {% raw %}`${{ github.sha }}`{% endraw %} | | `CI\_COMMIT\_SHORT\_SHA` | {% raw %}`${{ github.sha }}`{% endraw %} | | `CI\_COMMIT\_TAG` | {% raw %}`${{ github.ref }}`{% endraw %} | | `CI\_JOB\_ID` | {% raw %}`${{ github.job }}`{% endraw %} | | `CI\_JOB\_MANUAL` | {% raw %}`${{ github.event\_name == 'workflow\_dispatch' }}`{% endraw %} | | `CI\_JOB\_NAME` | {% raw %}`${{ github.job }}`{% endraw %} | | `CI\_JOB\_STATUS` | {% raw %}`${{ job.status }}`{% endraw %} | | `CI\_JOB\_URL` | {% raw %}`${{ github.server\_url }}/${{ github.repository }}/actions/runs/${{ github.run\_id }}`{% endraw %} | | `CI\_JOB\_TOKEN` | {% raw %}`${{ github.token }}`{% endraw %} | | `CI\_NODE\_INDEX` | {% raw %}`${{ strategy.job-index }}`{% endraw %} | | `CI\_NODE\_TOTAL` | {% raw %}`${{ strategy.job-total }}`{% endraw %} | | `CI\_PIPELINE\_ID` | {% raw %}`${{ github.repository}}/${{ github.workflow }}`{% endraw %} | | `CI\_PIPELINE\_IID` | {% raw %}`${{ github.workflow }}`{% endraw %} | | `CI\_PIPELINE\_SOURCE` | {% raw %}`${{ github.event\_name }}`{% endraw %} | | `CI\_PIPELINE\_TRIGGERED` | {% raw %}`${{ github.actions }}`{% endraw %} | | `CI\_PIPELINE\_URL` | {% raw %}`${{ github.server\_url }}/${{ github.repository }}/actions/runs/${{ github.run\_id }}`{% endraw %} | | `CI\_PROJECT\_DIR` | {% raw %}`${{ github.workspace }}`{% endraw %} | | `CI\_PROJECT\_ID` | {% raw %}`${{ github.repository }}`{% endraw %} | | `CI\_PROJECT\_NAME` | {% raw %}`${{ github.event.repository.name }}`{% endraw %} | | `CI\_PROJECT\_NAMESPACE` | {% raw %}`${{ github.repository\_owner }}`{% endraw %} | | `CI\_PROJECT\_PATH\_SLUG` | {% raw %}`${{ github.repository }}`{% endraw %} | | `CI\_PROJECT\_PATH` | {% raw %}`${{ github.repository }}`{% endraw %} | | `CI\_PROJECT\_ROOT\_NAMESPACE` | {% raw %}`${{ github.repository\_owner }}`{% endraw %} | | `CI\_PROJECT\_TITLE` | {% raw %}`${{ github.event.repository.full\_name }}`{% endraw %} | | `CI\_PROJECT\_URL` | {% raw %}`${{ github.server\_url }}/${{ github.repository }}`{% endraw %} | | `CI\_REPOSITORY\_URL` | {% raw %}`${{ github.event.repository.clone\_url }}`{% endraw %} | | `CI\_RUNNER\_EXECUTABLE\_ARCH` | {% raw %}`${{ runner.os }}`{% endraw %} | | `CI\_SERVER\_HOST` | {% raw %}`${{ github.server\_url }}`{% endraw %} | | `CI\_SERVER\_URL` | {% raw %}`${{ github.server\_url }}`{% endraw %} | | `CI\_SERVER` | {% raw %}`${{ github.actions }}`{% endraw %} | | `GITLAB\_CI` | {% raw %}`${{ github.actions }}`{% endraw %} | | `GITLAB\_USER\_EMAIL` | {% raw %}`${{ github.actor }}`{% endraw %} | | `GITLAB\_USER\_ID` | {% raw %}`${{ github.actor }}`{% endraw %} | | `GITLAB\_USER\_LOGIN` | {% raw %}`${{ github.actor }}`{% endraw %} | | `GITLAB\_USER\_NAME` | {% raw %}`${{ github.actor }}`{% endraw %} | | `TRIGGER\_PAYLOAD` | {%
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/gitlab-migration.md
main
github-actions
[ -0.09377919882535934, -0.05683644115924835, -0.01118533220142126, -0.037711258977651596, -0.0028262226842343807, -0.07102400809526443, -0.048136983066797256, -0.01421948429197073, -0.0871889591217041, -0.014722676016390324, 0.04798288643360138, -0.00648797582834959, 0.04635356366634369, 0....
-0.016382
%} | | `GITLAB\_USER\_EMAIL` | {% raw %}`${{ github.actor }}`{% endraw %} | | `GITLAB\_USER\_ID` | {% raw %}`${{ github.actor }}`{% endraw %} | | `GITLAB\_USER\_LOGIN` | {% raw %}`${{ github.actor }}`{% endraw %} | | `GITLAB\_USER\_NAME` | {% raw %}`${{ github.actor }}`{% endraw %} | | `TRIGGER\_PAYLOAD` | {% raw %}`${{ github.event\_path }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_ASSIGNEES` | {% raw %}`${{ github.event.pull\_request.assignees }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_ID` | {% raw %}`${{ github.event.pull\_request.number }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_IID` | {% raw %}`${{ github.event.pull\_request.number }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_LABELS` | {% raw %}`${{ github.event.pull\_request.labels }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_MILESTONE` | {% raw %}`${{ github.event.pull\_request.milestone }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_PROJECT\_ID` | {% raw %}`${{ github.repository }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_PROJECT\_PATH` | {% raw %}`${{ github.repository }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_PROJECT\_URL` | {% raw %}`${{ github.server\_url }}/${{ github.repository }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_REF\_PATH` | {% raw %}`${{ github.ref }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_SOURCE\_BRANCH\_NAME` | {% raw %}`${{ github.event.pull\_request.head.ref }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_SOURCE\_BRANCH\_SHA` | {% raw %}`${{ github.event.pull\_request.head.sha}}`{% endraw %} | | `CI\_MERGE\_REQUEST\_SOURCE\_PROJECT\_ID` | {% raw %}`${{ github.event.pull\_request.head.repo.full\_name }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_SOURCE\_PROJECT\_PATH` | {% raw %}`${{ github.event.pull\_request.head.repo.full\_name }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_SOURCE\_PROJECT\_URL` | {% raw %}`${{ github.event.pull\_request.head.repo.url }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_TARGET\_BRANCH\_NAME` | {% raw %}`${{ github.event.pull\_request.base.ref }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_TARGET\_BRANCH\_SHA` | {% raw %}`${{ github.event.pull\_request.base.sha }}`{% endraw %} | | `CI\_MERGE\_REQUEST\_TITLE` | {% raw %}`${{ github.event.pull\_request.title }}`{% endraw %} | | `CI\_EXTERNAL\_PULL\_REQUEST\_IID` | {% raw %}`${{ github.event.pull\_request.number }}`{% endraw %} | | `CI\_EXTERNAL\_PULL\_REQUEST\_SOURCE\_REPOSITORY` | {% raw %}`${{ github.event.pull\_request.head.repo.full\_name }}`{% endraw %} | | `CI\_EXTERNAL\_PULL\_REQUEST\_TARGET\_REPOSITORY` | {% raw %}`${{ github.event.pull\_request.base.repo.full\_name }}`{% endraw %} | | `CI\_EXTERNAL\_PULL\_REQUEST\_SOURCE\_BRANCH\_NAME` | {% raw %}`${{ github.event.pull\_request.head.ref }}`{% endraw %} | | `CI\_EXTERNAL\_PULL\_REQUEST\_SOURCE\_BRANCH\_SHA` | {% raw %}`${{ github.event.pull\_request.head.sha }}`{% endraw %} | | `CI\_EXTERNAL\_PULL\_REQUEST\_TARGET\_BRANCH\_NAME` | {% raw %}`${{ github.event.pull\_request.base.ref }}`{% endraw %} | | `CI\_EXTERNAL\_PULL\_REQUEST\_TARGET\_BRANCH\_SHA` | {% raw %}`${{ github.event.pull\_request.base.sha }}`{% endraw %} | ## Legal notice {% data reusables.actions.actions-importer-legal-notice %}
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/gitlab-migration.md
main
github-actions
[ -0.018324071541428566, -0.011056187562644482, -0.023975396528840065, 0.023071613162755966, -0.0031307796016335487, -0.02356429025530815, 0.0870162844657898, 0.02303503453731537, 0.01339193619787693, -0.003761294297873974, 0.02711986005306244, -0.08314646035432816, 0.06968972831964493, 0.01...
0.072806
## About migrating from Azure DevOps with GitHub Actions Importer The instructions below will guide you through configuring your environment to use {% data variables.product.prodname\_actions\_importer %} to migrate Azure DevOps pipelines to {% data variables.product.prodname\_actions %}. ### Prerequisites \* An Azure DevOps account or organization with projects and pipelines that you want to convert to {% data variables.product.prodname\_actions %} workflows. \* Access to create an Azure DevOps {% data variables.product.pat\_generic %} for your account or organization. {% data reusables.actions.actions-importer-prerequisites %} ### Limitations There are some limitations when migrating from Azure DevOps to {% data variables.product.prodname\_actions %} with {% data variables.product.prodname\_actions\_importer %}: \* {% data variables.product.prodname\_actions\_importer %} requires version 5.0 of the Azure DevOps API, available in either Azure DevOps Services or Azure DevOps Server 2019. Older versions of Azure DevOps Server are not compatible. \* Tasks that are implicitly added to an Azure DevOps pipeline, such as checking out source code, may be added to a {% data variables.product.prodname\_actions\_importer %} audit as a GUID name. To find the friendly task name for a GUID, you can use the following URL: `https://dev.azure.com/:organization/\_apis/distributedtask/tasks/:guid`. #### Manual tasks Certain Azure DevOps constructs must be migrated manually from Azure DevOps into {% data variables.product.prodname\_actions %} configurations. These include: \* Organization, repository, and environment secrets \* Service connections such as OIDC Connect, {% data variables.product.prodname\_github\_apps %}, and {% data variables.product.pat\_generic\_plural %} \* Unknown tasks \* Self-hosted agents \* Environments \* Pre-deployment approvals For more information on manual migrations, see [AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-azure-pipelines-to-github-actions). #### Unsupported tasks {% data variables.product.prodname\_actions\_importer %} does not support migrating the following tasks: \* Pre-deployment gates \* Post-deployment gates \* Post-deployment approvals \* Some resource triggers ## Installing the {% data variables.product.prodname\_actions\_importer %} CLI extension {% data reusables.actions.installing-actions-importer %} ## Configuring credentials The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname\_actions\_importer %} when working with Azure DevOps and {% data variables.product.prodname\_dotcom %}. 1. Create a {% data variables.product.prodname\_dotcom %} {% data variables.product.pat\_v1 %}. For more information, see [AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic). Your token must have the `workflow` scope. After creating the token, copy it and save it in a safe location for later use. 1. Create an Azure DevOps {% data variables.product.pat\_generic %}. For more information, see [Use {% data variables.product.pat\_generic\_plural %}](https://learn.microsoft.com/en-us/azure/devops/organizations/accounts/use-personal-access-tokens-to-authenticate?view=azure-devops&tabs=Windows#create-a-pat) in the Azure DevOps documentation. The token must have the following scopes: \* Agents Pool: `Read` \* Build: `Read` \* Code: `Read` \* Release: `Read` \* Service Connections: `Read` \* Task Groups: `Read` \* Variable Groups: `Read` After creating the token, copy it and save it in a safe location for later use. 1. In your terminal, run the {% data variables.product.prodname\_actions\_importer %} `configure` CLI command: ```shell gh actions-importer configure ``` The `configure` command will prompt you for the following information: \* For "Which CI providers are you configuring?", use the arrow keys to select `Azure DevOps`, press `Space` to select it, then press `Enter`. \* For "{% data variables.product.pat\_generic\_caps %} for GitHub", enter the value of the {% data variables.product.pat\_v1 %} that you created earlier, and press `Enter`. \* For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for {% data variables.location.product\_location\_enterprise %}, and press `Enter`.{% else %}press `Enter` to accept the default value (`https://github.com`).{% endif %} \* For "{% data variables.product.pat\_generic\_caps %} for Azure DevOps", enter the value for the Azure DevOps {% data variables.product.pat\_generic %} that you created earlier, and press `Enter`. \* For "Base url of the Azure DevOps instance", press `Enter` to accept the default value (`https://dev.azure.com`). \* For "Azure DevOps organization name", enter the name for your Azure DevOps organization, and press `Enter`. \* For "Azure DevOps project name",
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/azure-devops-migration.md
main
github-actions
[ -0.012763656675815582, -0.10117492079734802, -0.046718381345272064, 0.0032595705706626177, 0.018939044326543808, 0.042251862585544586, 0.03776778653264046, 0.02925090678036213, -0.06000863015651703, 0.07382185757160187, 0.022194599732756615, -0.04729573428630829, 0.02601681277155876, 0.035...
-0.013701
data variables.product.pat\_generic %} that you created earlier, and press `Enter`. \* For "Base url of the Azure DevOps instance", press `Enter` to accept the default value (`https://dev.azure.com`). \* For "Azure DevOps organization name", enter the name for your Azure DevOps organization, and press `Enter`. \* For "Azure DevOps project name", enter the name for your Azure DevOps project, and press `Enter`. An example of the `configure` command is shown below: ```shell $ gh actions-importer configure ✔ Which CI providers are you configuring?: Azure DevOps Enter the following values (leave empty to omit): ✔ {% data variables.product.pat\_generic\_caps %} for GitHub: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\* ✔ Base url of the GitHub instance: https://github.com ✔ {% data variables.product.pat\_generic\_caps %} for Azure DevOps: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\* ✔ Base url of the Azure DevOps instance: https://dev.azure.com ✔ Azure DevOps organization name: :organization ✔ Azure DevOps project name: :project Environment variables successfully updated. ``` 1. In your terminal, run the {% data variables.product.prodname\_actions\_importer %} `update` CLI command to connect to the {% data variables.product.prodname\_registry %} {% data variables.product.prodname\_container\_registry %} and ensure that the container image is updated to the latest version: ```shell gh actions-importer update ``` The output of the command should be similar to below: ```shell Updating ghcr.io/actions-importer/cli:latest... ghcr.io/actions-importer/cli:latest up-to-date ``` ## Perform an audit of Azure DevOps You can use the `audit` command to get a high-level view of all projects in an Azure DevOps organization. The `audit` command performs the following steps: 1. Fetches all of the projects defined in an Azure DevOps organization. 1. Converts each pipeline to its equivalent {% data variables.product.prodname\_actions %} workflow. 1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname\_actions\_importer %}. ### Running the audit command To perform an audit of an Azure DevOps organization, run the following command in your terminal: ```shell gh actions-importer audit azure-devops --output-dir tmp/audit ``` ### Inspecting the audit results {% data reusables.actions.gai-inspect-audit %} ## Forecast potential {% data variables.product.prodname\_actions %} usage You can use the `forecast` command to forecast potential {% data variables.product.prodname\_actions %} usage by computing metrics from completed pipeline runs in Azure DevOps. ### Running the forecast command To perform a forecast of potential {% data variables.product.prodname\_actions %} usage, run the following command in your terminal. By default, {% data variables.product.prodname\_actions\_importer %} includes the previous seven days in the forecast report. ```shell gh actions-importer forecast azure-devops --output-dir tmp/forecast\_reports ``` ### Inspecting the forecast report The `forecast\_report.md` file in the specified output directory contains the results of the forecast. Listed below are some key terms that can appear in the forecast report: \* The \*\*job count\*\* is the total number of completed jobs. \* The \*\*pipeline count\*\* is the number of unique pipelines used. \* \*\*Execution time\*\* describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname\_dotcom %}-hosted runners. This metric is correlated to how much you should expect to spend in {% data variables.product.prodname\_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname\_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs. \* \*\*Queue time\*\* metrics describe the amount of time a job spent waiting for a runner to be available to execute it. \* \*\*Concurrent jobs\*\* metrics describe the amount of jobs running at any given time. This metric can be used to define the number of runners you should configure. Additionally, these metrics are defined for each queue of runners in Azure DevOps. This is especially useful if there is a mix of hosted or self-hosted runners, or high
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/azure-devops-migration.md
main
github-actions
[ 0.017543386667966843, -0.08108576387166977, -0.03466131165623665, 0.00872160866856575, 0.0035188731271773577, -0.028971275314688683, 0.03423736244440079, -0.018500087782740593, 0.03946099057793617, 0.09419827163219452, 0.07679548114538193, -0.058729078620672226, 0.033635396510362625, 0.020...
-0.019194
of jobs running at any given time. This metric can be used to define the number of runners you should configure. Additionally, these metrics are defined for each queue of runners in Azure DevOps. This is especially useful if there is a mix of hosted or self-hosted runners, or high or low spec machines, so you can see metrics specific to different types of runners. ## Perform a dry-run migration You can use the `dry-run` command to convert an Azure DevOps pipeline to an equivalent {% data variables.product.prodname\_actions %} workflow. A dry run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline. {% data reusables.actions.gai-custom-transformers-rec %} ### Running the dry-run command for a build pipeline To perform a dry run of migrating your Azure DevOps build pipeline to {% data variables.product.prodname\_actions %}, run the following command in your terminal, replacing `pipeline\_id` with the ID of the pipeline you are converting. ```shell gh actions-importer dry-run azure-devops pipeline --pipeline-id :pipeline\_id --output-dir tmp/dry-run ``` You can view the logs of the dry run and the converted workflow files in the specified output directory. ### Running the dry-run command for a release pipeline To perform a dry run of migrating your Azure DevOps release pipeline to {% data variables.product.prodname\_actions %}, run the following command in your terminal, replacing `pipeline\_id` with the ID of the pipeline you are converting. ```shell gh actions-importer dry-run azure-devops release --pipeline-id :pipeline\_id --output-dir tmp/dry-run ``` You can view the logs of the dry run and the converted workflow files in the specified output directory. ## Perform a production migration You can use the `migrate` command to convert an Azure DevOps pipeline and open a pull request with the equivalent {% data variables.product.prodname\_actions %} workflow. ### Running the migrate command for a build pipeline To migrate an Azure DevOps build pipeline to {% data variables.product.prodname\_actions %}, run the following command in your terminal, replacing the `target-url` value with the URL for your {% data variables.product.prodname\_dotcom %} repository, and `pipeline\_id` with the ID of the pipeline you are converting. ```shell gh actions-importer migrate azure-devops pipeline --pipeline-id :pipeline\_id --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate ``` The command's output includes the URL of the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following: ```shell $ gh actions-importer migrate azure-devops pipeline --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --azure-devops-project my-azure-devops-project [2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log' [2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1' ``` ### Running the migrate command for a release pipeline To migrate an Azure DevOps release pipeline to {% data variables.product.prodname\_actions %}, run the following command in your terminal, replacing the `target-url` value with the URL for your {% data variables.product.prodname\_dotcom %} repository, and `pipeline\_id` with the ID of the pipeline you are converting. ```shell gh actions-importer migrate azure-devops release --pipeline-id :pipeline\_id --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate ``` The command's output includes the URL of the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following: ```shell $ gh actions-importer migrate azure-devops release --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --azure-devops-project my-azure-devops-project [2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log' [2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1' ``` {% data reusables.actions.gai-inspect-pull-request %} ## Reference This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname\_actions\_importer %} to migrate from Azure DevOps. ### Configuration environment variables {% data reusables.actions.gai-config-environment-variables %} {% data variables.product.prodname\_actions\_importer %} uses the following environment variables to connect to your Azure DevOps instance: \* `GITHUB\_ACCESS\_TOKEN`: The {% data variables.product.pat\_v1 %} used to create pull requests with a converted workflow
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/azure-devops-migration.md
main
github-actions
[ -0.030947428196668625, -0.09136461466550827, -0.07652366161346436, 0.016669340431690216, -0.010566355660557747, 0.008855611085891724, -0.005976542364805937, -0.00301799806766212, -0.003628247883170843, 0.04766925424337387, -0.02137640304863453, -0.0320875346660614, -0.005015285219997168, -...
0.045008
using {% data variables.product.prodname\_actions\_importer %} to migrate from Azure DevOps. ### Configuration environment variables {% data reusables.actions.gai-config-environment-variables %} {% data variables.product.prodname\_actions\_importer %} uses the following environment variables to connect to your Azure DevOps instance: \* `GITHUB\_ACCESS\_TOKEN`: The {% data variables.product.pat\_v1 %} used to create pull requests with a converted workflow (requires the `workflow` scope). \* `GITHUB\_INSTANCE\_URL`: The URL to the target {% data variables.product.prodname\_dotcom %} instance (for example, `https://github.com`). \* `AZURE\_DEVOPS\_ACCESS\_TOKEN`: The {% data variables.product.pat\_generic %} used to authenticate with your Azure DevOps instance. This token requires the following scopes: \* Build: `Read` \* Agent Pools: `Read` \* Code: `Read` \* Release: `Read` \* Service Connections: `Read` \* Task Groups: `Read` \* Variable Groups: `Read` \* `AZURE\_DEVOPS\_PROJECT`: The project name or GUID to use when migrating a pipeline. If you'd like to perform an audit on all projects, this is optional. \* `AZURE\_DEVOPS\_ORGANIZATION`: The organization name of your Azure DevOps instance. \* `AZURE\_DEVOPS\_INSTANCE\_URL`: The URL to the Azure DevOps instance, such as `https://dev.azure.com`. These environment variables can be specified in a `.env.local` file that is loaded by {% data variables.product.prodname\_actions\_importer %} when it is run. ### Optional arguments {% data reusables.actions.gai-optional-arguments-intro %} #### `--source-file-path` You can use the `--source-file-path` argument with the `forecast`, `dry-run`, or `migrate` subcommands. By default, {% data variables.product.prodname\_actions\_importer %} fetches pipeline contents from source control. The `--source-file-path` argument tells {% data variables.product.prodname\_actions\_importer %} to use the specified source file path instead. For example: ```shell gh actions-importer dry-run azure-devops pipeline --output-dir ./output/ --source-file-path ./path/to/azure\_devops/pipeline.yml ``` #### `--config-file-path` You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands. By default, {% data variables.product.prodname\_actions\_importer %} fetches pipeline contents from source control. The `--config-file-path` argument tells {% data variables.product.prodname\_actions\_importer %} to use the specified source files instead. The `--config-file-path` argument can also be used to specify which repository a converted reusable workflow or composite action should be migrated to. ##### Audit example In this example, {% data variables.product.prodname\_actions\_importer %} uses the specified YAML configuration file as the source file to perform an audit. ```shell gh actions-importer audit azure-devops pipeline --output-dir ./output/ --config-file-path ./path/to/azure\_devops/config.yml ``` To audit an Azure DevOps instance using a configuration file, the configuration file must be in the following format and each `repository\_slug` must be unique: ```yaml source\_files: - repository\_slug: azdo-project/1 path: file.yml - repository\_slug: azdo-project/2 paths: path.yml ``` You can generate the `repository\_slug` for a pipeline by combining the Azure DevOps organization name, project name, and the pipeline ID. For example, `my-organization-name/my-project-name/42`. ##### Dry run example In this example, {% data variables.product.prodname\_actions\_importer %} uses the specified YAML configuration file as the source file to perform a dry run. The pipeline is selected by matching the `repository\_slug` in the configuration file to the value of the `--azure-devops-organization` and `--azure-devops-project` option. The `path` is then used to pull the specified source file. ```shell gh actions-importer dry-run azure-devops pipeline --output-dir ./output/ --config-file-path ./path/to/azure\_devops/config.yml ``` ##### Specify the repository of converted reusable workflows and composite actions {% data variables.product.prodname\_actions\_importer %} uses the YAML file provided to the `--config-file-path` argument to determine the repository that converted reusable workflows and composite actions are migrated to. To begin, you should run an audit without the `--config-file-path` argument: ```shell gh actions-importer audit azure-devops --output-dir ./output/ ``` The output of this command will contain a file named `config.yml` that contains a list of all the reusable workflows and composite actions that were converted by {% data variables.product.prodname\_actions\_importer %}. For example, the `config.yml` file may have the following contents: ```yaml reusable\_workflows: - name: my-reusable-workflow.yml target\_url: https://github.com/octo-org/octo-repo ref: main composite\_actions: - name: my-composite-action.yml target\_url: https://github.com/octo-org/octo-repo ref: main ``` You can use this file to
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/azure-devops-migration.md
main
github-actions
[ -0.04033124819397926, -0.06910073012113571, -0.061624735593795776, 0.004285585600882769, 0.011244826950132847, 0.004943352658301592, 0.061382826417684555, 0.00845962855964899, 0.0033298467751592398, 0.06021912768483162, -0.007690273690968752, -0.05822150036692619, 0.04978436976671219, 0.00...
0.055318
of all the reusable workflows and composite actions that were converted by {% data variables.product.prodname\_actions\_importer %}. For example, the `config.yml` file may have the following contents: ```yaml reusable\_workflows: - name: my-reusable-workflow.yml target\_url: https://github.com/octo-org/octo-repo ref: main composite\_actions: - name: my-composite-action.yml target\_url: https://github.com/octo-org/octo-repo ref: main ``` You can use this file to specify which repository and ref a reusable workflow or composite action should be added to. You can then use the `--config-file-path` argument to provide the `config.yml` file to {% data variables.product.prodname\_actions\_importer %}. For example, you can use this file when running a `migrate` command to open a pull request for each unique repository defined in the config file: ```shell gh actions-importer migrate azure-devops pipeline --config-file-path config.yml --target-url https://github.com/my-org/my-repo ``` ### Supported syntax for Azure DevOps pipelines The following table shows the type of properties that {% data variables.product.prodname\_actions\_importer %} is currently able to convert. | Azure Pipelines | {% data variables.product.prodname\_actions %} | Status | | :-------------------- | :------------------------------------ | :------------------ | | condition | * `jobs..if` * `jobs..steps[\*].if` | Supported | | container | * `jobs..container` * `jobs..name` | Supported | | continuousIntegration | * `on..` * `on..` * `on..paths` | Supported | | job | * `jobs.` | Supported | | pullRequest | * `on..` * `on..paths` | Supported | | stage | * `jobs` | Supported | | steps | * `jobs..steps` | Supported | | strategy | * `jobs..strategy.fail-fast` * `jobs..strategy.max-parallel` * `jobs..strategy.matrix` | Supported | | timeoutInMinutes | * `jobs..timeout-minutes` | Supported | | variables | * `env` * `jobs..env` * `jobs..steps.env` | Supported | | manual deployment | * `jobs..environment` | Partially supported | | pool | * `runners` * `self hosted runners` | Partially supported | | services | * `jobs..services` | Partially supported | | strategy | * `jobs..strategy` | Partially supported | | triggers | * `on` | Partially supported | | pullRequest | * `on..` | Unsupported | | schedules | * `on.schedule` * `on.workflow\_run` | Unsupported | | triggers | * `on..types` | Unsupported | For more information about supported Azure DevOps tasks, see the [`github/gh-actions-importer` repository](https://github.com/github/gh-actions-importer/blob/main/docs/azure\_devops/index.md). ### Environment variable mapping {% data variables.product.prodname\_actions\_importer %} uses the mapping in the table below to convert default Azure DevOps environment variables to the closest equivalent in {% data variables.product.prodname\_actions %}. | Azure Pipelines | {% data variables.product.prodname\_actions %} | | :------------------------------------------ | :-------------------------------------------------- | | {% raw %}`$(Agent.BuildDirectory)`{% endraw %} | {% raw %}`${{ runner.workspace }}`{% endraw %} | | {% raw %}`$(Agent.HomeDirectory)`{% endraw %} | {% raw %}`${{ env.HOME }}`{% endraw %} | | {% raw %}`$(Agent.JobName)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} | | {% raw %}`$(Agent.OS)`{% endraw %} | {% raw %}`${{ runner.os }}`{% endraw %} | | {% raw %}`$(Agent.ReleaseDirectory)`{% endraw %} | {% raw %}`${{ github.workspace}}`{% endraw %} | | {% raw %}`$(Agent.RootDirectory)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} | | {% raw %}`$(Agent.ToolsDirectory)`{% endraw %} | {% raw %}`${{ runner.tool\_cache }}`{% endraw %} | | {% raw %}`$(Agent.WorkFolder)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} | | {% raw %}`$(Build.ArtifactStagingDirectory)`{% endraw %} | {% raw %}`${{ runner.temp }}`{% endraw %} | | {% raw %}`$(Build.BinariesDirectory)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} | | {% raw %}`$(Build.BuildId)`{% endraw %} | {% raw %}`${{ github.run\_id }}`{% endraw %} | | {% raw %}`$(Build.BuildNumber)`{% endraw %} | {% raw %}`${{ github.run\_number }}`{% endraw %} | | {% raw %}`$(Build.DefinitionId)`{% endraw %} | {% raw %}`${{ github.workflow }}`{% endraw %} | | {% raw %}`$(Build.DefinitionName)`{% endraw %} | {% raw %}`${{ github.workflow }}`{% endraw
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/azure-devops-migration.md
main
github-actions
[ -0.037258587777614594, -0.09944117814302444, -0.06336814910173416, -0.00291687180288136, 0.03131502866744995, 0.03809555619955063, 0.01597839966416359, 0.031961772590875626, -0.04965938255190849, -0.010543071664869785, 0.026015223935246468, -0.006715365685522556, 0.008143110200762749, -0.0...
0.064045
raw %}`${{ github.run\_id }}`{% endraw %} | | {% raw %}`$(Build.BuildNumber)`{% endraw %} | {% raw %}`${{ github.run\_number }}`{% endraw %} | | {% raw %}`$(Build.DefinitionId)`{% endraw %} | {% raw %}`${{ github.workflow }}`{% endraw %} | | {% raw %}`$(Build.DefinitionName)`{% endraw %} | {% raw %}`${{ github.workflow }}`{% endraw %} | | {% raw %}`$(Build.PullRequest.TargetBranch)`{% endraw %} | {% raw %}`${{ github.base\_ref }}`{% endraw %} | | {% raw %}`$(Build.PullRequest.TargetBranch.Name)`{% endraw %} | {% raw %}`${{ github.base\_ref }}`{% endraw %} | | {% raw %}`$(Build.QueuedBy)`{% endraw %} | {% raw %}`${{ github.actor }}`{% endraw %} | | {% raw %}`$(Build.Reason)`{% endraw %} | {% raw %}`${{ github.event\_name }}`{% endraw %} | | {% raw %}`$(Build.Repository.LocalPath)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} | | {% raw %}`$(Build.Repository.Name)`{% endraw %} | {% raw %}`${{ github.repository }}`{% endraw %} | | {% raw %}`$(Build.Repository.Provider)`{% endraw %} | {% raw %}`GitHub`{% endraw %} | | {% raw %}`$(Build.Repository.Uri)`{% endraw %} | {% raw %}`${{ github.server.url }}/${{ github.repository }}`{% endraw %} | | {% raw %}`$(Build.RequestedFor)`{% endraw %} | {% raw %}`${{ github.actor }}`{% endraw %} | | {% raw %}`$(Build.SourceBranch)`{% endraw %} | {% raw %}`${{ github.ref }}`{% endraw %} | | {% raw %}`$(Build.SourceBranchName)`{% endraw %} | {% raw %}`${{ github.ref }}`{% endraw %} | | {% raw %}`$(Build.SourceVersion)`{% endraw %} | {% raw %}`${{ github.sha }}`{% endraw %} | | {% raw %}`$(Build.SourcesDirectory)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} | | {% raw %}`$(Build.StagingDirectory)`{% endraw %} | {% raw %}`${{ runner.temp }}`{% endraw %} | | {% raw %}`$(Pipeline.Workspace)`{% endraw %} | {% raw %}`${{ runner.workspace }}`{% endraw %} | | {% raw %}`$(Release.DefinitionEnvironmentId)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} | | {% raw %}`$(Release.DefinitionId)`{% endraw %} | {% raw %}`${{ github.workflow }}`{% endraw %} | | {% raw %}`$(Release.DefinitionName)`{% endraw %} | {% raw %}`${{ github.workflow }}`{% endraw %} | | {% raw %}`$(Release.Deployment.RequestedFor)`{% endraw %} | {% raw %}`${{ github.actor }}`{% endraw %} | | {% raw %}`$(Release.DeploymentID)`{% endraw %} | {% raw %}`${{ github.run\_id }}`{% endraw %} | | {% raw %}`$(Release.EnvironmentId)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} | | {% raw %}`$(Release.EnvironmentName)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} | | {% raw %}`$(Release.Reason)`{% endraw %} | {% raw %}`${{ github.event\_name }}`{% endraw %} | | {% raw %}`$(Release.RequestedFor)`{% endraw %} | {% raw %}`${{ github.actor }}`{% endraw %} | | {% raw %}`$(System.ArtifactsDirectory)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} | | {% raw %}`$(System.DefaultWorkingDirectory)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} | | {% raw %}`$(System.HostType)`{% endraw %} | {% raw %}`build`{% endraw %} | | {% raw %}`$(System.JobId)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} | | {% raw %}`$(System.JobName)`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} | | {% raw %}`$(System.PullRequest.PullRequestId)`{% endraw %} | {% raw %}`${{ github.event.number }}`{% endraw %} | | {% raw %}`$(System.PullRequest.PullRequestNumber)`{% endraw %} | {% raw %}`${{ github.event.number }}`{% endraw %} | | {% raw %}`$(System.PullRequest.SourceBranch)`{% endraw %} | {% raw %}`${{ github.ref }}`{% endraw %} | | {% raw %}`$(System.PullRequest.SourceRepositoryUri)`{% endraw %} | {% raw %}`${{ github.server.url }}/${{ github.repository }}`{% endraw %} | | {% raw %}`$(System.PullRequest.TargetBranch)`{% endraw %} | {% raw %}`${{ github.event.base.ref }}`{% endraw %} | | {% raw %}`$(System.PullRequest.TargetBranchName)`{% endraw %} | {% raw %}`${{ github.event.base.ref }}`{% endraw %} | | {% raw %}`$(System.StageAttempt)`{% endraw %} | {% raw %}`${{ github.run\_number }}`{% endraw %} | | {% raw %}`$(System.TeamFoundationCollectionUri)`{% endraw %} | {%
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/azure-devops-migration.md
main
github-actions
[ -0.017373304814100266, 0.014721273444592953, -0.044659990817308426, 0.06065523996949196, 0.041420888155698776, -0.00023363821674138308, -0.006515664514154196, 0.026076730340719223, 0.0353204719722271, -0.033168986439704895, -0.03711923956871033, -0.08259201049804688, 0.03496408462524414, -...
0.019141
%}`$(System.PullRequest.TargetBranch)`{% endraw %} | {% raw %}`${{ github.event.base.ref }}`{% endraw %} | | {% raw %}`$(System.PullRequest.TargetBranchName)`{% endraw %} | {% raw %}`${{ github.event.base.ref }}`{% endraw %} | | {% raw %}`$(System.StageAttempt)`{% endraw %} | {% raw %}`${{ github.run\_number }}`{% endraw %} | | {% raw %}`$(System.TeamFoundationCollectionUri)`{% endraw %} | {% raw %}`${{ github.server.url }}/${{ github.repository }}`{% endraw %} | | {% raw %}`$(System.WorkFolder)`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} | ### Templates You can transform Azure DevOps templates with {% data variables.product.prodname\_actions\_importer %}. #### Limitations {% data variables.product.prodname\_actions\_importer %} is able to transform Azure DevOps templates with some limitations. \* Azure DevOps templates used under the `stages`, `deployments`, and `jobs` keys are converted into reusable workflows in {% data variables.product.prodname\_actions %}. For more information, see [AUTOTITLE](/actions/using-workflows/reusing-workflows). \* Azure DevOps templates used under the `steps` key are converted into composite actions. For more information, see [AUTOTITLE](/actions/creating-actions/creating-a-composite-action). \* If you currently have job templates that reference other job templates, {% data variables.product.prodname\_actions\_importer %} converts the templates into reusable workflows. Because reusable workflows cannot reference other reusable workflows, this is invalid syntax in {% data variables.product.prodname\_actions %}. You must manually correct nested reusable workflows. \* If a template references an external Azure DevOps organization or {% data variables.product.prodname\_dotcom %} repository, you must use the `--credentials-file` option to provide credentials to access this template. For more information, see [AUTOTITLE](/actions/migrating-to-github-actions/automated-migrations/supplemental-arguments-and-settings#using-a-credentials-file-for-authentication). \* You can dynamically generate YAML using `each` expressions with the following caveats: \* Nested `each` blocks are not supported and cause the parent `each` block to be unsupported. \* `each` and contained `if` conditions are evaluated at transformation time, because {% data variables.product.prodname\_actions %} does not support this style of insertion. \* `elseif` blocks are unsupported. If this functionality is required, you must manually correct them. \* Nested `if` blocks are supported, but `if/elseif/else` blocks nested under an `if` condition are not. \* `if` blocks that use predefined Azure DevOps variables are not supported. #### Supported templates {% data variables.product.prodname\_actions\_importer %} supports the templates listed in the table below. | Azure Pipelines | {% data variables.product.prodname\_actions %} | Status | | :---------------------------- | :------------------------------------ | ------------------: | | Extending from a template | `Reusable workflow` | Supported | | Job templates | `Reusable workflow` | Supported | | Stage templates | `Reusable workflow` | Supported | | Step templates | `Composite action` | Supported | | Task groups in classic editor | Varies | Supported | | Templates in a different Azure DevOps organization, project, or repository | Varies | Supported | | Templates in a {% data variables.product.prodname\_dotcom %} repository | Varies | Supported | | Variable templates | `env` | Supported | | Conditional insertion | `if` conditions on job/steps | Partially supported | | Iterative insertion | Not applicable | Partially supported | | Templates with parameters | Varies | Partially supported | #### Template file path names {% data variables.product.prodname\_actions\_importer %} can extract templates with relative or dynamic file paths with variable, parameter, and iterative expressions in the file name. However, there must be a default value set. ##### Variable file path name example ```yaml # File: azure-pipelines.yml variables: - template: 'templates/vars.yml' steps: - template: "./templates/${{ variables.one }}" ``` ```yaml # File: templates/vars.yml variables: one: 'simple\_step.yml' ``` ##### Parameter file path name example ```yaml parameters: - name: template type: string default: simple\_step.yml steps: - template: "./templates/{% raw %}${{ parameters.template }}{% endraw %}" ``` ##### Iterative file path name example ```yaml parameters: - name: steps type: object default: - build\_step - release\_step steps: - {% raw %}${{ each step in parameters.steps }}{% endraw %}: - template: "${{
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/azure-devops-migration.md
main
github-actions
[ 0.0016689430922269821, 0.005129607859998941, -0.05170737951993942, 0.04695792496204376, 0.018485412001609802, 0.01095445267856121, -0.021026579663157463, 0.020191319286823273, 0.06561721116304398, 0.023044811561703682, -0.011335168033838272, -0.07194270938634872, 0.019170133396983147, 0.00...
0.009096
name: template type: string default: simple\_step.yml steps: - template: "./templates/{% raw %}${{ parameters.template }}{% endraw %}" ``` ##### Iterative file path name example ```yaml parameters: - name: steps type: object default: - build\_step - release\_step steps: - {% raw %}${{ each step in parameters.steps }}{% endraw %}: - template: "${{ step }}-variables.yml" ``` #### Template parameters {% data variables.product.prodname\_actions\_importer %} supports the parameters listed in the table below. | Azure Pipelines | {% data variables.product.prodname\_actions %} | Status | | :-------------------- | :----------------------------------------- | :------------------- | | string | `inputs.string` | Supported | | number | `inputs.number` | Supported | | boolean | `inputs.boolean` | Supported | | object | `inputs.string` with `fromJSON` expression | Partially supported | | step | `step` | Partially supported | | stepList | `step` | Partially supported | | job | `job` | Partially supported | | jobList | `job` | Partially supported | | deployment | `job` | Partially supported | | deploymentList | `job` | Partially supported | | stage | `job` | Partially supported | | stageList | `job` | Partially supported | > [!NOTE] > A template used under the `step` key with this parameter type is only serialized as a composite action if the steps are used at the beginning or end of the template steps. A template used under the `stage`, `deployment`, and `job` keys with this parameter type are not transformed into a reusable workflow, and instead are serialized as a standalone workflow. ## Legal notice {% data reusables.actions.actions-importer-legal-notice %}
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/azure-devops-migration.md
main
github-actions
[ -0.02234180085361004, 0.035406459122896194, -0.05143196880817413, 0.014485525898635387, -0.028848741203546524, 0.0705399438738823, 0.018650636076927185, 0.06169977784156799, -0.03704855591058731, 0.06540431082248688, 0.017893586307764053, -0.03346254676580429, -0.02217792719602585, -0.0107...
-0.005382
## About migrating from Travis CI with GitHub Actions Importer The instructions below will guide you through configuring your environment to use {% data variables.product.prodname\_actions\_importer %} to migrate Travis CI pipelines to {% data variables.product.prodname\_actions %}. ### Prerequisites \* A Travis CI account or organization with pipelines and jobs that you want to convert to {% data variables.product.prodname\_actions %} workflows. \* Access to create a Travis CI API access token for your account or organization. {% data reusables.actions.actions-importer-prerequisites %} ### Limitations There are some limitations when migrating from Travis CI pipelines to {% data variables.product.prodname\_actions %} with {% data variables.product.prodname\_actions\_importer %}. #### Manual tasks Certain Travis CI constructs must be migrated manually. These include: \* Secrets \* Unknown job properties For more information on manual migrations, see [AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-travis-ci-to-github-actions). #### Travis CI project languages {% data variables.product.prodname\_actions\_importer %} transforms Travis CI project languages by adding a set of preconfigured build tools and a default build script to the transformed workflow. If no language is explicitly declared, {% data variables.product.prodname\_actions\_importer %} assumes a project language is Ruby. For a list of the project languages supported by {% data variables.product.prodname\_actions\_importer %}, see [Supported project languages](#supported-project-languages). ## Installing the {% data variables.product.prodname\_actions\_importer %} CLI extension {% data reusables.actions.installing-actions-importer %} ## Configuring credentials The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname\_actions\_importer %} when working with Travis CI and {% data variables.product.prodname\_dotcom %}. 1. Create a {% data variables.product.prodname\_dotcom %} {% data variables.product.pat\_v1 %}. For more information, see [AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic). Your token must have the `workflow` scope. After creating the token, copy it and save it in a safe location for later use. 1. Create a Travis CI API access token. For more information, see [Get your Travis CI API token](https://docs.travis-ci.com/user/migrate/travis-migrate-to-apps-gem-guide/#4-get-your-travis-ci-api-token) in the Travis CI documentation. After creating the token, copy it and save it in a safe location for later use. 1. In your terminal, run the {% data variables.product.prodname\_actions\_importer %} `configure` CLI command: ```shell gh actions-importer configure ``` The `configure` command will prompt you for the following information: \* For "Which CI providers are you configuring?", use the arrow keys to select `Travis CI`, press `Space` to select it, then press `Enter`. \* For "{% data variables.product.pat\_generic\_caps %} for GitHub", enter the value of the {% data variables.product.pat\_v1 %} that you created earlier, and press `Enter`. \* For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for {% data variables.location.product\_location\_enterprise %}, and press `Enter`.{% else %}press `Enter` to accept the default value (`https://github.com`).{% endif %} \* For "{% data variables.product.pat\_generic\_caps %} for Travis CI", enter the value for the Travis CI API access token that you created earlier, and press `Enter`. \* For "Base url of the Travis CI instance", enter the URL of your Travis CI instance, and press `Enter`. \* For "Travis CI organization name", enter the name of your Travis CI organization, and press `Enter`. An example of the output of the `configure` command is shown below. ```shell $ gh actions-importer configure ✔ Which CI providers are you configuring?: Travis CI Enter the following values (leave empty to omit): ✔ {% data variables.product.pat\_generic\_caps %} for GitHub: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\* ✔ Base url of the GitHub instance: https://github.com ✔ {% data variables.product.pat\_generic\_caps %} for Travis CI: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\* ✔ Base url of the Travis CI instance: https://travis-ci.com ✔ Travis CI organization name: actions-importer-labs Environment variables successfully updated. ``` 1. In your terminal, run the {% data variables.product.prodname\_actions\_importer %} `update` CLI command to connect to {% data variables.product.prodname\_registry %} {% data variables.product.prodname\_container\_registry %} and ensure that the container image is updated to the latest version:
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/travis-ci-migration.md
main
github-actions
[ -0.018586138263344765, -0.10424748063087463, -0.07095485925674438, -0.01024583913385868, -0.00967919360846281, 0.00935794971883297, 0.011038473807275295, 0.022149648517370224, -0.08423995971679688, 0.035432200878858566, 0.057622093707323074, -0.0687076523900032, 0.05915693938732147, -0.013...
0.041094
CI instance: https://travis-ci.com ✔ Travis CI organization name: actions-importer-labs Environment variables successfully updated. ``` 1. In your terminal, run the {% data variables.product.prodname\_actions\_importer %} `update` CLI command to connect to {% data variables.product.prodname\_registry %} {% data variables.product.prodname\_container\_registry %} and ensure that the container image is updated to the latest version: ```shell gh actions-importer update ``` The output of the command should be similar to below: ```shell Updating ghcr.io/actions-importer/cli:latest... ghcr.io/actions-importer/cli:latest up-to-date ``` ## Perform an audit of Travis CI You can use the `audit` command to get a high-level view of all pipelines in a Travis CI server. The `audit` command performs the following steps: 1. Fetches all of the projects defined in a Travis CI server. 1. Converts each pipeline to its equivalent {% data variables.product.prodname\_actions %} workflow. 1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname\_actions\_importer %}. ### Running the audit command To perform an audit of a Travis CI server, run the following command in your terminal: ```shell gh actions-importer audit travis-ci --output-dir tmp/audit ``` ### Inspecting the audit results {% data reusables.actions.gai-inspect-audit %} ## Forecast potential build runner usage You can use the `forecast` command to forecast potential {% data variables.product.prodname\_actions %} usage by computing metrics from completed pipeline runs in your Travis CI server. ### Running the forecast command To perform a forecast of potential {% data variables.product.prodname\_actions %} usage, run the following command in your terminal. By default, {% data variables.product.prodname\_actions\_importer %} includes the previous seven days in the forecast report. ```shell gh actions-importer forecast travis-ci --output-dir tmp/forecast ``` ### Inspecting the forecast report The `forecast\_report.md` file in the specified output directory contains the results of the forecast. Listed below are some key terms that can appear in the forecast report: \* The \*\*job count\*\* is the total number of completed jobs. \* The \*\*pipeline count\*\* is the number of unique pipelines used. \* \*\*Execution time\*\* describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname\_dotcom %}-hosted runners. \* This metric is correlated to how much you should expect to spend in {% data variables.product.prodname\_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname\_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs. \* \*\*Queue time\*\* metrics describe the amount of time a job spent waiting for a runner to be available to execute it. \* \*\*Concurrent jobs\*\* metrics describe the amount of jobs running at any given time. This metric can be used to define the number of runners you should configure. Additionally, these metrics are defined for each queue of runners in Travis CI. This is especially useful if there is a mix of hosted or self-hosted runners, or high or low spec machines, so you can see metrics specific to different types of runners. ## Perform a dry-run migration of a Travis CI pipeline You can use the `dry-run` command to convert a Travis CI pipeline to an equivalent {% data variables.product.prodname\_actions %} workflow. A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline. To perform a dry run of migrating your Travis CI pipelines to {% data variables.product.prodname\_actions %}, run the following command in your terminal, replacing `my-travis-ci-repository` with the name of your Travis CI repository. ```shell gh actions-importer dry-run travis-ci --travis-ci-repository my-travis-ci-repository --output-dir tmp/dry-run ``` You can view the logs of the dry run and the converted workflow files in the specified output directory. {%
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/travis-ci-migration.md
main
github-actions
[ 0.01399674266576767, -0.049441009759902954, -0.05451701208949089, -0.03932330012321472, 0.013918634504079819, -0.06537231802940369, -0.024017294868826866, 0.028318477794528008, -0.03585771098732948, 0.043286025524139404, 0.09437459707260132, -0.08816195279359818, 0.04555036127567291, -0.01...
0.040451
data variables.product.prodname\_actions %}, run the following command in your terminal, replacing `my-travis-ci-repository` with the name of your Travis CI repository. ```shell gh actions-importer dry-run travis-ci --travis-ci-repository my-travis-ci-repository --output-dir tmp/dry-run ``` You can view the logs of the dry run and the converted workflow files in the specified output directory. {% data reusables.actions.gai-custom-transformers-rec %} ## Perform a production migration of a Travis CI pipeline You can use the `migrate` command to convert a Travis CI pipeline and open a pull request with the equivalent {% data variables.product.prodname\_actions %} workflow. ### Running the migrate command To migrate a Travis CI pipeline to {% data variables.product.prodname\_actions %}, run the following command in your terminal, replacing the `target-url` value with the URL for your {% data variables.product.prodname\_dotcom %} repository, and `my-travis-ci-repository` with the name of your Travis CI repository. ```shell gh actions-importer migrate travis-ci --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --travis-ci-repository my-travis-ci-repository ``` The command's output includes the URL to the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following: ```shell $ gh actions-importer migrate travis-ci --target-url https://github.com/octo-org/octo-repo --output-dir tmp/migrate --travis-ci-repository my-travis-ci-repository [2022-08-20 22:08:20] Logs: 'tmp/migrate/log/actions-importer-20220916-014033.log' [2022-08-20 22:08:20] Pull request: 'https://github.com/octo-org/octo-repo/pull/1' ``` {% data reusables.actions.gai-inspect-pull-request %} ## Reference This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname\_actions\_importer %} to migrate from Travis CI. ### Using environment variables {% data reusables.actions.gai-config-environment-variables %} {% data variables.product.prodname\_actions\_importer %} uses the following environment variables to connect to your Travis CI instance: \* `GITHUB\_ACCESS\_TOKEN`: The {% data variables.product.pat\_v1 %} used to create pull requests with a converted workflow (requires the `workflow` scope). \* `GITHUB\_INSTANCE\_URL`: The URL to the target {% data variables.product.prodname\_dotcom %} instance (for example, `https://github.com`). \* `TRAVIS\_CI\_ACCESS\_TOKEN`: The Travis CI API access token used to view Travis CI resources. \* `TRAVIS\_CI\_ORGANIZATION`: The organization name of your Travis CI instance. \* `TRAVIS\_CI\_INSTANCE\_URL`: The URL of the Travis CI instance. \* `TRAVIS\_CI\_SOURCE\_GITHUB\_ACCESS\_TOKEN`: (Optional) The {% data variables.product.pat\_generic %} used to authenticate with your source GitHub instance. If not provided, `GITHUB\_ACCESS\_TOKEN` will be used instead. \* `TRAVIS\_CI\_SOURCE\_GITHUB\_INSTANCE\_URL`: (Optional) The URL to the source GitHub instance, such as https://github.com. If not provided, `GITHUB\_INSTANCE\_URL` will be used instead. These environment variables can be specified in a `.env.local` file that is loaded by {% data variables.product.prodname\_actions\_importer %} when it is run. ### Using optional arguments {% data reusables.actions.gai-optional-arguments-intro %} #### `--source-file-path` You can use the `--source-file-path` argument with the `forecast`, `dry-run`, or `migrate` subcommands. By default, {% data variables.product.prodname\_actions\_importer %} fetches pipeline contents from source control. The `--source-file-path` argument tells {% data variables.product.prodname\_actions\_importer %} to use the specified source file path instead. For example: ```shell gh actions-importer dry-run travis-ci --output-dir ./path/to/output/ --travis-ci-repository my-travis-ci-repository --source-file-path ./path/to/.travis.yml ``` #### `--allow-inactive-repositories` You can use this argument to specify whether {% data variables.product.prodname\_actions\_importer %} should include inactive repositories in an audit. If this option is not set, inactive repositories are not included in audits. ```shell gh actions-importer dry-run travis-ci --output-dir ./path/to/output/ --travis-ci-repository my-travis-ci-repository --allow-inactive-repositories ``` #### `--config-file-path` You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands. By default, {% data variables.product.prodname\_actions\_importer %} fetches pipeline contents from source control. The `--config-file-path` argument tells {% data variables.product.prodname\_actions\_importer %} to use the specified source files instead. ##### Audit example In this example, {% data variables.product.prodname\_actions\_importer %} uses the specified YAML configuration file to perform an audit. ```shell gh actions-importer audit travis-ci --output-dir ./path/to/output/ --config-file-path ./path/to/travis-ci/config.yml ``` To audit a Travis CI instance using a configuration file, the file must be in the following format and each `repository\_slug` value must be unique: ```yaml source\_files: - repository\_slug: travis-org-name/travis-repo-name path:
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/travis-ci-migration.md
main
github-actions
[ -0.029862213879823685, -0.05753019079566002, -0.07207494229078293, -0.013481751084327698, 0.026053303852677345, -0.03427129238843918, 0.017568204551935196, 0.031180324032902718, -0.05694088339805603, 0.03158707171678543, 0.0331866554915905, -0.04300463944673538, 0.07232962548732758, -0.040...
0.044746
the specified YAML configuration file to perform an audit. ```shell gh actions-importer audit travis-ci --output-dir ./path/to/output/ --config-file-path ./path/to/travis-ci/config.yml ``` To audit a Travis CI instance using a configuration file, the file must be in the following format and each `repository\_slug` value must be unique: ```yaml source\_files: - repository\_slug: travis-org-name/travis-repo-name path: path/to/.travis.yml - repository\_slug: travis-org-name/some-other-travis-repo-name path: path/to/.travis.yml ``` ##### Dry run example In this example, {% data variables.product.prodname\_actions\_importer %} uses the specified YAML configuration file as the source file to perform a dry run. The pipeline is selected by matching the `repository\_slug` in the configuration file to the value of the `--travis-ci-repository` option. The `path` is then used to pull the specified source file. ```shell gh actions-importer dry-run travis-ci --travis-ci-repository travis-org-name/travis-repo-name --output-dir ./output/ --config-file-path ./path/to/travis-ci/config.yml ``` ### Supported project languages {% data variables.product.prodname\_actions\_importer %} supports migrating Travis CI projects in the following languages. * `android` * `bash` * `c` * `clojure` * `c++` * `crystal` * `c#` * `d` * `dart` * `elixir` * `erlang` * `generic` * `go` * `groovy` * `haskell` * `haxe` * `java` * `julia` * `matlab` * `minimal` * `nix` * `node_js` * `objective-c` * `perl` * `perl6` * `php` * `python` * `r` * `ruby` * `rust` * `scala` * `sh` * `shell` * `smalltalk` * `swift` ### Supported syntax for Travis CI pipelines The following table shows the type of properties {% data variables.product.prodname\_actions\_importer %} is currently able to convert. For more details about how Travis CI pipeline syntax aligns with {% data variables.product.prodname\_actions %}, see [AUTOTITLE](/actions/migrating-to-github-actions/manually-migrating-to-github-actions/migrating-from-travis-ci-to-github-actions). | Travis CI | GitHub Actions | Status | | :------------------ | :--------------------------------- | ------------------: | | branches | * `on..` | Supported | | build\_pull\_requests | + `on.` | Supported | | env | - `env` - `jobs..env` - `jobs..steps.env` | Supported | | if | * `jobs..if` | Supported | | job | * `jobs.` * `jobs..name` | Supported | | matrix | * `jobs..strategy` * `jobs..strategy.fail-fast` * `jobs..strategy.matrix` | Supported | | os & dist | + `runners` | Supported | | scripts | + `jobs..steps` | Supported | | stages | + `jobs` | Supported | | env | + `on` | Partially supported | | branches | - `on..` - `on..paths` | Unsupported | | build\_pull\_requests | * `on..` * `on..` * `on..paths` | Unsupported | | cron triggers | + `on.schedule` + `on.workflow\_run` | Unsupported | | env | + `jobs..timeout-minutes` + `on..types` | Unsupported | | job | - `jobs..container` | Unsupported | | os & dist | * `self hosted runners` | Unsupported | For information about supported Travis CI constructs, see the [`github/gh-actions-importer` repository](https://github.com/github/gh-actions-importer/blob/main/docs/travis\_ci/index.md). ### Environment variables syntax {% data variables.product.prodname\_actions\_importer %} uses the mapping in the table below to convert default Travis CI environment variables to the closest equivalent in {% data variables.product.prodname\_actions %}. | Travis CI | GitHub Actions | | :---------------------------- | :------------------------------------------------------------------------------------ | | {% raw %}`$CONTINUOUS\_INTEGRATION`{% endraw %} | {% raw %}`$CI`{% endraw %} | | {% raw %}`$USER`{% endraw %} | {% raw %}`${{ github.actor }}`{% endraw %} | | {% raw %}`$HOME`{% endraw %} | {% raw %}`${{ github.workspace }}` {% endraw %} | | {% raw %}`$TRAVIS\_BRANCH`{% endraw %} | {% raw %}`${{ github.ref }}`{% endraw %} | | {% raw %}`$TRAVIS\_BUILD\_DIR`{% endraw %} | {% raw %}`${{ github.workspace }}`{% endraw %} | | {% raw %}`$TRAVIS\_BUILD\_ID`{% endraw %} | {% raw %}`${{ github.run\_number }}`{% endraw %} | | {% raw %}`$TRAVIS\_BUILD\_NUMBER`{% endraw %} | {% raw %}`${{ github.run\_id }}`{% endraw %} | | {% raw %}`$TRAVIS\_COMMIT`{% endraw %} | {% raw %}`${{ github.sha }}`{% endraw %} | | {%
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/travis-ci-migration.md
main
github-actions
[ 0.030914833769202232, -0.02483644336462021, -0.044379137456417084, -0.004062628373503685, 0.06873127818107605, -0.03877091780304909, 0.04386303573846817, -0.012694070115685463, 0.012481492944061756, 0.014564632438123226, 0.09321313351392746, -0.05381523445248604, 0.05034536123275757, -0.03...
0.028796
endraw %} | | {% raw %}`$TRAVIS\_BUILD\_ID`{% endraw %} | {% raw %}`${{ github.run\_number }}`{% endraw %} | | {% raw %}`$TRAVIS\_BUILD\_NUMBER`{% endraw %} | {% raw %}`${{ github.run\_id }}`{% endraw %} | | {% raw %}`$TRAVIS\_COMMIT`{% endraw %} | {% raw %}`${{ github.sha }}`{% endraw %} | | {% raw %}`$TRAVIS\_EVENT\_TYPE`{% endraw %} | {% raw %}`${{ github.event\_name }}`{% endraw %} | | {% raw %}`$TRAVIS\_PULL\_REQUEST\_BRANCH`{% endraw %} | {% raw %}`${{ github.base\_ref }}`{% endraw %} | | {% raw %}`$TRAVIS\_PULL\_REQUEST`{% endraw %} | {% raw %}`${{ github.event.number }}`{% endraw %} | | {% raw %}`$TRAVIS\_PULL\_REQUEST\_SHA`{% endraw %} | {% raw %}`${{ github.head.sha }}`{% endraw %} | | {% raw %}`$TRAVIS\_PULL\_REQUEST\_SLUG`{% endraw %} | {% raw %}`${{ github.repository }}`{% endraw %} | | {% raw %}`$TRAVIS\_TAG`{% endraw %} | {% raw %}`${{ github.ref }}`{% endraw %} | | {% raw %}`$TRAVIS\_OS\_NAME`{% endraw %} | {% raw %}`${{ runner.os }}`{% endraw %} | | {% raw %}`$TRAVIS\_JOB\_ID`{% endraw %} | {% raw %}`${{ github.job }}`{% endraw %} | | {% raw %}`$TRAVIS\_REPO\_SLUG`{% endraw %} | {% raw %}`${{ github.repository\_owner/github.repository }}`{% endraw %} | | {% raw %}`$TRAVIS\_BUILD\_WEB\_URL`{% endraw %} | {% raw %}`${{ github.server\_url }}/${{ github.repository }}/actions/runs/${{ github.run\_id }}`{% endraw %} | ## Legal notice {% data reusables.actions.actions-importer-legal-notice %}
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/travis-ci-migration.md
main
github-actions
[ 0.043417274951934814, 0.013588611967861652, -0.03127172589302063, 0.03869623690843582, 0.04715042933821678, 0.011999140493571758, 0.005979463923722506, -0.0001754111290210858, 0.036910708993673325, 0.027183853089809418, 0.019365649670362473, -0.07527828961610794, 0.009599192999303341, -0.0...
0.045963
## About migrating from Bitbucket Pipelines with GitHub Actions Importer The instructions below will guide you through configuring your environment to use {% data variables.product.prodname\_actions\_importer %} to migrate Bitbucket Pipelines to {% data variables.product.prodname\_actions %}. ### Prerequisites {% data reusables.actions.actions-importer-prerequisites %} ### Limitations There are some limitations when migrating from Bitbucket Pipelines to {% data variables.product.prodname\_actions %} with {% data variables.product.prodname\_actions\_importer %}. \* Images in a private AWS ECR are not supported. \* The Bitbucket Pipelines option `size` is not supported. {% ifversion fpt or ghec %}If additional runner resources are required in {% data variables.product.prodname\_actions %}, consider using {% data variables.actions.hosted\_runner %}s. For more information, see [AUTOTITLE](/actions/using-github-hosted-runners/about-larger-runners).{% endif %} \* Metrics detailing the queue time of jobs is not supported by the `forecast` command. \* Bitbucket [after-scripts](https://support.atlassian.com/bitbucket-cloud/docs/step-options/#After-script) are supported using {% data variables.product.prodname\_actions %} `always()` in combination with checking the `steps..conclusion` of the previous step. For more information, see [AUTOTITLE](/actions/learn-github-actions/contexts#steps-context). The following is an example of using the `always()` with `steps..conclusion`. ```yaml - name: After Script 1 run: |- echo "I'm after the script ran!" echo "We should be grouped!" id: after-script-1 if: "{% raw %}${{ always() }}{% endraw %}" - name: After Script 2 run: |- echo "this is really the end" echo "goodbye, for now!" id: after-script-2 if: "{% raw %}${{ steps.after-script-1.conclusion == 'success' && always() }}{% endraw %}" ``` ### Manual tasks Certain Bitbucket Pipelines constructs must be migrated manually. These include: \* Secured repository, workspace, and deployment variables \* SSH keys ## Installing the {% data variables.product.prodname\_actions\_importer %} CLI extension {% data reusables.actions.installing-actions-importer %} ## Configuring credentials The `configure` CLI command is used to set required credentials and options for {% data variables.product.prodname\_actions\_importer %} when working with Bitbucket Pipelines and {% data variables.product.prodname\_dotcom %}. 1. Create a {% data variables.product.prodname\_dotcom %} {% data variables.product.pat\_v1 %}. For more information, see [AUTOTITLE](/authentication/keeping-your-account-and-data-secure/managing-your-personal-access-tokens#creating-a-personal-access-token-classic). Your token must have the `workflow` scope. After creating the token, copy it and save it in a safe location for later use. 1. Create a Workspace Access Token for Bitbucket Pipelines. For more information, see [Workspace Access Token permissions](https://support.atlassian.com/bitbucket-cloud/docs/workspace-access-token-permissions/) in the Bitbucket documentation. Your token must have the `read` scope for pipelines, projects, and repositories. 1. In your terminal, run the {% data variables.product.prodname\_actions\_importer %} `configure` CLI command: ```shell gh actions-importer configure ``` The `configure` command will prompt you for the following information: \* For "Which CI providers are you configuring?", use the arrow keys to select `Bitbucket`, press `Space` to select it, then press `Enter`. \* For "{% data variables.product.pat\_generic\_caps %} for GitHub", enter the value of the {% data variables.product.pat\_v1 %} that you created earlier, and press `Enter`. \* For "Base url of the GitHub instance", {% ifversion ghes %}enter the URL for {% data variables.location.product\_location\_enterprise %}, and press `Enter`.{% else %}press `Enter` to accept the default value (`https://github.com`).{% endif %} \* For "{% data variables.product.pat\_generic\_caps %} for Bitbucket", enter the Workspace Access Token that you created earlier, and press `Enter`. \* For "Base url of the Bitbucket instance", enter the URL for your Bitbucket instance, and press `Enter`. An example of the `configure` command is shown below: ```shell $ gh actions-importer configure ✔ Which CI providers are you configuring?: Bitbucket Enter the following values (leave empty to omit): ✔ {% data variables.product.pat\_generic\_caps %} for GitHub: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\* ✔ Base url of the GitHub instance: https://github.com ✔ {% data variables.product.pat\_generic\_caps %} for Bitbucket: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\* ✔ Base url of the Bitbucket instance: https://bitbucket.example.com Environment variables successfully updated. ``` 1. In your terminal, run the {% data variables.product.prodname\_actions\_importer %} `update` CLI command to connect to {% data variables.product.prodname\_registry %} {% data variables.product.prodname\_container\_registry %} and ensure
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/bitbucket-pipelines-migration.md
main
github-actions
[ -0.034845903515815735, -0.03983362764120102, -0.036232054233551025, 0.01462503056973219, 0.05432087928056717, 0.016780536621809006, -0.005270021501928568, 0.09485091269016266, -0.0615166574716568, 0.03873110190033913, 0.05040643364191055, -0.06727073341608047, 0.05193023383617401, 0.008988...
-0.041765
GitHub instance: https://github.com ✔ {% data variables.product.pat\_generic\_caps %} for Bitbucket: \*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\*\* ✔ Base url of the Bitbucket instance: https://bitbucket.example.com Environment variables successfully updated. ``` 1. In your terminal, run the {% data variables.product.prodname\_actions\_importer %} `update` CLI command to connect to {% data variables.product.prodname\_registry %} {% data variables.product.prodname\_container\_registry %} and ensure that the container image is updated to the latest version: ```shell gh actions-importer update ``` The output of the command should be similar to below: ```shell Updating ghcr.io/actions-importer/cli:latest... ghcr.io/actions-importer/cli:latest up-to-date ``` ## Perform an audit of the Bitbucket instance You can use the audit command to get a high-level view of pipelines in a Bitbucket instance. The audit command performs the following steps. 1. Fetches all of the pipelines for a workspace. 1. Converts pipeline to its equivalent GitHub Actions workflow. 1. Generates a report that summarizes how complete and complex of a migration is possible with {% data variables.product.prodname\_actions\_importer %}. ### Running the audit command To perform an audit run the following command in your terminal, replacing `:workspace` with the name of the Bitbucket workspace to audit. ```bash gh actions-importer audit bitbucket --workspace :workspace --output-dir tmp/audit ``` Optionally, a `--project-key` option can be provided to the audit command to limit the results to only pipelines associated with a project. In the below example command `:project\_key` should be replaced with the key of the project that should be audited. Project keys can be found in Bitbucket on the workspace projects page. ```bash gh actions-importer audit bitbucket --workspace :workspace --project-key :project\_key --output-dir tmp/audit ``` ### Inspecting the audit results {% data reusables.actions.gai-inspect-audit %} ## Forecasting usage You can use the `forecast` command to forecast potential {% data variables.product.prodname\_actions %} usage by computing metrics from completed pipeline runs in your Bitbucket instance. ### Running the forecast command To perform a forecast of potential GitHub Actions usage, run the following command in your terminal, replacing `:workspace` with the name of the Bitbucket workspace to forecast. By default, GitHub Actions Importer includes the previous seven days in the forecast report. ```shell gh actions-importer forecast bitbucket --workspace :workspace --output-dir tmp/forecast\_reports ``` ### Forecasting a project To limit the forecast to a project, you can use the `--project-key` option. Replace the value for the `:project\_key` with the project key for the project to forecast. ```shell gh actions-importer forecast bitbucket --workspace :workspace --project-key :project\_key --output-dir tmp/forecast\_reports ``` ### Inspecting the forecast report The `forecast\_report.md` file in the specified output directory contains the results of the forecast. Listed below are some key terms that can appear in the forecast report: \* The \*\*job count\*\* is the total number of completed jobs. \* The \*\*pipeline count\*\* is the number of unique pipelines used. \* \*\*Execution time\*\* describes the amount of time a runner spent on a job. This metric can be used to help plan for the cost of {% data variables.product.prodname\_dotcom %}-hosted runners. \* This metric is correlated to how much you should expect to spend in {% data variables.product.prodname\_actions %}. This will vary depending on the hardware used for these minutes. You can use the [{% data variables.product.prodname\_actions %} pricing calculator](https://github.com/pricing/calculator) to estimate the costs. \* \*\*Concurrent jobs\*\* metrics describe the amount of jobs running at any given time. ## Performing a dry-run migration You can use the dry-run command to convert a Bitbucket pipeline to an equivalent {% data variables.product.prodname\_actions %} workflow(s). A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline. ### Running the dry-run command To perform a dry run of migrating a Bitbucket pipeline to {% data variables.product.prodname\_actions %}, run the following
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/bitbucket-pipelines-migration.md
main
github-actions
[ -0.026634961366653442, -0.03830016404390335, -0.006185592617839575, 0.0014204817125573754, 0.0469231903553009, -0.05455534905195236, 0.032845642417669296, 0.07692249864339828, -0.034962691366672516, 0.047186899930238724, 0.06631669402122498, -0.04027651995420456, 0.03525025025010109, 0.039...
-0.056853
equivalent {% data variables.product.prodname\_actions %} workflow(s). A dry-run creates the output files in a specified directory, but does not open a pull request to migrate the pipeline. ### Running the dry-run command To perform a dry run of migrating a Bitbucket pipeline to {% data variables.product.prodname\_actions %}, run the following command in your terminal, replacing `:workspace` with the name of the workspace and `:repo` with the name of the repository in Bitbucket. ```bash gh actions-importer dry-run bitbucket --workspace :workspace --repository :repo --output-dir tmp/dry-run ``` ### Inspecting the converted workflows You can view the logs of the dry run and the converted workflow files in the specified output directory. {% data reusables.actions.gai-custom-transformers-rec %} ## Performing a production migration You can use the migrate command to convert a Bitbucket pipeline and open a pull request with the equivalent {% data variables.product.prodname\_actions %} workflow(s). ### Running the migrate command To migrate a Bitbucket pipeline to {% data variables.product.prodname\_actions %}, run the following command in your terminal, replacing the following values. \* Replace `target-url` value with the URL for your {% data variables.product.company\_short %} repository. \* Replace `:repo` with the name of the repository in Bitbucket. \* Replace `:workspace` with the name of the workspace. ```bash gh actions-importer migrate bitbucket --workspace :workspace --repository :repo --target-url https://github.com/:owner/:repo --output-dir tmp/dry-run ``` The command's output includes the URL of the pull request that adds the converted workflow to your repository. An example of a successful output is similar to the following: ```bash gh actions-importer migrate bitbucket --workspace actions-importer --repository custom-trigger --target-url https://github.com/valet-dev-testing/demo-private --output-dir tmp/bitbucket [2023-07-18 09:56:06] Logs: 'tmp/bitbucket/log/valet-20230718-165606.log' [2023-07-18 09:56:24] Pull request: 'https://github.com/valet-dev-testing/demo-private/pull/55' ``` {% data reusables.actions.gai-inspect-pull-request %} ## Reference This section contains reference information on environment variables, optional arguments, and supported syntax when using {% data variables.product.prodname\_actions\_importer %} to migrate from Bitbucket Pipelines. ### Using environment variables {% data reusables.actions.gai-config-environment-variables %} {% data variables.product.prodname\_actions\_importer %} uses the following environment variables to connect to your Bitbucket instance. \* `GITHUB\_ACCESS\_TOKEN`: The {% data variables.product.pat\_v1 %} used to create pull requests with a transformed workflow (requires `repo` and `workflow` scopes). \* `GITHUB\_INSTANCE\_URL`: The url to the target GitHub instance. (e.g. `https://github.com`) \* `BITBUCKET\_ACCESS\_TOKEN`: The workspace access token with read scopes for pipeline, project, and repository. These environment variables can be specified in a `.env.local` file that will be loaded by {% data variables.product.prodname\_actions\_importer %} at run time. The distribution archive contains a `.env.local.template` file that can be used to create these files. ### Optional arguments {% data reusables.actions.gai-optional-arguments-intro %} #### `--source-file-path` You can use the `--source-file-path` argument with the `dry-run` or `migrate` subcommands. By default, {% data variables.product.prodname\_actions\_importer %} fetches pipeline contents from the Bitbucket instance. The `--source-file-path` argument tells {% data variables.product.prodname\_actions\_importer %} to use the specified source file path instead. For example: ```bash gh actions-importer dry-run bitbucket --workspace :workspace --repository :repo --output-dir tmp/dry-run --source-file-path path/to/my/pipeline/file.yml ``` #### `--config-file-path` You can use the `--config-file-path` argument with the `audit`, `dry-run`, and `migrate` subcommands. By default, {% data variables.product.prodname\_actions\_importer %} fetches pipeline contents from the Bitbucket instance. The `--config-file-path` argument tells {% data variables.product.prodname\_actions\_importer %} to use the specified source files instead. ### Audit example In this example, {% data variables.product.prodname\_actions\_importer %} uses the specified YAML configuration file to perform an audit. ```bash gh actions-importer audit bitbucket --workspace :workspace --output-dir tmp/audit --config-file-path "path/to/my/bitbucket/config.yml" ``` To audit a Bitbucket instance using a config file, the config file must be in the following format, and each `repository\_slug` must be unique: ```yaml source\_files: - repository\_slug: repo\_name path: path/to/one/source/file.yml - repository\_slug: another\_repo\_name path: path/to/another/source/file.yml ``` ## Supported syntax for Bitbucket Pipelines The following table shows the type of properties that {% data variables.product.prodname\_actions\_importer %}
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/bitbucket-pipelines-migration.md
main
github-actions
[ -0.09090173989534378, -0.06677313894033432, -0.05493941530585289, 0.0017962902784347534, 0.021639475598931313, -0.03345789015293121, 0.023477597162127495, 0.04144366458058357, -0.056647852063179016, 0.038430362939834595, 0.02378687635064125, -0.03245779499411583, 0.02008560486137867, -0.00...
-0.010545
a config file, the config file must be in the following format, and each `repository\_slug` must be unique: ```yaml source\_files: - repository\_slug: repo\_name path: path/to/one/source/file.yml - repository\_slug: another\_repo\_name path: path/to/another/source/file.yml ``` ## Supported syntax for Bitbucket Pipelines The following table shows the type of properties that {% data variables.product.prodname\_actions\_importer %} is currently able to convert. | Bitbucket | GitHub Actions | Status | | :------------------- | :------------------------------------------- | -----------: | | `after-script` | `jobs..steps[\*]` | Supported | | `artifacts` | `actions/upload-artifact` & `download-artifact` | Supported | | `caches` | `actions/cache` | Supported | | `clone` | `actions/checkout` | Supported | | `condition` | `job..steps[\*].run` | Supported | | `deployment` | `jobs..environment` | Supported | | `image` | `jobs..container` | Supported | | `max-time` | `jobs..steps[\*].timeout-minutes` | Supported | | `options.docker` | None | Supported | | `options.max-time` | `jobs..steps[\*].timeout-minutes` | Supported | | `parallel` | `jobs.` | Supported | | `pipelines.branches` | `on.push` | Supported | | `pipelines.custom` | `on.workflow\_dispatch` | Supported | | `pipelines.default` | `on.push` | Supported | | `pipelines.pull-requests` | `on.pull\_requests` | Supported | | `pipelines.tags` | `on.tags` | Supported | | `runs-on` | `jobs..runs-on` | Supported | | `script` | `job..steps[\*].run` | Supported | | `services` | `jobs..service` | Supported | | `stage` | `jobs.` | Supported | | `step` | `jobs..steps[\*]` | Supported | | `trigger` | `on.workflow\_dispatch` | Supported | | `fail-fast` | None | Unsupported | | `oidc` | None | Unsupported | | `options.size` | None | Unsupported | | `size` | None | Unsupported | ### Environment variable mapping {% data variables.product.prodname\_actions\_importer %} uses the mapping in the table below to convert default Bitbucket environment variables to the closest equivalent in {% data variables.product.prodname\_actions %}. | Bitbucket | GitHub Actions | | :------------------------------------- | :------------------------------------------------------ | | `CI` | {% raw %}`true`{% endraw %} | | `BITBUCKET\_BUILD\_NUMBER` | {% raw %}`${{ github.run\_number }}`{% endraw %} | | `BITBUCKET\_CLONE\_DIR` | {% raw %}`${{ github.workspace }}`{% endraw %} | | `BITBUCKET\_COMMIT` | {% raw %}`${{ github.sha }}`{% endraw %} | | `BITBUCKET\_WORKSPACE` | {% raw %}`${{ github.repository\_owner }}`{% endraw %} | | `BITBUCKET\_REPO\_SLUG` | {% raw %}`${{ github.repository }}`{% endraw %} | | `BITBUCKET\_REPO\_UUID` | {% raw %}`${{ github.repository\_id }}`{% endraw %} | | `BITBUCKET\_REPO\_FULL\_NAME` | {% raw %}`${{ github.repository\_owner }}`{% endraw %}/{% raw %}`${{ github.repository }}`{% endraw %} | | `BITBUCKET\_BRANCH` | {% raw %}`${{ github.ref }}`{% endraw %} | | `BITBUCKET\_TAG` | {% raw %}`${{ github.ref }}`{% endraw %} | | `BITBUCKET\_PR\_ID` | {% raw %}`${{ github.event.pull\_request.number }}`{% endraw %} | | `BITBUCKET\_PR\_DESTINATION\_BRANCH` | {% raw %}`${{ github.event.pull\_request.base.ref }}`{% endraw %} | | `BITBUCKET\_GIT\_HTTP\_ORIGIN` | {% raw %}`${{ github.event.repository.clone\_url }}`{% endraw %} | | `BITBUCKET\_GIT\_SSH\_ORIGIN` | {% raw %}`${{ github.event.repository.ssh\_url }}`{% endraw %} | | `BITBUCKET\_EXIT\_CODE` | {% raw %}`${{ job.status }}`{% endraw %} | | `BITBUCKET\_STEP\_UUID` | {% raw %}`${{ job.github\_job }}`{% endraw %} | | `BITBUCKET\_PIPELINE\_UUID` | {% raw %}`${{ github.workflow }}`{% endraw %} | | `BITBUCKET\_PROJECT\_KEY` | {% raw %}`${{ github.repository\_owner }}`{% endraw %} | | `BITBUCKET\_PROJECT\_UUID` | {% raw %}`${{ github.repository\_owner }}`{% endraw %} | | `BITBUCKET\_STEP\_TRIGGERER\_UUID` | {% raw %}`${{ github.actor\_id }}`{% endraw %} | | `BITBUCKET\_SSH\_KEY\_FILE` | {% raw %}`${{ github.workspace }}/.ssh/id\_rsa`{% endraw %} | | `BITBUCKET\_STEP\_OIDC\_TOKEN` | No Mapping | | `BITBUCKET\_DEPLOYMENT\_ENVIRONMENT` | No Mapping | | `BITBUCKET\_DEPLOYMENT\_ENVIRONMENT\_UUID` | No Mapping | | `BITBUCKET\_BOOKMARK` | No Mapping | | `BITBUCKET\_PARALLEL\_STEP` | No Mapping | | `BITBUCKET\_PARALLEL\_STEP\_COUNT` | No Mapping | ### System Variables System variables used in tasks are transformed to the equivalent bash shell variable and are assumed to be available. For example, `${system.}` will be transformed to `$variable\_name`. We recommend you verify
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/bitbucket-pipelines-migration.md
main
github-actions
[ -0.02444000169634819, -0.0430033840239048, -0.0776839330792427, 0.005293658003211021, -0.010714729316532612, -0.014986428432166576, 0.015581977553665638, 0.06388504803180695, -0.0725412368774414, 0.03935914486646652, 0.032404180616140366, -0.051894139498472214, 0.03940449655056, -0.0348409...
-0.019496
| No Mapping | | `BITBUCKET\_PARALLEL\_STEP` | No Mapping | | `BITBUCKET\_PARALLEL\_STEP\_COUNT` | No Mapping | ### System Variables System variables used in tasks are transformed to the equivalent bash shell variable and are assumed to be available. For example, `${system.}` will be transformed to `$variable\_name`. We recommend you verify this to ensure proper operation of the workflow. ## Legal notice {% data reusables.actions.actions-importer-legal-notice %}
https://github.com/github/docs/blob/main//content/actions/tutorials/migrate-to-github-actions/automated-migrations/bitbucket-pipelines-migration.md
main
github-actions
[ -0.008739002048969269, -0.019569700583815575, -0.0698775053024292, -0.0091434670612216, -0.03038664720952511, 0.013516529463231564, 0.043376706540584564, 0.021937016397714615, 0.009058477357029915, 0.05857196822762489, 0.07073301821947098, -0.047759413719177246, 0.07311249524354935, -0.029...
0.045198
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to build and test a Swift package. {% data variables.product.prodname\_dotcom %}-hosted runners have a tools cache with preinstalled software, and the Ubuntu and macOS runners include the dependencies for building Swift packages. For a full list of up-to-date software and the preinstalled versions of Swift and Xcode, see [AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners#supported-software). ## Prerequisites You should already be familiar with YAML syntax and how it's used with {% data variables.product.prodname\_actions %}. For more information, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions). We recommend that you have a basic understanding of Swift packages. For more information, see [Swift Packages](https://developer.apple.com/documentation/xcode/swift-packages) in the Apple developer documentation. ## Using a Swift workflow template {% data reusables.actions.workflow-templates-get-started %} {% data variables.product.prodname\_dotcom %} provides a workflow template for Swift that should work for most Swift projects. The subsequent sections of this guide give examples of how you can customize this workflow template. {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.actions-tab %} {% data reusables.actions.new-starter-workflow %} 1. The "Choose a workflow" page shows a selection of recommended workflow templates. Search for "swift". 1. Filter the selection of workflows by clicking \*\*Continuous integration\*\*. 1. On the "Swift" workflow, click \*\*Configure\*\*. {%- ifversion ghes %} If you don't find the "Swift" workflow template, copy the following workflow code to a new file called `swift.yml` in the `.github/workflows` directory of your repository. ```yaml copy name: Swift on: push: branches: [ "main" ] pull\_request: branches: [ "main" ] jobs: build: runs-on: macos-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Build run: swift build -v - name: Run tests run: swift test -v ``` {%- endif %} 1. Edit the workflow as required. For example, change the branch on which the workflow will run. 1. Click \*\*Commit changes\*\*. {% ifversion fpt or ghec %} The `swift.yml` workflow file is added to the `.github/workflows` directory of your repository. {% endif %} ## Specifying a Swift version To use a specific preinstalled version of Swift on a {% data variables.product.prodname\_dotcom %}-hosted runner, use the `swift-actions/setup-swift` action. This action finds a specific version of Swift from the tools cache on the runner and adds the necessary binaries to `PATH`. These changes will persist for the remainder of a job. For more information, see the [`swift-actions/setup-swift`](https://github.com/marketplace/actions/setup-swift) action. If you are using a self-hosted runner, you must install your desired Swift versions and add them to `PATH`. The examples below demonstrate using the `swift-actions/setup-swift` action. ### Using multiple Swift versions You can configure your job to use multiple versions of Swift in a matrix. ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} {% data reusables.actions.actions-use-sha-pinning-comment %} name: Swift on: [push] jobs: build: name: {% raw %}Swift ${{ matrix.swift }} on ${{ matrix.os }}{% endraw %} strategy: matrix: os: [ubuntu-latest, macos-latest] swift: ["5.2", "5.3"] runs-on: {% raw %}${{ matrix.os }}{% endraw %} steps: - uses: swift-actions/setup-swift@65540b95f51493d65f5e59e97dcef9629ddf11bf with: swift-version: {% raw %}${{ matrix.swift }}{% endraw %} - uses: {% data reusables.actions.action-checkout %} - name: Build run: swift build - name: Run tests run: swift test ``` ### Using a single specific Swift version You can configure your job to use a single specific version of Swift, such as `5.3.3`. {% raw %} ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} steps: - uses: swift-actions/setup-swift@65540b95f51493d65f5e59e97dcef9629ddf11bf with: swift-version: "5.3.3" - name: Get swift version run: swift --version # Swift 5.3.3 ``` {% endraw %} ## Building and testing your code You can use the same commands that you use locally to build and test your code using Swift. This example demonstrates how to use `swift build` and `swift test` in a job: ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} steps: - uses:
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/swift.md
main
github-actions
[ -0.03177376091480255, -0.08638579398393631, -0.03499216213822365, 0.005474047735333443, 0.004680977668613195, 0.0015061214799061418, 0.04018021002411842, 0.05952984839677811, -0.06045452505350113, -0.014298352412879467, 0.042253993451595306, -0.004608524963259697, -0.03297386318445206, 0.0...
0.160717
{% endraw %} ## Building and testing your code You can use the same commands that you use locally to build and test your code using Swift. This example demonstrates how to use `swift build` and `swift test` in a job: ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} steps: - uses: {% data reusables.actions.action-checkout %} - uses: swift-actions/setup-swift@65540b95f51493d65f5e59e97dcef9629ddf11bf with: swift-version: "5.3.3" - name: Build run: swift build - name: Run tests run: swift test ```
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/swift.md
main
github-actions
[ -0.02058485895395279, -0.09461311995983124, -0.029766632243990898, 0.07165774703025818, 0.03225528076291084, -0.042798999696969986, 0.04078449681401253, -0.059694841504096985, -0.08119220286607742, 0.01796075329184532, 0.07019605487585068, -0.024120241403579712, -0.025777988135814667, 0.06...
0.022627
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to create a workflow that performs continuous integration (CI) for your Java project using the Maven software project management tool. The workflow you create will allow you to see when commits to a pull request cause build or test failures against your default branch; this approach can help ensure that your code is always healthy. You can extend your CI workflow to cache files and upload artifacts from a workflow run. {% data variables.product.prodname\_dotcom %}-hosted runners have a tools cache with pre-installed software, which includes Java Development Kits (JDKs) and Maven. For a list of software and the pre-installed versions for JDK and Maven, see [AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners#supported-software). ## Prerequisites You should be familiar with YAML and the syntax for {% data variables.product.prodname\_actions %}. For more information, see: \* [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions) \* [AUTOTITLE](/actions/learn-github-actions) We recommend that you have a basic understanding of Java and the Maven framework. For more information, see the [Maven Getting Started Guide](https://maven.apache.org/guides/getting-started/index.html) in the Maven documentation. {% data reusables.actions.enterprise-setup-prereq %} ## Using a Maven workflow template {% data reusables.actions.workflow-templates-get-started %} {% data variables.product.prodname\_dotcom %} provides a workflow template for Maven that should work for most Java with Maven projects. The subsequent sections of this guide give examples of how you can customize this workflow template. {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.actions-tab %} {% data reusables.actions.new-starter-workflow %} 1. The "Choose a workflow" page shows a selection of recommended workflow templates. Search for "Java with Maven". 1. On the "Java with Maven" workflow, click \*\*Configure\*\*. {%- ifversion ghes %} If you don't find the "Java with Maven" workflow template, copy the following workflow code to a new file called `maven.yml` in the `.github/workflows` directory of your repository. ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} name: Java CI with Maven on: push: branches: [ "main" ] pull\_request: branches: [ "main" ] jobs: build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up JDK 17 uses: {% data reusables.actions.action-setup-java %} with: java-version: '17' distribution: 'temurin' cache: maven - name: Build with Maven run: mvn -B package --file pom.xml # Optional: Uploads the full dependency graph to GitHub to improve the quality of Dependabot alerts this repository can receive - name: Update dependency graph uses: advanced-security/maven-dependency-submission-action@571e99aab1055c2e71a1e2309b9691de18d6b7d6 ``` {%- endif %} 1. Edit the workflow as required. For example, change the Java version. 1. Click \*\*Commit changes\*\*. {% ifversion fpt or ghec %} The `maven.yml` workflow file is added to the `.github/workflows` directory of your repository. {% endif %} {% data reusables.actions.java-jvm-architecture %} ## Building and testing your code You can use the same commands that you use locally to build and test your code. The workflow template will run the `package` target by default. In the default Maven configuration, this command will download dependencies, build classes, run tests, and package classes into their distributable format, for example, a JAR file. If you use different commands to build your project, or you want to use a different target, you can specify those. For example, you may want to run the `verify` target that's configured in a `pom-ci.xml` file. ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-java %} with: java-version: '17' distribution: 'temurin' - name: Run the Maven verify phase run: mvn --batch-mode --update-snapshots verify ``` ## Caching dependencies You can cache your dependencies to speed up your workflow runs. After a successful run, your local Maven repository will be stored in a cache. In future workflow runs, the cache will be restored so that dependencies don't need to be downloaded from remote Maven repositories.
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/java-with-maven.md
main
github-actions
[ -0.05211137607693672, -0.021349232643842697, -0.049878161400556564, -0.033490974456071854, 0.014745867811143398, -0.02258823812007904, -0.05749520659446716, 0.0753038078546524, -0.046358030289411545, 0.0017804648960009217, -0.02109227515757084, 0.015287838876247406, 0.03302035480737686, -0...
0.130598
``` ## Caching dependencies You can cache your dependencies to speed up your workflow runs. After a successful run, your local Maven repository will be stored in a cache. In future workflow runs, the cache will be restored so that dependencies don't need to be downloaded from remote Maven repositories. You can cache dependencies simply using the [`setup-java` action](https://github.com/marketplace/actions/setup-java-jdk) or can use [`cache` action](https://github.com/actions/cache) for custom and more advanced configuration. ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up JDK 17 uses: {% data reusables.actions.action-setup-java %} with: java-version: '17' distribution: 'temurin' cache: maven - name: Build with Maven run: mvn --batch-mode --update-snapshots verify ``` This workflow will save the contents of your local Maven repository, located in the `.m2` directory of the runner's home directory. The cache key will be the hashed contents of `pom.xml`, so changes to `pom.xml` will invalidate the cache. ## Packaging workflow data as artifacts After your build has succeeded and your tests have passed, you may want to upload the resulting Java packages as a build artifact. This will store the built packages as part of the workflow run, and allow you to download them. Artifacts can help you test and debug pull requests in your local environment before they're merged. For more information, see [AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts). Maven will usually create output files like JARs, EARs, or WARs in the `target` directory. To upload those as artifacts, you can copy them into a new directory that contains artifacts to upload. For example, you can create a directory called `staging`. Then you can upload the contents of that directory using the `upload-artifact` action. ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-java %} with: java-version: '17' distribution: 'temurin' - run: mvn --batch-mode --update-snapshots verify - run: mkdir staging && cp target/\*.jar staging - uses: {% data reusables.actions.action-upload-artifact %} with: name: Package path: staging ```
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/java-with-maven.md
main
github-actions
[ -0.024214988574385643, -0.0027971300296485424, -0.029932543635368347, -0.06443370878696442, -0.0013507429976016283, -0.03644494712352753, -0.03335823863744736, -0.0469529815018177, 0.004161234945058823, 0.017335129901766777, 0.06751567125320435, 0.047569390386343, -0.0012009580386802554, -...
-0.002638
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to build, test, and publish a Go package. {% data variables.product.prodname\_dotcom %}-hosted runners have a tools cache with preinstalled software, which includes the dependencies for Go. For a full list of up-to-date software and the preinstalled versions of Go, see [AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners#preinstalled-software). ## Prerequisites You should already be familiar with YAML syntax and how it's used with {% data variables.product.prodname\_actions %}. For more information, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions). We recommend that you have a basic understanding of the Go language. For more information, see [Getting started with Go](https://golang.org/doc/tutorial/getting-started). ## Using a Go workflow template {% data reusables.actions.workflow-templates-get-started %} {% data variables.product.prodname\_dotcom %} provides a Go workflow template that should work for most Go projects. The subsequent sections of this guide give examples of how you can customize this workflow template. {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.actions-tab %} {% data reusables.actions.new-starter-workflow %} 1. The "Choose a workflow" page shows a selection of recommended workflow templates. Search for "go". 1. Filter the selection of workflows by clicking \*\*Continuous integration\*\*. 1. On the "Go - by {% data variables.product.prodname\_actions %}" workflow, click \*\*Configure\*\*. ![Screenshot of the "Choose a workflow" page. The "Configure" button on the "Go" workflow is highlighted with an orange outline.](/assets/images/help/actions/starter-workflow-go.png) {%- ifversion ghes %} If you don't find the "Go - by {% data variables.product.prodname\_actions %}" workflow template, copy the following workflow code to a new file called `go.yml` in the `.github/workflows` directory of your repository. ```yaml copy name: Go on: push: branches: [ "main" ] pull\_request: branches: [ "main" ] jobs: build: runs-on: self-hosted steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up Go uses: {% data reusables.actions.action-setup-go %} with: go-version: '1.20' - name: Build run: go build -v ./... - name: Test run: go test -v ./... ``` {%- endif %} 1. Edit the workflow as required. For example, change the version of Go. 1. Click \*\*Commit changes\*\*. {% ifversion fpt or ghec %} The `go.yml` workflow file is added to the `.github/workflows` directory of your repository. {% endif %} ## Specifying a Go version The easiest way to specify a Go version is by using the `setup-go` action provided by {% data variables.product.prodname\_dotcom %}. For more information see, the [`setup-go` action](https://github.com/actions/setup-go/). To use a preinstalled version of Go on a {% data variables.product.prodname\_dotcom %}-hosted runner, pass the relevant version to the `go-version` property of the `setup-go` action. This action finds a specific version of Go from the tools cache on each runner, and adds the necessary binaries to `PATH`. These changes will persist for the remainder of the job. The `setup-go` action is the recommended way of using Go with {% data variables.product.prodname\_actions %}, because it helps ensure consistent behavior across different runners and different versions of Go. If you are using a self-hosted runner, you must install Go and add it to `PATH`. ### Using multiple versions of Go ```yaml copy name: Go on: [push] jobs: build: runs-on: ubuntu-latest strategy: matrix: go-version: [ '1.19', '1.20', '1.21.x' ] steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup Go {% raw %}${{ matrix.go-version }}{% endraw %} uses: {% data reusables.actions.action-setup-go %} with: go-version: {% raw %}${{ matrix.go-version }}{% endraw %} # You can test your matrix by printing the current Go version - name: Display Go version run: go version ``` ### Using a specific Go version You can configure your job to use a specific version of Go, such as `1.20.8`. Alternatively, you can use semantic version syntax to get the latest minor release. This example uses the latest patch release of Go 1.21: ```yaml
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/go.md
main
github-actions
[ -0.005758253391832113, -0.016115400940179825, -0.050556376576423645, -0.027309225872159004, -0.02886316180229187, 0.02872834913432598, 0.03177584335207939, 0.08039787411689758, -0.03145679458975792, -0.011380798183381557, 0.05536431819200516, 0.03128520026803017, -0.003879582043737173, -0....
0.15011
version run: go version ``` ### Using a specific Go version You can configure your job to use a specific version of Go, such as `1.20.8`. Alternatively, you can use semantic version syntax to get the latest minor release. This example uses the latest patch release of Go 1.21: ```yaml copy - name: Setup Go 1.21.x uses: {% data reusables.actions.action-setup-go %} with: # Semantic version range syntax or exact version of Go go-version: '1.21.x' ``` ## Installing dependencies You can use `go get` to install dependencies: ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup Go uses: {% data reusables.actions.action-setup-go %} with: go-version: '1.21.x' - name: Install dependencies run: | go get . go get example.com/octo-examplemodule go get example.com/octo-examplemodule@v1.3.4 ``` ### Caching dependencies You can cache and restore dependencies using the [`setup-go` action](https://github.com/actions/setup-go). By default, caching is enabled when using the `setup-go` action. The `setup-go` action searches for the dependency file, `go.sum`, in the repository root and uses the hash of the dependency file as a part of the cache key. You can use the `cache-dependency-path` parameter for cases when multiple dependency files are used, or when they are located in different subdirectories. ```yaml copy - name: Setup Go uses: {% data reusables.actions.action-setup-go %} with: go-version: '1.17' cache-dependency-path: subdir/go.sum ``` If you have a custom requirement or need finer controls for caching, you can use the [`cache` action](https://github.com/marketplace/actions/cache). For more information, see [AUTOTITLE](/actions/using-workflows/caching-dependencies-to-speed-up-workflows). ## Building and testing your code You can use the same commands that you use locally to build and test your code. This example workflow demonstrates how to use `go build` and `go test` in a job: ```yaml copy name: Go on: [push] jobs: build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup Go uses: {% data reusables.actions.action-setup-go %} with: go-version: '1.21.x' - name: Install dependencies run: go get . - name: Build run: go build -v ./... - name: Test with the Go CLI run: go test ``` ## Packaging workflow data as artifacts After a workflow completes, you can upload the resulting artifacts for analysis. For example, you may need to save log files, core dumps, test results, or screenshots. The following example demonstrates how you can use the `upload-artifact` action to upload test results. For more information, see [AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts). ```yaml copy name: Upload Go test results on: [push] jobs: build: runs-on: ubuntu-latest strategy: matrix: go-version: [ '1.19', '1.20', '1.21.x' ] steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup Go uses: {% data reusables.actions.action-setup-go %} with: go-version: {% raw %}${{ matrix.go-version }}{% endraw %} - name: Install dependencies run: go get . - name: Test with Go run: go test -json > TestResults-{% raw %}${{ matrix.go-version }}{% endraw %}.json - name: Upload Go test results uses: {% data reusables.actions.action-upload-artifact %} with: name: Go-results-{% raw %}${{ matrix.go-version }}{% endraw %} path: TestResults-{% raw %}${{ matrix.go-version }}{% endraw %}.json ```
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/go.md
main
github-actions
[ 0.018378466367721558, -0.01543430145829916, 0.07027924805879593, -0.056982580572366714, -0.018305163830518723, 0.015608340501785278, -0.023048080503940582, -0.011377928778529167, -0.07082431018352509, -0.0021216655150055885, 0.06676202267408371, 0.043132975697517395, -0.07433713972568512, ...
-0.059975
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to build, test, and publish a Python package. {% data variables.product.prodname\_dotcom %}-hosted runners have a tools cache with pre-installed software, which includes Python and PyPy. You don't have to install anything! For a full list of up-to-date software and the pre-installed versions of Python and PyPy, see [AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners#supported-software). ## Prerequisites You should be familiar with YAML and the syntax for {% data variables.product.prodname\_actions %}. For more information, see [AUTOTITLE](/actions/learn-github-actions). We recommend that you have a basic understanding of Python, and pip. For more information, see: \* [Getting started with Python](https://www.python.org/about/gettingstarted/) \* [Pip package manager](https://pypi.org/project/pip/) {% data reusables.actions.enterprise-setup-prereq %} ## Using a Python workflow template {% data reusables.actions.workflow-templates-get-started %} {% data variables.product.prodname\_dotcom %} provides a workflow template for Python that should work if your repository already contains at least one `.py` file. The subsequent sections of this guide give examples of how you can customize this workflow template. {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.actions-tab %} {% data reusables.actions.new-starter-workflow %} 1. The "Choose a workflow" page shows a selection of recommended workflow templates. Search for "Python application". 1. On the "Python application" workflow, click \*\*Configure\*\*. {%- ifversion ghes %} If you don't find the "Python application" workflow template, copy the following workflow code to a new file called `python-app.yml` in the `.github/workflows` directory of your repository. ```yaml copy name: Python application on: push: branches: [ "main" ] pull\_request: branches: [ "main" ] permissions: contents: read jobs: build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up Python 3.13 uses: {% data reusables.actions.action-setup-python %} with: python-version: "3.13" - name: Install dependencies run: | python -m pip install --upgrade pip pip install ruff pytest if [ -f requirements.txt ]; then pip install -r requirements.txt; fi - name: Lint and format Python code with ruff run: | # Lint with the default set of ruff rules with GitHub Annotations ruff check --format=github --target-version=py39 # Verify the code is properly formatted ruff format --diff --target-version=py39 - name: Test with pytest run: | pytest ``` {%- endif %} 1. Edit the workflow as required. For example, change the Python version. 1. Click \*\*Commit changes\*\*. {% ifversion fpt or ghec %} The `python-app.yml` workflow file is added to the `.github/workflows` directory of your repository. {% endif %} ## Specifying a Python version To use a pre-installed version of Python or PyPy on a {% data variables.product.prodname\_dotcom %}-hosted runner, use the `setup-python` action. This action finds a specific version of Python or PyPy from the tools cache on each runner and adds the necessary binaries to `PATH`, which persists for the rest of the job. If a specific version of Python is not pre-installed in the tools cache, the `setup-python` action will download and set up the appropriate version from the [`python-versions`](https://github.com/actions/python-versions) repository. Using the `setup-python` action is the recommended way of using Python with {% data variables.product.prodname\_actions %} because it ensures consistent behavior across different runners and different versions of Python. If you are using a self-hosted runner, you must install Python and add it to `PATH`. For more information, see the [`setup-python` action](https://github.com/marketplace/actions/setup-python). The table below describes the locations for the tools cache in each {% data variables.product.prodname\_dotcom %}-hosted runner. {% rowheaders %} || Ubuntu | Mac | Windows | |------|-------|------|----------| |\*\*Tool Cache Directory\*\* |`/opt/hostedtoolcache/\*`|`/Users/runner/hostedtoolcache/\*`|`C:\hostedtoolcache\windows\\*`| |\*\*Python Tool Cache\*\*|`/opt/hostedtoolcache/Python/\*`|`/Users/runner/hostedtoolcache/Python/\*`|`C:\hostedtoolcache\windows\Python\\*`| |\*\*PyPy Tool Cache\*\*|`/opt/hostedtoolcache/PyPy/\*`|`/Users/runner/hostedtoolcache/PyPy/\*`|`C:\hostedtoolcache\windows\PyPy\\*`| {% endrowheaders %} If you are using a self-hosted runner, you can configure the runner to use the `setup-python` action to manage your dependencies. For more information, see [using setup-python with a self-hosted runner](https://github.com/actions/setup-python#using-setup-python-with-a-self-hosted-runner) in the `setup-python` README. {% data
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/python.md
main
github-actions
[ -0.03258434310555458, -0.057586420327425, -0.05502048507332802, 0.010392268188297749, 0.013069999404251575, -0.02625959925353527, -0.019681202247738838, 0.10292739421129227, -0.0759291797876358, 0.019194992259144783, 0.02302820235490799, 0.043890561908483505, 0.015139698050916195, -0.02977...
0.080834
|\*\*Tool Cache Directory\*\* |`/opt/hostedtoolcache/\*`|`/Users/runner/hostedtoolcache/\*`|`C:\hostedtoolcache\windows\\*`| |\*\*Python Tool Cache\*\*|`/opt/hostedtoolcache/Python/\*`|`/Users/runner/hostedtoolcache/Python/\*`|`C:\hostedtoolcache\windows\Python\\*`| |\*\*PyPy Tool Cache\*\*|`/opt/hostedtoolcache/PyPy/\*`|`/Users/runner/hostedtoolcache/PyPy/\*`|`C:\hostedtoolcache\windows\PyPy\\*`| {% endrowheaders %} If you are using a self-hosted runner, you can configure the runner to use the `setup-python` action to manage your dependencies. For more information, see [using setup-python with a self-hosted runner](https://github.com/actions/setup-python#using-setup-python-with-a-self-hosted-runner) in the `setup-python` README. {% data variables.product.prodname\_dotcom %} supports semantic versioning syntax. For more information, see [Using semantic versioning](https://docs.npmjs.com/about-semantic-versioning#using-semantic-versioning-to-specify-update-types-your-package-can-accept) and the [Semantic versioning specification](https://semver.org/). ### Using multiple Python versions The following example uses a matrix for the job to set up multiple Python versions. For more information, see [AUTOTITLE](/actions/using-jobs/using-a-matrix-for-your-jobs). ```yaml copy name: Python package on: [push] jobs: build: runs-on: ubuntu-latest strategy: matrix: python-version: ["pypy3.10", "3.9", "3.10", "3.11", "3.12", "3.13"] steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up Python {% raw %}${{ matrix.python-version }}{% endraw %} uses: {% data reusables.actions.action-setup-python %} with: python-version: {% raw %}${{ matrix.python-version }}{% endraw %} # You can test your matrix by printing the current Python version - name: Display Python version run: python -c "import sys; print(sys.version)" ``` ### Using a specific Python version You can configure a specific version of Python. For example, 3.12. Alternatively, you can use semantic version syntax to get the latest minor release. This example uses the latest minor release of Python 3. ```yaml copy name: Python package on: [push] jobs: build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up Python # This is the version of the action for setting up Python, not the Python version. uses: {% data reusables.actions.action-setup-python %} with: # Semantic version range syntax or exact version of a Python version python-version: '3.x' # Optional - x64 or x86 architecture, defaults to x64 architecture: 'x64' # You can test your matrix by printing the current Python version - name: Display Python version run: python -c "import sys; print(sys.version)" ``` ### Excluding a version If you specify a version of Python that is not available, `setup-python` fails with an error such as: `##[error]Version 3.7 with arch x64 not found`. The error message includes the available versions. You can also use the `exclude` keyword in your workflow if there is a configuration of Python that you do not wish to run. For more information, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob\_idstrategy). ```yaml copy name: Python package on: [push] jobs: build: runs-on: {% raw %}${{ matrix.os }}{% endraw %} strategy: matrix: os: [ubuntu-latest, macos-latest, windows-latest] python-version: ["3.9", "3.11", "3.13", "pypy3.10"] exclude: - os: macos-latest python-version: "3.11" - os: windows-latest python-version: "3.11" ``` ### Using the default Python version We recommend using `setup-python` to configure the version of Python used in your workflows because it helps make your dependencies explicit. If you don't use `setup-python`, the default version of Python set in `PATH` is used in any shell when you call `python`. The default version of Python varies between {% data variables.product.prodname\_dotcom %}-hosted runners, which may cause unexpected changes or use an older version than expected. | {% data variables.product.prodname\_dotcom %}-hosted runner | Description | |----|----| | Ubuntu | Ubuntu runners have multiple versions of system Python installed under `/usr/bin/python` and `/usr/bin/python3`. The Python versions that come packaged with Ubuntu are in addition to the versions that {% data variables.product.prodname\_dotcom %} installs in the tools cache. | | Windows | Excluding the versions of Python that are in the tools cache, Windows does not ship with an equivalent version of system Python. To maintain consistent behavior with other runners and to allow Python to be used out-of-the-box without the `setup-python` action, {% data variables.product.prodname\_dotcom %} adds a few versions from the tools cache to `PATH`.| | macOS | The macOS runners have more
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/python.md
main
github-actions
[ -0.019319258630275726, -0.007620768155902624, -0.056501634418964386, -0.006467964965850115, 0.0057304175570607185, -0.09554857015609741, -0.03568926453590393, -0.005184998270124197, 0.012335292994976044, 0.002025724621489644, 0.004051743075251579, -0.0182828176766634, -0.03320414200425148, ...
-0.054372
not ship with an equivalent version of system Python. To maintain consistent behavior with other runners and to allow Python to be used out-of-the-box without the `setup-python` action, {% data variables.product.prodname\_dotcom %} adds a few versions from the tools cache to `PATH`.| | macOS | The macOS runners have more than one version of system Python installed, in addition to the versions that are part of the tools cache. The system Python versions are located in the `/usr/local/Cellar/python/\*` directory. | ## Installing dependencies {% data variables.product.prodname\_dotcom %}-hosted runners have the pip package manager installed. You can use pip to install dependencies from the PyPI package registry before building and testing your code. For example, the YAML below installs or upgrades the `pip` package installer and the `setuptools` and `wheel` packages. You can also cache dependencies to speed up your workflow. For more information, see [AUTOTITLE](/actions/using-workflows/caching-dependencies-to-speed-up-workflows). ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up Python uses: {% data reusables.actions.action-setup-python %} with: python-version: '3.x' - name: Install dependencies run: python -m pip install --upgrade pip setuptools wheel ``` ### Requirements file After you update `pip`, a typical next step is to install dependencies from `requirements.txt`. For more information, see [pip](https://pip.pypa.io/en/stable/cli/pip\_install/#example-requirements-file). ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up Python uses: {% data reusables.actions.action-setup-python %} with: python-version: '3.x' - name: Install dependencies run: | python -m pip install --upgrade pip pip install -r requirements.txt ``` ### Caching Dependencies You can cache and restore the dependencies using the [`setup-python` action](https://github.com/actions/setup-python). The following example caches dependencies for pip. ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-python %} with: python-version: '3.12' cache: 'pip' - run: pip install -r requirements.txt - run: pip test ``` By default, the `setup-python` action searches for the dependency file (`requirements.txt` for pip, `Pipfile.lock` for pipenv or `poetry.lock` for poetry) in the whole repository. For more information, see [Caching packages dependencies](https://github.com/actions/setup-python#caching-packages-dependencies) in the `setup-python` README. If you have a custom requirement or need finer controls for caching, you can use the [`cache` action](https://github.com/marketplace/actions/cache). Pip caches dependencies in different locations, depending on the operating system of the runner. The path you'll need to cache may differ from the Ubuntu example above, depending on the operating system you use. For more information, see [Python caching examples](https://github.com/actions/cache/blob/main/examples.md#python---pip) in the `cache` action repository. ## Testing your code You can use the same commands that you use locally to build and test your code. ### Testing with pytest and pytest-cov This example installs or upgrades `pytest` and `pytest-cov`. Tests are then run and output in JUnit format while code coverage results are output in Cobertura. For more information, see [JUnit](https://junit.org/junit5/) and [Cobertura](https://cobertura.github.io/cobertura/). ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up Python uses: {% data reusables.actions.action-setup-python %} with: python-version: '3.x' - name: Install dependencies run: | python -m pip install --upgrade pip pip install -r requirements.txt - name: Test with pytest run: | pip install pytest pytest-cov pytest tests.py --doctest-modules --junitxml=junit/test-results.xml --cov=com --cov-report=xml --cov-report=html ``` ### Using Ruff to lint and/or format code The following example installs or upgrades `ruff` and uses it to lint all files. For more information, see [Ruff](https://docs.astral.sh/ruff). ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up Python uses: {% data reusables.actions.action-setup-python %} with: python-version: '3.x' - name: Install the code linting and formatting tool Ruff run: pipx install ruff - name: Lint code with Ruff run: ruff check --output-format=github --target-version=py39 - name: Check code formatting with Ruff run: ruff format --diff --target-version=py39 continue-on-error: true ``` The formatting step
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/python.md
main
github-actions
[ -0.01868787594139576, -0.04301166161894798, 0.01215861551463604, -0.0028774319216609, 0.04588930308818817, -0.10531594604253769, -0.08001484721899033, 0.03760934993624687, -0.0715121328830719, -0.02139877714216709, 0.011339617893099785, 0.050913192331790924, 0.0014931210316717625, 0.015682...
0.052279
{% data reusables.actions.action-setup-python %} with: python-version: '3.x' - name: Install the code linting and formatting tool Ruff run: pipx install ruff - name: Lint code with Ruff run: ruff check --output-format=github --target-version=py39 - name: Check code formatting with Ruff run: ruff format --diff --target-version=py39 continue-on-error: true ``` The formatting step has `continue-on-error: true` set. This will keep the workflow from failing if the formatting step doesn't succeed. Once you've addressed all of the formatting errors, you can remove this option so the workflow will catch new issues. ### Running tests with tox With {% data variables.product.prodname\_actions %}, you can run tests with tox and spread the work across multiple jobs. You'll need to invoke tox using the `-e py` option to choose the version of Python in your `PATH`, rather than specifying a specific version. For more information, see [tox](https://tox.readthedocs.io/en/latest/). ```yaml copy name: Python package on: [push] jobs: build: runs-on: ubuntu-latest strategy: matrix: python: ["3.9", "3.11", "3.13"] steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup Python uses: {% data reusables.actions.action-setup-python %} with: python-version: {% raw %}${{ matrix.python }}{% endraw %} - name: Install tox and any other packages run: pip install tox - name: Run tox # Run tox using the version of Python in `PATH` run: tox -e py ``` ## Packaging workflow data as artifacts You can upload artifacts to view after a workflow completes. For example, you may need to save log files, core dumps, test results, or screenshots. For more information, see [AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts). The following example demonstrates how you can use the `upload-artifact` action to archive test results from running `pytest`. For more information, see the [`upload-artifact` action](https://github.com/actions/upload-artifact). ```yaml copy name: Python package on: [push] jobs: build: runs-on: ubuntu-latest strategy: matrix: python-version: ["3.9", "3.10", "3.11", "3.12", "3.13"] steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup Python # Set Python version uses: {% data reusables.actions.action-setup-python %} with: python-version: {% raw %}${{ matrix.python-version }}{% endraw %} # Install pip and pytest - name: Install dependencies run: | python -m pip install --upgrade pip pip install pytest - name: Test with pytest run: pytest tests.py --doctest-modules {% raw %}--junitxml=junit/test-results-${{ matrix.python-version }}.xml{% endraw %} - name: Upload pytest test results uses: {% data reusables.actions.action-upload-artifact %} with: name: {% raw %}pytest-results-${{ matrix.python-version }}{% endraw %} path: {% raw %}junit/test-results-${{ matrix.python-version }}.xml{% endraw %} # Use always() to always run this step to publish test results when there are test failures if: {% raw %}${{ always() }}{% endraw %} ``` ## Publishing to PyPI You can configure your workflow to publish your Python package to PyPI once your CI tests pass. This section demonstrates how you can use {% data variables.product.prodname\_actions %} to upload your package to PyPI each time you publish a release. For more information, see [AUTOTITLE](/repositories/releasing-projects-on-github/managing-releases-in-a-repository). The example workflow below uses [Trusted Publishing](https://docs.pypi.org/trusted-publishers/) to authenticate with PyPI, eliminating the need for a manually configured API token. ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} {% data reusables.actions.actions-use-sha-pinning-comment %} name: Upload Python Package on: release: types: [published] permissions: contents: read jobs: release-build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-python %} with: python-version: "3.x" - name: Build release distributions run: | # NOTE: put your own distribution build steps here. python -m pip install build python -m build - name: Upload distributions uses: {% data reusables.actions.action-upload-artifact %} with: name: release-dists path: dist/ pypi-publish: runs-on: ubuntu-latest needs: - release-build permissions: # IMPORTANT: this permission is mandatory for trusted publishing id-token: write # Dedicated environments with protections for publishing are strongly recommended. environment: name: pypi # OPTIONAL: uncomment and update to include your PyPI
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/python.md
main
github-actions
[ -0.000519930268637836, -0.0672478973865509, -0.0013685704907402396, 0.015143855474889278, 0.014171536080539227, 0.01360914297401905, 0.0014966449234634638, 0.0011239154264330864, -0.07234466820955276, -0.014671172015368938, -0.009200909174978733, 0.005198512692004442, 0.08025260269641876, ...
-0.05569
uses: {% data reusables.actions.action-upload-artifact %} with: name: release-dists path: dist/ pypi-publish: runs-on: ubuntu-latest needs: - release-build permissions: # IMPORTANT: this permission is mandatory for trusted publishing id-token: write # Dedicated environments with protections for publishing are strongly recommended. environment: name: pypi # OPTIONAL: uncomment and update to include your PyPI project URL in the deployment status: # url: https://pypi.org/p/YOURPROJECT steps: - name: Retrieve release distributions uses: {% data reusables.actions.action-download-artifact %} with: name: release-dists path: dist/ - name: Publish release distributions to PyPI uses: pypa/gh-action-pypi-publish@6f7e8d9c0b1a2c3d4e5f6a7b8c9d0e1f2a3b4c5d ``` {% ifversion not ghes %} For more information about this workflow, including the PyPI settings needed, see [AUTOTITLE](/actions/deployment/security-hardening-your-deployments/configuring-openid-connect-in-pypi). {% endif %}
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/python.md
main
github-actions
[ -0.0006630282150581479, 0.013804876245558262, -0.019567646086215973, -0.00345621258020401, -0.0032649370841681957, -0.05314517393708229, -0.027927441522479057, 0.0019970154389739037, 0.008571299724280834, 0.05623244121670723, 0.08015881478786469, -0.014477811753749847, -0.053664036095142365,...
-0.020682
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to build, test, and publish a Rust package. {% data variables.product.prodname\_dotcom %}-hosted runners have a tools cache with preinstalled software, which includes the dependencies for Rust. For a full list of up-to-date software and the preinstalled versions of Rust, see [AUTOTITLE](/actions/using-github-hosted-runners/using-github-hosted-runners/about-github-hosted-runners#preinstalled-software). ## Prerequisites You should already be familiar with YAML syntax and how it's used with {% data variables.product.prodname\_actions %}. For more information, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions). We recommend that you have a basic understanding of the Rust language. For more information, see [Getting started with Rust](https://www.rust-lang.org/learn). ## Using a Rust workflow template {% data reusables.actions.workflow-templates-get-started %} {% data variables.product.prodname\_dotcom %} provides a Rust workflow template that should work for most basic Rust projects. The subsequent sections of this guide give examples of how you can customize this workflow template. {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.actions-tab %} {% data reusables.actions.new-starter-workflow %} 1. The "Choose a workflow" page shows a selection of recommended workflow templates. Search for "Rust". 1. Filter the selection of workflows by clicking \*\*Continuous integration\*\*. 1. On the "Rust - by {% data variables.product.prodname\_actions %}" workflow, click \*\*Configure\*\*. ![Screenshot of the "Choose a workflow" page. The "Configure" button on the "Rust" workflow is highlighted with an orange outline.](/assets/images/help/actions/starter-workflow-rust.png) {%- ifversion ghes %} If you don't find the "Rust - by {% data variables.product.prodname\_actions %}" workflow template, copy the following workflow code to a new file called `rust.yml` in the `.github/workflows` directory of your repository. ```yaml copy name: Rust on: push: branches: [ "main" ] pull\_request: branches: [ "main" ] env: CARGO\_TERM\_COLOR: never jobs: build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Build run: cargo build --verbose - name: Run tests run: cargo test --verbose ``` {%- endif %} 1. Edit the workflow as required. For example, change the version of Rust. 1. Click \*\*Commit changes\*\*. {% ifversion fpt or ghec %} The `rust.yml` workflow file is added to the `.github/workflows` directory of your repository. {% endif %} ## Specifying a Rust version {% data variables.product.prodname\_dotcom %}-hosted runners include a recent version of the Rust toolchain. You can use rustup to report on the version installed on a runner, override the version, and to install different toolchains. For more information, see [The rustup book](https://rust-lang.github.io/rustup/). This example shows steps you could use to setup your runner environment to use the nightly build of rust and to report the version. ```yaml copy - name: Temporarily modify the rust toolchain version run: rustup override set nightly - name: Output rust version for educational purposes run: rustup --version ``` ### Caching dependencies You can cache and restore dependencies using the Cache action. This example assumes that your repository contains a `Cargo.lock` file. ```yaml copy - name: Cache uses: {% data reusables.actions.action-cache %} with: path: | ~/.cargo/registry ~/.cargo/git target key: {% raw %}${{ runner.os }}-cargo-${{ hashFiles('\*\*/Cargo.lock') }}{% endraw %} ``` If you have custom requirements or need finer controls for caching, you should explore other configuration options for the [`cache` action](https://github.com/marketplace/actions/cache). For more information, see [AUTOTITLE](/actions/using-workflows/caching-dependencies-to-speed-up-workflows). ## Building and testing your code You can use the same commands that you use locally to build and test your code. This example workflow demonstrates how to use `cargo build` and `cargo test` in a job: ```yaml copy jobs: build: runs-on: ubuntu-latest strategy: matrix: BUILD\_TARGET: [release] # refers to a cargo profile outputs: release\_built: {% raw %}${{ steps.set-output.outputs.release\_built }}{% endraw %} steps: - uses: {% data reusables.actions.action-checkout %} - name: Build binaries in "{% raw %}${{ matrix.BUILD\_TARGET }}{% endraw %}" mode run: cargo build --profile ${% raw %}{{ matrix.BUILD\_TARGET }}{% endraw %} -
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/rust.md
main
github-actions
[ -0.06291449069976807, -0.03283302113413811, -0.03900637477636337, 0.028958085924386978, 0.021106218919157982, -0.0032475346233695745, 0.0031945379450917244, 0.1068890318274498, -0.05217845365405083, -0.03383231163024902, 0.007359957788139582, 0.008710311725735664, 0.021642640233039856, -0....
0.223129
matrix: BUILD\_TARGET: [release] # refers to a cargo profile outputs: release\_built: {% raw %}${{ steps.set-output.outputs.release\_built }}{% endraw %} steps: - uses: {% data reusables.actions.action-checkout %} - name: Build binaries in "{% raw %}${{ matrix.BUILD\_TARGET }}{% endraw %}" mode run: cargo build --profile ${% raw %}{{ matrix.BUILD\_TARGET }}{% endraw %} - name: Run tests in "${% raw %}{{ matrix.BUILD\_TARGET }}{% endraw %}" mode run: cargo test --profile ${% raw %}{{ matrix.BUILD\_TARGET }}{% endraw %} ``` The `release` keyword used in this example corresponds to a cargo profile. You can use any [profile](https://doc.rust-lang.org/cargo/reference/profiles.html) you have defined in your `Cargo.toml` file. ## Publishing your package or library to crates.io Once you have setup your workflow to build and test your code, you can use a secret to login to [crates.io](https://crates.io/) and publish your package. ```yaml copy - name: Login into crates.io run: cargo login {% raw %}${{ secrets.CRATES\_IO }}{% endraw %} - name: Build binaries in "release" mode run: cargo build -r - name: "Package for crates.io" run: cargo package # publishes a package as a tarball - name: "Publish to crates.io" run: cargo publish # publishes your crate as a library that can be added as a dependency ``` If there are any errors building and packaging the crate, check the metadata in your manifest, `Cargo.toml` file, see [The Manifest Format](https://doc.rust-lang.org/cargo/reference/manifest.html). You should also check your `Cargo.lock` file, see [Cargo.toml vs Cargo.lock](https://doc.rust-lang.org/cargo/guide/cargo-toml-vs-cargo-lock.html). ## Packaging workflow data as artifacts After a workflow completes, you can upload the resulting artifacts for analysis or to use in another workflow. You could add these example steps to the workflow to upload an application for use by another workflow. ```yaml copy - name: Upload release artifact uses: {% data reusables.actions.action-upload-artifact %} with: name: {% raw %}{% endraw %} path: {% raw %}target/${{ matrix.BUILD\_TARGET }}/{% endraw %} ``` To use the uploaded artifact in a different job, ensure your workflows have the right permissions for the repository, see [AUTOTITLE](/actions/security-for-github-actions/security-guides/automatic-token-authentication). You could use these example steps to download the app created in the previous workflow and publish it on {% data variables.product.github %}. ```yaml copy - uses: {% data reusables.actions.action-checkout %} - name: Download release artifact uses: {% data reusables.actions.action-download-artifact %} with: name: {% raw %}{% endraw %} path: ./{% raw %}{% endraw %} - name: Publish built binary to {% data variables.product.github %} releases - run: | gh release create --generate-notes ./{% raw %}/#{% endraw %}
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/rust.md
main
github-actions
[ 0.00048567113117314875, 0.02653801441192627, -0.09240101277828217, 0.0937013179063797, 0.047416433691978455, 0.015788767486810684, 0.04617692157626152, 0.01571451500058174, -0.09713882952928543, -0.047543250024318695, 0.030682504177093506, -0.0482313334941864, 0.0356474444270134, 0.0099777...
0.062994
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to create a workflow that performs continuous integration (CI) for your Java project using the Gradle build system. The workflow you create will allow you to see when commits to a pull request cause build or test failures against your default branch; this approach can help ensure that your code is always healthy. You can extend your CI workflow to cache files and upload artifacts from a workflow run. {% data variables.product.prodname\_dotcom %}-hosted runners have a tools cache with pre-installed software, which includes Java Development Kits (JDKs) and Gradle. For a list of software and the pre-installed versions for JDK and Gradle, see [AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners#supported-software). ## Prerequisites You should be familiar with YAML and the syntax for {% data variables.product.prodname\_actions %}. For more information, see: \* [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions) \* [AUTOTITLE](/actions/learn-github-actions) We recommend that you have a basic understanding of Java and the Gradle framework. For more information, see the [Gradle User Manual](https://docs.gradle.org/current/userguide/userguide.html). {% data reusables.actions.enterprise-setup-prereq %} ## Using a Gradle workflow template {% data reusables.actions.workflow-templates-get-started %} {% data variables.product.prodname\_dotcom %} provides a workflow template for Gradle that should work for most Java with Gradle projects. The subsequent sections of this guide give examples of how you can customize this workflow template. {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.actions-tab %} {% data reusables.actions.new-starter-workflow %} 1. The "Choose a workflow" page shows a selection of recommended workflow templates. Search for "Java with Gradle". 1. On the "Java with Gradle" workflow, click \*\*Configure\*\*. {%- ifversion ghes %} If you don't find the "Java with Gradle" workflow template, copy the following workflow code to a new file called `gradle.yml` in the `.github/workflows` directory of your repository. ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} name: Java CI with Gradle on: push: branches: [ "main" ] pull\_request: branches: [ "main" ] permissions: contents: read jobs: build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up JDK 17 uses: {% data reusables.actions.action-setup-java %} with: java-version: '17' distribution: 'temurin' - name: Setup Gradle uses: gradle/actions/setup-gradle@017a9effdb900e5b5b2fddfb590a105619dca3c3 # v4.4.2 - name: Build with Gradle run: ./gradlew build ``` {%- endif %} {% data reusables.actions.gradle-workflow-steps %} 1. The "Build with Gradle" step executes the `build` task using the [Gradle Wrapper](https://docs.gradle.org/current/userguide/gradle\_wrapper.html). 1. Edit the workflow as required. For example, change the Java version. {% indented\_data\_reference reusables.actions.third-party-actions spaces=3 %} 1. Click \*\*Commit changes\*\*. {% ifversion fpt or ghec %} The `gradle.yml` workflow file is added to the `.github/workflows` directory of your repository. {% endif %} {% data reusables.actions.java-jvm-architecture %} ## Building and testing your code You can use the same commands that you use locally to build and test your code. The workflow template will run the `build` task by default. In the default Gradle configuration, this command will download dependencies, build classes, run tests, and package classes into their distributable format, for example, a JAR file. If you use different commands to build your project, or you want to use a different task, you can specify those. For example, you may want to run the `package` task that's configured in your `ci.gradle` file. ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-java %} with: java-version: '17' distribution: 'temurin' - name: Setup Gradle uses: gradle/actions/setup-gradle@017a9effdb900e5b5b2fddfb590a105619dca3c3 # v4.4.2 - name: Build with Gradle run: ./gradlew -b ci.gradle package ``` ## Caching dependencies Your build dependencies can be cached to speed up your workflow runs. After a successful run, `gradle/actions/setup-gradle` caches important parts of the Gradle user home directory. In future jobs, the cache will be restored so that build scripts won't need to
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/java-with-gradle.md
main
github-actions
[ -0.06526023149490356, -0.032517991960048676, -0.03263596445322037, -0.05982418358325958, 0.025893449783325195, -0.0217893049120903, -0.027031119912862778, 0.08437049388885498, -0.0415969118475914, -0.011619088239967823, -0.04493788257241249, 0.026421761140227318, 0.02175235003232956, -0.11...
0.126527
run: ./gradlew -b ci.gradle package ``` ## Caching dependencies Your build dependencies can be cached to speed up your workflow runs. After a successful run, `gradle/actions/setup-gradle` caches important parts of the Gradle user home directory. In future jobs, the cache will be restored so that build scripts won't need to be recompiled and dependencies won't need to be downloaded from remote package repositories. Caching is enabled by default when using the `gradle/actions/setup-gradle` action. For more information, see [`gradle/actions/setup-gradle`](https://github.com/gradle/actions/blob/main/setup-gradle/README.md#caching-build-state-between-jobs). ## Packaging workflow data as artifacts After your build has succeeded and your tests have passed, you may want to upload the resulting Java packages as a build artifact. This will store the built packages as part of the workflow run, and allow you to download them. Artifacts can help you test and debug pull requests in your local environment before they're merged. For more information, see [AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts). Gradle will usually create output files like JARs, EARs, or WARs in the `build/libs` directory. You can upload the contents of that directory using the `upload-artifact` action. ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-java %} with: java-version: '17' distribution: 'temurin' - name: Setup Gradle uses: gradle/actions/setup-gradle@017a9effdb900e5b5b2fddfb590a105619dca3c3 # v4.4.2 - name: Build with Gradle run: ./gradlew build - name: Upload build artifacts uses: {% data reusables.actions.action-upload-artifact %} with: name: Package path: build/libs ```
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/java-with-gradle.md
main
github-actions
[ -0.05232756584882736, -0.03484564274549484, 0.020152632147073746, -0.08486905694007874, 0.0100113395601511, -0.03895324841141701, 0.0017805631505325437, -0.0466545969247818, 0.0207662433385849, -0.021116767078638077, -0.030266663059592247, 0.009057432413101196, -0.017782039940357208, -0.04...
-0.050953
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to create a continuous integration (CI) workflow that builds and tests a Ruby application. If your CI tests pass, you may want to deploy your code or publish a gem. ## Prerequisites We recommend that you have a basic understanding of Ruby, YAML, workflow configuration options, and how to create a workflow file. For more information, see: \* [Learn {% data variables.product.prodname\_actions %}](/actions/learn-github-actions) \* [Ruby in 20 minutes](https://www.ruby-lang.org/en/documentation/quickstart/) ## Using a Ruby workflow template {% data reusables.actions.workflow-templates-get-started %} {% data variables.product.prodname\_dotcom %} provides a workflow template for Ruby that should work for most Ruby projects. The subsequent sections of this guide give examples of how you can customize this workflow template. {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.actions-tab %} {% data reusables.actions.new-starter-workflow %} 1. The "Choose a workflow" page shows a selection of recommended workflow templates. Search for "ruby". 1. Filter the selection of workflows by clicking \*\*Continuous integration\*\*. 1. On the "Ruby" workflow, click \*\*Configure\*\*. {%- ifversion ghes %} If you don't find the "Ruby" workflow template, copy the following workflow code to a new file called `ruby.yml` in the `.github/workflows` directory of your repository. ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} name: Ruby on: push: branches: [ "main" ] pull\_request: branches: [ "main" ] permissions: contents: read jobs: test: runs-on: ubuntu-latest strategy: matrix: ruby-version: ['2.6', '2.7', '3.0'] steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up Ruby # To automatically get bug fixes and new Ruby versions for ruby/setup-ruby, # change this to (see https://github.com/ruby/setup-ruby#versioning): # uses: ruby/setup-ruby@v1 uses: ruby/setup-ruby@55283cc23133118229fd3f97f9336ee23a179fcf # v1.146.0 with: ruby-version: {% raw %}${{ matrix.ruby-version }}{% endraw %} bundler-cache: true # runs 'bundle install' and caches installed gems automatically - name: Run tests run: bundle exec rake ``` {%- endif %} 1. Edit the workflow as required. For example, change the Ruby versions you want to use. {% indented\_data\_reference reusables.actions.third-party-actions spaces=3 %} 1. Click \*\*Commit changes\*\*. {% ifversion fpt or ghec %} The `ruby.yml` workflow file is added to the `.github/workflows` directory of your repository. {% endif %} ## Specifying the Ruby version The easiest way to specify a Ruby version is by using the `ruby/setup-ruby` action provided by the Ruby organization on GitHub. The action adds any supported Ruby version to `PATH` for each job run in a workflow. For more information and available Ruby versions, see [`ruby/setup-ruby`](https://github.com/ruby/setup-ruby). Using Ruby's `ruby/setup-ruby` action is the recommended way of using Ruby with GitHub Actions because it ensures consistent behavior across different runners and different versions of Ruby. The `setup-ruby` action takes a Ruby version as an input and configures that version on the runner. ```yaml {% data reusables.actions.actions-not-certified-by-github-comment %} steps: - uses: {% data reusables.actions.action-checkout %} - uses: ruby/setup-ruby@ec02537da5712d66d4d50a0f33b7eb52773b5ed1 with: ruby-version: '3.1' # Not needed with a .ruby-version file - run: bundle install - run: bundle exec rake ``` Alternatively, you can check a `.ruby-version` file into the root of your repository and `setup-ruby` will use the version defined in that file. ## Testing with multiple versions of Ruby You can add a matrix strategy to run your workflow with more than one version of Ruby. For example, you can test your code against the latest patch releases of versions 3.1, 3.0, and 2.7. {% raw %} ```yaml strategy: matrix: ruby-version: ['3.1', '3.0', '2.7'] ``` {% endraw %} Each version of Ruby specified in the `ruby-version` array creates a job that runs the same steps. The {% raw %}`${{ matrix.ruby-version }}`{% endraw %} context is used to access the current job's version. For more information about matrix strategies and contexts, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions) and [AUTOTITLE](/actions/learn-github-actions/contexts).
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/ruby.md
main
github-actions
[ -0.0567544624209404, -0.04782058298587799, -0.11308884620666504, 0.016664868220686913, 0.008616429753601551, -0.003008422441780567, 0.0066265626810491085, 0.048701994121074677, -0.06832804530858994, 0.016740838065743446, 0.01367988158017397, 0.0805453434586525, 0.03279580548405647, -0.0493...
0.086384
``` {% endraw %} Each version of Ruby specified in the `ruby-version` array creates a job that runs the same steps. The {% raw %}`${{ matrix.ruby-version }}`{% endraw %} context is used to access the current job's version. For more information about matrix strategies and contexts, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions) and [AUTOTITLE](/actions/learn-github-actions/contexts). The full updated workflow with a matrix strategy could look like this: ```yaml {% data reusables.actions.actions-not-certified-by-github-comment %} {% data reusables.actions.actions-use-sha-pinning-comment %} name: Ruby CI on: push: branches: [ main ] pull\_request: branches: [ main ] jobs: test: runs-on: ubuntu-latest strategy: matrix: ruby-version: ['3.1', '3.0', '2.7'] steps: - uses: {% data reusables.actions.action-checkout %} - name: {% raw %}Set up Ruby ${{ matrix.ruby-version }}{% endraw %} uses: ruby/setup-ruby@ec02537da5712d66d4d50a0f33b7eb52773b5ed1 with: ruby-version: {% raw %}${{ matrix.ruby-version }}{% endraw %} - name: Install dependencies run: bundle install - name: Run tests run: bundle exec rake ``` ## Installing dependencies with Bundler The `setup-ruby` action will automatically install bundler for you. The version is determined by your `gemfile.lock` file. If no version is present in your lockfile, then the latest compatible version will be installed. ```yaml {% data reusables.actions.actions-not-certified-by-github-comment %} steps: - uses: {% data reusables.actions.action-checkout %} - uses: ruby/setup-ruby@ec02537da5712d66d4d50a0f33b7eb52773b5ed1 with: ruby-version: '3.1' - run: bundle install ``` ### Caching dependencies The `setup-ruby` actions provides a method to automatically handle the caching of your gems between runs. To enable caching, set the following. {% raw %} ```yaml {% data reusables.actions.actions-not-certified-by-github-comment %} steps: - uses: ruby/setup-ruby@ec02537da5712d66d4d50a0f33b7eb52773b5ed1 with: bundler-cache: true ``` {% endraw %} This will configure bundler to install your gems to `vendor/cache`. For each successful run of your workflow, this folder will be cached by {% data variables.product.prodname\_actions %} and re-downloaded for subsequent workflow runs. A hash of your `gemfile.lock` and the Ruby version are used as the cache key. If you install any new gems, or change a version, the cache will be invalidated and bundler will do a fresh install. \*\*Caching without setup-ruby\*\* For greater control over caching, you can use the `actions/cache` action directly. For more information, see [AUTOTITLE](/actions/using-workflows/caching-dependencies-to-speed-up-workflows). ```yaml steps: - uses: {% data reusables.actions.action-cache %} with: path: vendor/bundle key: {% raw %}${{ runner.os }}-gems-${{ hashFiles('\*\*/Gemfile.lock') }}{% endraw %} restore-keys: | {% raw %}${{ runner.os }}-gems-{% endraw %} - name: Bundle install run: | bundle config path vendor/bundle bundle install --jobs 4 --retry 3 ``` If you're using a matrix build, you will want to include the matrix variables in your cache key. For example, if you have a matrix strategy for different ruby versions (`matrix.ruby-version`) and different operating systems (`matrix.os`), your workflow steps might look like this: ```yaml steps: - uses: {% data reusables.actions.action-cache %} with: path: vendor/bundle key: {% raw %}bundle-use-ruby-${{ matrix.os }}-${{ matrix.ruby-version }}-${{ hashFiles('\*\*/Gemfile.lock') }}{% endraw %} restore-keys: | {% raw %}bundle-use-ruby-${{ matrix.os }}-${{ matrix.ruby-version }}-{% endraw %} - name: Bundle install run: | bundle config path vendor/bundle bundle install --jobs 4 --retry 3 ``` ## Matrix testing your code The following example matrix tests all stable releases and head versions of MRI, JRuby and TruffleRuby on Ubuntu and macOS. ```yaml {% data reusables.actions.actions-not-certified-by-github-comment %} {% data reusables.actions.actions-use-sha-pinning-comment %} name: Matrix Testing on: push: branches: [ main ] pull\_request: branches: [ main ] jobs: test: runs-on: {% raw %}${{ matrix.os }}-latest{% endraw %} strategy: fail-fast: false matrix: os: [ubuntu, macos] ruby: [2.5, 2.6, 2.7, head, debug, jruby, jruby-head, truffleruby, truffleruby-head] continue-on-error: {% raw %}${{ endsWith(matrix.ruby, 'head') || matrix.ruby == 'debug' }}{% endraw %} steps: - uses: {% data reusables.actions.action-checkout %} - uses: ruby/setup-ruby@ec02537da5712d66d4d50a0f33b7eb52773b5ed1 with: ruby-version: {% raw %}${{ matrix.ruby }}{% endraw %} - run: bundle install - run: bundle exec rake ``` ## Linting your code The following
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/ruby.md
main
github-actions
[ -0.040760718286037445, -0.027674084529280663, -0.10486391931772232, 0.015327476896345615, 0.014558909460902214, -0.014089827425777912, -0.02797238901257515, -0.08985810726881027, -0.0167083740234375, -0.011268537491559982, 0.0164690762758255, 0.07499600201845169, -0.005419086664915085, -0....
0.031328
truffleruby, truffleruby-head] continue-on-error: {% raw %}${{ endsWith(matrix.ruby, 'head') || matrix.ruby == 'debug' }}{% endraw %} steps: - uses: {% data reusables.actions.action-checkout %} - uses: ruby/setup-ruby@ec02537da5712d66d4d50a0f33b7eb52773b5ed1 with: ruby-version: {% raw %}${{ matrix.ruby }}{% endraw %} - run: bundle install - run: bundle exec rake ``` ## Linting your code The following example installs `rubocop` and uses it to lint all files. For more information, see [RuboCop](https://github.com/rubocop-hq/rubocop). You can [configure Rubocop](https://docs.rubocop.org/rubocop/configuration.html) to decide on the specific linting rules. ```yaml {% data reusables.actions.actions-not-certified-by-github-comment %} {% data reusables.actions.actions-use-sha-pinning-comment %} name: Linting on: [push] jobs: test: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - uses: ruby/setup-ruby@ec02537da5712d66d4d50a0f33b7eb52773b5ed1 with: ruby-version: '2.6' - run: bundle install - name: Rubocop run: rubocop -f github ``` Specifying `-f github` means that the RuboCop output will be in {% data variables.product.prodname\_dotcom %}'s annotation format. Any linting errors will show inline in the \*\*Files changed\*\* tab of the pull request that introduces them. ## Publishing Gems You can configure your workflow to publish your Ruby package to any package registry you'd like when your CI tests pass. You can store any access tokens or credentials needed to publish your package using repository secrets. The following example creates and publishes a package to `GitHub Package Registry` and `RubyGems`. ```yaml {% data reusables.actions.actions-not-certified-by-github-comment %} {% data reusables.actions.actions-use-sha-pinning-comment %} name: Ruby Gem on: # Manually publish workflow\_dispatch: # Alternatively, publish whenever changes are merged to the `main` branch. push: branches: [ main ] pull\_request: branches: [ main ] jobs: build: name: Build + Publish runs-on: ubuntu-latest permissions: packages: write contents: read steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up Ruby 2.6 uses: ruby/setup-ruby@ec02537da5712d66d4d50a0f33b7eb52773b5ed1 with: ruby-version: '2.6' - run: bundle install - name: Publish to GPR run: |{% raw %} mkdir -p $HOME/.gem touch $HOME/.gem/credentials chmod 0600 $HOME/.gem/credentials printf -- "---\n:github: ${GEM\_HOST\_API\_KEY}\n" > $HOME/.gem/credentials gem build \*.gemspec gem push --KEY github --host https://rubygems.pkg.github.com/${OWNER} \*.gem env: GEM\_HOST\_API\_KEY: "Bearer ${{secrets.GITHUB\_TOKEN}}" OWNER: ${{ github.repository\_owner }} - name: Publish to RubyGems run: | mkdir -p $HOME/.gem touch $HOME/.gem/credentials chmod 0600 $HOME/.gem/credentials printf -- "---\n:rubygems\_api\_key: ${GEM\_HOST\_API\_KEY}\n" > $HOME/.gem/credentials gem build \*.gemspec gem push \*.gem env: GEM\_HOST\_API\_KEY: "${{secrets.RUBYGEMS\_AUTH\_TOKEN}}"{% endraw %} ```
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/ruby.md
main
github-actions
[ -0.05730718746781349, -0.045888207852840424, -0.11783592402935028, 0.05292895436286926, 0.042157404124736786, 0.012750995345413685, 0.02444724179804325, 0.03803013265132904, -0.1378556340932846, 0.04532865434885025, 0.0815369188785553, 0.05527086555957794, -0.038455531001091, 0.00858629029...
-0.005377
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to create a workflow that performs continuous integration (CI) for your Xamarin project. The workflow you create will allow you to see when commits to a pull request cause build or test failures against your default branch; this approach can help ensure that your code is always healthy. For a full list of available Xamarin SDK versions on the {% data variables.product.prodname\_actions %}-hosted macOS runners, see the README file for the version of macOS you want to use in the [{% data variables.product.prodname\_actions %} Runner Images repository](https://github.com/actions/runner-images/tree/main/images/macos). ## Prerequisites We recommend that you have a basic understanding of Xamarin, .NET Core SDK, YAML, workflow configuration options, and how to create a workflow file. For more information, see: \* [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions) \* [Getting started with .NET](https://dotnet.microsoft.com/learn) \* [Learn Xamarin](https://dotnet.microsoft.com/learn/xamarin) {% ifversion ghec %} To use the examples in the guide, you will need a repository on {% data variables.product.prodname\_dotcom\_the\_website %}. {% data reusables.actions.macos-unavailable-ghecom %} {% endif %} ## Building Xamarin.iOS apps The example below demonstrates how to change the default Xamarin SDK versions and build a Xamarin.iOS application. ```yaml name: Build Xamarin.iOS app on: [push] jobs: build: runs-on: macos-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Set default Xamarin SDK versions run: | $VM\_ASSETS/select-xamarin-sdk-v2.sh --mono=6.12 --ios=14.10 - name: Set default Xcode 12.3 run: | XCODE\_ROOT=/Applications/Xcode\_12.3.0.app echo "MD\_APPLE\_SDK\_ROOT=$XCODE\_ROOT" >> $GITHUB\_ENV sudo xcode-select -s $XCODE\_ROOT - name: Setup .NET Core SDK 5.0.x uses: {% data reusables.actions.action-setup-dotnet %} with: dotnet-version: '5.0.x' - name: Install dependencies run: nuget restore - name: Build run: msbuild /p:Configuration=Debug /p:Platform=iPhoneSimulator /t:Rebuild ``` ## Building Xamarin.Android apps The example below demonstrates how to change default Xamarin SDK versions and build a Xamarin.Android application. ```yaml name: Build Xamarin.Android app on: [push] jobs: build: runs-on: macos-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Set default Xamarin SDK versions run: | $VM\_ASSETS/select-xamarin-sdk-v2.sh --mono=6.10 --android=10.2 - name: Setup .NET Core SDK 5.0.x uses: {% data reusables.actions.action-setup-dotnet %} with: dotnet-version: '5.0.x' - name: Install dependencies run: nuget restore - name: Build run: msbuild /t:PackageForAndroid /p:Configuration=Debug ``` ## Specifying a .NET version To use a preinstalled version of the .NET Core SDK on a {% data variables.product.prodname\_dotcom %}-hosted runner, use the `setup-dotnet` action. This action finds a specific version of .NET from the tools cache on each runner, and adds the necessary binaries to `PATH`. These changes will persist for the remainder of the job. The `setup-dotnet` action is the recommended way of using .NET with {% data variables.product.prodname\_actions %}, because it ensures consistent behavior across different runners and different versions of .NET. If you are using a self-hosted runner, you must install .NET and add it to `PATH`. For more information, see the [`setup-dotnet`](https://github.com/marketplace/actions/setup-net-core-sdk) action.
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/xamarin-apps.md
main
github-actions
[ -0.04446578398346901, -0.02390155754983425, 0.00556271243840456, 0.0006145749357528985, 0.020395176485180855, 0.008082875050604343, 0.021023117005825043, 0.06338465213775635, -0.009872222319245338, 0.01927845925092697, 0.016003774479031563, -0.04830407723784447, 0.014962923713028431, -0.03...
0.024697
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to create a workflow that performs continuous integration (CI) for your Java project using the Ant build system. The workflow you create will allow you to see when commits to a pull request cause build or test failures against your default branch; this approach can help ensure that your code is always healthy. You can extend your CI workflow to upload artifacts from a workflow run. {% data variables.product.prodname\_dotcom %}-hosted runners have a tools cache with pre-installed software, which includes Java Development Kits (JDKs) and Ant. For a list of software and the pre-installed versions for JDK and Ant, see [AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners#supported-software). ## Prerequisites You should be familiar with YAML and the syntax for {% data variables.product.prodname\_actions %}. For more information, see: \* [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions) \* [AUTOTITLE](/actions/learn-github-actions) We recommend that you have a basic understanding of Java and the Ant framework. For more information, see the [Apache Ant Manual](https://ant.apache.org/manual/). {% data reusables.actions.enterprise-setup-prereq %} ## Using an Ant workflow template {% data reusables.actions.workflow-templates-get-started %} {% data variables.product.prodname\_dotcom %} provides a workflow template for Ant that should work for most Java with Ant projects. The subsequent sections of this guide give examples of how you can customize this workflow template. {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.actions-tab %} {% data reusables.actions.new-starter-workflow %} 1. The "Choose a workflow" page shows a selection of recommended workflow templates. Search for "Java with Ant". 1. On the "Java with Ant" workflow, click \*\*Configure\*\*. {%- ifversion ghes %} If you don't find the "Java with Ant" workflow template, copy the following workflow code to a new file called `ant.yml` in the `.github/workflows` directory of your repository. ```yaml copy name: Java CI on: push: branches: [ $default-branch ] pull\_request: branches: [ $default-branch ] jobs: build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Set up JDK 11 uses: {% data reusables.actions.action-setup-java %} with: java-version: '11' distribution: 'temurin' - name: Build with Ant run: ant -noinput -buildfile build.xml ``` {%- endif %} 1. Edit the workflow as required. For example, change the Java version. 1. Click \*\*Commit changes\*\*. {% ifversion fpt or ghec %} The `ant.yml` workflow file is added to the `.github/workflows` directory of your repository. {% endif %} {% data reusables.actions.java-jvm-architecture %} ## Building and testing your code You can use the same commands that you use locally to build and test your code. The workflow template will run the default target specified in your `build.xml` file. Your default target will commonly be set to build classes, run tests and package classes into their distributable format, for example, a JAR file. If you use different commands to build your project, or you want to run a different target, you can specify those. For example, you may want to run the `jar` target that's configured in your `build-ci.xml` file. ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-java %} with: java-version: '17' distribution: 'temurin' - name: Run the Ant jar target run: ant -noinput -buildfile build-ci.xml jar ``` ## Packaging workflow data as artifacts After your build has succeeded and your tests have passed, you may want to upload the resulting Java packages as a build artifact. This will store the built packages as part of the workflow run, and allow you to download them. Artifacts can help you test and debug pull requests in your local environment before they're merged. For more information, see [AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts). Ant will usually create output files like JARs, EARs, or WARs in the `build/jar` directory. You can upload the contents of that directory using the
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/java-with-ant.md
main
github-actions
[ -0.05290338769555092, -0.010334638878703117, -0.0532776340842247, -0.04963381960988045, 0.013139106333255768, -0.011816451326012611, -0.0385516919195652, 0.08875646442174911, -0.0765751525759697, -0.0024698125198483467, -0.0474301241338253, 0.006598536390811205, 0.009078973904252052, -0.06...
0.172018
you to download them. Artifacts can help you test and debug pull requests in your local environment before they're merged. For more information, see [AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts). Ant will usually create output files like JARs, EARs, or WARs in the `build/jar` directory. You can upload the contents of that directory using the `upload-artifact` action. ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-java %} with: java-version: '17' distribution: 'temurin' - run: ant -noinput -buildfile build.xml - uses: {% data reusables.actions.action-upload-artifact %} with: name: Package path: build/jar ```
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/java-with-ant.md
main
github-actions
[ -0.015638235956430435, -0.023061474785208702, -0.0017000377411022782, -0.04390246048569679, 0.029179852455854416, -0.002851815428584814, 0.023343976587057114, -0.033539917320013046, 0.005322637502104044, 0.03374738246202469, 0.0395025834441185, 0.03882155567407608, -0.08408772200345993, -0...
0.074152
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to use PowerShell for CI. It describes how to use Pester, install dependencies, test your module, and publish to the PowerShell Gallery. {% data variables.product.prodname\_dotcom %}-hosted runners have a tools cache with pre-installed software, which includes PowerShell and Pester. For a full list of up-to-date software and the pre-installed versions of PowerShell and Pester, see [AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners#supported-software). ## Prerequisites You should be familiar with YAML and the syntax for {% data variables.product.prodname\_actions %}. For more information, see [AUTOTITLE](/actions/learn-github-actions). We recommend that you have a basic understanding of PowerShell and Pester. For more information, see: \* [Getting started with PowerShell](https://docs.microsoft.com/powershell/scripting/learn/ps101/01-getting-started) \* [Pester](https://pester.dev) {% data reusables.actions.enterprise-setup-prereq %} ## Adding a workflow for Pester To automate your testing with PowerShell and Pester, you can add a workflow that runs every time a change is pushed to your repository. In the following example, `Test-Path` is used to check that a file called `resultsfile.log` is present. This example workflow file must be added to your repository's `.github/workflows/` directory: ```yaml name: Test PowerShell on Ubuntu on: push jobs: pester-test: name: Pester test runs-on: ubuntu-latest steps: - name: Check out repository code uses: {% data reusables.actions.action-checkout %} - name: Perform a Pester test from the command-line shell: pwsh run: Test-Path resultsfile.log | Should -Be $true - name: Perform a Pester test from the Tests.ps1 file shell: pwsh run: | Invoke-Pester Unit.Tests.ps1 -Passthru ``` \* `shell: pwsh` - Configures the job to use PowerShell when running the `run` commands. \* `run: Test-Path resultsfile.log` - Check whether a file called `resultsfile.log` is present in the repository's root directory. \* `Should -Be $true` - Uses Pester to define an expected result. If the result is unexpected, then {% data variables.product.prodname\_actions %} flags this as a failed test. For example: ![Screenshot of a workflow run failure for a Pester test. Test reports "Expected $true, but got $false" and "Error: Process completed with exit code 1."](/assets/images/help/repository/actions-failed-pester-test-updated.png) \* `Invoke-Pester Unit.Tests.ps1 -Passthru` - Uses Pester to execute tests defined in a file called `Unit.Tests.ps1`. For example, to perform the same test described above, the `Unit.Tests.ps1` will contain the following: ```powershell Describe "Check results file is present" { It "Check results file is present" { Test-Path resultsfile.log | Should -Be $true } } ``` ## PowerShell module locations The table below describes the locations for various PowerShell modules in each {% data variables.product.prodname\_dotcom %}-hosted runner. {% rowheaders %} || Ubuntu | macOS | Windows | |------|-------|------|----------| |\*\*PowerShell system modules\*\* |`/opt/microsoft/powershell/7/Modules/\*`|`/usr/local/microsoft/powershell/7/Modules/\*`|`C:\program files\powershell\7\Modules\\*`| |\*\*PowerShell add-on modules\*\*|`/usr/local/share/powershell/Modules/\*`|`/usr/local/share/powershell/Modules/\*`|`C:\Modules\\*`| |\*\*User-installed modules\*\*|`/home/runner/.local/share/powershell/Modules/\*`|`/Users/runner/.local/share/powershell/Modules/\*`|`C:\Users\runneradmin\Documents\PowerShell\Modules\\*`| {% endrowheaders %} > [!NOTE] > On Ubuntu runners, Azure PowerShell modules are stored in `/usr/share/` instead of the default location of PowerShell add-on modules (i.e. `/usr/local/share/powershell/Modules/`). ## Installing dependencies {% data variables.product.prodname\_dotcom %}-hosted runners have PowerShell 7 and Pester installed. You can use `Install-Module` to install additional dependencies from the PowerShell Gallery before building and testing your code. > [!NOTE] > The pre-installed packages (such as Pester) used by {% data variables.product.prodname\_dotcom %}-hosted runners are regularly updated, and can introduce significant changes. As a result, it is recommended that you always specify the required package versions by using `Install-Module` with `-MaximumVersion`. You can also cache dependencies to speed up your workflow. For more information, see [AUTOTITLE](/actions/using-workflows/caching-dependencies-to-speed-up-workflows). For example, the following job installs the `SqlServer` and `PSScriptAnalyzer` modules: ```yaml jobs: install-dependencies: name: Install dependencies runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Install from PSGallery shell: pwsh run: | Set-PSRepository PSGallery -InstallationPolicy Trusted Install-Module SqlServer, PSScriptAnalyzer ``` > [!NOTE] > By default, no repositories are trusted by PowerShell. When installing modules from the PowerShell Gallery, you
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/powershell.md
main
github-actions
[ -0.004301501903682947, -0.040767136961221695, -0.1138700470328331, 0.036587346345186234, 0.036267396062612534, 0.044875264167785645, 0.0012412508949637413, 0.06475690007209778, -0.0648564025759697, 0.06893190741539001, 0.030830027535557747, 0.009343073703348637, 0.03617754578590393, -0.047...
0.144554
jobs: install-dependencies: name: Install dependencies runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Install from PSGallery shell: pwsh run: | Set-PSRepository PSGallery -InstallationPolicy Trusted Install-Module SqlServer, PSScriptAnalyzer ``` > [!NOTE] > By default, no repositories are trusted by PowerShell. When installing modules from the PowerShell Gallery, you must explicitly set the installation policy for `PSGallery` to `Trusted`. ### Caching dependencies You can cache PowerShell dependencies using a unique key, which allows you to restore the dependencies for future workflows with the [`cache`](https://github.com/marketplace/actions/cache) action. For more information, see [AUTOTITLE](/actions/using-workflows/caching-dependencies-to-speed-up-workflows). PowerShell caches its dependencies in different locations, depending on the runner's operating system. For example, the `path` location used in the following Ubuntu example will be different for a Windows operating system. ```yaml steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup PowerShell module cache id: cacher uses: {% data reusables.actions.action-cache %} with: path: "~/.local/share/powershell/Modules" key: {% raw %}${{ runner.os }}-SqlServer-PSScriptAnalyzer{% endraw %} - name: Install required PowerShell modules if: steps.cacher.outputs.cache-hit != 'true' shell: pwsh run: | Set-PSRepository PSGallery -InstallationPolicy Trusted Install-Module SqlServer, PSScriptAnalyzer -ErrorAction Stop ``` ## Testing your code You can use the same commands that you use locally to build and test your code. ### Using PSScriptAnalyzer to lint code The following example installs `PSScriptAnalyzer` and uses it to lint all `ps1` files in the repository. For more information, see [PSScriptAnalyzer on GitHub](https://github.com/PowerShell/PSScriptAnalyzer). ```yaml lint-with-PSScriptAnalyzer: name: Install and run PSScriptAnalyzer runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Install PSScriptAnalyzer module shell: pwsh run: | Set-PSRepository PSGallery -InstallationPolicy Trusted Install-Module PSScriptAnalyzer -ErrorAction Stop - name: Lint with PSScriptAnalyzer shell: pwsh run: | Invoke-ScriptAnalyzer -Path \*.ps1 -Recurse -Outvariable issues $errors = $issues.Where({$\_.Severity -eq 'Error'}) $warnings = $issues.Where({$\_.Severity -eq 'Warning'}) if ($errors) { Write-Error "There were $($errors.Count) errors and $($warnings.Count) warnings total." -ErrorAction Stop } else { Write-Output "There were $($errors.Count) errors and $($warnings.Count) warnings total." } ``` ## Packaging workflow data as artifacts You can upload artifacts to view after a workflow completes. For example, you may need to save log files, core dumps, test results, or screenshots. For more information, see [AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts). The following example demonstrates how you can use the `upload-artifact` action to archive the test results received from `Invoke-Pester`. For more information, see the [`upload-artifact` action](https://github.com/actions/upload-artifact). ```yaml name: Upload artifact from Ubuntu on: [push] jobs: upload-pester-results: name: Run Pester and upload results runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Test with Pester shell: pwsh run: Invoke-Pester Unit.Tests.ps1 -Passthru | Export-CliXml -Path Unit.Tests.xml - name: Upload test results uses: {% data reusables.actions.action-upload-artifact %} with: name: ubuntu-Unit-Tests path: Unit.Tests.xml if: {% raw %}${{ always() }}{% endraw %} ``` The `always()` function configures the job to continue processing even if there are test failures. For more information, see [AUTOTITLE](/actions/learn-github-actions/contexts#always). ## Publishing to PowerShell Gallery You can configure your workflow to publish your PowerShell module to the PowerShell Gallery when your CI tests pass. You can use secrets to store any tokens or credentials needed to publish your package. For more information, see [AUTOTITLE](/actions/security-guides/using-secrets-in-github-actions). The following example creates a package and uses `Publish-Module` to publish it to the PowerShell Gallery: ```yaml name: Publish PowerShell Module on: release: types: [created] jobs: publish-to-gallery: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Build and publish env: NUGET\_KEY: {% raw %}${{ secrets.NUGET\_KEY }}{% endraw %} shell: pwsh run: | ./build.ps1 -Path /tmp/samplemodule Publish-Module -Path /tmp/samplemodule -NuGetApiKey $env:NUGET\_KEY -Verbose ```
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/powershell.md
main
github-actions
[ -0.020599104464054108, -0.09631578624248505, -0.06976723670959473, 0.017150728031992912, 0.030527843162417412, 0.04261361435055733, 0.04590383544564247, -0.08691379427909851, -0.03232794627547264, 0.006564883980900049, 0.00011427738354541361, 0.07349227368831635, -0.010951130650937557, 0.0...
-0.006701
secrets.NUGET\_KEY }}{% endraw %} shell: pwsh run: | ./build.ps1 -Path /tmp/samplemodule Publish-Module -Path /tmp/samplemodule -NuGetApiKey $env:NUGET\_KEY -Verbose ```
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/powershell.md
main
github-actions
[ 0.0025894104037433863, 0.017836524173617363, -0.11890915781259537, 0.047868866473436356, 0.042450904846191406, -0.01289481669664383, 0.039232298731803894, 0.04463452845811844, -0.012382838875055313, 0.053630415350198746, -0.007225584704428911, -0.024788107722997665, 0.02931625209748745, -0...
0.013217
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to create a continuous integration (CI) workflow that builds and tests Node.js code. If your CI tests pass, you may want to deploy your code or publish a package. ## Prerequisites We recommend that you have a basic understanding of Node.js, YAML, workflow configuration options, and how to create a workflow file. For more information, see: \* [AUTOTITLE](/actions/learn-github-actions) \* [Getting started with Node.js](https://nodejs.org/en/docs/guides/getting-started-guide/) {% data reusables.actions.enterprise-setup-prereq %} ## Using a Node.js workflow template {% data reusables.actions.workflow-templates-get-started %} {% data variables.product.prodname\_dotcom %} provides a workflow template for Node.js that should work for most Node.js projects. The subsequent sections of this guide give examples of how you can customize this workflow template. {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.actions-tab %} {% data reusables.actions.new-starter-workflow %} 1. The "Choose a workflow" page shows a selection of recommended workflow templates. Search for "Node.js". 1. Filter the selection of workflows by clicking \*\*Continuous integration\*\*. 1. On the "Node.js" workflow, click \*\*Configure\*\*. {%- ifversion ghes %} If you don't find the "Node.js" workflow template, copy the following workflow code to a new file called `node.js.yml` in the `.github/workflows` directory of your repository. ```yaml copy name: Node.js CI on: push: branches: [ "main" ] pull\_request: branches: [ "main" ] jobs: build: runs-on: ubuntu-latest strategy: matrix: node-version: [18.x, 20.x] # See supported Node.js release schedule at https://nodejs.org/en/about/releases/ steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js {% raw %}${{ matrix.node-version }}{% endraw %} uses: {% data reusables.actions.action-setup-node %} with: node-version: {% raw %}${{ matrix.node-version }}{% endraw %} cache: 'npm' - run: npm ci - run: npm run build --if-present - run: npm test ``` {%- endif %} 1. Edit the workflow as required. For example, change the Node versions you want to use. 1. Click \*\*Commit changes\*\*. {% ifversion fpt or ghec %} The `node.js.yml` workflow file is added to the `.github/workflows` directory of your repository. {% endif %} ## Specifying the Node.js version The easiest way to specify a Node.js version is by using the `setup-node` action provided by {% data variables.product.prodname\_dotcom %}. For more information see, [`setup-node`](https://github.com/actions/setup-node/). The `setup-node` action takes a Node.js version as an input and configures that version on the runner. The `setup-node` action finds a specific version of Node.js from the tools cache on each runner and adds the necessary binaries to `PATH`, which persists for the rest of the job. Using the `setup-node` action is the recommended way of using Node.js with {% data variables.product.prodname\_actions %} because it ensures consistent behavior across different runners and different versions of Node.js. If you are using a self-hosted runner, you must install Node.js and add it to `PATH`. The workflow template includes a matrix strategy that builds and tests your code with the Node.js versions listed in `node-version`. The 'x' in the version number is a wildcard character that matches the latest minor and patch release available for a version. Each version of Node.js specified in the `node-version` array creates a job that runs the same steps. Each job can access the value defined in the matrix `node-version` array using the `matrix` context. The `setup-node` action uses the context as the `node-version` input. The `setup-node` action configures each job with a different Node.js version before building and testing code. For more information about matrix strategies and contexts, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob\_idstrategymatrix) and [AUTOTITLE](/actions/learn-github-actions/contexts). ```yaml copy strategy: matrix: node-version: ['18.x', '20.x'] steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js {% raw %}${{ matrix.node-version }}{% endraw %} uses: {% data reusables.actions.action-setup-node %} with: node-version: {% raw %}${{ matrix.node-version }}{% endraw %} ``` Alternatively,
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/nodejs.md
main
github-actions
[ -0.06299108266830444, -0.04263872280716896, -0.05020804703235626, 0.021210893988609314, 0.043147772550582886, -0.013511469587683678, -0.03811684250831604, 0.036152031272649765, -0.02259952574968338, 0.01885322295129299, -0.02449219860136509, 0.08010263741016388, 0.017599204555153847, -0.05...
0.126783
about matrix strategies and contexts, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob\_idstrategymatrix) and [AUTOTITLE](/actions/learn-github-actions/contexts). ```yaml copy strategy: matrix: node-version: ['18.x', '20.x'] steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js {% raw %}${{ matrix.node-version }}{% endraw %} uses: {% data reusables.actions.action-setup-node %} with: node-version: {% raw %}${{ matrix.node-version }}{% endraw %} ``` Alternatively, you can build and test with exact Node.js versions. ```yaml copy strategy: matrix: node-version: ['10.17.0', '17.9.0'] ``` Or, you can build and test using a single version of Node.js too. ```yaml copy name: Node.js CI on: [push] jobs: build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js uses: {% data reusables.actions.action-setup-node %} with: node-version: '20.x' - run: npm ci - run: npm run build --if-present - run: npm test ``` If you don't specify a Node.js version, {% data variables.product.prodname\_dotcom %} uses the environment's default Node.js version. For more information, see [AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners#supported-software). ## Installing dependencies {% data variables.product.prodname\_dotcom %}-hosted runners have npm and Yarn dependency managers installed. You can use npm and Yarn to install dependencies in your workflow before building and testing your code. The Windows and Linux {% data variables.product.prodname\_dotcom %}-hosted runners also have Grunt, Gulp, and Bower installed. You can also cache dependencies to speed up your workflow. For more information, see [AUTOTITLE](/actions/using-workflows/caching-dependencies-to-speed-up-workflows). ### Example using npm This example installs the versions in the `package-lock.json` or `npm-shrinkwrap.json` file and prevents updates to the lock file. Using `npm ci` is generally faster than running `npm install`. For more information, see [`npm ci`](https://docs.npmjs.com/cli/ci.html) and [Introducing `npm ci` for faster, more reliable builds](https://blog.npmjs.org/post/171556855892/introducing-npm-ci-for-faster-more-reliable). ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js uses: {% data reusables.actions.action-setup-node %} with: node-version: '20.x' - name: Install dependencies run: npm ci ``` Using `npm install` installs the dependencies defined in the `package.json` file. For more information, see [`npm install`](https://docs.npmjs.com/cli/install). ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js uses: {% data reusables.actions.action-setup-node %} with: node-version: '20.x' - name: Install dependencies run: npm install ``` ### Example using Yarn This example installs the dependencies defined in the `yarn.lock` file and prevents updates to the `yarn.lock` file. For more information, see [`yarn install`](https://yarnpkg.com/en/docs/cli/install). ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js uses: {% data reusables.actions.action-setup-node %} with: node-version: '20.x' - name: Install dependencies run: yarn --frozen-lockfile ``` Alternatively, you can install the dependencies defined in the `package.json` file. ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js uses: {% data reusables.actions.action-setup-node %} with: node-version: '20.x' - name: Install dependencies run: yarn ``` ### Example using a private registry and creating the .npmrc file {% data reusables.actions.setup-node-intro %} To authenticate to your private registry, you'll need to store your npm authentication token as a secret. For example, create a repository secret called `NPM\_TOKEN`. For more information, see [AUTOTITLE](/actions/security-guides/using-secrets-in-github-actions). In the example below, the secret `NPM\_TOKEN` stores the npm authentication token. The `setup-node` action configures the `.npmrc` file to read the npm authentication token from the `NODE\_AUTH\_TOKEN` environment variable. When using the `setup-node` action to create an `.npmrc` file, you must set the `NODE\_AUTH\_TOKEN` environment variable with the secret that contains your npm authentication token. Before installing dependencies, use the `setup-node` action to create the `.npmrc` file. The action has two input parameters. The `node-version` parameter sets the Node.js version, and the `registry-url` parameter sets the default registry. If your package registry uses scopes, you must use the `scope` parameter. For more information, see [`npm-scope`](https://docs.npmjs.com/misc/scope). ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js uses: {% data reusables.actions.action-setup-node %} with: always-auth: true
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/nodejs.md
main
github-actions
[ -0.05538445711135864, 0.0015698180068284273, -0.07437264174222946, 0.001357327331788838, 0.030440600588917732, 0.017554061487317085, -0.02542070858180523, -0.08439440280199051, 0.007539229467511177, 0.018282722681760788, 0.02033674344420433, 0.042097579687833786, -0.028361836448311806, 0.0...
0.057048
sets the Node.js version, and the `registry-url` parameter sets the default registry. If your package registry uses scopes, you must use the `scope` parameter. For more information, see [`npm-scope`](https://docs.npmjs.com/misc/scope). ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js uses: {% data reusables.actions.action-setup-node %} with: always-auth: true node-version: '20.x' registry-url: https://registry.npmjs.org scope: '@octocat' - name: Install dependencies run: npm ci env: NODE\_AUTH\_TOKEN: {% raw %}${{ secrets.NPM\_TOKEN }}{% endraw %} ``` The example above creates an `.npmrc` file with the following contents: ```shell //registry.npmjs.org/:\_authToken=${NODE\_AUTH\_TOKEN} @octocat:registry=https://registry.npmjs.org/ always-auth=true ``` ### Example caching dependencies You can cache and restore the dependencies using the [`setup-node` action](https://github.com/actions/setup-node). The following example caches dependencies for npm. ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-node %} with: node-version: '20' cache: 'npm' - run: npm install - run: npm test ``` The following example caches dependencies for Yarn. ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-node %} with: node-version: '20' cache: 'yarn' - run: yarn - run: yarn test ``` The following example caches dependencies for pnpm (v6.10+). ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} # NOTE: pnpm caching support requires pnpm version >= 6.10.0 steps: - uses: {% data reusables.actions.action-checkout %} - uses: pnpm/action-setup@0609f0983b7a228f052f81ef4c3d6510cae254ad with: version: 6.10.0 - uses: {% data reusables.actions.action-setup-node %} with: node-version: '20' cache: 'pnpm' - run: pnpm install - run: pnpm test ``` If you have a custom requirement or need finer controls for caching, you can use the [`cache` action](https://github.com/marketplace/actions/cache). For more information, see [AUTOTITLE](/actions/using-workflows/caching-dependencies-to-speed-up-workflows). ## Building and testing your code You can use the same commands that you use locally to build and test your code. For example, if you run `npm run build` to run build steps defined in your `package.json` file and `npm test` to run your test suite, you would add those commands in your workflow file. ```yaml copy steps: - uses: {% data reusables.actions.action-checkout %} - name: Use Node.js uses: {% data reusables.actions.action-setup-node %} with: node-version: '20.x' - run: npm install - run: npm run build --if-present - run: npm test ``` ## Packaging workflow data as artifacts You can save artifacts from your build and test steps to view after a job completes. For example, you may need to save log files, core dumps, test results, or screenshots. For more information, see [AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts). ## Publishing to package registries You can configure your workflow to publish your Node.js package to a package registry after your CI tests pass. For more information about publishing to npm and {% data variables.product.prodname\_registry %}, see [AUTOTITLE](/actions/publishing-packages/publishing-nodejs-packages).
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/nodejs.md
main
github-actions
[ -0.013186978176236153, -0.0008525584125891328, -0.024152686819434166, 0.04993501305580139, 0.019818976521492004, 0.019038701429963112, -0.0044670673087239265, 0.018712414428591728, -0.01494375430047512, 0.0820540264248848, -0.01906534656882286, 0.03664616495370865, -0.059146035462617874, 0...
0.048308
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you how to build, test, and publish a .NET package. {% data variables.product.prodname\_dotcom %}-hosted runners have a tools cache with preinstalled software, which includes the .NET Core SDK. For a full list of up-to-date software and the preinstalled versions of .NET Core SDK, see [software installed on {% data variables.product.prodname\_dotcom %}-hosted runners](/actions/using-github-hosted-runners/about-github-hosted-runners). ## Prerequisites You should already be familiar with YAML syntax and how it's used with {% data variables.product.prodname\_actions %}. For more information, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions). We recommend that you have a basic understanding of the .NET Core SDK. For more information, see [Getting started with .NET](https://dotnet.microsoft.com/learn). ## Using a .NET workflow template {% data reusables.actions.workflow-templates-get-started %} {% data variables.product.prodname\_dotcom %} provides a workflow template for .NET that should work for most .NET projects. The subsequent sections of this guide give examples of how you can customize this workflow template. {% data reusables.repositories.navigate-to-repo %} {% data reusables.repositories.actions-tab %} {% data reusables.actions.new-starter-workflow %} 1. The "Choose a workflow" page shows a selection of recommended workflow templates. Search for "dotnet". 1. On the ".NET" workflow, click \*\*Configure\*\*. {%- ifversion ghes %} If you don't find the ".NET" workflow template, copy the following workflow code to a new file called `dotnet.yml` in the `.github/workflows` directory of your repository. ```yaml copy name: .NET on: push: branches: [ "main" ] pull\_request: branches: [ "main" ] jobs: build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup .NET uses: {% data reusables.actions.action-setup-dotnet %} with: dotnet-version: 6.0.x - name: Restore dependencies run: dotnet restore - name: Build run: dotnet build --no-restore - name: Test run: dotnet test --no-build --verbosity normal ``` {%- endif %} 1. Edit the workflow as required. For example, change the .NET version. 1. Click \*\*Commit changes\*\*. {% ifversion fpt or ghec %} The `dotnet.yml` workflow file is added to the `.github/workflows` directory of your repository. {% endif %} ## Specifying a .NET version To use a preinstalled version of the .NET Core SDK on a {% data variables.product.prodname\_dotcom %}-hosted runner, use the `setup-dotnet` action. This action finds a specific version of .NET from the tools cache on each runner, and adds the necessary binaries to `PATH`. These changes will persist for the remainder of the job. The `setup-dotnet` action is the recommended way of using .NET with {% data variables.product.prodname\_actions %}, because it ensures consistent behavior across different runners and different versions of .NET. If you are using a self-hosted runner, you must install .NET and add it to `PATH`. For more information, see the [`setup-dotnet`](https://github.com/marketplace/actions/setup-net-core-sdk) action. ### Using multiple .NET versions ```yaml name: dotnet package on: [push] jobs: build: runs-on: ubuntu-latest strategy: matrix: dotnet-version: [ '3.1.x', '6.0.x' ] steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup dotnet {% raw %}${{ matrix.dotnet-version }}{% endraw %} uses: {% data reusables.actions.action-setup-dotnet %} with: dotnet-version: {% raw %}${{ matrix.dotnet-version }}{% endraw %} # You can test your matrix by printing the current dotnet version - name: Display dotnet version run: dotnet --version ``` ### Using a specific .NET version You can configure your job to use a specific version of .NET, such as `6.0.22`. Alternatively, you can use semantic version syntax to get the latest minor release. This example uses the latest minor release of .NET 6. ```yaml - name: Setup .NET 6.x uses: {% data reusables.actions.action-setup-dotnet %} with: # Semantic version range syntax or exact version of a dotnet version dotnet-version: '6.x' ``` ## Installing dependencies {% data variables.product.prodname\_dotcom %}-hosted runners have the NuGet package manager installed. You can use the dotnet CLI to install dependencies from the NuGet package registry
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/net.md
main
github-actions
[ -0.035498227924108505, -0.06351765990257263, -0.08775554597377777, 0.015379846096038818, 0.021396413445472717, 0.005521189421415329, 0.00010310970537830144, 0.10211788862943649, -0.051135510206222534, 0.05637724697589874, 0.008121308870613575, 0.010647544637322426, 0.03474171459674835, -0....
0.183942
6.x uses: {% data reusables.actions.action-setup-dotnet %} with: # Semantic version range syntax or exact version of a dotnet version dotnet-version: '6.x' ``` ## Installing dependencies {% data variables.product.prodname\_dotcom %}-hosted runners have the NuGet package manager installed. You can use the dotnet CLI to install dependencies from the NuGet package registry before building and testing your code. For example, the YAML below installs the `Newtonsoft` package. ```yaml steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup dotnet uses: {% data reusables.actions.action-setup-dotnet %} with: dotnet-version: '6.0.x' - name: Install dependencies run: dotnet add package Newtonsoft.Json --version 12.0.1 ``` ### Caching dependencies You can cache NuGet dependencies for future workflows using the optional `cache` input. For example, the YAML below caches the NuGet `global-packages` folder, and then installs the `Newtonsoft` package. A second optional input, `cache-dependency-path`, can be used to specify the path to a dependency file: `packages.lock.json`. For more information, see [AUTOTITLE](/actions/using-workflows/caching-dependencies-to-speed-up-workflows). ```yaml steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup dotnet uses: {% data reusables.actions.action-setup-dotnet %} with: dotnet-version: '6.x' cache: true - name: Install dependencies run: dotnet add package Newtonsoft.Json --version 12.0.1 ``` > [!NOTE] > Depending on the number of dependencies, it may be faster to use the dependency cache. Projects with many large dependencies should see a performance increase as it cuts down the time required for downloading. Projects with fewer dependencies may not see a significant performance increase and may even see a slight decrease due to how NuGet installs cached dependencies. The performance varies from project to project. ## Building and testing your code You can use the same commands that you use locally to build and test your code. This example demonstrates how to use `dotnet build` and `dotnet test` in a job: ```yaml steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup dotnet uses: {% data reusables.actions.action-setup-dotnet %} with: dotnet-version: '6.0.x' - name: Install dependencies run: dotnet restore - name: Build run: dotnet build --no-restore - name: Test with the dotnet CLI run: dotnet test --no-build ``` ## Packaging workflow data as artifacts After a workflow completes, you can upload the resulting artifacts for analysis. For example, you may need to save log files, core dumps, test results, or screenshots. The following example demonstrates how you can use the `upload-artifact` action to upload test results. For more information, see [AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts). ```yaml name: dotnet package on: [push] jobs: build: runs-on: ubuntu-latest strategy: matrix: dotnet-version: [ '3.1.x', '6.0.x' ] steps: - uses: {% data reusables.actions.action-checkout %} - name: Setup dotnet uses: {% data reusables.actions.action-setup-dotnet %} with: dotnet-version: {% raw %}${{ matrix.dotnet-version }}{% endraw %} - name: Install dependencies run: dotnet restore - name: Test with dotnet run: dotnet test --no-restore --logger trx --results-directory {% raw %}"TestResults-${{ matrix.dotnet-version }}"{% endraw %} - name: Upload dotnet test results uses: {% data reusables.actions.action-upload-artifact %} with: name: {% raw %}dotnet-results-${{ matrix.dotnet-version }}{% endraw %} path: {% raw %}TestResults-${{ matrix.dotnet-version }}{% endraw %} # Use always() to always run this step to publish test results when there are test failures if: {% raw %}${{ always() }}{% endraw %} ``` ## Publishing to package registries You can configure your workflow to publish your .NET package to a package registry when your CI tests pass. You can use repository secrets to store any tokens or credentials needed to publish your binary. The following example creates and publishes a package to {% data variables.product.prodname\_registry %} using `dotnet core cli`. ```yaml name: Upload dotnet package on: release: types: [created] jobs: deploy: runs-on: ubuntu-latest permissions: packages: write contents: read steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-dotnet %} with:
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/net.md
main
github-actions
[ -0.04013412445783615, -0.09333328157663345, -0.0038450309075415134, -0.021453697234392166, 0.07057005167007446, -0.003924795426428318, 0.033403314650058746, -0.026834873482584953, -0.04665333777666092, 0.058810845017433167, 0.011222926899790764, 0.03834932669997215, -0.0581638440489769, -0...
0.146115
binary. The following example creates and publishes a package to {% data variables.product.prodname\_registry %} using `dotnet core cli`. ```yaml name: Upload dotnet package on: release: types: [created] jobs: deploy: runs-on: ubuntu-latest permissions: packages: write contents: read steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-dotnet %} with: dotnet-version: '6.0.x' # SDK Version to use. source-url: https://nuget.pkg.github.com//index.json env: NUGET\_AUTH\_TOKEN: {% raw %}${{secrets.GITHUB\_TOKEN}}{% endraw %} - run: dotnet build --configuration Release - name: Create the package run: dotnet pack --configuration Release - name: Publish the package to GPR run: dotnet nuget push /bin/Release/\*.nupkg ```
https://github.com/github/docs/blob/main//content/actions/tutorials/build-and-test-code/net.md
main
github-actions
[ -0.06896136701107025, -0.04354992136359215, -0.043966807425022125, -0.04387655854225159, 0.08609753847122192, -0.0488361120223999, 0.07584818452596664, 0.017672918736934662, -0.03539946302771568, 0.07190857082605362, 0.009899511933326721, 0.02881723828613758, 0.02483045496046543, 0.0299595...
0.111614
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This tutorial demonstrates how to use the {% data variables.product.prodname\_cli %} in a workflow to label newly opened or reopened issues. For example, you can add the `triage` label every time an issue is opened or reopened. Then, you can see all issues that need to be triaged by filtering for issues with the `triage` label. The {% data variables.product.prodname\_cli %} allows you to easily use the {% data variables.product.prodname\_dotcom %} API in a workflow. In the tutorial, you will first make a workflow file that uses the {% data variables.product.prodname\_cli %}. Then, you will customize the workflow to suit your needs. ## Creating the workflow 1. {% data reusables.actions.choose-repo %} 1. {% data reusables.actions.make-workflow-file %} 1. Copy the following YAML contents into your workflow file. ```yaml copy name: Label issues on: issues: types: - reopened - opened jobs: label\_issues: runs-on: ubuntu-latest permissions: issues: write steps: - run: gh issue edit "$NUMBER" --add-label "$LABELS" env: GH\_TOKEN: {% raw %}${{ secrets.GITHUB\_TOKEN }}{% endraw %} GH\_REPO: {% raw %}${{ github.repository }}{% endraw %} NUMBER: {% raw %}${{ github.event.issue.number }}{% endraw %} LABELS: triage ``` 1. Customize the `env` values in your workflow file: \* The `GH\_TOKEN`, `GH\_REPO`, and `NUMBER` values are automatically set using the `github` and `secrets` contexts. You do not need to change these. \* Change the value for `LABELS` to the list of labels that you want to add to the issue. The label(s) must exist for your repository. Separate multiple labels with commas. For example, `help wanted,good first issue`. For more information about labels, see [AUTOTITLE](/issues/using-labels-and-milestones-to-track-work/managing-labels#applying-labels-to-issues-and-pull-requests). 1. {% data reusables.actions.commit-workflow %} ## Testing the workflow Every time an issue in your repository is opened or reopened, this workflow will add the labels that you specified to the issue. Test out your workflow by creating an issue in your repository. 1. Create an issue in your repository. For more information, see [AUTOTITLE](/issues/tracking-your-work-with-issues/creating-an-issue). 1. To see the workflow run that was triggered by creating the issue, view the history of your workflow runs. For more information, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/viewing-workflow-run-history). 1. When the workflow completes, the issue that you created should have the specified labels added. ## Next steps \* To learn more about additional things you can do with the {% data variables.product.prodname\_cli %}, see the [GitHub CLI manual](https://cli.github.com/manual/). \* To learn more about different events that can trigger your workflow, see [AUTOTITLE](/actions/using-workflows/events-that-trigger-workflows#issues). \* [Search GitHub](https://github.com/search?q=path%3A.github%2Fworkflows+gh+issue+edit&type=code) for examples of workflows using `gh issue edit`.
https://github.com/github/docs/blob/main//content/actions/tutorials/manage-your-work/add-labels-to-issues.md
main
github-actions
[ -0.007084076292812824, -0.06476477533578873, -0.07763820141553879, 0.019648121669888496, 0.007145152892917395, 0.049394045025110245, 0.061936553567647934, 0.060826729983091354, -0.026247546076774597, -0.0034049921669065952, 0.010154743678867817, 0.0015734246699139476, 0.012960963882505894, ...
0.151816
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This tutorial demonstrates how to use the [`actions/stale` action](https://github.com/marketplace/actions/close-stale-issues) to comment on and close issues that have been inactive for a certain period of time. For example, you can comment if an issue has been inactive for 30 days to prompt participants to take action. Then, if no additional activity occurs after 14 days, you can close the issue. In the tutorial, you will first make a workflow file that uses the [`actions/stale` action](https://github.com/marketplace/actions/close-stale-issues). Then, you will customize the workflow to suit your needs. ## Creating the workflow 1. {% data reusables.actions.choose-repo %} 1. {% data reusables.actions.make-workflow-file %} 1. Copy the following YAML contents into your workflow file. ```yaml copy name: Close inactive issues on: schedule: - cron: "30 1 \* \* \*" jobs: close-issues: runs-on: ubuntu-latest permissions: issues: write pull-requests: write steps: - uses: {% data reusables.actions.action-stale %} with: days-before-issue-stale: 30 days-before-issue-close: 14 stale-issue-label: "stale" stale-issue-message: "This issue is stale because it has been open for 30 days with no activity." close-issue-message: "This issue was closed because it has been inactive for 14 days since being marked as stale." days-before-pr-stale: -1 days-before-pr-close: -1 repo-token: {% raw %}${{ secrets.GITHUB\_TOKEN }}{% endraw %} ``` 1. Customize the parameters in your workflow file: \* Change the value for `on.schedule` to dictate when you want this workflow to run. In the example above, the workflow will run every day at 1:30 UTC. For more information about scheduled workflows, see [AUTOTITLE](/actions/using-workflows/events-that-trigger-workflows#scheduled-events). \* Change the value for `days-before-issue-stale` to the number of days without activity before the `actions/stale` action labels an issue. If you never want this action to label issues, set this value to `-1`. \* Change the value for `days-before-issue-close` to the number of days without activity before the `actions/stale` action closes an issue. If you never want this action to close issues, set this value to `-1`. \* Change the value for `stale-issue-label` to the label that you want to apply to issues that have been inactive for the amount of time specified by `days-before-issue-stale`. \* Change the value for `stale-issue-message` to the comment that you want to add to issues that are labeled by the `actions/stale` action. \* Change the value for `close-issue-message` to the comment that you want to add to issues that are closed by the `actions/stale` action. 1. {% data reusables.actions.commit-workflow %} ## Expected results Based on the `schedule` parameter (for example, every day at 1:30 UTC), your workflow will find issues that have been inactive for the specified period of time and will add the specified comment and label. Additionally, your workflow will close any previously labeled issues if no additional activity has occurred for the specified period of time. > [!NOTE] > {% data reusables.actions.schedule-delay %} You can view the history of your workflow runs to see this workflow run periodically. For more information, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/viewing-workflow-run-history). This workflow will only label and/or close 30 issues at a time in order to avoid exceeding a rate limit. You can configure this with the `operations-per-run` setting. For more information, see the [`actions/stale` action documentation](https://github.com/marketplace/actions/close-stale-issues). ## Next steps \* To learn more about additional things you can do with the `actions/stale` action, like closing inactive pull requests, ignoring issues with certain labels or milestones, or only checking issues with certain labels, see the [`actions/stale` action documentation](https://github.com/marketplace/actions/close-stale-issues). \* [Search GitHub](https://github.com/search?q=%22uses%3A+actions%2Fstale%22&type=code) for examples of workflows using this action.
https://github.com/github/docs/blob/main//content/actions/tutorials/manage-your-work/close-inactive-issues.md
main
github-actions
[ -0.013948447071015835, -0.05203384533524513, -0.008371968753635883, 0.08219703286886215, 0.021747246384620667, -0.02819036692380905, 0.004459889139980078, -0.02155059017241001, 0.005607995204627514, 0.02795219235122204, -0.017225686460733414, 0.05185621231794357, -0.005036619957536459, 0.0...
0.098156
[`actions/stale` action documentation](https://github.com/marketplace/actions/close-stale-issues). \* [Search GitHub](https://github.com/search?q=%22uses%3A+actions%2Fstale%22&type=code) for examples of workflows using this action.
https://github.com/github/docs/blob/main//content/actions/tutorials/manage-your-work/close-inactive-issues.md
main
github-actions
[ -0.06115010008215904, -0.08000867813825607, -0.03531329706311226, 0.04713069647550583, 0.02619299106299877, -0.03772026300430298, -0.0008033112972043455, -0.05160439759492874, 0.005655812099575996, 0.005775310564786196, 0.0165238119661808, 0.0865437313914299, -0.06054171919822693, -0.01892...
0.187915
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This tutorial demonstrates how to use the {% data variables.product.prodname\_cli %} to comment on an issue when a specific label is applied. For example, when the `help wanted` label is added to an issue, you can add a comment to encourage contributors to work on the issue. For more information about {% data variables.product.prodname\_cli %}, see [AUTOTITLE](/actions/using-workflows/using-github-cli-in-workflows). In the tutorial, you will first make a workflow file that uses the `gh issue comment` command to comment on an issue. Then, you will customize the workflow to suit your needs. ## Creating the workflow 1. {% data reusables.actions.choose-repo %} 1. {% data reusables.actions.make-workflow-file %} 1. Copy the following YAML contents into your workflow file. ```yaml copy name: Add comment on: issues: types: - labeled jobs: add-comment: if: github.event.label.name == 'help wanted' runs-on: ubuntu-latest permissions: issues: write steps: - name: Add comment run: gh issue comment "$NUMBER" --body "$BODY" env: GH\_TOKEN: {% raw %}${{ secrets.GITHUB\_TOKEN }}{% endraw %} GH\_REPO: {% raw %}${{ github.repository }}{% endraw %} NUMBER: {% raw %}${{ github.event.issue.number }}{% endraw %} BODY: > This issue is available for anyone to work on. \*\*Make sure to reference this issue in your pull request.\*\* :sparkles: Thank you for your contribution! :sparkles: ``` 1. Customize the parameters in your workflow file: \* Replace `help wanted` in `if: github.event.label.name == 'help wanted'` with the label that you want to act on. If you want to act on more than one label, separate the conditions with `||`. For example, `if: github.event.label.name == 'bug' || github.event.label.name == 'fix me'` will comment whenever the `bug` or `fix me` labels are added to an issue. \* Change the value for `BODY` to the comment that you want to add. GitHub flavored markdown is supported. For more information about markdown, see [AUTOTITLE](/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax). 1. {% data reusables.actions.commit-workflow %} ## Testing the workflow Every time an issue in your repository is labeled, this workflow will run. If the label that was added is one of the labels that you specified in your workflow file, the `gh issue comment` command will add the comment that you specified to the issue. Test your workflow by applying your specified label to an issue. 1. Open an issue in your repository. For more information, see [AUTOTITLE](/issues/tracking-your-work-with-issues/creating-an-issue). 1. Label the issue with the specified label in your workflow file. For more information, see [AUTOTITLE](/issues/using-labels-and-milestones-to-track-work/managing-labels#applying-labels-to-issues-and-pull-requests). 1. To see the workflow run triggered by labeling the issue, view the history of your workflow runs. For more information, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/viewing-workflow-run-history). 1. When the workflow completes, the issue that you labeled should have a comment added. ## Next steps \* To learn more about additional things you can do with the GitHub CLI, like editing existing comments, visit the [GitHub CLI Manual](https://cli.github.com/manual/).
https://github.com/github/docs/blob/main//content/actions/tutorials/manage-your-work/add-comments-with-labels.md
main
github-actions
[ -0.033847909420728683, -0.021052666008472443, -0.02681679092347622, 0.010677171871066093, 0.03512239456176758, 0.05343969911336899, 0.0993986651301384, 0.03801219165325165, -0.006073695607483387, -0.03312844783067703, -0.019371500238776207, -0.028728360310196877, 0.044837065041065216, -0.0...
0.098278
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This tutorial demonstrates how to use the {% data variables.product.prodname\_cli %} to create an issue on a regular basis. For example, you can create an issue each week to use as the agenda for a team meeting. For more information about {% data variables.product.prodname\_cli %}, see [AUTOTITLE](/actions/using-workflows/using-github-cli-in-workflows). In the tutorial, you will first make a workflow file that uses the {% data variables.product.prodname\_cli %}. Then, you will customize the workflow to suit your needs. ## Creating the workflow 1. {% data reusables.actions.choose-repo %} 1. {% data reusables.actions.make-workflow-file %} 1. Copy the following YAML contents into your workflow file. ```yaml copy name: Weekly Team Sync on: schedule: - cron: 20 07 \* \* 1 jobs: create\_issue: name: Create team sync issue runs-on: ubuntu-latest permissions: issues: write steps: - name: Create team sync issue run: | if [[ $CLOSE\_PREVIOUS == true ]]; then previous\_issue\_number=$(gh issue list \ --label "$LABELS" \ --json number \ --jq '.[0].number') if [[ -n $previous\_issue\_number ]]; then gh issue close "$previous\_issue\_number" gh issue unpin "$previous\_issue\_number" fi fi new\_issue\_url=$(gh issue create \ --title "$TITLE" \ --assignee "$ASSIGNEES" \ --label "$LABELS" \ --body "$BODY") if [[ $PINNED == true ]]; then gh issue pin "$new\_issue\_url" fi env: GH\_TOKEN: {% raw %}${{ secrets.GITHUB\_TOKEN }}{% endraw %} GH\_REPO: {% raw %}${{ github.repository }}{% endraw %} TITLE: Team sync ASSIGNEES: monalisa,doctocat,hubot LABELS: weekly sync,docs-team BODY: | ### Agenda - [ ] Start the recording - [ ] Check-ins - [ ] Discussion points - [ ] Post the recording ### Discussion Points Add things to discuss below - [Work this week](https://github.com/orgs/github/projects/3) PINNED: false CLOSE\_PREVIOUS: false ``` 1. Customize the parameters in your workflow file: \* Change the value for `on.schedule` to dictate when you want this workflow to run. In the example above, the workflow will run every Monday at 7:20 UTC. For more information about scheduled workflows, see [AUTOTITLE](/actions/using-workflows/events-that-trigger-workflows#scheduled-events). \* Change the value for `ASSIGNEES` to the list of {% data variables.product.prodname\_dotcom %} usernames that you want to assign to the issue. \* Change the value for `LABELS` to the list of labels that you want to apply to the issue. \* Change the value for `TITLE` to the title that you want the issue to have. \* Change the value for `BODY` to the text that you want in the issue body. The `|` character allows you to use a multi-line value for this parameter. \* If you want to pin this issue in your repository, set `PINNED` to `true`. For more information about pinned issues, see [AUTOTITLE](/issues/tracking-your-work-with-issues/pinning-an-issue-to-your-repository). \* If you want to close the previous issue generated by this workflow each time a new issue is created, set `CLOSE\_PREVIOUS` to `true`. The workflow will close the most recent issue that has the labels defined in the `labels` field. To avoid closing the wrong issue, use a unique label or combination of labels. 1. {% data reusables.actions.commit-workflow %} ## Expected results Based on the `schedule` parameter (for example, every Monday at 7:20 UTC), your workflow will create a new issue with the assignees, labels, title, and body that you specified. If you set `PINNED` to `true`, the workflow will pin the issue to your repository. If you set `CLOSE\_PREVIOUS` to true, the workflow will close the most recent issue with matching labels. > [!NOTE] > {% data reusables.actions.schedule-delay %} You can view the history of your workflow runs to see this workflow run periodically. For more information, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/viewing-workflow-run-history). ## Next steps \* To learn more about additional things you can do with the {% data variables.product.prodname\_cli %}, like using an issue template, see the
https://github.com/github/docs/blob/main//content/actions/tutorials/manage-your-work/schedule-issue-creation.md
main
github-actions
[ -0.024044586345553398, -0.045635003596544266, -0.05528181418776512, 0.01501816138625145, -0.0066705793142318726, 0.03282547742128372, 0.024314701557159424, 0.024251950904726982, -0.014889809302985668, -0.015814149752259254, -0.009867655113339424, 0.006085917819291353, 0.033211901783943176, ...
0.083103
{% data reusables.actions.schedule-delay %} You can view the history of your workflow runs to see this workflow run periodically. For more information, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/viewing-workflow-run-history). ## Next steps \* To learn more about additional things you can do with the {% data variables.product.prodname\_cli %}, like using an issue template, see the [`gh issue create` documentation](https://cli.github.com/manual/gh\_issue\_create). \* [Search {% data variables.product.prodname\_marketplace %}](https://github.com/marketplace?category=&type=actions&verification=&query=schedule+issue) for actions related to scheduled issues.
https://github.com/github/docs/blob/main//content/actions/tutorials/manage-your-work/schedule-issue-creation.md
main
github-actions
[ -0.03560059145092964, -0.0202224962413311, -0.03375140577554703, 0.019316425547003746, 0.01848732680082321, 0.03828892111778259, 0.06332197785377502, 0.01671922765672207, -0.011967197991907597, -0.04584655165672302, 0.024868184700608253, 0.0025943065993487835, -0.01612982340157032, -0.0398...
0.02917
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you workflow examples that configure a service container using the Docker Hub `postgres` image. The workflow runs a script that connects to the PostgreSQL service, creates a table, and then populates it with data. To test that the workflow creates and populates the PostgreSQL table, the script prints the data from the table to the console. {% data reusables.actions.docker-container-os-support %} ## Prerequisites {% data reusables.actions.service-container-prereqs %} You may also find it helpful to have a basic understanding of YAML, the syntax for {% data variables.product.prodname\_actions %}, and PostgreSQL. For more information, see: \* [AUTOTITLE](/actions/learn-github-actions) \* [PostgreSQL tutorial](https://www.postgresqltutorial.com/) in the PostgreSQL documentation ## Running jobs in containers {% data reusables.actions.container-jobs-intro %} {% data reusables.actions.copy-workflow-file %} ```yaml copy name: PostgreSQL service example on: push jobs: # Label of the container job container-job: # Containers must run in Linux based operating systems runs-on: ubuntu-latest # Docker Hub image that `container-job` executes in container: node:20-bookworm-slim # Service containers to run with `container-job` services: # Label used to access the service container postgres: # Docker Hub image image: postgres # Provide the password for postgres env: POSTGRES\_PASSWORD: postgres # Set health checks to wait until postgres has started options: >- --health-cmd pg\_isready --health-interval 10s --health-timeout 5s --health-retries 5 steps: # Downloads a copy of the code in your repository before running CI tests - name: Check out repository code uses: {% data reusables.actions.action-checkout %} # Performs a clean installation of all dependencies in the `package.json` file # For more information, see https://docs.npmjs.com/cli/ci.html - name: Install dependencies run: npm ci - name: Connect to PostgreSQL # Runs a script that creates a PostgreSQL table, populates # the table with data, and then retrieves the data. run: node client.js # Environment variables used by the `client.js` script to create a new PostgreSQL table. env: # The hostname used to communicate with the PostgreSQL service container POSTGRES\_HOST: postgres # The default PostgreSQL port POSTGRES\_PORT: 5432 ``` ### Configuring the runner job for jobs in containers {% data reusables.actions.service-container-host %} {% data reusables.actions.postgres-label-description %} ```yaml copy jobs: # Label of the container job container-job: # Containers must run in Linux based operating systems runs-on: ubuntu-latest # Docker Hub image that `container-job` executes in container: node:20-bookworm-slim # Service containers to run with `container-job` services: # Label used to access the service container postgres: # Docker Hub image image: postgres # Provide the password for postgres env: POSTGRES\_PASSWORD: postgres # Set health checks to wait until postgres has started options: >- --health-cmd pg\_isready --health-interval 10s --health-timeout 5s --health-retries 5 ``` ### Configuring the steps for jobs in containers {% data reusables.actions.service-template-steps %} ```yaml copy steps: # Downloads a copy of the code in your repository before running CI tests - name: Check out repository code uses: {% data reusables.actions.action-checkout %} # Performs a clean installation of all dependencies in the `package.json` file # For more information, see https://docs.npmjs.com/cli/ci.html - name: Install dependencies run: npm ci - name: Connect to PostgreSQL # Runs a script that creates a PostgreSQL table, populates # the table with data, and then retrieves the data. run: node client.js # Environment variable used by the `client.js` script to create # a new PostgreSQL client. env: # The hostname used to communicate with the PostgreSQL service container POSTGRES\_HOST: postgres # The default PostgreSQL port POSTGRES\_PORT: 5432 ``` {% data reusables.actions.postgres-environment-variables %} The hostname of the PostgreSQL service is the label you configured in your workflow, in this case, `postgres`. Because Docker containers on the same user-defined bridge network open all ports by default, you'll be able to access the
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/create-postgresql-service-containers.md
main
github-actions
[ 0.011050635948777199, -0.02013431489467621, -0.0711696594953537, 0.019515270367264748, -0.03166234493255615, 0.002838586922734976, 0.023558199405670166, 0.027197223156690598, -0.011007542721927166, -0.008668803609907627, -0.021154791116714478, 0.03266705572605133, -0.0002870865282602608, -...
0.079085
# The default PostgreSQL port POSTGRES\_PORT: 5432 ``` {% data reusables.actions.postgres-environment-variables %} The hostname of the PostgreSQL service is the label you configured in your workflow, in this case, `postgres`. Because Docker containers on the same user-defined bridge network open all ports by default, you'll be able to access the service container on the default PostgreSQL port 5432. ## Running jobs directly on the runner machine When you run a job directly on the runner machine, you'll need to map the ports on the service container to ports on the Docker host. You can access service containers from the Docker host using `localhost` and the Docker host port number. {% data reusables.actions.copy-workflow-file %} ```yaml copy name: PostgreSQL Service Example on: push jobs: # Label of the runner job runner-job: # You must use a Linux environment when using service containers or container jobs runs-on: ubuntu-latest # Service containers to run with `runner-job` services: # Label used to access the service container postgres: # Docker Hub image image: postgres # Provide the password for postgres env: POSTGRES\_PASSWORD: postgres # Set health checks to wait until postgres has started options: >- --health-cmd pg\_isready --health-interval 10s --health-timeout 5s --health-retries 5 ports: # Maps tcp port 5432 on service container to the host - 5432:5432 steps: # Downloads a copy of the code in your repository before running CI tests - name: Check out repository code uses: {% data reusables.actions.action-checkout %} # Performs a clean installation of all dependencies in the `package.json` file # For more information, see https://docs.npmjs.com/cli/ci.html - name: Install dependencies run: npm ci - name: Connect to PostgreSQL # Runs a script that creates a PostgreSQL table, populates # the table with data, and then retrieves the data run: node client.js # Environment variables used by the `client.js` script to create # a new PostgreSQL table. env: # The hostname used to communicate with the PostgreSQL service container POSTGRES\_HOST: localhost # The default PostgreSQL port POSTGRES\_PORT: 5432 ``` ### Configuring the runner job for jobs directly on the runner machine {% data reusables.actions.service-container-host-runner %} {% data reusables.actions.postgres-label-description %} The workflow maps port 5432 on the PostgreSQL service container to the Docker host. For more information about the `ports` keyword, see [AUTOTITLE](/actions/using-containerized-services/about-service-containers#mapping-docker-host-and-service-container-ports). ```yaml copy jobs: # Label of the runner job runner-job: # You must use a Linux environment when using service containers or container jobs runs-on: ubuntu-latest # Service containers to run with `runner-job` services: # Label used to access the service container postgres: # Docker Hub image image: postgres # Provide the password for postgres env: POSTGRES\_PASSWORD: postgres # Set health checks to wait until postgres has started options: >- --health-cmd pg\_isready --health-interval 10s --health-timeout 5s --health-retries 5 ports: # Maps tcp port 5432 on service container to the host - 5432:5432 ``` ### Configuring the steps for jobs directly on the runner machine {% data reusables.actions.service-template-steps %} ```yaml copy steps: # Downloads a copy of the code in your repository before running CI tests - name: Check out repository code uses: {% data reusables.actions.action-checkout %} # Performs a clean installation of all dependencies in the `package.json` file # For more information, see https://docs.npmjs.com/cli/ci.html - name: Install dependencies run: npm ci - name: Connect to PostgreSQL # Runs a script that creates a PostgreSQL table, populates # the table with data, and then retrieves the data run: node client.js # Environment variables used by the `client.js` script to create # a new PostgreSQL table. env: # The hostname used to communicate with the PostgreSQL service container POSTGRES\_HOST: localhost # The default PostgreSQL port POSTGRES\_PORT: 5432 ``` {% data reusables.actions.postgres-environment-variables %}
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/create-postgresql-service-containers.md
main
github-actions
[ 0.013694964349269867, -0.01233002357184887, -0.05195142701268196, -0.004404738079756498, -0.0644773617386818, 0.00002392634814896155, -0.04869367554783821, -0.013693483546376228, -0.022938264533877373, -0.03421447426080704, -0.13229408860206604, 0.018222762271761894, -0.0705890953540802, -...
-0.0305
data, and then retrieves the data run: node client.js # Environment variables used by the `client.js` script to create # a new PostgreSQL table. env: # The hostname used to communicate with the PostgreSQL service container POSTGRES\_HOST: localhost # The default PostgreSQL port POSTGRES\_PORT: 5432 ``` {% data reusables.actions.postgres-environment-variables %} {% data reusables.actions.service-container-localhost %} ## Testing the PostgreSQL service container You can test your workflow using the following script, which connects to the PostgreSQL service and adds a new table with some placeholder data. The script then prints the values stored in the PostgreSQL table to the terminal. Your script can use any language you'd like, but this example uses Node.js and the `pg` npm module. For more information, see the [npm pg module](https://www.npmjs.com/package/pg). You can modify \_client.js\_ to include any PostgreSQL operations needed by your workflow. In this example, the script connects to the PostgreSQL service, adds a table to the `postgres` database, inserts some placeholder data, and then retrieves the data. {% data reusables.actions.service-container-add-script %} ```javascript copy const { Client } = require('pg'); const pgclient = new Client({ host: process.env.POSTGRES\_HOST, port: process.env.POSTGRES\_PORT, user: 'postgres', password: 'postgres', database: 'postgres' }); pgclient.connect(); const table = 'CREATE TABLE student(id SERIAL PRIMARY KEY, firstName VARCHAR(40) NOT NULL, lastName VARCHAR(40) NOT NULL, age INT, address VARCHAR(80), email VARCHAR(40))' const text = 'INSERT INTO student(firstname, lastname, age, address, email) VALUES($1, $2, $3, $4, $5) RETURNING \*' const values = ['Mona the', 'Octocat', 9, '88 Colin P Kelly Jr St, San Francisco, CA 94107, United States', 'octocat@github.com'] pgclient.query(table, (err, res) => { if (err) throw err }); pgclient.query(text, values, (err, res) => { if (err) throw err }); pgclient.query('SELECT \* FROM student', (err, res) => { if (err) throw err console.log(err, res.rows) // Print the data in student table pgclient.end() }); ``` The script creates a new connection to the PostgreSQL service, and uses the `POSTGRES\_HOST` and `POSTGRES\_PORT` environment variables to specify the PostgreSQL service IP address and port. If `host` and `port` are not defined, the default host is `localhost` and the default port is 5432. The script creates a table and populates it with placeholder data. To test that the `postgres` database contains the data, the script prints the contents of the table to the console log. When you run this workflow, you should see the following output in the "Connect to PostgreSQL" step, which confirms that you successfully created the PostgreSQL table and added data: ```text null [ { id: 1, firstname: 'Mona the', lastname: 'Octocat', age: 9, address: '88 Colin P Kelly Jr St, San Francisco, CA 94107, United States', email: 'octocat@github.com' } ] ```
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/create-postgresql-service-containers.md
main
github-actions
[ 0.027926422655582428, -0.0014003579271957278, -0.05931569263339043, 0.03483651205897331, -0.08068657666444778, -0.03924978896975517, 0.01684432663023472, 0.010127358138561249, -0.0024705189280211926, -0.015914876013994217, -0.0937374085187912, 0.0021768200676888227, -0.005299288779497147, ...
0.031952
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction In this guide, you'll learn about the basic components needed to create and use a packaged Docker container action. To focus this guide on the components needed to package the action, the functionality of the action's code is minimal. The action prints "Hello World" in the logs or "Hello [who-to-greet]" if you provide a custom name. Once you complete this project, you should understand how to build your own Docker container action and test it in a workflow. {% data reusables.actions.self-hosted-runner-reqs-docker %} {% data reusables.actions.context-injection-warning %} ## Prerequisites \* You must create a repository on {% data variables.product.github %} and clone it to your workstation. For more information, see [AUTOTITLE](/repositories/creating-and-managing-repositories/creating-a-new-repository) and [AUTOTITLE](/repositories/creating-and-managing-repositories/cloning-a-repository). \* If your repository uses {% data variables.large\_files.product\_name\_short %}, you must include the objects in archives of your repository. For more information, see [AUTOTITLE](/enterprise-cloud@latest/repositories/managing-your-repositorys-settings-and-features/managing-repository-settings/managing-git-lfs-objects-in-archives-of-your-repository). \* You may find it helpful to have a basic understanding of {% data variables.product.prodname\_actions %}, environment variables and the Docker container filesystem. For more information, see [AUTOTITLE](/actions/learn-github-actions/variables) and [AUTOTITLE](/enterprise-cloud@latest/actions/using-github-hosted-runners/about-github-hosted-runners#docker-container-filesystem). ## Creating a Dockerfile In your new `hello-world-docker-action` directory, create a new `Dockerfile` file. Make sure that your filename is capitalized correctly (use a capital `D` but not a capital `f`) if you're having issues. For more information, see [AUTOTITLE](/actions/creating-actions/dockerfile-support-for-github-actions). \*\*Dockerfile\*\* ```dockerfile copy # Container image that runs your code FROM alpine:3.10 # Copies your code file from your action repository to the filesystem path `/` of the container COPY entrypoint.sh /entrypoint.sh # Code file to execute when the docker container starts up (`entrypoint.sh`) ENTRYPOINT ["/entrypoint.sh"] ``` ## Creating an action metadata file Create a new `action.yml` file in the `hello-world-docker-action` directory you created above. For more information, see [AUTOTITLE](/actions/creating-actions/metadata-syntax-for-github-actions). {% raw %} \*\*action.yml\*\* ```yaml copy # action.yml name: 'Hello World' description: 'Greet someone and record the time' inputs: who-to-greet: # id of input description: 'Who to greet' required: true default: 'World' outputs: time: # id of output description: 'The time we greeted you' runs: using: 'docker' image: 'Dockerfile' args: - ${{ inputs.who-to-greet }} ``` {% endraw %} This metadata defines one `who-to-greet` input and one `time` output parameter. To pass inputs to the Docker container, you should declare the input using `inputs` and pass the input in the `args` keyword. Everything you include in `args` is passed to the container, but for better discoverability for users of your action, we recommended using inputs. {% data variables.product.prodname\_dotcom %} will build an image from your `Dockerfile`, and run commands in a new container using this image. ## Writing the action code You can choose any base Docker image and, therefore, any language for your action. The following shell script example uses the `who-to-greet` input variable to print "Hello [who-to-greet]" in the log file. Next, the script gets the current time and sets it as an output variable that actions running later in a job can use. In order for {% data variables.product.prodname\_dotcom %} to recognize output variables, you must write them to the `$GITHUB\_OUTPUT` environment file: `echo "=" >> $GITHUB\_OUTPUT`. For more information, see [AUTOTITLE](/actions/using-workflows/workflow-commands-for-github-actions#setting-an-output-parameter). 1. Create a new `entrypoint.sh` file in the `hello-world-docker-action` directory. 1. Add the following code to your `entrypoint.sh` file. \*\*entrypoint.sh\*\* ```shell copy #!/bin/sh -l echo "Hello $1" time=$(date) echo "time=$time" >> $GITHUB\_OUTPUT ``` If `entrypoint.sh` executes without any errors, the action's status is set to `success`. You can also explicitly set exit codes in your action's code to provide an action's status. For more information, see [AUTOTITLE](/actions/creating-actions/setting-exit-codes-for-actions). 1. Make your `entrypoint.sh` file executable. Git provides a way to explicitly change the permission mode of a file so that it doesn’t get reset every time
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/create-a-docker-container-action.md
main
github-actions
[ 0.008919859305024147, -0.008046870119869709, -0.026959838345646858, 0.025325221940875053, -0.016721315681934357, -0.013228096067905426, 0.009580974467098713, 0.035630304366350174, -0.028808027505874634, 0.02649025432765484, -0.03155482932925224, -0.024284496903419495, 0.03373176231980324, ...
0.080946
to `success`. You can also explicitly set exit codes in your action's code to provide an action's status. For more information, see [AUTOTITLE](/actions/creating-actions/setting-exit-codes-for-actions). 1. Make your `entrypoint.sh` file executable. Git provides a way to explicitly change the permission mode of a file so that it doesn’t get reset every time there is a clone/fork. ```shell copy git add entrypoint.sh git update-index --chmod=+x entrypoint.sh ``` 1. Optionally, to check the permission mode of the file in the git index, run the following command. ```shell copy git ls-files --stage entrypoint.sh ``` An output like `100755 e69de29bb2d1d6434b8b29ae775ad8c2e48c5391 0 entrypoint.sh` means the file has the executable permission. In this example, `755` denotes the executable permission. ## Creating a README To let people know how to use your action, you can create a README file. A README is most helpful when you plan to share your action publicly, but is also a great way to remind you or your team how to use the action. In your `hello-world-docker-action` directory, create a `README.md` file that specifies the following information: \* A detailed description of what the action does. \* Required input and output arguments. \* Optional input and output arguments. \* Secrets the action uses. \* Environment variables the action uses. \* An example of how to use your action in a workflow. \*\*README.md\*\* ```markdown copy # Hello world docker action This action prints "Hello World" or "Hello" + the name of a person to greet to the log. ## Inputs ## `who-to-greet` \*\*Required\*\* The name of the person to greet. Default `"World"`. ## Outputs ## `time` The time we greeted you. ## Example usage uses: actions/hello-world-docker-action@v2 with: who-to-greet: 'Mona the Octocat' ``` ## Commit, tag, and push your action From your terminal, commit your `action.yml`, `entrypoint.sh`, `Dockerfile`, and `README.md` files. It's best practice to also add a version tag for releases of your action. For more information on versioning your action, see [AUTOTITLE](/actions/creating-actions/about-custom-actions#using-release-management-for-actions). ```shell copy git add action.yml entrypoint.sh Dockerfile README.md git commit -m "My first action is ready" git tag -a -m "My first action release" v1 git push --follow-tags ``` ## Testing out your action in a workflow Now you're ready to test your action out in a workflow. \* When an action is in a private repository, you can control who can access it. For more information, see [AUTOTITLE](/repositories/managing-your-repositorys-settings-and-features/enabling-features-for-your-repository/managing-github-actions-settings-for-a-repository#allowing-access-to-components-in-a-private-repository). \* {% ifversion ghes or ghec %}When an action is in an internal repository, you can control who can access it. For more information, see [AUTOTITLE](/repositories/managing-your-repositorys-settings-and-features/enabling-features-for-your-repository/managing-github-actions-settings-for-a-repository#allowing-access-to-components-in-an-internal-repository).{% else %}When an action is in an internal repository, the action can only be used in workflows in the same repository.{% endif %} \* Public actions can be used by workflows in any repository. {% data reusables.actions.enterprise-marketplace-actions %} ### Example using a public action The following workflow code uses the completed \_hello world\_ action in the public [`actions/hello-world-docker-action`](https://github.com/actions/hello-world-docker-action) repository. Copy the following workflow example code into a `.github/workflows/main.yml` file, but replace the `actions/hello-world-docker-action` with your repository and action name. You can also replace the `who-to-greet` input with your name. {% ifversion fpt or ghec %}Public actions can be used even if they're not published to {% data variables.product.prodname\_marketplace %}. For more information, see [AUTOTITLE](/actions/creating-actions/publishing-actions-in-github-marketplace#publishing-an-action). {% endif %} \*\*.github/workflows/main.yml\*\* ```yaml copy on: [push] jobs: hello\_world\_job: runs-on: ubuntu-latest name: A job to say hello steps: - name: Hello world action step id: hello uses: actions/hello-world-docker-action@v2 with: who-to-greet: 'Mona the Octocat' # Use the output from the `hello` step - name: Get the output time run: echo "The time was {% raw %}${{ steps.hello.outputs.time }}"{% endraw %} ``` ### Example using a private action Copy the following example workflow code
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/create-a-docker-container-action.md
main
github-actions
[ -0.013175779022276402, -0.05628879368305206, 0.022870028391480446, -0.03157161548733711, 0.02656535804271698, -0.05762245133519173, 0.06459192931652069, 0.041945356875658035, 0.021304842084646225, 0.045465800911188126, 0.023700911551713943, 0.026665475219488144, 0.036057617515325546, 0.003...
0.0292
action step id: hello uses: actions/hello-world-docker-action@v2 with: who-to-greet: 'Mona the Octocat' # Use the output from the `hello` step - name: Get the output time run: echo "The time was {% raw %}${{ steps.hello.outputs.time }}"{% endraw %} ``` ### Example using a private action Copy the following example workflow code into a `.github/workflows/main.yml` file in your action's repository. You can also replace the `who-to-greet` input with your name. {% ifversion fpt or ghec %}This private action can't be published to {% data variables.product.prodname\_marketplace %}, and can only be used in this repository.{% endif %} \*\*.github/workflows/main.yml\*\* ```yaml copy on: [push] jobs: hello\_world\_job: runs-on: ubuntu-latest name: A job to say hello steps: # To use this repository's private action, # you must check out the repository - name: Checkout uses: {% data reusables.actions.action-checkout %} - name: Hello world action step uses: ./ # Uses an action in the root directory id: hello with: who-to-greet: 'Mona the Octocat' # Use the output from the `hello` step - name: Get the output time run: echo "The time was {% raw %}${{ steps.hello.outputs.time }}"{% endraw %} ``` {% data reusables.actions.test-private-action-example %} ## Accessing files created by a container action When a container action runs, it will automatically map the default working directory (`GITHUB\_WORKSPACE`) on the runner with the `/github/workspace` directory on the container. Any files added to this directory on the container will be available to any subsequent steps in the same job. For example, if you have a container action that builds your project, and you would like to upload the build output as an artifact, you can use the following steps. \*\*workflow.yml\*\* ```yaml copy jobs: build: runs-on: ubuntu-latest steps: - name: Checkout uses: {% data reusables.actions.action-checkout %} # Output build artifacts to /github/workspace on the container. - name: Containerized Build uses: ./.github/actions/my-container-action - name: Upload Build Artifacts uses: {% data reusables.actions.action-upload-artifact %} with: name: workspace\_artifacts path: {% raw %}${{ github.workspace }}{% endraw %} ``` For more information about uploading build output as an artifact, see [AUTOTITLE](/actions/using-workflows/storing-workflow-data-as-artifacts). ## Example Docker container actions on {% data variables.product.prodname\_dotcom\_the\_website %} You can find many examples of Docker container actions on {% data variables.product.prodname\_dotcom\_the\_website %}. \* [github/issue-metrics](https://github.com/github/issue-metrics) \* [microsoft/infersharpaction](https://github.com/microsoft/infersharpaction) \* [microsoft/ps-docs](https://github.com/microsoft/ps-docs)
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/create-a-docker-container-action.md
main
github-actions
[ -0.014004610478878021, 0.03821010887622833, -0.07077375054359436, -0.0006976366275921464, 0.006847015582025051, -0.026692993938922882, 0.029767388477921486, 0.06668630987405777, -0.012754610739648342, -0.004921059124171734, 0.012298368848860264, -0.07671533524990082, -0.025027526542544365, ...
0.070398
{% data reusables.actions.enterprise-github-hosted-runners %} ## Introduction This guide shows you workflow examples that configure a service container using the Docker Hub `redis` image. The workflow runs a script to create a Redis client and populate the client with data. To test that the workflow creates and populates the Redis client, the script prints the client's data to the console. {% data reusables.actions.docker-container-os-support %} ## Prerequisites {% data reusables.actions.service-container-prereqs %} You may also find it helpful to have a basic understanding of YAML, the syntax for {% data variables.product.prodname\_actions %}, and Redis. For more information, see: \* [AUTOTITLE](/actions/learn-github-actions) \* [Getting Started with Redis](https://redis.io/learn/howtos/quick-start) in the Redis documentation ## Running jobs in containers {% data reusables.actions.container-jobs-intro %} {% data reusables.actions.copy-workflow-file %} ```yaml copy name: Redis container example on: push jobs: # Label of the container job container-job: # Containers must run in Linux based operating systems runs-on: ubuntu-latest # Docker Hub image that `container-job` executes in container: node:20-bookworm-slim # Service containers to run with `container-job` services: # Label used to access the service container redis: # Docker Hub image image: redis # Set health checks to wait until redis has started options: >- --health-cmd "redis-cli ping" --health-interval 10s --health-timeout 5s --health-retries 5 steps: # Downloads a copy of the code in your repository before running CI tests - name: Check out repository code uses: {% data reusables.actions.action-checkout %} # Performs a clean installation of all dependencies in the `package.json` file # For more information, see https://docs.npmjs.com/cli/ci.html - name: Install dependencies run: npm ci - name: Connect to Redis # Runs a script that creates a Redis client, populates # the client with data, and retrieves data run: node client.js # Environment variable used by the `client.js` script to create a new Redis client. env: # The hostname used to communicate with the Redis service container REDIS\_HOST: redis # The default Redis port REDIS\_PORT: 6379 ``` ### Configuring the container job {% data reusables.actions.service-container-host %} {% data reusables.actions.redis-label-description %} ```yaml copy jobs: # Label of the container job container-job: # Containers must run in Linux based operating systems runs-on: ubuntu-latest # Docker Hub image that `container-job` executes in container: node:20-bookworm-slim # Service containers to run with `container-job` services: # Label used to access the service container redis: # Docker Hub image image: redis # Set health checks to wait until redis has started options: >- --health-cmd "redis-cli ping" --health-interval 10s --health-timeout 5s --health-retries 5 ``` ### Configuring the steps for the container job {% data reusables.actions.service-template-steps %} ```yaml copy steps: # Downloads a copy of the code in your repository before running CI tests - name: Check out repository code uses: {% data reusables.actions.action-checkout %} # Performs a clean installation of all dependencies in the `package.json` file # For more information, see https://docs.npmjs.com/cli/ci.html - name: Install dependencies run: npm ci - name: Connect to Redis # Runs a script that creates a Redis client, populates # the client with data, and retrieves data run: node client.js # Environment variable used by the `client.js` script to create a new Redis client. env: # The hostname used to communicate with the Redis service container REDIS\_HOST: redis # The default Redis port REDIS\_PORT: 6379 ``` {% data reusables.actions.redis-environment-variables %} The hostname of the Redis service is the label you configured in your workflow, in this case, `redis`. Because Docker containers on the same user-defined bridge network open all ports by default, you'll be able to access the service container on the default Redis port 6379. ## Running jobs directly on the runner machine When you run a job directly on the runner machine, you'll need to
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/create-redis-service-containers.md
main
github-actions
[ -0.00308500905521214, -0.035686809569597244, -0.08180756121873856, 0.015281863510608673, 0.00993257388472557, -0.015608291141688824, 0.04722993075847626, 0.04016225039958954, 0.0007906602695584297, -0.004358137492090464, 0.0012710638111457229, 0.026871753856539726, 0.03912978991866112, -0....
0.134018
`redis`. Because Docker containers on the same user-defined bridge network open all ports by default, you'll be able to access the service container on the default Redis port 6379. ## Running jobs directly on the runner machine When you run a job directly on the runner machine, you'll need to map the ports on the service container to ports on the Docker host. You can access service containers from the Docker host using `localhost` and the Docker host port number. {% data reusables.actions.copy-workflow-file %} ```yaml copy name: Redis runner example on: push jobs: # Label of the runner job runner-job: # You must use a Linux environment when using service containers or container jobs runs-on: ubuntu-latest # Service containers to run with `runner-job` services: # Label used to access the service container redis: # Docker Hub image image: redis # Set health checks to wait until redis has started options: >- --health-cmd "redis-cli ping" --health-interval 10s --health-timeout 5s --health-retries 5 ports: # Maps port 6379 on service container to the host - 6379:6379 steps: # Downloads a copy of the code in your repository before running CI tests - name: Check out repository code uses: {% data reusables.actions.action-checkout %} # Performs a clean installation of all dependencies in the `package.json` file # For more information, see https://docs.npmjs.com/cli/ci.html - name: Install dependencies run: npm ci - name: Connect to Redis # Runs a script that creates a Redis client, populates # the client with data, and retrieves data run: node client.js # Environment variable used by the `client.js` script to create # a new Redis client. env: # The hostname used to communicate with the Redis service container REDIS\_HOST: localhost # The default Redis port REDIS\_PORT: 6379 ``` ### Configuring the runner job {% data reusables.actions.service-container-host-runner %} {% data reusables.actions.redis-label-description %} The workflow maps port 6379 on the Redis service container to the Docker host. For more information about the `ports` keyword, see [AUTOTITLE](/actions/using-containerized-services/about-service-containers#mapping-docker-host-and-service-container-ports). ```yaml copy jobs: # Label of the runner job runner-job: # You must use a Linux environment when using service containers or container jobs runs-on: ubuntu-latest # Service containers to run with `runner-job` services: # Label used to access the service container redis: # Docker Hub image image: redis # Set health checks to wait until redis has started options: >- --health-cmd "redis-cli ping" --health-interval 10s --health-timeout 5s --health-retries 5 ports: # Maps port 6379 on service container to the host - 6379:6379 ``` ### Configuring the steps for the runner job {% data reusables.actions.service-template-steps %} ```yaml copy steps: # Downloads a copy of the code in your repository before running CI tests - name: Check out repository code uses: {% data reusables.actions.action-checkout %} # Performs a clean installation of all dependencies in the `package.json` file # For more information, see https://docs.npmjs.com/cli/ci.html - name: Install dependencies run: npm ci - name: Connect to Redis # Runs a script that creates a Redis client, populates # the client with data, and retrieves data run: node client.js # Environment variable used by the `client.js` script to create # a new Redis client. env: # The hostname used to communicate with the Redis service container REDIS\_HOST: localhost # The default Redis port REDIS\_PORT: 6379 ``` {% data reusables.actions.redis-environment-variables %} {% data reusables.actions.service-container-localhost %} ## Testing the Redis service container You can test your workflow using the following script, which creates a Redis client and populates the client with some placeholder data. The script then prints the values stored in the Redis client to the terminal. Your script can use any language you'd like, but this example uses Node.js and the
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/create-redis-service-containers.md
main
github-actions
[ -0.01677100919187069, -0.004027555696666241, -0.10788421332836151, -0.02307768538594246, -0.0019316811813041568, -0.05357618257403374, -0.04325214400887489, -0.001875598798505962, 0.003601840930059552, -0.021714866161346436, -0.06458812206983566, 0.0037473279517143965, -0.02983865700662136, ...
0.077349
can test your workflow using the following script, which creates a Redis client and populates the client with some placeholder data. The script then prints the values stored in the Redis client to the terminal. Your script can use any language you'd like, but this example uses Node.js and the `redis` npm module. For more information, see the [npm redis module](https://www.npmjs.com/package/redis). You can modify \_client.js\_ to include any Redis operations needed by your workflow. In this example, the script creates the Redis client instance, adds placeholder data, then retrieves the data. {% data reusables.actions.service-container-add-script %} ```javascript copy const redis = require("redis"); // Creates a new Redis client // If REDIS\_HOST is not set, the default host is localhost // If REDIS\_PORT is not set, the default port is 6379 const redisClient = redis.createClient({ url: `redis://${process.env.REDIS\_HOST}:${process.env.REDIS\_PORT}` }); redisClient.on("error", (err) => console.log("Error", err)); (async () => { await redisClient.connect(); // Sets the key "octocat" to a value of "Mona the octocat" const setKeyReply = await redisClient.set("octocat", "Mona the Octocat"); console.log("Reply: " + setKeyReply); // Sets a key to "species", field to "octocat", and "value" to "Cat and Octopus" const SetFieldOctocatReply = await redisClient.hSet("species", "octocat", "Cat and Octopus"); console.log("Reply: " + SetFieldOctocatReply); // Sets a key to "species", field to "dinotocat", and "value" to "Dinosaur and Octopus" const SetFieldDinotocatReply = await redisClient.hSet("species", "dinotocat", "Dinosaur and Octopus"); console.log("Reply: " + SetFieldDinotocatReply); // Sets a key to "species", field to "robotocat", and "value" to "Cat and Robot" const SetFieldRobotocatReply = await redisClient.hSet("species", "robotocat", "Cat and Robot"); console.log("Reply: " + SetFieldRobotocatReply); try { // Gets all fields in "species" key const replies = await redisClient.hKeys("species"); console.log(replies.length + " replies:"); replies.forEach((reply, i) => { console.log(" " + i + ": " + reply); }); await redisClient.quit(); } catch (err) { // statements to handle any exceptions } })(); ``` The script creates a new Redis client using the `createClient` method, which accepts a `host` and `port` parameter. The script uses the `REDIS\_HOST` and `REDIS\_PORT` environment variables to set the client's IP address and port. If `host` and `port` are not defined, the default host is `localhost` and the default port is 6379. The script uses the `set` and `hset` methods to populate the database with some keys, fields, and values. To confirm that the Redis client contains the data, the script prints the contents of the database to the console log. When you run this workflow, you should see the following output in the "Connect to Redis" step confirming you created the Redis client and added data: ```shell Reply: OK Reply: 1 Reply: 1 Reply: 1 3 replies: 0: octocat 1: dinotocat 2: robotocat ```
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/create-redis-service-containers.md
main
github-actions
[ -0.0005979046691209078, -0.0040787882171571255, -0.07466805726289749, 0.03188700973987579, -0.014336971566081047, -0.061537403613328934, 0.04183519631624222, 0.014554382301867008, 0.04736774414777756, -0.05449700355529785, -0.08362738788127899, -0.004268443677574396, 0.021362584084272385, ...
0.105491
## Communicating with Docker service containers Service containers are Docker containers that provide a simple and portable way for you to host services that you might need to test or operate your application in a workflow. For example, your workflow might need to run integration tests that require access to a database and memory cache. You can configure service containers for each job in a workflow. {% data variables.product.prodname\_dotcom %} creates a fresh Docker container for each service configured in the workflow, and destroys the service container when the job completes. Steps in a job can communicate with all service containers that are part of the same job. However, you cannot create and use service containers inside a composite action. {% data reusables.actions.docker-container-os-support %} You can configure jobs in a workflow to run directly on a runner machine or in a Docker container. Communication between a job and its service containers is different depending on whether a job runs directly on the runner machine or in a container. ### Running jobs in a container When you run jobs in a container, {% data variables.product.prodname\_dotcom %} connects service containers to the job using Docker's user-defined bridge networks. For more information, see [Bridge network driver](https://docs.docker.com/engine/network/drivers/bridge/) in the Docker documentation. Running the job and services in a container simplifies network access. You can access a service container using the label you configure in the workflow. The hostname of the service container is automatically mapped to the label name. For example, if you create a service container with the label `redis`, the hostname of the service container is `redis`. You don't need to configure any ports for service containers. By default, all containers that are part of the same Docker network expose all ports to each other, and no ports are exposed outside of the Docker network. ### Running jobs on the runner machine When running jobs directly on the runner machine, you can access service containers using `localhost:` or `127.0.0.1:`. {% data variables.product.prodname\_dotcom %} configures the container network to enable communication from the service container to the Docker host. When a job runs directly on a runner machine, the service running in the Docker container does not expose its ports to the job on the runner by default. You need to map ports on the service container to the Docker host. For more information, see [AUTOTITLE](/actions/using-containerized-services/about-service-containers#mapping-docker-host-and-service-container-ports). ## Creating service containers You can use the `services` keyword to create service containers that are part of a job in your workflow. For more information, see [`jobs..services`](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob\_idservices). This example creates a service called `redis` in a job called `container-job`. The Docker host in this example is the `node:16-bullseye` container. {% raw %} ```yaml copy name: Redis container example on: push jobs: # Label of the container job container-job: # Containers must run in Linux based operating systems runs-on: ubuntu-latest # Docker Hub image that `container-job` executes in container: node:16-bullseye # Service containers to run with `container-job` services: # Label used to access the service container redis: # Docker Hub image image: redis ``` {% endraw %} ## Mapping Docker host and service container ports If your job runs in a Docker container, you do not need to map ports on the host or the service container. If your job runs directly on the runner machine, you'll need to map any required service container ports to ports on the host runner machine. You can map service containers ports to the Docker host using the `ports` keyword. For more information, see [`jobs..services`](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob\_idservices). | Value of `ports` | Description | |------------------|--------------| | `8080:80` | Maps TCP port 80 in
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/use-docker-service-containers.md
main
github-actions
[ -0.03482798486948013, -0.01846979930996895, -0.028887726366519928, -0.0015468262135982513, -0.009818247519433498, -0.031781282275915146, 0.07515785098075867, 0.009352123364806175, -0.06777723878622055, 0.027567049488425255, -0.06118885800242424, 0.017771832644939423, 0.051606904715299606, ...
0.081425
need to map any required service container ports to ports on the host runner machine. You can map service containers ports to the Docker host using the `ports` keyword. For more information, see [`jobs..services`](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob\_idservices). | Value of `ports` | Description | |------------------|--------------| | `8080:80` | Maps TCP port 80 in the container to port 8080 on the Docker host. | | `8080:80/udp` | Maps UDP port 80 in the container to port 8080 on the Docker host. | | `8080/udp` | Maps a randomly chosen port on the Docker host to UDP port 8080 in the container. | When you map ports using the `ports` keyword, {% data variables.product.prodname\_dotcom %} uses the `--publish` command to publish the container’s ports to the Docker host. For more information, see [Docker container networking](https://docs.docker.com/config/containers/container-networking/) in the Docker documentation. When you specify the container port but not the Docker host port, the container port is randomly assigned to a free port. {% data variables.product.prodname\_dotcom %} sets the assigned container port in the service container context. For example, for a `redis` service container, if you configured the Docker host port 5432, you can access the corresponding container port using the `job.services.redis.ports[5432]` context. For more information, see [AUTOTITLE](/actions/learn-github-actions/contexts#job-context). ### Example mapping Redis ports This example maps the service container `redis` port 6379 to the Docker host port 6379. {% raw %} ```yaml copy name: Redis Service Example on: push jobs: # Label of the container job runner-job: # You must use a Linux environment when using service containers or container jobs runs-on: ubuntu-latest # Service containers to run with `runner-job` services: # Label used to access the service container redis: # Docker Hub image image: redis # ports: # Opens tcp port 6379 on the host and service container - 6379:6379 ``` {% endraw %} ## Authenticating with image registries You can specify credentials for your service containers in case you need to authenticate with an image registry. This allows you to use images from private registries or to [increase your DockerHub rate limit](https://www.docker.com/increase-rate-limits/). Here’s an example of authenticating with Docker Hub and the {% data variables.product.prodname\_dotcom %} {% data variables.product.prodname\_container\_registry %}: {% raw %} ```yaml copy jobs: build: services: redis: # Docker Hub image image: redis ports: - 6379:6379 credentials: username: ${{ secrets.dockerhub\_username }} password: ${{ secrets.dockerhub\_password }} db: # Private registry image image: ghcr.io/octocat/testdb:latest credentials: username: ${{ github.repository\_owner }} password: ${{ secrets.ghcr\_password }} ``` {% endraw %} ## Further reading \* [AUTOTITLE](/actions/using-containerized-services/creating-redis-service-containers) \* [AUTOTITLE](/actions/using-containerized-services/creating-postgresql-service-containers)
https://github.com/github/docs/blob/main//content/actions/tutorials/use-containerized-services/use-docker-service-containers.md
main
github-actions
[ 0.001731754164211452, 0.025109419599175453, -0.09151837974786758, -0.028605129569768906, 0.0009890544461086392, -0.047718629240989685, -0.013533225283026695, 0.0035902715753763914, -0.051181819289922714, 0.04520147666335106, -0.018787220120429993, -0.018727637827396393, 0.023616870865225792,...
0.032443
You can [contact {% data variables.contact.github\_support %}](/support/contacting-github-support) for assistance with {% data variables.product.prodname\_actions %}. ## Providing diagnostic and troubleshooting information The contents of private and internal repositories are not visible to {% data variables.contact.github\_support %}, so {% data variables.contact.github\_support %} may request additional information to understand the complete context of your inquiry and reproduce any unexpected behavior. You can accelerate the resolution of your inquiry by providing this information when you initially raise a ticket with {% data variables.contact.github\_support %}. Some information that {% data variables.contact.github\_support %} will request can include, but is not limited to, the following: \* The URL of the workflow run. {% ifversion ghes %} For example: `https://DOMAIN/ORG/REPO/actions/runs/0123456789` {% else %} For example: `https://github.com/ORG/REPO/actions/runs/0123456789` {% endif %} \* The workflow `.yml` file(s) attached to the ticket as `.txt` files. For more information about workflows, see [AUTOTITLE](/actions/using-workflows/about-workflows#about-workflows). \* A copy of your workflow run logs for an example workflow run failure. For more information about workflow run logs, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/using-workflow-run-logs#downloading-logs). \* {% ifversion ghes %}A copy of your runner logs, {% else %}If you are running this workflow on a self-hosted runner, self-hosted runner logs{% endif %} which can be found under the `\_diag` folder within the runner. For more information about self-hosted runners, see [AUTOTITLE](/actions/hosting-your-own-runners/managing-self-hosted-runners/monitoring-and-troubleshooting-self-hosted-runners#reviewing-the-self-hosted-runner-application-log-files). Self-hosted runner log file names are be formatted: `Runner\_YYYY####-xxxxxx-utc.log` and `Worker\_YYYY####-xxxxxx-utc.log`. > [!NOTE] > Attach files to your support ticket by changing the file's extension to `.txt` or `.zip`. If you include textual data such as log or workflow file snippets inline in your ticket, ensure they are formatted correctly as Markdown code blocks. For more information about proper Markdown formatting, see [AUTOTITLE](/get-started/writing-on-github/getting-started-with-writing-and-formatting-on-github/basic-writing-and-formatting-syntax#quoting-code). > > If the information you provide is unreadable due to the loss of formatting by improper Markdown syntax, {% data variables.contact.github\_support %} may request that resubmit the information either as an attachment or with the correct Markdown formatting. > [!WARNING] > Ensure all files and text provided to {% data variables.contact.github\_support %} have been properly redacted to remove sensitive information such as tokens and other secrets. {% ifversion ghes %} Depending on the nature of your inquiry, {% data variables.contact.github\_support %} may also request that you generate and upload a support bundle for further review and analysis. For more information about providing data to {% data variables.contact.github\_support %} and support bundles, see [AUTOTITLE](/support/contacting-github-support/providing-data-to-github-support). {% endif %} ### Ephemeral Runner Application Log Files {% data variables.contact.github\_support %} may request the runner application log files from ephemeral runners. {% data variables.product.prodname\_dotcom %} expects and recommends that you have implemented a mechanism to forward and preserve the runner application log files from self-hosted ephemeral runners. For more information about runner application log files and troubleshooting self-hosted runners, see [AUTOTITLE](/actions/hosting-your-own-runners/managing-self-hosted-runners/monitoring-and-troubleshooting-self-hosted-runners#reviewing-the-self-hosted-runner-application-log-files). ### {% data variables.product.prodname\_actions\_runner\_controller %} If you are using {% data variables.product.prodname\_actions\_runner\_controller %} (ARC), {% data variables.contact.github\_support %} may ask you to submit the complete logs for the controller, listeners, and runner pods. For more information about collecting {% data variables.product.prodname\_actions\_runner\_controller %}'s logs, see [AUTOTITLE](/actions/hosting-your-own-runners/managing-self-hosted-runners-with-actions-runner-controller/troubleshooting-actions-runner-controller-errors#checking-the-logs-of-the-controller-and-runner-set-listener). For more information about the scope of support for {% data variables.product.prodname\_actions\_runner\_controller %}, see [AUTOTITLE](/actions/hosting-your-own-runners/managing-self-hosted-runners-with-actions-runner-controller/about-support-for-actions-runner-controller). ### {% data variables.product.prodname\_codeql %} and {% data variables.product.prodname\_actions %} If you are requesting assistance with a {% data variables.code-scanning.codeql\_workflow %}, {% data variables.contact.github\_support %} may request a copy of the {% data variables.product.prodname\_codeql %} debugging artifacts. For more information about debugging artifacts for a {% data variables.code-scanning.codeql\_workflow %}, see [AUTOTITLE](/code-security/code-scanning/troubleshooting-code-scanning/logs-not-detailed-enough#creating-codeql-debugging-artifacts). To provide the debugging artifacts to {% data variables.contact.github\_support %}, please download the {% data variables.product.prodname\_codeql %} debugging artifacts from a sample workflow run and attach it to your ticket as a `.zip` file. For more information on downloading workflow
https://github.com/github/docs/blob/main//content/actions/how-tos/get-support.md
main
github-actions
[ -0.06158677116036415, 0.00941214058548212, -0.015095577575266361, 0.02554035559296608, 0.07290930300951004, -0.016882991418242455, 0.07206854224205017, 0.06149518862366676, -0.04028267785906792, -0.02953028865158558, 0.022466301918029785, -0.012103036046028137, 0.02013368532061577, 0.01800...
0.021394
debugging artifacts for a {% data variables.code-scanning.codeql\_workflow %}, see [AUTOTITLE](/code-security/code-scanning/troubleshooting-code-scanning/logs-not-detailed-enough#creating-codeql-debugging-artifacts). To provide the debugging artifacts to {% data variables.contact.github\_support %}, please download the {% data variables.product.prodname\_codeql %} debugging artifacts from a sample workflow run and attach it to your ticket as a `.zip` file. For more information on downloading workflow artifacts, see [AUTOTITLE](/actions/managing-workflow-runs/downloading-workflow-artifacts). If the {% data variables.product.prodname\_codeql %} debugging artifacts `.zip` file is too large to upload to the ticket, please advise {% data variables.contact.github\_support %}, and we will work with you to determine the next steps. ## Scope of support {% data reusables.support.scope-of-support %}
https://github.com/github/docs/blob/main//content/actions/how-tos/get-support.md
main
github-actions
[ -0.027637317776679993, 0.00909564271569252, -0.05772622302174568, 0.0013386901700869203, 0.03292696177959442, -0.05083411931991577, 0.045605242252349854, 0.026944298297166824, -0.07252427190542221, 0.038668494671583176, -0.007231393828988075, -0.007589820306748152, 0.04354001209139824, -0....
0.005922
{% data reusables.actions.enterprise-github-hosted-runners %} ## Initial troubleshooting suggestions There are several ways you can troubleshoot failed workflow runs. {% ifversion copilot %} >[!NOTE] If you are on a {% data variables.copilot.copilot\_free %} subscription, this will count towards your monthly chat message limit. ### Using {% data variables.product.prodname\_copilot %} To open a chat with {% data variables.product.prodname\_copilot %} about a failed workflow run, you can either: \* Next to the failed check in the merge box, click \*\*{% octicon "kebab-horizontal" aria-hidden="true" aria-label="kebab-horizontal" %}\*\*, then click \*\*{% octicon "copilot" aria-hidden="true" aria-label="copilot" %} Explain error\*\*. \* In the merge box, click on the failed check. At the top of the workflow run summary page, click \*\*{% octicon "copilot" aria-hidden="true" aria-label="copilot" %} Explain error\*\*. This opens a chat window with {% data variables.product.prodname\_copilot %}, where it will provide instructions to resolve the issue. {% endif %} ### Using workflow run logs Each workflow run generates activity logs that you can view, search, and download. For more information, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/using-workflow-run-logs). ### Enabling debug logging If the workflow logs do not provide enough detail to diagnose why a workflow, job, or step is not working as expected, you can enable additional debug logging. For more information, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/enabling-debug-logging). If your workflow uses specific tools or actions, enabling their debug or verbose logging options can help generate more detailed output for troubleshooting. For example, you can use `npm install --verbose` for npm or `GIT\_TRACE=1 GIT\_CURL\_VERBOSE=1 git ...` for git. {% ifversion fpt or ghec %} ## Reviewing billing errors Actions usage includes runner minutes and storage for [workflow artifacts](/actions/writing-workflows/choosing-what-your-workflow-does/storing-and-sharing-data-from-a-workflow). For more information, see [AUTOTITLE](/billing/managing-billing-for-your-products/managing-billing-for-github-actions/about-billing-for-github-actions). ### Setting a budget Setting an Actions budget may help immediately unblock workflows failing due to billing or storage errors. It will allow further minutes and storage usage to be billed up to the set budget amount. To learn more, see [AUTOTITLE](/billing/managing-your-billing/preventing-overspending). {% endif %} {% ifversion actions-metrics %} ## Reviewing {% data variables.product.prodname\_actions %} activity with metrics To analyze the efficiency and reliability of your workflows using metrics, see [AUTOTITLE](/actions/administering-github-actions/viewing-github-actions-metrics). {% endif %} ## Troubleshooting workflow triggers You can review your workflow's `on:` field to understand what is expected to trigger the workflow. For more information, see [AUTOTITLE](/actions/writing-workflows/choosing-when-your-workflow-runs/triggering-a-workflow). For a full list of available events, see [AUTOTITLE](/actions/writing-workflows/choosing-when-your-workflow-runs/events-that-trigger-workflows). ### Triggering event conditions Some triggering events only run from the default branch (i.e. `issues`, `schedule`). Workflow file versions that exist outside of the default branch will not trigger on these events. Workflows will not run on `pull\_request` activity if the pull request has a merge conflict. Workflows that would otherwise be triggered on `push` or `pull\_request` activity will be skipped if the commit message contains a skip annotation. For more information, see [AUTOTITLE](/actions/managing-workflow-runs-and-deployments/managing-workflow-runs/skipping-workflow-runs). ### Scheduled workflows running at unexpected times Scheduled events can be delayed during periods of high loads of {% data variables.product.prodname\_actions %} workflow runs. High load times include the start of every hour. If the load is sufficiently high enough, some queued jobs may be dropped. To decrease the chance of delay, schedule your workflow to run at a different time of the hour. For more information, see [AUTOTITLE](/actions/writing-workflows/choosing-when-your-workflow-runs/events-that-trigger-workflows#schedule). ### Filtering and diff limits Specific events allow for filtering by branch, tag, and/or paths you can customize. Workflow run creation will be skipped if the filter conditions apply to filter out the workflow. You can use special characters with filters. For more information, see [AUTOTITLE](/actions/writing-workflows/workflow-syntax-for-github-actions#filter-pattern-cheat-sheet). For path filtering, evaluating diffs is limited to the first 300 files. If there are files changed that are not matched in the first 300 files returned by the filter, the workflow will not be run.
https://github.com/github/docs/blob/main//content/actions/how-tos/troubleshoot-workflows.md
main
github-actions
[ -0.001493757008574903, -0.04364647716283798, -0.005630312021821737, 0.05260753259062767, -0.02757495827972889, 0.009905300103127956, 0.06345636397600174, 0.021204287186264992, 0.01486414484679699, -0.01176378969103098, 0.016159234568476677, -0.011877348646521568, -0.01528480090200901, -0.0...
0.100474
the workflow. You can use special characters with filters. For more information, see [AUTOTITLE](/actions/writing-workflows/workflow-syntax-for-github-actions#filter-pattern-cheat-sheet). For path filtering, evaluating diffs is limited to the first 300 files. If there are files changed that are not matched in the first 300 files returned by the filter, the workflow will not be run. For more information, see [AUTOTITLE](/actions/writing-workflows/workflow-syntax-for-github-actions#git-diff-comparisons). ## Troubleshoot workflow execution Workflow execution involves any issues seen after the workflow was triggered and a workflow run has been created. {% ifversion fpt or ghec %} ### Debugging job conditions If a job was skipped unexpectedly, or ran when you expected it to be skipped, you can view the expression evaluation to understand why: 1. Click on the job in the workflow run. 1. Download the log archive from the job's menu. 1. Open the `JOB-NAME/system.txt` file. 1. Look for the `Evaluating`, `Expanded`, and `Result` lines. The `Expanded` line shows the actual runtime values that were substituted into your `if` condition, making it clear why the expression evaluated to `true` or `false`. For more information, see [AUTOTITLE](/actions/how-tos/monitor-workflows/view-job-condition-logs). {% endif %} ### Canceling Workflows If standard cancellation through the [UI](/actions/managing-workflow-runs-and-deployments/managing-workflow-runs/canceling-a-workflow) or [API](/rest/actions/workflow-runs?apiVersion=2022-11-28#cancel-a-workflow-run) does not process as expected, there may be a conditional statement configured for your running workflow job(s) that causes it to not cancel. In these cases, you can leverage the API to force cancel the run. For more information, see [AUTOTITLE](/rest/actions/workflow-runs?apiVersion=2022-11-28#force-cancel-a-workflow-run). A common cause can be using the `always()` [status check function](/actions/writing-workflows/choosing-what-your-workflow-does/evaluate-expressions-in-workflows-and-actions#status-check-functions) which returns `true`, even on cancellation. An alternative is to use the inverse of the `cancelled()` function, `{% raw %}${{ !cancelled() }}{% endraw %}`. For more information, see [AUTOTITLE](/actions/writing-workflows/choosing-when-your-workflow-runs/using-conditions-to-control-job-execution) and [AUTOTITLE](/actions/managing-workflow-runs-and-deployments/managing-workflow-runs/canceling-a-workflow#steps-github-takes-to-cancel-a-workflow-run). ## Troubleshooting runners ### Defining runner labels {% data variables.product.github %}-hosted runners leverage [preset labels](/actions/using-github-hosted-runners/using-github-hosted-runners/about-github-hosted-runners#standard-github-hosted-runners-for-public-repositories) maintained through the [`actions/runner-images`](https://github.com/actions/runner-images?tab=readme-ov-file#available-images) repository. We recommend using unique label names for larger and self-hosted runners. If a label matches to any of the existing preset labels, there can be runner assignment issues where there is no guarantee on which matching runner option the job will run on. ### Self-hosted runners If you use self-hosted runners, you can view their activity and diagnose common issues. For more information, see [AUTOTITLE](/actions/how-tos/manage-runners/self-hosted-runners/monitor-and-troubleshoot). ## Networking troubleshooting suggestions Our support is limited for network issues that involve: \* Your networks \* External networks \* Third-party systems \* General internet connectivity To view {% data variables.product.github %}'s realtime platform status, check [{% data variables.product.github %} Status](https://githubstatus.com/). For other network-related issues, review your organization's network settings and verify the status of any third-party services you're accessing. If problems persist, consider reaching out to your network administrators for further assistance. If you're unsure about the issue, contact {% data variables.contact.github\_support %}. For details on how to contact support, see [AUTOTITLE](/support/contacting-github-support). ### DNS Issues may occur from Domain Name System (DNS) configuration, resolution, or resolver problems. We recommend you review available logs, vendor documentation, or consult with your administrators for additional assistance. ### Firewalls Activities may become blocked by firewalls. If this occurs, you may want to review available logs, vendor documentation, or consult with your administrators for additional assistance. ### Proxies Activities could fail when using a proxy for communications. It's good practice to review available logs, vendor documentation, or consult with your administrators for additional assistance. Refer to [AUTOTITLE](/actions/how-tos/manage-runners/self-hosted-runners/use-proxy-servers) for information about configuring the runner application to utilize a proxy. ### Subnets It is possible to encounter issues with subnets in use or overlaps with an existing network, such as within virtual cloud provider or Docker networks. In such cases, we recommend you review your network topology and subnets in use. ### Certificates Issues may occur from self-signed or custom
https://github.com/github/docs/blob/main//content/actions/how-tos/troubleshoot-workflows.md
main
github-actions
[ -0.06357724219560623, -0.019758783280849457, 0.005229291971772909, -0.019215628504753113, 0.08115476369857788, -0.07837152481079102, 0.019637640565633774, -0.04372464865446091, 0.07994961738586426, -0.037434328347444534, 0.015092555433511734, 0.012856182642281055, -0.0008924836874939501, -...
0.078716
### Subnets It is possible to encounter issues with subnets in use or overlaps with an existing network, such as within virtual cloud provider or Docker networks. In such cases, we recommend you review your network topology and subnets in use. ### Certificates Issues may occur from self-signed or custom certificate chains and certificate stores. You can check that a certificate in use has not expired and is currently trusted. Certificates may be inspected with `curl` or similar tools. You can also review available logs, vendor documentation, or consult with your administrators for additional assistance. ### IP lists IP allow or deny lists may disrupt expected communications. If there is a problem, you should review available logs, vendor documentation, or consult with your administrators for additional assistance. {% ifversion ghec %} If your {% data variables.product.github %} account is configured with an IP allowlist, workflows will fail if a runner uses an IP address that isn’t included in the allowlist. To resolve this, verify that the runner's IP addresses are added to your organization's or enterprise's allowlist. For more details, see [AUTOTITLE](/organizations/keeping-your-organization-secure/managing-security-settings-for-your-organization/managing-allowed-ip-addresses-for-your-organization) and/or [AUTOTITLE](/admin/configuring-settings/hardening-security-for-your-enterprise/restricting-network-traffic-to-your-enterprise-with-an-ip-allow-list). {% endif %} {% ifversion fpt or ghec %} For information on {% data variables.product.github %}'s IP addresses, such as those used by {% data variables.product.github %}-hosted runners, see [AUTOTITLE](/authentication/keeping-your-account-and-data-secure/about-githubs-ip-addresses). Static IP addresses are available for use with {% data variables.product.github %}-hosted larger runners. See [AUTOTITLE](/actions/how-tos/manage-runners/larger-runners/manage-larger-runners) for more information. {% endif %} ### Operating systems and software applications In addition to firewalls or proxies, customizations performed to {% data variables.product.github %}-hosted runners, such as installing additional software packages, may result in communication disruptions. For information about available customization options, see [AUTOTITLE](/actions/how-tos/manage-runners/github-hosted-runners/customize-runners). \* For self-hosted runners, learn more about necessary endpoints in [AUTOTITLE](/actions/reference/runners/self-hosted-runners). \* For help configuring WireGuard, see [AUTOTITLE](/actions/how-tos/manage-runners/github-hosted-runners/connect-to-a-private-network/connect-with-wireguard). \* For details about configuring OpenID Connect (OIDC), see [AUTOTITLE](/actions/how-tos/manage-runners/github-hosted-runners/connect-to-a-private-network/connect-with-oidc). {% ifversion fpt or ghec %} ### Azure private networking for {% data variables.product.github %}-hosted runners Issues may arise from the use of {% data variables.product.github %}-hosted runners within your configured Azure Virtual Networks (VNETs) settings. For troubleshooting advice, see [AUTOTITLE](/organizations/managing-organization-settings/troubleshooting-azure-private-network-configurations-for-github-hosted-runners-in-your-organization) or {% ifversion ghec %}[AUTOTITLE](/admin/configuring-settings/configuring-private-networking-for-hosted-compute-products/troubleshooting-azure-private-network-configurations-for-github-hosted-runners-in-your-enterprise){% else %}[AUTOTITLE](/enterprise-cloud@latest/admin/configuring-settings/configuring-private-networking-for-hosted-compute-products/troubleshooting-azure-private-network-configurations-for-github-hosted-runners-in-your-enterprise) in the {% data variables.product.prodname\_ghe\_cloud %} docs{% endif %}. {% endif %}
https://github.com/github/docs/blob/main//content/actions/how-tos/troubleshoot-workflows.md
main
github-actions
[ -0.0027297406923025846, -0.008100499399006367, 0.010132921859622002, -0.013395332731306553, 0.04674084857106209, -0.04686054587364197, -0.04429709538817406, -0.07008146494626999, 0.04670526832342148, -0.05048763379454613, -0.014456848613917828, 0.061203733086586, 0.02294200100004673, 0.040...
0.01484
## Prerequisites Before starting this guide, you should be familiar with: \* The usage and security benefits of artifact attestations. See [AUTOTITLE](/actions/concepts/security/artifact-attestations). \* Generating artifact attestations. See [AUTOTITLE](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds). \* Writing and using reusable workflows. See [AUTOTITLE](/actions/using-workflows/reusing-workflows). ## Step 1: Configuring your builds First, we need to build with both artifact attestations and a reusable workflow. ### Building with a reusable workflow If you aren't already using reusable workflows to build your software, you'll need to take your build steps and move them into a reusable workflow. ### Building with artifact attestations The reusable workflow you use to build your software must also generate artifact attestations to establish build provenance. When you use a reusable workflow to generate artifact attestations, both the calling workflow and the reusable workflow need to have the following permissions. ```yaml copy permissions: attestations: write contents: read id-token: write ``` If you are building container images, you will also need to include the `packages: write` permission. ## Step 2: Verifying artifact attestations built with a reusable workflow To verify the artifact attestations generated with your builds, you can use [`gh attestation verify`](https://cli.github.com/manual/gh\_attestation\_verify) from the GitHub CLI. The `gh attestation verify` command requires either `--owner` or `--repo` flags to be used with it. These flags do two things. \* They tell `gh attestation verify` where to fetch the attestation from. This will always be your caller workflow. \* They tell `gh attestation verify` where the workflow that did the signing came from. This will always be the workflow that uses [`attest-build-provenance` action](https://github.com/actions/attest-build-provenance), which may be a reusable workflow. You can use optional flags with the `gh attestation verify` command. \* If your reusable workflow is not in the same repository as the caller workflow, use the `--signer-repo` flag to specify the repository that contains the reusable workflow. \* If you would like to require an artifact attestation to be signed with a specific workflow, use the `--signer-workflow` flag to indicate the workflow file that should be used. For example, if your calling workflow is `ORGANIZATION\_NAME/REPOSITORY\_NAME/.github/workflows/calling.yml` and it uses `REUSABLE\_ORGANIZATION\_NAME/REUSABLE\_REPOSITORY\_NAME/.github/workflows/reusable.yml` you could do: ```bash copy gh attestation verify -o ORGANIZATION\_NAME --signer-repo REUSABLE\_ORGANIZATION\_NAME/REUSABLE\_REPOSITORY\_NAME PATH/TO/YOUR/BUILD/ARTIFACT-BINARY ``` Or if you want to specify the exact workflow: ```bash copy gh attestation verify -o ORGANIZATION\_NAME --signer-workflow REUSABLE\_ORGANIZATION\_NAME/REUSABLE\_REPOSITORY\_NAME/.github/workflows/reusable.yml PATH/TO/YOUR/BUILD/ARTIFACT-BINARY ```
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/use-artifact-attestations/increase-security-rating.md
main
github-actions
[ -0.10349921137094498, 0.01812177710235119, -0.007159702945500612, -0.028427137061953545, 0.004390610847622156, -0.0627610981464386, -0.0033272847067564726, -0.016509713605046272, -0.056614506989717484, -0.017575370147824287, -0.02596331760287285, -0.09496957063674927, 0.05236344039440155, ...
0.112652
## Prerequisites Before starting this guide, you should be generating artifact attestations for your builds. See [AUTOTITLE](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds). ## Step 1: Download attestation bundle First, get the attestation bundle from the attestation API. You can do so with the following command from a machine that is online: ```bash copy gh attestation download PATH/TO/YOUR/BUILD/ARTIFACT-BINARY -R ORGANIZATION\_NAME/REPOSITORY\_NAME ``` Here is example output from that command: ```bash Wrote attestations to file sha256:ae57936def59bc4c75edd3a837d89bcefc6d3a5e31d55a6fa7a71624f92c3c3b.jsonl. Any previous content has been overwritten The trusted metadata is now available at sha256:ae57936def59bc4c75edd3a837d89bcefc6d3a5e31d55a6fa7a71624f92c3c3b.jsonl ``` ## Step 2: Download trusted roots Next, get the key material from the trusted roots. Artifact attestations uses the Sigstore public good instance for public repositories, and GitHub's Sigstore instance for private repositories. You can use one command to get both trusted roots: ```bash copy gh attestation trusted-root > trusted\_root.jsonl ``` ### Updating trusted root information in an offline environment It's best practice to generate a new `trusted\_root.jsonl` file any time you are importing new signed material into your offline environment. The key material in `trusted\_root.jsonl` does not have a built-in expiration date, so anything signed before you generate the trusted root file will continue to successfully verify. Anything signed after the file is generated will verify until that Sigstore instance rotates its key material, which typically happens a few times per year. You will not know if key material has been revoked since you last generated the trusted root file. ## Step 3: Perform offline verification Now, you are ready to verify the artifact offline. You should import into your offline environment: \* {% data variables.product.prodname\_cli %} \* Your artifact \* The bundle file \* The trusted root file You can then perform offline verification with the following command: ```bash copy gh attestation verify PATH/TO/YOUR/BUILD/ARTIFACT-BINARY -R ORGANIZATION\_NAME/REPOSITORY\_NAME --bundle sha256:ae57936def59bc4c75edd3a837d89bcefc6d3a5e31d55a6fa7a71624f92c3c3b.jsonl --custom-trusted-root trusted\_root.jsonl ```
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/use-artifact-attestations/verify-attestations-offline.md
main
github-actions
[ -0.06015784665942192, -0.017479870468378067, -0.04151548817753792, -0.04065284878015518, 0.07189430296421051, -0.04881318658590317, -0.03613764047622681, -0.06906397640705109, -0.05243752896785736, 0.05765854939818382, 0.044109415262937546, -0.06415653973817825, 0.036746371537446976, -0.00...
-0.042459
>[!NOTE] Before proceeding, ensure you have enabled build provenance for container images, including setting the `push-to-registry` attribute in the [`attest-build-provenance` action](https://github.com/actions/attest-build-provenance) as documented in [Generating build provenance for container images](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds#generating-build-provenance-for-container-images). This is required for the Policy Controller to verify the attestation. ## Getting started with Kubernetes admission controller To set up an admission controller for enforcing GitHub artifact attestations, you need to: 1. [Deploy the Sigstore Policy Controller](#deploy-the-sigstore-policy-controller). 1. [Add the GitHub `TrustRoot` and a `ClusterImagePolicy` to your cluster](#add-the-github-trustroot-and-a-clusterimagepolicy). 1. [Enable the policy in your namespace](#enable-the-policy-in-your-namespace). ### Deploy the Sigstore Policy Controller The Sigstore Policy Controller has been packaged and made available via a [Helm chart](https://github.com/sigstore/helm-charts). Before you begin, ensure you have the following prerequisites: \* A Kubernetes cluster with version 1.27 or later \* [Helm](https://helm.sh/docs/intro/install/) 3.0 or later \* [kubectl](https://kubernetes.io/docs/tasks/tools/) First, install the Helm chart that deploys the Sigstore Policy Controller: ```bash copy helm upgrade policy-controller --install --atomic \ --create-namespace --namespace artifact-attestations \ oci://ghcr.io/sigstore/helm-charts/policy-controller \ --version 0.10.5 ``` This installs the Policy Controller into the `artifact-attestations` namespace. At this point, no policies have been configured, and it will not enforce any attestations. ### Add the GitHub `TrustRoot` and a `ClusterImagePolicy` Once the policy controller has been deployed, you need to add the GitHub `TrustRoot` and a `ClusterImagePolicy` to your cluster. Use the Helm chart we provide to do this. Make sure to replace `MY-ORGANIZATION` with your GitHub organization's name (e.g., `github` or `octocat-inc`). ```bash copy helm upgrade trust-policies --install --atomic \ --namespace artifact-attestations \ oci://ghcr.io/github/artifact-attestations-helm-charts/trust-policies \ --version v0.7.0 \ --set policy.enabled=true \ --set policy.organization=MY-ORGANIZATION ``` You've now installed the GitHub trust root, and an artifact attestation policy into your cluster. This policy will reject artifacts that have not originated from within your GitHub organization. ### Enable the policy in your namespace > [!WARNING] > This policy will not be enforced until you specify which namespaces it should apply to. Each namespace in your cluster can independently enforce policies. To enable enforcement in a namespace, you can add the following label to the namespace: ```yaml metadata: labels: policy.sigstore.dev/include: "true" ``` After the label is added, the GitHub artifact attestation policy will be enforced in the namespace. Alternatively, you may run: ```bash copy kubectl label namespace MY-NAMESPACE policy.sigstore.dev/include=true ``` ### Matching images By default, the policy installed with the `trust-policies` Helm chart will verify attestations for all images before admitting them into the cluster. If you only intend to enforce attestations for a subset of images, you can use the Helm values `policy.images` and `policy.exemptImages` to specify a list of images to match against. These values can be set to a list of glob patterns that match the image names. The globbing syntax uses Go [filepath](https://pkg.go.dev/path/filepath#Match) semantics, with the addition of `\*\*` to match any character sequence, including slashes. For example, to enforce attestations for images that match the pattern `ghcr.io/MY-ORGANIZATION/\*` and admit `busybox` without a valid attestation, you can run: ```bash copy helm upgrade trust-policies --install --atomic \ --namespace artifact-attestations \ oci://ghcr.io/github/artifact-attestations-helm-charts/trust-policies \ --version v0.7.0 \ --set policy.enabled=true \ --set policy.organization=MY-ORGANIZATION \ --set-json 'policy.exemptImages=["index.docker.io/library/busybox\*\*"]' \ --set-json 'policy.images=["ghcr.io/MY-ORGANIZATION/\*\*"]' ``` All patterns must use the fully-qualified name, even if the images originate from Docker Hub. In this example, if we want to exempt the image `busybox`, we must provide the full name including the domain and double-star glob to match all image versions: `index.docker.io/library/busybox\*\*`. Note that any image you intend to admit \_must\_ have a matching glob pattern in the `policy.images` list. If an image does not match any pattern, it will be rejected. Additionally, if an image matches both `policy.images` and `policy.exemptImages`, it will be rejected. {% ifversion ghec %} If
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/use-artifact-attestations/enforce-artifact-attestations.md
main
github-actions
[ 0.015258789993822575, -0.010665320791304111, 0.016289371997117996, -0.0038916729390621185, 0.04316934198141098, -0.020753663033246994, -0.01679740659892559, 0.008093315176665783, 0.0202459916472435, 0.07255949079990387, 0.032409053295850754, -0.1037287786602974, 0.0720050036907196, 0.01266...
0.065505
versions: `index.docker.io/library/busybox\*\*`. Note that any image you intend to admit \_must\_ have a matching glob pattern in the `policy.images` list. If an image does not match any pattern, it will be rejected. Additionally, if an image matches both `policy.images` and `policy.exemptImages`, it will be rejected. {% ifversion ghec %} If your GitHub Enterprise account has a subdomain on {% data variables.enterprise.data\_residency\_site %}, you must specify a value for the GitHub trust domain. This value is used to fetch the trusted materials associated with the data residency region that hosts your GitHub Enterprise account. This value can be found by logging into your enterprise account with the `gh` CLI tool and running the following command: ```bash copy gh api meta --jq .domains.artifact\_attestations.trust\_domain ``` This value must be added when installing the `trust-policies` chart, like so: ```bash copy --set-json 'policy.trust.githubTrustDomain="YOUR-GHEC-TRUST-DOMAIN"' ``` {% endif %} ### Advanced usage To see the full set of options you may configure with the Helm chart, you can run either of the following commands. For policy controller options: ```bash copy helm show values oci://ghcr.io/sigstore/helm-charts/policy-controller --version 0.10.5 ``` For trust policy options: ```bash copy helm show values oci://ghcr.io/github/artifact-attestations-helm-charts/trust-policies --version v0.7.0 ``` For more information on the Sigstore Policy Controller, see the [Sigstore Policy Controller documentation](https://docs.sigstore.dev/policy-controller/overview/).
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/use-artifact-attestations/enforce-artifact-attestations.md
main
github-actions
[ 0.06214702129364014, 0.01508563756942749, -0.004246879369020462, -0.0064726322889328, 0.10304826498031616, -0.11027302592992783, -0.006087704561650753, -0.03607062250375748, 0.027120737358927727, 0.016325972974300385, 0.010285645723342896, -0.06917285174131393, 0.0653357282280922, 0.024051...
0.003899
## Prerequisites Before you start generating artifact attestations, you need to understand what they are and when you should use them. See [AUTOTITLE](/actions/concepts/security/artifact-attestations). ## Generating artifact attestations for your builds You can use {% data variables.product.prodname\_actions %} to generate artifact attestations that establish build provenance for artifacts such as binaries and container images. To generate an artifact attestation, you must: \* Ensure you have the appropriate permissions configured in your workflow. \* Include a step in your workflow that uses the [`attest-build-provenance` action](https://github.com/actions/attest-build-provenance). When you run your updated workflows, they will build your artifacts and generate an artifact attestation that establishes build provenance. You can view attestations in your repository's \*\*Actions\*\* tab. For more information, see the [`attest-build-provenance`](https://github.com/actions/attest-build-provenance) repository. ### Generating build provenance for binaries 1. In the workflow that builds the binary you would like to attest, add the following permissions. ```yaml permissions: id-token: write contents: read attestations: write ``` 1. After the step where the binary has been built, add the following step. ```yaml - name: Generate artifact attestation uses: actions/attest-build-provenance@v3 with: subject-path: 'PATH/TO/ARTIFACT' ``` The value of the `subject-path` parameter should be set to the path to the binary you want to attest. ### Generating build provenance for container images 1. In the workflow that builds the container image you would like to attest, add the following permissions. ```yaml permissions: id-token: write contents: read attestations: write packages: write ``` 1. After the step where the image has been built, add the following step. ```yaml - name: Generate artifact attestation uses: actions/attest-build-provenance@v3 with: subject-name: {% raw %}${{ env.REGISTRY }}/${{ env.IMAGE\_NAME }}{% endraw %} subject-digest: 'sha256:fedcba0...' push-to-registry: true ``` The value of the `subject-name` parameter should specify the fully-qualified image name. For example, `ghcr.io/user/app` or `acme.azurecr.io/user/app`. Do not include a tag as part of the image name. The value of the `subject-digest` parameter should be set to the SHA256 digest of the subject for the attestation, in the form `sha256:HEX\_DIGEST`. If your workflow uses `docker/build-push-action`, you can use the [`digest`](https://github.com/docker/build-push-action?tab=readme-ov-file#outputs) output from that step to supply the value. For more information on using outputs, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob\_idoutputs). ## Generating an attestation for a software bill of materials (SBOM) You can generate signed SBOM attestations for workflow artifacts. To generate an attestation for an SBOM, you must: \* Ensure you have the appropriate permissions configured in your workflow. \* Create an SBOM for your artifact. For more information, see [`anchore-sbom-action`](https://github.com/marketplace/actions/anchore-sbom-action) in the {% data variables.product.prodname\_marketplace %}. \* Include a step in your workflow that uses the [`attest-sbom` action](https://github.com/actions/attest-sbom). When you run your updated workflows, they will build your artifacts and generate an SBOM attestation. You can view attestations in your repository's \*\*Actions\*\* tab. For more information, see the [`attest-sbom` action](https://github.com/actions/attest-sbom) repository. ### Generating an SBOM attestation for binaries 1. In the workflow that builds the binary you would like to attest, add the following permissions. ```yaml permissions: id-token: write contents: read attestations: write ``` 1. After the step where the binary has been built, add the following step. ```yaml - name: Generate SBOM attestation uses: actions/attest-sbom@v2 with: subject-path: 'PATH/TO/ARTIFACT' sbom-path: 'PATH/TO/SBOM' ``` The value of the `subject-path` parameter should be set to the path of the binary the SBOM describes. The value of the `sbom-path` parameter should be set to the path of the SBOM file you generated. ### Generating an SBOM attestation for container images 1. In the workflow that builds the container image you would like to attest, add the following permissions. ```yaml permissions: id-token: write contents: read attestations: write packages: write ``` 1. After the step where the image has been built, add the following step. ```yaml -
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/use-artifact-attestations/use-artifact-attestations.md
main
github-actions
[ -0.025894999504089355, -0.0018209246918559074, -0.04206483066082001, -0.007933092303574085, 0.04957316443324089, -0.059576038271188736, 0.010350411757826805, -0.025810932740569115, -0.03453383222222328, 0.010897987522184849, -0.0053682150319218636, -0.12079309672117233, 0.05095133185386658, ...
0.042309
SBOM attestation for container images 1. In the workflow that builds the container image you would like to attest, add the following permissions. ```yaml permissions: id-token: write contents: read attestations: write packages: write ``` 1. After the step where the image has been built, add the following step. ```yaml - name: Generate SBOM attestation uses: actions/attest-sbom@v2 with: subject-name: {% raw %}${{ env.REGISTRY }}/PATH/TO/IMAGE{% endraw %} subject-digest: 'sha256:fedcba0...' sbom-path: 'sbom.json' push-to-registry: true ``` The value of the `subject-name` parameter should specify the fully-qualified image name. For example, `ghcr.io/user/app` or `acme.azurecr.io/user/app`. Do not include a tag as part of the image name. The value of the `subject-digest` parameter should be set to the SHA256 digest of the subject for the attestation, in the form `sha256:HEX\_DIGEST`. If your workflow uses `docker/build-push-action`, you can use the [`digest`](https://github.com/docker/build-push-action?tab=readme-ov-file#outputs) output from that step to supply the value. For more information on using outputs, see [AUTOTITLE](/actions/using-workflows/workflow-syntax-for-github-actions#jobsjob\_idoutputs). The value of the `sbom-path` parameter should be set to the path to the JSON-formatted SBOM file you want to attest. ## Uploading artifacts to the {% data variables.product.virtual\_registry %} We recommend uploading attested assets to your organization's {% data variables.product.virtual\_registry %}. This page displays artifacts' build history, deployment records, and storage details. You can use this data to prioritize security alerts or quickly connect vulnerable artifacts to their owning team, source code, and build run. For more information, see [AUTOTITLE](/code-security/concepts/supply-chain-security/linked-artifacts). {% data reusables.actions.attestation-virtual-registry %} For an example workflow, see [AUTOTITLE](/code-security/how-tos/secure-your-supply-chain/establish-provenance-and-integrity/upload-linked-artifacts#generating-an-attestation). ## Verifying artifact attestations with the {% data variables.product.prodname\_cli %} You can validate artifact attestations for binaries and container images and validate SBOM attestations using the {% data variables.product.prodname\_cli %}. For more information, see the [`attestation`](https://cli.github.com/manual/gh\_attestation) section of the {% data variables.product.prodname\_cli %} manual. >[!NOTE]These commands assume you are in an online environment. If you are in an offline or air-gapped environment, see [AUTOTITLE](/actions/security-guides/verifying-attestations-offline). ### Verifying an artifact attestation for binaries To verify artifact attestations for \*\*binaries\*\*, use the following {% data variables.product.prodname\_cli %} command. ```bash copy gh attestation verify PATH/TO/YOUR/BUILD/ARTIFACT-BINARY -R ORGANIZATION\_NAME/REPOSITORY\_NAME ``` ### Verifying an artifact attestation for container images To verify artifact attestations for \*\*container images\*\*, you must provide the image's FQDN prefixed with `oci://` instead of the path to a binary. You can use the following {% data variables.product.prodname\_cli %} command. ```bash copy docker login ghcr.io gh attestation verify oci://ghcr.io/ORGANIZATION\_NAME/IMAGE\_NAME:test -R ORGANIZATION\_NAME/REPOSITORY\_NAME ``` ### Verifying an attestation for SBOMs To verify SBOM attestations, you have to provide the `--predicate-type` flag to reference a non-default predicate. For more information, see [Vetted predicates](https://github.com/in-toto/attestation/tree/main/spec/predicates#vetted-predicates) in the `in-toto/attestation` repository. For example, the [`attest-sbom` action](https://github.com/actions/attest-sbom) currently supports either SPDX or CycloneDX SBOM predicates. To verify an SBOM attestation in the SPDX format, you can use the following {% data variables.product.prodname\_cli %} command. ```bash copy gh attestation verify PATH/TO/YOUR/BUILD/ARTIFACT-BINARY \ -R ORGANIZATION\_NAME/REPOSITORY\_NAME \ --predicate-type https://spdx.dev/Document/v2.3 ``` To view more information on the attestation, reference the `--format json` flag. This can be especially helpful when reviewing SBOM attestations. ```bash copy gh attestation verify PATH/TO/YOUR/BUILD/ARTIFACT-BINARY \ -R ORGANIZATION\_NAME/REPOSITORY\_NAME \ --predicate-type https://spdx.dev/Document/v2.3 \ --format json \ --jq '.[].verificationResult.statement.predicate' ``` ## Next steps To keep your attestations relevant and manageable, you should delete attestations that are no longer needed. See [AUTOTITLE](/actions/how-tos/security-for-github-actions/using-artifact-attestations/managing-the-lifecycle-of-artifact-attestations). You can also generate release attestations to help consumers verify the integrity and origin of your releases. For more information, see [AUTOTITLE](/code-security/supply-chain-security/understanding-your-software-supply-chain/immutable-releases).
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/use-artifact-attestations/use-artifact-attestations.md
main
github-actions
[ 0.03527127951383591, -0.022178983315825462, -0.06670643389225006, 0.01157839410007, 0.07224584370851517, -0.04811641201376915, 0.01873590052127838, 0.022598549723625183, -0.03174835070967674, 0.014790816232562065, 0.02523939125239849, -0.046174101531505585, 0.016490360721945763, 0.05579936...
0.06596
{% data reusables.actions.lifecycle-of-attestations %} ## Finding attestations 1. Navigate to the repository where the attestation was produced. {% data reusables.repositories.actions-tab %} 1. In the left sidebar, under "Management," click \*\*{% octicon "verified" aria-hidden="true" aria-label="verified" %} Attestations\*\*. 1. The attestations are sorted by creation date, newest first. Use the "Search or filter" bar to search for an attestation or filter the results. ### Searching and filtering Enter \*\*free text\*\* to search by subject name. This returns all attestations with subject names that partially match your search string. Multiple attestations can have the same subject name. Use the `created` filter to filter by creation date. To enter a custom date range, click today's date then edit the default query. \* For example: `created:<2025-04-03`. \* Supported operators: `> <`. Use the `predicate` filter to filter by the kind of attestation. A predicate is the type of claim that an attestation makes about an artifact, such as "this artifact was built during a particular workflow run and originates from this repository." \* Provenance attestations were created with the `attest-build-provenance` action. \* SBOM attestations were created with the `attest-sbom` action. \* Custom predicate type patterns are \*\*not\*\* supported in the search field, but are supported by the API. ## Deleting attestations Before deleting an attestation, we recommend downloading a copy of it. Once the attestation is deleted, consumers with a verification process in place will \*\*no longer be able to use the associated artifact\*\*, and you will no longer be able to find the attestation on {% data variables.product.github %}. 1. In the list of attestations, select the checkbox next to the attestations you want to delete. You can select multiple attestations at a time. 1. Click \*\*{% octicon "trash" aria-hidden="true" aria-label="trash" %} Delete\*\*. 1. Read the message, then confirm by clicking \*\*Delete attestations\*\*. ## Managing attestations with the API To manage attestations in bulk with the REST API, see [AUTOTITLE](/rest/users/attestations).
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/use-artifact-attestations/manage-attestations.md
main
github-actions
[ -0.03431747481226921, 0.03333544358611107, -0.07075387239456177, 0.11385314911603928, 0.0482051707804203, 0.06645176559686661, 0.0404546782374382, -0.06286662071943283, -0.06621594727039337, 0.05264020711183548, 0.036860447376966476, -0.03785822167992592, 0.058967918157577515, -0.030260168...
0.00833
{% data reusables.actions.enterprise-github-hosted-runners %} ## Overview OpenID Connect (OIDC) allows your {% data variables.product.prodname\_actions %} workflows to access resources in Google Cloud Platform (GCP), without needing to store the GCP credentials as long-lived {% data variables.product.prodname\_dotcom %} secrets. This guide gives an overview of how to configure GCP to trust {% data variables.product.prodname\_dotcom %}'s OIDC as a federated identity, and includes a workflow example for the [`google-github-actions/auth`](https://github.com/google-github-actions/auth) action that uses tokens to authenticate to GCP and access resources. ## Prerequisites {% data reusables.actions.oidc-link-to-intro %} {% data reusables.actions.oidc-security-notice %} {% data reusables.actions.oidc-on-ghecom %} {% ifversion ghes %} {% data reusables.actions.oidc-endpoints %} > [!NOTE] > Google Cloud Platform does not have fixed IP ranges defined for these endpoints. \* Make sure that the value of the issuer claim that's included with the JSON Web Token (JWT) is set to a publicly routable URL. For more information, see [AUTOTITLE](/enterprise-server@latest/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect). {% endif %} ## Adding a Google Cloud Workload Identity Provider To configure the OIDC identity provider in GCP, you will need to perform the following configuration. For instructions on making these changes, refer to [the GCP documentation](https://github.com/google-github-actions/auth). 1. Create a new identity pool. 1. Configure the mapping and add conditions. 1. Connect the new pool to a service account. Additional guidance for configuring the identity provider: \* For security hardening, make sure you've reviewed [Configuring the OIDC trust with the cloud](/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#configuring-the-oidc-trust-with-the-cloud). For an example, see [Configuring the subject in your cloud provider](/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#configuring-the-subject-in-your-cloud-provider). \* For the service account to be available for configuration, it needs to be assigned to the `roles/iam.workloadIdentityUser` role. For more information, see [the GCP documentation](https://cloud.google.com/iam/docs/workload-identity-federation?\_ga=2.114275588.-285296507.1634918453#conditions). \* The Issuer URL to use: {% ifversion ghes %}`https://HOSTNAME/\_services/token`{% else %}`https://token.actions.githubusercontent.com`{% endif %} ## Updating your {% data variables.product.prodname\_actions %} workflow To update your workflows for OIDC, you will need to make two changes to your YAML: 1. Add permissions settings for the token. 1. Use the [`google-github-actions/auth`](https://github.com/google-github-actions/auth) action to exchange the OIDC token (JWT) for a cloud access token. {% data reusables.actions.oidc-deployment-protection-rules %} ### Adding permissions settings {% data reusables.actions.oidc-permissions-token %} ### Requesting the access token The `google-github-actions/auth` action receives a JWT from the {% data variables.product.prodname\_dotcom %} OIDC provider, and then requests an access token from GCP. For more information, see the GCP [documentation](https://github.com/google-github-actions/auth). This example has a job called `Get\_OIDC\_ID\_token` that uses actions to request a list of services from GCP. \* `WORKLOAD-IDENTITY-PROVIDER`: Replace this with the path to your identity provider in GCP. For example, `projects/example-project-id/locations/global/workloadIdentityPools/name-of-pool/providers/name-of-provider` \* `SERVICE-ACCOUNT`: Replace this with the name of your service account in GCP. This action exchanges a {% data variables.product.prodname\_dotcom %} OIDC token for a Google Cloud access token, using [Workload Identity Federation](https://cloud.google.com/iam/docs/workload-identity-federation). {% raw %} ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} name: List services in GCP on: pull\_request: branches: - main permissions: id-token: write jobs: Get\_OIDC\_ID\_token: runs-on: ubuntu-latest steps: - id: 'auth' name: 'Authenticate to GCP' uses: 'google-github-actions/auth@f1e2d3c4b5a6f7e8d9c0b1a2c3d4e5f6a7b8c9d0' with: create\_credentials\_file: 'true' workload\_identity\_provider: 'WORKLOAD-IDENTITY-PROVIDER' service\_account: 'SERVICE-ACCOUNT' - id: 'gcloud' name: 'gcloud' run: |- gcloud auth login --brief --cred-file="${{ steps.auth.outputs.credentials\_file\_path }}" gcloud services list ``` {% endraw %} ## Further reading {% data reusables.actions.oidc-further-reading %}
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-in-google-cloud-platform.md
main
github-actions
[ -0.10156620293855667, -0.060180697590112686, -0.02225450426340103, 0.009691900573670864, 0.01522638276219368, 0.013266622088849545, 0.06731943041086197, 0.014837825670838356, -0.011618576943874359, -0.0015679681673645973, -0.009998918510973454, 0.023811299353837967, 0.0664314478635788, -0....
0.021537
## Overview OpenID Connect (OIDC) allows your {% data variables.product.prodname\_actions %} workflows to authenticate with [JFrog](https://jfrog.com/) to download and publish artifacts without storing JFrog passwords, tokens, or API keys in {% data variables.product.company\_short %}. This guide gives an overview of how to configure JFrog to trust {% data variables.product.prodname\_dotcom %}'s OIDC as a federated identity, and demonstrates how to use this configuration in a {% data variables.product.prodname\_actions %} workflow. For an example {% data variables.product.prodname\_actions %} workflow, see [Sample {% data variables.product.prodname\_actions %} Integration](https://jfrog.com/help/r/jfrog-platform-administration-documentation/sample-github-actions-integration) in the JFrog documentation. For an example {% data variables.product.prodname\_actions %} workflow using the JFrog CLI, see [`build-publish.yml`](https://github.com/jfrog/jfrog-github-oidc-example/blob/main/.github/workflows/build-publish.yml) in the `jfrog-github-oidc-example` repository. ## Prerequisites {% data reusables.actions.oidc-link-to-intro %} {% data reusables.actions.oidc-security-notice %} {% data reusables.actions.oidc-on-ghecom %} \* To be secure, you need to set a Claims JSON in JFrog when configuring identity mappings. For more information, see [AUTOTITLE](https://jfrog.com/help/r/jfrog-platform-administration-documentation/configure-identity-mappings) and [AUTOTITLE](/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#customizing-the-token-claims). For example, you can set `iss` to `https://token.actions.githubusercontent.com`, and the `repository` to something like "octo-org/octo-repo"`. This will ensure only Actions workflows from the specified repository will have access to your JFrog platform. The following is an example Claims JSON when configuring identity mappings. {% data reusables.actions.jfrog-json-configuring-identity-mappings %} ## Adding the identity provider to JFrog To use OIDC with JFrog, establish a trust relationship between {% data variables.product.prodname\_actions %} and the JFrog platform. For more information about this process, see [OpenID Connect Integration](https://jfrog.com/help/r/jfrog-platform-administration-documentation/openid-connect-integration) in the JFrog documentation. 1. Sign in to your JFrog Platform. 1. Configure trust between JFrog and your {% data variables.product.prodname\_actions %} workflows. 1. Configure identity mappings. ## Updating your {% data variables.product.prodname\_actions %} workflow ### Authenticating with JFrog using OIDC In your {% data variables.product.prodname\_actions %} workflow file, ensure you are using the provider name and audience you configured in the JFrog Platform. The following example uses the placeholders `YOUR\_PROVIDER\_NAME` and `YOUR\_AUDIENCE`. {% raw %} ```yaml {% data reusables.actions.actions-not-certified-by-github-comment %} permissions: id-token: write contents: read jobs: build: runs-on: ubuntu-latest steps: - name: Set up JFrog CLI with OIDC id: setup-jfrog-cli uses: jfrog/setup-jfrog-cli@29fa5190a4123350e81e2a2e8d803b2a27fed15e with: JF\_URL: ${{ env.JF\_URL }} oidc-provider-name: 'YOUR\_PROVIDER\_NAME' oidc-audience: 'YOUR\_AUDIENCE' # This is optional - name: Upload artifact run: jf rt upload "dist/\*.zip" my-repo/ ``` {% endraw %} > [!TIP] > When OIDC authentication is used, the `setup-jfrog-cli` action automatically provides `oidc-user` and `oidc-token` as step outputs. > These can be used for other integrations that require authentication with JFrog. > To reference these outputs, ensure the step has an explicit `id` defined (for example `id: setup-jfrog-cli`). ### Using OIDC Credentials in other steps {% raw %} ```yaml {% data reusables.actions.actions-not-certified-by-github-comment %} - name: Sign in to Artifactory Docker registry uses: docker/login-action@v3 with: registry: ${{ env.JF\_URL }} username: ${{ steps.setup-jfrog-cli.outputs.oidc-user }} password: ${{ steps.setup-jfrog-cli.outputs.oidc-token }} ``` {% endraw %} ## Further reading \* [OpenID Connect Integration](https://jfrog.com/help/r/jfrog-platform-administration-documentation/openid-connect-integration) in the JFrog documentation \* [Identity Mappings](https://jfrog.com/help/r/jfrog-platform-administration-documentation/identity-mappings) in the JFrog documentation \* [AUTOTITLE](/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect)
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-in-jfrog.md
main
github-actions
[ -0.05568623170256615, -0.03395767882466316, -0.07912377268075943, 0.00629283906891942, 0.02779831364750862, -0.013012788258492947, 0.06787832826375961, 0.05237921327352524, 0.012013065628707409, -0.05877182260155678, -0.019701959565281868, -0.0007204892463050783, 0.03796392306685448, -0.01...
0.049213
{% data reusables.actions.enterprise-github-hosted-runners %} ## Overview OpenID Connect (OIDC) allows your {% data variables.product.prodname\_actions %} workflows to authenticate with a HashiCorp Vault to retrieve secrets. This guide gives an overview of how to configure HashiCorp Vault to trust {% data variables.product.prodname\_dotcom %}'s OIDC as a federated identity, and demonstrates how to use this configuration in the [hashicorp/vault-action](https://github.com/hashicorp/vault-action) action to retrieve secrets from HashiCorp Vault. ## Prerequisites {% data reusables.actions.oidc-link-to-intro %} {% data reusables.actions.oidc-security-notice %} {% data reusables.actions.oidc-on-ghecom %} ## Adding the identity provider to HashiCorp Vault To use OIDC with HashiCorp Vault, you will need to add a trust configuration for the {% data variables.product.prodname\_dotcom %} OIDC provider. For more information, see the HashiCorp Vault [documentation](https://www.vaultproject.io/docs/auth/jwt). To configure your Vault server to accept JSON Web Tokens (JWT) for authentication: 1. Enable the JWT `auth` method, and use `write` to apply the configuration to your Vault. For `oidc\_discovery\_url` and `bound\_issuer` parameters, use {% ifversion ghes %}`https://HOSTNAME/\_services/token`{% else %}`https://token.actions.githubusercontent.com`{% endif %}. These parameters allow the Vault server to verify the received JSON Web Tokens (JWT) during the authentication process. ```shell copy vault auth enable jwt ``` ```shell copy vault write auth/jwt/config \ bound\_issuer="{% ifversion ghes %}https://HOSTNAME/\_services/token{% else %}https://token.actions.githubusercontent.com{% endif %}" \ oidc\_discovery\_url="{% ifversion ghes %}https://HOSTNAME/\_services/token{% else %}https://token.actions.githubusercontent.com{% endif %}" ``` {% ifversion ghec %} > [!NOTE] > If a unique issuer URL for an enterprise was set using the REST API (as described in [AUTOTITLE](/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#switching-to-a-unique-token-url)), the values for `bound\_issuer` and `oidc\_discover\_url` must match that unique URL. For example, for an enterprise named `octocat` that uses the unique issuer URL, `bound\_issuer` and `oidc\_discovery\_url` must be set to `https://token.actions.githubusercontent.com/octocat`. {% endif %} 1. Configure a policy that only grants access to the specific paths your workflows will use to retrieve secrets. For more advanced policies, see the HashiCorp Vault [Policies documentation](https://www.vaultproject.io/docs/concepts/policies). ```shell copy vault policy write myproject-production - < [!NOTE] > When the `permissions` key is used, all unspecified permissions are set to \_no access\_, with the exception of the metadata scope, which always gets \_read\_ access. As a result, you may need to add other permissions, such as `contents: read`. See [Automatic token authentication](/actions/security-guides/automatic-token-authentication) for more information. ### Requesting the access token The `hashicorp/vault-action` action receives a JWT from the {% data variables.product.prodname\_dotcom %} OIDC provider, and then requests an access token from your HashiCorp Vault instance to retrieve secrets. For more information, see the HashiCorp Vault GitHub Action [documentation](https://github.com/hashicorp/vault-action). This example demonstrates how to create a job that requests a secret from HashiCorp Vault. \* `VAULT-URL`: Replace this with the URL of your HashiCorp Vault. \* `VAULT-NAMESPACE`: Replace this with the Namespace you've set in HashiCorp Vault. For example: `admin`. \* `ROLE-NAME`: Replace this with the role you've set in the HashiCorp Vault trust relationship. \* `SECRET-PATH`: Replace this with the path to the secret you're retrieving from HashiCorp Vault. For example: `secret/data/production/ci npmToken`. ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} jobs: retrieve-secret: runs-on: ubuntu-latest permissions: id-token: write contents: read steps: - name: Retrieve secret from Vault uses: hashicorp/vault-action@9a8b7c6d5e4f3a2b1c0d9e8f7a6b5c4d3e2f1a0b with: method: jwt url: VAULT-URL namespace: VAULT-NAMESPACE # HCP Vault and Vault Enterprise only role: ROLE-NAME secrets: SECRET-PATH - name: Use secret from Vault run: | # This step has access to the secret retrieved above; see hashicorp/vault-action for more details. ``` > [!NOTE] > \* If your Vault server is not accessible from the public network, consider using a self-hosted runner with other available Vault [auth methods](https://www.vaultproject.io/docs/auth). For more information, see [AUTOTITLE](/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners). > \* `VAULT-NAMESPACE` must be set for a Vault Enterprise (including HCP Vault) deployment. For more information, see [Vault namespace](https://www.vaultproject.io/docs/enterprise/namespaces). ### Revoking the access token By default, the Vault server will
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-in-hashicorp-vault.md
main
github-actions
[ -0.03418038785457611, -0.013318745419383049, -0.12681737542152405, 0.005356192123144865, 0.0352790467441082, -0.0473882257938385, 0.0267658568918705, 0.04806071147322655, -0.012266918085515499, -0.05253736674785614, 0.04400135576725006, 0.009519941173493862, 0.0405748188495636, -0.03512323...
0.055383
the public network, consider using a self-hosted runner with other available Vault [auth methods](https://www.vaultproject.io/docs/auth). For more information, see [AUTOTITLE](/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners). > \* `VAULT-NAMESPACE` must be set for a Vault Enterprise (including HCP Vault) deployment. For more information, see [Vault namespace](https://www.vaultproject.io/docs/enterprise/namespaces). ### Revoking the access token By default, the Vault server will automatically revoke access tokens when their TTL is expired, so you don't have to manually revoke the access tokens. However, if you do want to revoke access tokens immediately after your job has completed or failed, you can manually revoke the issued token using the [Vault API](https://www.vaultproject.io/api/auth/token#revoke-a-token-self). 1. Set the `exportToken` option to `true` (default: `false`). This exports the issued Vault access token as an environment variable: `VAULT\_TOKEN`. 1. Add a step to call the [Revoke a Token (Self)](https://www.vaultproject.io/api/auth/token#revoke-a-token-self) Vault API to revoke the access token. ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} jobs: retrieve-secret: runs-on: ubuntu-latest permissions: id-token: write contents: read steps: - name: Retrieve secret from Vault uses: hashicorp/vault-action@9a8b7c6d5e4f3a2b1c0d9e8f7a6b5c4d3e2f1a0b with: exportToken: true method: jwt url: VAULT-URL role: ROLE-NAME secrets: SECRET-PATH - name: Use secret from Vault run: | # This step has access to the secret retrieved above; see hashicorp/vault-action for more details. - name: Revoke token # This step always runs at the end regardless of the previous steps result if: always() run: | curl -X POST -sv -H "X-Vault-Token: {% raw %}${{ env.VAULT\_TOKEN }}{% endraw %}" \ VAULT-URL/v1/auth/token/revoke-self ``` ## Further reading {% data reusables.actions.oidc-further-reading %}
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-in-hashicorp-vault.md
main
github-actions
[ 0.012814361602067947, 0.0827236920595169, -0.01894489675760269, 0.011129734106361866, -0.047379471361637115, -0.028043778613209724, -0.06264146417379379, -0.050757136195898056, 0.06436170637607574, -0.06388667225837708, -0.015638647601008415, 0.049826208502054214, 0.052154600620269775, -0....
-0.012753
{% data reusables.actions.enterprise-github-hosted-runners %} ## About reusable workflows Rather than copying and pasting deployment jobs from one workflow to another, you can create a reusable workflow that performs the deployment steps. A reusable workflow can be used by another workflow if it meets one of the access requirements described in [AUTOTITLE](/actions/using-workflows/reusing-workflows#access-to-reusable-workflows). You should be familiar with the concepts described in [AUTOTITLE](/actions/using-workflows/reusing-workflows) and [AUTOTITLE](/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect). ## Defining the trust conditions When combined with OpenID Connect (OIDC), reusable workflows let you enforce consistent deployments across your repository, organization, or enterprise. You can do this by defining trust conditions on cloud roles based on reusable workflows. The available options will vary depending on your cloud provider: \* \*\*Using `job\_workflow\_ref`:\*\* \* To create trust conditions based on reusable workflows, your cloud provider must support custom claims for `job\_workflow\_ref`. This allows your cloud provider to identify which repository the job originally came from. \* For clouds that only support the standard claims (audience (`aud`) and subject (`sub`)), you can use the API to customize the `sub` claim to include `job\_workflow\_ref`. For more information, see [AUTOTITLE](/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#customizing-the-token-claims). Support for custom claims is currently available for Google Cloud Platform and HashiCorp Vault. \* \*\*Customizing the token claims:\*\* \* You can configure more granular trust conditions by customizing the {% ifversion ghec %}issuer (`iss`) and {% endif %}subject (`sub`) claim{% ifversion ghec %}s that are{% else %} that's{% endif %} included with the JWT. For more information, see [AUTOTITLE](/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#customizing-the-token-claims). ## How the token works with reusable workflows During a workflow run, {% data variables.product.prodname\_dotcom %}'s OIDC provider presents a OIDC token to the cloud provider which contains information about the job. If that job is part of a reusable workflow, the token will include the standard claims that contain information about the calling workflow, and will also include a custom claim called `job\_workflow\_ref` that contains information about the called workflow. For example, the following OIDC token is for a job that was part of a called workflow. The `workflow`, `ref`, and other attributes describe the caller workflow, while `job\_workflow\_ref` refers to the called workflow: ```yaml copy { "typ": "JWT", "alg": "RS256", "x5t": "example-thumbprint", "kid": "example-key-id" } { "jti": "example-id", "sub": "repo:octo-org/octo-repo:environment:prod", "aud": "{% ifversion ghes %}https://HOSTNAME{% else %}https://github.com{% endif %}/octo-org", "ref": "refs/heads/main", "sha": "example-sha", "repository": "octo-org/octo-repo", "repository\_owner": "octo-org", "actor\_id": "12", "repository\_id": "74", "repository\_owner\_id": "65", "run\_id": "example-run-id", "run\_number": "10", "run\_attempt": "2", "actor": "octocat", "workflow": "example-workflow", "head\_ref": "", "base\_ref": "", "event\_name": "workflow\_dispatch", "ref\_type": "branch", "job\_workflow\_ref": "octo-org/octo-automation/.github/workflows/oidc.yml@refs/heads/main", "iss": "{% ifversion ghes %}https://HOSTNAME/\_services/token{% else %}https://token.actions.githubusercontent.com{% endif %}", "nbf": 1632492967, "exp": 1632493867, "iat": 1632493567 } ``` If your reusable workflow performs deployment steps, then it will typically need access to a specific cloud role, and you might want to allow any repository in your organization to call that reusable workflow. To permit this, you'll create the trust condition that allows any repository and any caller workflow, and then filter on the organization and the called workflow. See the next section for some examples. ## Examples \*\*Filtering for reusable workflows within a specific repository\*\* You can configure a custom claim that filters for any reusable workflow in a specific repository. In this example, the workflow run must have originated from a job defined in a reusable workflow in the `octo-org/octo-automation` repository, and in any repository that is owned by the `octo-org` organization. \* \*\*Subject:\*\* \* Syntax: `repo:ORG\_NAME/\*` \* Example: `repo:octo-org/\*` \* \*\*Custom claim:\*\* \* Syntax: `job\_workflow\_ref:ORG\_NAME/REPO\_NAME` \* Example: `job\_workflow\_ref:octo-org/octo-automation@\*` \*\*Filtering for a specific reusable workflow at a specific ref\*\* You can configure a custom claim that filters for a specific reusable workflow. In this example, the workflow run must have originated from a job defined in
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-with-reusable-workflows.md
main
github-actions
[ -0.06457767635583878, -0.04701777920126915, -0.03697703406214714, 0.005024661310017109, 0.008655535988509655, -0.005829664412885904, 0.002260452602058649, -0.061737656593322754, 0.05006585642695427, -0.09143350273370743, -0.022729741409420967, 0.038451630622148514, 0.07443059235811234, -0....
0.106166
`repo:ORG\_NAME/\*` \* Example: `repo:octo-org/\*` \* \*\*Custom claim:\*\* \* Syntax: `job\_workflow\_ref:ORG\_NAME/REPO\_NAME` \* Example: `job\_workflow\_ref:octo-org/octo-automation@\*` \*\*Filtering for a specific reusable workflow at a specific ref\*\* You can configure a custom claim that filters for a specific reusable workflow. In this example, the workflow run must have originated from a job defined in the reusable workflow `octo-org/octo-automation/.github/workflows/deployment.yml`, and in any repository that is owned by the `octo-org` organization. \* \*\*Subject:\*\* \* Syntax: `repo:ORG\_NAME/\*` \* Example: `repo:octo-org/\*` \* \*\*Custom claim:\*\* \* Syntax: `job\_workflow\_ref:ORG\_NAME/REPO\_NAME/.github/workflows/WORKFLOW\_FILE@ref` \* Example: `job\_workflow\_ref:octo-org/octo-automation/.github/workflows/deployment.yml@ 10040c56a8c0253d69db7c1f26a0d227275512e2`
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-with-reusable-workflows.md
main
github-actions
[ -0.10825841128826141, -0.019192740321159363, 0.013433102518320084, -0.021873170509934425, 0.03579854220151901, -0.04247494041919708, 0.024768222123384476, -0.04873766377568245, 0.04021718353033066, -0.02091183140873909, -0.04091155156493187, -0.08453919738531113, 0.010783465579152107, 0.03...
0.093143
{% data reusables.actions.enterprise-github-hosted-runners %} ## Overview OpenID Connect (OIDC) allows your {% data variables.product.prodname\_actions %} workflows to access resources in Amazon Web Services (AWS), without needing to store the AWS credentials as long-lived {% data variables.product.prodname\_dotcom %} secrets. This guide explains how to configure AWS to trust {% data variables.product.prodname\_dotcom %}'s OIDC as a federated identity, and includes a workflow example for the [`aws-actions/configure-aws-credentials`](https://github.com/aws-actions/configure-aws-credentials) that uses tokens to authenticate to AWS and access resources. {% data reusables.actions.oidc-custom-claims-aws-restriction %} ## Prerequisites {% data reusables.actions.oidc-link-to-intro %} {% data reusables.actions.oidc-security-notice %} {% data reusables.actions.oidc-on-ghecom %} {% ifversion ghes %} {% data reusables.actions.oidc-endpoints %} > [!NOTE] > You can restrict access to the OIDC endpoints by allowing only [AWS IP address ranges](https://docs.aws.amazon.com/vpc/latest/userguide/aws-ip-ranges.html). > [!NOTE] > {% data variables.product.prodname\_dotcom %} does not natively support AWS session tags. {% endif %} ## Adding the identity provider to AWS To add the {% data variables.product.prodname\_dotcom %} OIDC provider to IAM, see the [AWS documentation](https://docs.aws.amazon.com/IAM/latest/UserGuide/id\_roles\_providers\_create\_oidc.html). \* For the provider URL: Use {% ifversion ghes %}`https://HOSTNAME/\_services/token`{% else %}`https://token.actions.githubusercontent.com`{% endif %} \* For the "Audience": Use `sts.amazonaws.com` if you are using the [official action](https://github.com/aws-actions/configure-aws-credentials). ### Configuring the role and trust policy To configure the role and trust in IAM, see the AWS documentation [Configure AWS Credentials for GitHub Actions](https://github.com/aws-actions/configure-aws-credentials#configure-aws-credentials-for-github-actions) and [Configuring a role for GitHub OIDC identity provider](https://docs.aws.amazon.com/IAM/latest/UserGuide/id\_roles\_create\_for-idp\_oidc.html#idp\_oidc\_Create\_GitHub). > [!NOTE] > AWS Identity and Access Management (IAM) recommends that users evaluate the IAM condition key, `token.actions.githubusercontent.com:sub`, in the trust policy of any role that trusts {% data variables.product.prodname\_dotcom %}’s OIDC identity provider (IdP). Evaluating this condition key in the role trust policy limits which {% data variables.product.prodname\_dotcom %} actions are able to assume the role. Edit the trust policy, adding the `sub` field to the validation conditions. For example: ```json copy "Condition": { "StringEquals": { "{% ifversion ghes %}HOSTNAME/\_services/token{% else %}token.actions.githubusercontent.com{% endif %}:aud": "sts.amazonaws.com", "{% ifversion ghes %}HOSTNAME/\_services/token{% else %}token.actions.githubusercontent.com{% endif %}:sub": "repo:octo-org/octo-repo:ref:refs/heads/octo-branch" } } ``` If you use a workflow with an environment, the `sub` field must reference the environment name: `repo:ORG-NAME/REPO-NAME:environment:ENVIRONMENT-NAME`. For more information, see [AUTOTITLE](/actions/reference/openid-connect-reference#filtering-for-a-specific-environment). {% data reusables.actions.oidc-deployment-protection-rules %} ```json copy "Condition": { "StringEquals": { "{% ifversion ghes %}HOSTNAME/\_services/token{% else %}token.actions.githubusercontent.com{% endif %}:aud": "sts.amazonaws.com", "{% ifversion ghes %}HOSTNAME/\_services/token{% else %}token.actions.githubusercontent.com{% endif %}:sub": "repo:octo-org/octo-repo:environment:prod" } } ``` In the following example, `StringLike` is used with a wildcard operator (`\*`) to allow any branch, pull request merge branch, or environment from the `octo-org/octo-repo` organization and repository to assume a role in AWS. ```json copy { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Federated": "arn:aws:iam::123456123456:oidc-provider/token.actions.githubusercontent.com" }, "Action": "sts:AssumeRoleWithWebIdentity", "Condition": { "StringLike": { "token.actions.githubusercontent.com:sub": "repo:octo-org/octo-repo:\*" }, "StringEquals": { "token.actions.githubusercontent.com:aud": "sts.amazonaws.com" } } } ] } ``` ## Updating your {% data variables.product.prodname\_actions %} workflow To update your workflows for OIDC, you will need to make two changes to your YAML: 1. Add permissions settings for the token. 1. Use the [`aws-actions/configure-aws-credentials`](https://github.com/aws-actions/configure-aws-credentials) action to exchange the OIDC token (JWT) for a cloud access token. ### Adding permissions settings {% data reusables.actions.oidc-permissions-token %} ### Requesting the access token The `aws-actions/configure-aws-credentials` action receives a JWT from the {% data variables.product.prodname\_dotcom %} OIDC provider, and then requests an access token from AWS. For more information, see the AWS [documentation](https://github.com/aws-actions/configure-aws-credentials). \* `BUCKET-NAME`: Replace this with the name of your S3 bucket. \* `AWS-REGION`: Replace this with the name of your AWS region. \* `ROLE-TO-ASSUME`: Replace this with your AWS role. For example, `arn:aws:iam::1234567890:role/example-role` ```yaml copy # Sample workflow to access AWS resources when workflow is tied to branch # The workflow creates a static website using Amazon S3 {% data reusables.actions.actions-not-certified-by-github-comment %} name: AWS example workflow on: push env: BUCKET\_NAME : "BUCKET-NAME" AWS\_REGION :
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-in-aws.md
main
github-actions
[ -0.09880358725786209, 0.0023635532706975937, -0.0974871963262558, 0.030500585213303566, 0.05679117143154144, 0.01469498872756958, 0.08258922398090363, 0.0402764268219471, 0.028440406545996666, 0.013812846504151821, -0.014666538685560226, 0.00978685449808836, 0.08170635253190994, -0.0301882...
0.080015
Replace this with your AWS role. For example, `arn:aws:iam::1234567890:role/example-role` ```yaml copy # Sample workflow to access AWS resources when workflow is tied to branch # The workflow creates a static website using Amazon S3 {% data reusables.actions.actions-not-certified-by-github-comment %} name: AWS example workflow on: push env: BUCKET\_NAME : "BUCKET-NAME" AWS\_REGION : "AWS-REGION" # permission can be added at job level or workflow level permissions: id-token: write # This is required for requesting the JWT contents: read # This is required for actions/checkout jobs: S3PackageUpload: runs-on: ubuntu-latest steps: - name: Git clone the repository uses: {% data reusables.actions.action-checkout %} - name: configure aws credentials uses: aws-actions/configure-aws-credentials@e3dd6a429d7300a6a4c196c26e071d42e0343502 with: role-to-assume: ROLE-TO-ASSUME role-session-name: samplerolesession aws-region: {% raw %}${{ env.AWS\_REGION }}{% endraw %} # Upload a file to AWS s3 - name: Copy index.html to s3 run: | aws s3 cp ./index.html s3://{% raw %}${{ env.BUCKET\_NAME }}{% endraw %}/ ``` ## Further reading {% data reusables.actions.oidc-further-reading %}
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-in-aws.md
main
github-actions
[ -0.03800303116440773, -0.053676702082157135, -0.0597669892013073, -0.04600067436695099, 0.026018111035227776, -0.03480897471308708, -0.004898765590041876, -0.04319284111261368, 0.042251791805028915, -0.0047321440652012825, -0.0592421293258667, -0.04422077536582947, 0.05577844753861427, -0....
0.039177
{% data reusables.actions.enterprise-github-hosted-runners %} ## Overview OpenID Connect (OIDC) allows your {% data variables.product.prodname\_actions %} workflows to access resources in your cloud provider, without having to store any credentials as long-lived {% data variables.product.prodname\_dotcom %} secrets. To use OIDC, you will first need to configure your cloud provider to trust {% data variables.product.prodname\_dotcom %}'s OIDC as a federated identity, and must then update your workflows to authenticate using tokens. ## Prerequisites {% data reusables.actions.oidc-link-to-intro %} {% data reusables.actions.oidc-security-notice %} {% data reusables.actions.oidc-on-ghecom %} ## Updating your {% data variables.product.prodname\_actions %} workflow To update your workflows for OIDC, you will need to make two changes to your YAML: 1. Add permissions settings for the token. 1. Use the official action from your cloud provider to exchange the OIDC token (JWT) for a cloud access token. If your cloud provider doesn't yet offer an official action, you can update your workflows to perform these steps manually. {% data reusables.actions.oidc-deployment-protection-rules %} ### Adding permissions settings {% data reusables.actions.oidc-permissions-token %} ### Using official actions If your cloud provider has created an official action for using OIDC with {% data variables.product.prodname\_actions %}, it will allow you to easily exchange the OIDC token for an access token. You can then update your workflows to use this token when accessing cloud resources. For example, Alibaba Cloud created [`aliyun/configure-aliyun-credentials-action`](https://github.com/aliyun/configure-aliyun-credentials-action) to integrate with using OIDC with {% data variables.product.prodname\_dotcom %}. ## Using custom actions If your cloud provider doesn't have an official action, or if you prefer to create custom scripts, you can manually request the JSON Web Token (JWT) from {% data variables.product.prodname\_dotcom %}'s OIDC provider. If you're not using an official action, then {% data variables.product.prodname\_dotcom %} recommends that you use the Actions core toolkit. Alternatively, you can use the following environment variables to retrieve the token: `ACTIONS\_ID\_TOKEN\_REQUEST\_TOKEN`, `ACTIONS\_ID\_TOKEN\_REQUEST\_URL`. To update your workflows using this approach, you will need to make three changes to your YAML: 1. Add permissions settings for the token. 1. Add code that requests the OIDC token from {% data variables.product.prodname\_dotcom %}'s OIDC provider. 1. Add code that exchanges the OIDC token with your cloud provider for an access token. ### Requesting the JWT using the Actions core toolkit The following example demonstrates how to use `actions/github-script` with the `core` toolkit to request the JWT from {% data variables.product.prodname\_dotcom %}'s OIDC provider. For more information, see [AUTOTITLE](/actions/creating-actions/creating-a-javascript-action#adding-actions-toolkit-packages). ```yaml jobs: job: environment: Production runs-on: ubuntu-latest steps: - name: Install OIDC Client from Core Package run: npm install @actions/core@1.6.0 @actions/http-client - name: Get Id Token uses: {% data reusables.actions.action-github-script %} id: idtoken with: script: | const coredemo = require('@actions/core') let id\_token = await coredemo.getIDToken() coredemo.setOutput('id\_token', id\_token) ``` ### Requesting the JWT using environment variables The following example demonstrates how to use environment variables to request a JSON Web Token. For your deployment job, you will need to define the token settings, using `actions/github-script` with the `core` toolkit. For more information, see [AUTOTITLE](/actions/creating-actions/creating-a-javascript-action#adding-actions-toolkit-packages). For example: ```yaml jobs: job: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-github-script %} id: script timeout-minutes: 10 with: debug: true script: | const token = process.env['ACTIONS\_ID\_TOKEN\_REQUEST\_TOKEN'] const runtimeUrl = process.env['ACTIONS\_ID\_TOKEN\_REQUEST\_URL'] core.setOutput('TOKEN', token.trim()) core.setOutput('IDTOKENURL', runtimeUrl.trim()) ``` You can then use `curl` to retrieve a JWT from the {% data variables.product.prodname\_dotcom %} OIDC provider. For example: ```yaml - run: | IDTOKEN=$(curl -H "Authorization: Bearer {% raw %}${{steps.script.outputs.TOKEN}}" ${{steps.script.outputs.IDTOKENURL}} {% endraw %} -H "Accept: application/json; api-version=2.0" -H "Content-Type: application/json" -d "{}" | jq -r '.value') echo $IDTOKEN jwtd() { if [[ -x $(command -v jq) ]]; then jq -R 'split(".") | .[0],.[1] | @base64d | fromjson' <<< "${1}" echo "Signature: $(echo "${1}" | awk -F'.'
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-in-cloud-providers.md
main
github-actions
[ -0.07667768746614456, -0.05086405575275421, -0.06487207114696503, 0.023555124178528786, 0.01110814232379198, 0.003468447597697377, 0.059963494539260864, 0.050321925431489944, 0.030776159837841988, 0.005656964145600796, -0.0052991523407399654, 0.029478169977664948, 0.038036443293094635, -0....
0.036805
{% raw %}${{steps.script.outputs.TOKEN}}" ${{steps.script.outputs.IDTOKENURL}} {% endraw %} -H "Accept: application/json; api-version=2.0" -H "Content-Type: application/json" -d "{}" | jq -r '.value') echo $IDTOKEN jwtd() { if [[ -x $(command -v jq) ]]; then jq -R 'split(".") | .[0],.[1] | @base64d | fromjson' <<< "${1}" echo "Signature: $(echo "${1}" | awk -F'.' '{print $3}')" fi } jwtd $IDTOKEN echo "idToken=${IDTOKEN}" >> $GITHUB\_OUTPUT id: tokenid ``` ### Getting the access token from the cloud provider You will need to present the OIDC JSON web token to your cloud provider in order to obtain an access token. For each deployment, your workflows must use cloud login actions (or custom scripts) that fetch the OIDC token and present it to your cloud provider. The cloud provider then validates the claims in the token; if successful, it provides a cloud access token that is available only to that job run. The provided access token can then be used by subsequent actions in the job to connect to the cloud and deploy to its resources. The steps for exchanging the OIDC token for an access token will vary for each cloud provider. ### Accessing resources in your cloud provider Once you've obtained the access token, you can use specific cloud actions or scripts to authenticate to the cloud provider and deploy to its resources. These steps could differ for each cloud provider. For example, Alibaba Cloud maintains their own instructions for OIDC authentication. For more information, see [Overview of OIDC-based SSO](https://www.alibabacloud.com/help/en/ram/user-guide/overview-of-oidc-based-sso) in the Alibaba Cloud documentation. In addition, the default expiration time of this access token could vary between each cloud and can be configurable at the cloud provider's side. ## Further reading {% data reusables.actions.oidc-further-reading %}
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-in-cloud-providers.md
main
github-actions
[ -0.058109939098358154, 0.0866188332438469, -0.03405062109231949, 0.01594686135649681, 0.03034791722893715, -0.013926376588642597, 0.01245463453233242, 0.029351431876420975, 0.04445813596248627, -0.005453750025480986, -0.0019278035033494234, -0.04546525329351425, 0.060954079031944275, -0.09...
-0.030146
{% data reusables.actions.enterprise-github-hosted-runners %} ## Overview OpenID Connect (OIDC) allows your {% data variables.product.prodname\_actions %} workflows to access resources in Azure, without needing to store the Azure credentials as long-lived {% data variables.product.prodname\_dotcom %} secrets. This guide gives an overview of how to configure Azure to trust {% data variables.product.prodname\_dotcom %}'s OIDC as a federated identity, and includes a workflow example for the [`azure/login`](https://github.com/Azure/login) action that uses tokens to authenticate to Azure and access resources. ## Prerequisites {% data reusables.actions.oidc-link-to-intro %} {% data reusables.actions.oidc-security-notice %} {% data reusables.actions.oidc-on-ghecom %} {% ifversion ghes %} {% data reusables.actions.oidc-endpoints %} > [!NOTE] > Microsoft Entra ID (previously known as Azure AD) does not have fixed IP ranges defined for these endpoints. \* Make sure that the value of the issuer claim that's included with the JSON Web Token (JWT) is set to a publicly routable URL. For more information, see [AUTOTITLE](/enterprise-server@latest/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect). {% endif %} ## Adding the federated credentials to Azure {% data variables.product.prodname\_dotcom %}'s OIDC provider works with Azure's workload identity federation. For an overview, see Microsoft's documentation at [Workload identity federation](https://docs.microsoft.com/en-us/azure/active-directory/develop/workload-identity-federation). To configure the OIDC identity provider in Azure, you will need to perform the following configuration. For instructions on making these changes, refer to [the Azure documentation](https://docs.microsoft.com/en-us/azure/developer/github/connect-from-azure). {% ifversion fpt or ghec %}In the following procedure, you will create an application for Microsoft Entra ID (previously known as Azure AD).{% endif %} 1. Create an Entra ID application and a service principal. 1. Add federated credentials for the Entra ID application. 1. Create {% data variables.product.prodname\_dotcom %} secrets for storing Azure configuration. Additional guidance for configuring the identity provider: \* For security hardening, make sure you've reviewed [AUTOTITLE](/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#configuring-the-oidc-trust-with-the-cloud). For an example, see [AUTOTITLE](/actions/deployment/security-hardening-your-deployments/about-security-hardening-with-openid-connect#configuring-the-subject-in-your-cloud-provider). \* For the `audience` setting, `api://AzureADTokenExchange` is the recommended value, but you can also specify other values here. ## Updating your {% data variables.product.prodname\_actions %} workflow To update your workflows for OIDC, you will need to make two changes to your YAML: 1. Add permissions settings for the token. 1. Use the [`azure/login`](https://github.com/Azure/login) action to exchange the OIDC token (JWT) for a cloud access token. {% data reusables.actions.oidc-deployment-protection-rules %} ### Adding permissions settings {% data reusables.actions.oidc-permissions-token %} ### Requesting the access token The [`azure/login`](https://github.com/Azure/login) action receives a JWT from the {% data variables.product.prodname\_dotcom %} OIDC provider, and then requests an access token from Azure. For more information, see the [`azure/login`](https://github.com/Azure/login) documentation. The following example exchanges an OIDC ID token with Azure to receive an access token, which can then be used to access cloud resources. {% raw %} ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} name: Run Azure Login with OIDC on: [push] permissions: id-token: write contents: read jobs: build-and-deploy: runs-on: ubuntu-latest steps: - name: 'Az CLI login' uses: azure/login@8c334a195cbb38e46038007b304988d888bf676a with: client-id: ${{ secrets.AZURE\_CLIENT\_ID }} tenant-id: ${{ secrets.AZURE\_TENANT\_ID }} subscription-id: ${{ secrets.AZURE\_SUBSCRIPTION\_ID }} - name: 'Run az commands' run: | az account show az group list ``` {% endraw %} ## Further reading {% data reusables.actions.oidc-further-reading %}
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-in-azure.md
main
github-actions
[ -0.0741158127784729, -0.03082318603992462, -0.0938551053404808, 0.04514201730489731, 0.014767310582101345, 0.022889545187354088, 0.08095160871744156, 0.01271880604326725, 0.029404250904917717, 0.046677157282829285, -0.012624875642359257, 0.03382854536175728, 0.07832739502191544, -0.0118249...
0.084201
## Overview OpenID Connect (OIDC) allows your {% data variables.product.prodname\_actions %} workflows to authenticate with [PyPI](https://pypi.org) to publish Python packages. This guide gives an overview of how to configure PyPI to trust {% data variables.product.prodname\_dotcom %}'s OIDC as a federated identity, and demonstrates how to use this configuration in the [`pypa/gh-action-pypi-publish`](https://github.com/marketplace/actions/pypi-publish) action to publish packages to PyPI (or other Python package repositories) without any manual API token management. ## Prerequisites {% data reusables.actions.oidc-link-to-intro %} {% data reusables.actions.oidc-security-notice %} {% data reusables.actions.oidc-on-ghecom %} ## Adding the identity provider to PyPI To use OIDC with PyPI, add a trust configuration that links each project on PyPI to each repository and workflow combination that's allowed to publish for it. 1. Sign in to PyPI and navigate to the trusted publishing settings for the project you'd like to configure. For a project named `myproject`, this will be at `https://pypi.org/manage/project/myproject/settings/publishing/`. 1. Configure a trust relationship between the PyPI project and a {% data variables.product.prodname\_dotcom %} repository (and workflow within the repository). For example, if your {% data variables.product.prodname\_dotcom %} repository is at `myorg/myproject` and your release workflow is defined in `release.yml` with an environment of `release`, you should use the following settings for your trusted publisher on PyPI. > [!NOTE] > Enter these values carefully. Giving the incorrect user, repository, or workflow the ability to publish to your PyPI project is equivalent to sharing an API token. \* Owner: `myorg` \* Repository name: `myproject` \* Workflow name: `release.yml` \* (Optionally) a {% data variables.product.prodname\_actions %} environment name: `release` ## Updating your {% data variables.product.prodname\_actions %} workflow Once your trusted publisher is registered on PyPI, you can update your release workflow to use trusted publishing. {% data reusables.actions.oidc-deployment-protection-rules %} The [`pypa/gh-action-pypi-publish`](https://github.com/marketplace/actions/pypi-publish) action has built-in support for trusted publishing, which can be enabled by giving its containing job the `id-token: write` permission and omitting `username` and `password`. The following example uses the `pypa/gh-action-pypi-publish` action to exchange an OIDC token for a PyPI API token, which is then used to upload a package's release distributions to PyPI. ```yaml copy {% data reusables.actions.actions-not-certified-by-github-comment %} jobs: release-build: runs-on: ubuntu-latest steps: - uses: {% data reusables.actions.action-checkout %} - uses: {% data reusables.actions.action-setup-python %} with: python-version: "3.x" - name: build release distributions run: | # NOTE: put your own distribution build steps here. python -m pip install build python -m build - name: upload windows dists uses: {% data reusables.actions.action-upload-artifact %} with: name: release-dists path: dist/ pypi-publish: runs-on: ubuntu-latest needs: - release-build permissions: id-token: write steps: - name: Retrieve release distributions uses: {% data reusables.actions.action-download-artifact %} with: name: release-dists path: dist/ - name: Publish release distributions to PyPI uses: pypa/gh-action-pypi-publish@3e4f5a6b7c8d9e0f1a2b3c4d5e6f7a8b9c0d1e2f ```
https://github.com/github/docs/blob/main//content/actions/how-tos/secure-your-work/security-harden-deployments/oidc-in-pypi.md
main
github-actions
[ -0.09256734699010849, -0.014536095783114433, -0.06283975392580032, -0.0028462496120482683, 0.0006092119147069752, -0.06837698817253113, 0.05520612373948097, 0.07731623947620392, -0.00808341708034277, -0.02149798348546028, 0.031657785177230835, 0.010454648174345493, 0.03682766854763031, 0.0...
0.02489
## Prerequisites You should be familiar with the syntax for {% data variables.product.prodname\_actions %}. For more information, see [AUTOTITLE](/actions/learn-github-actions). ## Triggering your deployment You can use a variety of events to trigger your deployment workflow. Some of the most common are: `pull\_request`, `push`, and `workflow\_dispatch`. For example, a workflow with the following triggers runs whenever: \* There is a push to the `main` branch. \* A pull request targeting the `main` branch is opened, synchronized, or reopened. \* Someone manually triggers it. ```yaml on: push: branches: - main pull\_request: branches: - main workflow\_dispatch: ``` For more information, see [AUTOTITLE](/actions/using-workflows/events-that-trigger-workflows). ## Using environments {% data reusables.actions.about-environments %} You can configure environments with protection rules and secrets. When a workflow job references an environment, the job won't start until all of the environment's protection rules pass. A job also cannot access secrets that are defined in an environment until all the deployment protection rules pass. To learn more, see [Using custom deployment protection rules](#using-custom-deployment-protection-rules) in this article. ## Using concurrency Concurrency ensures that only a single job or workflow using the same concurrency group will run at a time. You can use concurrency so that an environment has a maximum of one deployment in progress and one deployment pending at a time. For more information about concurrency, see [AUTOTITLE](/actions/using-jobs/using-concurrency). > [!NOTE] > `concurrency` and `environment` are not connected. The concurrency value can be any string; it does not need to be an environment name. Additionally, if another workflow uses the same environment but does not specify concurrency, that workflow will not be subject to any concurrency rules. For example, when the following workflow runs, it will be paused with the status `pending` if any job or workflow that uses the `production` concurrency group is in progress. It will also cancel any job or workflow that uses the `production` concurrency group and has the status `pending`. This means that there will be a maximum of one running and one pending job or workflow in that uses the `production` concurrency group. ```yaml name: Deployment concurrency: production on: push: branches: - main jobs: deployment: runs-on: ubuntu-latest environment: production steps: - name: deploy # ...deployment-specific steps ``` You can also specify concurrency at the job level. This will allow other jobs in the workflow to proceed even if the concurrent job is `pending`. ```yaml name: Deployment on: push: branches: - main jobs: deployment: runs-on: ubuntu-latest environment: production concurrency: production steps: - name: deploy # ...deployment-specific steps ``` You can also use `cancel-in-progress` to cancel any currently running job or workflow in the same concurrency group. ```yaml name: Deployment concurrency: group: production cancel-in-progress: true on: push: branches: - main jobs: deployment: runs-on: ubuntu-latest environment: production steps: - name: deploy # ...deployment-specific steps ``` For guidance on writing deployment-specific steps, see [Finding deployment examples](#finding-deployment-examples). ## Viewing deployment history When a {% data variables.product.prodname\_actions %} workflow deploys to an environment, the environment is displayed on the main page of the repository. For more information about viewing deployments to environments, see [AUTOTITLE](/actions/deployment/managing-your-deployments/viewing-deployment-history). {% ifversion virtual-registry %} Your organization can collect deployment records for all your builds in a single place by uploading data to the {% data variables.product.virtual\_registry %}. See [AUTOTITLE](/code-security/concepts/supply-chain-security/linked-artifacts). {% endif %} ## Monitoring workflow runs Every workflow run generates a real-time graph that illustrates the run progress. You can use this graph to monitor and debug deployments. For more information see, [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/using-the-visualization-graph). You can also view the logs of each workflow run and the history of workflow runs. For more information, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/viewing-workflow-run-history). ## Using required reviews in workflows Jobs that reference an environment configured with required
https://github.com/github/docs/blob/main//content/actions/how-tos/deploy/configure-and-manage-deployments/control-deployments.md
main
github-actions
[ 0.008336132392287254, -0.0980447381734848, -0.0493096262216568, -0.03439401835203171, 0.00839061476290226, -0.025298960506916046, 0.04012410342693329, 0.025767065584659576, -0.010111641138792038, 0.01245924923568964, 0.029408806934952736, 0.006788585800677538, 0.03631449490785599, -0.03463...
0.114496
You can use this graph to monitor and debug deployments. For more information see, [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/using-the-visualization-graph). You can also view the logs of each workflow run and the history of workflow runs. For more information, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/viewing-workflow-run-history). ## Using required reviews in workflows Jobs that reference an environment configured with required reviewers will wait for an approval before starting. While a job is awaiting approval, it has a status of "Waiting". If a job is not approved within 30 days, it will automatically fail. For more information about environments and required approvals, see [AUTOTITLE](/actions/deployment/targeting-different-environments/managing-environments-for-deployment). For information about how to review deployments with the REST API, see [AUTOTITLE](/rest/actions/workflow-runs). ## Using custom deployment protection rules {% data reusables.actions.custom-deployment-protection-rules-beta-note %} {% data reusables.actions.about-custom-deployment-protection-rules %} Custom deployment protection rules are powered by {% data variables.product.prodname\_github\_apps %} and run based on webhooks and callbacks. Approval or rejection of a workflow job is based on consumption of the `deployment\_protection\_rule` webhook. For more information, see [AUTOTITLE](/webhooks-and-events/webhooks/webhook-events-and-payloads#deployment\_protection\_rule) and [Approving or rejecting deployments](/actions/how-tos/managing-workflow-runs-and-deployments/managing-deployments/creating-custom-deployment-protection-rules#approving-or-rejecting-deployments). Once you have created a custom deployment protection rule and installed it on your repository, the custom deployment protection rule will automatically be available for all environments in the repository. Deployments to an environment can be approved or rejected based on the conditions defined in any external service like an approved ticket in an IT Service Management (ITSM) system, vulnerable scan result on dependencies, or stable health metrics of a cloud resource. The decision to approve or reject deployments is at the discretion of the integrating third-party application and the gating conditions you define in them. The following are a few use cases for which you can create a deployment protection rule. \* ITSM & Security Operations: you can check for service readiness by validating quality, security, and compliance processes that verify deployment readiness. \* Observability systems: you can consult monitoring or observability systems (Asset Performance Management Systems and logging aggregators, cloud resource health verification systems, etc.) for verifying the safety and deployment readiness. \* Code quality & testing tools: you can check for automated tests on CI builds which need to be deployed to an environment. Alternatively, you can write your own protection rules for any of the above use cases or you can define any custom logic to safely approve or reject deployments from pre-production to production environments. ## Tracking deployments through apps {% ifversion fpt or ghec %} If your personal account or organization on {% data variables.product.github %} is integrated with Microsoft Teams or Slack, you can track deployments that use environments through Microsoft Teams or Slack. For example, you can receive notifications through the app when a deployment is pending approval, when a deployment is approved, or when the deployment status changes. For more information about integrating Microsoft Teams or Slack, see [AUTOTITLE](/get-started/exploring-integrations/github-extensions-and-integrations#team-communication-tools). {% endif %} You can also build an app that uses deployment and deployment status webhooks to track deployments. {% data reusables.actions.environment-deployment-event %} For more information, see [AUTOTITLE](/apps) and [AUTOTITLE](/webhooks-and-events/webhooks/webhook-events-and-payloads#deployment). ## Choosing a runner You can run your deployment workflow on {% data variables.product.company\_short %}-hosted runners or on self-hosted runners. Traffic from {% data variables.product.company\_short %}-hosted runners can come from a [wide range of network addresses](/rest/meta/meta#get-github-meta-information). If you are deploying to an internal environment and your company restricts external traffic into private networks, {% data variables.product.prodname\_actions %} workflows running on {% data variables.product.company\_short %}-hosted runners may not be able to communicate with your internal services or resources. To overcome this, you can host your own runners. For more information, see [AUTOTITLE](/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners) and [AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners). ## Displaying a status badge You can use a status badge to display the status of
https://github.com/github/docs/blob/main//content/actions/how-tos/deploy/configure-and-manage-deployments/control-deployments.md
main
github-actions
[ 0.008985013701021671, -0.03179351985454559, -0.016482574865221977, 0.007606083061546087, 0.08328862488269806, -0.0875580906867981, -0.05044373869895935, -0.0688122883439064, 0.0172320194542408, 0.025534141808748245, -0.010274517349898815, -0.07081051170825958, 0.017267495393753052, 0.01845...
0.068829
on {% data variables.product.company\_short %}-hosted runners may not be able to communicate with your internal services or resources. To overcome this, you can host your own runners. For more information, see [AUTOTITLE](/actions/hosting-your-own-runners/managing-self-hosted-runners/about-self-hosted-runners) and [AUTOTITLE](/actions/using-github-hosted-runners/about-github-hosted-runners). ## Displaying a status badge You can use a status badge to display the status of your deployment workflow. {% data reusables.repositories.actions-workflow-status-badge-intro %} For more information, see [AUTOTITLE](/actions/monitoring-and-troubleshooting-workflows/adding-a-workflow-status-badge). ## Finding deployment examples This article demonstrated features of {% data variables.product.prodname\_actions %} that you can add to your deployment workflows. {% data reusables.actions.cd-templates-actions %}
https://github.com/github/docs/blob/main//content/actions/how-tos/deploy/configure-and-manage-deployments/control-deployments.md
main
github-actions
[ -0.016061460599303246, 0.010073344223201275, -0.04316854476928711, 0.02510114014148712, 0.014253443107008934, 0.03034752793610096, -0.007780656684190035, 0.03043365851044655, -0.03812967613339424, -0.01900327391922474, -0.013290313072502613, 0.011349093168973923, 0.018847059458494186, -0.0...
0.085966
## Prerequisites {% data reusables.actions.custom-deployment-protection-rules-beta-note %} For general information about deployment protection rules, see [AUTOTITLE](/actions/concepts/use-cases/deploying-with-github-actions#using-custom-deployment-protection-rules). ## Creating a custom deployment protection rule with {% data variables.product.prodname\_github\_apps %} 1. Create a {% data variables.product.prodname\_github\_app %}. For more information, see [AUTOTITLE](/apps/creating-github-apps/creating-github-apps/creating-a-github-app). Configure the {% data variables.product.prodname\_github\_app %} as follows. 1. Optionally, in the \*\*Callback URL\*\* text field under "Identifying and authorizing users," enter the callback URL. For more information, see [AUTOTITLE](/apps/creating-github-apps/creating-github-apps/about-the-user-authorization-callback-url). 1. Under "Permissions," select \*\*Repository permissions\*\*. 1. To the right of "Actions," click the drop down menu and select \*\*Access: Read-only\*\*. ![Screenshot of the "Repository permissions" section for a new GitHub App. The Actions permission shows "Read-only" and is outlined in orange.](/assets/images/help/actions/actions-repo-permissions-read-only.png) 1. To the right of "Deployments," click the drop down menu and select \*\*Access: Read and write\*\*. ![Screenshot of the "Repository permissions" section for a new GitHub App. The Deployments permission shows "Read and write" and is outlined in orange.](/assets/images/help/actions/actions-deployments-repo-permissions-read-and-write.png) 1. Under "Subscribe to events," select \*\*Deployment protection rule\*\*. ![Screenshot of the "Subscribe to events section" section for a new GitHub App. The checkbox for the Deployment protection rule is outlined in orange.](/assets/images/help/actions/actions-subscribe-to-events-deployment-protection-rules.png) 1. Install the custom deployment protection rule in your repositories and enable it for use. For more information, see [AUTOTITLE](/actions/deployment/protecting-deployments/configuring-custom-deployment-protection-rules). ## Approving or rejecting deployments Once a workflow reaches a job that references an environment that has the custom deployment protection rule enabled, {% data variables.product.company\_short %} sends a `POST` request to a URL you configure containing the `deployment\_protection\_rule` payload. You can write your deployment protection rule to automatically send REST API requests that approve or reject the deployment based on the `deployment\_protection\_rule` payload. Configure your REST API requests as follows. 1. Validate the incoming `POST` request. For more information, see [AUTOTITLE](/webhooks-and-events/webhooks/securing-your-webhooks#validating-payloads-from-github). 1. Use a JSON Web Token to authenticate as a {% data variables.product.prodname\_github\_app %}. For more information, see [AUTOTITLE](/apps/creating-github-apps/authenticating-with-a-github-app/authenticating-as-a-github-app#about-authentication-as-a-github-app). 1. Using the installation ID from the `deployment\_protection\_rule` webhook payload, generate an install token. For more information, see [AUTOTITLE](/developers/apps/building-github-apps/authenticating-with-github-apps#authenticating-as-a-github-app). ```shell curl --request POST \ --url "{% data variables.product.rest\_url %}/app/installations/INSTALLATION\_ID/ACCESS\_TOKENS" \ --header "Accept: application/vnd.github+json" \ --header "Authorization: Bearer {jwt}" \ --header "Content-Type: application/json" \ --data \ '{ \ "repository\_ids": [321], \ "permissions": { \ "deployments": "write" \ } \ }' ``` 1. Optionally, to add a status report without taking any other action to {% data variables.product.prodname\_dotcom %}, send a `POST` request to `/repos/OWNER/REPO/actions/runs/RUN\_ID/deployment\_protection\_rule`. In the request body, omit the `state`. For more information, see [AUTOTITLE](/rest/actions/workflow-runs#review-custom-deployment-protection-rules-for-a-workflow-run). You can post a status report on the same deployment up to 10 times. Status reports support Markdown formatting and can be up to 1024 characters long. 1. To approve or reject a request, send a `POST` request to `/repos/OWNER/REPO/actions/runs/RUN\_ID/deployment\_protection\_rule`. In the request body, set the `state` property to either `approved` or `rejected`. For more information, see [AUTOTITLE](/rest/actions/workflow-runs#review-custom-deployment-protection-rules-for-a-workflow-run). 1. Optionally, request the status of an approval for a workflow run by sending a `GET` request to `/repos/OWNER/REPOSITORY\_ID/actions/runs/RUN\_ID/approvals`. For more information, see [AUTOTITLE](/rest/actions/workflow-runs#get-the-review-history-for-a-workflow-run). 1. Optionally, review the deployment on {% data variables.product.prodname\_dotcom %}. For more information, see [AUTOTITLE](/actions/managing-workflow-runs/reviewing-deployments). {% ifversion fpt or ghec %} ## Publishing custom deployment protection rules in the {% data variables.product.prodname\_marketplace %} You can publish your {% data variables.product.prodname\_github\_app %} to the {% data variables.product.prodname\_marketplace %} to allow developers to discover suitable protection rules and install it across their {% data variables.product.company\_short %} repositories. Or you can browse existing custom deployment protection rules to suit your needs. For more information, see [AUTOTITLE](/apps/publishing-apps-to-github-marketplace/github-marketplace-overview/about-github-marketplace) and [AUTOTITLE](/apps/publishing-apps-to-github-marketplace/listing-an-app-on-github-marketplace). {% endif %}
https://github.com/github/docs/blob/main//content/actions/how-tos/deploy/configure-and-manage-deployments/create-custom-protection-rules.md
main
github-actions
[ -0.04358210414648056, -0.0029109090100973845, -0.03497806191444397, -0.03604643791913986, 0.0319846086204052, 0.061101607978343964, 0.09498818963766098, 0.05665023624897003, -0.0510077066719532, 0.04225291684269905, 0.056840162724256516, 0.009755858220160007, 0.0671069398522377, -0.0006523...
0.064345
browse existing custom deployment protection rules to suit your needs. For more information, see [AUTOTITLE](/apps/publishing-apps-to-github-marketplace/github-marketplace-overview/about-github-marketplace) and [AUTOTITLE](/apps/publishing-apps-to-github-marketplace/listing-an-app-on-github-marketplace). {% endif %}
https://github.com/github/docs/blob/main//content/actions/how-tos/deploy/configure-and-manage-deployments/create-custom-protection-rules.md
main
github-actions
[ -0.0005087964236736298, 0.0032844350207597017, -0.014939522370696068, -0.041426125913858414, 0.042862266302108765, 0.04205717146396637, 0.010053274221718311, 0.031944695860147476, -0.027229363098740578, 0.04161932319402695, -0.013980666175484657, 0.009432358667254448, 0.061265990138053894, ...
0.013483