id
stringlengths 14
16
| text
stringlengths 1
2.43k
| source
stringlengths 99
229
|
---|---|---|
127c5a5c690b-2 | + [Installing the AWS CodeStar app on Bitbucket and creating a connection](#action-reference-CodestarConnectionSource-auth)
+ [See also](#action-reference-CodestarConnectionSource-links) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-CodestarConnectionSource.md |
8a74fe091c7a-0 | + Category: `Source`
+ Owner: `AWS`
+ Provider: `CodeStarSourceConnection`
+ Version: `1` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-CodestarConnectionSource.md |
2f55d6fab429-0 | ****ConnectionArn****
Required: Yes
The connection ARN that is configured and authenticated for the source provider\.
****FullRepositoryId****
Required: Yes
The owner and name of the repository where source changes are to be detected\.
Example: `some-user/my-repo`
****BranchName****
Required: Yes
The name of the branch where source changes are to be detected\.
****OutputArtifactFormat****
Required: No
Specifies the output artifact format\. Can be either `CODEBUILD_CLONE_REF` or `CODE_ZIP`\. If unspecified, the default is `CODE_ZIP`\.
The `CODEBUILD_CLONE_REF` option can only be used by CodeBuild downstream actions\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-CodestarConnectionSource.md |
41ddf6eb1bef-0 | + **Number of Artifacts:** `0`
+ **Description:** Input artifacts do not apply for this action type\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-CodestarConnectionSource.md |
940f92031ffb-0 | + **Number of Artifacts:** `1`
+ **Description:** The artifacts generated from the repository are the output artifacts for the `CodeStarSourceConnection` action\. The source code commit ID is displayed in CodePipeline as the source revision for the triggered pipeline execution\. You can configure the output artifact of this action in:
+ A ZIP file that contains the contents of the configured repository and branch at the commit specified as the source revision for the pipeline execution\.
+ A JSON file that contains a URL reference to the repository so that downstream actions can perform Git commands directly\.
**Important**
This option can only be used by CodeBuild downstream actions\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-CodestarConnectionSource.md |
e62f9ea9f36c-0 | ------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-CodestarConnectionSource.md |
794f460fada9-0 | ```
Name: Source
Actions:
- InputArtifacts: []
ActionTypeId:
Version: '1'
Owner: AWS
Category: Source
Provider: CodeStarSourceConnection
OutputArtifacts:
- Name: SourceArtifact
RunOrder: 1
Configuration:
ConnectionArn: "arn:aws:codestar-connections:region:account-id:connection/connection-id"
FullRepositoryId: "some-user/my-repo"
BranchName: "master"
OutputArtifactFormat: "CODE_ZIP"
Name: ApplicationSource
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-CodestarConnectionSource.md |
28200c474880-0 | ```
{
"Name": "Source",
"Actions": [
{
"InputArtifacts": [],
"ActionTypeId": {
"Version": "1",
"Owner": "AWS",
"Category": "Source",
"Provider": "CodeStarSourceConnection"
},
"OutputArtifacts": [
{
"Name": "SourceArtifact"
}
],
"RunOrder": 1,
"Configuration": {
"ConnectionArn": "arn:aws:codestar-connections:region:account-id:connection/connection-id",
"FullRepositoryId": "some-user/my-repo",
"BranchName": "master",
"OutputArtifactFormat": "CODE_ZIP"
},
"Name": "ApplicationSource"
}
]
},
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-CodestarConnectionSource.md |
3c5ae0e08b6c-0 | The first time you use the console to add a new connection to a Bitbucket repository, you must authorize CodePipeline access to your repositories\. You choose or create an installation app that helps you connect to the account where you have created your third\-party code repository\.
When you use the AWS CLI or an AWS CloudFormation template, you must provide the connection ARN of a Bitbucket connection that has already gone through the installation handshake\. Otherwise, the pipeline is not triggered\.
**Note**
Most source actions in CodePipeline, such as GitHub, require either a configured change detection resource \(such as a webhook or CloudWatch Events rule\) or use the option to poll the repository for source changes\. For pipelines with a Bitbucket Cloud source action, you do not have to set up a webhook or default to polling\. The connections action manages your source change detection for you\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-CodestarConnectionSource.md |
7141094fbf5c-0 | The following related resources can help you as you work with this action\.
+ [AWS CodeStar Connections API Reference](https://docs.aws.amazon.com/codestar-connections/latest/APIReference/Welcome.html) – The AWS CodeStar Connections API Reference provides reference information for the available connections actions\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-CodestarConnectionSource.md |
33d3676bbbe8-0 | You should use Amazon CloudWatch Events to detect source code changes and trigger the start of your pipeline\. If your pipeline has an Amazon S3 source, you must create an AWS CloudTrail trail to log write events to objects in your Amazon S3 source bucket\.
AWS CloudTrail is a service that logs and filters events on your Amazon S3 source bucket\. The trail sends the filtered source changes to the Amazon CloudWatch Events rule\. The Amazon CloudWatch Events rule detects the source change and then starts your pipeline\.
**Note**
For pipelines with an Amazon S3 source, an Amazon CloudWatch Events rule detects source changes and then starts your pipeline when changes occur\. When you use the console to create or change a pipeline, the rule and all associated resources are created for you\. If you create or change a pipeline with an Amazon S3 source in the CLI or AWS CloudFormation, you must create the Amazon CloudWatch Events rule, IAM role, and AWS CloudTrail trail manually\.
**Requirements:**
+ If you are not creating a trail, use an existing AWS CloudTrail trail for logging events in your Amazon S3 source bucket and sending filtered events to the Amazon CloudWatch Events rule\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/create-cloudtrail-S3-source.md |
33d3676bbbe8-1 | + Create or use an existing S3 bucket where AWS CloudTrail can store its log files\. AWS CloudTrail must have the permissions required to deliver log files to an Amazon S3 bucket\. The bucket cannot be configured as a [Requester Pays](https://docs.aws.amazon.com/AmazonS3/latest/dev/RequesterPaysBuckets.html) bucket\. When you create an Amazon S3 bucket as part of creating or updating a trail in the console, AWS CloudTrail attaches the required permissions to a bucket for you\. For more information, see [Amazon S3 Bucket Policy for CloudTrail](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/create-s3-bucket-policy-for-cloudtrail.html)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/create-cloudtrail-S3-source.md |
cdf0a52c4d69-0 | This section provides an overview of the way CodePipeline processes a set of changes\. CodePipeline tracks each pipeline execution that starts when a change is made to the source code\. CodePipeline also tracks the way each execution progresses through the pipeline, including whether it is superseded by another execution\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
39912f070a77-0 | You can trigger an execution when you change your source code or manually start the pipeline\. You can also trigger an execution through an Amazon CloudWatch Events rule that you schedule\. For example, when a source code change is pushed to a repository configured as the pipeline's source action, the pipeline detects the change and starts an execution\.
**Note**
If a pipeline contains multiple source actions, all of them run again, even if a change is detected for one source action only\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
08ff93ace18a-0 | To use the console to stop a pipeline execution, you can choose **Stop execution** on the pipeline visualization page, on the execution history page, or on the detailed history page\. To use the CLI to stop a pipeline execution, you use the `stop-pipeline-execution` command\. For more information, see [Stop a pipeline execution in CodePipeline](pipelines-stop.md)\.
There are two ways to stop a pipeline execution:
+ **Stop and wait:** All in\-progress action executions are allowed to complete, and subsequent actions are not started\. The pipeline execution does not continue to subsequent stages\. You cannot use this option on an execution that is already in a `Stopping` state\.
+ **Stop and abandon:** All in\-progress action executions are abandoned and do not complete, and subsequent actions are not started\. The pipeline execution does not continue to subsequent stages\. You can use this option on an execution that is already in a `Stopping` state\.
**Note**
This option can lead to failed tasks or out of sequence tasks\.
Each option results in a different sequence of pipeline and action execution phases, as follows\.
**Option 1: Stop and wait**
When you choose to stop and wait, the selected execution continues until in\-progress actions are completed\. For example, the following pipeline execution was stopped while the build action was in progress\.
1. In the pipeline view, the success message banner is displayed, and the build action continues until it is completed\. The pipeline execution status is **Stopping**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
08ff93ace18a-1 | 1. In the pipeline view, the success message banner is displayed, and the build action continues until it is completed\. The pipeline execution status is **Stopping**\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/stop-exec-wait-vis-1.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
In the history view, the status for in\-progress actions, such as the build action, is **In progress** until the build action is completed\. While actions are in progress, the pipeline execution status is **Stopping**\.
1. The execution stops when the stopping process is complete\. If the build action is completed successfully, its status is **Succeeded**, and the pipeline execution shows a status of **Stopped**\. Subsequent actions do not start\. The **Retry** button is enabled\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
08ff93ace18a-2 | ![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/stop-exec-wait-vis-2.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
In the history view, the execution status is **Stopped** after the in\-progress action is completed\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/stop-exec-wait-hist-1.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
**Option 2: Stop and abandon**
When you choose to stop and abandon, the selected execution does not wait for in\-progress actions to complete\. The actions are abandoned\. For example, the following pipeline execution was stopped and abandoned while the build action was in progress\.
1. In the pipeline view, the success banner message is displayed, the build action shows a status of **In progress**, and the pipeline execution shows a status of **Stopping**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
08ff93ace18a-3 | ![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/stop-exec-abandon-vis-1.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
1. After the pipeline execution stops, the build action shows a status of **Abandoned**, and the pipeline execution shows a status of **Stopped**\. Subsequent actions do not start\. The **Retry** button is enabled\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/stop-exec-abandon-vis-2.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
1. In the history view, the execution status is **Stopped**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
08ff93ace18a-4 | 1. In the history view, the execution status is **Stopped**\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/stop-exec-abandon-hist-1.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
**Use cases for stopping a pipeline execution**
We recommend that you use the stop and wait option to stop a pipeline execution\. This option is safer because it avoids possible failed or out\-of\-sequence tasks in your pipeline\. When an action is abandoned in CodePipeline, the action provider continues any tasks related to the action\. In the case of an AWS CloudFormation action, the deployment action in the pipeline is abandoned, but the stack update might continue and result in a failed update\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
08ff93ace18a-5 | As an example of abandoned actions that can result in out\-of\-sequence tasks, if you are deploying a large file \(1GB\) through an S3 deployment action, and you choose to stop and abandon the action while the deployment is already in progress, the action is abandoned in CodePipeline, but continues in Amazon S3\. Amazon S3 does not encounter any instruction to cancel the upload\. Next, if you start a new pipeline execution with a very small file, there are now two deployments in progress\. Because the file size of the new execution is small, the new deployment completes while the old deployment is still uploading\. When the old deployment completes, the new file is overwritten by the old file\.
You might want to use the stop and abandon option in the case where you have a custom action\. For example, you can abandon a custom action with work that does not need to finish before you stop the execution for a new execution with a bug fix\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
767ee1bc8d73-0 | An execution consists of a set of changes picked up and processed by the execution\. Pipelines can process multiple executions at the same time\. Each execution is run through the pipeline separately\. The pipeline processes each execution in order and might supersede an earlier execution with a later one\. The following rules are used to process executions in a pipeline\.
**Rule 1: Stages are locked when an execution is being processed**
Because each stage can process only one execution at a time, the stage is locked while in progress\. When the execution completes a stage, it transitions to the next stage in the pipeline\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/Promotion.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
**Rule 2: Subsequent executions wait for the stage to be unlocked**
While a stage is locked, waiting executions are held in front of the locked stage\. All actions configured for a stage must be completed successfully before the stage is considered complete\. A failure releases the lock on the stage\. When an execution is stopped, the execution does not continue in a stage and the stage is unlocked\.
**Note** | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
767ee1bc8d73-1 | **Note**
Before you stop an execution, we recommend that you disable the transition in front of the stage\. This way, when the stage is unlocked due to the stopped execution, the stage does not accept a subsequent pipeline execution\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/Waiting.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
**Rule 3: Waiting executions are superseded by more recent executions**
Executions are only superseded in between stages\. A locked stage holds one execution at the front of the stage awaiting the stage to complete\. A more recent execution overtakes a waiting execution and continues to the next stage as soon as the stage is unlocked\. The superseded execution does not continue\. In this example, Execution 2 has been superseded by Execution 3 while awaiting the locked stage\. Execution 3 enters the stage next\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/Batching.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
8ee12551cbfe-0 | The flow of pipeline executions can be controlled by:
+ A *transition*, which controls the flow of executions into the stage\. Transitions can be enabled or disabled\. After you enable the transition, any execution waiting to enter the stage moves into the stage and locks it\. Similar to executions awaiting a locked stage, when a transition is disabled, the execution waiting to enter the stage can still be superseded by a new execution\. When a disabled transition is re\-enabled, the latest execution, including any that superseded older executions while the transition was disabled, enters the stage\.
+ An *approval action*, which prevents a pipeline from transitioning to the next action until permission is granted \(for example, through manual approval from an authorized IAM user\)\. You might use an approval action when you want to control the time at which a pipeline transitions to a final **Production** stage, for example\.
**Note**
A stage with an approval action is locked until the approval action is approved or rejected or has timed out\. A timed\-out approval action is processed in the same way as a failed action\.
+ A *failure*, when an action in a stage does not complete successfully\. The revision does not transition to the next action in the stage or the next stage in the pipeline\. The following can occur:
+ You manually retry the stage that contains the failed actions\. This resumes the execution \(it retries failed actions and, if they succeed, continues in the stage/pipeline\)\.
+ Another execution enters the failed stage and supersedes the failed execution\. At this point, the failed execution cannot be retried\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
2474f86edea0-0 | When deciding how a code change should flow through your pipeline, it is best to group related actions within a stage so that, when the stage locks, the actions all process the same execution\. You might create a stage for each application environment, AWS Region, or Availability Zone, and so on\. A pipeline with too many stages \(that is, too granular\) can allow too many concurrent changes, while a pipeline with many actions in a large stage \(too coarse\) can take too long to release a change\.
As an example, a test action after a deployment action in the same stage is guaranteed to test the same change that was deployed\. In this example, a change is deployed to a Test environment and then tested, and then the latest change from the test environment is deployed to a Production environment\. In the recommended example, the Test environment and the Prod environment are separate stages\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/structure-example-recommended-notrecommended.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/concepts-how-it-works.md |
c23988ac1126-0 | You can use the AWS CLI to edit the webhook for your repository\.
+ If you use the console to edit the GitHub source action for your pipeline, the webhook is updated for you \(and re\-registered, if appropriate\)\.
+ If you are not updating the webhook name, and you are not changing the GitHub repository, you can use the AWS CLI to update the webhook\. See Example 1\.
+ If you are changing the webhook name or GitHub repository name, you must edit the source action in the console or delete and recreate the webhook in the CLI\. After you create the webhook, you also register it\. See Example 2\.
**Example 1: To update a webhook secret**
1. In a text editor, edit the JSON file for the webhook you want to update\. This example modifies the sample file that was used to create the webhook in [Create a webhook for a GitHub source](pipelines-webhooks-create.md)\. This sample changes the secret token of the webhook named `"my-webhook"`\.
```
{"webhook":
{"name": "my-webhook",
"targetPipeline": "pipeline_name",
"targetAction": "source_action_name",
"filters": [
{
"jsonPath": "$.ref", | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-update.title.md |
c23988ac1126-1 | "filters": [
{
"jsonPath": "$.ref",
"matchEquals": "refs/heads/{Branch}"
}
],
"authentication": "GITHUB_HMAC",
"authenticationConfiguration": {"SecretToken":"new_secret"}
}
}
```
1. Call the put\-webhook command and include the `--cli-input` and `--region` parameters\.
The following sample command updates a webhook with the modified `"webhook_json"` JSON file\.
```
aws codepipeline put-webhook --cli-input-json file://webhook_json.json --region "eu-central-1"
```
1. The output returns the webhook details and the new secret\.
**Note**
You can edit the GitHub source action in the console\. This allows CodePipeline to manage webhooks for you\.
**Example 2: To update a webhook name or GitHub repository**
1. Use the steps in [Delete the webhook for your GitHub source](pipelines-webhooks-delete.md) to deregister and delete the existing webhook that is associated with the old webhook name or GitHub repository\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-update.title.md |
c23988ac1126-2 | 1. Use the steps in [Create a webhook for a GitHub source](pipelines-webhooks-create.md) to recreate the webhook\.
**Note**
You can edit the GitHub source action in the console\. This allows CodePipeline to manage webhooks for you\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-update.title.md |
c9baa5b013cb-0 | AWS AppConfig is a capability of AWS Systems Manager\. AppConfig supports controlled deployments to applications of any size and includes built\-in validation checks and monitoring\. You can use AppConfig with applications hosted on Amazon EC2 instances, AWS Lambda, containers, mobile applications, or IoT devices\.
The `AppConfig` deploy action is an AWS CodePipeline action that deploys configurations stored in your pipeline source location to a specified AppConfig *application*, *environment*, and *configuration* profile\. It uses the preferences defined in an AppConfig *deployment strategy*\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-AppConfig.md |
aedd76afa57f-0 | + Category: `Deploy`
+ Owner: `AWS`
+ Provider: `AppConfig`
+ Version: `1` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-AppConfig.md |
5518bdb5fd0f-0 | **Application**
Required: Yes
The ID of the AWS AppConfig application with the details for your configuration and deployment\.
**Environment**
Required: Yes
The ID of the AWS AppConfig environment where the configuration is deployed\.
**ConfigurationProfile**
Required: Yes
The ID of the AWS AppConfig configuration profile to deploy\.
**InputArtifactConfigurationPath**
Required: Yes
The file path of the configuration data within the input artifact to deploy\.
**DeploymentStrategy**
Required: No
The AWS AppConfig deployment strategy to use for deployment\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-AppConfig.md |
7fa5e230f5f4-0 | + **Number of Artifacts:** `1`
+ **Description:** The input artifact for the deploy action\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-AppConfig.md |
7a95aca6f3f5-0 | Not applicable\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-AppConfig.md |
f93487eb4928-0 | ------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-AppConfig.md |
606190b7c99e-0 | ```
name: Deploy
actions:
- name: Deploy
actionTypeId:
category: Deploy
owner: AWS
provider: AppConfig
version: '1'
runOrder: 1
configuration:
Application: 2s2qv57
ConfigurationProfile: PvjrpU
DeploymentStrategy: frqt7ir
Environment: 9tm27yd
InputArtifactConfigurationPath: /
outputArtifacts: []
inputArtifacts:
- name: SourceArtifact
region: us-west-2
namespace: DeployVariables
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-AppConfig.md |
9dd6279250b2-0 | ```
{
"name": "Deploy",
"actions": [
{
"name": "Deploy",
"actionTypeId": {
"category": "Deploy",
"owner": "AWS",
"provider": "AppConfig",
"version": "1"
},
"runOrder": 1,
"configuration": {
"Application": "2s2qv57",
"ConfigurationProfile": "PvjrpU",
"DeploymentStrategy": "frqt7ir",
"Environment": "9tm27yd",
"InputArtifactConfigurationPath": "/"
},
"outputArtifacts": [],
"inputArtifacts": [
{
"name": "SourceArtifact"
}
],
"region": "us-west-2",
"namespace": "DeployVariables"
}
]
}
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-AppConfig.md |
0444d9b8987b-0 | The following related resources can help you as you work with this action\.
+ [AWS AppConfig](https://docs.aws.amazon.com/systems-manager/latest/userguide/appconfig.html) – For information about AWS AppConfig deployments, see the *AWS Systems Manager User Guide*\.
+ [Tutorial: Create a pipeline that uses AWS AppConfig as a deployment provider](tutorials-AppConfig.md) – This tutorial gets you started setting up simple deployment configuration files and AppConfig resources, and shows you how to use the console to create a pipeline with an AWS AppConfig deployment action\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-AppConfig.md |
2797cdd21c76-0 | This tutorial helps you to create a complete, end\-to\-end continuous deployment \(CD\) pipeline with Amazon ECS with CodePipeline\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
68d89178ed19-0 | There are a few resources that you must have in place before you can use this tutorial to create your CD pipeline\. Here are the things you need to get started:
**Note**
All of these resources should be created within the same AWS Region\.
+ A source control repository \(this tutorial uses CodeCommit\) with your Dockerfile and application source\. For more information, see [Create a CodeCommit Repository](https://docs.aws.amazon.com/codecommit/latest/userguide/how-to-create-repository.html) in the *AWS CodeCommit User Guide*\.
+ A Docker image repository \(this tutorial uses Amazon ECR\) that contains an image you have built from your Dockerfile and application source\. For more information, see [Creating a Repository](https://docs.aws.amazon.com/AmazonECR/latest/userguide/repository-create.html) and [Pushing an Image](https://docs.aws.amazon.com/AmazonECR/latest/userguide/docker-push-ecr-image.html) in the *Amazon Elastic Container Registry User Guide*\.
+ An Amazon ECS task definition that references the Docker image hosted in your image repository\. For more information, see [Creating a Task Definition](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/create-task-definition.html) in the *Amazon Elastic Container Service Developer Guide*\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
68d89178ed19-1 | + An Amazon ECS cluster that is running a service that uses your previously mentioned task definition\. For more information, see [Creating a Cluster](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/create_cluster.html) and [Creating a Service](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/create-service.html) in the *Amazon Elastic Container Service Developer Guide*\.
After you have satisfied these prerequisites, you can proceed with the tutorial and create your CD pipeline\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
6f8a8970407c-0 | This tutorial uses CodeBuild to build your Docker image and push the image to Amazon ECR\. Add a `buildspec.yml` file to your source code repository to tell CodeBuild how to do that\. The example build specification below does the following:
+ Pre\-build stage:
+ Log in to Amazon ECR\.
+ Set the repository URI to your ECR image and add an image tag with the first seven characters of the Git commit ID of the source\.
+ Build stage:
+ Build the Docker image and tag the image both as `latest` and with the Git commit ID\.
+ Post\-build stage:
+ Push the image to your ECR repository with both tags\.
+ Write a file called `imagedefinitions.json` in the build root that has your Amazon ECS service's container name and the image and tag\. The deployment stage of your CD pipeline uses this information to create a new revision of your service's task definition, and then it updates the service to use the new task definition\. The `imagedefinitions.json` file is required for the ECS job worker\.
```
version: 0.2
phases:
install:
runtime-versions:
docker: 19
pre_build:
commands:
- echo Logging in to Amazon ECR...
- aws --version | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
6f8a8970407c-1 | pre_build:
commands:
- echo Logging in to Amazon ECR...
- aws --version
- $(aws ecr get-login --region $AWS_DEFAULT_REGION --no-include-email)
- REPOSITORY_URI=012345678910.dkr.ecr.us-west-2.amazonaws.com/hello-world
- COMMIT_HASH=$(echo $CODEBUILD_RESOLVED_SOURCE_VERSION | cut -c 1-7)
- IMAGE_TAG=${COMMIT_HASH:=latest}
build:
commands:
- echo Build started on `date`
- echo Building the Docker image...
- docker build -t $REPOSITORY_URI:latest .
- docker tag $REPOSITORY_URI:latest $REPOSITORY_URI:$IMAGE_TAG
post_build:
commands:
- echo Build completed on `date`
- echo Pushing the Docker images...
- docker push $REPOSITORY_URI:latest
- docker push $REPOSITORY_URI:$IMAGE_TAG
- echo Writing image definitions file...
- printf '[{"name":"hello-world","imageUri":"%s"}]' $REPOSITORY_URI:$IMAGE_TAG > imagedefinitions.json
artifacts:
files: imagedefinitions.json | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
6f8a8970407c-2 | artifacts:
files: imagedefinitions.json
```
The build specification was written for the following task definition, used by the Amazon ECS service for this tutorial\. The `REPOSITORY_URI` value corresponds to the `image` repository \(without any image tag\), and the `hello-world` value near the end of the file corresponds to the container name in the service's task definition\.
```
{
"ipcMode": null,
"executionRoleArn": "role_ARN",
"containerDefinitions": [
{
"dnsSearchDomains": null,
"environmentFiles": null,
"logConfiguration": {
"logDriver": "awslogs",
"secretOptions": null,
"options": {
"awslogs-group": "/ecs/hello-world",
"awslogs-region": "us-west-2",
"awslogs-stream-prefix": "ecs"
}
},
"entryPoint": null,
"portMappings": [
{
"hostPort": 80,
"protocol": "tcp",
"containerPort": 80
}
],
"command": null, | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
6f8a8970407c-3 | "containerPort": 80
}
],
"command": null,
"linuxParameters": null,
"cpu": 0,
"environment": [],
"resourceRequirements": null,
"ulimits": null,
"dnsServers": null,
"mountPoints": [],
"workingDirectory": null,
"secrets": null,
"dockerSecurityOptions": null,
"memory": null,
"memoryReservation": 128,
"volumesFrom": [],
"stopTimeout": null,
"image": "image_name",
"startTimeout": null,
"firelensConfiguration": null,
"dependsOn": null,
"disableNetworking": null,
"interactive": null,
"healthCheck": null,
"essential": true,
"links": null,
"hostname": null,
"extraHosts": null,
"pseudoTerminal": null,
"user": null,
"readonlyRootFilesystem": null,
"dockerLabels": null,
"systemControls": null, | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
6f8a8970407c-4 | "dockerLabels": null,
"systemControls": null,
"privileged": null,
"name": "hello-world"
}
],
"placementConstraints": [],
"memory": "2048",
"taskRoleArn": null,
"compatibilities": [
"EC2",
"FARGATE"
],
"taskDefinitionArn": "ARN",
"family": "hello-world",
"requiresAttributes": [],
"pidMode": null,
"requiresCompatibilities": [
"FARGATE"
],
"networkMode": "awsvpc",
"cpu": "1024",
"revision": 1,
"status": "ACTIVE",
"inferenceAccelerators": null,
"proxyConfiguration": null,
"volumes": []
}
```
**To add a `buildspec.yml` file to your source repository**
1. Open a text editor and then copy and paste the build specification above into a new file\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
6f8a8970407c-5 | 1. Open a text editor and then copy and paste the build specification above into a new file\.
1. Replace the `REPOSITORY_URI` value \(`012345678910.dkr.ecr.us-west-2.amazonaws.com/hello-world`\) with your Amazon ECR repository URI \(without any image tag\) for your Docker image\. Replace `hello-world` with the container name in your service's task definition that references your Docker image\.
1. Commit and push your `buildspec.yml` file to your source repository\.
1. Add the file\.
```
git add .
```
1. Commit the change\.
```
git commit -m "Adding build specification."
```
1. Push the commit\.
```
git push
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
2deeb7bbb865-0 | Use the CodePipeline wizard to create your pipeline stages and connect your source repository to your ECS service\.
**To create your pipeline**
1. Open the CodePipeline console at [https://console\.aws\.amazon\.com/codepipeline/](https://console.aws.amazon.com/codepipeline/)\.
1. On the **Welcome** page, choose **Create pipeline**\.
If this is your first time using CodePipeline, an introductory page appears instead of **Welcome**\. Choose **Get Started Now**\.
1. On the **Step 1: Name** page, for **Pipeline name**, type the name for your pipeline and choose **Next**\. For this tutorial, the pipeline name is **hello\-world**\.
1. On the **Step 2: Add source stage** page, for **Source provider**, choose **AWS CodeCommit**\.
1. For **Repository name**, choose the name of the CodeCommit repository to use as the source location for your pipeline\.
1. For **Branch name**, choose the branch to use and choose **Next**\.
1. On the **Step 3: Add build stage** page, for **Build provider** choose **AWS CodeBuild**, and then choose **Create project**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
2deeb7bbb865-1 | 1. For **Project name**, choose a unique name for your build project\. For this tutorial, the project name is **hello\-world**\.
1. For **Environment image**, choose **Managed image**\.
1. For **Operating system**, choose **Amazon Linux 2**\.
1. For **Runtime\(s\)**, choose **Standard**\.
1. For **Image**, choose **aws/codebuild/amazonlinux2\-x86\_64\-standard:2\.0**\.
1. For **Image version** and **Environment type**, use the default values\.
1. Select **Enable this flag if you want to build Docker images or want your builds to get elevated privileges**\.
1. Deselect **CloudWatch logs**\. You might need to expand **Advanced**\.
1. Choose **Continue to CodePipeline**\.
1. Choose **Next**\.
**Note**
The wizard creates a CodeBuild service role for your build project, called **code\-build\-*build\-project\-name*\-service\-role**\. Note this role name, as you add Amazon ECR permissions to it later\.
1. On the **Step 4: Add deploy stage** page, for **Deployment provider**, choose **Amazon ECS**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
2deeb7bbb865-2 | 1. On the **Step 4: Add deploy stage** page, for **Deployment provider**, choose **Amazon ECS**\.
1. For **Cluster name**, choose the Amazon ECS cluster in which your service is running\. For this tutorial, the cluster is **default**\.
1. For **Service name**, choose the service to update and choose **Next**\. For this tutorial, the service name is **hello\-world**\.
1. On the **Step 5: Review** page, review your pipeline configuration and choose **Create pipeline** to create the pipeline\.
**Note**
Now that the pipeline has been created, it attempts to run through the different pipeline stages\. However, the default CodeBuild role created by the wizard does not have permissions to execute all of the commands contained in the `buildspec.yml` file, so the build stage fails\. The next section adds the permissions for the build stage\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
5f04b6e40de5-0 | The CodePipeline wizard created an IAM role for the CodeBuild build project, called **code\-build\-*build\-project\-name*\-service\-role**\. For this tutorial, the name is **code\-build\-hello\-world\-service\-role**\. Because the `buildspec.yml` file makes calls to Amazon ECR API operations, the role must have a policy that allows permissions to make these Amazon ECR calls\. The following procedure helps you attach the proper permissions to the role\.
**To add Amazon ECR permissions to the CodeBuild role**
1. Open the IAM console at [https://console\.aws\.amazon\.com/iam/](https://console.aws.amazon.com/iam/)\.
1. In the left navigation pane, choose **Roles**\.
1. In the search box, type **codebuild\-** and choose the role that was created by the CodePipeline wizard\. For this tutorial, the role name is **codebuild\-hello\-world\-service\-role**\.
1. On the **Summary** page, choose **Attach policies**\.
1. Select the box to the left of the **AmazonEC2ContainerRegistryPowerUser** policy, and choose **Attach policy**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
cda52fbbc1b0-0 | Your pipeline should have everything for running an end\-to\-end native AWS continuous deployment\. Now, test its functionality by pushing a code change to your source repository\.
**To test your pipeline**
1. Make a code change to your configured source repository, commit, and push the change\.
1. Open the CodePipeline console at [https://console\.aws\.amazon\.com/codepipeline/](https://console.aws.amazon.com/codepipeline/)\.
1. Choose your pipeline from the list\.
1. Watch the pipeline progress through its stages\. Your pipeline should complete and your Amazon ECS service runs the Docker image that was created from your code change\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/ecs-cd-pipeline.md |
800acbf23035-0 | By default, IAM users and roles don't have permission to create or modify CodePipeline resources\. They also can't perform tasks using the AWS Management Console, AWS CLI, or AWS API\. An IAM administrator must create IAM policies that grant users and roles permission to perform specific API operations on the specified resources they need\. The administrator must then attach those policies to the IAM users or groups that require those permissions\.
To learn how to create an IAM identity\-based policy using these example JSON policy documents, see [Creating Policies on the JSON Tab](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_create.html#access_policies_create-json-editor) in the *IAM User Guide*\.
**Topics**
+ [Policy best practices](security_iam_service-with-iam-policy-best-practices.md)
+ [Viewing resources in the console](security-iam-resources-console.md)
+ [Allow users to view their own permissions](security_iam_id-based-policy-examples-view-own-permissions.md)
+ [Identity\-based policies \(IAM\) examples](security-iam-id-policies-examples.md)
+ [Using tags to control access to CodePipeline resources](tag-based-access-control.md)
+ [Permissions required to use the CodePipeline console](security-iam-permissions-console.md) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security_iam_id-based-policy-examples.md |
800acbf23035-1 | + [Permissions required to use the CodePipeline console](security-iam-permissions-console.md)
+ [AWS managed \(predefined\) policies for CodePipeline](managed-policies.md)
+ [CodePipeline managed policies and notifications](#notifications-permissions)
+ [Customer managed policy examples](customer-managed-policies.md) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security_iam_id-based-policy-examples.md |
1d58d0a35b05-0 | CodePipeline supports notifications, which can notify users of important changes to pipelines\. Managed policies for CodePipeline include policy statements for notification functionality\. For more information, see [What are notifications?](https://docs.aws.amazon.com/codestar-notifications/latest/userguide/welcome.html)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security_iam_id-based-policy-examples.md |
72890c653158-0 | The `AWSCodePipelineFullAccess` managed policy includes the following statements to allow full access to notifications\. Users with this managed policy applied can also create and manage Amazon SNS topics for notifications, subscribe and unsubscribe users to topics, list topics to choose as targets for notification rules, and list AWS Chatbot clients configured for Slack\.
```
{
"Sid": "CodeStarNotificationsReadWriteAccess",
"Effect": "Allow",
"Action": [
"codestar-notifications:CreateNotificationRule",
"codestar-notifications:DescribeNotificationRule",
"codestar-notifications:UpdateNotificationRule",
"codestar-notifications:DeleteNotificationRule",
"codestar-notifications:Subscribe",
"codestar-notifications:Unsubscribe"
],
"Resource": "*",
"Condition" : {
"StringLike" : {"codestar-notifications:NotificationsForResource" : "arn:aws:codepipeline:*"}
}
},
{
"Sid": "CodeStarNotificationsListAccess",
"Effect": "Allow",
"Action": [
"codestar-notifications:ListNotificationRules",
"codestar-notifications:ListTargets", | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security_iam_id-based-policy-examples.md |
72890c653158-1 | "codestar-notifications:ListNotificationRules",
"codestar-notifications:ListTargets",
"codestar-notifications:ListTagsforResource",
"codestar-notifications:ListEventTypes"
],
"Resource": "*"
},
{
"Sid": "CodeStarNotificationsSNSTopicCreateAccess",
"Effect": "Allow",
"Action": [
"sns:CreateTopic",
"sns:SetTopicAttributes"
],
"Resource": "arn:aws:sns:*:*:codestar-notifications*"
},
{
"Sid": "SNSTopicListAccess",
"Effect": "Allow",
"Action": [
"sns:ListTopics"
],
"Resource": "*"
},
{
"Sid": "CodeStarNotificationsChatbotAccess",
"Effect": "Allow",
"Action": [
"chatbot:DescribeSlackChannelConfigurations"
],
"Resource": "*"
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security_iam_id-based-policy-examples.md |
4866ce42b42b-0 | The `AWSCodePipelineReadOnlyAccess` managed policy includes the following statements to allow read\-only access to notifications\. Users with this policy applied can view notifications for resources, but cannot create, manage, or subscribe to them\.
```
{
"Sid": "CodeStarNotificationsPowerUserAccess",
"Effect": "Allow",
"Action": [
"codestar-notifications:DescribeNotificationRule"
],
"Resource": "*",
"Condition" : {
"StringLike" : {"codestar-notifications:NotificationsForResource" : "arn:aws:codepipeline:*"}
}
},
{
"Sid": "CodeStarNotificationsListAccess",
"Effect": "Allow",
"Action": [
"codestar-notifications:ListNotificationRules",
"codestar-notifications:ListEventTypes",
"codestar-notifications:ListTargets"
],
"Resource": "*"
}
```
For more information about IAM and notifications, see [Identity and Access Management for AWS CodeStar Notifications](https://docs.aws.amazon.com/codestar-notifications/latest/userguide/security-iam.html)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security_iam_id-based-policy-examples.md |
3d1591e07269-0 | Conditions in IAM user policy statements are part of the syntax that you use to specify permissions to resources required by CodePipeline actions\. Using tags in conditions is one way to control access to resources and requests\. For information about tagging CodePipeline resources, see [Tagging resources](tag-resources.md)\. This topic discusses tag\-based access control\.
When you design IAM policies, you might be setting granular permissions by granting access to specific resources\. As the number of resources that you manage grows, this task becomes more difficult\. Tagging resources and using tags in policy statement conditions can make this task easier\. You grant access in bulk to any resource with a certain tag\. Then you repeatedly apply this tag to relevant resources, during creation or later\.
Tags can be attached to the resource or passed in the request to services that support tagging\. In CodePipeline, resources can have tags, and some actions can include tags\. When you create an IAM policy, you can use tag condition keys to control:
+ Which users can perform actions on a pipeline resource, based on tags that it already has\.
+ Which tags can be passed in an action's request\.
+ Whether specific tag keys can be used in a request\.
For the complete syntax and semantics of tag condition keys, see [Controlling Access Using Tags](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_tags.html) in the *IAM User Guide*\.
The following examples demonstrate how to specify tag conditions in policies for CodePipeline users\.
**Example 1: Limit actions based on tags in the request** | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tag-based-access-control.md |
3d1591e07269-1 | The following examples demonstrate how to specify tag conditions in policies for CodePipeline users\.
**Example 1: Limit actions based on tags in the request**
The `CodePipelineFullAccess` managed user policy gives users unlimited permission to perform any CodePipeline action on any resource\.
The following policy limits this power and denies unauthorized users permission to create pipelines for specific projects\. To do that, it denies the `CreatePipeline` action if the request specifies a tag named `Project` with one of the values `ProjectA` or `ProjectB`\. \(The `aws:RequestTag` condition key is used to control which tags can be passed in an IAM request\.\) In addition, the policy prevents these unauthorized users from tampering with the resources by using the `aws:TagKeys` condition key to not allow tag modification actions to include these same tag values or to completely remove the `Project` tag\. A customer's administrator must attach this IAM policy to unauthorized IAM users, in addition to the managed user policy\.
```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Deny",
"Action": [
"codepipeline:CreatePipeline",
"codepipeline:TagResource"
],
"Resource": "*",
"Condition": {
"StringEquals": {
"aws:RequestTag/Project": ["ProjectA", "ProjectB"] | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tag-based-access-control.md |
3d1591e07269-2 | "StringEquals": {
"aws:RequestTag/Project": ["ProjectA", "ProjectB"]
}
}
},
{
"Effect": "Deny",
"Action": [
"codepipeline:UntagResource"
],
"Resource": "*",
"Condition": {
"ForAllValues:StringEquals": {
"aws:TagKeys": ["Project"]
}
}
}
]
}
```
**Example 2: Limit actions based on resource tags**
The `CodePipelineFullAccess` managed user policy gives users unlimited permission to perform any CodePipeline action on any resource\.
The following policy limits this power and denies unauthorized users permission to perform actions on specified project pipelines\. To do that, it denies some actions if the resource has a tag named `Project` with one of the values `ProjectA` or `ProjectB`\. \(The `aws:ResourceTag` condition key is used to control access to the resources based on the tags on those resources\.\) A customer's administrator must attach this IAM policy to unauthorized IAM users, in addition to the managed user policy\.
```
{
"Version": "2012-10-17",
"Statement": [
{ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tag-based-access-control.md |
3d1591e07269-3 | ```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Deny",
"Action": [
"codepipeline:TagResource",
"codepipeline:UntagResource",
"codepipeline:UpdatePipeline",
"codepipeline:DeletePipeline",
"codepipeline:ListTagsForResource"
],
"Resource": "*",
"Condition": {
"StringEquals": {
"aws:ResourceTag/Project": ["ProjectA", "ProjectB"]
}
}
}
]
}
```
**Example 3: Allow actions based on tags in the request**
The following policy grants users permission to create development pipelines in CodePipeline\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tag-based-access-control.md |
3d1591e07269-4 | **Example 3: Allow actions based on tags in the request**
The following policy grants users permission to create development pipelines in CodePipeline\.
To do that, it allows the `CreatePipeline` and `TagResource` actions if the request specifies a tag named `Project` with the value `ProjectA`\. \(The `aws:RequestTag` condition key is used to control which tags can be passed in an IAM request\.\) The `aws:TagKeys` condition ensures tag key case sensitivity\. This policy is useful for IAM users who don't have the `CodePipelineFullAccess` managed user policy attached\. The managed policy gives users unlimited permission to perform any CodePipeline action on any resource\.
```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"codepipeline:CreatePipeline",
"codepipeline:TagResource"
],
"Resource": "*",
"Condition": {
"StringEquals": {
"aws:RequestTag/Project": "ProjectA"
},
"ForAllValues:StringEquals": {
"aws:TagKeys": ["Project"]
}
}
}
]
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tag-based-access-control.md |
3d1591e07269-5 | }
}
}
]
}
```
**Example 4: Allow actions based on resource tags**
The following policy grants users permission to perform actions on, and get information about, project pipelines in CodePipeline\.
To do that, it allows specific actions if the pipeline has a tag named `Project` with the value `ProjectA`\. \(The `aws:RequestTag` condition key is used to control which tags can be passed in an IAM request\.\) The `aws:TagKeys` condition ensures tag key case sensitivity\. This policy is useful for IAM users who don't have the `CodePipelineFullAccess` managed user policy attached\. The managed policy gives users unlimited permission to perform any CodePipeline action on any resource\.
```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"codepipeline:UpdatePipeline",
"codepipeline:DeletePipeline",
"codepipeline:ListPipelines"
],
"Resource": "*",
"Condition": {
"StringEquals": {
"aws:ResourceTag/Project": "ProjectA"
},
"ForAllValues:StringEquals": { | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tag-based-access-control.md |
3d1591e07269-6 | "aws:ResourceTag/Project": "ProjectA"
},
"ForAllValues:StringEquals": {
"aws:TagKeys": ["Project"]
}
}
}
]
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tag-based-access-control.md |
fc8afe04ce65-0 | The following sections provide links to blog posts, articles, and community\-provided examples\.
**Note**
These links are provided for informational purposes only, and should not be considered either a comprehensive list or an endorsement of the content of the examples\. AWS is not responsible for the content or accuracy of external content\.
**Topics**
+ [Integration examples: Blog posts](integrations-community-blogposts.md)
+ [Integration examples: Videos](integrations-community-videos.md) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-community.md |
6d2d8e6b47dd-0 | In this tutorial, you configure a pipeline that continuously delivers files using Amazon S3 as the deployment action provider in your deployment stage\. The completed pipeline detects changes when you make a change to the source files in your source repository\. The pipeline then uses Amazon S3 to deploy the files to your bucket\. Each time you modify, add, or delete your website files in your source location, the deployment creates the website with your latest files\. This tutorial provides two options:
+ Create a pipeline that deploys a static website to your S3 public bucket\. This example creates a pipeline with an AWS CodeCommit source action and an Amazon S3 deployment action\. See [Option 1: Deploy static website files to Amazon S3](#tutorials-s3deploy-acc)\.
+ Create a pipeline that compiles sample TypeScript code into JavaScript and deploys the CodeBuild output artifact to your S3 bucket for archive\. This example creates a pipeline with an Amazon S3 source action, a CodeBuild build action, and an Amazon S3 deployment action\. See [Option 2: Deploy built archive files to Amazon S3 from an S3 source bucket](#tutorials-s3deploy-s3source)\.
**Important**
Many of the actions you add to your pipeline in this procedure involve AWS resources that you need to create before you create the pipeline\. AWS resources for your source actions must always be created in the same AWS Region where you create your pipeline\. For example, if you create your pipeline in the US East \(Ohio\) Region, your CodeCommit repository must be in the US East \(Ohio\) Region\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
6d2d8e6b47dd-1 | You can add cross\-region actions when you create your pipeline\. AWS resources for cross\-region actions must be in the same AWS Region where you plan to execute the action\. For more information, see [Add a cross\-Region action in CodePipeline](actions-create-cross-region.md)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
936d01cbd50c-0 | In this example, you download the sample static website template file, upload the files to your AWS CodeCommit repository, create your bucket, and configure it for hosting\. Next, you use the AWS CodePipeline console to create your pipeline and specify an Amazon S3 deployment configuration\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
8ad1994d669b-0 | You must already have the following:
+ A CodeCommit repository\. You can use the AWS CodeCommit repository you created in [Tutorial: Create a simple pipeline \(CodeCommit repository\)](tutorials-simple-codecommit.md)\.
+ Source files for your static website\. Use this link to download a [sample static website](samples/sample-website.zip)\. The sample\-website\.zip download produces the following files:
+ An `index.html` file
+ A `main.css` file
+ A `graphic.jpg` file
+ An S3 bucket configured for website hosting\. See [Hosting a static website on Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteHosting.html)\. Make sure you create your bucket in the same Region as the pipeline\.
**Note**
To host a website, your bucket must have public read access, which gives everyone read access\. With the exception of website hosting, you should keep the default access settings that block public access to S3 buckets\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
cfac47a7d15a-0 | In this section, push your source files to the repository that the pipeline uses for your source stage\.
**To push files to your CodeCommit repository**
1. Extract the downloaded sample files\. Do not upload the ZIP file to your repository\.
1. Push or upload the files to your CodeCommit repository\. These files are the source artifact created by the **Create Pipeline** wizard for your deployment action in CodePipeline\. Your files should look like this in your local directory:
```
index.html
main.css
graphic.jpg
```
1. You can use Git or the CodeCommit console to upload your files:
1. To use the Git command line from a cloned repository on your local computer:
1. Run the following command to stage all of your files at once:
```
git add -A
```
1. Run the following command to commit the files with a commit message:
```
git commit -m "Added static website files"
```
1. Run the following command to push the files from your local repo to your CodeCommit repository:
```
git push
```
1. To use the CodeCommit console to upload your files:
1. Open the CodeCommit console, and choose your repository from the **Repositories** list\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
cfac47a7d15a-1 | 1. Open the CodeCommit console, and choose your repository from the **Repositories** list\.
1. Choose **Add file**, and then choose **Upload file**\.
1. Select **Choose file**, and then browse for your file\. Commit the change by entering your user name and email address\. Choose **Commit changes**\.
1. Repeat this step for each file you want to upload\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7abc81ff0b28-0 | In this section, you create a pipeline with the following actions:
+ A source stage with a CodeCommit action where the source artifacts are the files for your website\.
+ A deployment stage with an Amazon S3 deployment action\.
**To create a pipeline with the wizard**
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**\.
1. In **Step 1: Choose pipeline settings**, in **Pipeline name**, enter **MyS3DeployPipeline**\.
1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM\.
1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7abc81ff0b28-1 | 1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**\.
1. In **Step 2: Add source stage**, in **Source provider**, choose **AWS CodeCommit**\. In **Repository name**, choose the name of the CodeCommit repository you created in [Step 1: Create a CodeCommit repository](tutorials-simple-codecommit.md#codecommit-create-repository)\. In **Branch name**, choose the name of the branch that contains your latest code update\. Unless you created a different branch on your own, only `master` is available\.
After you select the repository name and branch, the Amazon CloudWatch Events rule to be created for this pipeline is displayed\.
Choose **Next**\.
1. In **Step 3: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again\.
Choose **Next**\.
1. In **Step 4: Add deploy stage**:
1. In **Deploy provider**, choose **Amazon S3**\.
1. In **Bucket**, enter the name of your public bucket\.
1. Select **Extract file before deploy**\.
**Note**
The deployment fails if you do not select **Extract file before deploy**\. This is because the AWS CodeCommit action in your pipeline zips source artifacts and your file is a ZIP file\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7abc81ff0b28-2 | When **Extract file before deploy** is selected, **Deployment path** is displayed\. Enter the name of the path you want to use\. This creates a folder structure in Amazon S3 to which the files are extracted\. For this tutorial, leave this field blank\.
![\[The Step 4: Deploy page for an S3 deploy action with an AWS CodeCommit source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/tutorial-s3deploy-stage-codecommit.png)![\[The Step 4: Deploy page for an S3 deploy action with an AWS CodeCommit source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[The Step 4: Deploy page for an S3 deploy action with an AWS CodeCommit source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
1. \(Optional\) In **Canned ACL**, you can apply a set of predefined grants, known as a [canned ACL](https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl), to the uploaded artifacts\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7abc81ff0b28-3 | 1. \(Optional\) In **Cache control**, enter the caching parameters\. You can set this to control caching behavior for requests/responses\. For valid values, see the [http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9](http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9) header field for HTTP operations\.
1. Choose **Next**\.
1. In **Step 5: Review**, review the information, and then choose **Create pipeline**\.
![\[The completed pipeline for an Amazon S3 deploy action with an AWS CodeCommit source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/tutorial-s3deploy-pipeline-codecommit.png)![\[The completed pipeline for an Amazon S3 deploy action with an AWS CodeCommit source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[The completed pipeline for an Amazon S3 deploy action with an AWS CodeCommit source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
1. After your pipeline runs successfully, open the Amazon S3 console and verify that your files appear in your public bucket as shown:
```
index.html
main.css
graphic.jpg | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7abc81ff0b28-4 | ```
index.html
main.css
graphic.jpg
```
1. Access your endpoint to test the website\. Your endpoint follows this format: `http://bucket-name.s3-website-region.amazonaws.com/`\.
Example endpoint: `http://my-bucket.s3-website-us-west-2.amazonaws.com/`\.
The sample appears as shown here\.
![\[Sample website\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/tutorial-s3deploy-pipeline-website.png)![\[Sample website\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Sample website\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
029feb5e26eb-0 | Make a change to your source files and then push the change to your repository\. This triggers your pipeline to run\. Verify that your website is updated\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
e93d34b4b60a-0 | In this option, the build commands in your build stage compile TypeScript code into JavaScript code and deploy the output to your S3 target bucket under a separate timestamped folder\. First, you create TypeScript code and a buildspec\.yml file\. After you combine the source files in a ZIP file, you upload the source ZIP file to your S3 source bucket, and use a CodeBuild stage to deploy a built application ZIP file to your S3 | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
e93d34b4b60a-1 | a built application ZIP file to your S3 target bucket\. The compiled code is retained as an archive in your target bucket\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
efff3d68e12d-0 | You must already have the following:
+ An S3 source bucket\. You can use the bucket you created in [Tutorial: Create a simple pipeline \(S3 bucket\)](tutorials-simple-s3.md)\.
+ An S3 target bucket\. See [Hosting a static website on Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/dev/WebsiteHosting.html)\. Make sure you create your bucket in the same AWS Region as the pipeline you want to create\.
**Note**
This example demonstrates deploying files to a private bucket\. Do not enable your target bucket for website hosting or attach any policies that make the bucket public\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
5923de75f430-0 | In this section, you create and upload your source files to the bucket that the pipeline uses for your source stage\. This section provides instructions for creating the following source files:
+ A `buildspec.yml` file, which is used for CodeBuild build projects\.
+ An `index.ts` file\.
**To create a buildspec\.yml file**
+ Create a file named `buildspec.yml` with the following contents\. These build commands install TypeScript and use the TypeScript compiler to rewrite the code in `index.ts` to JavaScript code\.
```
version: 0.2
phases:
install:
commands:
- npm install -g typescript
build:
commands:
- tsc index.ts
artifacts:
files:
- index.js
```
**To create an index\.ts file**
+ Create a file named `index.ts` with the following contents\.
```
interface Greeting {
message: string;
}
class HelloGreeting implements Greeting {
message = "Hello!";
}
function greet(greeting: Greeting) {
console.log(greeting.message);
}
let greeting = new HelloGreeting();
greet(greeting);
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
5923de75f430-1 | }
let greeting = new HelloGreeting();
greet(greeting);
```
**To upload files to your S3 source bucket**
1. Your files should look like this in your local directory:
```
buildspec.yml
index.ts
```
Zip the files and name the file `source.zip`\.
1. In the Amazon S3 console, for your source bucket, choose **Upload**\. Choose **Add files**, and then browse for the ZIP file you created\.
1. Choose **Upload**\. These files are the source artifact created by the **Create Pipeline** wizard for your deployment action in CodePipeline\. Your file should look like this in your bucket:
```
source.zip
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7f8627cde641-0 | In this section, you create a pipeline with the following actions:
+ A source stage with an Amazon S3 action where the source artifacts are the files for your downloadable application\.
+ A deployment stage with an Amazon S3 deployment action\.
**To create a pipeline with the wizard**
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
1. On the **Welcome** page, **Getting started** page, or **Pipelines** page, choose **Create pipeline**\.
1. In **Step 1: Choose pipeline settings**, in **Pipeline name**, enter **MyS3DeployPipeline**\.
1. In **Service role**, choose **New service role** to allow CodePipeline to create a service role in IAM\.
1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**\.
1. In **Step 2: Add source stage**, in **Source provider**, choose **Amazon S3**\. In **Bucket**, choose the name of your source bucket\. In **S3 object key**, enter the name of your source ZIP file\. Make sure you include the \.zip file extension\.
Choose **Next**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7f8627cde641-1 | Choose **Next**\.
1. In **Step 3: Add build stage**:
1. In **Build provider**, choose **AWS CodeBuild**\.
1. Choose **Create build project**\. On the **Create project** page:
1. In **Project name**, enter a name for this build project\.
1. In **Environment**, choose **Managed image**\. For **Operating system**, choose **Ubuntu**\.
1. For **Runtime**, choose **Standard**\. For **Runtime version**, choose **aws/codebuild/standard:1\.0**\.
1. In **Image version**, choose **Always use the latest image for this runtime version**\.
1. For **Service role**, choose your CodeBuild service role, or create one\.
1. For **Build specifications**, choose **Use a buildspec file**\.
1. Choose **Continue to CodePipeline**\. A message is displayed if the project was created successfully\.
1. Choose **Next**\.
1. In **Step 4: Add deploy stage**:
1. In **Deploy provider**, choose **Amazon S3**\.
1. In **Bucket**, enter the name of your S3 target bucket\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7f8627cde641-2 | 1. In **Bucket**, enter the name of your S3 target bucket\.
1. Make sure that **Extract file before deploy** is cleared\.
When **Extract file before deploy** is cleared, **S3 object key** is displayed\. Enter the name of the path you want to use: `js-application/{datetime}.zip`\.
This creates a `js-application` folder in Amazon S3 to which the files are extracted\. In this folder, the `{datetime}` variable creates a timestamp on each output file when your pipeline runs\.
![\[The Step 4: Deploy page for an Amazon S3 deploy action with an Amazon S3 source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/tutorial-s3deploy-stage-s3source.png)![\[The Step 4: Deploy page for an Amazon S3 deploy action with an Amazon S3 source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[The Step 4: Deploy page for an Amazon S3 deploy action with an Amazon S3 source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7f8627cde641-3 | 1. \(Optional\) In **Canned ACL**, you can apply a set of predefined grants, known as a [canned ACL](https://docs.aws.amazon.com/AmazonS3/latest/dev/acl-overview.html#canned-acl), to the uploaded artifacts\.
1. \(Optional\) In **Cache control**, enter the caching parameters\. You can set this to control caching behavior for requests/responses\. For valid values, see the [http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9](http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9) header field for HTTP operations\.
1. Choose **Next**\.
1. In **Step 5: Review**, review the information, and then choose **Create pipeline**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7f8627cde641-4 | 1. In **Step 5: Review**, review the information, and then choose **Create pipeline**\.
![\[The completed pipeline for an Amazon S3 deploy action with an Amazon S3 source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/tutorial-s3deploy-pipeline-s3source.png)![\[The completed pipeline for an Amazon S3 deploy action with an Amazon S3 source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[The completed pipeline for an Amazon S3 deploy action with an Amazon S3 source\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
1. After your pipeline runs successfully, view your bucket in the Amazon S3 console\. Verify that your deployed ZIP file is displayed in your target bucket under the `js-application` folder\. The JavaScript file contained in the ZIP file should be `index.js`\. The `index.js` file contains the following output:
```
var HelloGreeting = /** @class */ (function () {
function HelloGreeting() {
this.message = "Hello!";
}
return HelloGreeting;
}());
function greet(greeting) {
console.log(greeting.message);
}
var greeting = new HelloGreeting(); | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
7f8627cde641-5 | function greet(greeting) {
console.log(greeting.message);
}
var greeting = new HelloGreeting();
greet(greeting);
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
549073b67bb3-0 | Make a change to your source files and then upload them to your source bucket\. This triggers your pipeline to run\. View your target bucket and verify that the deployed output files are available in the `js-application` folder as shown:
![\[Sample ZIP download\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/tutorial-s3deploy-pipeline-appzip.png)![\[Sample ZIP download\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Sample ZIP download\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-s3deploy.md |
54d0e4261a06-0 | This walkthrough shows you how to use the AWS CloudFormation console to create infrastructure that includes a pipeline connected to an Amazon S3 source bucket\. In this tutorial, you use the provided sample template file to create your resource stack, which includes your source bucket, artifact store, pipeline, and change\-detection resources, such as your Amazon CloudWatch Events rule and CloudTrail trail\. After you create your resource stack in AWS CloudFormation, you can view your pipeline in the AWS CodePipeline console\. The pipeline is a two\-stage pipeline with an Amazon S3 source stage and a CodeDeploy deployment stage\.
**Prerequisites:**
You must have the following resources to use with the AWS CloudFormation sample template:
+ You must have created the Amazon EC2 instances, where you installed the CodeDeploy agent on the instances\. You must have created a CodeDeploy application and deployment group\. Use the Amazon EC2 and CodeDeploy resources you created in [Tutorial: Create a simple pipeline \(CodeCommit repository\)](tutorials-simple-codecommit.md)\.
+ Choose the following links to download the sample AWS CloudFormation template files for creating a pipeline with an Amazon S3 source:
+ Download the sample template for your pipeline: [YAML](samples/codepipeline-s3-events-yaml.zip) \| [JSON](samples/codepipeline-s3-events-json.zip) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-cloudformation-s3.md |
54d0e4261a06-1 | + Download the sample template for your CloudTrail bucket and trail: [YAML](samples/codepipeline-s3-cloudtrail-yaml.zip) \| [JSON](samples/codepipeline-s3-cloudtrail-json.zip)
+ Unzip the files and place them on your local computer\.
+ Download the sample application from [SampleApp\_Linux\.zip](samples/SampleApp_Linux.zip)\.
Save the \.zip file on your local computer\. You upload the \.zip file after the stack is created\.
**Create your pipeline in AWS CloudFormation**
1. Open the AWS CloudFormation console, and choose **Create Stack**\.
1. In **Choose a template**, choose **Upload a template to Amazon S3**\. Choose **Browse**, and then select the template file from your local computer\. Choose **Next**\.
1. In **Stack name**, enter a name for your pipeline\. Parameters specified by the sample template are displayed\. Enter the following parameters:
1. In **ApplicationName**, enter the name of your CodeDeploy application\. You can replace the `DemoApplication` default name\.
1. In **BetaFleet**, enter the name of your CodeDeploy deployment group\. You can replace the `DemoFleet` default name\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-cloudformation-s3.md |
54d0e4261a06-2 | 1. In **BetaFleet**, enter the name of your CodeDeploy deployment group\. You can replace the `DemoFleet` default name\.
1. In **SourceObjectKey**, enter `SampleApp_Linux.zip`\. You upload this file to your bucket after the template creates the bucket and pipeline\.
1. Choose **Next**\. Accept the defaults on the following page, and then choose **Next**\.
1. In **Capabilities**, select **I acknowledge that AWS CloudFormation might create IAM resources**, and then choose **Create**\.
1. After your stack creation is complete, view the event list to check for any errors\.
**Troubleshooting**
The IAM user who is creating the pipeline in AWS CloudFormation might require additional permissions to create resources for the pipeline\. The following permissions are required in the IAM user's policy to allow AWS CloudFormation to create the required Amazon CloudWatch Events resources for the Amazon S3 pipeline:
```
{
"Effect": "Allow",
"Action": [
"events:PutRule",
"events:PutEvents",
"events:PutTargets",
"events:DeleteRule",
"events:RemoveTargets",
"events:DescribeRule"
],
"Resource": "*"
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-cloudformation-s3.md |
54d0e4261a06-3 | "events:DescribeRule"
],
"Resource": "*"
}
```
1. In AWS CloudFormation, in the **Resources** tab for your stack, view the resources that were created for your stack\.
Choose the S3 bucket with a `sourcebucket` label in the name, such as `s3-cfn-codepipeline-sourcebucket-y04EXAMPLE.` Do not choose the pipeline artifact bucket\.
The source bucket is empty because the resource is newly created by AWS CloudFormation\. Open the Amazon S3 console and locate your `sourcebucket` bucket\. Choose **Upload**, and follow the instructions to upload your `SampleApp_Linux.zip` \.zip file\.
**Note**
When Amazon S3 is the source provider for your pipeline, you must upload to your bucket all source files packaged as a single \.zip file\. Otherwise, the source action fails\.
1. Sign in to the AWS Management Console and open the CodePipeline console at [https://console\.aws\.amazon\.com/codepipeline/](https://console.aws.amazon.com/codepipeline/)\.
Under **Pipelines**, choose your pipeline, and then choose **View**\. The diagram shows your pipeline source and deployment stages\.
1. Complete the steps in the following procedure to create your AWS CloudTrail resources\.
**Create your AWS CloudTrail resources in AWS CloudFormation** | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-cloudformation-s3.md |
54d0e4261a06-4 | **Create your AWS CloudTrail resources in AWS CloudFormation**
1. Open the AWS CloudFormation console, and choose **Create Stack**\.
1. In **Choose a template**, choose **Upload a template to Amazon S3**\. Choose **Browse**, and then select the template file for the AWS CloudTrail resources from your local computer\. Choose **Next**\.
1. In **Stack name**, enter a name for your resource stack\. Parameters specified by the sample template are displayed\. Enter the following parameters:
1. In **SourceObjectKey**, accept the default for the sample application's zip file\.
1. Choose **Next**\. Accept the defaults on the following page, and then choose **Next**\.
1. In **Capabilities**, select **I acknowledge that AWS CloudFormation might create IAM resources**, and then choose **Create**\.
1. After your stack creation is complete, view the event list to check for any errors\.
The following permissions are required in the IAM user's policy to allow AWS CloudFormation to create the required CloudTrail resources for the Amazon S3 pipeline:
```
{
"Effect": "Allow",
"Action": [
"cloudtrail:CreateTrail",
"cloudtrail:DeleteTrail",
"cloudtrail:StartLogging", | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-cloudformation-s3.md |
54d0e4261a06-5 | "cloudtrail:DeleteTrail",
"cloudtrail:StartLogging",
"cloudtrail:StopLogging",
"cloudtrail:PutEventSelectors"
],
"Resource": "*"
}
```
1. Sign in to the AWS Management Console and open the CodePipeline console at [https://console\.aws\.amazon\.com/codepipeline/](https://console.aws.amazon.com/codepipeline/)\.
Under **Pipelines**, choose your pipeline, and then choose **View**\. The diagram shows your pipeline source and deployment stages\.
1. In your source bucket, commit and push a change\. Your change\-detection resources pick up the change and your pipeline starts\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-cloudformation-s3.md |
5379d78d4914-0 | Triggers the pipeline when a new commit is made on the configured GitHub repository and branch\.
To integrate with GitHub, CodePipeline uses an OAuth application or a personal access token for your pipeline\. If you use the console to create or edit your pipeline, CodePipeline creates a GitHub webhook that starts your pipeline when a change occurs in the repository\.
You must have already created a GitHub account and repository before you connect the pipeline through a GitHub action\.
If you want to limit the access CodePipeline has to repositories, create a GitHub account and grant the account access only to those repositories you want to integrate with CodePipeline\. Use that account when you configure CodePipeline to use GitHub repositories for source stages in pipelines\.
For more information, see the [GitHub developer documentation](https://developer.github.com) on the GitHub website\.
**Topics**
+ [Action type](#action-reference-GitHub-type)
+ [Configuration parameters](#action-reference-GitHub-config)
+ [Input artifacts](#action-reference-GitHub-input)
+ [Output artifacts](#action-reference-GitHub-output)
+ [Output variables](#action-reference-GitHub-variables)
+ [Action declaration \(GitHub example\)](#action-reference-GitHub-example) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-GitHub.md |
5379d78d4914-1 | + [Action declaration \(GitHub example\)](#action-reference-GitHub-example)
+ [Connecting to GitHub \(OAuth\)](#action-reference-GitHub-auth)
+ [See also](#action-reference-GitHub-links) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-GitHub.md |
1dc6a761f030-0 | + Category: `Source`
+ Owner: `ThirdParty`
+ Provider: `GitHub`
+ Version: `1` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-GitHub.md |
fe3bfdaf1838-0 | **Owner**
Required: Yes
The name of the GitHub user or organization who owns the GitHub repository\.
**Repo**
Required: Yes
The name of the repository where source changes are to be detected\.
**Branch**
Required: Yes
The name of the branch where source changes are to be detected\.
**OAuthToken**
Required: Yes
Represents the GitHub authentication token that allows CodePipeline to perform operations on your GitHub repository\. The entry is always displayed as a mask of four asterisks\. It represents one of the following values:
+ When you use the console to create the pipeline, CodePipeline uses an OAuth token to register the GitHub connection\.
+ When you use the AWS CLI to create the pipeline, you can pass your GitHub personal access token in this field\. Replace the asterisks \(\*\*\*\*\) with your personal access token copied from GitHub\. When you run `get-pipeline` to view the action configuration, the four\-asterisk mask is displayed for this value\.
+ When you use an AWS CloudFormation template to create the pipeline, you must first store the token as a secret in AWS Secrets Manager\. You include the value for this field as a dynamic reference to the stored secret in Secrets Manager, such as `{{resolve:secretsmanager:MyGitHubSecret:SecretString:token}}`\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-GitHub.md |
fe3bfdaf1838-1 | For more information about GitHub authentication tokens for your pipeline, see [Connecting to GitHub \(OAuth\)](#action-reference-GitHub-auth)\. For more information about GitHub scopes, see the [GitHub Developer API Reference](https://developer.github.com/v3/oauth/#scopes) on the GitHub website\.
**PollForSourceChanges**
Required: No
`PollForSourceChanges` controls whether CodePipeline polls the GitHub repository for source changes\. We recommend that you use webhooks to detect source changes instead\. For more information about configuring webhooks, see [Update pipelines for push events \(GitHub source\) \(CLI\)](update-change-detection.md#update-change-detection-cli-github) or [Update pipelines for push events \(GitHub source\) \(AWS CloudFormation template\)](update-change-detection.md#update-change-detection-cfn-github)\.
If you intend to configure webhooks, you must set `PollForSourceChanges` to `false` to avoid duplicate pipeline executions\.
Valid values for this parameter:
+ `True`: If set, CodePipeline polls your repository for source changes\.
**Note** | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-GitHub.md |
fe3bfdaf1838-2 | Valid values for this parameter:
+ `True`: If set, CodePipeline polls your repository for source changes\.
**Note**
If you omit `PollForSourceChanges`, CodePipeline defaults to polling your repository for source changes\. This behavior is the same as if `PollForSourceChanges` is set to `true`\.
+ `False`: If set, CodePipeline does not poll your repository for source changes\. Use this setting if you intend to configure a webhook to detect source changes\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-GitHub.md |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.