id
stringlengths 14
16
| text
stringlengths 1
2.43k
| source
stringlengths 99
229
|
---|---|---|
86a5e129d0f2-0 | You can always edit a pipeline to change its functionality, but you might decide you want to delete it instead\. You can use the AWS CodePipeline console or the delete\-pipeline command in the AWS CLI to delete a pipeline\.
**Topics**
+ [Delete a pipeline \(console\)](#pipelines-delete-console)
+ [Delete a pipeline \(CLI\)](#pipelines-delete-cli) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-delete.md |
f75dca5de140-0 | **To delete a pipeline**
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
The names and status of all pipelines associated with your AWS account are displayed\.
1. In **Name**, choose the name of the pipeline you want to delete\.
1. On the pipeline details page, choose **Edit**\.
1. On the **Edit** page, choose **Delete**\.
1. Type **delete** in the field to confirm, and then choose **Delete**\.
**Important**
This action cannot be undone\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-delete.md |
98b7472a6d51-0 | To use the AWS CLI to manually delete a pipeline, use the [delete\-pipeline](http://docs.aws.amazon.com/cli/latest/reference/codepipeline/delete-pipeline.html) command\.
**Important**
Deleting a pipeline is irreversible\. There is no confirmation dialog box\. After the command is run, the pipeline is deleted, but none of the resources used in the pipeline are deleted\. This makes it easier to create a new pipeline that uses those resources to automate the release of your software\.
**To delete a pipeline**
1. Open a terminal \(Linux, macOS, or Unix\) or command prompt \(Windows\) and use the AWS CLI to run the delete\-pipeline command, specifying the name of the pipeline you want to delete\. For example, to delete a pipeline named *MyFirstPipeline*:
```
aws codepipeline delete-pipeline --name MyFirstPipeline
```
This command returns nothing\.
1. Delete any resources you no longer need\.
**Note** | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-delete.md |
98b7472a6d51-1 | ```
This command returns nothing\.
1. Delete any resources you no longer need\.
**Note**
Deleting a pipeline does not delete the resources used in the pipeline, such as the CodeDeploy or Elastic Beanstalk application you used to deploy your code, or, if you created your pipeline from the CodePipeline console, the Amazon S3 bucket CodePipeline created to store the artifacts of your pipelines\. Make sure that you delete resources that are no longer required so that you are not charged for them in the future\. For example, when you use the console to create a pipeline for the first time, CodePipeline creates one Amazon S3 bucket to store all artifacts for all of your pipelines\. If you have deleted all of your pipelines, follow the steps in [Deleting a Bucket](https://docs.aws.amazon.com/AmazonS3/latest/user-guide/DeletingaBucket.html)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-delete.md |
cf35160ad06e-0 | The easiest way to create a pipeline is to use the **Create pipeline** wizard in the AWS CodePipeline console\.
In this tutorial, you create a two\-stage pipeline that uses a versioned S3 bucket and CodeDeploy to release a sample application\.
**Note**
When Amazon S3 is the source provider for your pipeline, you may zip your source file or files into a single \.zip and upload the \.zip to your source bucket\. You may also upload a single unzipped file; however, downstream actions that expect a \.zip file will fail\.
After you create this simple pipeline, you add another stage and then disable and enable the transition between stages\.
**Important**
Many of the actions you add to your pipeline in this procedure involve AWS resources that you need to create before you create the pipeline\. AWS resources for your source actions must always be created in the same AWS Region where you create your pipeline\. For example, if you create your pipeline in the US East \(Ohio\) Region, your CodeCommit repository must be in the US East \(Ohio\) Region\.
You can add cross\-region actions when you create your pipeline\. AWS resources for cross\-region actions must be in the same AWS Region where you plan to execute the action\. For more information, see [Add a cross\-Region action in CodePipeline](actions-create-cross-region.md)\.
Before you begin, you should complete the prerequisites in [Getting started with CodePipeline](getting-started-codepipeline.md)\.
**Topics** | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
cf35160ad06e-1 | **Topics**
+ [Step 1: Create an S3 bucket for your application](#s3-create-s3-bucket)
+ [Step 2: Create Amazon EC2 Windows instances and install the CodeDeploy agent](#S3-create-instances)
+ [Step 3: Create an application in CodeDeploy](#S3-create-deployment)
+ [Step 4: Create your first pipeline in CodePipeline](#s3-create-pipeline)
+ [\(Optional\) Step 5: Add another stage to your pipeline](#s3-add-stage)
+ [\(Optional\) Step 6: Disable and enable transitions between stages in CodePipeline](#s3-configure-transitions)
+ [Step 7: Clean up resources](#s3-clean-up) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
a66668a77fbb-0 | You can store your source files or applications in any versioned location\. In this tutorial, you create an S3 bucket for the sample applications and enable versioning on that bucket\. After you have enabled versioning, you copy the sample applications to that bucket\.
**To create an S3 bucket**
1. Sign in to the AWS Management Console and open the Amazon S3 console at [https://console\.aws\.amazon\.com/s3/](https://console.aws.amazon.com/s3/)\.
1. Choose **Create bucket**\.
1. In **Bucket name**, enter a name for your bucket \(for example, **awscodepipeline\-demobucket\-example\-date**\)\.
**Note**
Because all bucket names in Amazon S3 must be unique, use one of your own, not the name shown in the example\. You can change the example name just by adding the date to it\. Make a note of this name because you need it for the rest of this tutorial\.
In **Region**, choose the Region where you intend to create your pipeline, such as **US West \(Oregon\)**, and then choose **Create bucket**\.
1. After the bucket is created, a success banner displays\. Choose **Go to bucket details**\.
1. On the **Properties** tab, choose **Versioning**\. Choose **Enable versioning**, and then choose **Save**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
a66668a77fbb-1 | When versioning is enabled, Amazon S3 saves every version of every object in the bucket\.
1. On the **Permissions** tab, leave the defaults\. For more information about S3 bucket and object permissions, see [Specifying Permissions in a Policy](https://docs.aws.amazon.com/AmazonS3/latest/dev/using-with-s3-actions.html)\.
1. Next, download a sample and save it into a folder or directory on your local computer\.
1. Choose one of the following\. Choose `SampleApp_Windows.zip` if you want to follow the steps in this tutorial for Windows Server instances\.
+ If you want to deploy to Amazon Linux instances using CodeDeploy, download the sample application here: [SampleApp\_Linux\.zip](samples/SampleApp_Linux.zip)\.
+ If you want to deploy to Windows Server instances using CodeDeploy, download the sample application here: [SampleApp\_Windows\.zip](samples/SampleApp_Windows.zip)\.
1. Download the compressed \(zipped\) file\. Do not unzip the file\.
1. In the Amazon S3 console, for your bucket, upload the file:
1. Choose **Upload**\.
1. Drag and drop the file or choose **Add files** and browse for the file\.
1. Choose **Upload**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
1db40d2025b2-0 | **Note**
This tutorial provides sample steps for creating Amazon EC2 Windows instances\. For sample steps to create Amazon EC2 Linux instances, see [Step 3: Create an EC2 Linux instance and install the CodeDeploy agent](tutorials-simple-codecommit.md#codecommit-create-deployment)\. When prompted for the number of instances to create, specify **2** instances\.
In this step, you create the Windows Server Amazon EC2 instances to which you will deploy a sample application\. As part of this process, you install the CodeDeploy agent on the instances\. The CodeDeploy agent is a software package that enables an instance to be used in CodeDeploy deployments\.
**To create an instance role**
1. Open the IAM console at [https://console\.aws\.amazon\.com/iam/](https://console.aws.amazon.com/iam/)\)\.
1. From the console dashboard, choose **Roles**\.
1. Choose **Create role**\.
1. Under **Select type of trusted entity**, select **AWS service**\. Under **Choose a use case**, select **EC2**, and then choose **Next: Permissions**\.
1. Search for and select the policy named **AmazonEC2RoleforAWSCodeDeploy**, and then choose **Next: Tags**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
1db40d2025b2-1 | 1. Search for and select the policy named **AmazonEC2RoleforAWSCodeDeploy**, and then choose **Next: Tags**\.
1. Choose **Next: Review**\. Enter a name for the role \(for example, **EC2InstanceRole**\)\.
**Note**
Make a note of your role name for the next step\. You choose this role when you are creating your instance\.
Choose **Create role**\.
**To launch instances**
1. Open the Amazon EC2 console at [https://console\.aws\.amazon\.com/ec2/](https://console.aws.amazon.com/ec2/)\.
1. From the console dashboard, choose **Launch instance**, and select **Launch instance** from the options that pop up\.
1. On the **Step 1: Choose an Amazon Machine Image \(AMI\)** page, locate the **Microsoft Windows Server 2019 Base** option, and then choose **Select**\. \(This AMI is labeled "Free tier eligible" and can be found at the top of the list\.\)
1. On the **Step 2: Choose an Instance Type** page, choose the free tier eligible `t2.micro` type as the hardware configuration for your instance, and then choose **Next: Configure Instance Details**\.
1. On the **Step 3: Configure Instance Details** page, do the following: | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
1db40d2025b2-2 | 1. On the **Step 3: Configure Instance Details** page, do the following:
+ In **Number of instances**, enter `2`\.
+ In **Auto\-assign Public IP**, choose **Enable**\.
+ In **IAM role**, choose the IAM role you created in the previous procedure \(for example, **EC2InstanceRole**\)\.
+ Expand **Advanced Details**, and in **User data**, with **As text** selected, enter the following:
```
<powershell>
New-Item -Path c:\temp -ItemType "directory" -Force
powershell.exe -Command Read-S3Object -BucketName bucket-name/latest -Key codedeploy-agent.msi -File c:\temp\codedeploy-agent.msi
Start-Process -Wait -FilePath c:\temp\codedeploy-agent.msi -WindowStyle Hidden
</powershell>
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
1db40d2025b2-3 | </powershell>
```
*bucket\-name* is the name of the S3 bucket that contains the CodeDeploy Resource Kit files for your Region\. For example, for the US West \(Oregon\) Region, replace *bucket\-name* with `aws-codedeploy-us-west-2`\. For a list of bucket names, see [Resource Kit Bucket Names by Region](https://docs.aws.amazon.com/codedeploy/latest/userguide/resource-kit.html#resource-kit-bucket-names)\.
This code installs the CodeDeploy agent on your instance as it is created\. This script is written for Windows instances only\.
+ Leave the rest of the items on the **Step 3: Configure Instance Details** page unchanged\. Choose **Next: Add Storage**\.
1. Leave the **Step 4: Add Storage** page unchanged, and then choose **Next: Add Tags**\.
1. On the **Add Tags** page, choose **Add Tag**\. Enter **Name** in the **Key** field, enter `MyCodePipelineDemo` in the **Value** field, and then choose **Next: Configure Security Group**\.
**Important**
The **Key** and **Value** boxes are case sensitive\.
1. On the **Configure Security Group** page, allow port 80 communication so you can access the public instance endpoint\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
1db40d2025b2-4 | 1. On the **Configure Security Group** page, allow port 80 communication so you can access the public instance endpoint\.
1. Choose **Review and Launch**\.
1. On the **Review Instance Launch** page, choose **Launch**\. When prompted for a key pair, choose **Proceed without a key pair**\.
**Note**
For the purposes of this tutorial, you can proceed without a key pair\. To use SSH to connect to your instances, create or use a key pair\.
When you are ready, select the acknowledgment check box, and then choose **Launch Instances**\.
1. Choose **View Instances** to close the confirmation page and return to the console\.
1. You can view the status of the launch on the **Instances** page\. When you launch an instance, its initial state is `pending`\. After the instance starts, its state changes to `running`, and it receives a public DNS name\. \(If the **Public DNS** column is not displayed, choose the **Show/Hide** icon, and then select **Public DNS**\.\)
1. It can take a few minutes for the instance to be ready for you to connect to it\. Check that your instance has passed its status checks\. You can view this information in the **Status Checks** column\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
e5b92856c3ed-0 | In CodeDeploy, an *application* is an identifier, in the form of a name, for the code you want to deploy\. CodeDeploy uses this name to ensure the correct combination of revision, deployment configuration, and deployment group are referenced during a deployment\. You select the name of the CodeDeploy application you create in this step when you create your pipeline later in this tutorial\.
**To create an application in CodeDeploy**
1. Open the CodeDeploy console at [https://console\.aws\.amazon\.com/codedeploy](https://console.aws.amazon.com/codedeploy)\.
1. If the **Applications** page does not appear, on the AWS CodeDeploy menu, choose **Applications**\.
1. Choose **Create application**\.
1. In **Application name**, enter `MyDemoApplication`\.
1. In **Compute Platform**, choose **EC2/On\-premises**\.
1. Choose **Create application**\.
**To create a deployment group in CodeDeploy**
1. On the page that displays your application, choose **Create deployment group**\.
1. In **Deployment group name**, enter **MyDemoDeploymentGroup**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
e5b92856c3ed-1 | 1. In **Deployment group name**, enter **MyDemoDeploymentGroup**\.
1. In **Service Role**, choose a service role that trusts AWS CodeDeploy with, at minimum, the trust and permissions described in [Create a Service Role for CodeDeploy](https://docs.aws.amazon.com/codedeploy/latest/userguide/getting-started-create-service-role.html)\. To get the service role ARN, see [Get the Service Role ARN \(Console\)](https://docs.aws.amazon.com/codedeploy/latest/userguide/how-to-create-service-role.html#getting-started-get-service-role-console)\.
1. Under **Deployment type**, choose **In\-place**\.
1. Under **Environment configuration**, choose **Amazon EC2 Instances**\. Choose **Name** in the **Key** field, and in the **Value** field, enter **MyCodePipelineDemo**\.
**Important**
You must choose the same value for the **Name** key here that you assigned to your EC2 instances when you created them\. If you tagged your instances with something other than **MyCodePipelineDemo**, be sure to use it here\.
1. Under **Deployment configuration**, choose `CodeDeployDefault.OneAtaTime`\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
e5b92856c3ed-2 | 1. Under **Deployment configuration**, choose `CodeDeployDefault.OneAtaTime`\.
1. Under **Load Balancer**, clear **Enable load balancing**\. You do not need to set up a load balancer or choose a target group for this example\.
1. In the **Advanced** section, leave the defaults\.
1. Choose **Create deployment group**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
3bd750db13d4-0 | In this part of the tutorial, you create the pipeline\. The sample runs automatically through the pipeline\.
**To create a CodePipeline automated release process**
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
1. On the **Welcome** page, **Getting started** page, or the **Pipelines** page, choose **Create pipeline**\.
1. In **Step 1: Choose pipeline settings**, in **Pipeline name**, enter **MyFirstPipeline**\.
**Note**
If you choose another name for your pipeline, be sure to use that name instead of **MyFirstPipeline** for the rest of this tutorial\. After you create a pipeline, you cannot change its name\. Pipeline names are subject to some limitations\. For more information, see [Quotas in AWS CodePipeline](limits.md)\.
1. In **Service role**, do one of the following: | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
3bd750db13d4-1 | 1. In **Service role**, do one of the following:
+ Choose **New service role** to allow CodePipeline to create a new service role in IAM\. In **Role name**, the role and policy name both default to this format: AWSCodePipelineServiceRole\-*region*\-*pipeline\_name*\. For example, this is the service role created for this tutorial: AWSCodePipelineServiceRole\-eu\-west\-2\-MyFirstPipeline\.
+ Choose **Existing service role** to use a service role already created in IAM\. In **Role name**, choose your service role from the list\.
1. Leave the settings under **Advanced settings** at their defaults, and then choose **Next**\.
1. In **Step 2: Add source stage**, in **Source provider**, choose **Amazon S3**\. In **Bucket**, enter the name of the S3 bucket you created in [Step 1: Create an S3 bucket for your application](#s3-create-s3-bucket)\. In **S3 object key**, enter the object key with or without a file path, and remember to include the file extension\. For example, for `SampleApp_Windows.zip`, enter the sample file name as shown in this example:
```
SampleApp_Windows.zip
```
Choose **Next step**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
3bd750db13d4-2 | ```
SampleApp_Windows.zip
```
Choose **Next step**\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-wizard-source-pol.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
Under **Change detection options**, leave the defaults\. This allows CodePipeline to use Amazon CloudWatch Events to detect changes in your source bucket\.
Choose **Next**\.
1. In **Step 3: Add build stage**, choose **Skip build stage**, and then accept the warning message by choosing **Skip** again\. Choose **Next**\.
1. In **Step 4: Add deploy stage**, in **Deploy provider**, choose **AWS CodeDeploy**\. The **Region** field defaults to the same AWS Region as your pipeline\. In **Application name**, enter `MyDemoApplication`, or choose the **Refresh** button, and then choose the application name from the list\. In **Deployment group**, enter **MyDemoDeploymentGroup**, or choose it from the list, and then choose **Next**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
3bd750db13d4-3 | ![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-wizard-deploy-pol.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
**Note**
The name Deploy is the name given by default to the stage created in the **Step 4: Add deploy stage** step, just as Source is the name given to the first stage of the pipeline\.
1. In **Step 5: Review**, review the information, and then choose **Create pipeline**\.
1. The pipeline starts to run\. You can view progress and success and failure messages as the CodePipeline sample deploys a webpage to each of the Amazon EC2 instances in the CodeDeploy deployment\.
Congratulations\! You just created a simple pipeline in CodePipeline\. The pipeline has two stages:
+ A source stage named **Source**, which detects changes in the versioned sample application stored in the S3 bucket and pulls those changes into the pipeline\.
+ A **Deploy** stage that deploys those changes to EC2 instances with CodeDeploy\.
Now, verify the results\.
**To verify your pipeline ran successfully** | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
3bd750db13d4-4 | Now, verify the results\.
**To verify your pipeline ran successfully**
1. View the initial progress of the pipeline\. The status of each stage changes from **No executions yet** to **In Progress**, and then to either **Succeeded** or **Failed**\. The pipeline should complete the first run within a few minutes\.
1. After **Succeeded** is displayed for the action status, in the status area for the **Deploy** stage, choose **Details**\. This opens the AWS CodeDeploy console\.
1. In the **Deployment group** tab, under **Deployment lifecycle events**, choose an instance ID\. This opens the EC2 console\.
1. On the **Description** tab, in **Public DNS**, copy the address, and then paste it into the address bar of your web browser\. View the index page for the sample application you uploaded to your S3 bucket\.
The following page is the sample application you uploaded to your S3 bucket\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-demo-success-message.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
3bd750db13d4-5 | For more information about stages, actions, and how pipelines work, see [CodePipeline concepts](concepts.md)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
09f486d99078-0 | Now add another stage in the pipeline to deploy from staging servers to production servers using CodeDeploy\. First, you create another deployment group in the CodePipelineDemoApplication in CodeDeploy\. Then you add a stage that includes an action that uses this deployment group\. To add another stage, you use the CodePipeline console or the AWS CLI to retrieve and manually edit the structure of the pipeline in a JSON file, and then run the update\-pipeline command to update the pipeline with your changes\.
**Topics**
+ [Create a second deployment group in CodeDeploy](#s3-add-stage-part-1)
+ [Add the deployment group as another stage in your pipeline](#s3-add-stage-part-2) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
6d2a0022e954-0 | **Note**
In this part of the tutorial, you create a second deployment group, but deploy to the same Amazon EC2 instances as before\. This is for demonstration purposes only\. It is purposely designed to fail to show you how errors are displayed in CodePipeline\.
**To create a second deployment group in CodeDeploy**
1. Open the CodeDeploy console at [https://console\.aws\.amazon\.com/codedeploy](https://console.aws.amazon.com/codedeploy)\.
1. Choose **Applications**, and in the list of applications, choose `MyDemoApplication`\.
1. Choose the **Deployment groups** tab, and then choose **Create deployment group**\.
1. On the **Create deployment group** page, in **Deployment group name**, enter a name for the second deployment group \(for example, **CodePipelineProductionFleet**\)\.
1. In **Service Role**, choose the same CodeDeploy service role you used for the initial deployment \(not the CodePipeline service role\)\.
1. Under **Deployment type**, choose **In\-place**\.
1. Under **Environment configuration**, choose **Amazon EC2 Instances**\. Choose **Name** in the **Key** box, and in the **Value** box, choose `MyCodePipelineDemo` from the list\. Leave the default configuration for **Deployment settings**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
6d2a0022e954-1 | 1. Under **Deployment configuration**, choose `CodeDeployDefault.OneAtaTime`\.
1. Under **Load Balancer**, clear **Enable load balancing**\.
1. Choose **Create deployment group**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
1d34d522504e-0 | Now that you have another deployment group, you can add a stage that uses this deployment group to deploy to the same EC2 instances you used earlier\. You can use the CodePipeline console or the AWS CLI to add this stage\.
**Topics**
+ [Create a third stage \(console\)](#s3-add-stage-part-2-console)
+ [Create a third stage \(CLI\)](#s3-add-stage-part-2-cli) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
439ed3c5dd9a-0 | You can use the CodePipeline console to add a new stage that uses the new deployment group\. Because this deployment group is deploying to the EC2 instances you've already used, the deploy action in this stage fails\.
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
1. In **Name**, choose the name of the pipeline you created, MyFirstPipeline\.
1. On the pipeline details page, choose **Edit**\.
1. On the **Edit** page, choose **\+ Add stage** to add a stage immediately after the Deploy stage\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/edit-pipeline-console-pol.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
1. In **Add stage**, in **Stage name**, enter **Production**\. Choose **Add stage**\.
1. In the new stage, choose **\+ Add action group**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
439ed3c5dd9a-1 | 1. In the new stage, choose **\+ Add action group**\.
1. In **Edit action**, in **Action name**, enter **Deploy\-Second\-Deployment**\. In **Action provider**, under **Deploy**, choose **AWS CodeDeploy**\.
1. In the CodeDeploy section, in **Application name**, choose `MyDemoApplication` from the drop\-down list, as you did when you created the pipeline\. In **Deployment group**, choose the deployment group you just created, **CodePipelineProductionFleet**\. In **Input artifacts**, choose the input artifact from the source action\. Choose **Save**\.
1. On the **Edit** page, choose **Save**\. In **Save pipeline changes**, choose **Save**\.
1. Although the new stage has been added to your pipeline, a status of **No executions yet** is displayed because no changes have triggered another run of the pipeline\. You must manually rerun the last revision to see how the edited pipeline runs\. On the pipeline details page, choose **Release change**, and then choose **Release** when prompted\. This runs the most recent revision available in each source location specified in a source action through the pipeline\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
439ed3c5dd9a-2 | Alternatively, to use the AWS CLI to rerun the pipeline, from a terminal on your local Linux, macOS, or Unix machine, or a command prompt on your local Windows machine, run the start\-pipeline\-execution command, specifying the name of the pipeline\. This runs the application in your source bucket through the pipeline for a second time\.
```
aws codepipeline start-pipeline-execution --name MyFirstPipeline
```
This command returns a `pipelineExecutionId` object\.
1. Return to the CodePipeline console and in the list of pipelines, choose **MyFirstPipeline** to open the view page\.
The pipeline shows three stages and the state of the artifact running through those three stages\. It might take up to five minutes for the pipeline to run through all stages\. You see the deployment succeeds on the first two stages, just as before, but the **Production** stage shows the **Deploy\-Second\-Deployment** action failed\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-failed-third-stage.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
439ed3c5dd9a-3 | 1. In the **Deploy\-Second\-Deployment** action, choose **Details**\. You are redirected to the page for the CodeDeploy deployment\. In this case, the failure is the result of the first instance group deploying to all of the EC2 instances, leaving no instances for the second deployment group\.
**Note**
This failure is by design, to demonstrate what happens when there is a failure in a pipeline stage\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
ef5f87f7d4a2-0 | Although using the AWS CLI to add a stage to your pipeline is more complex than using the console, it provides more visibility into the structure of the pipeline\.
**To create a third stage for your pipeline**
1. Open a terminal session on your local Linux, macOS, or Unix machine, or a command prompt on your local Windows machine, and run the get\-pipeline command to display the structure of the pipeline you just created\. For **MyFirstPipeline**, you would type the following command:
```
aws codepipeline get-pipeline --name "MyFirstPipeline"
```
This command returns the structure of MyFirstPipeline\. The first part of the output should look similar to the following:
```
{
"pipeline": {
"roleArn": "arn:aws:iam::80398EXAMPLE:role/AWS-CodePipeline-Service",
"stages": [
...
```
The final part of the output includes the pipeline metadata and should look similar to the following:
```
...
],
"artifactStore": {
"type": "S3"
"location": "codepipeline-us-east-2-250656481468",
},
"name": "MyFirstPipeline",
"version": 4 | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
ef5f87f7d4a2-1 | },
"name": "MyFirstPipeline",
"version": 4
},
"metadata": {
"pipelineArn": "arn:aws:codepipeline:us-east-2:80398EXAMPLE:MyFirstPipeline",
"updated": 1501626591.112,
"created": 1501626591.112
}
}
```
1. Copy and paste this structure into a plain\-text editor, and save the file as **pipeline\.json**\. For convenience, save this file in the same directory where you run the aws codepipeline commands\.
**Note**
You can pipe the JSON directly into a file with the get\-pipeline command as follows:
```
aws codepipeline get-pipeline --name MyFirstPipeline >pipeline.json
```
1. Copy the **Deploy** stage section and paste it after the first two stages\. Because it is a deploy stage, just like the **Deploy** stage, you use it as a template for the third stage\.
1. Change the name of the stage and the deployment group details\.
The following example shows the JSON you add to the pipeline\.json file after the **Deploy** stage\. Edit the emphasized elements with new values\. Remember to include a comma to separate the **Deploy** and **Production** stage definitions\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
ef5f87f7d4a2-2 | ```
,
{
"name": "Production",
"actions": [
{
"inputArtifacts": [
{
"name": "MyApp"
}
],
"name": "Deploy-Second-Deployment",
"actionTypeId": {
"category": "Deploy",
"owner": "AWS",
"version": "1",
"provider": "CodeDeploy"
},
"outputArtifacts": [],
"configuration": {
"ApplicationName": "CodePipelineDemoApplication",
"DeploymentGroupName": "CodePipelineProductionFleet"
},
"runOrder": 1
}
]
}
```
1. If you are working with the pipeline structure retrieved using the get\-pipeline command, you must remove the `metadata` lines from the JSON file\. Otherwise, the update\-pipeline command cannot use it\. Remove the `"metadata": { }` lines and the `"created"`, `"pipelineARN"`, and `"updated"` fields\.
For example, remove the following lines from the structure:
```
"metadata": { | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
ef5f87f7d4a2-3 | For example, remove the following lines from the structure:
```
"metadata": {
"pipelineArn": "arn:aws:codepipeline:region:account-ID:pipeline-name",
"created": "date",
"updated": "date"
}
```
Save the file\.
1. Run the update\-pipeline command, specifying the pipeline JSON file, similar to the following:
```
aws codepipeline update-pipeline --cli-input-json file://pipeline.json
```
This command returns the entire structure of the updated pipeline\.
**Important**
Be sure to include `file://` before the file name\. It is required in this command\.
1. Run the start\-pipeline\-execution command, specifying the name of the pipeline\. This runs the application in your source bucket through the pipeline for a second time\.
```
aws codepipeline start-pipeline-execution --name MyFirstPipeline
```
This command returns a `pipelineExecutionId` object\.
1. Open the CodePipeline console and choose **MyFirstPipeline** from the list of pipelines\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
ef5f87f7d4a2-4 | 1. Open the CodePipeline console and choose **MyFirstPipeline** from the list of pipelines\.
The pipeline shows three stages and the state of the artifact running through those three stages\. It might take up to five minutes for the pipeline to run through all stages\. Although the deployment succeeds on the first two stages, just as before, the **Production** stage shows that the **Deploy\-Second\-Deployment** action failed\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-failed-third-stage.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
1. In the **Deploy\-Second\-Deployment** action, choose **Details** to see details of the failure\. You are redirected to the details page for the CodeDeploy deployment\. In this case, the failure is the result of the first instance group deploying to all of the EC2 instances, leaving no instances for the second deployment group\.
**Note**
This failure is by design, to demonstrate what happens when there is a failure in a pipeline stage\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
de94218823b7-0 | You can enable or disable the transition between stages in a pipeline\. Disabling the transition between stages allows you to manually control transitions between one stage and another\. For example, you might want to run the first two stages of a pipeline, but disable transitions to the third stage until you are ready to deploy to production, or while you troubleshoot a problem or failure with that stage\.
**To disable and enable transitions between stages in a CodePipeline pipeline**
1. Open the CodePipeline console and choose **MyFirstPipeline** from the list of pipelines\.
1. On the details page for the pipeline, choose the **Disable transition** button between the second stage \(**Deploy**\) and the third stage that you added in the previous section \(**Production**\)\.
1. In **Disable transition**, enter a reason for disabling the transition between the stages, and then choose **Disable**\.
The arrow between stages displays an icon and color change, and the **Enable transition** button\.
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-disabled-transition-pol.png)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Image NOT FOUND\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
de94218823b7-1 | 1. Upload your sample again to the S3 bucket\. Because the bucket is versioned, this change starts the pipeline\. For information, see [](#getting-started-upload-s3)\.
1. Return to the details page for your pipeline and watch the status of the stages\. The pipeline view changes to show progress and success on the first two stages, but no changes occur on the third stage\. This process might take a few minutes\.
1. Enable the transition by choosing the **Enable transition** button between the two stages\. In the **Enable transition** dialog box, choose **Enable**\. The stage starts running in a few minutes and attempts to process the artifact that has already been run through the first two stages of the pipeline\.
**Note**
If you want this third stage to succeed, edit the CodePipelineProductionFleet deployment group before you enable the transition, and specify a different set of EC2 instances where the application is deployed\. For more information about how to do this, see [Change deployment group settings](http://docs.aws.amazon.com/codedeploy/latest/userguide/how-to-change-deployment-group-settings.html)\. If you create more EC2 instances, you might incur additional costs\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
ff7b28a82331-0 | You can use some of the resources you created in this tutorial for the [Tutorial: Create a four\-stage pipeline](tutorials-four-stage-pipeline.md)\. For example, you can reuse the CodeDeploy application and deployment\. You can configure a build action with a provider such as CodeBuild, which is a fully managed build service in the cloud\. You can also configure a build action that uses a provider with a build server or system, such as Jenkins\.
However, after you complete this and any other tutorials, you should delete the pipeline and the resources it uses, so that you are not charged for the continued use of those resources\. First, delete the pipeline, then the CodeDeploy application and its associated EC2 instances, and finally, the S3 bucket\.
**To clean up the resources used in this tutorial**
1. To clean up your CodePipeline resources, follow the instructions in [Delete a pipeline in AWS CodePipeline](pipelines-delete.md)\.
1. To clean up your CodeDeploy resources, follow the instructions in [To clean up resources \(console\)](https://docs.aws.amazon.com/codedeploy/latest/userguide/tutorials-wordpress-clean-up.html#tutorials-wordpress-clean-up-console)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
ff7b28a82331-1 | 1. To delete the S3 bucket, follow the instructions in [Deleting or emptying a bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/delete-or-empty-bucket.html)\. If you do not intend to create more pipelines, delete the S3 bucket created for storing your pipeline artifacts\. For more information about this bucket, see [CodePipeline concepts](concepts.md)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials-simple-s3.md |
d835b7f2e69a-0 | This section is a reference only\. For information about creating variables, see [Working with variables](actions-variables.md)\.
Variables allow you to configure your pipeline actions with values that are determined at the time of the action execution\. Variables can be produced by an action execution or be implicitly available at the start of each pipeline execution\.
Some action providers produce a defined set of variables\. You choose from default variable keys for that action provider, such as commit ID\.
To see step\-by\-step examples of using variables:
+ For a tutorial with a Lambda action that uses variables from an upstream action \(CodeCommit\) and generates output variables, see [Tutorial: Using variables with Lambda invoke actions](tutorials-lambda-variables.md)\.
+ For a tutorial with a AWS CloudFormation action that references stack output variables from an upstream CloudFormation action, see [Tutorial: Create a pipeline that uses variables from AWS CloudFormation deployment actions](tutorials-cloudformation-action.md)\.
+ For an example manual approval action with message text that references output variables that resolve to the CodeCommit commit ID and commit message, see [Example: Use variables in manual approvals](actions-variables.md#actions-variables-examples-approvals)\.
+ For an example CodeBuild action with an environment variable that resolves to the GitHub branch name, see [Example: Use a BranchName variable with CodeBuild environment variables](actions-variables.md#actions-variables-examples-env-branchname)\.
**Variable Limits** | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
d835b7f2e69a-1 | **Variable Limits**
For limit information, see [Quotas in AWS CodePipeline](limits.md)\.
**Note**
When you enter output variable syntax in the action configuration fields, do not exceed the 1000\-character limit for the configuration fields\. A validation error is returned when this limit is exceeded\.
**Topics**
+ [Concepts](#reference-variables-concepts)
+ [Configuring variables](#reference-variables-workflow)
+ [Variable resolution](#reference-variables-resolution)
+ [Rules for variables](#reference-variables-rules)
+ [Variables available for pipeline actions](#reference-variables-list) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
1448b3170c85-0 | This section lists key terms and concepts related to variables and namespaces\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
9cb5272e67c3-0 | Variables are key\-value pairs that can be used to dynamically configure actions in your pipeline\. There are currently two ways these variables are made available:
+ There is a set of variables that are implicitly available at the start of each pipeline execution\. This set currently includes `PipelineExecutionId`, the ID of the current pipeline execution\.
+ There are action types that produce sets of variables when they are executed\. You can see the variables produced by an action by inspecting the `outputVariables` field that is part of the [ListActionExecutions](https://docs.aws.amazon.com/codepipeline/latest/APIReference/API_ListActionExecutions.html) API\. To see which variables each action type produces, see the CodePipeline [Action structure reference](action-reference.md)\.
To reference these variables in your action configuration, you must use the variable reference syntax with the correct namespace\.
For a list of available key names by action provider, see [Variables available for pipeline actions](#reference-variables-list)\. For an example variable workflow, see [Configuring variables ](#reference-variables-workflow)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
4f86b0c7ced9-0 | To ensure that variables can be uniquely referenced, they must be assigned to a namespace\. After you have a set of variables assigned to a namespace, they can be referenced in an action configuration by using the namespace and variable key with the following syntax:
```
#{namespace.variable_key}
```
There are two types of namespaces under which variables can be assigned:
+ **The codepipeline reserved namespace**
This is the namespace assigned to the set of implicit variables available at the start of each pipeline execution\. This namespace is `codepipeline`\. Example variable reference:
```
#{codepipeline.PipelineExecutionId}
```
+ **Action assigned namespace**
This is a namespace that you assign to an action\. All variables produced by the action fall under this namespace\. To make the variables produced by an action available for use in a downstream action configuration, you must configure the producing action with a namespace\. Namespaces must be unique across the pipeline definition and cannot conflict with any artifact names\. Here is an example variable reference for an action configured with a namespace of `SourceVariables`\.
```
#{SourceVariables.VersionId}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
c6f17c775451-0 | You configure an action to produce variables by declaring a namespace for the action\. The action must already be one of the action providers that generates variables\. Otherwise, the variables available are pipeline\-level variables\.
You declare the namespace either by:
+ On the **Edit action ** page of the console, entering a namespace in **Variable namespace**\.
+ Entering a namespace in the `namespace` parameter field in the JSON pipeline structure\.
In this example, you add the `namespace` parameter to the CodeCommit source action with the name `SourceVariables`\. This configures the action to produce the variables available for that action provider, such as `CommitId`\.
```
{
"name": "Source",
"actions": [
{
"outputArtifacts": [
{
"name": "SourceArtifact"
}
],
"name": "Source",
"namespace": "SourceVariables",
"configuration": {
"RepositoryName": "MyRepo",
"BranchName": "mainline",
"PollForSourceChanges": "false"
},
"inputArtifacts": [],
"region": "us-west-2",
"actionTypeId": { | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
c6f17c775451-1 | "inputArtifacts": [],
"region": "us-west-2",
"actionTypeId": {
"provider": "CodeCommit",
"category": "Source",
"version": "1",
"owner": "AWS"
},
"runOrder": 1
}
]
},
```
Next, you configure the downstream action to use the variables produced by the previous action\. You do this by:
+ On the **Edit action ** page of the console, entering the variable syntax \(for the downstream action\) in the action configuration fields\.
+ Entering the variable syntax \(for the downstream action\) in the action configuration fields in the JSON pipeline structure
In this example, the build action's configuration field shows environment variables that are updated upon the action execution\. The example specifies the namespace and variable for execution ID with `#{codepipeline.PipelineExecutionId}` and the namespace and variable for commit ID with **\#\{SourceVariables\.CommitId\}**\.
```
{
"name": "Build",
"actions": [
{
"outputArtifacts": [
{
"name": "BuildArtifact"
}
], | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
c6f17c775451-2 | {
"name": "BuildArtifact"
}
],
"name": "Build",
"configuration": {
"EnvironmentVariables": "[{\"name\":\"Release_ID\",\"value\":\"#{codepipeline.PipelineExecutionId}\",\"type\":\"PLAINTEXT\"},{\"name\":\"Commit_ID\",\"value\":\"#{SourceVariables.CommitId}\",\"type\":\"PLAINTEXT\"}]",
"ProjectName": "env-var-test"
},
"inputArtifacts": [
{
"name": "SourceArtifact"
}
],
"region": "us-west-2",
"actionTypeId": {
"provider": "CodeBuild",
"category": "Build",
"version": "1",
"owner": "AWS"
},
"runOrder": 1
}
]
},
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
b9d35955acfd-0 | Each time an action is executed as part of a pipeline execution, the variables it produces are available for use in any action that is guaranteed to occur after the producing action\. To use these variables in a consuming action, you can add them to the consuming action's configuration using the syntax shown in the previous example\. Before it performs a consuming action, CodePipeline resolves all of the variable references present in the configuration prior to initiating the action execution\.
![\[Example: Variables for multiple actions\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/variables-workflow-example.png)![\[Example: Variables for multiple actions\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Example: Variables for multiple actions\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
fed1ff939b44-0 | The following rules help you with the configuration of variables:
+ You specify the namespace and variable for an action through a new action property or by editing an action\.
+ When you use the pipeline creation wizard, the console generates a namespace for each action created with the wizard\.
+ If the namespace isn't specified, the variables produced by that action cannot be referenced in any action configuration\.
+ To reference variables produced by an action, the referencing action must occur after the action that produces the variables\. This means it is either in a later stage than the action producing the variables, or in the same stage but at a higher run order\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
0f21c5668c15-0 | The action provider determines which variables can be generated by the action\.
Unlike a namespace which you can choose, most variable keys cannot be edited\. For example, for the Amazon S3 action provider, only the `ETag` and `VersionId` variable keys are available\.
For CodeBuild, AWS CloudFormation, and Lambda actions, the variable keys are configured by the user\.
Each execution also has a set of CodePipeline\-generated pipeline variables that contain data about the execution, such as the pipeline release ID\. These variables can be consumed by any action in the pipeline\.
[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/reference-variables.html)
For step\-by\-step procedures for managing variables, see [Working with variables](actions-variables.md)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/reference-variables.md |
d413f5b5db3e-0 | You can use the CodePipeline console to view a list of all of the pipelines in your account\. You can also view details for each pipeline, including when actions last ran in the pipeline, whether a transition between stages is enabled or disabled, whether any actions have failed, and other information\. You can also view a history page that shows details for all pipeline executions for which history has been recorded\. Execution history is retained for up to 12 months\.
**Note**
Detailed execution history is available for executions run on or after February 21, 2019\.
**Topics**
+ [View pipeline \(console\)](#pipelines-list-console)
+ [View pipeline execution history \(console\)](#pipelines-executions-console)
+ [View execution status \(console\)](#pipelines-executions-status-console)
+ [View pipeline execution source revisions \(console\)](#pipelines-source-revisions-console)
+ [View action executions \(console\)](#pipelines-action-executions-console)
+ [View action artifacts and artifact store information \(console\)](#pipelines-action-artifacts-console)
+ [View the pipeline ARN and service role ARN \(console\)](#pipelines-settings-console) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
48bffbe4d496-0 | You can view status, transitions, and artifact updates for a pipeline\.
**Note**
After an hour, the detailed view of a pipeline stops refreshing automatically in your browser\. To view current information, refresh the page\.
**To view a pipeline**
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
The names and creation date of all pipelines associated with your AWS account are displayed, along with links to view execution history\.
1. To see details for a single pipeline, in **Name**, choose the pipeline\. You can also select the pipeline, and then choose **View pipeline**\. A detailed view of the pipeline, including the state of each action in each stage and the state of the transitions, is displayed\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
48bffbe4d496-1 | ![\[The console view of the pipeline indicates the state of each action in each stage and the state of the transitions.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/codepipeline-view-pipeline-pol.png)![\[The console view of the pipeline indicates the state of each action in each stage and the state of the transitions.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[The console view of the pipeline indicates the state of each action in each stage and the state of the transitions.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
The graphical view displays the following information for each stage:
+ The stage name\.
+ Every action configured for the stage\.
+ The state of transitions between stages \(enabled or disabled\), as indicated by the state of the arrow between stages\. An enabled transition is indicated by an arrow with a **Disable transition** button next to it\. A disabled transition is indicated by an arrow with a strikeout under it and an **Enable transition** button next to it\.
+ A color bar to indicate the status of the stage:
+ Gray: No executions yet
+ Blue: In progress
+ Green: Succeeded
+ Red: Failed
The graphical view also displays the following information about actions in each stage:
+ The name of the action\.
+ The provider of the action, such as CodeDeploy\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
48bffbe4d496-2 | + The name of the action\.
+ The provider of the action, such as CodeDeploy\.
+ When the action was last run\.
+ Whether the action succeeded or failed\.
+ Links to other details about the last run of the action, where available\.
+ Details about the source revisions that are running through the latest pipeline execution in the stage or, for CodeDeploy deployments, the latest source revisions that were deployed to target instances\.
1. To see the configuration details for an action in a stage of a pipeline, choose the information icon next to the action\.
1. To view the details of the provider of the action, choose the provider\. For example, in the preceding example pipeline, if you choose CodeDeploy in either the Staging or Production stages the CodeDeploy console page for the deployment group configured for that stage is displayed\.
1. To see the progress details for an action in a stage, choose **Details** when it is displayed next to an action in progress \(indicated by an **In Progress** message\)\. If the action is in progress, you see the incremental progress and the steps or actions as they occur\.
**Note**
Details are available for source actions that retrieve content from GitHub repositories, but not those that retrieve content from Amazon S3 buckets or CodeCommit repositories\.
1. To approve or reject actions that have been configured for manual approval, choose **Review**\.
1. To retry actions in a stage that were not completed successfully, choose **Retry**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
48bffbe4d496-3 | 1. To retry actions in a stage that were not completed successfully, choose **Retry**\.
1. To get more information about errors or failures for a completed action in a stage, choose **Details**\. Details from the last time the action ran, including the results of that action \(**Succeeded** or **Failed**\) are displayed\.
1. To view details about source artifacts \(output artifact that originated in the first stage of a pipeline\) that are used the latest pipeline execution for a stage, click in the details information area at the bottom of the stage\. You can view details about identifiers, such as commit IDs, check\-in comments, and the time since the artifact was created or updated\.
1. To view details about the most recent executions for the pipeline, choose **View history**\. For past executions, you can view revision details associated with source artifacts, such as execution IDs, status, start and end times, duration, and commit IDs and messages\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
81fa582dee48-0 | You can use the console to view the history of executions in a pipeline, including status, source revisions, and timing details for each execution\.
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
The names of all pipelines associated with your AWS account are displayed, along with their status\.
1. In **Name**, choose the name of the pipeline\.
1. Choose **View history**\.
1. View the status, source revisions, change details, and triggers related to each execution for your pipeline\.
![\[View execution history.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/view-execution-history.png)![\[View execution history.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[View execution history.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
1. Choose an execution\. The detail view shows execution details, the **Timeline** tab, and the **Visualization** tab\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
81fa582dee48-1 | 1. Choose an execution\. The detail view shows execution details, the **Timeline** tab, and the **Visualization** tab\.
![\[View execution details.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/view-execution-detail.png)![\[View execution details.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[View execution details.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
0640d8f90843-0 | You can view the pipeline status in **Status** on the execution history page\. Choose an execution ID link, and then view the action status\.
The following are valid states for pipelines, stages, and actions:
**Pipeline\-level states**
| Pipeline state | Description |
| --- | --- |
| InProgress | The pipeline execution is currently running\. |
| Stopping | The pipeline execution is stopping due to a request to either stop and wait or stop and abandon the pipeline execution\. |
| Stopped | The stopping process is complete, and the pipeline execution is stopped\. |
| Succeeded | The pipeline execution was completed successfully\. |
| Superseded | While this pipeline execution was waiting for the next stage to be completed, a newer pipeline execution advanced and continued through the pipeline instead\. |
| Failed | The pipeline execution was not completed successfully\. |
**Stage\-level states**
| Stage state | Description |
| --- | --- |
| InProgress | The stage is currently running\. |
| Stopping | The stage execution is stopping due to a request to either stop and wait or stop and abandon the pipeline execution\. |
| Stopped | The stopping process is complete, and the stage execution is stopped\. |
| Succeeded | The stage was completed successfully\. |
| Failed | The stage was not completed successfully\. |
**Action\-level states**
| Action state | Description | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
0640d8f90843-1 | | Failed | The stage was not completed successfully\. |
**Action\-level states**
| Action state | Description |
| --- | --- |
| InProgress | The action is currently running\. |
| Abandoned | The action is abandoned due to a request to stop and abandon the pipeline execution\. |
| Succeeded | The action was completed successfully\. |
| Failed | For approval actions, the FAILED state means the action was either rejected by the reviewer or failed due to an incorrect action configuration\. | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
1cc2092aa27c-0 | You can view details about source artifacts \(output artifact that originated in the first stage of a pipeline\) that are used in an execution of a pipeline\. The details include identifiers, such as commit IDs, check\-in comments, and, when you use the CLI, version numbers of pipeline build actions\. For some revision types, you can view and open the URL of the commit\. Source revisions are made up of the following:
+ **Summary**: Summary information about the most recent revision of the artifact\. For GitHub and AWS CodeCommit repositories, the commit message\. For Amazon S3 buckets or actions, the user\-provided content of a codepipeline\-artifact\-revision\-summary key specified in the object metadata\.
+ **revisionUrl**: The revision URL for the artifact revision \(for example, the external repository URL\)\.
+ **revisionId**: The revision ID for the artifact revision\. For example, for a source change in a CodeCommit or GitHub repository, this is the commit ID\. For artifacts stored in GitHub or CodeCommit repositories, the commit ID is linked to a commit details page\.
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
The names of all pipelines associated with your AWS account will be displayed\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
1cc2092aa27c-1 | The names of all pipelines associated with your AWS account will be displayed\.
1. Choose the name of the pipeline for which you want to view source revision details, Do one of the following:
+ Choose **View history**\. In **Source revisions**, the source change for each execution is listed\.
![\[Information about a revision can be viewed on the execution history page\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/view-execution-history.png)![\[Information about a revision can be viewed on the execution history page\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Information about a revision can be viewed on the execution history page\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
+ Locate an action for which you want to view source revision details, and then find the revision information at the bottom of its stage: | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
1cc2092aa27c-2 | + Locate an action for which you want to view source revision details, and then find the revision information at the bottom of its stage:
![\[Information about a revision can be viewed at the bottom of a stage in the CodePipeline console.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/view-changes-console-3.png)![\[Information about a revision can be viewed at the bottom of a stage in the CodePipeline console.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[Information about a revision can be viewed at the bottom of a stage in the CodePipeline console.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
Choose **View current revisions** to view source information\. With the exception of artifacts stored in Amazon S3 buckets, identifiers such as commit IDs in this information detail view are linked to source information pages for the artifacts\.
![\[View source revisions.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/view-changes-console-4.png)![\[View source revisions.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[View source revisions.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
9f39e06e27c2-0 | You can view action details for a pipeline, such as action execution ID, input artifacts, output artifacts, and status\. You can view action details by choosing a pipeline in the console and then choosing an execution ID\.
**Note**
Detailed execution history is available for executions run on or after February 21, 2019\.
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
The names of all pipelines associated with your AWS account are displayed\.
1. Choose the name of the pipeline for which you want to view action details, and then choose **View history**\.
1. In **Execution ID**, choose the execution ID for which you want to view action execution details\.
1. You can view the following information on the **Timeline** tab:
1. In **Action name**, choose the link to open a details page for the action where you can view status, stage name, action name, configuration data, and artifact information\.
1. In **Provider**, choose the link to view the action provider details\. For example, in the preceding example pipeline, if you choose CodeDeploy in either the Staging or Production stages, the CodeDeploy console page for the CodeDeploy application configured for that stage is displayed\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
f55a853fa456-0 | You can view input and output artifact details for an action\. You can also choose a link that takes you to the artifact information for that action\. Because the artifact store uses versioning, each action execution has a unique input and output artifact location\.
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
The names of all pipelines associated with your AWS account are displayed\.
1. Choose the name of the pipeline for which you want to view action details, and then choose **View history**\.
1. In **Execution ID**, choose the execution ID for which you want to view action details\.
1. On the **Timeline** tab, in **Action name**, choose the link to open a details page for the action\.
1. On the details page, in **Execution summary**, view the status and timing of the action execution\.
1. In **Action details**, view the action provider and AWS Region where the execution runs\. In **Action configuration**, view the resource configuration for the action \(for example, the CodeBuild build project name\)\.
1. In **Artifacts**, view the artifact details in **Artifact type** and **Artifact provider**\. Choose the link under **Artifact name** to view the artifacts in the artifact store\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
f55a853fa456-1 | ![\[View action details.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/view-action-details.png)![\[View action details.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[View action details.\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
0f0b13e609aa-0 | You can use the console to view pipeline settings, such as the pipeline ARN, the service role ARN, and the pipeline artifact store\.
1. Sign in to the AWS Management Console and open the CodePipeline console at [http://console\.aws\.amazon\.com/codesuite/codepipeline/home](http://console.aws.amazon.com/codesuite/codepipeline/home)\.
The names of all pipelines associated with your AWS account will be displayed\.
1. Choose the name of your pipeline, and then choose **Settings** in the left\-hand navigation pane\. The page shows the following:
+ The pipeline name
+ The pipeline Amazon Resource Name \(ARN\)
+ The CodePipeline service role ARN for your pipeline
+ The pipeline version
+ The name and location of the artifact store for the pipeline | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-view-console.md |
69854aad98cb-0 | Before you set up a rule in CloudWatch Events, you must create an AWS CloudTrail trail\. For more information, see [Creating a Trail in the Console](https://docs.aws.amazon.com/awscloudtrail/latest/userguide/cloudtrail-create-a-trail-using-the-console-first-time.html)\.
**Important**
If you use the console to create or edit your pipeline, your CloudWatch Events rule and AWS CloudTrail trail are created for you\.
**To create a trail**
1. Open the AWS CloudTrail console\.
1. In the navigation pane, choose **Trails**\.
1. Choose **Create Trail**\. For **Trail name**, enter a name for your trail\.
1. For **Apply trail to all regions**, choose **No**\.
1. Under **Data events**, make sure **S3** is selected\. Specify an Amazon S3 bucket and the object prefix \(folder name\) to log data events for all objects in the folder\. For each trail, you can add up to 250 Amazon S3 objects\.
1. For **Read/Write events**, choose **None**\.
1. Choose **Write**\. The trail records Amazon S3 object\-level API activity \(for example, `GetObject` and `PutObject`\) on the specified bucket and prefix\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/create-cloudtrail-S3-source-console.md |
69854aad98cb-1 | 1. Under **Storage location**, create or specify the bucket to be used to store the log files\. By default, Amazon S3 buckets and objects are private\. Only the resource owner \(the AWS account that created the bucket\) can access the bucket and its objects\. The bucket must have a resource policy that allows AWS CloudTrail permissions to access the objects in the bucket\.
1. If you're satisfied with the trail, choose **Create**\.
**To create a CloudWatch Events rule that targets your pipeline with an S3 source**
1. Open the CloudWatch console at [https://console\.aws\.amazon\.com/cloudwatch/](https://console.aws.amazon.com/cloudwatch/)\.
1. In the navigation pane, choose **Events**\.
1. Choose **Event Pattern**, and then choose **Build event pattern to match events by service**\.
1. Under **Event source**, from **Service Name**, choose **Simple Storage Service \(S3\)**\.
1. From **Event Type**, choose **Object Level Operations**\.
1. Choose **Specific operation\(s\)**, and then choose **CompleteMultipartUpload**, **CopyObject**, and **PutObject**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/create-cloudtrail-S3-source-console.md |
69854aad98cb-2 | Above the **Event Pattern Preview** pane, choose **Edit**\. Edit the event pattern to add the bucket name and encryption key as `requestParameters`, as shown in this example for a bucket named `my-bucket`\. When you use the **Edit** window to specify resources, your rule is updated to use a custom event pattern\.
![\[S3 source state change rule\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/images/cloudwatch-rule-event-pattern-S3-source.png)![\[S3 source state change rule\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)![\[S3 source state change rule\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/)
The following is a sample event pattern to copy and paste:
```
{
"source": [
"aws.s3"
],
"detail-type": [
"AWS API Call via CloudTrail"
],
"detail": {
"eventSource": [
"s3.amazonaws.com"
],
"eventName": [
"CopyObject",
"CompleteMultipartUpload",
"PutObject"
], | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/create-cloudtrail-S3-source-console.md |
69854aad98cb-3 | "CopyObject",
"CompleteMultipartUpload",
"PutObject"
],
"requestParameters": {
"bucketName": [
"my-bucket"
],
"key": [
"my-key"
]
}
}
}
```
1. In **Targets**, choose **CodePipeline**\.
1. Enter the pipeline ARN for the pipeline to be started when triggered by this rule\.
**Note**
To get the pipeline ARN, run the get\-pipeline command\. The pipeline ARN appears in the output\. It is constructed in this format:
arn:aws:codepipeline:*region*:*account*:*pipeline\-name*
Sample pipeline ARN:
arn:aws:codepipeline:us\-east\-2:80398EXAMPLE:MyFirstPipeline
1. To create or specify an IAM service role that grants Amazon CloudWatch Events permissions to invoke the target associated with your Amazon CloudWatch Events rule \(in this case, the target is CodePipeline\):
+ Choose **Create a new role for this specific resource** to create a service role that gives Amazon CloudWatch Events permissions to your start your pipeline executions when triggered\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/create-cloudtrail-S3-source-console.md |
69854aad98cb-4 | + Choose **Create a new role for this specific resource** to create a service role that gives Amazon CloudWatch Events permissions to your start your pipeline executions when triggered\.
+ Choose **Use existing role** to enter a service role that gives Amazon CloudWatch Events permissions to your start your pipeline executions when triggered\.
1. Review your rule to make sure it meets your requirements, and then choose **Configure details**\.
1. On the **Configure rule details** page, enter a name and description for the rule, and then choose **State** to enable the rule\.
1. If you're satisfied with the rule, choose **Create rule**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/create-cloudtrail-S3-source-console.md |
f12039c797ee-0 | To use AWS CloudFormation to create a webhook, update your template as described here\.<a name="proc-cfn-webhook-github"></a>
**To add parameters and create a webhook in your template**
We strongly recommend that you use AWS Secrets Manager to store your credentials\. If you use Secrets Manager, you must have already configured and stored your secret parameters in Secrets Manager\. This example uses dynamic references to AWS Secrets Manager for the GitHub credentials for your webhook\. For more information, see [ Using Dynamic References to Specify Template Values](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/dynamic-references.html#dynamic-references-secretsmanager)\.
**Important**
When passing secret parameters, do not enter the value directly into the template\. The value is rendered as plaintext and is therefore readable\. For security reasons, do not use plaintext in your AWS CloudFormation template to store your credentials\.
When you use the CLI or AWS CloudFormation to create a pipeline and add a webhook, you must disable periodic checks\.
**Note** | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-create-cfn.md |
f12039c797ee-1 | **Note**
To disable periodic checks, you must explicitly add the `PollForSourceChanges` parameter and set it to false, as detailed in the final procedure below\. Otherwise, the default for a CLI or AWS CloudFormation pipeline is that `PollForSourceChanges` defaults to true and does not display in the pipeline structure output\. For more information about PollForSourceChanges defaults, see [Default settings for the PollForSourceChanges parameter](reference-pipeline-structure.md#PollForSourceChanges-defaults)\.
1. In the template, under `Resources`, add your parameters:
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-create-cfn.md |
4d5a1a9350fc-0 | ```
Parameters:
GitHubOwner:
Type: String
...
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-create-cfn.md |
9b0cba4862c7-0 | ```
{
"Parameters": {
"BranchName": {
"Description": "GitHub branch name",
"Type": "String",
"Default": "master"
},
"GitHubOwner": {
"Type": "String"
},
...
```
------
1. Use the `AWS::CodePipeline::Webhook` AWS CloudFormation resource to add a webhook\.
**Note**
The `TargetAction` you specify must match the `Name` property of the source action defined in the pipeline\.
If `RegisterWithThirdParty` is set to `true`, make sure the user associated to the `OAuthToken` can set the required scopes in GitHub\. The token and webhook require the following GitHub scopes:
+ `repo` \- used for full control to read and pull artifacts from public and private repositories into a pipeline\.
+ `admin:repo_hook` \- used for full control of repository hooks\.
Otherwise, GitHub returns a 404\. For more information about the 404 returned, see [https://help.github.com/articles/about-webhooks](https://help.github.com/articles/about-webhooks)\.
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-create-cfn.md |
097f20c036d3-0 | ```
AppPipelineWebhook:
Type: AWS::CodePipeline::Webhook
Properties:
Authentication: GITHUB_HMAC
AuthenticationConfiguration:
SecretToken: {{resolve:secretsmanager:MyGitHubSecret:SecretString:token}}
Filters:
-
JsonPath: "$.ref"
MatchEquals: refs/heads/{Branch}
TargetPipeline: !Ref AppPipeline
TargetAction: SourceAction
Name: AppPipelineWebhook
TargetPipelineVersion: !GetAtt AppPipeline.Version
RegisterWithThirdParty: true
...
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-create-cfn.md |
b58090d7dd92-0 | ```
"AppPipelineWebhook": {
"Type": "AWS::CodePipeline::Webhook",
"Properties": {
"Authentication": "GITHUB_HMAC",
"AuthenticationConfiguration": {
"SecretToken": "{{resolve:secretsmanager:MyGitHubSecret:SecretString:token}}"
},
"Filters": [
{
"JsonPath": "$.ref",
"MatchEquals": "refs/heads/{Branch}"
}
],
"TargetPipeline": {
"Ref": "AppPipeline"
},
"TargetAction": "SourceAction",
"Name": "AppPipelineWebhook",
"TargetPipelineVersion": {
"Fn::GetAtt": [
"AppPipeline",
"Version"
]
},
"RegisterWithThirdParty": true
}
},
...
```
------
1. Save the updated template to your local computer, and then open the AWS CloudFormation console\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-create-cfn.md |
b58090d7dd92-1 | ------
1. Save the updated template to your local computer, and then open the AWS CloudFormation console\.
1. Choose your stack, and then choose **Create Change Set for Current Stack**\.
1. Upload the template, and then view the changes listed in AWS CloudFormation\. These are the changes to be made to the stack\. You should see your new resources in the list\.
1. Choose **Execute**\.<a name="proc-cfn-flag-github"></a>
**To edit your pipeline's PollForSourceChanges parameter**
**Important**
When you create a pipeline with this method, the `PollForSourceChanges` parameter defaults to true if it is not explicitly set to false\. When you add event\-based change detection, you must add the parameter to your output and set it to false to disable polling\. Otherwise, your pipeline starts twice for a single source change\. For details, see [Default settings for the PollForSourceChanges parameter](reference-pipeline-structure.md#PollForSourceChanges-defaults)\.
+ In the template, change `PollForSourceChanges` to `false`\. If you did not include `PollForSourceChanges` in your pipeline definition, add it and set it to false\.
**Why am I making this change? ** Changing this parameter to `false` turns off periodic checks so you can use event\-based change detection only\.
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-create-cfn.md |
eed37476c092-0 | ```
Name: Source
Actions:
-
Name: SourceAction
ActionTypeId:
Category: Source
Owner: ThirdParty
Version: 1
Provider: GitHub
OutputArtifacts:
- Name: SourceOutput
Configuration:
Owner: !Ref GitHubOwner
Repo: !Ref RepositoryName
Branch: !Ref BranchName
OAuthToken: {{resolve:secretsmanager:MyGitHubSecret:SecretString:token}}
PollForSourceChanges: false
RunOrder: 1
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-create-cfn.md |
e14dde2a3ff7-0 | ```
{
"Name": "Source",
"Actions": [
{
"Name": "SourceAction",
"ActionTypeId": {
"Category": "Source",
"Owner": "ThirdParty",
"Version": 1,
"Provider": "GitHub"
},
"OutputArtifacts": [
{
"Name": "SourceOutput"
}
],
"Configuration": {
"Owner": {
"Ref": "GitHubOwner"
},
"Repo": {
"Ref": "RepositoryName"
},
"Branch": {
"Ref": "BranchName"
},
"OAuthToken": "{{resolve:secretsmanager:MyGitHubSecret:SecretString:token}}",
"PollForSourceChanges": false
},
"RunOrder": 1
}
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-webhooks-create-cfn.md |
f06475a1e44b-0 | **Topics**
CodePipeline provides a number of security features to consider as you develop and implement your own security policies\. The following best practices are general guidelines and don’t represent a complete security solution\. Because these best practices might not be appropriate or sufficient for your environment, treat them as helpful considerations rather than prescriptions\.
You use encryption and authentication for the source repositories that connect to your pipelines\. These are the CodePipeline best practices for security:
+ If you create a pipeline that uses an S3 source bucket, configure server\-side encryption for artifacts stored in Amazon S3 for CodePipeline by managing AWS KMS managed keys \(SSE\-KMS\), as described in [Configure server\-side encryption for artifacts stored in Amazon S3 for CodePipeline](S3-artifact-encryption.md)\.
+ If you create a pipeline that uses a GitHub source repository, configure GitHub authentication\. You can use an AWS managed OAuth token or a customer managed personal access token, as described in [Configure GitHub authentication](GitHub-authentication.md)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security-best-practices.md |
f06475a1e44b-1 | + If you are using the Jenkins action provider, when you use a Jenkins build provider for your pipeline’s build or test action, install Jenkins on an EC2 instance and configure a separate EC2 instance profile\. Make sure that the instance profile grants Jenkins only the AWS permissions required to perform tasks for your project, such as retrieving files from Amazon S3\. To learn how to create the role for your Jenkins instance profile, see the steps in [Create an IAM role to use for Jenkins integration](tutorials-four-stage-pipeline.md#tutorials-four-stage-pipeline-prerequisites-jenkins-iam-role)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security-best-practices.md |
8de1e329737d-0 | You can use logging features in AWS to determine the actions users have taken in your account and the resources that were used\. The log files show:
+ The time and date of actions\.
+ The source IP address for an action\.
+ Which actions failed due to inadequate permissions\.
Logging features are available in the following AWS services:
+ AWS CloudTrail can be used to log AWS API calls and related events made by or on behalf of an AWS account\. For more information, see [Logging CodePipeline API calls with AWS CloudTrail](monitoring-cloudtrail-logs.md)\.
+ Amazon CloudWatch Events can be used to monitor your AWS Cloud resources and the applications you run on AWS\. You can create alerts in Amazon CloudWatch Events based on metrics that you define\. For more information, see [Detect and react to changes in pipeline state with Amazon CloudWatch Events](detect-state-changes-cloudwatch-events.md)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/incident-response.md |
7fc9b5ed1fff-0 | You can use CodePipeline to help you automatically build, test, and deploy your applications in the cloud\. Specifically, you can:
+ **Automate your release processes**: CodePipeline fully automates your release process from end to end, starting from your source repository through build, test, and deployment\. You can prevent changes from moving through a pipeline by including a manual approval action in any stage except a Source stage\. You can release when you want, in the way you want, on the systems of your choice, across one instance or multiple instances\.
+ **Establish a consistent release process**: Define a consistent set of steps for every code change\. CodePipeline runs each stage of your release according to your criteria\.
+ **Speed up delivery while improving quality**: You can automate your release process to allow your developers to test and release code incrementally and speed up the release of new features to your customers\.
+ **Use your favorite tools**: You can incorporate your existing source, build, and deployment tools into your pipeline\. For a full list of AWS services and third\-party tools currently supported by CodePipeline, see [Product and service integrations with CodePipeline](integrations.md)\.
+ **View progress at a glance**: You can review real\-time status of your pipelines, check the details of any alerts, retry failed actions, view details about the source revisions used in the latest pipeline execution in each stage, and manually rerun any pipeline\.
+ **View pipeline history details**: You can view details about executions of a pipeline, including start and end times, run duration, and execution IDs\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/welcome-what-can-I-do.md |
a759a7c55ee1-0 | This short video \(3:06\) describes how CodePipeline builds, tests, and deploys your code every time there is a code change, based on the release process models you define\.
[](http://www.youtube.com/watch?v=YxcIj_SLflw) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/welcome-what-can-I-do.md |
51e4f2eb5e94-0 | There are two ways to configure server\-side encryption for Amazon S3 artifacts:
+ CodePipeline creates an S3 artifact bucket and default AWS managed SSE\-KMS encryption keys when you create a pipeline using the Create Pipeline wizard\. The master key is encrypted along with object data and managed by AWS\.
+ You can create and manage your own customer managed SSE\-KMS keys\.
**Important**
CodePipeline only supports symmetric customer master keys \(CMKs\)\. Do not use an asymmetric CMK to encrypt the data in your S3 bucket\.
If you are using the default S3 key, you cannot change or delete this AWS managed key\. If you are using a customer managed key in AWS KMS to encrypt or decrypt artifacts in the S3 bucket, you can change or rotate this key as necessary\.
Amazon S3 supports bucket policies that you can use if you require server\-side encryption for all objects that are stored in your bucket\. For example, the following bucket policy denies upload object \(`s3:PutObject`\) permission to everyone if the request does not include the `x-amz-server-side-encryption` header requesting server\-side encryption with SSE\-KMS\.
```
{
"Version": "2012-10-17",
"Id": "SSEAndSSLPolicy",
"Statement": [
{
"Sid": "DenyUnEncryptedObjectUploads", | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/S3-artifact-encryption.md |
51e4f2eb5e94-1 | "Statement": [
{
"Sid": "DenyUnEncryptedObjectUploads",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:PutObject",
"Resource": "arn:aws:s3:::codepipeline-us-west-2-89050EXAMPLE/*",
"Condition": {
"StringNotEquals": {
"s3:x-amz-server-side-encryption": "aws:kms"
}
}
},
{
"Sid": "DenyInsecureConnections",
"Effect": "Deny",
"Principal": "*",
"Action": "s3:*",
"Resource": "arn:aws:s3:::codepipeline-us-west-2-89050EXAMPLE/*",
"Condition": {
"Bool": {
"aws:SecureTransport": "false"
}
}
}
]
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/S3-artifact-encryption.md |
51e4f2eb5e94-2 | }
}
}
]
}
```
For more information about server\-side encryption and AWS KMS, see [Protecting Data Using Server\-Side Encryption](https://docs.aws.amazon.com/AmazonS3/latest/dev/serv-side-encryption.html) and [Protecting Data Using Server\-Side Encryption with CMKs Stored in AWS Key Management Service \(SSE\-KMS\)](https://docs.aws.amazon.com/AmazonS3/latest/dev/UsingKMSEncryption.html)\.
For more information about AWS KMS, see the [AWS Key Management Service Developer Guide](https://docs.aws.amazon.com/kms/latest/developerguide/)\.
**Topics**
+ [View your default Amazon S3 SSE\-KMS encryption keys](#S3-view-default-keys)
+ [Configure server\-side encryption for S3 buckets using AWS CloudFormation or the AWS CLI](#S3-rotate-customer-key) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/S3-artifact-encryption.md |
5aaf94a2bc3e-0 | When you use the **Create Pipeline** wizard to create your first pipeline, an S3 bucket is created for you in the same Region you created the pipeline\. The bucket is used to store pipeline artifacts\. When a pipeline runs, artifacts are put into and retrieved from the S3 bucket\. By default, CodePipeline uses server\-side encryption with the AWS KMS\-managed keys \(SSE\-KMS\) using the default key for Amazon S3 \(the `aws/s3` key\)\. This key is created and stored in your AWS account\. When artifacts are retrieved from the S3 bucket, CodePipeline uses the same SSE\-KMS process to decrypt the artifact\.
**To view information about your default AWS KMS key**
1. Sign in to the AWS Management Console and open the IAM console at [https://console\.aws\.amazon\.com/iam/](https://console.aws.amazon.com/iam/)\.
1. In the service navigation pane, choose **Encryption Keys**\. \(If a welcome page appears, choose **Get Started Now**\.\)
1. In **Filter**, choose the Region for your pipeline\. For example, if the pipeline was created in `us-east-2`, make sure that the filter is set to US East \(Ohio\)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/S3-artifact-encryption.md |
5aaf94a2bc3e-1 | For more information about the Regions and endpoints available for CodePipeline, see [AWS CodePipeline endpoints and quotas](https://docs.aws.amazon.com/general/latest/gr/codepipeline.html)\.
1. In the list of encryption keys, choose the key with the alias used for your pipeline \(by default, **aws/s3**\)\. Basic information about the key is displayed\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/S3-artifact-encryption.md |
b5dbb12e0ebb-0 | When you use AWS CloudFormation or the AWS CLI to create a pipeline, you must configure server\-side encryption manually\. Use the sample bucket policy above, and then create your own customer managed SSE\-KMS encryption keys\. You can also use your own keys instead of the default Amazon S3 key\. Some advantages to using your own key include:
+ You want to rotate the key on a schedule to meet business or security requirements for your organization\.
+ You want to create a pipeline that uses resources associated with another AWS account\. This requires the use of a customer managed key\. For more information, see [Create a pipeline in CodePipeline that uses resources from another AWS account](pipelines-create-cross-account.md)\.
Cryptographic best practices discourage extensive reuse of encryption keys\. As a best practice, rotate your key on a regular basis\. To create new cryptographic material for your AWS Key Management Service \(AWS KMS\) customer master keys \(CMKs\), you can create CMKs, and then change your applications or aliases to use the new CMKs\. Or, you can enable automatic key rotation for an existing CMK\.
To rotate your SSE\-KMS customer master key, see [Rotating Customer Master Keys](https://docs.aws.amazon.com/kms/latest/developerguide/rotate-keys.html)\.
**Important**
CodePipeline only supports symmetric customer master keys \(CMKs\)\. Do not use an asymmetric CMK to encrypt the data in your S3 bucket\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/S3-artifact-encryption.md |
f340cd72f5c5-0 | After you complete the steps in [Getting started with CodePipeline](getting-started-codepipeline.md), you can try one of the AWS CodePipeline tutorials in this user guide:
| | |
| --- |--- |
| I want to use the wizard to create a pipeline that uses CodeDeploy to deploy a sample application from an Amazon S3 bucket to Amazon EC2 instances running Amazon Linux\. After using the wizard to create my two\-stage pipeline, I want to add a third stage\. | See [Tutorial: Create a simple pipeline \(S3 bucket\)](tutorials-simple-s3.md)\. |
| I want to create a two\-stage pipeline that uses CodeDeploy to deploy a sample application from a CodeCommit repository to an Amazon EC2 instance running Amazon Linux\. | See [Tutorial: Create a simple pipeline \(CodeCommit repository\)](tutorials-simple-codecommit.md)\. |
| I want to add a build stage to the three\-stage pipeline I created in the first tutorial\. The new stage uses Jenkins to build my application\. | See [Tutorial: Create a four\-stage pipeline](tutorials-four-stage-pipeline.md)\. |
| I want to set up a CloudWatch Events rule that sends notifications whenever there are changes to the execution state of my pipeline, stage, or action\. | See [Tutorial: Set up a CloudWatch Events rule to receive email notifications for pipeline state changes](tutorials-cloudwatch-sns-notifications.md)\. | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials.md |
f340cd72f5c5-1 | | I want to create a pipeline with a GitHub source that builds and tests an Android app with CodeBuild and AWS Device Farm\. | See [Tutorial: Create a pipeline that builds and tests your Android app when a commit is pushed to your GitHub repository](tutorials-codebuild-devicefarm.md)\. |
| I want to create a pipeline with an Amazon S3 source that tests an iOS app with AWS Device Farm\. | See [Tutorial: Create a pipeline that tests your iOS app after a change in your S3 bucket](tutorials-codebuild-devicefarm-S3.md)\. |
| I want to create a pipeline that deploys my product template to AWS Service Catalog\. | See [Tutorial: Create a pipeline that deploys to AWS Service Catalog](tutorials-S3-servicecatalog.md)\. |
| I want to use sample templates to create a simple pipeline \(with an Amazon S3, CodeCommit, or GitHub source\) using the AWS CloudFormation console\. | See [Tutorial: Create a pipeline with AWS CloudFormation](tutorials-cloudformation.md)\. |
| I want to create a two\-stage pipeline that uses CodeDeploy and Amazon ECS for blue/green deployment of an image from an Amazon ECR repository to an Amazon ECS cluster and service\. | See [Tutorial: Create a pipeline with an Amazon ECR source and ECS\-to\-CodeDeploy deployment](tutorials-ecs-ecr-codedeploy.md)\. | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials.md |
f340cd72f5c5-2 | | I want to create a pipeline that continuously publishes my serverless application to the AWS Serverless Application Repository\. | See [Tutorial: Create a pipeline that publishes your serverless application to the AWS Serverless Application Repository](tutorials-serverlessrepo-auto-publish.md)\. |
The following tutorials in other user guides provide guidance for integrating other AWS services into your pipelines:
+ [Create a pipeline that uses CodeBuild](https://docs.aws.amazon.com/codebuild/latest/userguide/how-to-create-pipeline.html#pipelines-create-console) in *[AWS CodeBuild User Guide](https://docs.aws.amazon.com/codebuild/latest/userguide/)*
+ [Using CodePipeline with AWS OpsWorks Stacks](https://docs.aws.amazon.com/opsworks/latest/userguide/other-services-cp.html) in *[AWS OpsWorks User Guide](https://docs.aws.amazon.com/opsworks/latest/userguide/)*
+ [Continuous Delivery with CodePipeline](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/continuous-delivery-codepipeline.html) in *[AWS CloudFormation User Guide](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/)* | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials.md |
f340cd72f5c5-3 | + [Getting started using Elastic Beanstalk](https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/GettingStarted.html) in *[AWS Elastic Beanstalk Developer Guide](https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/)*
+ [Set Up a Continuous Deployment Pipeline Using CodePipeline](https://aws.amazon.com/getting-started/tutorials/continuous-deployment-pipeline/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/tutorials.md |
ccaef63bd038-0 | You can set up a rule in Amazon CloudWatch Events to start a pipeline on a schedule\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-schedule.md |
a943aee23bcb-0 | **To create a CloudWatch Events rule with a schedule as the event source**
1. Open the CloudWatch console at [https://console\.aws\.amazon\.com/cloudwatch/](https://console.aws.amazon.com/cloudwatch/)\.
1. In the navigation pane, choose **Events**\.
1. Choose **Create rule**, and then under **Event Source**, choose **Schedule**\.
1. Set up the schedule using a fixed rate or expression\. For information, see [Schedule Expression for Rules](https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/ScheduledEvents.html)\.
1. In **Targets**, choose **CodePipeline**\.
1. Enter the pipeline ARN for the pipeline execution that starts when triggered by this schedule\.
**Note**
You can find the pipeline ARN under **Settings** in the console\. See [View the pipeline ARN and service role ARN \(console\)](pipelines-view-console.md#pipelines-settings-console)\.
1. Choose one of the following to create or specify an IAM service role that gives Amazon CloudWatch Events permissions to invoke the target associated with your Amazon CloudWatch Events rule \(in this case, the target is CodePipeline\)\.
+ Choose **Create a new role for this specific resource** to create a service role that grants Amazon CloudWatch Events permissions to start your pipeline executions when triggered\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-schedule.md |
a943aee23bcb-1 | + Choose **Create a new role for this specific resource** to create a service role that grants Amazon CloudWatch Events permissions to start your pipeline executions when triggered\.
+ Choose **Use existing role** to enter a service role that grants Amazon CloudWatch Events permissions to start your pipeline executions when triggered\.
1. Choose **Configure details**\.
1. On the **Configure rule details** page, enter a name and description for the rule, and then choose **State** to enable the rule\.
1. If you're satisfied with the rule, choose **Create rule**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-schedule.md |
71b51942c6ed-0 | To use the AWS CLI to create a rule, call the put\-rule command, specifying:
+ A name that uniquely identifies the rule you are creating\. This name must be unique across all of the pipelines you create with CodePipeline associated with your AWS account\.
+ The schedule expression for the rule\.
**To create a CloudWatch Events rule with a schedule as the event source**
1. Call the put\-rule command and include the `--name `and `--schedule-expression` parameters\.
Examples:
The following sample command uses \-\-schedule\-expression to create a rule called `MyRule2` that filters CloudWatch Events on a schedule\.
```
aws events put-rule --schedule-expression 'cron(15 10 ? * 6L 2002-2005)' --name MyRule2
```
1. Grant permissions for Amazon CloudWatch Events to use CodePipeline to invoke the rule\. For more information, see [Using Resource\-Based Policies for Amazon CloudWatch Events](http://docs.aws.amazon.com/AmazonCloudWatch/latest/events/resource-based-policies-cwe.html)\.
1. Use the following sample to create the trust policy to allow Amazon CloudWatch Events to assume the service role\. Name it `trustpolicyforCWE.json`\.
```
{
"Version": "2012-10-17",
"Statement": [
{ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-schedule.md |
71b51942c6ed-1 | ```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "events.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
```
1. Use the following command to create the `Role-for-MyRule` role and attach the trust policy\.
```
aws iam create-role --role-name Role-for-MyRule --assume-role-policy-document file://trustpolicyforCWE.json
```
1. Create the permissions policy JSON as shown in this sample for the pipeline named `MyFirstPipeline`\. Name the permissions policy `permissionspolicyforCWE.json`\.
```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"codepipeline:StartPipelineExecution"
],
"Resource": [
"arn:aws:codepipeline:us-west-2:80398EXAMPLE:MyFirstPipeline"
]
} | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-schedule.md |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.