id
stringlengths 14
16
| text
stringlengths 1
2.43k
| source
stringlengths 99
229
|
---|---|---|
411b0c0d76b6-0 | ```
{
"datasetArn": "string"
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_CreateDataset.md |
98ed897aaa2b-0 | If the action is successful, the service sends back an HTTP 200 response\.
The following data is returned in JSON format by the service\.
** [datasetArn](#API_CreateDataset_ResponseSyntax) ** <a name="personalize-CreateDataset-response-datasetArn"></a>
The ARN of the dataset\.
Type: String
Length Constraints: Maximum length of 256\.
Pattern: `arn:([a-z\d-]+):personalize:.*:.*:.+` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_CreateDataset.md |
35064023061c-0 | **InvalidInputException**
Provide a valid value for the field or parameter\.
HTTP Status Code: 400
**LimitExceededException**
The limit on the number of requests per second has been exceeded\.
HTTP Status Code: 400
**ResourceAlreadyExistsException**
The specified resource already exists\.
HTTP Status Code: 400
**ResourceInUseException**
The specified resource is in use\.
HTTP Status Code: 400
**ResourceNotFoundException**
Could not find the specified resource\.
HTTP Status Code: 400 | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_CreateDataset.md |
b1ca68316f51-0 | For more information about using this API in one of the language\-specific AWS SDKs, see the following:
+ [AWS Command Line Interface](https://docs.aws.amazon.com/goto/aws-cli/personalize-2018-05-22/CreateDataset)
+ [AWS SDK for \.NET](https://docs.aws.amazon.com/goto/DotNetSDKV3/personalize-2018-05-22/CreateDataset)
+ [AWS SDK for C\+\+](https://docs.aws.amazon.com/goto/SdkForCpp/personalize-2018-05-22/CreateDataset)
+ [AWS SDK for Go](https://docs.aws.amazon.com/goto/SdkForGoV1/personalize-2018-05-22/CreateDataset)
+ [AWS SDK for Java](https://docs.aws.amazon.com/goto/SdkForJava/personalize-2018-05-22/CreateDataset)
+ [AWS SDK for JavaScript](https://docs.aws.amazon.com/goto/AWSJavaScriptSDK/personalize-2018-05-22/CreateDataset)
+ [AWS SDK for PHP V3](https://docs.aws.amazon.com/goto/SdkForPHPV3/personalize-2018-05-22/CreateDataset) | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_CreateDataset.md |
b1ca68316f51-1 | + [AWS SDK for Python](https://docs.aws.amazon.com/goto/boto3/personalize-2018-05-22/CreateDataset)
+ [AWS SDK for Ruby V3](https://docs.aws.amazon.com/goto/SdkForRubyV3/personalize-2018-05-22/CreateDataset) | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_CreateDataset.md |
c0e4cf479571-0 | You can get real\-time recommendations from Amazon Personalize with a campaign\. For example, suppose you have a campaign that is designed to give movie recommendations\. You can use the following operations to give real\-time movie recommendations to users signed into your application or website\. For an example using the AWS CLI, see [Step 4: Get Recommendations](getting-started-cli.md#gs-test)\.
**Topics**
+ [GetRecommendations](#recommendations)
+ [GetPersonalizedRanking](#rankings) | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/getting-real-time-recommendations.md |
01be48a698fd-0 | To get recommendations, call the [GetRecommendations](API_RS_GetRecommendations.md) API\. Supply either the user ID or item ID, dependent on the recipe type used to create the solution the campaign is based on\.
To get contextual recommendations, you can also include contextual metadata on your user\. For instance, you might include information on the user's current location or device \(desktop, mobile, tablet\) so that Amazon Personalize can get recommendations based on that user's previous situational behavior\. Any metadata context fields must be included in the schema of the campaign's user\-item interaction dataset\.
**Note**
The solution backing the campaign must have been created using a recipe of type USER\_PERSONALIZATION or RELATED\_ITEMS\. For more information, see [Choosing a Recipe](working-with-predefined-recipes.md)\.
**How Scoring Works**
Models that are based on USER\_PERSONALIZATION recipes score all of the items in your Items dataset relative to each other on a scale from 0 to 1 \(both inclusive\), so that the total of all scores equals 1\. For example, if you're getting movie recommendations for a user and there are three movies in the Items dataset, their scores might be `0.6`, `0.3`, and `0.1`\. Similarly, if you have 1,000 movies in your inventory, the highest\-scoring movies might have very small scores \(the average score would be`.001`\), but, because scoring is relative, the recommendations are still valid\. | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/getting-real-time-recommendations.md |
01be48a698fd-1 | In mathematical terms, scores for each user\-item pair \(u,i\) are computed according to the following formula, where “exp” is the exponential function, w̅ u and wi/j are user and item embeddings respectively, and the Greek letter sigma \(Σ\) represents summation over all items in the item dataset:
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/personalize/latest/dg/images/get_recommendations_score.png)
**Note**
Scores aren't shown for SIMS or Popularity\-Count\-based models\.
**Get Recommendations Using the AWS Python SDK**
Use the following code to get a recommendation\. Change the value of `userId` to a user ID that is in the data that you used to train the solution\. A list of recommended items for the user is displayed\.
```
import boto3
personalizeRt = boto3.client('personalize-runtime')
response = personalizeRt.get_recommendations(
campaignArn = 'Campaign ARN',
userId = 'User ID')
print("Recommended items")
for item in response['itemList']:
print (item['itemId'])
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/getting-real-time-recommendations.md |
01be48a698fd-2 | for item in response['itemList']:
print (item['itemId'])
```
Use the following code to get a recommendation based on contextual metadata\. Change the value of the `context` key\-value pair to that of a metadata field that is your training data\. A list of recommended items for the user is displayed\.
```
import boto3
personalizeRt = boto3.client('personalize-runtime')
response = personalizeRt.get_recommendations(
campaignArn = 'Campaign ARN',
userId = 'User ID',
context = {
'key': 'value'
}
)
print("Recommended items")
for item in response['itemList']:
print (item['itemId'])
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/getting-real-time-recommendations.md |
3fbf8ee290f5-0 | A personalized ranking is a list of recommended items that are re\-ranked for a specific user\. To get personalized rankings, call the [GetPersonalizedRanking](API_RS_GetPersonalizedRanking.md) API\.
**Note**
The solution backing the campaign must have been created using a recipe of type PERSONALIZED\_RANKING\. For more information, see [Choosing a Recipe](working-with-predefined-recipes.md)\.
**How Scoring Works**
Like the scores returned by the `GetRecommendations` operation, `GetPersonalizedRanking` scores sum to 1, but because the list of considered items is much smaller than your full Items dataset, recommendation scores tend to be higher\.
Mathematically, the scoring function for GetPersonalizedRanking is identical to `GetRecommendations`, except that it only considers the input items\. This means that scores closer to 1 become more likely, as there are fewer other choices to divide up the score:
![\[Image NOT FOUND\]](http://docs.aws.amazon.com/personalize/latest/dg/images/get_personalized_ranking.png)
**Get Personalized Rankings Using the AWS Python SDK**
Use the following code to get a personalized ranking\. Change the value of `userId` and `inputList` to a user ID and list of item IDs that are in the data that you used to train the solution\. A list of ranked recommendations is displayed\. Amazon Personalize considers the first item in the list of most interest to the user\.
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/getting-real-time-recommendations.md |
3fbf8ee290f5-1 | ```
import boto3
personalizeRt = boto3.client('personalize-runtime')
response = personalizeRt.get_personalized_ranking(
campaignArn = "Campaign arn",
userId = "UserID",
inputList = ['ItemID1','ItemID2'])
print("Personalized Ranking")
for item in response['personalizedRanking']:
print (item['itemId'])
```
Use the following code to get a personalized ranking based on contextual metadata\. Change the value of the `context` key\-value pair to that of a metadata field that is in your training data\. Amazon Personalize considers the first item in the list of most interest to the user\.
```
import boto3
personalizeRt = boto3.client('personalize-runtime')
response = personalizeRt.get_personalized_ranking(
campaignArn = "Campaign arn",
userId = "UserID",
inputList = ['ItemID1', 'ItemID2'],
context = {
'key': 'value'
}
)
print("Personalized Ranking")
for item in response['personalizedRanking']:
print(item['itemId'])
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/getting-real-time-recommendations.md |
a138e2ba2f7b-0 | For a sample Jupyter notebook that shows how to use the Personalized\-Ranking recipe see [Personalize Ranking Example](https://github.com/aws-samples/amazon-personalize-samples/blob/master/next_steps/core_use_cases/personalized_ranking/personalize_ranking_example.ipynb)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/getting-real-time-recommendations.md |
38fb865f1362-0 | Returns the list of datasets contained in the given dataset group\. The response provides the properties for each dataset, including the Amazon Resource Name \(ARN\)\. For more information on datasets, see [CreateDataset](API_CreateDataset.md)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListDatasets.md |
6385fe203fce-0 | ```
{
"datasetGroupArn": "string",
"maxResults": number,
"nextToken": "string"
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListDatasets.md |
d82f04101586-0 | The request accepts the following data in JSON format\.
** [datasetGroupArn](#API_ListDatasets_RequestSyntax) ** <a name="personalize-ListDatasets-request-datasetGroupArn"></a>
The Amazon Resource Name \(ARN\) of the dataset group that contains the datasets to list\.
Type: String
Length Constraints: Maximum length of 256\.
Pattern: `arn:([a-z\d-]+):personalize:.*:.*:.+`
Required: No
** [maxResults](#API_ListDatasets_RequestSyntax) ** <a name="personalize-ListDatasets-request-maxResults"></a>
The maximum number of datasets to return\.
Type: Integer
Valid Range: Minimum value of 1\. Maximum value of 100\.
Required: No
** [nextToken](#API_ListDatasets_RequestSyntax) ** <a name="personalize-ListDatasets-request-nextToken"></a>
A token returned from the previous call to `ListDatasetImportJobs` for getting the next set of dataset import jobs \(if they exist\)\.
Type: String
Length Constraints: Maximum length of 1300\.
Required: No | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListDatasets.md |
bd89c3c6bfd4-0 | ```
{
"datasets": [
{
"creationDateTime": number,
"datasetArn": "string",
"datasetType": "string",
"lastUpdatedDateTime": number,
"name": "string",
"status": "string"
}
],
"nextToken": "string"
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListDatasets.md |
f0a870e31dd8-0 | If the action is successful, the service sends back an HTTP 200 response\.
The following data is returned in JSON format by the service\.
** [datasets](#API_ListDatasets_ResponseSyntax) ** <a name="personalize-ListDatasets-response-datasets"></a>
An array of `Dataset` objects\. Each object provides metadata information\.
Type: Array of [DatasetSummary](API_DatasetSummary.md) objects
Array Members: Maximum number of 100 items\.
** [nextToken](#API_ListDatasets_ResponseSyntax) ** <a name="personalize-ListDatasets-response-nextToken"></a>
A token for getting the next set of datasets \(if they exist\)\.
Type: String
Length Constraints: Maximum length of 1300\. | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListDatasets.md |
77a1cd9f371d-0 | **InvalidInputException**
Provide a valid value for the field or parameter\.
HTTP Status Code: 400
**InvalidNextTokenException**
The token is not valid\.
HTTP Status Code: 400 | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListDatasets.md |
ff792ac02036-0 | For more information about using this API in one of the language\-specific AWS SDKs, see the following:
+ [AWS Command Line Interface](https://docs.aws.amazon.com/goto/aws-cli/personalize-2018-05-22/ListDatasets)
+ [AWS SDK for \.NET](https://docs.aws.amazon.com/goto/DotNetSDKV3/personalize-2018-05-22/ListDatasets)
+ [AWS SDK for C\+\+](https://docs.aws.amazon.com/goto/SdkForCpp/personalize-2018-05-22/ListDatasets)
+ [AWS SDK for Go](https://docs.aws.amazon.com/goto/SdkForGoV1/personalize-2018-05-22/ListDatasets)
+ [AWS SDK for Java](https://docs.aws.amazon.com/goto/SdkForJava/personalize-2018-05-22/ListDatasets)
+ [AWS SDK for JavaScript](https://docs.aws.amazon.com/goto/AWSJavaScriptSDK/personalize-2018-05-22/ListDatasets)
+ [AWS SDK for PHP V3](https://docs.aws.amazon.com/goto/SdkForPHPV3/personalize-2018-05-22/ListDatasets) | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListDatasets.md |
ff792ac02036-1 | + [AWS SDK for Python](https://docs.aws.amazon.com/goto/boto3/personalize-2018-05-22/ListDatasets)
+ [AWS SDK for Ruby V3](https://docs.aws.amazon.com/goto/SdkForRubyV3/personalize-2018-05-22/ListDatasets) | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListDatasets.md |
eb51f59cbb39-0 | Hyperparameters are used to optimize the trained model and are set before training begins\. This contrasts with model parameters whose values are determined during the training process\.
Hyperparameters are specified using the `algorithmHyperParameters` key that is part of the [SolutionConfig](API_SolutionConfig.md) object that is passed to the [CreateSolution](API_CreateSolution.md) operation\.
Different recipes use different hyperparameters\. For the available hyperparameters, see the individual recipes in [Choosing a Recipe](working-with-predefined-recipes.md)\.
Hyperparameter optimization \(HPO\), or tuning, is the task of choosing optimal hyperparameters for a specific learning objective\. The optimal hyperparameters are determined by running many training jobs using different values from the specified ranges of possibilities\. By default, Amazon Personalize does not perform HPO\. To use HPO, set `performHPO` to `true`, and include the `hpoConfig` object\.
Hyperparameters can be categorical, continuous, or integer\-valued\. The `hpoConfig` object has keys that correspond to each of these types, where you specify the hyperparameters and their ranges\. Note that not all hyperparameters can be tuned \(see the recipe tables\)\.
The following is a partial example of a `CreateSolution` request using the [HRNN](native-recipe-hrnn.md) recipe\.
```
{
"performAutoML": false, | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/customizing-solution-config-hpo.md |
eb51f59cbb39-1 | ```
{
"performAutoML": false,
"recipeArn": "arn:aws:personalize:::recipe/aws-hrnn",
"performHPO": true,
"solutionConfig": {
"algorithmHyperParameters": {
"hidden_dimension": "55"
},
"hpoConfig": {
"algorithmHyperParameterRanges": {
"categoricalHyperParameterRanges": [
{
"name": "recency_mask",
"values": [ "true", "false" ]
}
],
"integerHyperParameterRanges": [
{
"name": "bptt",
"minValue": 20,
"maxValue": 40
}
]
},
"hpoResourceConfig": {
"maxNumberOfTrainingJobs": "4",
"maxParallelTrainingJobs": "2"
}
}
}
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/customizing-solution-config-hpo.md |
eb51f59cbb39-2 | }
}
}
}
```
Once training is complete, you can view the hyperparameters of the best performing model by calling the [DescribeSolutionVersion](API_DescribeSolutionVersion.md) operation\. The following sample shows a condensed `DescribeSolutionVersion` output with the optimized hyperparameters displayed in the `tunedHPOParams` object\.
```
{
"solutionVersion":{
"creationDateTime":1562191944.745,
"datasetGroupArn":"arn:aws:personalize:us-west-2:000000000000:dataset-group/hpo",
"lastUpdatedDateTime":1562194465.075,
"performAutoML":false,
"performHPO":true,
"recipeArn":"arn:aws:personalize:::recipe/aws-hrnn",
"solutionArn":"arn:aws:personalize:us-west-2:000000000000:solution/hpo",
"solutionVersionArn":"arn:aws:personalize:us-west-2:000000000000:solution/hpo/5a515609",
"status":"ACTIVE",
"tunedHPOParams":{
"algorithmHyperParameters":{ | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/customizing-solution-config-hpo.md |
eb51f59cbb39-3 | "status":"ACTIVE",
"tunedHPOParams":{
"algorithmHyperParameters":{
"hidden_dimension":"58",
"recency_mask":"false"
}
}
}
}
```
For more information, see [Automatic Model Tuning](https://docs.aws.amazon.com/sagemaker/latest/dg/automatic-model-tuning.html)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/customizing-solution-config-hpo.md |
41622ed57630-0 | Describes a filter's properties\. | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_DescribeFilter.md |
2a6bc44b820a-0 | ```
{
"filterArn": "string"
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_DescribeFilter.md |
3467fdb75e35-0 | The request accepts the following data in JSON format\.
** [filterArn](#API_DescribeFilter_RequestSyntax) ** <a name="personalize-DescribeFilter-request-filterArn"></a>
The ARN of the filter to describe\.
Type: String
Length Constraints: Maximum length of 256\.
Pattern: `arn:([a-z\d-]+):personalize:.*:.*:.+`
Required: Yes | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_DescribeFilter.md |
4f4b61e4ce3e-0 | ```
{
"filter": {
"creationDateTime": number,
"datasetGroupArn": "string",
"failureReason": "string",
"filterArn": "string",
"filterExpression": "string",
"lastUpdatedDateTime": number,
"name": "string",
"status": "string"
}
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_DescribeFilter.md |
c776414305b1-0 | If the action is successful, the service sends back an HTTP 200 response\.
The following data is returned in JSON format by the service\.
** [filter](#API_DescribeFilter_ResponseSyntax) ** <a name="personalize-DescribeFilter-response-filter"></a>
The filter's details\.
Type: [Filter](API_Filter.md) object | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_DescribeFilter.md |
183f0b18c62a-0 | **InvalidInputException**
Provide a valid value for the field or parameter\.
HTTP Status Code: 400
**ResourceNotFoundException**
Could not find the specified resource\.
HTTP Status Code: 400 | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_DescribeFilter.md |
f4fe86a14571-0 | For more information about using this API in one of the language\-specific AWS SDKs, see the following:
+ [AWS Command Line Interface](https://docs.aws.amazon.com/goto/aws-cli/personalize-2018-05-22/DescribeFilter)
+ [AWS SDK for \.NET](https://docs.aws.amazon.com/goto/DotNetSDKV3/personalize-2018-05-22/DescribeFilter)
+ [AWS SDK for C\+\+](https://docs.aws.amazon.com/goto/SdkForCpp/personalize-2018-05-22/DescribeFilter)
+ [AWS SDK for Go](https://docs.aws.amazon.com/goto/SdkForGoV1/personalize-2018-05-22/DescribeFilter)
+ [AWS SDK for Java](https://docs.aws.amazon.com/goto/SdkForJava/personalize-2018-05-22/DescribeFilter)
+ [AWS SDK for JavaScript](https://docs.aws.amazon.com/goto/AWSJavaScriptSDK/personalize-2018-05-22/DescribeFilter)
+ [AWS SDK for PHP V3](https://docs.aws.amazon.com/goto/SdkForPHPV3/personalize-2018-05-22/DescribeFilter) | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_DescribeFilter.md |
f4fe86a14571-1 | + [AWS SDK for Python](https://docs.aws.amazon.com/goto/boto3/personalize-2018-05-22/DescribeFilter)
+ [AWS SDK for Ruby V3](https://docs.aws.amazon.com/goto/SdkForRubyV3/personalize-2018-05-22/DescribeFilter) | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_DescribeFilter.md |
85338cf6d58e-0 | Gets a list of the batch inference jobs that have been performed off of a solution version\. | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListBatchInferenceJobs.md |
5be57212fdd5-0 | ```
{
"maxResults": number,
"nextToken": "string",
"solutionVersionArn": "string"
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListBatchInferenceJobs.md |
2949ea7fcd7d-0 | The request accepts the following data in JSON format\.
** [maxResults](#API_ListBatchInferenceJobs_RequestSyntax) ** <a name="personalize-ListBatchInferenceJobs-request-maxResults"></a>
The maximum number of batch inference job results to return in each page\. The default value is 100\.
Type: Integer
Valid Range: Minimum value of 1\. Maximum value of 100\.
Required: No
** [nextToken](#API_ListBatchInferenceJobs_RequestSyntax) ** <a name="personalize-ListBatchInferenceJobs-request-nextToken"></a>
The token to request the next page of results\.
Type: String
Length Constraints: Maximum length of 1300\.
Required: No
** [solutionVersionArn](#API_ListBatchInferenceJobs_RequestSyntax) ** <a name="personalize-ListBatchInferenceJobs-request-solutionVersionArn"></a>
The Amazon Resource Name \(ARN\) of the solution version from which the batch inference jobs were created\.
Type: String
Length Constraints: Maximum length of 256\.
Pattern: `arn:([a-z\d-]+):personalize:.*:.*:.+`
Required: No | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListBatchInferenceJobs.md |
4e1fadba8092-0 | ```
{
"batchInferenceJobs": [
{
"batchInferenceJobArn": "string",
"creationDateTime": number,
"failureReason": "string",
"jobName": "string",
"lastUpdatedDateTime": number,
"solutionVersionArn": "string",
"status": "string"
}
],
"nextToken": "string"
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListBatchInferenceJobs.md |
b839f1d97ab2-0 | If the action is successful, the service sends back an HTTP 200 response\.
The following data is returned in JSON format by the service\.
** [batchInferenceJobs](#API_ListBatchInferenceJobs_ResponseSyntax) ** <a name="personalize-ListBatchInferenceJobs-response-batchInferenceJobs"></a>
A list containing information on each job that is returned\.
Type: Array of [BatchInferenceJobSummary](API_BatchInferenceJobSummary.md) objects
Array Members: Maximum number of 100 items\.
** [nextToken](#API_ListBatchInferenceJobs_ResponseSyntax) ** <a name="personalize-ListBatchInferenceJobs-response-nextToken"></a>
The token to use to retreive the next page of results\. The value is `null` when there are no more results to return\.
Type: String
Length Constraints: Maximum length of 1300\. | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListBatchInferenceJobs.md |
a5ef48926002-0 | **InvalidInputException**
Provide a valid value for the field or parameter\.
HTTP Status Code: 400
**InvalidNextTokenException**
The token is not valid\.
HTTP Status Code: 400 | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListBatchInferenceJobs.md |
b35640526fd8-0 | For more information about using this API in one of the language\-specific AWS SDKs, see the following:
+ [AWS Command Line Interface](https://docs.aws.amazon.com/goto/aws-cli/personalize-2018-05-22/ListBatchInferenceJobs)
+ [AWS SDK for \.NET](https://docs.aws.amazon.com/goto/DotNetSDKV3/personalize-2018-05-22/ListBatchInferenceJobs)
+ [AWS SDK for C\+\+](https://docs.aws.amazon.com/goto/SdkForCpp/personalize-2018-05-22/ListBatchInferenceJobs)
+ [AWS SDK for Go](https://docs.aws.amazon.com/goto/SdkForGoV1/personalize-2018-05-22/ListBatchInferenceJobs)
+ [AWS SDK for Java](https://docs.aws.amazon.com/goto/SdkForJava/personalize-2018-05-22/ListBatchInferenceJobs)
+ [AWS SDK for JavaScript](https://docs.aws.amazon.com/goto/AWSJavaScriptSDK/personalize-2018-05-22/ListBatchInferenceJobs) | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListBatchInferenceJobs.md |
b35640526fd8-1 | + [AWS SDK for PHP V3](https://docs.aws.amazon.com/goto/SdkForPHPV3/personalize-2018-05-22/ListBatchInferenceJobs)
+ [AWS SDK for Python](https://docs.aws.amazon.com/goto/boto3/personalize-2018-05-22/ListBatchInferenceJobs)
+ [AWS SDK for Ruby V3](https://docs.aws.amazon.com/goto/SdkForRubyV3/personalize-2018-05-22/ListBatchInferenceJobs) | https://github.com/siagholami/aws-documentation/tree/main/documents/amazon-personalize-developer-guide/doc_source/API_ListBatchInferenceJobs.md |
56286b276e64-0 | For the latest AWS terminology, see the [AWS glossary](https://docs.aws.amazon.com/general/latest/gr/glos-chap.html) in the *AWS General Reference*\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/glossary.md |
ed24fd23e7aa-0 | You can use an Amazon ECS action to deploy an Amazon ECS service and task set\. An Amazon ECS service is a container application that is deployed to an Amazon ECS cluster\. An Amazon ECS cluster is a collection of instances that host your container application in the cloud\. The deployment requires a task definition that you create in Amazon ECS and an image definitions file that CodePipeline uses to deploy the image\.
Before you create your pipeline, you must have already created the Amazon ECS resources, tagged and stored the image in your image repository, and uploaded the BuildSpec file to your file repository\.
**Note**
This reference topic describes the Amazon ECS standard deployment action for CodePipeline\. For reference information about Amazon ECS to CodeDeploy blue/green deployment actions in CodePipeline, see [Amazon Elastic Container Service and CodeDeploy Blue\-Green](action-reference-ECSbluegreen.md)\.
**Topics**
+ [Action Type](#action-reference-ECS-type)
+ [Configuration Parameters](#action-reference-ECS-config)
+ [Input Artifacts](#action-reference-ECS-input)
+ [Output Artifacts](#action-reference-ECS-output)
+ [Action Declaration](#action-reference-ECS-example)
+ [See Also](#action-reference-ECS-links) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-ECS.md |
5ab55202c24a-0 | + Category: `Deploy`
+ Owner: `AWS`
+ Provider: `ECS`
+ Version: `1` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-ECS.md |
7097d4d06610-0 | **ClusterName**
Required: Yes
The Amazon ECS cluster in Amazon ECS\.
**ServiceName**
Required: Yes
The Amazon ECS service that you created in Amazon ECS\.
**FileName**
Required: No
The name of your image definitions file, the JSON file that describes your service's container name and the image and tag\. You use this file for ECS standard deployments\. For more information, see [Input Artifacts](#action-reference-ECS-input) and [imagedefinitions\.json file for Amazon ECS standard deployment actions](file-reference.md#pipelines-create-image-definitions)\.
**DeploymentTimeout**
Required: No
The Amazon ECS deployment action timeout in minutes\. The timeout is configurable up to the maximum default timeout for this action\. For example:
```
"DeploymentTimeout": "15"
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-ECS.md |
410c8e546b03-0 | + **Number of Artifacts:** `1`
+ **Description:** The action looks for an `imagedefinitions.json` file in the source file repository for the pipeline\. An image definitions document is a JSON file that describes your Amazon ECS container name and the image and tag\. CodePipeline uses the file to retrieve the image from your image repository such as Docker Hub or Amazon ECR\. You can manually add an `imagedefinitions.json` file for a pipeline where the action is not automated\. For information about the `imagedefinitions.json` file, see [imagedefinitions\.json file for Amazon ECS standard deployment actions](file-reference.md#pipelines-create-image-definitions)\.
The action requires an existing image that has already been pushed to your image repository\. Because the image mapping is provided by the `imagedefinitions.json` file, the action does not require that the Amazon ECR source be included as a source action in the pipeline\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-ECS.md |
c50151c653c6-0 | + **Number of Artifacts:** `0`
+ **Description:** Output artifacts do not apply for this action type\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-ECS.md |
112ade19fc83-0 | ------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-ECS.md |
1218ecdc28a8-0 | ```
Name: DeployECS
ActionTypeId:
Category: Deploy
Owner: AWS
Provider: ECS
Version: '1'
RunOrder: 2
Configuration:
ClusterName: my-ecs-cluster
ServiceName: sample-app-service
FileName: imagedefinitions.json
DeploymentTimeout: '15'
OutputArtifacts: []
InputArtifacts:
- Name: my-image
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-ECS.md |
6537b7543397-0 | ```
{
"Name": "DeployECS",
"ActionTypeId": {
"Category": "Deploy",
"Owner": "AWS",
"Provider": "ECS",
"Version": "1"
},
"RunOrder": 2,
"Configuration": {
"ClusterName": "my-ecs-cluster",
"ServiceName": "sample-app-service",
"FileName": "imagedefinitions.json",
"DeploymentTimeout": "15"
},
"OutputArtifacts": [],
"InputArtifacts": [
{
"Name": "my-image"
}
]
},
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-ECS.md |
80e6c78a8b7b-0 | The following related resources can help you as you work with this action\.
+ [Tutorial: Continuous Deployment with CodePipeline](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/ecs-cd-pipeline.html) – This tutorial shows you how to create a Dockerfile that you store in a source file repository such as CodeCommit\. Next, the tutorial shows you how to incorporate a CodeBuild BuildSpec file that builds and pushes your Docker image to Amazon ECR and creates your imagedefinitions\.json file\. Finally, you create an Amazon ECS service and task definition, and then you create your pipeline with an Amazon ECS deployment action\.
**Note**
This topic and tutorial describe the Amazon ECS standard deployment action for CodePipeline\. For information about Amazon ECS to CodeDeploy blue/green deployment actions in CodePipeline, see [Tutorial: Create a pipeline with an Amazon ECR source and ECS\-to\-CodeDeploy deployment](tutorials-ecs-ecr-codedeploy.md)\.
+ *Amazon Elastic Container Service Developer Guide* – For information about working with Docker images and containers, Amazon ECS services and clusters, and Amazon ECS task sets, see [What Is Amazon ECS?](https://docs.aws.amazon.com/AmazonECS/latest/developerguide/) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/action-reference-ECS.md |
87ed25ca0b90-0 | The integrations information in this topic is organized by CodePipeline action type\.
**Topics**
+ [Source action integrations](#integrations-source)
+ [Build action integrations](#integrations-build)
+ [Test action integrations](#integrations-test)
+ [Deploy action integrations](#integrations-deploy)
+ [Approval action integrations](#integrations-approval)
+ [Invoke action integrations](#integrations-invoke) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
e5b8f2da9749-0 | The following information is organized by CodePipeline action type and can help you configure CodePipeline to integrate with the products and services you use\.
| | |
| --- |--- |
| Amazon Simple Storage Service \(Amazon S3\) | [Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/user-guide/) is storage for the internet\. You can use Amazon S3 to store and retrieve any amount of data at any time, from anywhere on the web\. You can configure CodePipeline to use a versioned Amazon S3 bucket as the source stage for your code\. Create the bucket and enable versioning on it\. Then you can create a pipeline that uses the bucket as part of a source action in a stage\. Amazon S3 can also be included in a pipeline as a deploy action\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
e5b8f2da9749-1 | | AWS CodeCommit | [CodeCommit](https://docs.aws.amazon.com/codecommit/latest/userguide/) is a version control service that you can use to privately store and manage assets \(such as documents, source code, and binary files\) in the cloud\. You can configure CodePipeline to use a branch in a CodeCommit repository as the source stage for your code\. Create the repository and associate it with a working directory on your local machine\. Then you can create a pipeline that uses the branch as part of a source action in a stage\. You can connect to the CodeCommit repository by either creating a pipeline or editing an existing one\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
e5b8f2da9749-2 | | GitHub | You can configure CodePipeline to use a [GitHub](https://github.com/) repository as the source stage for your code\. You must have previously created a GitHub account and at least one GitHub repository\. You can connect to the GitHub repository by either creating a pipeline or editing an existing one\. CodePipeline integration with GitHub Enterprise is not supported\. For action parameters and definitions, see the action structure reference | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
e5b8f2da9749-3 | parameters and definitions, see the action structure reference page for [GitHub](action-reference-GitHub.md)\. The first time you add a GitHub repository to a pipeline, you are prompted to authorize CodePipeline access to your repositories\. To integrate with GitHub, CodePipeline creates an OAuth application for your pipeline\. If you create or edit your pipeline in the console, CodePipeline creates a GitHub webhook that starts your pipeline when a change occurs in the repository\. The | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
e5b8f2da9749-4 | when a change occurs in the repository\. The token and webhook require the following GitHub scopes: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) For more information about GitHub scopes, see the [GitHub Developer API Reference](https://developer.github.com/v3/oauth/#scopes)\. Access for CodePipeline is configured for all repositories to which that GitHub account has access\. It cannot currently be configured for individual repositories\. To revoke | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
e5b8f2da9749-5 | be configured for individual repositories\. To revoke this access from GitHub, choose **Settings**, and then choose **Applications**\. Under **Authorized applications**, find CodePipeline in the list of authorized applications, and then choose **Revoke**\. Revoking access immediately prevents CodePipeline from accessing any GitHub repositories previously configured for access with that GitHub account\. If you want to limit the access CodePipeline has to repositories, create a GitHub account, grant that account | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
e5b8f2da9749-6 | create a GitHub account, grant that account access only to those repositories you want to integrate with CodePipeline, and then use that account when you configure CodePipeline to use GitHub repositories for source stages in pipelines\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
e5b8f2da9749-7 | | Amazon ECR | [Amazon ECR](https://docs.aws.amazon.com/AmazonECR/latest/userguide/) is an AWS Docker image repository service\. You use Docker push and pull commands to upload Docker images to your repository\. An Amazon ECR repository URI and image are used in Amazon ECS task definitions to reference source image information\.Learn more:[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) |
| AWS CodeStar Connections \(Connections to Bitbucket\) | You can set up resources called connections to allow your pipelines to access third\-party code repositories like Bitbucket\. When you create a connection, you install the AWS CodeStar app with your third\-party code repository, and then associate it with your connection\.Learn more:[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
0f040488642f-0 | | | |
| --- |--- |
| [AWS CodeBuild](action-reference-CodeBuild.md) | [CodeBuild](http://aws.amazon.com/codebuild/) is a fully managed build service that compiles your source code, runs unit tests, and produces artifacts that are ready to deploy\. You can add CodeBuild as a build action to the build stage of a pipeline\. CodeBuild can also be included in a pipeline as a test action, with or without a build output\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) |
| CloudBees | You can configure CodePipeline to use [CloudBees](http://www.cloudbees.com) to build or test your code in one or more actions in a pipeline\.Learn more:[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
0f040488642f-1 | | Jenkins | You can configure CodePipeline to use [Jenkins CI](https://jenkins-ci.org/) to build or test your code in one or more actions in a pipeline\. You must have previously created a Jenkins project and installed and configured the CodePipeline Plugin for Jenkins for that project\. You can connect to the Jenkins project by either creating a new pipeline or editing an existing one\. Access for Jenkins is configured on a per\-project basis\. You must install the CodePipeline Plugin for Jenkins on every Jenkins instance you want to use with CodePipeline\. You must also configure CodePipeline access to the Jenkins project\. Secure your Jenkins project by configuring it to accept HTTPS/SSL connections only\. If your Jenkins project is installed on an Amazon EC2 instance, consider providing your AWS credentials by installing the AWS CLI on each instance\. Then configure an AWS profile on those instances with the IAM user profile and AWS credentials you want to use for connections\. This is an alternative to adding and storing them through the Jenkins web interface\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
0f040488642f-2 | | TeamCity | You can configure CodePipeline to use [TeamCity](https://www.jetbrains.com/teamcity/) to build and test your code in one or more actions in a pipeline\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
069fcebfc7c9-0 | | | |
| --- |--- |
| [AWS CodeBuild](action-reference-CodeBuild.md) | [CodeBuild](http://aws.amazon.com/codebuild/) is a fully managed build service in the cloud\. CodeBuild compiles your source code, runs unit tests, and produces artifacts that are ready to deploy\. You can add CodeBuild to a pipeline as a test action\. For more information, see the CodePipeline Action Configuration Reference for [AWS CodeBuild](action-reference-CodeBuild.md)\. CodeBuild can also be included in a pipeline as a build action, with a mandatory build output artifact\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
069fcebfc7c9-1 | | AWS Device Farm | [AWS Device Farm](http://aws.amazon.com/devicefarm/) is an app testing service that you can use to test and interact with your Android, iOS, and web applications on real, physical phones and tablets\. You can configure CodePipeline to use AWS Device Farm to test your code in one or more actions in a pipeline\. AWS Device Farm allows you to upload your own tests or use built\-in, script\-free compatibility tests\. Because testing is performed in parallel, tests on multiple devices begin in minutes\. A test report that contains high\-level results, low\-level logs, pixel\-to\-pixel screenshots, and performance data is updated as tests are completed\. AWS Device Farm supports testing of native and hybrid Android, iOS, and Fire OS apps, including those created with PhoneGap, Titanium, Xamarin, Unity, and other frameworks\. It supports remote access of Android apps, which allows you to interact directly with test devices\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) |
| BlazeMeter | You can configure CodePipeline to use [BlazeMeter](https://blazemeter.com/) to test your code in one or more actions in a pipeline\. Learn more:[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
069fcebfc7c9-2 | | Ghost Inspector | You can configure CodePipeline to use [Ghost Inspector](https://ghostinspector.com/) to test your code in one or more actions in a pipeline\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) |
| Micro Focus StormRunner Load | You can configure CodePipeline to use [Micro Focus StormRunner Load](https://software.microfocus.com/en-us/products/stormrunner-load-agile-cloud-testing/overview) in one or more actions in a pipeline\.Learn more:[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) |
| Nouvola | You can configure CodePipeline to use [Nouvola](http://www.nouvola.com/aws-codepipeline-plugin/) to test your code in one or more actions in a pipeline\.Learn more:[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
069fcebfc7c9-3 | | Runscope | You can configure CodePipeline to use [Runscope](https://www.runscope.com/) to test your code in one or more actions in a pipeline\.Learn more:[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
cefe289f7459-0 | | | |
| --- |--- |
| Amazon Simple Storage Service \(Amazon S3\) | [Amazon S3](https://docs.aws.amazon.com/AmazonS3/latest/user-guide/) is storage for the internet\. You can use Amazon S3 to store and retrieve any amount of data at any time, from anywhere on the web\. You can add an action to a pipeline that uses Amazon S3 as a deployment provider\. Amazon S3 can also be included in a pipeline as a source action\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) |
| AWS AppConfig | AWS AppConfig is a capability of AWS Systems Manager to create, manage, and quickly deploy application configurations\. You can use AppConfig with applications hosted on EC2 instances, AWS Lambda, containers, mobile applications, or IoT devices\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
cefe289f7459-1 | | [AWS CloudFormation](action-reference-CloudFormation.md) | [AWS CloudFormation](https://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/) gives developers and systems administrators an easy way to create and manage a collection of related AWS resources, using templates to provision and update those resources\. You can use the service’s sample templates or create your own\. Templates describe the AWS resources and any dependencies or runtime parameters required to run your application\. The AWS Serverless Application Model \(AWS SAM\) extends AWS CloudFormation to provide a simplified way to define and deploy serverless applications\. AWS SAM supports Amazon API Gateway APIs, AWS Lambda functions, and Amazon DynamoDB tables\. You can use CodePipeline with AWS CloudFormation and the AWS SAM to continuously deliver your serverless applications\. You can add an action to a pipeline that uses AWS CloudFormation as a deployment provider\. When you use AWS CloudFormation as a deployment provider, you can take action on AWS CloudFormation stacks and change sets as part of a pipeline execution\. AWS CloudFormation can create, update, replace, and delete stacks and change sets when a pipeline runs\. As a result, AWS and custom resources can be created, provisioned, updated, or terminated during a pipeline execution according to the specifications you provide in AWS CloudFormation templates and parameter definitions\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
cefe289f7459-2 | | AWS CodeDeploy | [CodeDeploy](https://docs.aws.amazon.com/codedeploy/latest/userguide/) coordinates application deployments to Amazon EC2 instances, on\-premises instances, or both\. You can configure CodePipeline to use CodeDeploy to deploy your code\. You can create the CodeDeploy application, deployment, and deployment group to use in a deploy action in a stage either before you create the pipeline or when you use the **Create Pipeline** wizard\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) |
| Amazon Elastic Container Service | Amazon ECS is a highly scalable, high performance container management service that allows you to run container\-based applications in the AWS Cloud\. When you create a pipeline, you can select Amazon ECS as a deployment provider\. A change to code in your source control repository triggers your pipeline to build a new Docker image, push it to your container registry, and then deploy the updated image to Amazon ECS\. You can also use the **ECS \(Blue/Green\)** provider action in CodePipeline to route and deploy blue/green deployments to Amazon ECS with CodeDeploy\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
cefe289f7459-3 | | AWS Elastic Beanstalk | [Elastic Beanstalk](https://docs.aws.amazon.com/elasticbeanstalk/latest/dg/) is a service for deploying and scaling web applications and services developed with Java, \.NET, PHP, Node\.js, Python, Ruby, Go, and Docker on familiar servers such as Apache, Nginx, Passenger, and IIS\. You can configure CodePipeline to use Elastic Beanstalk to deploy your code\. You can create the Elastic Beanstalk application and environment to use in a deploy action in a stage either before you create the pipeline or when you use the **Create Pipeline** wizard\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
cefe289f7459-4 | | AWS OpsWorks Stacks | AWS OpsWorks is a configuration management service that helps you configure and operate applications of all shapes and sizes using Chef\. Using AWS OpsWorks Stacks, you can define the application’s architecture and the specification of each component including package installation, software configuration and resources such as storage\. You can configure CodePipeline to use AWS OpsWorks Stacks to deploy your code in conjunction with custom Chef cookbooks and applications in AWS OpsWorks\. [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) Before you create the pipeline, you create the AWS OpsWorks stack and layer\. You can create the AWS OpsWorks application to use in a deploy action in a stage either before you create the pipeline or when you use the **Create Pipeline** wizard\. CodePipeline support for AWS OpsWorks is currently available in the US East \(N\. Virginia\) Region \(us\-east\-1\) only\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
cefe289f7459-5 | | AWS Service Catalog | [AWS Service Catalog](https://docs.aws.amazon.com/servicecatalog/latest/dg/) enables organizations to create and manage catalogs of products that are approved for use on AWS\. You can configure CodePipeline to deploy updates and versions of your product templates to AWS Service Catalog\. You can create the AWS Service Catalog product to use in a deployment action and then use the **Create Pipeline** wizard to create the pipeline\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) |
| Alexa Skills Kit | [Amazon Alexa Skills Kit](https://developer.amazon.com/docs/custom-skills/use-the-alexa-skills-kit-samples.html) lets you build and distribute cloud\-based skills to users of Alexa\-enabled devices\. You can add an action to a pipeline that uses Alexa Skills Kit as a deployment provider\. Source changes are detected by your pipeline, and then your pipeline deploys updates to your Alexa skill in the Alexa service\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
cefe289f7459-6 | | XebiaLabs | You can configure CodePipeline to use [XebiaLabs](https://xebialabs.com/) to deploy your code in one or more actions in a pipeline\. Learn more:[\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
9161a4947a7e-0 | | | |
| --- |--- |
| Amazon Simple Notification Service | [Amazon SNS](https://docs.aws.amazon.com/sns/latest/gsg/) is a fast, flexible, fully managed push notification service that lets you send individual messages or to fan out messages to large numbers of recipients\. Amazon SNS makes it simple and cost effective to send push notifications to mobile device users, email recipients or even send messages to other distributed services\. When you create a manual approval request in CodePipeline, you can optionally publish to a topic in Amazon SNS so that all IAM users subscribed to it are notified that the approval action is ready to be reviewed\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
9cc5134ab4b3-0 | | | |
| --- |--- |
| [AWS Lambda](action-reference-Lambda.md) | [Lambda](https://docs.aws.amazon.com/lambda/latest/dg/) lets you run code without provisioning or managing servers\. You can configure CodePipeline to use Lambda functions to add flexibility and functionality to your pipelines\. You can create the Lambda function to add as an action in a stage either before you create the pipeline or when you use the **Create Pipeline** wizard\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) |
| [AWS Step Functions](action-reference-StepFunctions.md) | [Step Functions](https://docs.aws.amazon.com/step-functions/latest/dg/) lets you create and configure state machines\. You can configure CodePipeline to use Step Functions invoke actions to trigger state machine executions\. Learn more: [\[See the AWS documentation website for more details\]](http://docs.aws.amazon.com/codepipeline/latest/userguide/integrations-action-type.html) | | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/integrations-action-type.md |
9f1bd9222aea-0 | You can attach policies to IAM identities\. For example, you can do the following:
+ **Attach a permissions policy to a user or a group in your account** – To grant a user permissions to view pipelines in the CodePipeline console, you can attach a permissions policy to a user or group that the user belongs to\.
+ **Attach a permissions policy to a role \(grant cross\-account permissions\)** – You can attach an identity\-based permissions policy to an IAM role to grant cross\-account permissions\. For example, the administrator in Account A can create a role to grant cross\-account permissions to another AWS account \(for example, Account B\) or an AWS service as follows:
1. Account A administrator creates an IAM role and attaches a permissions policy to the role that grants permissions on resources in Account A\.
1. Account A administrator attaches a trust policy to the role identifying Account B as the principal who can assume the role\.
1. Account B administrator can then delegate permissions to assume the role to any users in Account B\. Doing this allows users in Account B to create or access resources in Account A\. The principal in the trust policy can also be an AWS service principal if you want to grant an AWS service permissions to assume the role\.
For more information about using IAM to delegate permissions, see [Access Management](https://docs.aws.amazon.com/IAM/latest/UserGuide/access.html) in the *IAM User Guide*\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security-iam-id-policies-examples.md |
9f1bd9222aea-1 | The following shows an example of a permissions policy that allows a user to enable and disable all stage transitions in the pipeline named `MyFirstPipeline` in the `us-west-2 region`:
```
{
"Version": "2012-10-17",
"Statement" : [
{
"Effect" : "Allow",
"Action" : [
"codepipeline:EnableStageTransition",
"codepipeline:DisableStageTransition"
],
"Resource" : [
"arn:aws:codepipeline:us-west-2:111222333444:MyFirstPipeline"
]
}
]
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security-iam-id-policies-examples.md |
9f1bd9222aea-2 | ]
}
]
}
```
The following example shows a policy in the 111222333444 account that allows users to view, but not change, the pipeline named `MyFirstPipeline` in the CodePipeline console\. This policy is based on the `AWSCodePipelineReadOnlyAccess` managed policy, but because it is specific to the `MyFirstPipeline` pipeline, it cannot use the managed policy directly\. If you do not want to restrict the policy to a specific pipeline, consider using one of the managed policies created and maintained by CodePipeline\. For more information, see [Working with Managed Policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/access_policies_managed-using.html)\. You must attach this policy to an IAM role you create for access, for example, a role named `CrossAccountPipelineViewers`:
```
{
"Statement": [
{
"Action": [
"codepipeline:GetPipeline",
"codepipeline:GetPipelineState",
"codepipeline:GetPipelineExecution",
"codepipeline:ListPipelineExecutions",
"codepipeline:ListActionTypes",
"codepipeline:ListPipelines",
"iam:ListRoles",
"s3:GetBucketPolicy",
"s3:GetObject", | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security-iam-id-policies-examples.md |
9f1bd9222aea-3 | "iam:ListRoles",
"s3:GetBucketPolicy",
"s3:GetObject",
"s3:ListAllMyBuckets",
"s3:ListBucket",
"codecommit:ListBranches",
"codecommit:ListRepositories",
"codedeploy:GetApplication",
"codedeploy:GetDeploymentGroup",
"codedeploy:ListApplications",
"codedeploy:ListDeploymentGroups",
"elasticbeanstalk:DescribeApplications",
"elasticbeanstalk:DescribeEnvironments",
"lambda:GetFunctionConfiguration",
"lambda:ListFunctions",
"opsworks:DescribeApps",
"opsworks:DescribeLayers",
"opsworks:DescribeStacks"
],
"Effect": "Allow",
"Resource": "arn:aws:codepipeline:us-west-2:111222333444:MyFirstPipeline"
}
],
"Version": "2012-10-17"
}
``` | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security-iam-id-policies-examples.md |
9f1bd9222aea-4 | }
],
"Version": "2012-10-17"
}
```
After you create this policy, create the IAM role in the 111222333444 account and attach the policy to that role\. In the role's trust relationships, you must add the AWS account that will assume this role\. The following example shows a policy that allows users from the *111111111111* AWS account to assume roles defined in the 111222333444 account:
```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::111111111111:root"
},
"Action": "sts:AssumeRole"
}
]
}
```
The following example shows a policy created in the *111111111111* AWS account that allows users to assume the role named *CrossAccountPipelineViewers* in the 111222333444 account:
```
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "sts:AssumeRole", | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security-iam-id-policies-examples.md |
9f1bd9222aea-5 | "Statement": [
{
"Effect": "Allow",
"Action": "sts:AssumeRole",
"Resource": "arn:aws:iam::111222333444:role/CrossAccountPipelineViewers"
}
]
}
```
You can create IAM policies to restrict the calls and resources that users in your account have access to, and then attach those policies to IAM users\. For more information about how to create IAM roles and to explore example IAM policy statements for CodePipeline, see [Customer managed policy examples](customer-managed-policies.md)\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/security-iam-id-policies-examples.md |
55dc983b7243-0 | To use AWS CloudFormation to create a rule, update your template as shown here\.<a name="proc-cfn-event-codecommit"></a>
**To update your pipeline AWS CloudFormation template and create CloudWatch Events rule**
1. In the template, under `Resources`, use the `AWS::IAM::Role` AWS CloudFormation resource to configure the IAM role that allows your event to start your pipeline\. This entry creates a role that uses two policies:
+ The first policy allows the role to be assumed\.
+ The second policy provides permissions to start the pipeline\.
**Why am I making this change?** Adding the `AWS::IAM::Role` resource enables AWS CloudFormation to create permissions for CloudWatch Events\. This resource is added to your AWS CloudFormation stack\.
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-repo-changes-cfn.md |
b6f3a4ce9bc7-0 | ```
AmazonCloudWatchEventRole:
Type: AWS::IAM::Role
Properties:
AssumeRolePolicyDocument:
Version: 2012-10-17
Statement:
-
Effect: Allow
Principal:
Service:
- events.amazonaws.com
Action: sts:AssumeRole
Path: /
Policies:
-
PolicyName: cwe-pipeline-execution
PolicyDocument:
Version: 2012-10-17
Statement:
-
Effect: Allow
Action: codepipeline:StartPipelineExecution
Resource: !Join [ '', [ 'arn:aws:codepipeline:', !Ref 'AWS::Region', ':', !Ref 'AWS::AccountId', ':', !Ref AppPipeline ] ]
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-repo-changes-cfn.md |
51a04099120f-0 | ```
"AmazonCloudWatchEventRole": {
"Type": "AWS::IAM::Role",
"Properties": {
"AssumeRolePolicyDocument": {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": [
"events.amazonaws.com"
]
},
"Action": "sts:AssumeRole"
}
]
},
"Path": "/",
"Policies": [
{
"PolicyName": "cwe-pipeline-execution",
"PolicyDocument": {
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": "codepipeline:StartPipelineExecution",
"Resource": {
"Fn::Join": [
"",
[
"arn:aws:codepipeline:",
{
"Ref": "AWS::Region"
},
":",
{ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-repo-changes-cfn.md |
51a04099120f-1 | {
"Ref": "AWS::Region"
},
":",
{
"Ref": "AWS::AccountId"
},
":",
{
"Ref": "AppPipeline"
}
]
...
```
------
1. In the template, under `Resources`, use the `AWS::Events::Rule` AWS CloudFormation resource to add a CloudWatch Events rule\. This event pattern creates an event that monitors push changes to your repository When CloudWatch Events detects a repository state change, the rule invokes `StartPipelineExecution` on your target pipeline\.
**Why am I making this change? ** Adding the `AWS::Events::Rule` resource enables AWS CloudFormation to create the event\. This resource is added to your AWS CloudFormation stack\.
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-repo-changes-cfn.md |
93ce6d225c1d-0 | ```
AmazonCloudWatchEventRule:
Type: AWS::Events::Rule
Properties:
EventPattern:
source:
- aws.codecommit
detail-type:
- 'CodeCommit Repository State Change'
resources:
- !Join [ '', [ 'arn:aws:codecommit:', !Ref 'AWS::Region', ':', !Ref 'AWS::AccountId', ':', !Ref RepositoryName ] ]
detail:
event:
- referenceCreated
- referenceUpdated
referenceType:
- branch
referenceName:
- master
Targets:
-
Arn:
!Join [ '', [ 'arn:aws:codepipeline:', !Ref 'AWS::Region', ':', !Ref 'AWS::AccountId', ':', !Ref AppPipeline ] ]
RoleArn: !GetAtt AmazonCloudWatchEventRole.Arn
Id: codepipeline-AppPipeline
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-repo-changes-cfn.md |
bf89c4fc07f8-0 | ```
"AmazonCloudWatchEventRule": {
"Type": "AWS::Events::Rule",
"Properties": {
"EventPattern": {
"source": [
"aws.codecommit"
],
"detail-type": [
"CodeCommit Repository State Change"
],
"resources": [
{
"Fn::Join": [
"",
[
"arn:aws:codecommit:",
{
"Ref": "AWS::Region"
},
":",
{
"Ref": "AWS::AccountId"
},
":",
{
"Ref": "RepositoryName"
}
]
]
}
],
"detail": {
"event": [
"referenceCreated",
"referenceUpdated"
],
"referenceType": [
"branch"
],
"referenceName": [
"master"
]
}
},
"Targets": [
{ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-repo-changes-cfn.md |
bf89c4fc07f8-1 | "master"
]
}
},
"Targets": [
{
"Arn": {
"Fn::Join": [
"",
[
"arn:aws:codepipeline:",
{
"Ref": "AWS::Region"
},
":",
{
"Ref": "AWS::AccountId"
},
":",
{
"Ref": "AppPipeline"
}
]
]
},
"RoleArn": {
"Fn::GetAtt": [
"AmazonCloudWatchEventRole",
"Arn"
]
},
"Id": "codepipeline-AppPipeline"
}
]
}
},
```
------
1. Save the updated template to your local computer, and then open the AWS CloudFormation console\.
1. Choose your stack, and then choose **Create Change Set for Current Stack**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-repo-changes-cfn.md |
bf89c4fc07f8-2 | 1. Choose your stack, and then choose **Create Change Set for Current Stack**\.
1. Upload the template, and then view the changes listed in AWS CloudFormation\. These are the changes to be made to the stack\. You should see your new resources in the list\.
1. Choose **Execute**\.<a name="proc-cfn-flag-codecommit"></a>
**To edit your pipeline's PollForSourceChanges parameter**
**Important**
In many cases, the `PollForSourceChanges` parameter defaults to true when you create a pipeline\. When you add event\-based change detection, you must add the parameter to your output and set it to false to disable polling\. Otherwise, your pipeline starts twice for a single source change\. For details, see [Default settings for the PollForSourceChanges parameter](reference-pipeline-structure.md#PollForSourceChanges-defaults)\.
+ In the template, change `PollForSourceChanges` to `false`\. If you did not include `PollForSourceChanges` in your pipeline definition, add it and set it to `false`\.
**Why am I making this change?** Changing this parameter to `false` turns off periodic checks so you can use event\-based change detection only\.
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-repo-changes-cfn.md |
ec46c890d0ae-0 | ```
Name: Source
Actions:
-
Name: SourceAction
ActionTypeId:
Category: Source
Owner: AWS
Version: 1
Provider: CodeCommit
OutputArtifacts:
- Name: SourceOutput
Configuration:
BranchName: !Ref BranchName
RepositoryName: !Ref RepositoryName
PollForSourceChanges: false
RunOrder: 1
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-repo-changes-cfn.md |
d33d5539aa48-0 | ```
{
"Name": "Source",
"Actions": [
{
"Name": "SourceAction",
"ActionTypeId": {
"Category": "Source",
"Owner": "AWS",
"Version": 1,
"Provider": "CodeCommit"
},
"OutputArtifacts": [
{
"Name": "SourceOutput"
}
],
"Configuration": {
"BranchName": {
"Ref": "BranchName"
},
"RepositoryName": {
"Ref": "RepositoryName"
},
"PollForSourceChanges": false
},
"RunOrder": 1
}
]
},
```
------ | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-trigger-source-repo-changes-cfn.md |
eab97d6cb40c-0 | You might want to create a pipeline that uses resources created or managed by another AWS account\. For example, you might want to use one account for your pipeline and another for your CodeDeploy resources\.
**Note**
When you create a pipeline with actions from multiple accounts, you must configure your actions so that they can still access artifacts within the limitations of cross\-account pipelines\. The following limitations apply to cross\-account actions:
In general, an action can only consume an artifact if:
The action is in the same account as the pipeline account OR
The artifact was created in the pipeline account for an action in another account OR
The artifact was produced by a previous action in the same account as the action
In other words, you cannot pass an artifact from one account to another if neither account is the pipeline account\.
Cross\-account actions are not supported for the following action types:
Jenkins build actions
For this example, you must create an AWS Key Management Service \(AWS KMS\) key to use, add the key to the pipeline, and set up account policies and roles to enable cross\-account access\. For an AWS KMS key, you can use the key ID, the key ARN, or the alias ARN\.
**Note**
To specify a customer master key \(CMK\) in a different AWS account, you must use the key ARN or alias ARN\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-create-cross-account.md |
eab97d6cb40c-1 | To specify a customer master key \(CMK\) in a different AWS account, you must use the key ARN or alias ARN\.
In this walkthrough and its examples, *AccountA* is the account originally used to create the pipeline\. It has access to the Amazon S3 bucket used to store pipeline artifacts and the service role used by AWS CodePipeline\. *AccountB* is the account originally used to create the CodeDeploy application, deployment group, and service role used by CodeDeploy\.
For *AccountA* to edit a pipeline to use the CodeDeploy application created by *AccountB*, *AccountA* must:
+ Request the ARN or account ID of *AccountB* \(in this walkthrough, the *AccountB* ID is *012ID\_ACCOUNT\_B*\)\.
+ Create or use an AWS KMS customer managed key in the Region for the pipeline, and grant permissions to use that key to the service role \(*AWS\-CodePipeline\-Service*\) and *AccountB*\.
+ Create an Amazon S3 bucket policy that grants *AccountB* access to the Amazon S3 bucket \(for example, *codepipeline\-us\-east\-2\-1234567890*\)\.
+ Create a policy that allows *AccountA* to assume a role configured by *AccountB*, and attach that policy to the service role \(*AWS\-CodePipeline\-Service*\)\.
+ Edit the pipeline to use the customer managed AWS KMS key instead of the default key\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-create-cross-account.md |
eab97d6cb40c-2 | + Edit the pipeline to use the customer managed AWS KMS key instead of the default key\.
For *AccountB* to allow access to its resources to a pipeline created in *AccountA*, *AccountB* must:
+ Request the ARN or account ID of *AccountA* \(in this walkthrough, the *AccountA* ID is *012ID\_ACCOUNT\_A*\)\.
+ Create a policy applied to the [Amazon EC2 instance role](https://docs.aws.amazon.com/codedeploy/latest/userguide/how-to-create-iam-instance-profile.html) configured for CodeDeploy that allows access to the Amazon S3 bucket \(*codepipeline\-us\-east\-2\-1234567890*\)\.
+ Create a policy applied to the [Amazon EC2 instance role](https://docs.aws.amazon.com/codedeploy/latest/userguide/how-to-create-iam-instance-profile.html) configured for CodeDeploy that allows access to the AWS KMS customer managed key used to encrypt the pipeline artifacts in *AccountA*\.
+ Configure and attach an IAM role \(*CrossAccount\_Role*\) with a trust relationship policy that allows *AccountA* to assume the role\.
+ Create a policy that allows access to the deployment resources required by the pipeline and attach it to *CrossAccount\_Role*\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-create-cross-account.md |
eab97d6cb40c-3 | + Create a policy that allows access to the deployment resources required by the pipeline and attach it to *CrossAccount\_Role*\.
+ Create a policy that allows access to the Amazon S3 bucket \(*codepipeline\-us\-east\-2\-1234567890*\) and attach it to *CrossAccount\_Role*\.
**Topics**
+ [Prerequisite: Create an AWS KMS encryption key](#pipelines-create-cross-account-create-key)
+ [Step 1: Set up account policies and roles](#pipelines-create-cross-account-setup)
+ [Step 2: Edit the pipeline](#pipelines-create-cross-account-create) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-create-cross-account.md |
f8b4fbc64939-0 | Customer\-managed keys are specific to a Region, as are all AWS KMS keys\. You must create your customer managed AWS KMS key in the same Region where the pipeline was created \(for example, `us-east-2`\)\.
**To create a customer managed key in AWS KMS**
1. Sign in to the AWS Management Console with *AccountA* and open the AWS KMS console\.
1. On the left, choose **Customer managed keys**\.
1. Choose **Create key**\. In **Configure key**, leave the **Symmetric** default selected and choose **Next**\.
1. In **Alias**, enter an alias to use for this key \(for example, *PipelineName\-Key*\)\. Optionally, provide a description and tags for this key, and then choose **Next**\.
1. In **Define Key Administrative Permissions**, choose your IAM user and any other users or groups you want to act as administrators for this key, and then choose **Next**\.
1. In **Define Key Usage Permissions**, under **This Account**, select the name of the service role for the pipeline \(for example, AWS\-CodePipeline\-Service\)\. Under **Other AWS accounts**, choose **Add another AWS account**\. Enter the account ID for *AccountB* to complete the ARN, and then choose **Next**\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-create-cross-account.md |
f8b4fbc64939-1 | 1. In **Review and edit key policy**, review the policy, and then choose **Finish**\.
1. From the list of keys, choose the alias of your key and copy its ARN \(for example, ***arn:aws:kms:us\-east\-2:012ID\_ACCOUNT\_A:key/2222222\-3333333\-4444\-556677EXAMPLE***\)\. You will need this when you edit your pipeline and configure policies\. | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-create-cross-account.md |
55f4de7d5846-0 | After you create the AWS KMS key, you must create and attach policies that will enable the cross\-account access\. This requires actions from both *AccountA* and *AccountB*\.
**Topics**
+ [Configure policies and roles in the account that will create the pipeline \(*AccountA*\)](#pipelines-create-cross-account-setup-accounta)
+ [Configure policies and roles in the account that owns the AWS resource \(*AccountB*\)](#pipelines-create-cross-account-setup-accountb) | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-create-cross-account.md |
5fa89934524b-0 | To create a pipeline that uses CodeDeploy resources associated with another AWS account, *AccountA* must configure policies for both the Amazon S3 bucket used to store artifacts and the service role for CodePipeline\.
**To create a policy for the Amazon S3 bucket that grants access to AccountB \(console\)**
1. Sign in to the AWS Management Console with *AccountA* and open the Amazon S3 console at [https://console\.aws\.amazon\.com/s3/](https://console.aws.amazon.com/s3/)\.
1. In the list of Amazon S3 buckets, choose the Amazon S3 bucket where artifacts for your pipelines are stored\. This bucket is named `codepipeline-region-1234567EXAMPLE`, where *region* is the AWS Region in which you created the pipeline and *1234567EXAMPLE* is a ten\-digit random number that ensures the bucket name is unique \(for example, *codepipeline\-us\-east\-2\-1234567890*\)\.
1. On the detail page for the Amazon S3 bucket, choose **Properties**\.
1. In the properties pane, expand **Permissions**, and then choose **Add bucket policy**\.
**Note** | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-create-cross-account.md |
5fa89934524b-1 | 1. In the properties pane, expand **Permissions**, and then choose **Add bucket policy**\.
**Note**
If a policy is already attached to your Amazon S3 bucket, choose **Edit bucket policy**\. You can then add the statements in the following example to the existing policy\. To add a new policy, choose the link, and follow the instructions in the AWS Policy Generator\. For more information, see [Overview of IAM Policies](https://docs.aws.amazon.com/IAM/latest/UserGuide/policies_overview.html)\.
1. In the **Bucket Policy Editor** window, type the following policy\. This will allow *AccountB* access to the pipeline artifacts, and will give *AccountB* the ability to add output artifacts if an action, such as a custom source or build action, creates them\.
In the following example, the ARN is for *AccountB* is *012ID\_ACCOUNT\_B*\. The ARN for the Amazon S3 bucket is *codepipeline\-us\-east\-2\-1234567890*\. Replace these ARNs with the ARN for the account you want to allow access and the ARN for the Amazon S3 bucket:
```
{
"Version": "2012-10-17",
"Id": "SSEAndSSLPolicy",
"Statement": [
{
"Sid": "DenyUnEncryptedObjectUploads",
"Effect": "Deny", | https://github.com/siagholami/aws-documentation/tree/main/documents/aws-codepipeline-user-guide/doc_source/pipelines-create-cross-account.md |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.