issue_owner_repo
listlengths
2
2
issue_body
stringlengths
0
262k
issue_title
stringlengths
1
1.02k
issue_comments_url
stringlengths
53
116
issue_comments_count
int64
0
2.49k
issue_created_at
stringdate
1999-03-17 02:06:42
2025-06-23 11:41:49
issue_updated_at
stringdate
2000-02-10 06:43:57
2025-06-23 11:43:00
issue_html_url
stringlengths
34
97
issue_github_id
int64
132
3.17B
issue_number
int64
1
215k
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Some events going to OpenSearch should have a document id (`_id`) set to a field within a sub-object. For example, `info/id` rather than `id`. Currently Data Prepper's `document_id_field` property only supports a field which is directly in the root of the event. **Describe the solution you'd like** Support the JSON Pointer syntax that Data Prepper is using throughout in the `document_id_field` property for Data Prepper. Example configuration: ``` sink: - opensearch: hosts: ["https://my-opensearch"] document_id_field: info/id ``` Given the following event, Data Prepper will produce a document with `_id` set to `json001`: ``` "fieldA" : "arbitrary value", "info" : { "id" : "json001", "fieldA" : "xyz", "fieldB" : "def" } ```
Complex documentId fields in OpenSearch sink
https://api.github.com/repos/opensearch-project/data-prepper/issues/1456/comments
1
2022-06-01T11:55:17Z
2022-11-03T16:54:35Z
https://github.com/opensearch-project/data-prepper/issues/1456
1,255,799,852
1,456
[ "opensearch-project", "data-prepper" ]
Decompress GZip S3 Objects to Data Prepper Event.
Load GZip compressed S3 objects
https://api.github.com/repos/opensearch-project/data-prepper/issues/1435/comments
0
2022-05-31T06:39:12Z
2022-06-13T22:45:03Z
https://github.com/opensearch-project/data-prepper/issues/1435
1,253,399,720
1,435
[ "opensearch-project", "data-prepper" ]
Convert uncompressed S3 Objects to Data Prepper Event.
Load uncompressed S3 objects
https://api.github.com/repos/opensearch-project/data-prepper/issues/1434/comments
0
2022-05-31T06:39:08Z
2022-06-03T18:46:56Z
https://github.com/opensearch-project/data-prepper/issues/1434
1,253,399,680
1,434
[ "opensearch-project", "data-prepper" ]
Download S3 objects from SQS queue.
Add S3 Event handling
https://api.github.com/repos/opensearch-project/data-prepper/issues/1433/comments
0
2022-05-31T06:39:05Z
2022-06-08T21:02:40Z
https://github.com/opensearch-project/data-prepper/issues/1433
1,253,399,623
1,433
[ "opensearch-project", "data-prepper" ]
## CVE-2022-29361 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Werkzeug-1.0.1-py2.py3-none-any.whl</b></p></summary> <p>The comprehensive WSGI web application library.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/cc/94/5f7079a0e00bd6863ef8f1da638721e9da21e5bacee597595b318f71d62e/Werkzeug-1.0.1-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/cc/94/5f7079a0e00bd6863ef8f1da638721e9da21e5bacee597595b318f71d62e/Werkzeug-1.0.1-py2.py3-none-any.whl</a></p> <p>Path to dependency file: /examples/trace-analytics-sample-app/sample-app/requirements.txt</p> <p>Path to vulnerable library: /examples/trace-analytics-sample-app/sample-app/requirements.txt</p> <p> Dependency Hierarchy: - opentelemetry_instrumentation_flask-0.19b0-py3-none-any.whl (Root Library) - Flask-1.1.4-py2.py3-none-any.whl - :x: **Werkzeug-1.0.1-py2.py3-none-any.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Improper parsing of HTTP requests in Pallets Werkzeug v2.1.0 and below allows attackers to perform HTTP Request Smuggling using a crafted HTTP request with multiple requests included inside the body. <p>Publish Date: 2022-05-25 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-29361>CVE-2022-29361</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-29361">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-29361</a></p> <p>Release Date: 2022-05-25</p> <p>Fix Resolution: Werkzeug - 2.1.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"Werkzeug","packageVersion":"1.0.1","packageFilePaths":["/examples/trace-analytics-sample-app/sample-app/requirements.txt"],"isTransitiveDependency":true,"dependencyTree":"opentelemetry-instrumentation-flask:0.19b0;Flask:1.1.4;Werkzeug:1.0.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Werkzeug - 2.1.1","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2022-29361","vulnerabilityDetails":"Improper parsing of HTTP requests in Pallets Werkzeug v2.1.0 and below allows attackers to perform HTTP Request Smuggling using a crafted HTTP request with multiple requests included inside the body.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-29361","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
CVE-2022-29361 (High) detected in Werkzeug-1.0.1-py2.py3-none-any.whl - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/1432/comments
1
2022-05-27T06:04:13Z
2022-06-07T20:45:15Z
https://github.com/opensearch-project/data-prepper/issues/1432
1,250,378,641
1,432
[ "opensearch-project", "data-prepper" ]
## WS-2019-0379 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-codec-1.11.jar</b></p></summary> <p>The Apache Commons Codec package contains simple encoder and decoders for various formats such as Base64 and Hexadecimal. In addition to these widely used encoders and decoders, the codec package also maintains a collection of phonetic encoding utilities.</p> <p>Path to dependency file: /data-prepper-plugins/dynamodb-source-coordination-store/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.11/3acb4705652e16236558f0f4f2192cc33c3bd189/commons-codec-1.11.jar</p> <p> Dependency Hierarchy: - httpclient-4.5.13.jar (Root Library) - :x: **commons-codec-1.11.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/ebd3e757c341c1d9c1352431bbad7bf5db2ea939">ebd3e757c341c1d9c1352431bbad7bf5db2ea939</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> Apache commons-codec before version “commons-codec-1.13-RC1” is vulnerable to information disclosure due to Improper Input validation. <p>Publish Date: 2019-05-20 <p>URL: <a href=https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113>WS-2019-0379</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2019-05-20</p> <p>Fix Resolution (commons-codec:commons-codec): 1.13</p> <p>Direct dependency fix Resolution (org.apache.httpcomponents:httpclient): 4.5.14</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
WS-2019-0379 (Medium) detected in commons-codec-1.11.jar
https://api.github.com/repos/opensearch-project/data-prepper/issues/1428/comments
2
2022-05-25T16:56:42Z
2023-04-20T14:46:02Z
https://github.com/opensearch-project/data-prepper/issues/1428
1,248,400,075
1,428
[ "opensearch-project", "data-prepper" ]
1. Use the SQS ReceiveMessage API to receive messages from SQS. 2. For each Message from SQS, it will: - Parse the Message as an S3Event. - Download the S3 Object which the S3Event indicates was created. - Decompress the object if configured to do so. - Parse the decompressed file using the configured codec into a list of Log Event objects. - Write the Log objects into the Data Prepper buffer. 3. Perform a DeleteMessageBatch with all of the messages which were successfully processed. Repeat
Add basic SQS interactions
https://api.github.com/repos/opensearch-project/data-prepper/issues/1425/comments
1
2022-05-24T18:03:00Z
2022-06-13T22:44:25Z
https://github.com/opensearch-project/data-prepper/issues/1425
1,246,948,269
1,425
[ "opensearch-project", "data-prepper" ]
Support authentication using STS Role ARN provided or use DefaultCredentialProvider for credentials to authenticate.
Add authentication for AWS configuration using STS role ARN
https://api.github.com/repos/opensearch-project/data-prepper/issues/1424/comments
0
2022-05-24T18:02:56Z
2022-05-25T16:49:57Z
https://github.com/opensearch-project/data-prepper/issues/1424
1,246,948,233
1,424
[ "opensearch-project", "data-prepper" ]
Add project setup for s3 source.
Add s3 source boilerplate
https://api.github.com/repos/opensearch-project/data-prepper/issues/1423/comments
0
2022-05-24T18:02:53Z
2022-05-24T18:10:05Z
https://github.com/opensearch-project/data-prepper/issues/1423
1,246,948,183
1,423
[ "opensearch-project", "data-prepper" ]
Presently the minimum version for running Data Prepper is Java 8. This issue proposes making Java 11 the new minimum. There are a few possible benefits: * The plugin redesign (#321) could better use of Java modules which were introduced in Java 9. * The OpenSearch project has updated to Java 11 as the minimum, including for official clients. * Fewer versions of Java to test against. * Make new language features available to contributors and maintainers. Is running Data Prepper on Java 8 important for you? Please provide feedback.
Make Java 11 the baseline for Data Prepper
https://api.github.com/repos/opensearch-project/data-prepper/issues/1422/comments
0
2022-05-24T16:17:50Z
2022-09-09T13:52:47Z
https://github.com/opensearch-project/data-prepper/issues/1422
1,246,798,935
1,422
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Currently Data Prepper reserves serviceName as its single common tag key with its value configurable through environment variable. This tag will be attached to any Data Prepper plugin metrics. Users are restricted to configure that single tag value for Data Prepper monitoring. **Describe the solution you'd like** We should allow users to customize the common metric tag key-value pairs in data-prepper-config.yaml as follows: ``` commonMetricTags: - key1: value1 - key2: value2 - ... ``` The commonMetricTags is an optional field defaults to empty list. The number of key value pairs after deduplication will be restricted to a max limit as for cloudwatch/embedded metrics format meter registry, tags will be converted into dimensions. The max allowed number of dimensions for a metric is 9 and we still want to reserve rooms for individual plugin to add special tags for its metrics(e.g. peer-forwarder has endpoint as tag for some of its metrics). The initial max limit will be set to 3. A conservative number that we can increase later. On the other hand, for backward compatibility, serviceName will remain as a reserved tag key that must be configured via environment variable until full removal in 2.0 release. **Describe alternatives you've considered (Optional)** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here.
Support customized common metric tags in Data Prepper configuration
https://api.github.com/repos/opensearch-project/data-prepper/issues/1415/comments
2
2022-05-20T16:29:25Z
2022-05-27T00:27:35Z
https://github.com/opensearch-project/data-prepper/issues/1415
1,243,382,333
1,415
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The out-of-box third party meter registries provided by micrometer fails to facilitate connection to stream processing service (e.g. Kinesis Data Streams/Firehose, Apache Kafka) on Data Prepper runtime metrics. e.g. [LoggingMeterRegistry](https://www.javadoc.io/doc/io.micrometer/micrometer-core/1.1.2/io/micrometer/core/instrument/logging/LoggingMeterRegistry.html) produces metrics logs but mixes them with the Data Prepper runtime event logs. Also, its format does not facilitate extracting core values e.g. Data Prepper metric log ``` 2022-04-25T15:55:59,951 [logging-metrics-publisher] INFO io.micrometer.core.instrument.logging.LoggingMeterRegistry - armeria.executor.active{name=blockingTaskExecutor} value=0 threads ``` Data Prepper event log ``` 2022-04-26T20:29:46,421 [raw-pipeline-processor-worker-3-thread-1] @opensearch-dashboards-highlighted-field@INFO@/opensearch-dashboards-highlighted-field@ com.amazon.dataprepper.pipeline.ProcessWorker - raw-pipeline Worker: No records received from buffer ``` **Describe the solution you'd like** We need to build an in-house [StepMeterRegistry](https://www.javadoc.io/doc/io.micrometer/micrometer-core/1.1.2/io/micrometer/core/instrument/step/StepMeterRegistry.html) (EMFLoggingMeterRegistry) that publish Data Prepper metrics as logs in standard format/schema. Why [Embedded Metrics Format](https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Embedded_Metric_Format_Specification.html)? * It is a standardized JSON specification used to instruct CloudWatch Logs to automatically extract metric values embedded in structured log events. One can use CloudWatch to graph and create alarms on the extracted metric values. * There is SDK library for serialization and publishing (https://github.com/awslabs/aws-embedded-metrics-java) **Describe alternatives you've considered (Optional)** Customized serializable metrics format could also work but takes more development and maintenance efforts. **Additional context** Add any other context or screenshots about the feature request here.
Support Embedded Metrics Format (EMF) log publishing for Data Prepper monitoring
https://api.github.com/repos/opensearch-project/data-prepper/issues/1404/comments
1
2022-05-17T20:49:15Z
2022-05-20T15:03:08Z
https://github.com/opensearch-project/data-prepper/issues/1404
1,239,142,969
1,404
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The `http` source does not support compressed gzip data **Describe the solution you'd like** * The `http` source should read gzip encoded data, decompress it, and translate it into individual `Events`. This logic will be repeated in the `s3` source, and potentially other sources. It makes sense to have a plugin that can be reused by different sources to perform this logic. **Additional context** * Another request for the `otel-trace-source` to support compressed data was made in this issue (https://github.com/opensearch-project/data-prepper/issues/1152) * The `s3` source design outlines support for `gzip` encoded data (https://github.com/opensearch-project/data-prepper/issues/251)
Support gzip encoded content in http source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1399/comments
1
2022-05-16T15:36:50Z
2023-05-17T16:24:33Z
https://github.com/opensearch-project/data-prepper/issues/1399
1,237,342,360
1,399
[ "opensearch-project", "data-prepper" ]
Update `THIRD-PARTY` file.
Update THIRD-PARTY file for Data Prepper 1.4.0
https://api.github.com/repos/opensearch-project/data-prepper/issues/1396/comments
0
2022-05-13T18:31:55Z
2022-05-13T22:32:08Z
https://github.com/opensearch-project/data-prepper/issues/1396
1,235,560,448
1,396
[ "opensearch-project", "data-prepper" ]
Update Data Prepper documentation at: https://github.com/opensearch-project/documentation-website
Update Data Prepper documentation on OpenSearch.org for 1.4.0
https://api.github.com/repos/opensearch-project/data-prepper/issues/1395/comments
0
2022-05-13T18:31:21Z
2022-05-17T19:48:03Z
https://github.com/opensearch-project/data-prepper/issues/1395
1,235,559,966
1,395
[ "opensearch-project", "data-prepper" ]
Create Data Prepper 1.4.0 Release Notes All changes should be available at: https://github.com/opensearch-project/data-prepper/milestone/5?closed=1 Please note that we need to clarify in the Release Notes which OTel Metric types are currently supported and which are not.
Create Data Prepper 1.4.0 Release Notes
https://api.github.com/repos/opensearch-project/data-prepper/issues/1394/comments
0
2022-05-13T18:31:03Z
2022-05-17T14:58:41Z
https://github.com/opensearch-project/data-prepper/issues/1394
1,235,559,712
1,394
[ "opensearch-project", "data-prepper" ]
Create Data Prepper 1.4.0 Changelog The Changelog is a detailed overview of all the changes made to Data Prepper in this release. It needs to be generated from Git history. See #1195 for the previous release's instructions.
Create Data Prepper 1.4.0 Changelog
https://api.github.com/repos/opensearch-project/data-prepper/issues/1393/comments
1
2022-05-13T18:28:49Z
2022-05-17T18:37:16Z
https://github.com/opensearch-project/data-prepper/issues/1393
1,235,556,143
1,393
[ "opensearch-project", "data-prepper" ]
## CVE-2022-22970 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-beans-5.3.18.jar</b></p></summary> <p>Spring Beans</p> <p>Path to dependency file: /data-prepper-expression/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-beans/5.3.18/3f0ea6598a5a1eae0a672f025a33a0b7e0d6dfd3/spring-beans-5.3.18.jar</p> <p> Dependency Hierarchy: - spring-context-5.3.18.jar (Root Library) - :x: **spring-beans-5.3.18.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In spring framework versions prior to 5.3.20+ , 5.2.22+ and old unsupported versions, applications that handle file uploads are vulnerable to DoS attack if they rely on data binding to set a MultipartFile or javax.servlet.Part to a field in a model object. <p>Publish Date: 2022-05-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-22970>CVE-2022-22970</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-22970">https://tanzu.vmware.com/security/cve-2022-22970</a></p> <p>Release Date: 2022-05-12</p> <p>Fix Resolution (org.springframework:spring-beans): 5.3.20</p> <p>Direct dependency fix Resolution (org.springframework:spring-context): 5.3.20</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
CVE-2022-22970 (Medium) detected in spring-beans-5.3.18.jar
https://api.github.com/repos/opensearch-project/data-prepper/issues/1390/comments
0
2022-05-13T14:45:01Z
2022-06-15T15:10:11Z
https://github.com/opensearch-project/data-prepper/issues/1390
1,235,339,139
1,390
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The `format` processor described in #697 was not yet implemented. **Describe the solution you'd like** Implement the `format` processor. It should be able to construct a string from a format string which includes both literal values and other keys. The following is a conceptual example. The exact format may vary. ``` "messageString" : "${/http_method} ${/http_path} response: ${status_code} ``` With the following Event: ``` "http_method" : "POST", "http_path" : "/hello", "status_code" : 200 ``` Will modify the Event to be: ``` "http_method" : "POST", "http_path" : "/hello", "status_code" : 200, "messageString" : "POST /hello response: 200" ```
Support the format String Manipulation Processor
https://api.github.com/repos/opensearch-project/data-prepper/issues/1380/comments
1
2022-05-10T22:53:26Z
2023-05-05T02:09:31Z
https://github.com/opensearch-project/data-prepper/issues/1380
1,231,784,137
1,380
[ "opensearch-project", "data-prepper" ]
## CVE-2022-24823 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-common-4.1.74.Final.jar</b></p></summary> <p></p> <p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p> <p>Path to dependency file: /data-prepper-plugins/drop-events-processor/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-common/4.1.74.Final/891b8ad3206469762b20c73f45d0d2e24cff3dd2/netty-common-4.1.74.Final.jar</p> <p> Dependency Hierarchy: - sts-2.17.264.jar (Root Library) - netty-nio-client-2.17.264.jar - :x: **netty-common-4.1.74.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Netty is an open-source, asynchronous event-driven network application framework. The package `io.netty:netty-codec-http` prior to version 4.1.77.Final contains an insufficient fix for CVE-2021-21290. When Netty's multipart decoders are used local information disclosure can occur via the local system temporary directory if temporary storing uploads on the disk is enabled. This only impacts applications running on Java version 6 and lower. Additionally, this vulnerability impacts code running on Unix-like systems, and very old versions of Mac OSX and Windows as they all share the system temporary directory between all users. Version 4.1.77.Final contains a patch for this vulnerability. As a workaround, specify one's own `java.io.tmpdir` when starting the JVM or use DefaultHttpDataFactory.setBaseDir(...) to set the directory to something that is only readable by the current user. <p>Publish Date: 2022-05-06 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-24823>CVE-2022-24823</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-24823">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-24823</a></p> <p>Release Date: 2022-05-06</p> <p>Fix Resolution: io.netty:netty-all;io.netty:netty-common - 4.1.77.Final</p> </p> </details> <p></p>
CVE-2022-24823 (Medium) detected in netty-common-4.1.74.Final.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/1379/comments
2
2022-05-08T19:55:50Z
2022-12-22T00:41:34Z
https://github.com/opensearch-project/data-prepper/issues/1379
1,228,973,384
1,379
[ "opensearch-project", "data-prepper" ]
**Describe the bug** Project Resources -> Configuration Reference in README.md links to `https://opensearch.org/docs/latest/monitoring-plugins/trace/data-prepper-reference/` which doesn't seem to exist and just reroutes to the root of the docs site. **To Reproduce** 1. Visit the link in the section "Project Resources -> Configuration Reference" in README.md **Expected behavior** I would expect the link to go to a known-valid configuration reference for data prepper. **Screenshots** ![image](https://user-images.githubusercontent.com/96254688/166339483-f0a32141-85d6-447c-a721-799b0aa84bfd.png)
[BUG] README links to non existent page
https://api.github.com/repos/opensearch-project/data-prepper/issues/1375/comments
2
2022-05-02T22:55:45Z
2022-05-05T14:19:50Z
https://github.com/opensearch-project/data-prepper/issues/1375
1,223,441,410
1,375
[ "opensearch-project", "data-prepper" ]
**Describe the bug** The AggregateProcessor has had some test failures. I've seen some failures related to AggregateProcessor, but am unsure what they have been in the past. ``` com.amazon.dataprepper.plugins.processor.aggregate.AggregateProcessorIT > repetition 2 of 2 FAILED java.lang.AssertionError at AggregateProcessorIT.java:160 ``` [769](https://github.com/opensearch-project/data-prepper/runs/6243816891?check_suite_focus=true#step:4:769) Please add to this issue when a test fails in AggregateProcessor to help track down the issue(s).
[BUG] Possible issue in AggregateProcessor or in tests
https://api.github.com/repos/opensearch-project/data-prepper/issues/1374/comments
5
2022-05-02T16:52:49Z
2023-11-10T14:04:14Z
https://github.com/opensearch-project/data-prepper/issues/1374
1,223,111,027
1,374
[ "opensearch-project", "data-prepper" ]
Update Data Prepper to use the opensearch-java client instead of the Rest High Level Client for bulk requests. This client should be compatible with both OpenSearch 1.x clusters and OpenSearch 2.x clusters. Additionally, it supports Java 8. Originally, Data Prepper was going to update to use the Rest High Level Client version `2.0.0-rc1`. However, that version requires Java 11.
Update to use the opensearch-java client for _bulk requests
https://api.github.com/repos/opensearch-project/data-prepper/issues/1347/comments
2
2022-04-29T16:16:37Z
2022-05-13T14:39:42Z
https://github.com/opensearch-project/data-prepper/issues/1347
1,221,185,246
1,347
[ "opensearch-project", "data-prepper" ]
This is a task to implement Conditional Routing as proposed in #1007. - [x] #1665 - [x] Conditional routing in Data Prepper core - [x] Updated support for plugin model to make routes available to all sinks
Implement Conditional Routing for Sinks
https://api.github.com/repos/opensearch-project/data-prepper/issues/1337/comments
0
2022-04-26T20:30:02Z
2022-10-06T00:12:54Z
https://github.com/opensearch-project/data-prepper/issues/1337
1,216,444,138
1,337
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem?** Humans have to remember to document features, or at least to open issues in documentation-website for new features/changes to be documented. This is a manual process, e.g. https://github.com/opensearch-project/OpenSearch/issues/1711#issuecomment-1060893992 **What solution would you like?** Add a workflow for creating issues in the `documentation-website` repo whenever a label `needs-documentation` is added on a PR. Refer: https://github.com/opensearch-project/OpenSearch/pull/2929 **What alternatives have you considered?** Create issues manually in the `documentation-website` whenever a feature needs to be documented. **Do you have any additional context?** Add any other context or screenshots about the feature request here.
Add a workflow for documenting features
https://api.github.com/repos/opensearch-project/data-prepper/issues/1326/comments
1
2022-04-25T18:27:07Z
2022-05-03T14:21:39Z
https://github.com/opensearch-project/data-prepper/issues/1326
1,214,881,780
1,326
[ "opensearch-project", "data-prepper" ]
just want to know how to perform health check for Data prepper, thanks
Kubernetes deployment readiness probe and liveness probe
https://api.github.com/repos/opensearch-project/data-prepper/issues/1314/comments
5
2022-04-18T02:23:29Z
2022-04-26T14:52:13Z
https://github.com/opensearch-project/data-prepper/issues/1314
1,206,583,724
1,314
[ "opensearch-project", "data-prepper" ]
**Describe the bug** Data prepper did not process tracing silently and I find the log below in the pod log, I suspend this is the cause, but do not know how to fix it 2022-04-12T15:03:07,141 [raw-pipeline-sink-worker-6-thread-1] WARN com.amazon.dataprepper.pipeline.Pipeline - Pipeline [raw-pipeline] - Workers did not terminate in time, forcing termination because after this time point, the tracing disappears in my opensearch tracing GUI, and also otel-collector log started to show that it could not connect to data-prepper as "connection refused" appeared in its log **To Reproduce** Steps to reproduce the behavior: 1.Just regular usage, otel java agent --- otel-collector pod --- data-prepper pod --- opensearch **Expected behavior** Data prepper could work normally instead of stop working siliently **Environment (please complete the following information):** image :opensearchproject/data-prepper:1.3.0
[BUG] Pipeline [raw-pipeline] - Workers did not terminate in time, forcing termination
https://api.github.com/repos/opensearch-project/data-prepper/issues/1313/comments
3
2022-04-18T02:20:34Z
2022-04-26T14:53:10Z
https://github.com/opensearch-project/data-prepper/issues/1313
1,206,582,257
1,313
[ "opensearch-project", "data-prepper" ]
## CVE-2022-22968 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-context-5.3.18.jar</b></p></summary> <p>Spring Context</p> <p>Path to dependency file: /data-prepper-expression/build.gradle</p> <p>Path to vulnerable library: /e/caches/modules-2/files-2.1/org.springframework/spring-context/5.3.18/34f6683d9dbe6edb02ad9393df3d3211b5484622/spring-context-5.3.18.jar</p> <p> Dependency Hierarchy: - :x: **spring-context-5.3.18.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Spring Framework versions 5.3.0 - 5.3.18, 5.2.0 - 5.2.20, and older unsupported versions, the patterns for disallowedFields on a DataBinder are case sensitive which means a field is not effectively protected unless it is listed with both upper and lower case for the first character of the field, including upper and lower case for the first character of all nested fields within the property path. <p>Publish Date: 2022-04-14 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-22968>CVE-2022-22968</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-22968">https://tanzu.vmware.com/security/cve-2022-22968</a></p> <p>Release Date: 2022-04-14</p> <p>Fix Resolution: 5.3.19</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
CVE-2022-22968 (Medium) detected in spring-context-5.3.18.jar
https://api.github.com/repos/opensearch-project/data-prepper/issues/1312/comments
0
2022-04-15T01:06:41Z
2022-06-15T15:10:11Z
https://github.com/opensearch-project/data-prepper/issues/1312
1,205,150,966
1,312
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** OpenSearch 2.0 is releasing in May. Data Prepper should support OpenSearch 2.0. https://opensearch.org/blog/partners/2022/02/roadmap-proposal/ **Describe the solution you'd like** Verify that OpenSearch 2.0 works as a sink and make any necessary changes. **Tasks** - [x] #1139 - Data Prepper never used mapping types - [x] ~#877~ Not needed to support OpenSearch 2.0 - [x] #593 - [x] #1347
Support OpenSearch 2.0
https://api.github.com/repos/opensearch-project/data-prepper/issues/1311/comments
2
2022-04-12T18:45:04Z
2022-05-13T18:29:47Z
https://github.com/opensearch-project/data-prepper/issues/1311
1,202,255,373
1,311
[ "opensearch-project", "data-prepper" ]
PR #242 introduced OTEL metrics support for Data-Prepper. However, due to scope constraints and an [outdated Java OTEL protocol library](https://github.com/opensearch-project/data-prepper/issues/1269) not all features could be fully supported. With the upgraded library the following [metrics features](https://github.com/open-telemetry/opentelemetry-proto/tree/main/opentelemetry/proto) can be implemented and should thus be supported by Data Prepper: - Exponential Histograms - Flags - SchemaUrl - ScopeMetrics - Exemplars
Support latest OTEL metrics features
https://api.github.com/repos/opensearch-project/data-prepper/issues/1310/comments
1
2022-04-12T13:42:10Z
2022-11-04T15:59:59Z
https://github.com/opensearch-project/data-prepper/issues/1310
1,201,872,645
1,310
[ "opensearch-project", "data-prepper" ]
**Describe the bug** I created a test application with OTLP data and I can get any document in the ServiceMap Indice. Steps to reproduce the behavior: In a c# .net core console, add OpenTelemetry package and opentelemetry.exporter.opentelemetryprotocol Add : ``` var MyActivitySource = new ActivitySource(serviceName); using var activity = MyActivitySource.StartActivity("SayHello"); activity?.SetTag("baz", new int[] { 1, 2, 3 }); ``` Setup the OtelCollector to send data to the Data-prepper: Config: ``` receivers: otlp: protocols: grpc: #endpoint: 0.0.0.0:55680 exporters: otlp/2: endpoint: localhost:21890 tls: insecure: true insecure_skip_verify: true logging: service: pipelines: traces: receivers: [otlp] exporters: [logging, otlp/2] ``` when the trace reach the server, it is consumed and sent to OpenSearch but only the raw-pipeline seems to send data. **Expected behavior** service-map-pipeline should send document to open search **Environment (please complete the following information):** - OS: Windows - Data-preper is running on latest docker image - Version Latest as of 11 april 22 No sure it is a bug or something wrong in my configuration. Is there any configuration to set to have verbose log that can help in tracking why the servicemap pipeline is not sending data?
[BUG] NoService document
https://api.github.com/repos/opensearch-project/data-prepper/issues/1308/comments
2
2022-04-11T16:41:56Z
2022-04-14T10:40:34Z
https://github.com/opensearch-project/data-prepper/issues/1308
1,200,240,409
1,308
[ "opensearch-project", "data-prepper" ]
**Describe the bug** In my application, if I set a tag with a array of int as value, the data-prepper is unable to process the ressourcespan. **To Reproduce** Steps to reproduce the behavior: In a c# .net core console, add OpenTelemetry package and opentelemetry.exporter.opentelemetryprotocol Add : ``` var MyActivitySource = new ActivitySource(serviceName); using var activity = MyActivitySource.StartActivity("SayHello"); activity?.SetTag("baz", new int[] { 1, 2, 3 }); ``` Setup the OtelCollector to send data to the Data-prepper: Config: ``` receivers: otlp: protocols: grpc: #endpoint: 0.0.0.0:55680 exporters: otlp/2: endpoint: localhost:21890 tls: insecure: true insecure_skip_verify: true logging: service: pipelines: traces: receivers: [otlp] exporters: [logging, otlp/2] ``` when the trace reach the server, it can't be consumed as the "bar" attributes is set multiples times (Array data). **Expected behavior** No errors. **Error Logs** ``` 2022-04-11T15:20:06,952 [raw-pipeline-processor-worker-5-thread-1] ERROR com.amazon.dataprepper.plugins.prepper.oteltrace.OTelTraceRawPrepper - Unable to process invalid ResourceSpan resource { data-prepper | attributes { data-prepper | key: "service.name" data-prepper | value { data-prepper | string_value: "MDS.TestSerilog.Test" data-prepper | } data-prepper | } data-prepper | attributes { data-prepper | key: "service.version" data-prepper | value { data-prepper | string_value: "1.0.0" data-prepper | } data-prepper | } data-prepper | attributes { data-prepper | key: "service.instance.id" data-prepper | value { data-prepper | string_value: "be537731-e6c8-48b8-8202-2d50301b61c9" data-prepper | } data-prepper | } data-prepper | } data-prepper | instrumentation_library_spans { data-prepper | instrumentation_library { data-prepper | name: "MDS.TestSerilog.Test" data-prepper | } data-prepper | spans { data-prepper | trace_id: "\355\273\256\213\nc9D\253\211\324\240lD\220s" data-prepper | span_id: "\307) R\276\247\244A" data-prepper | name: "SayHello" data-prepper | kind: SPAN_KIND_INTERNAL data-prepper | start_time_unix_nano: 1649690403941588600 data-prepper | end_time_unix_nano: 1649690403942298100 data-prepper | attributes { data-prepper | key: "foo" data-prepper | value { data-prepper | int_value: 1 data-prepper | } data-prepper | } data-prepper | attributes { data-prepper | key: "bar" data-prepper | value { data-prepper | string_value: "Hello, World32!" data-prepper | } data-prepper | } data-prepper | attributes { data-prepper | key: "baz" data-prepper | value { data-prepper | int_value: 1 data-prepper | } data-prepper | } data-prepper | attributes { data-prepper | key: "baz" data-prepper | value { data-prepper | int_value: 2 data-prepper | } data-prepper | } data-prepper | attributes { data-prepper | key: "baz" data-prepper | value { data-prepper | int_value: 3 data-prepper | } data-prepper | } data-prepper | status { data-prepper | } data-prepper | } data-prepper | } data-prepper | : data-prepper | java.lang.IllegalStateException: Duplicate key span.attributes.baz (attempted merging values 1 and 2) ``` **Environment (please complete the following information):** - OS: Windows / linux - Data-preper is running on latest docker image - Version: latest as of 11 April 2022 Thanks for the help!
[BUG] OTEL - Activity Tag: Array is not supported
https://api.github.com/repos/opensearch-project/data-prepper/issues/1307/comments
4
2022-04-11T15:53:33Z
2023-07-23T16:25:03Z
https://github.com/opensearch-project/data-prepper/issues/1307
1,200,179,735
1,307
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Data Prepper is currently creating legacy templates for indices. This approach is useful because it does allow Data Prepper to support a wider range of engine versions. However, legacy templates are not applied if any composable index template matches. Users who configure OpenSearch with a default index pattern for all indices will not have the correct trace templates applied to their indices in this case. **Describe the solution you'd like** Support composable index templates as well as legacy templates. Two broad approaches: 1. Data Prepper can detect the engine version and use the most appropriate type. 2. Pipeline authors could configure which to use via a pipeline configuration. To avoid any possibly breaking changes, I propose that Data Prepper support this with a configuration value. There would be three options: * `v1` - The current v1 (ie. legacy) template * `index-template` - New composable templates * ~`auto` - Let Data Prepper decide based on the engine version.~ The default value would be `v1`. In a major version release, the default would change to `auto`. **Additional context** See the issue that occurred in #1215.
Support Composable Index Templates
https://api.github.com/repos/opensearch-project/data-prepper/issues/1275/comments
2
2022-04-06T18:40:02Z
2023-06-07T23:31:16Z
https://github.com/opensearch-project/data-prepper/issues/1275
1,194,996,319
1,275
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Data Prepper trace ingestion is now supporting record_type:event which uses Event as data model and flow unit processed by trace ingestion plugins. The classic data unit record_type:otlp which indicates ExportTraceServiceRequest as data model will be removed in 2.0. **Describe the solution you'd like** A clear and concise description of what you want to happen. **Describe alternatives you've considered (Optional)** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here.
Remove ExportTraceServiceRequest record type as Data Prepper supported internal data type.
https://api.github.com/repos/opensearch-project/data-prepper/issues/1272/comments
4
2022-04-05T21:57:56Z
2022-09-27T18:52:08Z
https://github.com/opensearch-project/data-prepper/issues/1272
1,193,762,047
1,272
[ "opensearch-project", "data-prepper" ]
Some tests break when building Data Prepper with non-US locale settings. **Affected tests:** - DefaultPluginFactoryIT (locale specific validation error) - DateProcessorTests (locale specific date patterns)
[BUG] Some tests fail when building Data Prepper with non-US locale settings
https://api.github.com/repos/opensearch-project/data-prepper/issues/1270/comments
3
2022-04-05T17:27:22Z
2022-11-03T16:35:19Z
https://github.com/opensearch-project/data-prepper/issues/1270
1,193,503,149
1,270
[ "opensearch-project", "data-prepper" ]
The currently used OTLP protocol implementation is outdated (1.0.1-alpha). Although it is still marked as alpha it would be great if a more recent OTLP protobuf implementation would be used since some OTEL features (metrics ExponentialHistogram, etc) have changed since and consequently could be supported by Data Prepper. **Note**: the OTLP proto implementation got new Maven coordinates. See the [release notes of v1.9.0](https://github.com/open-telemetry/opentelemetry-java/releases/tag/v1.9.0) for more details.
Update Java Bindings for the OpenTelemetry Protocol
https://api.github.com/repos/opensearch-project/data-prepper/issues/1269/comments
0
2022-04-05T17:14:07Z
2022-09-09T01:50:44Z
https://github.com/opensearch-project/data-prepper/issues/1269
1,193,489,025
1,269
[ "opensearch-project", "data-prepper" ]
## WS-2022-0107 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>spring-beans-5.3.16.jar</b>, <b>spring-beans-5.3.15.jar</b></p></summary> <p> <details><summary><b>spring-beans-5.3.16.jar</b></p></summary> <p>Spring Beans</p> <p>Library home page: <a href="https://github.com/">https://github.com/</a></p> <p>Path to dependency file: /data-prepper-core/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-beans/5.3.16/15decec5cea7a91423272daaae6f5d050c23cf3b/spring-beans-5.3.16.jar</p> <p> Dependency Hierarchy: - spring-context-5.3.16.jar (Root Library) - :x: **spring-beans-5.3.16.jar** (Vulnerable Library) </details> <details><summary><b>spring-beans-5.3.15.jar</b></p></summary> <p>Spring Beans</p> <p>Path to dependency file: /data-prepper-expression/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-beans/5.3.15/a88e2ccfe8b131bcff2e643b90d52f6d928e7369/spring-beans-5.3.15.jar</p> <p> Dependency Hierarchy: - spring-context-5.3.15.jar (Root Library) - :x: **spring-beans-5.3.15.jar** (Vulnerable Library) </details> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Spring Framework before 5.2.20 and 5.3.x before 5.3.18 are vulnerable due to a vulnerability in Spring-beans which allows attackers under certain circumstances to achieve remote code execution, this vulnerability is also known as ״Spring4Shell״ or ״SpringShell״. The current POC related to the attack is done by creating a specially crafted request which manipulates ClassLoader to successfully achieve RCE (Remote Code Execution). Please note that the ease of exploitation may diverge by the code implementation. Currently, the exploit requires JDK 9 or higher, Apache Tomcat as the Servlet container, the application Packaged as WAR, and dependency on spring-webmvc or spring-webflux. Spring Framework 5.3.18 and 5.2.20 have already been released. WhiteSource’s research team is carefully observing developments and researching the case. We will keep updating this page and our WhiteSource resources with updates. This is a temporary WhiteSource ID until an official CVE ID will be released. <p>Publish Date: 2022-03-30 <p>URL: <a href=https://www.cyberkendra.com/2022/03/springshell-rce-0-day-vulnerability.html>WS-2022-0107</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://spring.io/blog/2022/03/31/spring-framework-rce-early-announcement">https://spring.io/blog/2022/03/31/spring-framework-rce-early-announcement</a></p> <p>Release Date: 2022-03-30</p> <p>Fix Resolution: org.springframework:spring-beans:5.2.20.RELEASE,5.3.18</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.3.16","packageFilePaths":["/data-prepper-core/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework:spring-context:5.3.16;org.springframework:spring-beans:5.3.16","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.springframework:spring-beans:5.2.20.RELEASE,5.3.18","isBinary":false},{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.3.15","packageFilePaths":["/data-prepper-expression/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework:spring-context:5.3.15;org.springframework:spring-beans:5.3.15","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.springframework:spring-beans:5.2.20.RELEASE,5.3.18","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"WS-2022-0107","vulnerabilityDetails":"Spring Framework before 5.2.20 and 5.3.x before 5.3.18 are vulnerable due to a vulnerability in Spring-beans which allows attackers under certain circumstances to achieve remote code execution, this vulnerability is also known as ״Spring4Shell״ or ״SpringShell״.\n\nThe current POC related to the attack is done by creating a specially crafted request which manipulates ClassLoader to successfully achieve RCE (Remote Code Execution).\nPlease note that the ease of exploitation may diverge by the code implementation.\n\nCurrently, the exploit requires JDK 9 or higher, Apache Tomcat as the Servlet container, the application Packaged as WAR, and dependency on spring-webmvc or spring-webflux.\nSpring Framework 5.3.18 and 5.2.20 have already been released.\n\nWhiteSource’s research team is carefully observing developments and researching the case. We will keep updating this page and our WhiteSource resources with updates.\nThis is a temporary WhiteSource ID until an official CVE ID will be released.","vulnerabilityUrl":"https://www.cyberkendra.com/2022/03/springshell-rce-0-day-vulnerability.html","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
WS-2022-0107 (High) detected in spring-beans-5.3.16.jar, spring-beans-5.3.15.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/1249/comments
1
2022-03-31T17:02:21Z
2022-03-31T21:35:06Z
https://github.com/opensearch-project/data-prepper/issues/1249
1,188,382,081
1,249
[ "opensearch-project", "data-prepper" ]
## CVE-2022-22965 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-beans-5.3.15.jar</b></p></summary> <p>Spring Beans</p> <p>Path to dependency file: /data-prepper-expression/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-beans/5.3.15/a88e2ccfe8b131bcff2e643b90d52f6d928e7369/spring-beans-5.3.15.jar</p> <p> Dependency Hierarchy: - spring-context-5.3.15.jar (Root Library) - :x: **spring-beans-5.3.15.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/37fa594796a7285c58043fbf9951b6408acb342e">37fa594796a7285c58043fbf9951b6408acb342e</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A Spring MVC or Spring WebFlux application running on JDK 9+ may be vulnerable to remote code execution (RCE) via data binding. The specific exploit requires the application to run on Tomcat as a WAR deployment. If the application is deployed as a Spring Boot executable jar, i.e. the default, it is not vulnerable to the exploit. However, the nature of the vulnerability is more general, and there may be other ways to exploit it. <p>Publish Date: 2022-04-01 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-22965>CVE-2022-22965</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://spring.io/blog/2022/03/31/spring-framework-rce-early-announcement">https://spring.io/blog/2022/03/31/spring-framework-rce-early-announcement</a></p> <p>Release Date: 2022-04-01</p> <p>Fix Resolution: org.springframework:spring-beans:5.2.20.RELEASE,5.3.18</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.springframework","packageName":"spring-beans","packageVersion":"5.3.15","packageFilePaths":["/data-prepper-expression/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework:spring-context:5.3.15;org.springframework:spring-beans:5.3.15","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.springframework:spring-beans:5.2.20.RELEASE,5.3.18","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2022-22965","vulnerabilityDetails":"A Spring MVC or Spring WebFlux application running on JDK 9+ may be vulnerable to remote code execution (RCE) via data binding. The specific exploit requires the application to run on Tomcat as a WAR deployment. If the application is deployed as a Spring Boot executable jar, i.e. the default, it is not vulnerable to the exploit. However, the nature of the vulnerability is more general, and there may be other ways to exploit it.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-22965","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
CVE-2022-22965 (High) detected in spring-beans-5.3.15.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/1248/comments
3
2022-03-31T17:02:19Z
2022-04-20T14:20:32Z
https://github.com/opensearch-project/data-prepper/issues/1248
1,188,382,040
1,248
[ "opensearch-project", "data-prepper" ]
## CVE-2022-22950 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-expression-5.3.15.jar</b></p></summary> <p>Spring Expression Language (SpEL)</p> <p>Path to dependency file: /data-prepper-expression/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.springframework/spring-expression/5.3.15/362f36bbc4c4b46cc2e4f219df22d08945000c2/spring-expression-5.3.15.jar</p> <p> Dependency Hierarchy: - spring-context-5.3.15.jar (Root Library) - :x: **spring-expression-5.3.15.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/37fa594796a7285c58043fbf9951b6408acb342e">37fa594796a7285c58043fbf9951b6408acb342e</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> n Spring Framework versions 5.3.0 - 5.3.16 and older unsupported versions, it is possible for a user to provide a specially crafted SpEL expression that may cause a denial of service condition. <p>Publish Date: 2022-04-01 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-22950>CVE-2022-22950</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://tanzu.vmware.com/security/cve-2022-22950">https://tanzu.vmware.com/security/cve-2022-22950</a></p> <p>Release Date: 2022-04-01</p> <p>Fix Resolution: org.springframework:spring-expression:5.2.20,5.3.17</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.springframework","packageName":"spring-expression","packageVersion":"5.3.15","packageFilePaths":["/data-prepper-expression/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.springframework:spring-context:5.3.15;org.springframework:spring-expression:5.3.15","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.springframework:spring-expression:5.2.20,5.3.17","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2022-22950","vulnerabilityDetails":"n Spring Framework versions 5.3.0 - 5.3.16 and older unsupported versions, it is possible for a user to provide a specially crafted SpEL expression that may cause a denial of service condition.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-22950","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
CVE-2022-22950 (Medium) detected in spring-expression-5.3.15.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/1247/comments
3
2022-03-30T20:53:02Z
2022-04-20T14:20:35Z
https://github.com/opensearch-project/data-prepper/issues/1247
1,186,976,977
1,247
[ "opensearch-project", "data-prepper" ]
Add in a new processor, ConvertEventProcessor. Its intent is to take an entry in a record and try to convert it to a desired type. It should at least support converting to boolean, integer, float, and string.
Mutate Event Processor - ConvertEventProcessor
https://api.github.com/repos/opensearch-project/data-prepper/issues/1240/comments
0
2022-03-24T16:44:04Z
2022-03-24T16:44:15Z
https://github.com/opensearch-project/data-prepper/issues/1240
1,179,769,613
1,240
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Data Prepper logs the main Data Prepper port number (default 4900). But, the OTel and Log sources do not log the ports they open. These ports have default values and thus there is no way to know what port they are running beyond looking at the documentation. **Describe the solution you'd like** Log the port when each source with a port starts up. **Describe alternatives you've considered (Optional)** Perhaps the Data Prepper management endpoints could provide something to help here. But that is more complicated. Logging is easy. **Additional context** N/A
Log ports when starting Data Prepper
https://api.github.com/repos/opensearch-project/data-prepper/issues/1238/comments
0
2022-03-23T20:43:58Z
2022-06-23T15:16:46Z
https://github.com/opensearch-project/data-prepper/issues/1238
1,178,645,177
1,238
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Data Prepper trace indices use a sequential index pattern. Some situations warrant a date-time index pattern instead. **Describe the solution you'd like** Support date-time patterns in trace index names. **Additional context** Data Prepper supports date-time patterns on trace indices in #767 . This is somewhat related to #1221 in that Data Prepper would need to support configurable index names.
Support Date-Time Patterns in Trace Indices
https://api.github.com/repos/opensearch-project/data-prepper/issues/1228/comments
0
2022-03-21T16:51:39Z
2022-06-08T19:10:43Z
https://github.com/opensearch-project/data-prepper/issues/1228
1,175,661,555
1,228
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The current docker-compose example projects are build and tested on mac/linux os. **Describe the solution you'd like** Add manual or automated testing to verify all docker-compose files are windows compatible or add windows specific instructions. - Windows Linux containers - Windows containers - Reference to running using WSL/WSL2 - Reference using a VM (Virtual Box) - Windows docker network configuration Add windows specific documentation for getting started with data prepper (if needed) Add documentation for troubleshooting windows / docket networking issues - Group policy - Registry settings - Network settings (list of network devices in windows, what network is docker connecting to) **Describe alternatives you've considered (Optional)** 1. Not supporting windows 2. Only supporting VM (Virtual Box) **Additional context** Opening issue base on questions from [community forum](https://discuss.opendistrocommunity.dev/c/data-prepper)
Add docker-compose examples for windows
https://api.github.com/repos/opensearch-project/data-prepper/issues/1227/comments
0
2022-03-21T16:42:45Z
2022-04-19T18:27:04Z
https://github.com/opensearch-project/data-prepper/issues/1227
1,175,651,608
1,227
[ "opensearch-project", "data-prepper" ]
## CVE-2021-44906 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>minimist-1.2.5.tgz</b></p></summary> <p>parse argument options</p> <p>Library home page: <a href="https://registry.npmjs.org/minimist/-/minimist-1.2.5.tgz">https://registry.npmjs.org/minimist/-/minimist-1.2.5.tgz</a></p> <p>Path to dependency file: /release/staging-resources-cdk/package.json</p> <p>Path to vulnerable library: /release/staging-resources-cdk/node_modules/minimist/package.json</p> <p> Dependency Hierarchy: - ts-jest-26.5.6.tgz (Root Library) - json5-2.2.0.tgz - :x: **minimist-1.2.5.tgz** (Vulnerable Library) <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Minimist <=1.2.5 is vulnerable to Prototype Pollution via file index.js, function setKey() (lines 69-95). <p>Publish Date: 2022-03-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-44906>CVE-2021-44906</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/substack/minimist/issues/164">https://github.com/substack/minimist/issues/164</a></p> <p>Release Date: 2022-03-17</p> <p>Fix Resolution: minimist - 1.2.6</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"minimist","packageVersion":"1.2.5","packageFilePaths":["/release/staging-resources-cdk/package.json"],"isTransitiveDependency":true,"dependencyTree":"ts-jest:26.5.6;json5:2.2.0;minimist:1.2.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"minimist - 1.2.6","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-44906","vulnerabilityDetails":"Minimist \u003c\u003d1.2.5 is vulnerable to Prototype Pollution via file index.js, function setKey() (lines 69-95).","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-44906","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
CVE-2021-44906 (High) detected in minimist-1.2.5.tgz - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/1226/comments
2
2022-03-19T01:15:51Z
2022-03-26T00:07:01Z
https://github.com/opensearch-project/data-prepper/issues/1226
1,174,135,210
1,226
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Like many other organizations, we leverage OpenSearch across multiple groups, whom have different role-based access to the system. In our case, most access is fits a model where group1 has access to indices group1-\*, group2 has access to indices group2-\*, and so on. The trace analytics stack (data-prepper and observability plugins), requires the use of indexes otel-v1-apm-span-\* and otel-v1-apm-service-map\*, which does not fix RBAC policies locked down by index. **Describe the solution you'd like** Please have observability and data-prepper teams work together to support other indexes for traces and services for the trace analytics stack. **Describe alternatives you've considered** N/A **Additional context** N/A ref: https://github.com/opensearch-project/dashboards-observability/issues/15
trace analytics - please support other indexes
https://api.github.com/repos/opensearch-project/data-prepper/issues/1221/comments
0
2022-03-18T16:46:37Z
2023-01-03T17:50:14Z
https://github.com/opensearch-project/data-prepper/issues/1221
1,173,799,363
1,221
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** As a user that wants to use the string processors like `substitute`, `trim`, `uppercase`, etc., I would prefer to not have to manually use these processors on individual elements of an array of strings one by one. **Describe the solution you'd like** I would like support for key values that are of type `List<String>`, and have the action (eg. substitute, trim) be completed on all elements of the array.
Support arrays of Strings in the Mutate String Processors
https://api.github.com/repos/opensearch-project/data-prepper/issues/1217/comments
0
2022-03-17T21:54:14Z
2022-04-19T19:08:30Z
https://github.com/opensearch-project/data-prepper/issues/1217
1,172,957,745
1,217
[ "opensearch-project", "data-prepper" ]
**Describe the bug** Saving trace data to OpenSearch is producing the following error: > [illegal_argument_exception] Text fields are not optimised for operations that require per-document field data like aggregations and sorting, so these operations are disabled by default. Please use a keyword field instead. Alternatively, set fielddata=true on [traceGroup] in order to load field data by uninverting the inverted index. Note that this can use significant memory. **Environment (please complete the following information):** - OS: Docker image - Version 1.2.1 - OpenSearch: 1.0 - OTel Collector 0.37.0
[BUG] Unoptimized text field used for traceGroup
https://api.github.com/repos/opensearch-project/data-prepper/issues/1215/comments
3
2022-03-17T20:29:49Z
2022-04-07T08:11:50Z
https://github.com/opensearch-project/data-prepper/issues/1215
1,172,880,737
1,215
[ "opensearch-project", "data-prepper" ]
Create Data Prepper 1.3 Changelog The Changelog is a detailed overview of all the changes made to Data Prepper in this release. It needs to be generated from Git history.
Create Data Prepper 1.3.0 Changelog
https://api.github.com/repos/opensearch-project/data-prepper/issues/1195/comments
2
2022-03-15T16:04:36Z
2022-03-23T22:00:51Z
https://github.com/opensearch-project/data-prepper/issues/1195
1,169,876,806
1,195
[ "opensearch-project", "data-prepper" ]
Currently data-prepper docker images are still been using `amazoncorretto:15-al2-full` which is a year old and not even on the support list anymore. I'm not sure which version of java data-prepper now supports. Besides that switching to the alpine versions is possible, since I don't see any other use than executing `java`. That would also decrease the image size. https://github.com/opensearch-project/data-prepper/blob/0dbdddac25b4a281bc5bedea484827beedf34f37/release/docker/Dockerfile#L1 https://github.com/opensearch-project/data-prepper/blob/26526f6d3011d01e74f648709c0c3b9c00edd151/examples/dev/trace-analytics-sample-app/Dockerfile#L6 If you wish I could also came up with a PR for it.
Update to newer version amazoncorretto
https://api.github.com/repos/opensearch-project/data-prepper/issues/1194/comments
3
2022-03-15T15:38:02Z
2022-04-28T06:14:08Z
https://github.com/opensearch-project/data-prepper/issues/1194
1,169,841,939
1,194
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Today [JacksonEvent.get()](https://github.com/opensearch-project/data-prepper/blob/299a638da0f45d10551e15fd3ebd59b7e0511e96/data-prepper-api/src/main/java/com/amazon/dataprepper/model/event/JacksonEvent.java#L161) checks if the key matches the pattern `^[A-Za-z0-9]+([A-Za-z0-9.-_][A-Za-z0-9])*$`. If the key does not match an `IllegalArgumentException` is thrown by [JacksonEvent.checkKey(](https://github.com/opensearch-project/data-prepper/blob/299a638da0f45d10551e15fd3ebd59b7e0511e96/data-prepper-api/src/main/java/com/amazon/dataprepper/model/event/JacksonEvent.java#L287). Internally [JacksonEvent.get()](https://github.com/opensearch-project/data-prepper/blob/299a638da0f45d10551e15fd3ebd59b7e0511e96/data-prepper-api/src/main/java/com/amazon/dataprepper/model/event/JacksonEvent.java#L161) uses a Jackson JsonPointer for key lookups. **Describe the solution you'd like** Update [JacksonEvent.checkKey(](https://github.com/opensearch-project/data-prepper/blob/299a638da0f45d10551e15fd3ebd59b7e0511e96/data-prepper-api/src/main/java/com/amazon/dataprepper/model/event/JacksonEvent.java#L287) to support the full set of characters supported by [JsonPointer](https://fasterxml.github.io/jackson-core/javadoc/2.9/com/fasterxml/jackson/core/JsonPointer.html) **Describe alternatives you've considered (Optional)** Supporting a subset of [JsonPointer](https://fasterxml.github.io/jackson-core/javadoc/2.9/com/fasterxml/jackson/core/JsonPointer.html) supported characters. **Additional context** Creating JacksonEvents using a wider set of key characters will need additional testing, not all character may work with the existing model.
Add support for additional character in JacksonEvent get by key methods
https://api.github.com/repos/opensearch-project/data-prepper/issues/1190/comments
1
2022-03-14T20:03:09Z
2022-04-19T19:09:14Z
https://github.com/opensearch-project/data-prepper/issues/1190
1,168,843,815
1,190
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Today `DropEventsProcessor` requires the annotation `@SingleThread` and `ConditionalExpressionEvaluator` requires `@Scope(ConfigurableBeanFactory.SCOPE_PROTOTYPE)`. `DropEventsProcessor` could be a shared instance processor if `ConditionalExpressionEvaluator` had support for concurrent execution. **Describe the solution you'd like** Enhance ConditionalExpressionEvaluator to support concurrent execution. A few options that could work: 1. Modify ConditionalExpressionEvaluator.evaluate(...) to return a future that contains a scoped instance of ParseTreeEvaluatorListener 2. Add a synchronous queue of requests to ConditionalExpressionEvaluator 3. Create a ParseTreeEvaluatorListener per evaluate request **Describe alternatives you've considered (Optional)** Expect `ConditionalExpressionEvaluator` to handle concurrency. **Additional context** n/a
Data Prepper Expression add concurrency support
https://api.github.com/repos/opensearch-project/data-prepper/issues/1189/comments
0
2022-03-14T19:49:15Z
2022-09-29T23:39:18Z
https://github.com/opensearch-project/data-prepper/issues/1189
1,168,831,446
1,189
[ "opensearch-project", "data-prepper" ]
**Describe the bug** Data Prepper Expression string where the first character is `/` are evaluated as escaped Json Pointer. **To Reproduce** Steps to reproduce the behavior: Using the following configuration: ```yaml simple-pipeline: workers: 1 source: http: thread_count: 1 prepper: - drop_events: drop_when: '/message == "/Hello"' sink: - file: path: /tmp/sink.log ``` 1. Start Data Prepper 2. `curl -k -H "Content-Type: application/json" -d '{"message": "/Hello"}' 'http://localhost:2021/log/ingest` 3. Verify in `/tmp/sink.log` event was not dropped. **Expected behavior** Support for Strings starting in a forward slash '/' **Screenshots** <img width="1195" alt="image" src="https://user-images.githubusercontent.com/8837988/158248313-7e92c69e-43ee-49d2-b549-8b26f113401c.png"> **Environment (please complete the following information):** - OS: MacOS 12.2.1 - Version: Data Prepper 1.3.0 **Additional context** n/a
[BUG] Data Prepper Expression string starting in / is considered escaped Json Pointer
https://api.github.com/repos/opensearch-project/data-prepper/issues/1188/comments
2
2022-03-14T19:40:37Z
2025-04-24T01:46:34Z
https://github.com/opensearch-project/data-prepper/issues/1188
1,168,824,066
1,188
[ "opensearch-project", "data-prepper" ]
## CVE-2020-36518 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.13.1.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /data-prepper-plugins/http-source/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.1/698b2d2b15d9a1b7aae025f1d9f576842285e7f6/jackson-databind-2.13.1.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.13.1.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/e04884680bf334075d12fd96e68d0c928a5585da">e04884680bf334075d12fd96e68d0c928a5585da</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jackson-databind before 2.13.0 allows a Java StackOverflow exception and denial of service via a large depth of nested objects. WhiteSource Note: After conducting further research, WhiteSource has determined that all versions of com.fasterxml.jackson.core:jackson-databind up to version 2.13.2 are vulnerable to CVE-2020-36518. <p>Publish Date: 2022-03-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36518>CVE-2020-36518</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2020-36518">https://nvd.nist.gov/vuln/detail/CVE-2020-36518</a></p> <p>Release Date: 2022-03-11</p> <p>Fix Resolution: jackson-databind-2.10 - 2.10.1;com.fasterxml.jackson.core.jackson-databind - 2.6.2.v20161117-2150</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.13.1","packageFilePaths":["/data-prepper-plugins/http-source/build.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.13.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jackson-databind-2.10 - 2.10.1;com.fasterxml.jackson.core.jackson-databind - 2.6.2.v20161117-2150","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2020-36518","vulnerabilityDetails":"jackson-databind before 2.13.0 allows a Java StackOverflow exception and denial of service via a large depth of nested objects.\n WhiteSource Note: After conducting further research, WhiteSource has determined that all versions of com.fasterxml.jackson.core:jackson-databind up to version 2.13.2 are vulnerable to CVE-2020-36518.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36518","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
CVE-2020-36518 (High) detected in jackson-databind-2.13.1.jar
https://api.github.com/repos/opensearch-project/data-prepper/issues/1182/comments
0
2022-03-14T14:47:13Z
2022-03-14T18:37:57Z
https://github.com/opensearch-project/data-prepper/issues/1182
1,168,486,336
1,182
[ "opensearch-project", "data-prepper" ]
null
Update the Release process to include deploying Maven artifacts to the artifact page
https://api.github.com/repos/opensearch-project/data-prepper/issues/1180/comments
0
2022-03-11T18:50:23Z
2022-03-14T19:15:36Z
https://github.com/opensearch-project/data-prepper/issues/1180
1,166,747,880
1,180
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** By default, OpenTelemetry Collector compress its requests body (along with Content-Encoding header). Otel-trace-source is not supporting this (responding with `INTERNAL: Invalid protobuf byte sequence`) and it needs to be turned off in the OpenTelemetry Collector configuration. **Describe the solution you'd like** Data Prepper should accept Content-Encoding header and adjust to it. **Describe alternatives you've considered (Optional)** Alternatively, it can let users configure which compression they intend to use, however it requires coordinated changes on both Collector and Prepper, so following HTTP header guidance is preferable. **Additional context** Found while reporting #1152
Support compression in OTel sources (trace, logs, metrics)
https://api.github.com/repos/opensearch-project/data-prepper/issues/1176/comments
6
2022-03-11T06:29:07Z
2023-05-17T16:24:34Z
https://github.com/opensearch-project/data-prepper/issues/1176
1,166,053,113
1,176
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Data Prepper's existing Trace Analytics source and processors operate on `ExportTraceServiceRequest` records. Each of these `ExportTraceServiceRequest` objects contains multiple spans. With the new Event model, the Trace Analytics source and processors should operate on spans instead. The current work to migrate to spans changes the way that the buffer behaves. With this change, the buffer will need to have many more records in it. Each record is smaller and thus the overall buffer size in MB should be the same. But, the buffer count does need to change. If Data Prepper releases with the new Event model, then existing pipelines will experience a performance degradation which we have measured to be about half the throughput. Pipeline authors must reconfigure their pipeline to get back the original throughput. Because this is somewhat of a breaking change, Data Prepper should have a migration path toward using the new model. **Describe the solution you'd like** Provide a migration path for existing Data Prepper pipelines to move to the new model. * Add a new `otel_trace_raw` processor which operates on the new Event model. * Keep `otel_trace_raw_prepper` on the old model. * Add a `record_type` field to `otel_trace_source` to allow it to send out Events instead of the old model. Here is a conceptual pipeline (shortened for brevity): ``` otel-trace-pipeline: source: otel_trace_source: record_type: event buffer: bounded_blocking: # Original recommendation of 512, multiplied by 20 buffer_size: 10240 sink: - pipeline: name: "raw-pipeline" - pipeline: name: "service-map-pipeline" raw-pipeline: source: pipeline: name: "otel-trace-pipeline" buffer: bounded_blocking: # Configure the same value as in otel-trace-pipeline buffer_size: 10_240 processor: # This used to be the otel_trace_raw_prepper. - otel_trace_raw: sink: ... ``` **Describe alternatives you've considered (Optional)** 1) Require pipeline authors update their pipeline. This would not be a good experience and I don't think we should take this approach. 2) Force the migration in Data Prepper 2.0. Pipeline authors would need to make changes to support Data Prepper 2.0. I think having a migration path is more ideal. **Additional context** Related issue: #546 Existing work: https://github.com/opensearch-project/data-prepper/tree/maint/546-migrate-trace-analytics-to-event-model
Provide a migration path to using Events for Trace data
https://api.github.com/repos/opensearch-project/data-prepper/issues/1158/comments
3
2022-03-08T20:10:46Z
2022-04-07T14:31:56Z
https://github.com/opensearch-project/data-prepper/issues/1158
1,163,072,296
1,158
[ "opensearch-project", "data-prepper" ]
**Describe the bug** OTel Trace Source [claims](https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/otel-trace-source#otel-trace-source) to support OTLP Protocol using both OTLP/grpc and OTLP/HTTP. I am not able to make the latter work. (It's a limitation of my runtime environment that I can only use HTTP for now.) **To Reproduce** 1. Use `jaeger-hotrod` example from the repository 2. Edit `otel-collector-config.yml`: 1. Change exporter from GRPC to HTTP: `otlp` to `otlphttp` on lines 7 and 16 2. Add HTTP protocol to the endpoint: `data-prepper:21890` to `http://data-prepper:21890` on line 8 3. Run docker compose: `docker-compose up -d --build` 4. Go to http://localhost:8080/ 5. Click a few buttons 6. Go to http://localhost:5601/app/observability-dashboards#/trace_analytics/home 7. See that no traces are coming through 8. Run `docker logs otel-collector` 9. See 404 errors: ``` 2022-03-08T07:24:35.632Z info exporterhelper/queued_retry.go:314 Exporting failed. Will retry the request after interval. {"kind": "exporter", "name": "otlphttp/2", "error": "error exporting items, request to http://data-prepper:21890/v1/traces responded with HTTP Status Code 404", "interval": "17.391133947s"} ``` 10. (Optional) clean-up: `docker-compose rm -s -v` **Expected behavior** I expected the messages to come through, just as they do when using unmodified `otel-collector-config.yml`. **Environment (please complete the following information):** - OS: Ubuntu 20.04.4 LTS - Version: 1.2.1-linux-64 **Additional context** I can't see any implementation for OTLP/HTTP in https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/otel-trace-source/src/main/java/com/amazon/dataprepper/plugins/source/oteltrace
[BUG] otel-trace-source is not accepting traces in OTLP format via HTTP
https://api.github.com/repos/opensearch-project/data-prepper/issues/1152/comments
6
2022-03-08T08:08:27Z
2022-03-10T17:59:45Z
https://github.com/opensearch-project/data-prepper/issues/1152
1,162,345,425
1,152
[ "opensearch-project", "data-prepper" ]
Create a Data Prepper release job which creates artifacts: * Archive files (tar.gz) * Docker image Part of #977.
Create Data Prepper build job to create artifacts
https://api.github.com/repos/opensearch-project/data-prepper/issues/1149/comments
0
2022-03-08T02:00:09Z
2022-03-09T14:53:42Z
https://github.com/opensearch-project/data-prepper/issues/1149
1,162,129,361
1,149
[ "opensearch-project", "data-prepper" ]
Data Prepper 1.3 is adding a number of new processors that need to be performance tested These processors include: * key-value processor * date processor * aggregate processor * mutate-event processors * mutate-string processors
Performance Testing for Data Prepper 1.3
https://api.github.com/repos/opensearch-project/data-prepper/issues/1148/comments
0
2022-03-08T00:21:54Z
2022-03-21T19:56:03Z
https://github.com/opensearch-project/data-prepper/issues/1148
1,162,075,177
1,148
[ "opensearch-project", "data-prepper" ]
As part of v2.0 release, `mapping types` are getting removed from OpenSearch engine. Below are the changes in the opensearch-engine. - Removal of type from rest end-points - Removal of include_type_name parameter from API requests. As part of this issue, please verify if type removal change on OpenSearch engine impacts this repository. If yes, then please remove the type references/usage from this plugin. Related: [OpenSearch-engine meta issue](https://github.com/opensearch-project/opensearch/issues/1940)
Remove mapping types
https://api.github.com/repos/opensearch-project/data-prepper/issues/1139/comments
2
2022-03-02T18:57:52Z
2022-04-13T20:15:02Z
https://github.com/opensearch-project/data-prepper/issues/1139
1,157,572,739
1,139
[ "opensearch-project", "data-prepper" ]
Currently it does not respect the SRP. Refactor out to be more sustainable as more and more processors get added.
Refactor MutateMapper to follow better coding principles
https://api.github.com/repos/opensearch-project/data-prepper/issues/1138/comments
0
2022-03-02T17:21:05Z
2022-03-16T15:58:28Z
https://github.com/opensearch-project/data-prepper/issues/1138
1,157,481,310
1,138
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Today if a JsonPointer results in a null value an exception is thrown. Using [Data Prepper Expression Syntax](https://github.com/opensearch-project/data-prepper/issues/1005) null values cannot be hard coded. As of Data Prepper 1.3.0 all operators are not null safe. **Describe the solution you'd like** Add support for a `null` keyword. Example: ``` /status_code in {null, 400, 404} ``` **Describe alternatives you've considered (Optional)** Handling null values with exceptions. **Additional context** n/a
Add null support to DataPrepperExpressions
https://api.github.com/repos/opensearch-project/data-prepper/issues/1136/comments
2
2022-03-02T16:49:09Z
2022-11-14T21:27:13Z
https://github.com/opensearch-project/data-prepper/issues/1136
1,157,443,804
1,136
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Some services call external services which are not annotated. Data Prepper users still want to see these external services in the service map. **Describe the solution you'd like** Include external service calls in the service map. This issue still needs additional analysis for the solution. **Additional context** Related issue in the OpenSearch Observability plugin: https://github.com/opensearch-project/dashboards-observability/issues/90 Also, see #628 for a related Data Prepper issue which is somewhat the opposite problem.
Support external services in the service map
https://api.github.com/repos/opensearch-project/data-prepper/issues/1131/comments
2
2022-03-01T16:54:57Z
2024-11-05T20:52:31Z
https://github.com/opensearch-project/data-prepper/issues/1131
1,155,574,974
1,131
[ "opensearch-project", "data-prepper" ]
Update the `aggregate` Processor with a configuration which allows a pipeline author to conclude a group before the group duration ends based on a condition. The new property key will be `conclude_when` and it will take a Data Prepper Expression string value. Example: ``` aggregate: conclude_when: "/event/type == 'CLOSED'" ``` See the *Conclusion Conditions* section of #699 for details on the behavior.
Add the `conclude_when` feature to conclude a group when a conditional is met
https://api.github.com/repos/opensearch-project/data-prepper/issues/1130/comments
1
2022-03-01T16:37:05Z
2022-09-16T14:40:31Z
https://github.com/opensearch-project/data-prepper/issues/1130
1,155,556,559
1,130
[ "opensearch-project", "data-prepper" ]
Data Prepper only distributes .tar.gz archive files to keep the distribution list streamlined for users. Update the Gradle task(s) which build and upload archive files so that it no longer produces .zip files.
Update Gradle project to produce only tar.gz archives
https://api.github.com/repos/opensearch-project/data-prepper/issues/1129/comments
1
2022-03-01T16:16:13Z
2022-03-02T16:45:21Z
https://github.com/opensearch-project/data-prepper/issues/1129
1,155,533,950
1,129
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Data Prepper Expressions today have limited support for precise floating point operations. Examples of expressions that give unexpected results: ``` 2000000000 == 2000000000.1 0.1 == 0.10000000000001 ``` **Describe the solution you'd like** - Added support for a tolerance value in floating point operations. - Add Double support - Add Long support **Describe alternatives you've considered (Optional)** n/a **Additional context** n/a
Enhance Data Prepper Expression Numeric Support
https://api.github.com/repos/opensearch-project/data-prepper/issues/1109/comments
0
2022-02-28T19:17:12Z
2022-04-19T19:10:57Z
https://github.com/opensearch-project/data-prepper/issues/1109
1,154,455,821
1,109
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Data Prepper currently does not provide any control logic within pipelines. **Describe the solution you'd like** This proposal is to add a basic conditional support for identifying what data is dropped by the DropEventsProcessor. A new `drop_when` configuration setting can be added following the [Data Prepper Expression Syntax](https://github.com/opensearch-project/data-prepper/issues/1005) to specify what events will be dropped or continue to the next processor. The `drop_when` expression will evaluate once for each event. If the expression evaluates to *true*, the event will be dropped. While evaluating an event if an exception occurs, by default, a warning will be logged and the event will not be dropped. Alternative behaviors can be configured with a new `handle_failed_events` configuration setting. The following options are supported: | Setting | Description | Data Prepper Version | |---------------|------------------------------------------------------------|----------------------| | skip | The event *will not* be dropped. A warning will be logged | 1.3.0 | | skip_silently | The event *will not* be dropped. No warning will be logged | 1.3.0 | | drop | The event *will* be dropped. A warning will be logged | 1.3.0 | | drop_silently | The event *will* be dropped. No warning will be logged | 1.3.0 | ## Sample Pipeline Configuration ``` drop-pipeline: source: file: path: "/full/path/to/logs_json.log" record_type: "event" format: "json" processor: - drop_events: drop_when: '/status_code < 400' handle_failed_events: skip_silently sink: - stdout: ``` **Describe alternatives you've considered (Optional)** Additional support for conditionals will be added by [#1007](https://github.com/opensearch-project/data-prepper/issues/1007) **Additional context** [Data Prepper Expression Parser](https://github.com/opensearch-project/data-prepper/issues/976) [Data Prepper Expression Evaluator](https://github.com/opensearch-project/data-prepper/issues/1003) [Data Prepper Expression Syntax](https://github.com/opensearch-project/data-prepper/issues/1005) [Data Prepper Conditionals](https://github.com/opensearch-project/data-prepper/issues/522) [Routers](https://github.com/opensearch-project/data-prepper/issues/1007) Duplicate of closed issue [#1063](https://github.com/opensearch-project/data-prepper/issues/1063)
Add conditional to DropEventsProcessor
https://api.github.com/repos/opensearch-project/data-prepper/issues/1105/comments
3
2022-02-28T17:34:32Z
2022-03-16T16:50:58Z
https://github.com/opensearch-project/data-prepper/issues/1105
1,154,367,621
1,105
[ "opensearch-project", "data-prepper" ]
Run: ``` ./gradlew generateThirdPartyReport ``` Commit `THIRD-PARTY`.
Update THIRD-PARTY file for Data Prepper 1.3
https://api.github.com/repos/opensearch-project/data-prepper/issues/1104/comments
0
2022-02-28T16:59:48Z
2022-03-01T16:10:21Z
https://github.com/opensearch-project/data-prepper/issues/1104
1,154,330,262
1,104
[ "opensearch-project", "data-prepper" ]
Create Data Prepper 1.3 Release Notes All changes should be available at: https://github.com/opensearch-project/data-prepper/milestone/2?closed=1 The release notes should provide a high-level overview of the changes we made to Data Prepper. They should include at least: * New Features * Enhancements * Bug Fixes * Migrations - Any migrations customers should make * Breaking Changes - Any breaking changes, even if minor.
Create Data Prepper 1.3.0 Release Notes
https://api.github.com/repos/opensearch-project/data-prepper/issues/1103/comments
0
2022-02-28T16:58:00Z
2022-03-16T16:08:02Z
https://github.com/opensearch-project/data-prepper/issues/1103
1,154,328,523
1,103
[ "opensearch-project", "data-prepper" ]
Update Data Prepper documentation at: https://github.com/opensearch-project/documentation-website We will have a staging branch in the above repository: `data-prepper-1.3` where we can make PRs for upcoming changes.
Update Data Prepper documentation on OpenSearch.org for 1.3.
https://api.github.com/repos/opensearch-project/data-prepper/issues/1102/comments
2
2022-02-28T16:49:31Z
2022-03-23T18:01:42Z
https://github.com/opensearch-project/data-prepper/issues/1102
1,154,320,460
1,102
[ "opensearch-project", "data-prepper" ]
Provide an SNS Source which can subscribe to an SNS topic (or use an existing subscription) to receive SNS notifications over HTTP/S. This will be similar to the existing HTTP source plugin. However, it should also support auto confirmation of subscription requests. When an expected topic names a request, visit the `SubscribeURL` per the SNS protocol. This can support auto-subscription either by a list of topics or using a topic name pattern. Provide two options for topic auto-confirmation: * `topics` - A list of topics to auto-confirm * `topics_regex` - A list of regex patterns. Any topic matching will auto-confirm. Provide configuration options just like `http` source: * `port` * `path` * `request_timeout` * `thread_count` * `max_connection_count` * `max_pending_requests` * `ssl` * `ssl_certificate_file` * `ssl_key_file` * `use_acm_certificate_for_ssl` * `acm_certificate_arn` * `acm_private_key_password` * `acm_certificate_timeout_millis` * `aws_region`
Support SNS HTTPS as a Source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1083/comments
1
2022-02-22T17:49:20Z
2024-10-04T15:04:44Z
https://github.com/opensearch-project/data-prepper/issues/1083
1,147,202,755
1,083
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Some pipeline authors want to retrieve events from Amazon Kinesis Data Streams. **Describe the solution you'd like** Create a `kinesis_data_streams` source plugin. The [Kinesis Client Library](https://docs.aws.amazon.com/streams/latest/dev/shared-throughput-kcl-consumers.html) (KCL) can manage much of the client needs. So I propose that the Data Prepper source use KCL for reading from Kinesis. KCL uses DynamoDB to coordinate consumers. Because KCL uses DynamoDB and Kinesis presumes an AWS account anyway, I propose that Data Prepper uses DynamoDB for consumer coordination. Data Prepper should support configuring the AWS resources and access to the AWS resources that KCL needs. And also configuring the Kinesis stream name. Example configuration: ``` source: kinesis_data_streams: stream_name: MyStream coordination_table_name: MyDynamoDbTable ``` **Additional context** https://javadoc.io/doc/software.amazon.kinesis/amazon-kinesis-client/latest/index.html
Support AWS Kinesis Data Streams as a Source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1082/comments
4
2022-02-22T17:49:02Z
2024-09-19T22:23:04Z
https://github.com/opensearch-project/data-prepper/issues/1082
1,147,202,513
1,082
[ "opensearch-project", "data-prepper" ]
Some Events will have a single line from a comma-separated value (CSV) or tab-separated value (TSV) file. Data Prepper should be able to parse an individual CSV line and add fields to the Event for that CSV line. Pipeline authors should have the option to have the each CSV/TSV line output either: * Specific key names from column indices (e.g. "column 2 maps to status code") * Output the columns into an array based on the index - [x] #1613 - [x] #1614 - [x] #1615 - [x] #1616 - [x] #1617 - [x] #1619
Parse CSV or TSV content in Events
https://api.github.com/repos/opensearch-project/data-prepper/issues/1081/comments
9
2022-02-22T17:16:06Z
2022-09-13T16:54:17Z
https://github.com/opensearch-project/data-prepper/issues/1081
1,147,169,832
1,081
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Follow-up to #854, to include different default `index` values for `elasticsearch`, `OpenSearch` and `amazon_es` output plugins. There might be other plugins needing this feature where Logstash default is different from the Data Prepper default. **Describe the solution you'd like** Add `defaultSettings` property to the Logstash Mapping YAML files setting the default Data Prepper attribute values. **Describe alternatives you've considered (Optional)** Add multiple Mapper classes depending on the number of default values with a common Base class. Potential redundant logic. **Additional context** N/A
LogstashConfigConverter: Support default values for attributes in mapping files
https://api.github.com/repos/opensearch-project/data-prepper/issues/1080/comments
0
2022-02-22T16:24:13Z
2022-03-03T22:34:24Z
https://github.com/opensearch-project/data-prepper/issues/1080
1,147,115,829
1,080
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Many processors have configuration options available for specifying durations (for example, the Aggregate Processor `group_duration` and the Service Map `window_duration`. This can lead to a worse user experience, as the user does not immediately know the unit of the duration, and they have to calculate based on that unit depending on the value they want to provide. **Describe the solution you'd like** A standard approach to Duration configurations in Data Prepper. This would include support for two separate types of Duration strings 1. Simple strings to specify Durations in milliseconds, seconds, minutes, hours, or days. This will be easier for some to use as it is simple to read and easier to create without knowledge of ISO 8601 Here are some examples of the units we will support with this approach ``` "100ms" // 100 milliseconds "60s" // 60 seconds "15min" // 15 minutes "2hr" // 2 hours "3day" // 3 days ``` 2. ISO_8601 notation (Ex: `P3DT4H59M` or `PT2H30M`) The validation and building of these durations should be done within `data-prepper-core/data-prepper-api`, if possible. For example, a configuration for `group_duration` could look like the following: ``` @ValidDurationString private String durationString; and then build the actual duration with an API call of Duration duration = getDurationFromString(durationString); ``` **Describe alternatives you've considered (Optional)** Keep things the way they are now and only support one unit of time for configurations, while mentioning the unit they use (ex: `group_duration_seconds`) **Additional context** Add any other context or screenshots about the feature request here.
Standard Duration configuration for Data Prepper
https://api.github.com/repos/opensearch-project/data-prepper/issues/1079/comments
7
2022-02-21T21:13:40Z
2022-02-25T18:13:54Z
https://github.com/opensearch-project/data-prepper/issues/1079
1,146,231,769
1,079
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The Aggregate Processor stores state in memory, and if the available memory is used up, it will just throw an Out of Memory error and Data Prepper will crash. **Describe the solution you'd like** In order to avoid the Out of Memory error, the AggregateProcessor should make space by concluding groups whenever a certain percent of memory is used. Whenever a Data Prepper instance reaches a threshold of using 90% or more of its available heap memory, the Aggregate Processor should conclude the `N` oldest groups until the memory is brought back to below 70% use. **Describe alternatives you've considered (Optional)** * Store state on disk - instead of just concluding groups, the better long term solution will be to buffer the state of the Aggregate Processor to disk. This is something that will be thought about holistically for Data Prepper in the future * Throttling the ingestion of data - Currently, Data Prepper does not support throttling (in the case of either a full buffer or of no available memory) * Concluding strategy is configurable - Choosing the groups to conclude when memory is nearing full could be configurable. For example, concluding the oldest groups could be set by the following config. ```yaml processor: - aggregate: out_of_memory_strategy: `conclude_oldest` ``` Another strategy with this approach could be to not allow the creation of new groups. **Additional context** The Service Map processor also has this issue of throwing an OOM error when no space is available
Handling Out of Memory errors in the Aggregate Processor
https://api.github.com/repos/opensearch-project/data-prepper/issues/1078/comments
1
2022-02-21T20:14:33Z
2022-04-19T19:11:08Z
https://github.com/opensearch-project/data-prepper/issues/1078
1,146,191,758
1,078
[ "opensearch-project", "data-prepper" ]
Tried to see otel-v1-apm-span-* indexes created by Data Prepper in OpenSearch Dashboards -> Observability -> Event analytics, but simple request like ```source = otel-v1-apm-span-000001 | fields name``` returns ```Index does not contain a valid time field``` Similar request for indexes created by Jaeger like ```source = jaeger-span-2022-02-21 | fields operationName``` shows results - so why? I guess the reason is ```startTime``` field format - in ```jaeger-span-*``` indexes it looks as integer (UNUX epoch?), but in ```otel-v1-apm-span-*``` indexes it looks like "2022-02-21T06:04:04.898483Z". Another possible issue is fields names like ```"span.attributes.ci@pipeline@run@url"``` in ```otel-v1-apm-span-*``` indexes, which looks less usable than similar data in ```jaeger-span-*``` indexes
[BUG] Indexes otel-v1-apm-span-* are not suitable for Event analytics
https://api.github.com/repos/opensearch-project/data-prepper/issues/1077/comments
1
2022-02-21T06:21:36Z
2022-02-23T22:48:46Z
https://github.com/opensearch-project/data-prepper/issues/1077
1,145,373,649
1,077
[ "opensearch-project", "data-prepper" ]
Data-prepper can't run until opensearch is ready: ``` data-prepper | 2022-02-19T13:26:13,300 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink - Initializing OpenSearch sink data-prepper | 2022-02-19T13:26:13,785 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - Using the trust all strategy data-prepper | 2022-02-19T13:26:14,274 [main] ERROR com.amazon.dataprepper.plugin.PluginCreator - Encountered exception while instantiating the plugin OpenSearchSink data-prepper | java.lang.reflect.InvocationTargetException: null data-prepper | Caused by: java.lang.RuntimeException: Connection refused data-prepper | Caused by: java.net.ConnectException: Connection refused data-prepper | Caused by: java.net.ConnectException: Connection refused data-prepper | 2022-02-19T13:26:14,304 [main] ERROR com.amazon.dataprepper.parser.PipelineParser - Construction of pipeline components failed, skipping building of pipeline [service-map-pipeline] and its connected pipelines data-prepper | com.amazon.dataprepper.model.plugin.PluginInvocationException: Exception throw from the plugin'OpenSearchSink'. data-prepper | Caused by: java.lang.reflect.InvocationTargetException data-prepper | Caused by: java.lang.RuntimeException: Connection refused data-prepper | Caused by: java.net.ConnectException: Connection refused data-prepper | Caused by: java.net.ConnectException: Connection refused data-prepper | 2022-02-19T13:26:14,314 [main] ERROR com.amazon.dataprepper.DataPrepper - No valid pipeline is available for execution, exiting ``` Why not wait as Dashboards?
[BUG] Reconnect on opensearch connection refused
https://api.github.com/repos/opensearch-project/data-prepper/issues/1076/comments
1
2022-02-19T14:02:40Z
2022-02-21T14:58:11Z
https://github.com/opensearch-project/data-prepper/issues/1076
1,144,718,890
1,076
[ "opensearch-project", "data-prepper" ]
It looks like no way to transmit OTEL traces to some simple output like stdout or file With configuration like so: ``` entry-pipeline: delay: "100" source: otel_trace_source: ssl: false sink: - file: path: /data/data-prepper/prepper.json ``` I see error: ``` data-prepper | 2022-02-19T08:39:16,313 [entry-pipeline-prepper-worker-1-thread-1] INFO com.amazon.dataprepper.pipeline.ProcessWorker - entry-pipeline Worker: Processing 1 records from buffer data-prepper | 2022-02-19T08:39:16,336 [entry-pipeline-sink-worker-2-thread-1] ERROR com.amazon.dataprepper.pipeline.common.PipelineThreadPoolExecutor - Pipeline [entry-pipeline] process worker encountered a fatal exception, cannot proceed further data-prepper | java.util.concurrent.ExecutionException: java.lang.ClassCastException: class io.opentelemetry.proto.collector.trace.v1.ExportTraceServiceRequest cannot be cast to class java.lang.String (io.opentelemetry.proto.collector.trace.v1.ExportTraceServiceRequest is in unnamed module of loader 'app'; java.lang.String is in module java.base of loader 'bootstrap') ``` With: ``` entry-pipeline: delay: "100" source: otel_trace_source: ssl: false sink: - stdout: ``` error is even more helpful: ``` data-prepper | 2022-02-19T08:46:06,315 [entry-pipeline-prepper-worker-1-thread-1] INFO com.amazon.dataprepper.pipeline.ProcessWorker - entry-pipeline Worker: Processing 1 records from buffer data-prepper | 2022-02-19T08:46:06,323 [entry-pipeline-sink-worker-2-thread-1] ERROR com.amazon.dataprepper.pipeline.common.PipelineThreadPoolExecutor - Pipeline [entry-pipeline] process worker encountered a fatal exception, cannot proceed further data-prepper | java.util.concurrent.ExecutionException: java.lang.RuntimeException: Invalid record type. StdOutSink only supports String and Events ``` So, maybe it is possible to dump traces as json?
[BUG] Can't dump OTEL traces to stdout or file
https://api.github.com/repos/opensearch-project/data-prepper/issues/1075/comments
1
2022-02-19T08:49:37Z
2022-03-15T14:44:38Z
https://github.com/opensearch-project/data-prepper/issues/1075
1,144,617,113
1,075
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Building plugins requires defining components of a plugin (core feature, logstash mapping) in multiple parts of the code base. Users build their plugins in their specific plugin [module](https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins) in the data-prepper-plugins package. Here they implement the interface (source, prepper, sink) for the plugin type they are developing and define the configuration. To support Logstash conversion, plugin authors must create a separate classes in a separate [module](https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-logstash-configuration/src/main/java/org/opensearch/dataprepper/logstash/mapping) in the data-prepper-logstash-configuration package. This current approach scatters plugin concepts across the code base. **Describe the solution you'd like** Ideally, as a plugin author I would like to define my mapping and implementation details in the same package. **Describe alternatives you've considered (Optional)** I am open to alternatives... **Additional context** I think this is an important feature to consider as we build support for the community to bring their own plugins.
Centralizing Plugin Conversion with Plugin Code
https://api.github.com/repos/opensearch-project/data-prepper/issues/1066/comments
3
2022-02-18T20:56:30Z
2022-04-19T22:21:51Z
https://github.com/opensearch-project/data-prepper/issues/1066
1,143,762,585
1,066
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The new `drop` processor drops all Events, but Data Prepper currently doesn't support any form of conditionals. Thus, the `drop` processor probably has little value in 1.3.0. **Describe the solution you'd like** The upcoming #1007 Conditional Routing feature is targeted for 1.4.0 and may only cover Sinks then. However, expression parsing ( #1003 ) will be available in 1.3.0. The `drop` processor should have its own custom conditional support for dropping events. I propose adding a `drop_when` condition that only drops events meeting a given condition. For example, to drop all successful HTTP responses. ``` drop: drop_when: "/status_code < 400" ``` **Describe alternatives you've considered (Optional)** Using the #522 conditional logic would be reasonable. However, Data Prepper efforts are being re-oriented toward Conditional Routing of sinks. And we may reconsider the approach taken for processors.
Drop Events only when certain conditions are met
https://api.github.com/repos/opensearch-project/data-prepper/issues/1063/comments
2
2022-02-18T17:36:04Z
2022-02-28T17:57:36Z
https://github.com/opensearch-project/data-prepper/issues/1063
1,143,464,983
1,063
[ "opensearch-project", "data-prepper" ]
## Description JCenter has been turned off and should be removed as a referenced repository, as soon as possible to prevent interruptions of builds. Please Remove any direct dependency on jcenter() within a repositories block in gradle files. Related: https://github.com/opensearch-project/opensearch-build/issues/1456
Remove jcenter repository
https://api.github.com/repos/opensearch-project/data-prepper/issues/1062/comments
1
2022-02-18T17:33:18Z
2022-02-18T17:35:06Z
https://github.com/opensearch-project/data-prepper/issues/1062
1,143,460,798
1,062
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** As a user that wants to run performance tests on my pipeline configuration, setting up performance testing is a tedious manual process. As a developer that wants to test the performance of a new plugin that I have created, setting up and running the performance tests is a tedious manual process. **Describe the solution you'd like** * Automate the deployment of the performance testing environment (data prepper installation, gatling, prometheus, grafana) so that anyone can deploy a separate instance of this environment with a couple simple commands (this means multiple users could run their performance tests independently and in parallel) * After the environment is deployed, make it easy to run the performance test suites with different Data Prepper configurations. Ideally, this should be as simple as editing the configuration, and the clicking "Run Tests". * Running performance tests should only cost resources when tests are actually being run (just deploying the environment doesn't cost money, and after a test is being run no billing is done) **Describe alternatives you've considered (Optional)** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here.
Automate Performance Testing
https://api.github.com/repos/opensearch-project/data-prepper/issues/1058/comments
0
2022-02-17T20:22:23Z
2022-04-19T19:11:28Z
https://github.com/opensearch-project/data-prepper/issues/1058
1,141,827,501
1,058
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** For filters other than `mutate`, they have the option to have `add_field` and `remove_field`. Currently, these two are not being converted into their respective `add_entries` and `delete_entries` processors. **Describe the solution you'd like** Add a conversion of the common options `add_field` and `remove_field` to the Logstash converter.
Convert Add/Remove Fields in filters other than Mutate
https://api.github.com/repos/opensearch-project/data-prepper/issues/1057/comments
0
2022-02-17T18:55:07Z
2022-06-08T19:09:33Z
https://github.com/opensearch-project/data-prepper/issues/1057
1,141,729,308
1,057
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** We are trying to use dataprepper opensearch sink with a minimum set of permissions i.e index permission only. The destination index is managed by another process. Currently the sink is trying to do something that requires admin/get, a permission we don't want to assign to our indexing user. **Describe the solution you'd like** A config option to disable any template or ism management **Describe alternatives you've considered (Optional)** We have considered granting the user more permissions but this is not something we want to do with our use case.
[Request] Opensearch sink should have option to disable index management
https://api.github.com/repos/opensearch-project/data-prepper/issues/1051/comments
5
2022-02-17T03:12:50Z
2022-06-23T12:47:05Z
https://github.com/opensearch-project/data-prepper/issues/1051
1,140,823,929
1,051
[ "opensearch-project", "data-prepper" ]
``` log-pipeline: source: sqs: queues: - url: "https://sqs.us-east-1.amazonaws.com/myQueue1" polling_frequency: 5m batch_size: 10 number_of_threads: 2 - urls: "https://sqs.us-east-1.amazonaws.com/myQueue2" polling_frequency: 1m batch_size: 10 number_of_threads: 3 ``` The data from the queue will be sent to the buffer as: ``` { "message" : "{messageBody}" } ``` It should also include queue metadata: ``` { "queueUrl" : "https://sqs.us-east-1.amazonaws.com/myQueue1", "sentTimestamp" : 1720679777 ```
Support SQS as a Source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1049/comments
4
2022-02-16T20:50:44Z
2025-03-11T19:49:51Z
https://github.com/opensearch-project/data-prepper/issues/1049
1,140,573,546
1,049
[ "opensearch-project", "data-prepper" ]
Some pipeline authors would like to save Events to S3. Some teams using OpenSearch for observability have looked for an ability to store all their events (all logs, trace, and metrics) into S3. This can be a more cost-effective storage solution for data that may not be very important. To fully support this use-case, Data Prepper would need to store the objects into S3 in a form that can later be played back. They could run a sink which loads the data from S3 using the existing S3 source and then sends it into OpenSearch later. Data Prepper should have a Sink which saves Events to S3 as objects. Likely, an object should contain multiple events. The S3 sink should support the following: * Configurations for the bucket name, key path and key pattern. The key pattern should support timestamps such as `logs-${YYYY.mm}`. * The key pattern should support the time at which it was written, using a similar format to the OpenSearch sink's index pattern. * The sink will collect objects (ideally in a local file to handle faults) before sending them to S3 as a large object. * Configurations to determine thresholds for writing the S3 objects. These can be any of 1) how many events, 2) how many bytes; or 3) how long events should be collected before writing the S3 object * The ability to encode events using a concept similar to #1532 for sink-based codecs. * The pipeline author can configure the output codec that they wish to use (e.g. newline, JSON, CSV). This will be a pipeline configuration option on the S3 sink.
Support S3 as a Sink
https://api.github.com/repos/opensearch-project/data-prepper/issues/1048/comments
1
2022-02-16T20:49:12Z
2023-06-05T21:38:35Z
https://github.com/opensearch-project/data-prepper/issues/1048
1,140,572,333
1,048
[ "opensearch-project", "data-prepper" ]
null
Integration testing for the Aggregate Processor
https://api.github.com/repos/opensearch-project/data-prepper/issues/1038/comments
0
2022-02-15T16:07:41Z
2022-02-16T21:40:58Z
https://github.com/opensearch-project/data-prepper/issues/1038
1,138,898,194
1,038
[ "opensearch-project", "data-prepper" ]
[FindBugs](http://findbugs.sourceforge.net/) is a static analysis tool that finds bugs in Java code. Adding the FindBugs analysis to our Gradle build will make it easy to quickly catch issues in the code.
Add SpotBugs to the Data Prepper Gradle Build
https://api.github.com/repos/opensearch-project/data-prepper/issues/1032/comments
1
2022-02-14T21:57:55Z
2022-04-19T20:27:36Z
https://github.com/opensearch-project/data-prepper/issues/1032
1,137,895,374
1,032
[ "opensearch-project", "data-prepper" ]
# Introduction A Data Prepper pipeline can contain multiple sources, processors, and sinks with the same type. Presently, these cannot be distinguished. # Proposed Solution Data Prepper should assign a unique identifier for each pipeline component. The scope of the Id is within the current pipeline. There will be a fully-qualified Id, which is discussed at the end of this issue description. For most of this discussion, the Id is unique only within a single pipeline. Additionally, pipeline authors may wish to configure some component Ids. This can help them debug their pipelines and make them more readable. The following example shows how a pipeline author can configure the Id using an `id` property: ``` log-pipeline: source: http: prepper: - grok: id: extract-apache-logs match: log: [ "%{COMMONAPACHELOG}" ] sink: - opensearch: id: opensearch-a hosts: [ "https://opensearch-host-a" ] - opensearch: id: opensearch-a hosts: [ "https://opensearch-host-b" ] ``` Pipeline authors do not need to configure the `id`. Data Prepper will produce a default value. ## Id Generation The default Id generation should be deterministic. This will allow the peer-forwarder to use the `id` of a component and consistently supply Events to the correct component in a peer node. The default Id generation can be: ``` ${pluginType}${incrementedCount > 1 ? incrementedCount : ''} ``` The `incrementedCount` will be a number which is incremented for each component type individually. It can thus be stored in a map: `Map<String, Integer> typeToIncrementedCount`. The count will be incremented before applying the function above. So the first of any given type has `incrementedCount == 1`. This approach allows the pipelines without duplicates to continue to use the `pluginType` without having a trailing `1`. ## Examples ### No Configured Ids ``` log-pipeline: source: http: prepper: - grok: match: log: [ "%{COMMONAPACHELOG}" ] sink: - opensearch: hosts: [ "https://opensearch-host-a" ] - opensearch: hosts: [ "https://opensearch-host-b" ] ``` The Ids are: * `http` * `grok` * `opensearch` * `opensearch2` ### Some Configured Ids ``` log-pipeline: source: http: prepper: - grok: match: id: extract-apache-logs log: [ "%{COMMONAPACHELOG}" ] sink: - opensearch: id: opensearch-a hosts: [ "https://opensearch-host-a" ] - opensearch: hosts: [ "https://opensearch-host-b" ] ``` The Ids are: * `http` * `extract-apache-logs` * `opensearch-a` * `opensearch2` # Alternatives ## Duplicates Always Have Count Suffix Another approach is to identify any plugin type that has more than one plugin in the pipeline. Only those that have more than one will have a suffix. This can be nice because each plugin of the same type has a more consistent name. The disadvantage is that it may be more complicated to support. Is the improvement to the name really worthwhile here? Pipeline authors who want better names can control the `id` already. ``` log-pipeline: source: http: prepper: - grok: match: log: [ "%{COMMONAPACHELOG}" ] sink: - opensearch: hosts: [ "https://opensearch-host-a" ] - opensearch: hosts: [ "https://opensearch-host-b" ] ``` The Ids are: * `http` * `grok` * `opensearch1` * `opensearch2` ## Count Across Components Data Prepper could increment a universal count. The disadvantage is that when there is only one processor of a type it gets some number behind it. ``` log-pipeline: source: http: prepper: - grok: match: log: [ "%{COMMONAPACHELOG}" ] sink: - opensearch: hosts: [ "https://opensearch-host-a" ] - opensearch: hosts: [ "https://opensearch-host-b" ] ``` The Ids are: * `http1` * `grok2` * `opensearch3` * `opensearch4` # Fully Scoped Ids Data Prepper will generate and validate plugin Ids only within a single plugin. Additionally, Data Prepper will support fully qualified component Ids. A fully-qualified plugin Id will be unique across all pipelines. The format will be: ``` {pipelineName}.{pluginId} ``` This format is based on the current convention for plugin metrics. Data Prepper currently defines metrics by: ``` {pipelineName}.{pluginType}.{metricName} ``` # Tasks - [ ] #3995 - [ ] Allow users to set component Ids - [ ] Update metrics to include update component Ids
Pipeline Component Ids
https://api.github.com/repos/opensearch-project/data-prepper/issues/1025/comments
3
2022-02-10T19:09:40Z
2024-11-19T20:38:09Z
https://github.com/opensearch-project/data-prepper/issues/1025
1,130,715,075
1,025
[ "opensearch-project", "data-prepper" ]
**Describe the bug** The service-map pipeline continues to suffer from memory leak although it was previously tackled by https://github.com/opendistro-for-elasticsearch/data-prepper/pull/671. The issue needs to be revisited. **To Reproduce** Steps to reproduce the behavior: 1. pipeline definition: ``` entry-pipeline: source: otel_trace_source: ssl: false unframed_requests: true sink: - pipeline: name: "raw-pipeline" - pipeline: name: "service-map-pipeline" raw-pipeline: source: pipeline: name: "entry-pipeline" prepper: - otel_trace_raw_prepper: # trace_flush_interval: 6 # - otel_trace_group_prepper: # hosts: [ "https://node-0.example.com:9200" ] # username: "admin" # password: "admin" sink: - opensearch: hosts: [ "https://node-0.example.com:9200" ] username: "admin" password: "admin" trace_analytics_raw: true service-map-pipeline: source: pipeline: name: "entry-pipeline" prepper: - service_map_stateful: sink: - opensearch: hosts: ["https://node-0.example.com:9200"] username: "admin" password: "admin" trace_analytics_service_map: true ``` 2. run data-prepper in docker with opensearch backend 3. run a load generator to send ExportTraceServiceRequest of batch size 20 to otel-trace-source. 4. See example error: ``` 2022-02-09T19:21:07,474 [service-map-pipeline-prepper-worker-3-thread-1] ERROR com.amazon.dataprepper.pipeline.common.PipelineThreadPoolExecutor - Pipeline [service-map-pipeline] process worker encountered a fatal exception, cannot proceed further java.util.concurrent.ExecutionException: java.lang.OutOfMemoryError: Java heap space at java.util.concurrent.FutureTask.report(FutureTask.java:122) ~[?:?] at java.util.concurrent.FutureTask.get(FutureTask.java:191) ~[?:?] at com.amazon.dataprepper.pipeline.common.PipelineThreadPoolExecutor.afterExecute(PipelineThreadPoolExecutor.java:70) [data-prepper.jar:1.2.1] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1131) [?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) [?:?] at java.lang.Thread.run(Thread.java:832) [?:?] Caused by: java.lang.OutOfMemoryError: Java heap space at com.fasterxml.jackson.core.io.ContentReference.construct(ContentReference.java:101) ~[data-prepper.jar:1.2.1] at com.fasterxml.jackson.core.JsonFactory._createContentReference(JsonFactory.java:1990) ~[data-prepper.jar:1.2.1] at com.fasterxml.jackson.core.JsonFactory.createGenerator(JsonFactory.java:1316) ~[data-prepper.jar:1.2.1] at com.fasterxml.jackson.databind.ObjectMapper.createGenerator(ObjectMapper.java:1188) ~[data-prepper.jar:1.2.1] at com.fasterxml.jackson.databind.ObjectMapper.writeValueAsString(ObjectMapper.java:3822) ~[data-prepper.jar:1.2.1] at com.amazon.dataprepper.plugins.prepper.ServiceMapStatefulPrepper.lambda$iteratePrepperState$7(ServiceMapStatefulPrepper.java:235) ~[data-prepper.jar:1.2.1] at com.amazon.dataprepper.plugins.prepper.ServiceMapStatefulPrepper$$Lambda$907/0x000000080118bbe8.accept(Unknown Source) ~[?:?] at java.util.Iterator.forEachRemaining(Iterator.java:133) ~[?:?] at com.amazon.dataprepper.plugins.prepper.ServiceMapStatefulPrepper.iteratePrepperState(ServiceMapStatefulPrepper.java:207) ~[data-prepper.jar:1.2.1] at com.amazon.dataprepper.plugins.prepper.ServiceMapStatefulPrepper.evaluateEdges(ServiceMapStatefulPrepper.java:184) ~[data-prepper.jar:1.2.1] at com.amazon.dataprepper.plugins.prepper.ServiceMapStatefulPrepper.doExecute(ServiceMapStatefulPrepper.java:131) ~[data-prepper.jar:1.2.1] at com.amazon.dataprepper.model.prepper.AbstractPrepper.lambda$execute$0(AbstractPrepper.java:44) ~[data-prepper.jar:1.2.1] at com.amazon.dataprepper.model.prepper.AbstractPrepper$$Lambda$712/0x00000008010d3e10.get(Unknown Source) ~[?:?] at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:57) ~[data-prepper.jar:1.2.1] at com.amazon.dataprepper.model.prepper.AbstractPrepper.execute(AbstractPrepper.java:44) ~[data-prepper.jar:1.2.1] at com.amazon.dataprepper.pipeline.ProcessWorker.run(ProcessWorker.java:62) ~[data-prepper.jar:1.2.1] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) ~[?:?] ... 2 more ``` **Expected behavior** - n/a **Environment (please complete the following information):** - OS: [e.g. Ubuntu 20.04 LTS] - Version [e.g. 22]
[BUG] OutofMemory error in service-map pipeline
https://api.github.com/repos/opensearch-project/data-prepper/issues/1021/comments
7
2022-02-09T21:11:03Z
2022-04-19T20:26:38Z
https://github.com/opensearch-project/data-prepper/issues/1021
1,129,025,120
1,021
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** As a user that prefers JSON over YAML, I would like the option to configure Data Prepper using a JSON configuration file. **Describe the solution you'd like** Data Prepper supports both YAML and JSON configuration files **Additional context** This is a relatively minor change, as Jackson can easily convert JSON to YAML internally. However, it may be necessary to provide JSON configuration examples along with the YAML examples in the documentation. However, since YAML is the main way to configure Data Prepper, this may not be completely necessary. Is there a concern that having 50% of users on YAML and 50% on JSON would cause more confusion that necessary? Or does the option to use JSON outweigh this concern? ### Tasks - [ ] Unit or integration tests for pipeline JSON - [ ] Include `.json` extension when scanning the `pipelines/` directory - [ ] Read each `.json` and `.yaml` file independently when constructing the pipeline - [ ] Documentation updates
Support JSON configuration files
https://api.github.com/repos/opensearch-project/data-prepper/issues/1020/comments
5
2022-02-09T19:23:04Z
2024-02-15T19:50:41Z
https://github.com/opensearch-project/data-prepper/issues/1020
1,128,934,807
1,020
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Create automated tests to verify Prometheus and CloudWatch metrics are available at `/metrics/prometheus` and `/metrics/sys` if configured. **Describe the solution you'd like** Add an automated e2e test to verify. **Describe alternatives you've considered (Optional)** - enhance existing e2e test - less focused test - create a smoke test - not really the place to catch bugs **Additional context** A bug where configured metrics are not available was discovered in main by @graytaylor0
Automated testing for /metrics/*
https://api.github.com/repos/opensearch-project/data-prepper/issues/1018/comments
0
2022-02-09T17:42:11Z
2022-04-19T19:12:30Z
https://github.com/opensearch-project/data-prepper/issues/1018
1,128,842,244
1,018
[ "opensearch-project", "data-prepper" ]
# Background Data Prepper pipelines currently do not support conditionals or routing. Thus all events in Data Prepper must flow through all sinks and processors in a pipeline. Many users require the ability to route events to different sinks and processors depending on the specific event. The following diagram outlines a common scenario: Users need to route data to different sinks depending on some property of the event. ![ConditionalRouting-SinkExampleByType](https://user-images.githubusercontent.com/293424/153069171-67a3467b-ccaf-4538-9d0f-85edc43df753.png) # Proposal This RFC introduces a concept of a router to Data Prepper. Pipeline authors can define named routes in the router. Data Prepper will apply routes to individual Events before sending them to Sinks. This GitHub issue focuses on using routing to route sinks. See #522 RFC for a proposal for routing through a processor chain. The following diagram outlines where the router will sit and what it will perform. ![ConditionalRouting-Simple](https://user-images.githubusercontent.com/293424/155431217-e3fc609f-0fcf-4d3c-8255-c9a8e2fa0068.png) ## Design Data Prepper will introduce a new `router` component to the pipeline. This is at the same level of the YAML as the `prepper` and `sink`. The router will run after the Processor chain and before the Sinks. Data Prepper would evaluate these routes directly before passing the Events into the sinks. ``` log-pipeline: source: http: processor: router: - application-logs: '/log_type == "application"' - http-logs: '/log_type == "apache"' sink: - opensearch: hosts: [ "https://opensearch:9200" ] index: application_logs routes: [application-logs] - opensearch: hosts: [ "https://opensearch:9200" ] index: http_logs routes: [http-logs] - opensearch: hosts: [ "https://opensearch:9200" ] index: all_logs ``` Any `sink` with the `routes` property will only accept Events which match at least one of the routes. In the example above, `application-logs` is a named route. Data Prepper will only route events with the `application-logs` route to the first `opensearch` sink. By default, Data Prepper will route all Events to a sink which does not define a route. Thus, in the example above, all Events will go into the third `opensearch` sink. # Alternatives See the comments below for alternatives.
[RFC] Conditional Routing
https://api.github.com/repos/opensearch-project/data-prepper/issues/1007/comments
19
2022-02-08T22:02:08Z
2022-09-08T15:19:52Z
https://github.com/opensearch-project/data-prepper/issues/1007
1,127,804,176
1,007
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** As part of a larger feature (#522) to support complex condition statements in data prepper there is a need to define a syntax for conditional statements. A conditional statement is a String that is evaluated at runtime and may reference fields within a record. **Describe the solution you'd like** _Terms used throught this document are defined in the Definitions section_ ## Supported Operators In order of evaluation priority. _(top to bottom, left to right)_ | Operator | Description | Data Prepper Version | |----------------------|--------------------------|----------------------| | `{}` | Set Initializer | 1.4.0 | | `()` | Priority Expression | 1.3.0 | | `not` | Not Operator | 1.3.0 | | `in`, `not in` | Set Operators | 1.4.0 | | `<`, `<=`, `>`, `>=` | Relational Operators | 1.3.0 | | `=~`, `!~` | Regex Equality Operators | TBD | | `==`, `!=` | Equality Operators | 1.3.0 | | `and`, `or` | Conditional Expression | 1.3.0 | | `,` | Set Value Delimiter | 1.4.0 | ## Reserved for possible future functionality Reserved symbol set: `^`, `*`, `/`, `%`, `+`, `-`, `xor`, `=`, `+=`, `-=`, `*=`, `/=`, `%=`, `++`, `--`, `${<text>}` ## Set Initialiser Defines a set or term and/or expressions. Examples ``` {1, 2, 3} {"a", "b", "c"} {/people/0/name, /status_code} ``` ## Priority Expression Identifies an expression that will be evaluated at the highest priority level. Priority expression must contain an expression or value, empty parentheses are not supported. Examples ``` /is_cool == (/name == "Steven") ``` ## Set Operators Tests if a value is in/not in a set. Note, the right-hand side operand must be a set. Syntax ``` <Expression> in <Set> <Expression> not in <Set> ``` Examples ``` /status_code in {200, 202} /status_code not in {400, 404, 500} ``` ## Relational Operators Tests the relationship of two numeric values. Note, the operands must be a number or Json Pointer that will resolve to a number. Syntax ``` <Number | Json Pointer> < <Number | Json Pointer> <Number | Json Pointer> <= <Number | Json Pointer> <Number | Json Pointer> > <Number | Json Pointer> <Number | Json Pointer> >= <Number | Json Pointer> ``` Examples ``` /status_code >= 200 and /status_code < 300 ``` ## Regex Equality Operators Used to test if a String value matches/does not match a Regular Expression. Note, the left-hand side operand must be a string or Json Pointer that resolves to a String. The right hand side operand must be a String that contains a regular expression or a Json Pointer that resolves to a String that contains a regular expression. Syntax ``` <String | Json Pointer> =~ <Regex String | Json Pointer> <String | Json Pointer> !~ <Regex String | Json Pointer> ``` Examples ``` /string_property =~ "^[A-Za-z\s]*$" "Hello!" !~ /event/regex_matcher ``` ## Equality Operators Used to test if two value are/are not equivalent. Syntax ``` <Any> == <Any> <Any> != <Any> ``` Examples ``` /is_cool == true 3.14 != /status_code {1, 2} == /event/set_property ``` ## Conditional Expression Used to chain together multiple expressions and/or values. Syntax ``` <Any> and <Any> <Any> or <Any> not <Any> ``` Examples ``` /status_code == 200 and /message == "Hello world" /status_code == 200 or /status_code == 202 not /status_code in {200, 202} ``` # Definitions ### Literal A fundamental value that has no children. - Float _(Supports values from 3.40282347 x 10^38 to 1.40239846 x 10^-45)_ - Integer _(Supports values from -2147483648 to 2147483647)_ - Boolean _(Supports true or false)_ - Json Pointer _(See Json Pointer section for details)_ - String _(Supports Valid Java String characters)_ ### Expression String The String that will be parsed for evaluation. Expression String is the highest level of a Data Prepper Expression. Only supports one Expression String resulting in a return value. Note, an _Expression String_ is not the same as an _Expression_. ### Statement The highest level component of the Expression String. ### Expression A generic component that contains a _Primary_ or an _Operator_. Expressions may contain expressions. An expressions imminent children can contains 0-1 _Operators_. ### Primary - _Set_ - _Priority Expression_ - _Literal_ ### Operator Hard coded token that identifies the operation use in an _Expression_. ### Json Pointer A Literal used to reference a value within the Event provided as context for the _Expression String_. Json Pointers are identified by a leading `/` containing alpha numeric character or underscores, delimited by `/`. Json Pointers can use an extended character set if wrapped in double quotes (`"`) using the escape character `\`. Note, Json Pointer require `~` and `/` that should be used as part of the path and not a delimiter to be escaped. - `~0` representing `~` - `~1` representing `/` Shorthand Syntax (Regex, `\w` = `[A-Za-z_]`) ``` /\w+(/\w+)* ``` Shorthand Example ``` /Hello/World/0 ``` Escaped Syntax ``` "/<Valid String Characters | Escaped Character>(/<Valid String Characters | Escaped Character>)*" ``` Escaped Example ``` # Path # { "Hello - 'world/" : [{ "\"JsonPointer\"": true }] } "/Hello - 'world\//0/\"JsonPointer\"" ``` ## White Space ### Operators White space is **optional** surrounding Relational Operators, Regex Equality Operators, Equality Operators and commas. White space is **required** surrounding Set Initializers, Priority Expressions, Set Operators, and Conditional Expressions. ### Reference Table | Operator | Description | White Space Required | ✅ Valid Examples | ❌ Invalid Examples | |----------------------|--------------------------|----------------------|----------------------------------------------------------------|---------------------------------------| | `{}` | Set Initializer | Yes | `/status in {200}` | `/status in{200}` | | `()` | Priority Expression | Yes | `/a==(/b==200)`<br>`/a in ({200})` | `/status in({200})` | | `in`, `not in` | Set Operators | Yes | `/a in {200}`<br>`/a not in {400}` | `/a in{200, 202}`<br>`/a not in{400}` | | `<`, `<=`, `>`, `>=` | Relational Operators | No | `/status < 300`<br>`/status>=300` | | | `=~`, `!~` | Regex Equality Operators | No | `/msg =~ "^\w*$"`<br>`/msg=~"^\w*$"` | | | `==`, `!=` | Equality Operators | No | `/status == 200`<br>`/status_code==200` | | | `and`, `or`, `not` | Conditional Operators | Yes | `/a<300 and /b>200` | `/b<300and/b>200` | | `,` | Set Value Delimiter | No | `/a in {200, 202}`<br>`/a in {200,202}`<br>`/a in {200 , 202}` | `/a in {200,}` |
[RFC] Data Prepper Expression Syntax
https://api.github.com/repos/opensearch-project/data-prepper/issues/1005/comments
10
2022-02-08T17:55:30Z
2023-03-30T06:45:40Z
https://github.com/opensearch-project/data-prepper/issues/1005
1,127,581,413
1,005
[ "opensearch-project", "data-prepper" ]
null
Add metrics for Aggregate Processor
https://api.github.com/repos/opensearch-project/data-prepper/issues/1004/comments
0
2022-02-08T17:23:10Z
2022-02-10T18:10:35Z
https://github.com/opensearch-project/data-prepper/issues/1004
1,127,544,117
1,004
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** - [ ] Evaluate parsed data prepper expressions - [ ] Handle parsing and evaluate exceptions - [ ] Use Dependency Injection to instantiate evaluation classes - [ ] Add e2e test for parsing -> evaluation **Describe the solution you'd like** ## UML Diagram ![when-processor-Updated UML (5)](https://user-images.githubusercontent.com/8837988/155194080-2923ace3-e766-49f7-9595-7aa7ea8952b9.png) ## Sequence Diagram ![when-processor-Control Flow (2)](https://user-images.githubusercontent.com/8837988/155193439-14a5ea70-0b78-4dac-b751-fae771e95273.png) ## Class Descriptions ### _interface_ ExpressionEvaluator Primary interface of the `org.opensearch.dataprepper.expression` package. Exposes functionality expression evaluation API to other packages. All other interfaces and classes are package private. ### _class_ ConditionalExpressionEvaluator Default implementation of the `ExpressionEvaluator` interface. Controls the flow of execution in the following order 1. Parse the statement using the `Parser` service 2. Evaluates the statement using the `Evaluator` service 3. Cource the result using the `CourcionService` 4. Return a boolean representing the evaluated results ### _interface_ Parser Describes `Parser` functionality. ### _class_ ParseTreeParser Default implementation of the `Parser` interface. Converts a statement String to a `ParseTree` object using ANTLR generated classes. ### _interface_ Evaluator Describes `Evaluator` functionality. ### _class_ ParseTreeEvaluator Default implementation of the `Evaluator` interface. Uses and ANTLR `ParseTreeWalker` to travers a `ParseTree`. The `ParseTreeWalker` will call corresponding functions on `ParseTreeEvaluatorListener` instance on entering or exiting a node and on visiting a leaf (Terminal Node). ### _class_ ParseTreeEvaluationListener Implements `DataPrepperExpressionListener` (extends `ParseTreeEvaluatorListener` and generated by ANTLR). Handles all tree traversal events using an operand stack and a argument stack to track visited nodes. On exit rule events, both stacks are queried to determine if an operation strategy should be triggered. All available operation strategies are registed when the ParseTreeEvaluationListener is constructed. ### _class_ CourcionService Service to contain object type detection, argument count verification and type conversion (casting). ### _interface_ Operation Interface every operation must implement to be registered as an evaluation strategy for the `ParseTreeEvaluationListener`. **Describe alternatives you've considered (Optional)** n/a **Additional context** Related to RFC #522 Depends on #1001
Data Prepper Expression Evaluator
https://api.github.com/repos/opensearch-project/data-prepper/issues/1003/comments
0
2022-02-08T16:41:05Z
2022-03-11T23:13:21Z
https://github.com/opensearch-project/data-prepper/issues/1003
1,127,496,928
1,003
[ "opensearch-project", "data-prepper" ]
## CVE-2021-44832 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-core-2.11.2.jar</b></p></summary> <p>The Apache Log4j Implementation</p> <p>Path to dependency file: /performance-test/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.2/6c2fb3f5b7cd27504726aef1b674b542a0c9cf53/log4j-core-2.11.2.jar</p> <p> Dependency Hierarchy: - zinc_2.12-1.3.5.jar (Root Library) - zinc-compile-core_2.12-1.3.5.jar - util-logging_2.12-1.3.0.jar - :x: **log4j-core-2.11.2.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/022b333dc9be3548b8eb8bb73d0337fd26425056">022b333dc9be3548b8eb8bb73d0337fd26425056</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache Log4j2 versions 2.0-beta7 through 2.17.0 (excluding security fix releases 2.3.2 and 2.12.4) are vulnerable to a remote code execution (RCE) attack when a configuration uses a JDBC Appender with a JNDI LDAP data source URI when an attacker has control of the target LDAP server. This issue is fixed by limiting JNDI data source names to the java protocol in Log4j2 versions 2.17.1, 2.12.4, and 2.3.2. <p>Publish Date: 2021-12-28 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-44832>CVE-2021-44832</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://logging.apache.org/log4j/2.x/security.html">https://logging.apache.org/log4j/2.x/security.html</a></p> <p>Release Date: 2021-12-28</p> <p>Fix Resolution: org.apache.logging.log4j:log4j-core:2.3.2,2.12.4,2.17.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.logging.log4j","packageName":"log4j-core","packageVersion":"2.11.2","packageFilePaths":["/performance-test/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.scala-sbt:zinc_2.12:1.3.5;org.scala-sbt:zinc-compile-core_2.12:1.3.5;org.scala-sbt:util-logging_2.12:1.3.0;org.apache.logging.log4j:log4j-core:2.11.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.logging.log4j:log4j-core:2.3.2,2.12.4,2.17.1","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-44832","vulnerabilityDetails":"Apache Log4j2 versions 2.0-beta7 through 2.17.0 (excluding security fix releases 2.3.2 and 2.12.4) are vulnerable to a remote code execution (RCE) attack when a configuration uses a JDBC Appender with a JNDI LDAP data source URI when an attacker has control of the target LDAP server. This issue is fixed by limiting JNDI data source names to the java protocol in Log4j2 versions 2.17.1, 2.12.4, and 2.3.2.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-44832","cvss3Severity":"medium","cvss3Score":"6.6","cvss3Metrics":{"A":"High","AC":"High","PR":"High","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
CVE-2021-44832 (Medium) detected in log4j-core-2.11.2.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/999/comments
1
2022-02-07T19:39:37Z
2022-02-09T16:04:23Z
https://github.com/opensearch-project/data-prepper/issues/999
1,126,417,997
999
[ "opensearch-project", "data-prepper" ]
## CVE-2021-44228 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-core-2.11.2.jar</b></p></summary> <p>The Apache Log4j Implementation</p> <p>Path to dependency file: /performance-test/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.logging.log4j/log4j-core/2.11.2/6c2fb3f5b7cd27504726aef1b674b542a0c9cf53/log4j-core-2.11.2.jar</p> <p> Dependency Hierarchy: - zinc_2.12-1.3.5.jar (Root Library) - zinc-compile-core_2.12-1.3.5.jar - util-logging_2.12-1.3.0.jar - :x: **log4j-core-2.11.2.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/022b333dc9be3548b8eb8bb73d0337fd26425056">022b333dc9be3548b8eb8bb73d0337fd26425056</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache Log4j2 2.0-beta9 through 2.15.0 (excluding security releases 2.12.2, 2.12.3, and 2.3.1) JNDI features used in configuration, log messages, and parameters do not protect against attacker controlled LDAP and other JNDI related endpoints. An attacker who can control log messages or log message parameters can execute arbitrary code loaded from LDAP servers when message lookup substitution is enabled. From log4j 2.15.0, this behavior has been disabled by default. From version 2.16.0 (along with 2.12.2, 2.12.3, and 2.3.1), this functionality has been completely removed. Note that this vulnerability is specific to log4j-core and does not affect log4net, log4cxx, or other Apache Logging Services projects. <p>Publish Date: 2021-12-10 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-44228>CVE-2021-44228</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>10.0</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://logging.apache.org/log4j/2.x/security.html">https://logging.apache.org/log4j/2.x/security.html</a></p> <p>Release Date: 2021-12-10</p> <p>Fix Resolution: org.apache.logging.log4j:log4j-core:2.3.1,2.12.2,2.15.0;org.ops4j.pax.logging:pax-logging-log4j2:1.11.10,2.0.11</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.logging.log4j","packageName":"log4j-core","packageVersion":"2.11.2","packageFilePaths":["/performance-test/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.scala-sbt:zinc_2.12:1.3.5;org.scala-sbt:zinc-compile-core_2.12:1.3.5;org.scala-sbt:util-logging_2.12:1.3.0;org.apache.logging.log4j:log4j-core:2.11.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.logging.log4j:log4j-core:2.3.1,2.12.2,2.15.0;org.ops4j.pax.logging:pax-logging-log4j2:1.11.10,2.0.11","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-44228","vulnerabilityDetails":"Apache Log4j2 2.0-beta9 through 2.15.0 (excluding security releases 2.12.2, 2.12.3, and 2.3.1) JNDI features used in configuration, log messages, and parameters do not protect against attacker controlled LDAP and other JNDI related endpoints. An attacker who can control log messages or log message parameters can execute arbitrary code loaded from LDAP servers when message lookup substitution is enabled. From log4j 2.15.0, this behavior has been disabled by default. From version 2.16.0 (along with 2.12.2, 2.12.3, and 2.3.1), this functionality has been completely removed. Note that this vulnerability is specific to log4j-core and does not affect log4net, log4cxx, or other Apache Logging Services projects.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-44228","cvss3Severity":"high","cvss3Score":"10.0","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Changed","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
CVE-2021-44228 (High) detected in log4j-core-2.11.2.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/998/comments
1
2022-02-07T19:39:35Z
2022-02-09T16:04:20Z
https://github.com/opensearch-project/data-prepper/issues/998
1,126,417,971
998
[ "opensearch-project", "data-prepper" ]
## CVE-2020-15250 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>junit-4.13.jar</b></p></summary> <p>JUnit is a unit testing framework for Java, created by Erich Gamma and Kent Beck.</p> <p>Library home page: <a href="http://junit.org">http://junit.org</a></p> <p>Path to dependency file: /data-prepper-plugins/key-value-processor/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar</p> <p> Dependency Hierarchy: - junit-vintage-engine-5.7.2.jar (Root Library) - :x: **junit-4.13.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/022b333dc9be3548b8eb8bb73d0337fd26425056">022b333dc9be3548b8eb8bb73d0337fd26425056</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In JUnit4 from version 4.7 and before 4.13.1, the test rule TemporaryFolder contains a local information disclosure vulnerability. On Unix like systems, the system's temporary directory is shared between all users on that system. Because of this, when files and directories are written into this directory they are, by default, readable by other users on that same system. This vulnerability does not allow other users to overwrite the contents of these directories or files. This is purely an information disclosure vulnerability. This vulnerability impacts you if the JUnit tests write sensitive information, like API keys or passwords, into the temporary folder, and the JUnit tests execute in an environment where the OS has other untrusted users. Because certain JDK file system APIs were only added in JDK 1.7, this this fix is dependent upon the version of the JDK you are using. For Java 1.7 and higher users: this vulnerability is fixed in 4.13.1. For Java 1.6 and lower users: no patch is available, you must use the workaround below. If you are unable to patch, or are stuck running on Java 1.6, specifying the `java.io.tmpdir` system environment variable to a directory that is exclusively owned by the executing user will fix this vulnerability. For more information, including an example of vulnerable code, see the referenced GitHub Security Advisory. <p>Publish Date: 2020-10-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15250>CVE-2020-15250</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/junit-team/junit4/security/advisories/GHSA-269g-pwp5-87pp">https://github.com/junit-team/junit4/security/advisories/GHSA-269g-pwp5-87pp</a></p> <p>Release Date: 2020-10-12</p> <p>Fix Resolution: junit:junit:4.13.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"junit","packageName":"junit","packageVersion":"4.13","packageFilePaths":["/data-prepper-plugins/key-value-processor/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.junit.vintage:junit-vintage-engine:5.7.2;junit:junit:4.13","isMinimumFixVersionAvailable":true,"minimumFixVersion":"junit:junit:4.13.1","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2020-15250","vulnerabilityDetails":"In JUnit4 from version 4.7 and before 4.13.1, the test rule TemporaryFolder contains a local information disclosure vulnerability. On Unix like systems, the system\u0027s temporary directory is shared between all users on that system. Because of this, when files and directories are written into this directory they are, by default, readable by other users on that same system. This vulnerability does not allow other users to overwrite the contents of these directories or files. This is purely an information disclosure vulnerability. This vulnerability impacts you if the JUnit tests write sensitive information, like API keys or passwords, into the temporary folder, and the JUnit tests execute in an environment where the OS has other untrusted users. Because certain JDK file system APIs were only added in JDK 1.7, this this fix is dependent upon the version of the JDK you are using. For Java 1.7 and higher users: this vulnerability is fixed in 4.13.1. For Java 1.6 and lower users: no patch is available, you must use the workaround below. If you are unable to patch, or are stuck running on Java 1.6, specifying the `java.io.tmpdir` system environment variable to a directory that is exclusively owned by the executing user will fix this vulnerability. For more information, including an example of vulnerable code, see the referenced GitHub Security Advisory.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15250","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Local","I":"None"},"extraData":{}}</REMEDIATE> -->
CVE-2020-15250 (Medium) detected in junit-4.13.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/997/comments
1
2022-02-07T19:39:33Z
2022-02-28T22:40:29Z
https://github.com/opensearch-project/data-prepper/issues/997
1,126,417,946
997