issue_owner_repo listlengths 2 2 | issue_body stringlengths 0 262k ⌀ | issue_title stringlengths 1 1.02k | issue_comments_url stringlengths 53 116 | issue_comments_count int64 0 2.49k | issue_created_at stringdate 1999-03-17 02:06:42 2025-06-23 11:41:49 | issue_updated_at stringdate 2000-02-10 06:43:57 2025-06-23 11:43:00 | issue_html_url stringlengths 34 97 | issue_github_id int64 132 3.17B | issue_number int64 1 215k |
|---|---|---|---|---|---|---|---|---|---|
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
DataPrepper recently added Kafka Source support. Kafka source should expose KafkaConsumer metrics and other metrics to DataPrepper users using the PluginMetrics
**Describe the solution you'd like**
Types of metrics to support
- global (KafkaSource level) metrics
- topic level metrics
* KafkaConsumer metrics (obtained by using `metrics()` API of KafkaConsumer
* Internal metrics that track the functioning of KafkaConsumer code. Examples of these metrics are - number of positive/negative acknowledgements received, number of auth exceptions, etc
Since Kafka Source allows creation of multiple threads per topic, we need a way to efficiently collect and expose metrics.
Proposal
- Each thread maintains local counters and exports them periodically to a common/shared data structure (shared by all threads consuming from the same topic)
- Periodically, per-topic metrics are calculated by aggregating the metrics from each thread and PluginMetric.guage() is created. Per topic metrics will have `topic.` prefix followed by topicname followed by `.` before the metricName. Also Kafka metric names with `-` will be replaced by camel case equivalent. For example, the KafkaConsumer metric `records-consumed` per topic will be `topic.topicName.recordsConsumed`
- Global metrics are aggregated across all threads (irrespective of topic)
**Describe alternatives you've considered (Optional)**
Alternative is to use synchronized blocks or atomic counters to increment the metrics inline. I think this method will be very inefficient when the number of topics or threads are high.
**Additional context**
Add any other context or screenshots about the feature request here.
| Metrics in Kafka Source | https://api.github.com/repos/opensearch-project/data-prepper/issues/3112/comments | 1 | 2023-08-04T15:10:36Z | 2023-08-10T18:38:44Z | https://github.com/opensearch-project/data-prepper/issues/3112 | 1,836,929,936 | 3,112 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The OpenSearch sink currently supports the `create` and `index` bulk actions. As a user, I would also like to utilize both the `delete` and `update` actions (as well as `upsert` action which uses the update bulk action) in the OpenSearch sink, but they are not supported at this time (https://opensearch.org/docs/latest/api-reference/document-apis/bulk/). Additionally, I would like to choose the bulk action to use (create, index, update, and delete) based on what each Event looks like through conditional expressions.
**Describe the solution you'd like**
A deprecation of the existing `action` parameter of the OpenSearch sink
```
action: "index"
```
and a rework to the following format with an `actions` parameter. The simple config above will look like this in the new parameter
```
actions:
- type: "index"
```
and more complex cases may look like this, including conditional expressions
```
actions:
- type: "create"
when: "/some_key == CREATE"
- type: "index"
when: "/some_key == INDEX"
- type: "upsert"
when: "/some_key == UPSERT"
- type: "update"
when: "/some_key == UPDATE"
- type: "delete"
when: "/some_key == DELETE"
# default case
- type: "index"
```
This would be read like a switch statement, where the order of the list matters
```
if (/some_key == CREATE) {
// use bulk create action
} else if (/some_key == INDEX) {
// use bulk index action
} else if (/some_key == UPSERT) {
// perform upsert (update action with doc_as_upsert set to true)
} else if (/some_key == UPDATE) {
// use bulk update action
} else if (/some_key == DELETE) {
// use bulk delete action
} else {
// use bulk index action
}
```
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Support remaining bulk actions and conditional expressions to determine bulk action in the OpenSearch sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/3109/comments | 4 | 2023-08-03T15:28:35Z | 2023-10-06T00:54:39Z | https://github.com/opensearch-project/data-prepper/issues/3109 | 1,835,285,927 | 3,109 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Other projects in the OpenSearch project have a `RELEASING.md` file.
Data Prepper has the instructions in a different location.
https://github.com/opensearch-project/data-prepper/blob/main/release/README.md
**Describe the solution you'd like**
Move the instructions to `RELEASING.md` and update for the new release process underway in #2122.
**Describe alternatives you've considered (Optional)**
Move the file now, but we are in the process of updating the release process, so I'll just do it all at once.
| Providing a RELEASING.md file | https://api.github.com/repos/opensearch-project/data-prepper/issues/3108/comments | 0 | 2023-08-03T14:18:18Z | 2023-08-25T21:04:45Z | https://github.com/opensearch-project/data-prepper/issues/3108 | 1,835,160,368 | 3,108 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently the cloudwatch_logs sink does not support the pushing of log events to a dlq if an error occurs.
**Describe the solution you'd like**
To solve this issue, an S3 and file dlq can be passed into the sink to allow for errored out log events to be saved for later viewing or retransmission.
**Additional context**
The addition of the DLQ would require the extension of the existing cloudwatch_logs sink, including its configuration.
| Addition of DLQ inside the CloudWatch Logs Sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/3107/comments | 0 | 2023-08-03T00:44:37Z | 2023-08-09T21:09:22Z | https://github.com/opensearch-project/data-prepper/issues/3107 | 1,834,094,476 | 3,107 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently the cloudwatch_logs sink publishes log events with the time at which the event is transmitted from the sink. There is no support for extracting an already existing timestamp from the event and using it as the timestamp of the event.
**Describe the solution you'd like**
A solution would be to extend the current plugin to include a timestamp extraction option. This would be configurable and would allow for the extraction of the timestamp.
**Additional context**
The timestamp could potentially be extracted in the following way:
- Through a user-entered date configuration. Where the sink would try to pull and use during transmission.
- Through the extraction of a Data-Prepper standard timestamp (via the Date Processor).
Both of these methods would also require the final date to be in UTC prior to being transmitted. | Adding Timestamp Extraction to CloudWatch Logs Sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/3106/comments | 0 | 2023-08-03T00:39:44Z | 2023-08-09T21:08:15Z | https://github.com/opensearch-project/data-prepper/issues/3106 | 1,834,089,017 | 3,106 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data-Prepper can receive log events but not publish them in Embedded Metrics Format(EMF) to take advantage of metric extraction from CloudWatch Logs services.
**Describe the solution you'd like**
To solve this, a processor can be made to format the log event into EMF format. This processor would be in charge of generating the root metric document, and inserting the metrics to be extracted (Configurable by plugin).
**Additional context**
Reference to EMF format: https://docs.aws.amazon.com/AmazonCloudWatch/latest/monitoring/CloudWatch_Embedded_Metric_Format_Specification.html#CloudWatch_Embedded_Metric_Format_Specification_structure_metricdirective
| Addition of EMF Processor to format log events for CloudWatch Logs Metric Extraction | https://api.github.com/repos/opensearch-project/data-prepper/issues/3105/comments | 0 | 2023-08-03T00:25:28Z | 2023-08-09T21:06:14Z | https://github.com/opensearch-project/data-prepper/issues/3105 | 1,834,072,546 | 3,105 |
[
"opensearch-project",
"data-prepper"
] | Data Prepper `pipelines.yaml`:
```yaml
otel-opensearch-pipeline:
workers: 1
delay: "5000"
source:
otel_logs_source:
ssl: false
sink:
- opensearch:
hosts: [ "https://es-host:9200" ]
index: "test-index-%{yyyy.MM.dd}"
username: admin
password: <redacted>
```
This config receives OpenTelemetry logs and forwards them to OpenSearch. Attributes that are sent to OpenSearch are all prefixed with `log.attributes` resulting in OpenSearch docs that look like this:
```json
{
"_index": "test-index-2023.07.31",
"_type": "_doc",
"_id": "1237rIkBtBas0TDtiznV",
"_version": 1,
"_score": null,
"_source": {
<rest of source redacted>
"log.attributes.my_string": "TEST",
"resource.attributes.telemetry@sdk@language": "dotnet",
"log.attributes.dotnet@ilogger@category": "LoggingApp.Program",
"log.attributes.my_int": 123
}
}
```
If these prefixes are not used for anything useful in OpenSearch, is there a sensible way in Data Prepper to strip this `log.attributes` prefix off of the messages? The desired source would be like this:
```json
"_source": {
"my_string": "TEST",
"resource.attributes.telemetry@sdk@language": "dotnet",
"dotnet@ilogger@category": "LoggingApp.Program",
"my_int": 123
}
```
And the log attributes are not known ahead of time. Thanks. | Receive OTLP and index into OpenSearch without `log.attributes` prefix | https://api.github.com/repos/opensearch-project/data-prepper/issues/3098/comments | 2 | 2023-08-01T11:08:07Z | 2024-08-28T22:37:03Z | https://github.com/opensearch-project/data-prepper/issues/3098 | 1,831,029,991 | 3,098 |
[
"opensearch-project",
"data-prepper"
] | See PR comments in #2989. | PR feedback improvements from #2989 | https://api.github.com/repos/opensearch-project/data-prepper/issues/3080/comments | 1 | 2023-07-31T14:07:34Z | 2023-08-14T15:30:14Z | https://github.com/opensearch-project/data-prepper/issues/3080 | 1,829,273,528 | 3,080 |
[
"opensearch-project",
"data-prepper"
] | Hello,
I tried many different syntaxes about rollover_alias but somehow, when data-prepper is creating the template in OpenSearch, the setting gets lost.
I have a simple pipeline with a sink:
```
sink:
- opensearch:
hosts: [ "https://127.0.0.1:9200" ]
index: "test_logs"
template_type: "index-template"
template_file: "/opt/data-prepper/pipelines/log.template"
ism_policy_file: "/opt/data-prepper/pipelines/log.policy"
```
The template_file looks like this:
```
{
"version": 3,
"priority": 1,
"template": {
"settings": {
"index": {
"number_of_shards": "1",
"number_of_replicas": "0",
"plugins": {
"index_state_management": {
"rollover_alias": "test_logs"
}
}
}
}
}
}
```
The shards and replicas settings are applied. However, the rollover_alias always gets lost. I also tried the older syntax "opendistro.index_state_management.rollover_alias" but this setting gets lost too.
When the template is created, I can modify it without a problem and add the rollover_alias. I just can't get it work via data-preppers template_file.
Does anybody have a clue how to do it?
| Question: How can I add a rollover_alias in the index_template file? | https://api.github.com/repos/opensearch-project/data-prepper/issues/3077/comments | 2 | 2023-07-28T14:04:44Z | 2023-10-16T06:22:12Z | https://github.com/opensearch-project/data-prepper/issues/3077 | 1,826,491,182 | 3,077 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
In the `index` parameter of the OpenSearch sink this format is supported to format from multiple keys or expression results in the Event
```
index: "${key-one}-${key-two}-${getMetadata(\"some_metadata_key\")"
```
but `document_id_field` has never supported the format option, so it has to either be
```
document_id_field: "some_key"
```
or
```
document_id_field: "getMetadata(\"some_metadata_key\")"
```
**Describe the solution you'd like**
To allow the same formatting as `index` for `document_id_field`, but without breaking the existing one (if a temporary `document_id_field_format` option is needed until it is possible to make a breaking change, then that works as well. It may be possible to do now without breaking the existing functionality though
```
document_id_field: "${key-one}-${key-two}-${getMetadata(\"some_metadata_key\")}"
```
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Support formatting of the document_id_field in the OpenSearch sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/3074/comments | 0 | 2023-07-28T00:04:00Z | 2023-08-16T17:27:31Z | https://github.com/opensearch-project/data-prepper/issues/3074 | 1,825,409,723 | 3,074 |
[
"opensearch-project",
"data-prepper"
] | ## WS-2023-0236 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jetty-xml-11.0.12.jar</b></p></summary>
<p>The jetty xml utilities.</p>
<p>Library home page: <a href="https://eclipse.org/jetty">https://eclipse.org/jetty</a></p>
<p>Path to dependency file: /data-prepper-plugins/s3-source/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.eclipse.jetty/jetty-xml/11.0.12/c47d1eb5032141b7ebd5f83f317a07a4fcad6612/jetty-xml-11.0.12.jar</p>
<p>
Dependency Hierarchy:
- wiremock-3.0.0-beta-8.jar (Root Library)
- jetty-webapp-11.0.12.jar
- :x: **jetty-xml-11.0.12.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/8bb96ddcf23859e0e7b55c3a9add5d77eddbccb0">8bb96ddcf23859e0e7b55c3a9add5d77eddbccb0</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
XmlParser is vulnerable to XML external entity (XXE) vulnerability.
XmlParser is being used when parsing Jetty’s xml configuration files. An attacker might exploit this vulnerability in order to achieve SSRF or cause a denial of service. One possible scenario is importing a (remote) malicious WAR into a Jetty’s server, while the WAR includes a malicious web.xml. The vulnerability is patched in versions 10.0.16, 11.0.16, and 12.0.0.
<p>Publish Date: 2023-07-10
<p>URL: <a href=https://github.com/eclipse/jetty.project/commit/9a05c75ad28ebad4abbe624fa432664c59763747>WS-2023-0236</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/eclipse/jetty.project/security/advisories/GHSA-58qw-p7qm-5rvh">https://github.com/eclipse/jetty.project/security/advisories/GHSA-58qw-p7qm-5rvh</a></p>
<p>Release Date: 2023-07-10</p>
<p>Fix Resolution: org.eclipse.jetty:jetty-xml:10.0.16,11.0.16,12.0.0</p>
</p>
</details>
<p></p>
| WS-2023-0236 (Low) detected in jetty-xml-11.0.12.jar - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/3072/comments | 6 | 2023-07-27T15:53:10Z | 2023-10-26T18:28:47Z | https://github.com/opensearch-project/data-prepper/issues/3072 | 1,824,676,808 | 3,072 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-37920 - Critical Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>certifi-2022.12.7-py3-none-any.whl</b></p></summary>
<p>Python package for providing Mozilla's CA Bundle.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/71/4c/3db2b8021bd6f2f0ceb0e088d6b2d49147671f25832fb17970e9b583d742/certifi-2022.12.7-py3-none-any.whl">https://files.pythonhosted.org/packages/71/4c/3db2b8021bd6f2f0ceb0e088d6b2d49147671f25832fb17970e9b583d742/certifi-2022.12.7-py3-none-any.whl</a></p>
<p>Path to dependency file: /release/smoke-tests/otel-span-exporter/requirements.txt</p>
<p>Path to vulnerable library: /release/smoke-tests/otel-span-exporter/requirements.txt,/release/smoke-tests/otel-span-exporter/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **certifi-2022.12.7-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Certifi is a curated collection of Root Certificates for validating the trustworthiness of SSL certificates while verifying the identity of TLS hosts. Certifi prior to version 2023.07.22 recognizes "e-Tugra" root certificates. e-Tugra's root certificates were subject to an investigation prompted by reporting of security issues in their systems. Certifi 2023.07.22 removes root certificates from "e-Tugra" from the root store.
<p>Publish Date: 2023-07-25
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-37920>CVE-2023-37920</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/certifi/python-certifi/security/advisories/GHSA-xqr8-7jwr-rhp7">https://github.com/certifi/python-certifi/security/advisories/GHSA-xqr8-7jwr-rhp7</a></p>
<p>Release Date: 2023-07-25</p>
<p>Fix Resolution: certifi - 2023.7.22</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2023-37920 (Critical) detected in certifi-2022.12.7-py3-none-any.whl | https://api.github.com/repos/opensearch-project/data-prepper/issues/3070/comments | 1 | 2023-07-26T19:11:42Z | 2023-09-07T22:13:41Z | https://github.com/opensearch-project/data-prepper/issues/3070 | 1,823,003,802 | 3,070 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-38493 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>armeria-1.22.1.jar</b>, <b>armeria-1.15.0.jar</b></p></summary>
<p>
<details><summary><b>armeria-1.22.1.jar</b></p></summary>
<p>Asynchronous HTTP/2 RPC/REST client/server library built on top of Java 8, Netty, Thrift and gRPC (armeria)</p>
<p>Library home page: <a href="https://armeria.dev/">https://armeria.dev/</a></p>
<p>Path to dependency file: /data-prepper-plugins/otel-logs-source/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.22.1/9e34f008f55d4095f01f00ac90edf05e8c9f711a/armeria-1.22.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **armeria-1.22.1.jar** (Vulnerable Library)
</details>
<details><summary><b>armeria-1.15.0.jar</b></p></summary>
<p>Asynchronous HTTP/2 RPC/REST client/server library built on top of Java 8, Netty, Thrift and gRPC (armeria)</p>
<p>Library home page: <a href="https://armeria.dev/">https://armeria.dev/</a></p>
<p>Path to dependency file: /data-prepper-plugins/otel-logs-source/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.linecorp.armeria/armeria/1.15.0/6c26d009aa047e14edb8b99926772d441ab75cf0/armeria-1.15.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **armeria-1.15.0.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Armeria is a microservice framework Spring supports Matrix variables. When Spring integration is used, Armeria calls Spring controllers via `TomcatService` or `JettyService` with the path that may contain matrix variables. Prior to version 1.24.3, the Armeria decorators might not invoked because of the matrix variables. If an attacker sends a specially crafted request, the request may bypass the authorizer. Version 1.24.3 contains a patch for this issue.
<p>Publish Date: 2023-07-25
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-38493>CVE-2023-38493</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
| CVE-2023-38493 (High) detected in armeria-1.22.1.jar, armeria-1.15.0.jar | https://api.github.com/repos/opensearch-project/data-prepper/issues/3069/comments | 0 | 2023-07-26T19:11:38Z | 2023-09-20T13:35:44Z | https://github.com/opensearch-project/data-prepper/issues/3069 | 1,823,003,710 | 3,069 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-3635 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>okio-2.8.0.jar</b></p></summary>
<p>A modern I/O API for Java</p>
<p>Library home page: <a href="https://github.com/square/okio/">https://github.com/square/okio/</a></p>
<p>Path to dependency file: /release/archives/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.squareup.okio/okio/2.8.0/49b64e09d81c0cc84b267edd0c2fd7df5a64c78c/okio-jvm-2.8.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.squareup.okio/okio/2.8.0/49b64e09d81c0cc84b267edd0c2fd7df5a64c78c/okio-jvm-2.8.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.squareup.okio/okio/2.8.0/49b64e09d81c0cc84b267edd0c2fd7df5a64c78c/okio-jvm-2.8.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.squareup.okio/okio/2.8.0/49b64e09d81c0cc84b267edd0c2fd7df5a64c78c/okio-jvm-2.8.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.squareup.okio/okio/2.8.0/49b64e09d81c0cc84b267edd0c2fd7df5a64c78c/okio-jvm-2.8.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.squareup.okio/okio/2.8.0/49b64e09d81c0cc84b267edd0c2fd7df5a64c78c/okio-jvm-2.8.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.squareup.okio/okio/2.8.0/49b64e09d81c0cc84b267edd0c2fd7df5a64c78c/okio-jvm-2.8.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.squareup.okio/okio/2.8.0/49b64e09d81c0cc84b267edd0c2fd7df5a64c78c/okio-jvm-2.8.0.jar</p>
<p>
Dependency Hierarchy:
- kafka-plugins-2.7.0-SNAPSHOT (Root Library)
- schema-registry-serde-1.1.15.jar
- wire-schema-3.7.1.jar
- :x: **okio-2.8.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/ebd3e757c341c1d9c1352431bbad7bf5db2ea939">ebd3e757c341c1d9c1352431bbad7bf5db2ea939</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
GzipSource does not handle an exception that might be raised when parsing a malformed gzip buffer. This may lead to denial of service of the Okio client when handling a crafted GZIP archive, by using the GzipSource class.
<p>Publish Date: 2023-07-12
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-3635>CVE-2023-3635</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2023-3635">https://www.cve.org/CVERecord?id=CVE-2023-3635</a></p>
<p>Release Date: 2023-07-12</p>
<p>Fix Resolution: com.squareup.okio:okio-jvm:3.4.0</p>
</p>
</details>
<p></p>
| CVE-2023-3635 (High) detected in okio-2.8.0.jar | https://api.github.com/repos/opensearch-project/data-prepper/issues/3068/comments | 3 | 2023-07-26T19:11:36Z | 2024-03-07T15:38:26Z | https://github.com/opensearch-project/data-prepper/issues/3068 | 1,823,003,668 | 3,068 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2021-39194 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>kaml-0.20.0.jar</b></p></summary>
<p>YAML support for kotlinx.serialization</p>
<p>Library home page: <a href="https://github.com/">https://github.com/</a></p>
<p>Path to dependency file: /data-prepper-main/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.charleskorn.kaml/kaml/0.20.0/ba87fe23d666195fd7586a9806a945bccc681ec4/kaml-0.20.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.charleskorn.kaml/kaml/0.20.0/ba87fe23d666195fd7586a9806a945bccc681ec4/kaml-0.20.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.charleskorn.kaml/kaml/0.20.0/ba87fe23d666195fd7586a9806a945bccc681ec4/kaml-0.20.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.charleskorn.kaml/kaml/0.20.0/ba87fe23d666195fd7586a9806a945bccc681ec4/kaml-0.20.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.charleskorn.kaml/kaml/0.20.0/ba87fe23d666195fd7586a9806a945bccc681ec4/kaml-0.20.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.charleskorn.kaml/kaml/0.20.0/ba87fe23d666195fd7586a9806a945bccc681ec4/kaml-0.20.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.charleskorn.kaml/kaml/0.20.0/ba87fe23d666195fd7586a9806a945bccc681ec4/kaml-0.20.0.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.charleskorn.kaml/kaml/0.20.0/ba87fe23d666195fd7586a9806a945bccc681ec4/kaml-0.20.0.jar</p>
<p>
Dependency Hierarchy:
- kafka-plugins-2.5.0-SNAPSHOT (Root Library)
- schema-registry-serde-1.1.15.jar
- wire-compiler-3.7.1.jar
- :x: **kaml-0.20.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/548b5e0946ecf29631cc9b391031f3291ca23804">548b5e0946ecf29631cc9b391031f3291ca23804</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
kaml is an open source implementation of the YAML format with support for kotlinx.serialization. In affected versions attackers that could provide arbitrary YAML input to an application that uses kaml could cause the application to endlessly loop while parsing the input. This could result in resource starvation and denial of service. This only affects applications that use polymorphic serialization with the default tagged polymorphism style. Applications using the property polymorphism style are not affected. YAML input for a polymorphic type that provided a tag but no value for the object would trigger the issue. Version 0.35.3 or later contain the fix for this issue.
<p>Publish Date: 2021-09-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-39194>CVE-2021-39194</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/charleskorn/kaml/security/advisories/GHSA-fmm9-3gv8-58f4">https://github.com/charleskorn/kaml/security/advisories/GHSA-fmm9-3gv8-58f4</a></p>
<p>Release Date: 2021-09-07</p>
<p>Fix Resolution: com.charleskorn.kaml:kaml:0.35.3</p>
</p>
</details>
<p></p>
| CVE-2021-39194 (Medium) detected in kaml-0.20.0.jar - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/3064/comments | 2 | 2023-07-25T22:57:39Z | 2023-09-26T18:40:01Z | https://github.com/opensearch-project/data-prepper/issues/3064 | 1,821,285,278 | 3,064 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently the `index_type` defaults to `management_disabled` when `distribution_version` is set to es6. This is due to old index mapping schema in ES 6:
```
org.opensearch.client.opensearch._types.OpenSearchException: Request failed: [illegal_argument_exception] Malformed [mappings] section for type [dynamic_templates], should include an inner object describing the mapping
at org.opensearch.client.transport.aws.AwsSdk2Transport.parseResponse(AwsSdk2Transport.java:492) ~[opensearch-java-2.5.0.jar:?]
```
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| [Enhancement] Support custom index template for ES 6 in opensearch sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/3060/comments | 0 | 2023-07-25T17:06:53Z | 2023-07-26T21:05:54Z | https://github.com/opensearch-project/data-prepper/issues/3060 | 1,820,763,343 | 3,060 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper now supports the new `_index_template` template type. This is preferred over the `v1` style.
**Describe the solution you'd like**
Change the default behavior of the Data Prepper `opensearch` sink to use the newer composable index templates. Make this change in a major version change.
By default:
```
sink:
- opensearch
template_type: index-template
```
**Describe alternatives you've considered (Optional)**
Keeping the default behavior. This is not ideal for many users.
Changing the default in a minor version. However, this may cause unexpected impacts and thus could be considered breaking.
**Additional context**
See some related issues: #1275, #1215, #3052
| Change the default OpenSearch template type to index_template | https://api.github.com/repos/opensearch-project/data-prepper/issues/3059/comments | 0 | 2023-07-25T15:23:48Z | 2023-07-25T15:25:03Z | https://github.com/opensearch-project/data-prepper/issues/3059 | 1,820,589,121 | 3,059 |
[
"opensearch-project",
"data-prepper"
] | Requesting RPM&DEB packages for data-prepper in order to ease the barrier of entry to its use. | Request RPM&DEB packages for data-prepper | https://api.github.com/repos/opensearch-project/data-prepper/issues/3057/comments | 2 | 2023-07-25T12:58:23Z | 2023-07-31T22:20:56Z | https://github.com/opensearch-project/data-prepper/issues/3057 | 1,820,288,597 | 3,057 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
When using a sink with the parameters template_file and ism_policy_file, an OpenSearch template is automatically created. However, this is not visible in the dashboard because it uses the old path _template instead of the new one _index_template
It's probably more an improvement to stay compatible with newer versions.
**To Reproduce**
Steps to reproduce the behavior:
1. Configure a sink, e.g.
```
sink:
- opensearch:
hosts: [ "https://127.0.0.1:9200" ]
index: "test_logs"
number_of_shards: 1
number_of_replicas: 0
template_file: "/opt/data-prepper/pipelines/log.template"
ism_policy_file: "/opt/data-prepper/pipelines/log.policy"
```
3. The log.template looks like this:
```
{
"version": 1,
"settings": {
"index": {
"opendistro": {
"index_state_management": {
"rollover_alias": "test_logs"
}
},
"number_of_shards": "1",
"number_of_replicas": "0"
}
},
"mappings": {
"properties": {
"date": {
"type": "float"
},
"level": {
"type": "keyword"
},
"log": {
"type": "keyword"
},
"logger": {
"type": "text"
},
"message": {
"type": "text"
},
"time": {
"type": "date"
}
}
}
}
```
5. GET _template shows the new template, created by data-prepper. Deletion only works in the path _template/ and not in _index_template/.
**Screenshots**
**Environment (please complete the following information):**
- OS: Ubuntu 22.04 LTS
- Data-prepper: 2.3.2
- OpenSearch 2.8.0
**Additional context**
--
| [BUG] Data-Prepper is still using the deprecated _template path when creating ES templates | https://api.github.com/repos/opensearch-project/data-prepper/issues/3052/comments | 4 | 2023-07-22T13:24:44Z | 2023-07-26T07:16:53Z | https://github.com/opensearch-project/data-prepper/issues/3052 | 1,816,764,759 | 3,052 |
[
"opensearch-project",
"data-prepper"
] | ### Background
Data Prepper has an existing metric naming convention of:
```
{pipelineName}.{pluginType}.{metricName}
```
The `{metricName}` part is custom to each plugin. The `{pipelineName}.{pluginType}` is determined by the `PluginMetrics` class and is standard to all Data Prepper metrics.
### Problems
There are a few problems with this convention:
1. Some plugin types have the same name for different component types. There is an `s3` source and sink now. And there will be a `kafka` source and sink as well.
2. Having multiple plugins of the same type does not distinguish between the metrics.
### Proposal
Update our plugin metric naming. I'd like to suggest that we use consistent names for any plugin type. And then we would use tags/dimensions to disambiguate pipelines and pluginIds.
New metric name:
```
{componentType}.{metricName}
```
New tags per metric:
```
pipelineName={pipelineName}
pluginId={pluginId}
```
The`componentType` is the type of pipeline component represented. This would be `source`, `sink`, `processor`, or `buffer`.
The `pluginId` is the plugin Id which would be added by #1025.
### Expanded Metric Proposal
Also, we currently disallow pipelines to be named `core` or `data-prepper` in order to reserve this.
Thus, all Data Prepper metrics will have the following form.
```
{scopeIdentifier}.{metrics}
```
The `scopeIdentifier` can be one of the following:
* `core`
* `data-prepper`
* A component type - e.g. `source`, `processor`.
If the `scopeIdentifier` is a pipeline component, then the plugin metric convention above applies. For `core` and `data-prepper`, the plugin metric convention does not apply and it will depend on the specific metrics.
### Migration
This new plugin metric definition is a breaking change. Thus, we can offer a flag to enable these metrics and remove it a major version bump.
In `data-prepper-core.yaml`, provide a new property named `metric_naming`. It will have two options:
* `v1` - The current naming; the default
* `v2` - The new naming convention
```
metric_naming: v2
```
### Dependencies
* #1025
### Tasks
- [ ] Refactor `PluginMetrics` to an interface
- [ ] Provide a different `PluginMetrics` depending on the naming convention
- [ ] Create a setting to configure the different options
| Consistent metric naming convention | https://api.github.com/repos/opensearch-project/data-prepper/issues/3051/comments | 2 | 2023-07-21T18:06:16Z | 2024-11-18T17:39:12Z | https://github.com/opensearch-project/data-prepper/issues/3051 | 1,816,191,180 | 3,051 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently opensearch sink defaults to `management_disabled` for index management. Since the merge of support for composable index template we have the ability to expand the support for index management for serverless ingestion.
**Describe the solution you'd like**
**Describe alternatives you've considered (Optional)**
**Additional context**
| [Enhancement] Support serverless compatible index template in opensearch sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/3048/comments | 1 | 2023-07-20T17:39:23Z | 2023-10-11T16:28:59Z | https://github.com/opensearch-project/data-prepper/issues/3048 | 1,814,495,298 | 3,048 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
We have deployed TLS enabled data prepper version 2.3.2 on Kubernetes cluster and we are using opensearch sink for it. We use data prepper for distributed tracing using opentelemetry. We noticed that once the pod starts up, it the data prepper containers always gets stuck at "Submitting request to initiate the pipeline processing" and never comes out of it
**To Reproduce**
Steps to reproduce the behavior:
1. Deploy data prepper 2.3.2 with attached pipeline configuration
**Expected behavior**
Data prepper starts up
**Attachments**
Data prepper console logs, data prepper configuration
**Environment (please complete the following information):**
- Kubernetes cluster (EKS)
[data-prepper-logs.txt](https://github.com/opensearch-project/data-prepper/files/12093613/data-prepper-logs.txt)
[data-prepper-tracing-pipeline.txt](https://github.com/opensearch-project/data-prepper/files/12093621/data-prepper-tracing-pipeline.txt)
| [BUG] Data prepper 2.3.2 is stuck in Submitting request to initiate the pipeline processing | https://api.github.com/repos/opensearch-project/data-prepper/issues/3043/comments | 2 | 2023-07-19T09:55:14Z | 2023-07-20T05:40:04Z | https://github.com/opensearch-project/data-prepper/issues/3043 | 1,811,598,786 | 3,043 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
DataPrepper supports `create` and `index` actions in the OpenSearch sink. Depending on which action is specified, the BulkRequest populates the respective parameter. This is accounted for in some places of the code, such as the [JavaClientUncompressedAccumulatingBulkRequest](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/bulk/JavaClientAccumulatingUncompressedBulkRequest.java#L75-L78), but is not accounted for in the DLQ serialization logic:
https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/opensearch/src/main/java/org/opensearch/dataprepper/plugins/sink/opensearch/dlq/FailedBulkOperationConverter.java#L65
This causes an exception that shuts down DataPrepper:
```
ERROR org.opensearch.dataprepper.pipeline.common.FutureHelper - FutureTask failed due to:
java.util.concurrent.ExecutionException: java.lang.IllegalStateException: Cannot get 'Index' variant: current variant is 'Create'.
at java.util.concurrent.FutureTask.report(FutureTask.java:122) ~[?:?]
at java.util.concurrent.FutureTask.get(FutureTask.java:191) ~[?:?]
at org.opensearch.dataprepper.pipeline.common.FutureHelper.awaitFuturesIndefinitely(FutureHelper.java:29) ~[data-prepper-core-2.3.1.jar:?]
at org.opensearch.dataprepper.pipeline.ProcessWorker.postToSink(ProcessWorker.java:140) ~[data-prepper-core-2.3.1.jar:?]
at org.opensearch.dataprepper.pipeline.ProcessWorker.doRun(ProcessWorker.java:121) ~[data-prepper-core-2.3.1.jar:?]
at org.opensearch.dataprepper.pipeline.ProcessWorker.run(ProcessWorker.java:50) ~[data-prepper-core-2.3.1.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) ~[?:?]
at java.lang.Thread.run(Thread.java:829) [?:?]
Caused by: java.lang.IllegalStateException: Cannot get 'Index' variant: current variant is 'Create'.
at org.opensearch.client.util.TaggedUnionUtils.get(TaggedUnionUtils.java:47) ~[opensearch-java-2.5.0.jar:?]
at org.opensearch.client.opensearch.core.bulk.BulkOperation.index(BulkOperation.java:139) ~[opensearch-java-2.5.0.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.dlq.FailedBulkOperationConverter.convertDocumentToGenericMap(FailedBulkOperationConverter.java:65) ~[opensearch-2.3.1.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.dlq.FailedBulkOperationConverter.convertToDlqObject(FailedBulkOperationConverter.java:40) ~[opensearch-2.3.1.jar:?]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948) ~[?:?]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.logFailureForBulkRequests(OpenSearchSink.java:328) ~[opensearch-2.3.1.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.BulkRetryStrategy.handleFailures(BulkRetryStrategy.java:334) ~[opensearch-2.3.1.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.BulkRetryStrategy.handleFailures(BulkRetryStrategy.java:252) ~[opensearch-2.3.1.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.BulkRetryStrategy.handleRetriesAndFailures(BulkRetryStrategy.java:245) ~[opensearch-2.3.1.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.BulkRetryStrategy.handleRetry(BulkRetryStrategy.java:269) ~[opensearch-2.3.1.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.BulkRetryStrategy.execute(BulkRetryStrategy.java:191) ~[opensearch-2.3.1.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.lambda$flushBatch$6(OpenSearchSink.java:314) ~[opensearch-2.3.1.jar:?]
at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:141) ~[micrometer-core-1.10.5.jar:1.10.5]
at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.flushBatch(OpenSearchSink.java:311) ~[opensearch-2.3.1.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.doOutput(OpenSearchSink.java:283) ~[opensearch-2.3.1.jar:?]
at org.opensearch.dataprepper.model.sink.AbstractSink.lambda$output$0(AbstractSink.java:64) ~[data-prepper-api-2.3.1.jar:?]
at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:141) ~[micrometer-core-1.10.5.jar:1.10.5]
at org.opensearch.dataprepper.model.sink.AbstractSink.output(AbstractSink.java:64) ~[data-prepper-api-2.3.1.jar:?]
at org.opensearch.dataprepper.pipeline.Pipeline.lambda$publishToSinks$5(Pipeline.java:336) ~[data-prepper-core-2.3.1.jar:?]
```
There may be other areas where the assumption that action is `index` is made. The code should be audited to ensure all document accesses account for both supported actions.
**Expected behavior**
DLQ serialization works regardless of action
| [BUG] DLQ Serialization Doesn't Support Create Action | https://api.github.com/repos/opensearch-project/data-prepper/issues/3040/comments | 0 | 2023-07-18T18:35:17Z | 2023-07-19T21:13:01Z | https://github.com/opensearch-project/data-prepper/issues/3040 | 1,810,484,422 | 3,040 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Race condition in DataPrepper sources using e2e acknowledgements.
DataPrepper sources using e2e acknowledgements are doing something like this
```
bufferAccumulator.add(eventRecord);
if(Objects.nonNull(acknowledgementSet)){
acknowledgementSet.add(eventRecord.getData());
}
```
This may result in eventRecord getting processed (and may even be dropped, if drop processor is used) even before the event Record is added tot he acknowledgement set.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
Expected behavior is that the event record is added to the acknowledgement set before it is processed further by the pipeline.
Fix is to swap the lines as shown below
```
if(Objects.nonNull(acknowledgementSet)){
acknowledgementSet.add(eventRecord.getData());
}
bufferAccumulator.add(eventRecord);
```
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| [BUG] Race condition in DataPrepper sources using e2e acknowledgements | https://api.github.com/repos/opensearch-project/data-prepper/issues/3038/comments | 0 | 2023-07-18T18:05:51Z | 2023-07-18T22:53:42Z | https://github.com/opensearch-project/data-prepper/issues/3038 | 1,810,445,690 | 3,038 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
I'd like to push metrics to Prometheus.
**Describe the solution you'd like**
Provide a `prometheus` sink which can support push.
```
sink:
- prometheus:
model: push
endpoint: http://localhost:2090
```
**Additional context**
This is complementary to #1744.
| Support Prometheus as a Sink storage (remote-write model) | https://api.github.com/repos/opensearch-project/data-prepper/issues/3028/comments | 0 | 2023-07-13T17:26:23Z | 2025-04-17T14:08:22Z | https://github.com/opensearch-project/data-prepper/issues/3028 | 1,803,465,348 | 3,028 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The error message provided by STS when assuming may not always be the most useful message to present.
**Describe the solution you'd like**
Allow configuration of the AWS plugin to change the error message provided when assuming a role.
This message can use some formatting to make it dynamic.
e.g.
```
Unable to assume role ${stsRoleArn} from Amazon STS.
```
This may yield:
```
Unable to assume role arn:aws:iam::123456789012:role/MyRole from Amazon STS.
```
Configuration in data-prepper-config.yaml:
```
aws:
sts_errors:
default_error: Unable to assume role ${stsRoleArn} from Amazon STS.
```
**Additional context**
This relates to the ongoing work in the AWS plugin in #2570. And it requires that extension plugins can add new configurations to data-prepper-config.yaml which is part of the extensions work in #2588.
| Support configurable error message from STS | https://api.github.com/repos/opensearch-project/data-prepper/issues/3018/comments | 0 | 2023-07-12T19:32:13Z | 2023-07-12T21:07:33Z | https://github.com/opensearch-project/data-prepper/issues/3018 | 1,801,628,706 | 3,018 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Please address comments in issue : https://github.com/opensearch-project/data-prepper/pull/2925
Issue : downloadReady flag and also global usage of this flag.
**To Reproduce**
Review comments to be addressed
| [BUG] GeoIP - Download ready flag | https://api.github.com/repos/opensearch-project/data-prepper/issues/3015/comments | 2 | 2023-07-12T17:10:08Z | 2023-07-13T15:35:00Z | https://github.com/opensearch-project/data-prepper/issues/3015 | 1,801,405,566 | 3,015 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
The `s3` source no longer validates the bucket ownership, which can lead to bucket sniping.
**To Reproduce**
Steps to reproduce the behavior:
1. Run Data Prepper with cross-account SQS/S3
**Expected behavior**
The bucket is not owned by the same account, Data Prepper should not read from this S3 bucket.
**Additional context**
This was a regression introduced in 2.3 via:
https://github.com/opensearch-project/data-prepper/pull/2727/files#diff-93680a6369c6fc1d125fc6244e536b254234123d702ea7750812a7df24ec96eaR63-R65
| [BUG] Data Prepper S3 source does not validate bucket ownership | https://api.github.com/repos/opensearch-project/data-prepper/issues/3005/comments | 0 | 2023-07-11T19:14:04Z | 2023-07-12T17:36:11Z | https://github.com/opensearch-project/data-prepper/issues/3005 | 1,799,604,355 | 3,005 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The current opensearch sink lacks support for Elasticsearch 6 due to the type removal breaking change in opensearch:
```
Request failed: [action_request_validation_exception] Validation Failed: 1: type is missing;2: type is missing;3: type is missing;4: type is missing;
```
**Describe the solution you'd like**
We can fix the type parameter missing by introducing default {index} and {type} in URI: https://www.elastic.co/guide/en/elasticsearch/reference/6.8/docs-bulk.html
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Support ES 6.8 in opensearch sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/3003/comments | 0 | 2023-07-11T16:39:50Z | 2023-07-28T23:32:55Z | https://github.com/opensearch-project/data-prepper/issues/3003 | 1,799,331,892 | 3,003 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Race condition in S3 source when E2E acknowledgements enabled.
The main thread and the ack callback thread may get executed in different order causing the `waitingForAcknowledgements` to be not populated correctly. This can cause SQS messages not deleted.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
Expected behavior is for the callback thread to run only after `waitingForAcknowledgements` are fully populated by the main thread.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| [BUG] Race condition in S3 source when E2E acknowledgements enabled | https://api.github.com/repos/opensearch-project/data-prepper/issues/3000/comments | 0 | 2023-07-11T07:29:13Z | 2023-07-11T22:09:28Z | https://github.com/opensearch-project/data-prepper/issues/3000 | 1,798,300,903 | 3,000 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
_Network delays when sending alert messages._
Sometimes messages are not sent during tests of sending alert messages and an exception occurs. I suspect network delays/FW cause errors randomly. On the same machine, there is no problem with the same mailbox using Alertmanager.
`[2023-07-06T22:12:19,684][INFO ][o.o.n.s.SendMessageActionHelper] [kubernetes-master-vm-0] notifications:getAllConfigs-get [wncJLYkBTr_U9fu7o619, v3QALYkBTr_U9fu7Sq-d]
[2023-07-06T22:12:19,707][INFO ][o.o.n.c.c.DestinationSmtpClient] [kubernetes-master-vm-0] EmailException javax.mail.MessagingException: Exception reading response;
nested exception is:
java.net.SocketException: Connection reset
[2023-07-06T22:12:19,707][INFO ][o.o.n.s.SendMessageActionHelper] [kubernetes-master-vm-0] notifications:sendMessage:statusCode=424, statusText=sendEmail Error, status:Exception reading response
[2023-07-06T22:12:19,708][INFO ][o.o.n.s.SendMessageActionHelper] [kubernetes-master-vm-0] notifications:wncJLYkBTr_U9fu7o619:statusCode=424, statusText=sendEmail Error, status:Exception reading response
[2023-07-06T22:12:19,708][WARN ][o.o.n.a.PluginBaseAction ] [kubernetes-master-vm-0] notifications:OpenSearchStatusException:
org.opensearch.OpenSearchStatusException: {"event_status_list": [{"config_id":"iJOwKYgBPMWZN1s6JSE4","config_type":"email","config_name":"login_errors","email_recipient_status":[{"recipient":"user@mail.com","delivery_status":{"status_code":"424","status_text":"sendEmail Error, status:Exception reading response"}}],"delivery_status":{"status_code":"424","status_text":"sendEmail Error, status:Exception reading response"}}]}
at org.opensearch.notifications.send.SendMessageActionHelper.executeRequest(SendMessageActionHelper.kt:99) ~[?:?]
at org.opensearch.notifications.send.SendMessageActionHelper$executeRequest$1.invokeSuspend(SendMessageActionHelper.kt) ~[?:?]
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:33) [kotlin-stdlib-1.6.10.jar:1.6.10-release-923(1.6.10)]
at kotlinx.coroutines.internal.ScopeCoroutine.afterResume(Scopes.kt:32) [kotlinx-coroutines-core-jvm-1.4.3.jar:?]
at kotlinx.coroutines.AbstractCoroutine.resumeWith(AbstractCoroutine.kt:113) [kotlinx-coroutines-core-jvm-1.4.3.jar:?]
at kotlin.coroutines.jvm.internal.BaseContinuationImpl.resumeWith(ContinuationImpl.kt:46) [kotlin-stdlib-1.6.10.jar:1.6.10-release-923(1.6.10)]
at kotlinx.coroutines.DispatchedTask.run(DispatchedTask.kt:106) [kotlinx-coroutines-core-jvm-1.4.3.jar:?]
at kotlinx.coroutines.scheduling.CoroutineScheduler.runSafely(CoroutineScheduler.kt:571) [kotlinx-coroutines-core-jvm-1.4.3.jar:?]
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.executeTask(CoroutineScheduler.kt:750) [kotlinx-coroutines-core-jvm-1.4.3.jar:?]
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.runWorker(CoroutineScheduler.kt:678) [kotlinx-coroutines-core-jvm-1.4.3.jar:?]
at kotlinx.coroutines.scheduling.CoroutineScheduler$Worker.run(CoroutineScheduler.kt:665) [kotlinx-coroutines-core-jvm-1.4.3.jar:?]
[2023-07-06T22:12:19,709][ERROR][o.o.n.a.SendTestNotificationAction] [kubernetes-master-vm-0] notifications:SendTestNotificationAction-send Error:OpenSearchStatusException[{"event_status_list": [{"config_id":"iJOwKYgBPMWZN1s6JSE4","config_type":"email","config_name":"login_errors","email_recipient_status":[{"recipient":"user@mail.com","delivery_status":{"status_code":"424","status_text":"sendEmail Error, status:Exception reading response"}}],"delivery_status":{"status_code":"424","status_text":"sendEmail Error, status:Exception reading response"}}]}]`
**To Reproduce**
Steps to reproduce the behavior:
1. Turn on firewall
2. Create a channel with SMTP sender
3. Send test message multiple times
**Expected behavior**
Messages are always sent.
**Environment (please complete the following information):**
- OS: [Ubuntu 20.04.5 LTS]
- Version [v 2.8.0] | [BUG] Network delays when sending alert messages. | https://api.github.com/repos/opensearch-project/data-prepper/issues/2990/comments | 0 | 2023-07-07T10:31:34Z | 2023-07-11T08:31:05Z | https://github.com/opensearch-project/data-prepper/issues/2990 | 1,793,274,518 | 2,990 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Rename StartTime and Time fields of Aggregate processor to "AggregationStartTime" and "AggregationEndTime"
**Describe the solution you'd like**
Rename StartTime and Time fields of Aggregate processor to "AggregationStartTime" and "AggregationEndTime"
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Rename StartTime and Time fields of Aggregate processor | https://api.github.com/repos/opensearch-project/data-prepper/issues/2988/comments | 1 | 2023-07-07T00:48:10Z | 2023-08-16T22:28:14Z | https://github.com/opensearch-project/data-prepper/issues/2988 | 1,792,517,969 | 2,988 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Customers would like to allowlist Some data to OpenSearch and send all discarded data to S3 Sink. Right now there is not an easy way to do it, it would be nice to add options of include_keys and exclude_keys under the sink.
**Describe the solution you'd like**
This additional options must be supported in OpenSearch sink and S3 sink, and can be extended to others as well.
The minimal usage is as below:
```
sink:
- opensearch:
...
include_keys: [ "key-one", "key-one" ]
- opensearch:
...
exclude_keys: [ "key-three" ]
- s3:
...
exclude_keys: [ "key-one", "key-one" ]
```
The assumption is that customers should not use both in the one sink. So if include_keys are defined, exclude_keys will be ignored.
For OpenSearch sink, this list of keys are under the document_root_key if document_root_key is defined.
It would be nice if create a list with name to be used as allowlist in one sink and reuse same name with negation as deny list in another sink. Such as:
```
sink:
- opensearch:
include_keys:
waf_logs_desired : [ "key-one", "key-one" ] <<< This is like defining a list of Keys with name “waf_logs_desired” and to reuse same name in deny_list.
- s3:
exclude_keys: waf_logs_desired
```
| Add include_keys and exclude_keys options under the sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/2975/comments | 22 | 2023-07-03T03:04:22Z | 2023-08-14T15:28:52Z | https://github.com/opensearch-project/data-prepper/issues/2975 | 1,785,195,111 | 2,975 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper logs usually contain an array of json objects and I would like to add entries inside each object of the array.
**Describe the solution you'd like**
For Example:
Incoming Event JSON :
```
{
"collection": [
{
"entry_one": "value_one"
},
{
"entry_two": "value_two"
}
]
}
```
I would like to add an entry `"new_entry" : "new_value"` inside each object of `collection` array.
Ouput JSON would look like this:
```
{
"collection": [
{
"entry_one": "value_one"
"new_entry" : "new_value"
},
{
"entry_two": "value_two"
"new_entry": "new_value"
}
]
}
```
The following yaml config can work for the above scenario:
```
mypipeline:
source:
http:
processor:
- add_entries:
iterate_on : "collection"
entries:
- key : "new_key"
value : "new_value"
sink:
- stdout:
```
Currently `translate` processor has `iterate_on` option for iterating over a JSON array. We can have the same option name for `add_entries` processor also.
In the above config `iterate_on` option is configured with the JSON array field `collection`. The entries specified in the `entries` are added to each object inside `collection`.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Similar issue : https://github.com/opensearch-project/data-prepper/issues/2853
| Support for iterating over an array in add_entries processor | https://api.github.com/repos/opensearch-project/data-prepper/issues/2957/comments | 0 | 2023-06-30T17:24:49Z | 2023-07-05T20:36:02Z | https://github.com/opensearch-project/data-prepper/issues/2957 | 1,782,886,331 | 2,957 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
When the substitute string processor encounters an exception while processing, it throws that exception and shuts down the process worker thread. The rest of the pipeline remains active but unable to make progress on incoming data. Example exception that shut down a process worker thread:
```
2023-06-30T07:11:01.606 [s3-pipeline-processor-worker-1-thread-1] ERROR org.opensearch.dataprepper.pipeline.ProcessWorker - Encountered exception during pipeline s3-pipeline processing
java.lang.IllegalArgumentException: key _<redacted> must contain only alphanumeric chars with .-_ and must follow JsonPointer (ie. 'field/to/key')
```
**To Reproduce**
Steps to reproduce the behavior:
1. Create a pipeline with the substitute string processor
2. Ingest data that causes an exception to be thrown, example: invalid JSON pointer
**Expected behavior**
The entire pipeline should shut down rather than remaining partially alive but unable to process data
| [BUG] Exception in substitute string processor shuts down processor work but not pipeline | https://api.github.com/repos/opensearch-project/data-prepper/issues/2956/comments | 4 | 2023-06-30T15:13:17Z | 2023-12-12T19:45:46Z | https://github.com/opensearch-project/data-prepper/issues/2956 | 1,782,704,485 | 2,956 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
My Data Prepper pipeline seems to be sending much larger requests that the `bulk_size` configured in the pipeline. Here's the relevant snippet of my pipeline configuration:
```
....
processor:
- add_entries:
entries:
- key: "document_id"
value_expression: "getMetadata(\"document_id\")"
- key: "index"
value_expression: "getMetadata(\"index\")"
sink:
- opensearch:
hosts: ["<redacted>"]
username: "<redacted>"
password: "<redacted>"
index: "${index}"
document_id_field: "document_id"
bulk_size: 4
```
Running Data Prepper with this pipeline results in an exception that suggests that the size of the `bulk` request is much larger that what is configured:
```
...
WARN org.opensearch.dataprepper.plugins.sink.opensearch.BulkRetryStrategy - Bulk Operation Failed. Number of retries 5. Retrying...
org.opensearch.client.ResponseException: method [POST], host [<redacted>], URI [/_bulk], status line [HTTP/1.1 413 Request Entity Too Large]
{"Message":"Request size exceeded 10485760 bytes"}
at org.opensearch.client.RestClient.convertResponse(RestClient.java:375) ~[opensearch-rest-client-2.7.0.jar:?]
at org.opensearch.client.RestClient.performRequest(RestClient.java:345) ~[opensearch-rest-client-2.7.0.jar:?]
at org.opensearch.client.RestClient.performRequest(RestClient.java:320) ~[opensearch-rest-client-2.7.0.jar:?]
at org.opensearch.client.transport.rest_client.RestClientTransport.performRequest(RestClientTransport.java:143) ~[opensearch-java-2.5.0.jar:?]
at org.opensearch.client.opensearch.OpenSearchClient.bulk(OpenSearchClient.java:217) ~[opensearch-java-2.5.0.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.lambda$doInitializeInternal$1(OpenSearchSink.java:202) ~[opensearch-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.BulkRetryStrategy.handleRetry(BulkRetryStrategy.java:267) ~[opensearch-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.BulkRetryStrategy.execute(BulkRetryStrategy.java:191) ~[opensearch-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.lambda$flushBatch$6(OpenSearchSink.java:319) ~[opensearch-2.4.0-SNAPSHOT.jar:?]
at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:141) ~[micrometer-core-1.10.5.jar:1.10.5]
at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.flushBatch(OpenSearchSink.java:316) ~[opensearch-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.doOutput(OpenSearchSink.java:288) ~[opensearch-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.model.sink.AbstractSink.lambda$output$0(AbstractSink.java:64) ~[data-prepper-api-2.4.0-SNAPSHOT.jar:?]
at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:141) ~[micrometer-core-1.10.5.jar:1.10.5]
at org.opensearch.dataprepper.model.sink.AbstractSink.output(AbstractSink.java:64) ~[data-prepper-api-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.pipeline.Pipeline.lambda$publishToSinks$5(Pipeline.java:336) ~[data-prepper-core-2.4.0-SNAPSHOT.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
```
This seems to be the opposite of https://github.com/opensearch-project/data-prepper/issues/2852
**To Reproduce**
See description above
**Expected behavior**
Size of the bulk request should be equal or under to the configured `bulk_size` value.
**Screenshots**
N/A
**Environment (please complete the following information):**
- OS: MacOS 12.6.6
- Version: Data Prepper 2.4.0-SNAPSHOT
**Additional context**
N/A
| [BUG] bulk_size is underestimated | https://api.github.com/repos/opensearch-project/data-prepper/issues/2954/comments | 2 | 2023-06-29T22:20:48Z | 2023-07-26T22:15:30Z | https://github.com/opensearch-project/data-prepper/issues/2954 | 1,781,586,746 | 2,954 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
While trying to set field_delimiter_regex, I see the following error `Caused by: java.lang.IllegalArgumentException: field_delimiter_regex and field_split_characters cannot both be defined.`, I am not setting field_split_characters, I assume it is using the default one.
**To Reproduce**
1. Use the configuration mentioned in the [documentation](https://opensearch.org/docs/latest/data-prepper/pipelines/configuration/processors/key-value/)
`field_delimiter_regex: "&\\{2\\}"`
2. Star the docker container
```
docker run --rm --name data-prepper -p 2021:2021-v $(pwd)/pipeline.yaml:/usr/share/data-prepper/pipelines/pipelines.yaml -v $(pwd)/data-prepper-config.yaml:/usr/share/data-prepper/config/data-prepper-config.yaml -v data-prepper-out:/usr/share/log opensearchproject/data-prepper:latest
```
4. See error
```
Caused by: java.lang.IllegalArgumentException: field_delimiter_regex and field_split_characters cannot both be defined.
at org.opensearch.dataprepper.plugins.processor.keyvalue.KeyValueProcessor.<init>(KeyValueProcessor.java:45) ~[key-value-processor-2.2.1.jar:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:480) ~[?:?]
at org.opensearch.dataprepper.plugin.PluginCreator.newPluginInstance(PluginCreator.java:40) ~[data-prepper-core-2.2.1.jar:?]
... 35 more
```
**Pipeline config**
```
log-pipeline:
source:
http:
processor:
- key_value:
field_delimiter_regex: "&\\{2\\}"
source: "log"
sink:
- file:
path: /usr/share/log/output-file
```
| [BUG] : unable to set field_delimiter_regex | https://api.github.com/repos/opensearch-project/data-prepper/issues/2946/comments | 4 | 2023-06-27T18:47:18Z | 2024-04-03T17:15:54Z | https://github.com/opensearch-project/data-prepper/issues/2946 | 1,777,554,077 | 2,946 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Hello! We're working on changing the [opensearch-migrations](https://github.com/opensearch-project/opensearch-migrations) tooling so our data migration implementation uses Data Prepper instead of Logstash. We will be leveraging the newly minted OpenSearch/ElasticSearch source plugin for this.
Since this is a pull-based plugin, there is a finite set of data that needs to be ingested. Once all of the data has been processed, our expectation is that the Data Prepper pipeline would shut itself down based on a signal from the source plugin. This is similar to how pull-based plugins function in Logstash.
However, Data Prepper does not currently operate this way. The pipeline/process continues to stay alive (though the source plugin is not pulling any more data) until the caller terminates it or shuts it down via APIs.
**Describe the solution you'd like**
Once a pull-based source plugin has completed ingesting all data, it should signal to the Data Prepper pipeline, and the pipeline should shut itself down.
**Describe alternatives you've considered (Optional)**
An alternative approach would be to have the pipeline / source plugin signal externally (to the caller) that all data has been processed. The caller can then invoke the Data Prepper shutdown API to stop the process.
**Additional context**
N/A
| [Feature Request] Allow the OpenSearch source plugin to shut down the Data Prepper pipeline | https://api.github.com/repos/opensearch-project/data-prepper/issues/2944/comments | 6 | 2023-06-27T17:47:34Z | 2023-07-13T21:52:33Z | https://github.com/opensearch-project/data-prepper/issues/2944 | 1,777,457,389 | 2,944 |
[
"opensearch-project",
"data-prepper"
] | ## WS-2023-0116 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jose4j-0.7.9.jar</b></p></summary>
<p>The jose.4.j library is a robust and easy to use open source implementation of JSON Web Token (JWT) and the JOSE specification suite (JWS, JWE, and JWK).
It is written in Java and relies solely on the JCA APIs for cryptography.
Please see https://bitbucket.org/b_c/jose4j/wiki/Home for more info, examples, etc..</p>
<p>Library home page: <a href="https://bitbucket.org/b_c/jose4j/">https://bitbucket.org/b_c/jose4j/</a></p>
<p>Path to dependency file: /data-prepper-main/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bitbucket.b_c/jose4j/0.7.9/b44a2235728ab1cad9ffd06013500f09a5f1d241/jose4j-0.7.9.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bitbucket.b_c/jose4j/0.7.9/b44a2235728ab1cad9ffd06013500f09a5f1d241/jose4j-0.7.9.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bitbucket.b_c/jose4j/0.7.9/b44a2235728ab1cad9ffd06013500f09a5f1d241/jose4j-0.7.9.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bitbucket.b_c/jose4j/0.7.9/b44a2235728ab1cad9ffd06013500f09a5f1d241/jose4j-0.7.9.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bitbucket.b_c/jose4j/0.7.9/b44a2235728ab1cad9ffd06013500f09a5f1d241/jose4j-0.7.9.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bitbucket.b_c/jose4j/0.7.9/b44a2235728ab1cad9ffd06013500f09a5f1d241/jose4j-0.7.9.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bitbucket.b_c/jose4j/0.7.9/b44a2235728ab1cad9ffd06013500f09a5f1d241/jose4j-0.7.9.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bitbucket.b_c/jose4j/0.7.9/b44a2235728ab1cad9ffd06013500f09a5f1d241/jose4j-0.7.9.jar</p>
<p>
Dependency Hierarchy:
- kafka-plugins-2.5.0-SNAPSHOT (Root Library)
- kafka-schema-registry-7.3.3.jar
- kafka_2.13-7.3.3-ccs.jar
- :x: **jose4j-0.7.9.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/8bb96ddcf23859e0e7b55c3a9add5d77eddbccb0">8bb96ddcf23859e0e7b55c3a9add5d77eddbccb0</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
RSA1_5 in jose4j is susceptible to chosen ciphertext attacks. The
attack allows to decrypt RSA1_5 or RSA_OAEP encrypted ciphertexts. It may be feasible to sign with affected keys.
<p>Publish Date: 2023-04-27
<p>URL: <a href=https://bitbucket.org/b_c/jose4j/commits/14e62a8dee9decb4ff6e0625aedc5724601bfdb6>WS-2023-0116</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-jgvc-jfgh-rjvv">https://github.com/advisories/GHSA-jgvc-jfgh-rjvv</a></p>
<p>Release Date: 2023-04-27</p>
<p>Fix Resolution: org.bitbucket.b_c:jose4j:0.9.3</p>
</p>
</details>
<p></p>
| WS-2023-0116 (Medium) detected in jose4j-0.7.9.jar - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/2943/comments | 3 | 2023-06-27T17:30:18Z | 2023-09-26T18:40:10Z | https://github.com/opensearch-project/data-prepper/issues/2943 | 1,777,432,633 | 2,943 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Traces coming from the otel-trace-pipeline and forwarded to the service-map-pipeline don't propagate some items. It could be a great improvement if custom items could be added in the service-map-pipeline.
**Describe the solution you'd like**
The trace I recieve at the moment from the service-map-pipeline is the following:
{
"_index" : "otel-v1-apm-service-map",
"_type" : "_doc",
"_id" : "4tvYCIifyVbGYfH/9iaP/g==",
"_score" : 1.0,
"_source" : {
"serviceName" : "order",
"kind" : "SPAN_KIND_INTERNAL",
"destination" : {
"resource" : "prepareOrderManifest",
"domain" : "inventory"
},
"target" : null,
"traceGroupName" : "createOrder",
"hashId" : "4tvYCIifyVbGYfH/9iaP/g=="
}
}
This trace should be manipulated such that a "tenant" item is added. "tenant" is already available as a resource attribute within the trace in the otel-trace-pipeline :
{
"_index" : "otel-v1-apm-service-map",
"_type" : "_doc",
"_id" : "4tvYCIifyVbGYfH/9iaP/g==",
"_score" : 1.0,
"_source" : {
"serviceName" : "order",
"kind" : "SPAN_KIND_INTERNAL",
"destination" : {
"resource" : "prepareOrderManifest",
"domain" : "inventory"
},
"target" : null,
"traceGroupName" : "createOrder",
"hashId" : "4tvYCIifyVbGYfH/9iaP/g==",
"tenant" : "test1"
}
}
**Additional context**
Add any other context or screenshots about the feature request here.
| Adding custom items to service-map traces | https://api.github.com/repos/opensearch-project/data-prepper/issues/2941/comments | 1 | 2023-06-27T08:40:36Z | 2024-12-11T13:24:19Z | https://github.com/opensearch-project/data-prepper/issues/2941 | 1,776,393,264 | 2,941 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Firstly, thanks for merging in [support for the scroll API for the ElasticSearch accessor](https://github.com/opensearch-project/data-prepper/pull/2930)! When I attempted to use this plugin with a test ElasticSearch OSS source cluster, I ran into the following error:
```
....
2023-06-26T22:30:19,634 [historical-data-migration-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [historical-data-migration] Sink is ready, starting source...
2023-06-26T22:30:19,641 [historical-data-migration-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.plugins.source.opensearch.worker.client.OpenSearchClientFactory - Using username and password for auth for the OpenSearch source
2023-06-26T22:30:19,642 [historical-data-migration-sink-worker-2-thread-1] ERROR org.opensearch.dataprepper.pipeline.common.PipelineThreadPoolExecutor - Pipeline [historical-data-migration] process worker encountered a fatal exception, cannot proceed further
java.util.concurrent.ExecutionException: java.lang.IllegalArgumentException: Username may not be null
at java.util.concurrent.FutureTask.report(FutureTask.java:122) ~[?:?]
at java.util.concurrent.FutureTask.get(FutureTask.java:191) ~[?:?]
at org.opensearch.dataprepper.pipeline.common.PipelineThreadPoolExecutor.afterExecute(PipelineThreadPoolExecutor.java:70) [data-prepper-core-2.4.0-SNAPSHOT.jar:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1137) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
Caused by: java.lang.IllegalArgumentException: Username may not be null
at org.apache.http.util.Args.notNull(Args.java:54) ~[httpcore-4.4.16.jar:4.4.16]
at org.apache.http.auth.UsernamePasswordCredentials.<init>(UsernamePasswordCredentials.java:81) ~[httpclient-4.5.14.jar:4.5.14]
at org.opensearch.dataprepper.plugins.source.opensearch.worker.client.OpenSearchClientFactory.attachUsernamePassword(OpenSearchClientFactory.java:175) ~[opensearch-source-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.plugins.source.opensearch.worker.client.OpenSearchClientFactory.createOpenSearchRestClient(OpenSearchClientFactory.java:137) ~[opensearch-source-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.plugins.source.opensearch.worker.client.OpenSearchClientFactory.provideOpenSearchClient(OpenSearchClientFactory.java:75) ~[opensearch-source-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.plugins.source.opensearch.worker.client.SearchAccessorStrategy.getSearchAccessor(SearchAccessorStrategy.java:61) ~[opensearch-source-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.plugins.source.opensearch.OpenSearchSource.startProcess(OpenSearchSource.java:49) ~[opensearch-source-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.plugins.source.opensearch.OpenSearchSource.start(OpenSearchSource.java:41) ~[opensearch-source-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.pipeline.Pipeline.startSourceAndProcessors(Pipeline.java:210) ~[data-prepper-core-2.4.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.pipeline.Pipeline.lambda$execute$2(Pipeline.java:251) ~[data-prepper-core-2.4.0-SNAPSHOT.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) ~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) ~[?:?]
... 2 more
2023-06-26T22:30:19,654 [historical-data-migration-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [historical-data-migration] - Received shutdown signal with processor shutdown timeout PT30S and sink shutdown timeout PT30S. Initiating the shutdown process
...
```
It looks like the plugin is expecting `username` and `password` configurations in the pipeline even when the source cluster does not have security enabled (which mine does not since i'm using the OSS distro and security is an X-Pack plugin)
**To Reproduce**
Steps to reproduce the behavior:
1. Start/use an ElasticSearch OSS source cluster (i'm using [ES OSS 7.10.2](https://www.elastic.co/downloads/past-releases/elasticsearch-oss-7-10-2)) without any auth
2. Configure a Data Prepper pipeline with the following source:
```
test-pipeline:
source:
opensearch:
hosts: ["<es-oss-host>"]
```
3. Start Data Prepper using this pipeline and observe the error
**Expected behavior**
The above source configuration should work without errors
**Screenshots**
N/A, stack-trace provided above
**Environment (please complete the following information):**
- OS: MacOS 12.6.6
- Version: Data Prepper 2.4.0-SNAPSHOT
**Additional context**
I am able to work around this problem by providing dummy values for username and password in the config:
```
test-pipeline:
source:
opensearch:
hosts: ["<es-oss-host>"]
username: "dummy"
password: "dummy"
```
The ES source cluster ignores/does not parse these values so the API call goes through.
| [BUG] OpenSearch source plugin requires auth configuration even when the source cluster does not need any | https://api.github.com/repos/opensearch-project/data-prepper/issues/2939/comments | 0 | 2023-06-26T22:41:21Z | 2023-06-29T16:37:37Z | https://github.com/opensearch-project/data-prepper/issues/2939 | 1,775,794,357 | 2,939 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Pipeline users want to send message to an Amazon Simple Notification Service (SNS) topic.
**Describe the solution you'd like**
Create a new sink in Data Prepper which outputs data to topic using codec
```
sink:
- sns:
topic_name: "mytopic"
id: << String>>
aws:
region: us-east-1
sts_role_arn: "arn:aws:sns:us-east-1:1234567:hello"
codec:
ndjson:
```
**Additional context**
- https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/examples-simple-notification-service.html
| SNS as Sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/2938/comments | 2 | 2023-06-26T22:13:42Z | 2023-09-20T14:51:56Z | https://github.com/opensearch-project/data-prepper/issues/2938 | 1,775,756,946 | 2,938 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-35165 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>aws-cdk-lib-2.13.0.tgz</b></p></summary>
<p>Version 2 of the AWS Cloud Development Kit library</p>
<p>Library home page: <a href="https://registry.npmjs.org/aws-cdk-lib/-/aws-cdk-lib-2.13.0.tgz">https://registry.npmjs.org/aws-cdk-lib/-/aws-cdk-lib-2.13.0.tgz</a></p>
<p>Path to dependency file: /release/staging-resources-cdk/package.json</p>
<p>Path to vulnerable library: /release/staging-resources-cdk/node_modules/aws-cdk-lib/package.json</p>
<p>
Dependency Hierarchy:
- :x: **aws-cdk-lib-2.13.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
AWS Cloud Development Kit (AWS CDK) is an open-source software development framework to define cloud infrastructure in code and provision it through AWS CloudFormation. In the packages `aws-cdk-lib` 2.0.0 until 2.80.0 and `@aws-cdk/aws-eks` 1.57.0 until 1.202.0, `eks.Cluster` and `eks.FargateCluster` constructs create two roles, `CreationRole` and `default MastersRole`, that have an overly permissive trust policy.
The first, referred to as the `CreationRole`, is used by lambda handlers to create the cluster and deploy Kubernetes resources (e.g `KubernetesManifest`, `HelmChart`, ...) onto it. Users with CDK version higher or equal to 1.62.0 (including v2 users) may be affected.
The second, referred to as the `default MastersRole`, is provisioned only if the `mastersRole` property isn't provided and has permissions to execute `kubectl` commands on the cluster. Users with CDK version higher or equal to 1.57.0 (including v2 users) may be affected.
The issue has been fixed in `@aws-cdk/aws-eks` v1.202.0 and `aws-cdk-lib` v2.80.0. These versions no longer use the account root principal. Instead, they restrict the trust policy to the specific roles of lambda handlers that need it. There is no workaround available for CreationRole. To avoid creating the `default MastersRole`, use the `mastersRole` property to explicitly provide a role.
<p>Publish Date: 2023-06-23
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-35165>CVE-2023-35165</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-rx28-r23p-2qc3">https://github.com/advisories/GHSA-rx28-r23p-2qc3</a></p>
<p>Release Date: 2023-06-23</p>
<p>Fix Resolution: 2.80.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2023-35165 (Medium) detected in aws-cdk-lib-2.13.0.tgz | https://api.github.com/repos/opensearch-project/data-prepper/issues/2933/comments | 0 | 2023-06-26T02:39:09Z | 2023-06-29T14:59:05Z | https://github.com/opensearch-project/data-prepper/issues/2933 | 1,773,728,554 | 2,933 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Customers would like to use Data Prepper to sync the data in Amazon DynamoDB (as source) to destination such as OpenSearch. The sync will include a full historical data dump and/or incremental changes capture.
**Describe the solution you'd like**
Using DynamoDB as source will need to support:
1. Data Export: Using DynamoDB point in time export for historical data.
2. Change Data Capture: This includes but not limited to the use of DynamoDB Streams.
Also, it will be nice to have flexiable configurations to support different run types:
1. Data Export only
2. Streams Only
3. Data Export (historical) + Streams (CDC)
For DynamoDB data export, the data will be stored in S3, hence it would be nice that this source will also trigger the S3 scan job to run.
> Note that DynamoDB using Kinesis Data Streams will not in scope, for which the Kinesis Data Source should be used instead.
| Support DynamoDB as source | https://api.github.com/repos/opensearch-project/data-prepper/issues/2932/comments | 2 | 2023-06-26T02:14:32Z | 2023-10-03T16:31:24Z | https://github.com/opensearch-project/data-prepper/issues/2932 | 1,773,675,922 | 2,932 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
As a user with Events that have keys with epoch milli timestamps, `"{ "timestamp": 1687549774219 }`, I would like to use the date processor to convert this timestamp to a date
**Describe the solution you'd like**
A new concept of built-in patterns for the date processor. So now, instead of just `DateTimeFormatter` patterns, the date processor will have a list of built in patterns such as `epoch_milli`, `epoch_second`, or `epoch_nano`. These patterns will work exactly the same as the regular date time patterns do. For example, the config will look like this
```
processor:
- date:
match:
key: "timestamp"
patterns: [ "epoch_milli", "epoch_second", "yyyy.MM.dd.hh.ss" ]
```
The date processor would first check if the timestamp is epoch milli, and if so it will convert it to a Date Time. If it wasn't found to be an epoch milli timestamp, then it will check for epoch_second, then finally the DateTime pattern.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Date processor to convert from epoch_milli or epoch_second | https://api.github.com/repos/opensearch-project/data-prepper/issues/2929/comments | 1 | 2023-06-23T19:54:34Z | 2023-12-20T19:58:59Z | https://github.com/opensearch-project/data-prepper/issues/2929 | 1,772,041,123 | 2,929 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-34462 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>netty-all-4.1.86.Final.jar</b>, <b>netty-handler-4.1.86.Final.jar</b></p></summary>
<p>
<details><summary><b>netty-all-4.1.86.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p>
<p>Path to dependency file: /release/maven/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.86.Final/a6395c3d2f8699e8dc4fd1e38171f82045f4af7b/netty-all-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.86.Final/a6395c3d2f8699e8dc4fd1e38171f82045f4af7b/netty-all-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.86.Final/a6395c3d2f8699e8dc4fd1e38171f82045f4af7b/netty-all-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.86.Final/a6395c3d2f8699e8dc4fd1e38171f82045f4af7b/netty-all-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.86.Final/a6395c3d2f8699e8dc4fd1e38171f82045f4af7b/netty-all-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.86.Final/a6395c3d2f8699e8dc4fd1e38171f82045f4af7b/netty-all-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.86.Final/a6395c3d2f8699e8dc4fd1e38171f82045f4af7b/netty-all-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.86.Final/a6395c3d2f8699e8dc4fd1e38171f82045f4af7b/netty-all-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-all/4.1.86.Final/a6395c3d2f8699e8dc4fd1e38171f82045f4af7b/netty-all-4.1.86.Final.jar</p>
<p>
Dependency Hierarchy:
- data-prepper-plugins-2.4.0-SNAPSHOT (Root Library)
- parquet-codecs-2.4.0-SNAPSHOT
- hadoop-mapreduce-client-core-3.3.5.jar
- :x: **netty-all-4.1.86.Final.jar** (Vulnerable Library)
</details>
<details><summary><b>netty-handler-4.1.86.Final.jar</b></p></summary>
<p></p>
<p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p>
<p>Path to dependency file: /data-prepper-plugins/sqs-source/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.86.Final/bcb65230218286e6456b5d085cb42e67776eb70/netty-handler-4.1.86.Final.jar</p>
<p>
Dependency Hierarchy:
- sts-2.17.264.jar (Root Library)
- netty-nio-client-2.17.264.jar
- :x: **netty-handler-4.1.86.Final.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers & clients. The `SniHandler` can allocate up to 16MB of heap for each channel during the TLS handshake. When the handler or the channel does not have an idle timeout, it can be used to make a TCP server using the `SniHandler` to allocate 16MB of heap. The `SniHandler` class is a handler that waits for the TLS handshake to configure a `SslHandler` according to the indicated server name by the `ClientHello` record. For this matter it allocates a `ByteBuf` using the value defined in the `ClientHello` record. Normally the value of the packet should be smaller than the handshake packet but there are not checks done here and the way the code is written, it is possible to craft a packet that makes the `SslClientHelloHandler`. This vulnerability has been fixed in version 4.1.94.Final.
<p>Publish Date: 2023-06-22
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-34462>CVE-2023-34462</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-6mjq-h674-j845">https://github.com/advisories/GHSA-6mjq-h674-j845</a></p>
<p>Release Date: 2023-06-07</p>
<p>Fix Resolution (io.netty:netty-handler): 5.0.0.Alpha1</p>
<p>Direct dependency fix Resolution (software.amazon.awssdk:sts): 2.17.265</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2023-34462 (Medium) detected in netty-all-4.1.86.Final.jar, netty-handler-4.1.86.Final.jar | https://api.github.com/repos/opensearch-project/data-prepper/issues/2924/comments | 0 | 2023-06-22T17:07:12Z | 2023-06-29T14:59:06Z | https://github.com/opensearch-project/data-prepper/issues/2924 | 1,770,086,128 | 2,924 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
I would like to be able to see and review metric's exemplars data-points representing samples of the aggregated values used to build the metrics
**Describe the solution you'd like**
In order to navigate from metrics (which are an aggregation of data-points) into the traces / logs that are the basis of this metrics (data-points measurements) we would like to store the data-points exemplars along the metrics itself ( along the @time dimension )
**Additional context**
- https://github.com/opensearch-project/dashboards-observability/issues/560
- https://github.com/opensearch-project/sql/issues/1750
- https://opentelemetry.io/docs/specs/otel/metrics/sdk/#exemplar
| Support Metrics Exemplar | https://api.github.com/repos/opensearch-project/data-prepper/issues/2921/comments | 0 | 2023-06-21T18:43:55Z | 2023-07-05T20:44:50Z | https://github.com/opensearch-project/data-prepper/issues/2921 | 1,768,199,878 | 2,921 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
As a user, I would like to filter the keys in my Event with an explicit allowlist, and delete all other keys.
**Describe the solution you'd like**
An additional parameter in the `delete_entries` processor
```
delete_entries:
delete_all_except: [ “key-one”, "key-two”, "key-three” ]
```
This would result in the Event only containing the three keys above, and all others will be deleted. We could consider having each element be a regex pattern instead of just a string, but this would impact performance, since we would have to match each key against every pattern until there was a match. For keys that don't match any patterns, they would be matched against every pattern in the list. Given that, I am proposing not support regex yet
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Allowlist support for deleting entries | https://api.github.com/repos/opensearch-project/data-prepper/issues/2920/comments | 2 | 2023-06-21T16:42:17Z | 2024-06-25T07:03:21Z | https://github.com/opensearch-project/data-prepper/issues/2920 | 1,767,998,440 | 2,920 |
[
"opensearch-project",
"data-prepper"
] | ## WS-2023-0178 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>aws-cdk-lib-2.13.0.tgz</b></p></summary>
<p>Version 2 of the AWS Cloud Development Kit library</p>
<p>Library home page: <a href="https://registry.npmjs.org/aws-cdk-lib/-/aws-cdk-lib-2.13.0.tgz">https://registry.npmjs.org/aws-cdk-lib/-/aws-cdk-lib-2.13.0.tgz</a></p>
<p>Path to dependency file: /release/staging-resources-cdk/package.json</p>
<p>Path to vulnerable library: /release/staging-resources-cdk/node_modules/aws-cdk-lib/package.json</p>
<p>
Dependency Hierarchy:
- :x: **aws-cdk-lib-2.13.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
AWS CDK EKS is overly permissive trust policies.
The AWS Cloud Development Kit (CDK) allows for the definition of Amazon Elastic Container Service for Kubernetes (EKS) clusters. eks.Cluster and eks.FargateCluster constructs create two roles that have an overly permissive trust policy.
The first, referred to as the CreationRole, is used by lambda handlers to create the cluster and deploy Kubernetes resources (e.g KubernetesManifest, HelmChart, ...) onto it. Users with CDK version higher or equal to 1.62.0 (including v2 users) will be affected.
The second, referred to as the default MastersRole, is provisioned only if the mastersRole property isn't provided and has permissions to execute kubectl commands on the cluster. Users with CDK version higher or equal to 1.57.0 (including v2 users) will be affected.
Both these roles use the account root principal in their trust policy, which allows any identity in the account with the appropriate sts:AssumeRole permissions to assume it. For example, this can happen if another role in your account has sts:AssumeRole permissions on Resource: "*".
The issue has been fixed in versions v1.202.0, v2.80.0.
<p>Publish Date: 2023-06-19
<p>URL: <a href=https://github.com/aws/aws-cdk/commit/51f0193bf34cca8254743561a1176e3ca5d83a74>WS-2023-0178</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-rx28-r23p-2qc3">https://github.com/advisories/GHSA-rx28-r23p-2qc3</a></p>
<p>Release Date: 2023-06-19</p>
<p>Fix Resolution: 2.80.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| WS-2023-0178 (Medium) detected in aws-cdk-lib-2.13.0.tgz - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/2919/comments | 1 | 2023-06-21T15:17:57Z | 2023-06-26T02:39:13Z | https://github.com/opensearch-project/data-prepper/issues/2919 | 1,767,848,717 | 2,919 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2022-25883 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semver-7.5.1.tgz</b></p></summary>
<p></p>
<p>Library home page: <a href="https://registry.npmjs.org/semver/-/semver-7.5.1.tgz">https://registry.npmjs.org/semver/-/semver-7.5.1.tgz</a></p>
<p>
Dependency Hierarchy:
- aws-cdk-lib-2.80.0.tgz (Root Library)
- :x: **semver-7.5.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Versions of the package semver before 7.5.2 are vulnerable to Regular Expression Denial of Service (ReDoS) via the function new Range, when untrusted user data is provided as a range.
<p>Publish Date: 2023-06-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25883>CVE-2022-25883</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-c2qf-rxjj-qqgw">https://github.com/advisories/GHSA-c2qf-rxjj-qqgw</a></p>
<p>Release Date: 2023-06-21</p>
<p>Fix Resolution: semver - 5.7.2,6.3.1,7.5.2;org.webjars.npm:semver:7.5.2</p>
</p>
</details>
<p></p>
| CVE-2022-25883 (High) detected in semver-7.5.1.tgz - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/2918/comments | 1 | 2023-06-21T15:17:52Z | 2023-08-31T18:45:23Z | https://github.com/opensearch-project/data-prepper/issues/2918 | 1,767,848,478 | 2,918 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-33201 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bcprov-jdk15on-1.70.jar</b>, <b>bcprov-jdk15on-1.69.jar</b></p></summary>
<p>
<details><summary><b>bcprov-jdk15on-1.70.jar</b></p></summary>
<p>The Bouncy Castle Crypto package is a Java implementation of cryptographic algorithms. This jar contains JCE provider and lightweight API for the Bouncy Castle Cryptography APIs for JDK 1.5 and up.</p>
<p>Library home page: <a href="https://www.bouncycastle.org/java.html">https://www.bouncycastle.org/java.html</a></p>
<p>Path to dependency file: /data-prepper-plugins/anomaly-detector-processor/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.70/4636a0d01f74acaf28082fb62b317f1080118371/bcprov-jdk15on-1.70.jar</p>
<p>
Dependency Hierarchy:
- :x: **bcprov-jdk15on-1.70.jar** (Vulnerable Library)
</details>
<details><summary><b>bcprov-jdk15on-1.69.jar</b></p></summary>
<p>The Bouncy Castle Crypto package is a Java implementation of cryptographic algorithms. This jar contains JCE provider and lightweight API for the Bouncy Castle Cryptography APIs for JDK 1.5 and up.</p>
<p>Library home page: <a href="https://www.bouncycastle.org/java.html">https://www.bouncycastle.org/java.html</a></p>
<p>Path to dependency file: /data-prepper-plugins/otel-logs-source/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.bouncycastle/bcprov-jdk15on/1.69/91e1628251cf3ca90093ce9d0fe67e5b7dab3850/bcprov-jdk15on-1.69.jar</p>
<p>
Dependency Hierarchy:
- :x: **bcprov-jdk15on-1.69.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Bouncy Castle For Java before 1.74 is affected by an LDAP injection vulnerability. The vulnerability only affects applications that use an LDAP CertStore from Bouncy Castle to validate X.509 certificates. During the certificate validation process, Bouncy Castle inserts the certificate's Subject Name into an LDAP search filter without any escaping, which leads to an LDAP injection vulnerability.
<p>Publish Date: 2023-07-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-33201>CVE-2023-33201</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2023-07-05</p>
<p>Fix Resolution: org.bouncycastle:bcprov-ext-jdk18on:1.74, org.bouncycastle:bcprov-jdk18on:1.74, org.bouncycastle:bcprov-debug-jdk18on:1.74, org.bouncycastle:bcprov-ext-debug-jdk18on:1.74, org.bouncycastle:bcprov-ext-jdk15to18:1.74, org.bouncycastle:bcprov-jdk15to18:1.74, org.bouncycastle:bcprov-debug-jdk14:1.74, org.bouncycastle:bcprov-debug-jdk15to18:1.74, org.bouncycastle:bcprov-ext-debug-jdk14:1.74, org.bouncycastle:bcprov-ext-debug-jdk15to18:1.74, org.bouncycastle:bcprov-jdk14:1.74</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2023-33201 (Medium) detected in bcprov-jdk15on-1.70.jar, bcprov-jdk15on-1.69.jar | https://api.github.com/repos/opensearch-project/data-prepper/issues/2917/comments | 0 | 2023-06-21T15:17:50Z | 2023-09-06T13:42:54Z | https://github.com/opensearch-project/data-prepper/issues/2917 | 1,767,848,390 | 2,917 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
We have tried to launch it on aws eks to be able to send logs from alb to aws opensearch.
We wanted to use service account role, but data prepper always try to assume instance role instead of service account.
**To Reproduce**
Steps to reproduce the behavior:
1. Create IRSA role on aws
1. Launch data prepper on aws eks and assign IRSA role to service account
3. Setup config to send logs to aws opensearch
4. See error
**Expected behavior**
IRSA role should be assumed instead of instance
**Screenshots**

**Environment (please complete the following information):**
- EKS 1.27
- data-prepper 2.3.1
| [BUG] Does not assume service account role | https://api.github.com/repos/opensearch-project/data-prepper/issues/2914/comments | 0 | 2023-06-21T07:27:22Z | 2023-07-05T20:48:28Z | https://github.com/opensearch-project/data-prepper/issues/2914 | 1,766,934,686 | 2,914 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-34455 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>snappy-java-1.1.7.jar</b>, <b>snappy-java-1.1.8.4.jar</b>, <b>snappy-java-1.1.9.1.jar</b>, <b>snappy-java-1.1.8.3.jar</b></p></summary>
<p>
<details><summary><b>snappy-java-1.1.7.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Library home page: <a href="https://github.com/xerial/snappy-java">https://github.com/xerial/snappy-java</a></p>
<p>Path to dependency file: /data-prepper-plugins/kafka-plugins/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.7/ef9d28a3515f0704ee9b930b7370051bd26720f/snappy-java-1.1.7.jar</p>
<p>
Dependency Hierarchy:
- curator-test-5.5.0.jar (Root Library)
- :x: **snappy-java-1.1.7.jar** (Vulnerable Library)
</details>
<details><summary><b>snappy-java-1.1.8.4.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Library home page: <a href="https://github.com/xerial/snappy-java">https://github.com/xerial/snappy-java</a></p>
<p>Path to dependency file: /data-prepper-plugins/kafka-plugins/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.8.4/66f0d56454509f6e36175f2331572e250e04a6cc/snappy-java-1.1.8.4.jar</p>
<p>
Dependency Hierarchy:
- kafka-clients-7.3.3-ccs.jar (Root Library)
- :x: **snappy-java-1.1.8.4.jar** (Vulnerable Library)
</details>
<details><summary><b>snappy-java-1.1.9.1.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Path to dependency file: /release/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **snappy-java-1.1.9.1.jar** (Vulnerable Library)
</details>
<details><summary><b>snappy-java-1.1.8.3.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Library home page: <a href="https://github.com/xerial/snappy-java">https://github.com/xerial/snappy-java</a></p>
<p>Path to dependency file: /data-prepper-plugins/parquet-codecs/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.8.3/2c58fa5745831afa3a4290c2ce15553ff13ad0ab/snappy-java-1.1.8.3.jar</p>
<p>
Dependency Hierarchy:
- hadoop-common-3.3.5.jar (Root Library)
- :x: **snappy-java-1.1.8.3.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
snappy-java is a fast compressor/decompressor for Java. Due to use of an unchecked chunk length, an unrecoverable fatal error can occur in versions prior to 1.1.10.1.
The code in the function hasNextChunk in the fileSnappyInputStream.java checks if a given stream has more chunks to read. It does that by attempting to read 4 bytes. If it wasn’t possible to read the 4 bytes, the function returns false. Otherwise, if 4 bytes were available, the code treats them as the length of the next chunk.
In the case that the `compressed` variable is null, a byte array is allocated with the size given by the input data. Since the code doesn’t test the legality of the `chunkSize` variable, it is possible to pass a negative number (such as 0xFFFFFFFF which is -1), which will cause the code to raise a `java.lang.NegativeArraySizeException` exception. A worse case would happen when passing a huge positive value (such as 0x7FFFFFFF), which would raise the fatal `java.lang.OutOfMemoryError` error.
Version 1.1.10.1 contains a patch for this issue.
<p>Publish Date: 2023-06-15
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-34455>CVE-2023-34455</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/xerial/snappy-java/security/advisories/GHSA-qcwq-55hx-v3vh">https://github.com/xerial/snappy-java/security/advisories/GHSA-qcwq-55hx-v3vh</a></p>
<p>Release Date: 2023-06-15</p>
<p>Fix Resolution: org.xerial.snappy:snappy-java:1.1.10.1</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2023-34455 (High) detected in multiple libraries | https://api.github.com/repos/opensearch-project/data-prepper/issues/2904/comments | 0 | 2023-06-19T17:17:52Z | 2023-06-29T14:59:06Z | https://github.com/opensearch-project/data-prepper/issues/2904 | 1,763,917,115 | 2,904 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-34453 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>snappy-java-1.1.7.jar</b>, <b>snappy-java-1.1.8.4.jar</b>, <b>snappy-java-1.1.9.1.jar</b>, <b>snappy-java-1.1.8.3.jar</b></p></summary>
<p>
<details><summary><b>snappy-java-1.1.7.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Library home page: <a href="https://github.com/xerial/snappy-java">https://github.com/xerial/snappy-java</a></p>
<p>Path to dependency file: /data-prepper-plugins/kafka-plugins/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.7/ef9d28a3515f0704ee9b930b7370051bd26720f/snappy-java-1.1.7.jar</p>
<p>
Dependency Hierarchy:
- curator-test-5.5.0.jar (Root Library)
- :x: **snappy-java-1.1.7.jar** (Vulnerable Library)
</details>
<details><summary><b>snappy-java-1.1.8.4.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Library home page: <a href="https://github.com/xerial/snappy-java">https://github.com/xerial/snappy-java</a></p>
<p>Path to dependency file: /data-prepper-plugins/kafka-plugins/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.8.4/66f0d56454509f6e36175f2331572e250e04a6cc/snappy-java-1.1.8.4.jar</p>
<p>
Dependency Hierarchy:
- kafka-clients-7.3.3-ccs.jar (Root Library)
- :x: **snappy-java-1.1.8.4.jar** (Vulnerable Library)
</details>
<details><summary><b>snappy-java-1.1.9.1.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Path to dependency file: /release/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **snappy-java-1.1.9.1.jar** (Vulnerable Library)
</details>
<details><summary><b>snappy-java-1.1.8.3.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Library home page: <a href="https://github.com/xerial/snappy-java">https://github.com/xerial/snappy-java</a></p>
<p>Path to dependency file: /data-prepper-plugins/parquet-codecs/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.8.3/2c58fa5745831afa3a4290c2ce15553ff13ad0ab/snappy-java-1.1.8.3.jar</p>
<p>
Dependency Hierarchy:
- hadoop-common-3.3.5.jar (Root Library)
- :x: **snappy-java-1.1.8.3.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
snappy-java is a fast compressor/decompressor for Java. Due to unchecked multiplications, an integer overflow may occur in versions prior to 1.1.10.1, causing a fatal error.
The function `shuffle(int[] input)` in the file `BitShuffle.java` receives an array of integers and applies a bit shuffle on it. It does so by multiplying the length by 4 and passing it to the natively compiled shuffle function. Since the length is not tested, the multiplication by four can cause an integer overflow and become a smaller value than the true size, or even zero or negative. In the case of a negative value, a `java.lang.NegativeArraySizeException` exception will raise, which can crash the program. In a case of a value that is zero or too small, the code that afterwards references the shuffled array will assume a bigger size of the array, which might cause exceptions such as `java.lang.ArrayIndexOutOfBoundsException`.
The same issue exists also when using the `shuffle` functions that receive a double, float, long and short, each using a different multiplier that may cause the same issue.
Version 1.1.10.1 contains a patch for this vulnerability.
<p>Publish Date: 2023-06-15
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-34453>CVE-2023-34453</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/xerial/snappy-java/security/advisories/GHSA-pqr6-cmr2-h8hf">https://github.com/xerial/snappy-java/security/advisories/GHSA-pqr6-cmr2-h8hf</a></p>
<p>Release Date: 2023-06-15</p>
<p>Fix Resolution: org.xerial.snappy:snappy-java:1.1.10.1</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2023-34453 (Medium) detected in multiple libraries | https://api.github.com/repos/opensearch-project/data-prepper/issues/2903/comments | 0 | 2023-06-19T17:17:50Z | 2023-06-29T14:59:07Z | https://github.com/opensearch-project/data-prepper/issues/2903 | 1,763,917,074 | 2,903 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-34454 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>snappy-java-1.1.7.jar</b>, <b>snappy-java-1.1.8.4.jar</b>, <b>snappy-java-1.1.9.1.jar</b>, <b>snappy-java-1.1.8.3.jar</b></p></summary>
<p>
<details><summary><b>snappy-java-1.1.7.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Library home page: <a href="https://github.com/xerial/snappy-java">https://github.com/xerial/snappy-java</a></p>
<p>Path to dependency file: /data-prepper-plugins/kafka-plugins/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.7/ef9d28a3515f0704ee9b930b7370051bd26720f/snappy-java-1.1.7.jar</p>
<p>
Dependency Hierarchy:
- curator-test-5.5.0.jar (Root Library)
- :x: **snappy-java-1.1.7.jar** (Vulnerable Library)
</details>
<details><summary><b>snappy-java-1.1.8.4.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Library home page: <a href="https://github.com/xerial/snappy-java">https://github.com/xerial/snappy-java</a></p>
<p>Path to dependency file: /data-prepper-plugins/kafka-plugins/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.8.4/66f0d56454509f6e36175f2331572e250e04a6cc/snappy-java-1.1.8.4.jar</p>
<p>
Dependency Hierarchy:
- kafka-clients-7.3.3-ccs.jar (Root Library)
- :x: **snappy-java-1.1.8.4.jar** (Vulnerable Library)
</details>
<details><summary><b>snappy-java-1.1.9.1.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Path to dependency file: /release/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.9.1/96770312ae05ca8b59e565909973dbc2ea71bb91/snappy-java-1.1.9.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **snappy-java-1.1.9.1.jar** (Vulnerable Library)
</details>
<details><summary><b>snappy-java-1.1.8.3.jar</b></p></summary>
<p>snappy-java: A fast compression/decompression library</p>
<p>Library home page: <a href="https://github.com/xerial/snappy-java">https://github.com/xerial/snappy-java</a></p>
<p>Path to dependency file: /data-prepper-plugins/parquet-codecs/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.xerial.snappy/snappy-java/1.1.8.3/2c58fa5745831afa3a4290c2ce15553ff13ad0ab/snappy-java-1.1.8.3.jar</p>
<p>
Dependency Hierarchy:
- hadoop-common-3.3.5.jar (Root Library)
- :x: **snappy-java-1.1.8.3.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/ebd3e757c341c1d9c1352431bbad7bf5db2ea939">ebd3e757c341c1d9c1352431bbad7bf5db2ea939</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
snappy-java is a fast compressor/decompressor for Java. Due to unchecked multiplications, an integer overflow may occur in versions prior to 1.1.10.1, causing an unrecoverable fatal error.
The function `compress(char[] input)` in the file `Snappy.java` receives an array of characters and compresses it. It does so by multiplying the length by 2 and passing it to the rawCompress` function.
Since the length is not tested, the multiplication by two can cause an integer overflow and become negative. The rawCompress function then uses the received length and passes it to the natively compiled maxCompressedLength function, using the returned value to allocate a byte array.
Since the maxCompressedLength function treats the length as an unsigned integer, it doesn’t care that it is negative, and it returns a valid value, which is casted to a signed integer by the Java engine. If the result is negative, a `java.lang.NegativeArraySizeException` exception will be raised while trying to allocate the array `buf`. On the other side, if the result is positive, the `buf` array will successfully be allocated, but its size might be too small to use for the compression, causing a fatal Access Violation error.
The same issue exists also when using the `compress` functions that receive double, float, int, long and short, each using a different multiplier that may cause the same issue. The issue most likely won’t occur when using a byte array, since creating a byte array of size 0x80000000 (or any other negative value) is impossible in the first place.
Version 1.1.10.1 contains a patch for this issue.
<p>Publish Date: 2023-06-15
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-34454>CVE-2023-34454</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/xerial/snappy-java/security/advisories/GHSA-fjpj-2g6w-x25r">https://github.com/xerial/snappy-java/security/advisories/GHSA-fjpj-2g6w-x25r</a></p>
<p>Release Date: 2023-06-15</p>
<p>Fix Resolution: org.xerial.snappy:snappy-java:1.1.10.1</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2023-34454 (Medium) detected in multiple libraries | https://api.github.com/repos/opensearch-project/data-prepper/issues/2902/comments | 0 | 2023-06-19T17:17:47Z | 2023-06-29T14:59:07Z | https://github.com/opensearch-project/data-prepper/issues/2902 | 1,763,917,030 | 2,902 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-2976 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>guava-31.1-jre.jar</b>, <b>guava-30.1.1-jre.jar</b></p></summary>
<p>
<details><summary><b>guava-31.1-jre.jar</b></p></summary>
<p>Guava is a suite of core and expanded libraries that include
utility classes, Google's collections, I/O classes, and
much more.</p>
<p>Path to dependency file: /data-prepper-plugins/otel-trace-raw-processor/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/31.1-jre/60458f877d055d0c9114d9e1a2efb737b4bc282c/guava-31.1-jre.jar</p>
<p>
Dependency Hierarchy:
- :x: **guava-31.1-jre.jar** (Vulnerable Library)
</details>
<details><summary><b>guava-30.1.1-jre.jar</b></p></summary>
<p>Guava is a suite of core and expanded libraries that include
utility classes, Google's collections, I/O classes, and
much more.</p>
<p>Library home page: <a href="https://github.com/google/guava">https://github.com/google/guava</a></p>
<p>Path to dependency file: /data-prepper-plugins/user-agent-processor/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.guava/guava/30.1.1-jre/87e0fd1df874ea3cbe577702fe6f17068b790fd8/guava-30.1.1-jre.jar</p>
<p>
Dependency Hierarchy:
- checkstyle-8.45.1.jar (Root Library)
- :x: **guava-30.1.1-jre.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/ebd3e757c341c1d9c1352431bbad7bf5db2ea939">ebd3e757c341c1d9c1352431bbad7bf5db2ea939</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Use of Java's default temporary directory for file creation in `FileBackedOutputStream` in Google Guava versions 1.0 to 31.1 on Unix systems and Android Ice Cream Sandwich allows other users and apps on the machine with access to the default Java temporary directory to be able to access the files created by the class.
Even though the security vulnerability is fixed in version 32.0.0, we recommend using version 32.0.1 as version 32.0.0 breaks some functionality under Windows.
<p>Publish Date: 2023-06-14
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-2976>CVE-2023-2976</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2023-2976">https://www.cve.org/CVERecord?id=CVE-2023-2976</a></p>
<p>Release Date: 2023-06-14</p>
<p>Fix Resolution: 32.0.1-jre</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2023-2976 (Medium) detected in guava-31.1-jre.jar, guava-30.1.1-jre.jar | https://api.github.com/repos/opensearch-project/data-prepper/issues/2901/comments | 0 | 2023-06-19T17:17:45Z | 2023-09-29T16:28:34Z | https://github.com/opensearch-project/data-prepper/issues/2901 | 1,763,916,994 | 2,901 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Some users have requested the ability to add to existing documents in OpenSearch instead of overwriting existing documents.
**Describe the solution you'd like**
Ideally, the user could configure an option to allow this behavior. | Support partial updates on existing documents | https://api.github.com/repos/opensearch-project/data-prepper/issues/2893/comments | 0 | 2023-06-16T15:38:04Z | 2023-07-05T20:49:49Z | https://github.com/opensearch-project/data-prepper/issues/2893 | 1,760,870,684 | 2,893 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Some of S3 source & sink metrics are same which is aggregating metrics of both plugins
**To Reproduce**
Steps to reproduce the behavior:
1. Create a pipeline with s3 source and sink
2. You will only see one metric which is tracks sum of both source and sink metrics
**Expected behavior**
There should be `s3ObjectsSucceeded`, `s3ObjectsFailed` and `s3ObjectSizeBytes` for both plugins.
| [BUG] S3 sink metric names conflict with S3 sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/2887/comments | 0 | 2023-06-16T01:31:21Z | 2023-06-16T13:48:46Z | https://github.com/opensearch-project/data-prepper/issues/2887 | 1,759,769,655 | 2,887 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
A user reported lack of opensearch sink initialization error logs when data-prepper is spinning up.
```
2023-06-12T22:52:21,382 [entry-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [entry-pipeline] - sink is not ready for execution, retrying
2023-06-12T22:52:21,382 [service-map-pipeline-sink-worker-4-thread-1] INFO org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [service-map-pipeline] - sink is not ready for execution, retrying
2023-06-12T22:52:21,383 [entry-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [entry-pipeline] Waiting for Sink to be ready
2023-06-12T22:52:21,383 [service-map-pipeline-sink-worker-4-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink - Initializing OpenSearch sink
2023-06-12T22:52:21,383 [raw-pipeline-sink-worker-6-thread-1] INFO org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [raw-pipeline] - sink is not ready for execution, retrying
2023-06-12T22:52:21,383 [raw-pipeline-sink-worker-6-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink - Initializing OpenSearch sink
2023-06-12T22:52:21,407 [main] WARN org.opensearch.dataprepper.pipeline.server.HttpServerProvider - Creating Data Prepper server without TLS. This is not secure.
2023-06-12T22:52:21,407 [main] WARN org.opensearch.dataprepper.pipeline.server.HttpServerProvider - In order to set up TLS for the Data Prepper server, go here: https://github.com/opensearch-project/data-prepper/blob/main/docs/configuration.md#server-configuration
2023-06-12T22:52:21,409 [raw-pipeline-sink-worker-6-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - aws_sigv4 is set, will sign requests using AWSRequestSigningApacheInterceptor
2023-06-12T22:52:21,409 [service-map-pipeline-sink-worker-4-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - aws_sigv4 is set, will sign requests using AWSRequestSigningApacheInterceptor
2023-06-12T22:52:21,425 [main] INFO org.opensearch.dataprepper.pipeline.server.DataPrepperServer - Data Prepper server running at :4900
2023-06-12T22:52:21,883 [service-map-pipeline-sink-worker-4-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - Using the trust all strategy
2023-06-12T22:52:21,883 [raw-pipeline-sink-worker-6-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - Using the trust all strategy
2023-06-12T22:52:22,383 [entry-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [entry-pipeline] - sink is not ready for execution, retrying
2023-06-12T22:52:23,383 [entry-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [entry-pipeline] - sink is not ready for execution, retrying
2023-06-12T22:52:24,384 [entry-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [entry-pipeline] - sink is not ready for execution, retrying
2023-06-12T22:52:25,384 [entry-pipeline-sink-worker-2-thread-1] INFO org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [entry-pipeline] - sink is not ready for execution, retrying
```
Still need to reproduce the issue
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| [BUG] lack of opensearch sink initialization error logs when data-prepper is spinning up | https://api.github.com/repos/opensearch-project/data-prepper/issues/2877/comments | 2 | 2023-06-14T18:59:03Z | 2023-10-27T07:36:50Z | https://github.com/opensearch-project/data-prepper/issues/2877 | 1,757,475,850 | 2,877 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
One user found the bug in opensearch sink:
```
2023-06-12T22:47:20,056 [service-map-pipeline-sink-worker-4-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - aws_sigv4 is set, will sign requests using AWSRequestSigningApacheInterceptor
2023-06-12T22:47:20,056 [raw-pipeline-sink-worker-6-thread-1] INFO org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - aws_sigv4 is set, will sign requests using AWSRequestSigningApacheInterceptor
2023-06-12T22:47:20,068 [main] INFO org.opensearch.dataprepper.pipeline.server.DataPrepperServer - Data Prepper server running at :4900
2023-06-12T22:47:20,509 [service-map-pipeline-sink-worker-4-thread-1] WARN org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink - Failed to initialize OpenSearch sink with a retryable exception.
java.util.ConcurrentModificationException: null
at java.util.HashMap.computeIfAbsent(HashMap.java:1221) ~[?:?]
at org.opensearch.dataprepper.plugins.aws.CredentialsCache.getOrCreate(CredentialsCache.java:25) ~[aws-plugin-2.3.0.jar:?]
at org.opensearch.dataprepper.plugins.aws.DefaultAwsCredentialsSupplier.getProvider(DefaultAwsCredentialsSupplier.java:23) ~[aws-plugin-2.3.0.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration.attachSigV4(ConnectionConfiguration.java:268) ~[opensearch-2.3.0.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration.createClient(ConnectionConfiguration.java:245) ~[opensearch-2.3.0.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.doInitializeInternal(OpenSearchSink.java:164) ~[opensearch-2.3.0.jar:?]
at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.doInitialize(OpenSearchSink.java:144) ~[opensearch-2.3.0.jar:?]
at org.opensearch.dataprepper.model.sink.AbstractSink.initialize(AbstractSink.java:49) ~[data-prepper-api-2.3.0.jar:?]
at org.opensearch.dataprepper.pipeline.Pipeline.isReady(Pipeline.java:195) ~[data-prepper-core-2.3.0.jar:?]
at org.opensearch.dataprepper.pipeline.Pipeline.lambda$execute$2(Pipeline.java:243) ~[data-prepper-core-2.3.0.jar:?]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:539) [?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) [?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136) [?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635) [?:?]
at java.lang.Thread.run(Thread.java:833) [?:?]
```
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| [BUG] ConcurrentModificationException in CredentialsCache::getOrCreate | https://api.github.com/repos/opensearch-project/data-prepper/issues/2875/comments | 0 | 2023-06-14T15:19:55Z | 2023-06-15T02:24:09Z | https://github.com/opensearch-project/data-prepper/issues/2875 | 1,757,149,157 | 2,875 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The OpenSearch sink currently supports using values from an Event in the `index` and `document_id_field`
```
index: "${key_one}-${key_two}-%{yyyy.mm.hh}
document_id_field: "/key_three"
```
As a user, I would like the ability to set the index and document_id_field from Data Prepper expressions, most notably the functions available. For example, I would like to use the `getMetadata` function to set these values without having to keep them in the Event.
**Describe the solution you'd like**
A way to use Data Prepper expressions along with formatting in these parameters. We should do this in a way that can be kept consistent in any location in the pipeline config that needs a value
For the `add_entries` processor, we created a new field for each of these options
```
add_entries:
- entries:
key: "/my_key"
value: "static_value"
format: "${key_one}-${key_two}"
value_expression: "length(/message)"
```
Ideally, we would not repeat this pattern in every location, in this case that being
```
index: "my_index" # This currently supports format as well with ${}
index_expression: "getMetadata(\"metatdata_key\")"
document_id_field: "/my_field_from_event"
document_id_field_expression: "getMetadata(\"metadata_key_two\")"
document_id_field_format: "${key_one}-${key_two}"
```
Instead of this, we should try to consolidate as mentioned in #2719. So we should have only the `index` and `document_id_field` parameters, and they will support all of the above options
The best way to do this will be to use ${} syntax to escape both keys and expressions. For example,
```
index: "prefix-${/some/key}-${getMetadata(\"metadata_key\")}-%{yyyy.mm.hh}
document_id_field: "prefix-${/some/key}-${length(/message)}"
```
Alternatively, we could consider a different escape character between key values and expressions if we find it difficult to detect keys vs expressions
```
index: "prefix-${/some/key}-@{getMetadata(\"metadata_key\")}-%{yyyy.mm.hh}
document_id_field: "prefix-${/some/key}-@{length(/message)}"
```
**Describe alternatives you've considered (Optional)**
**Additional context**
Consolidating the expression use in parameters of Data Prepper pipeline configurations (https://github.com/opensearch-project/data-prepper/issues/2719)
| Ability to use Data Prepper expressions in the OpenSearch sink index and document_id_field | https://api.github.com/repos/opensearch-project/data-prepper/issues/2864/comments | 2 | 2023-06-13T17:05:41Z | 2023-08-02T16:44:12Z | https://github.com/opensearch-project/data-prepper/issues/2864 | 1,755,327,849 | 2,864 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Add end-to-end acknowledgement support to Stdout and File Sinks.
stdout sink is used for testing and tests fail when end-to-end acknowledgements are tested with stdout as sink
**Describe the solution you'd like**
Release the event handles in Stdout and File sinks
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Add end-to-end acknowledgement support to Stdout and File Sinks | https://api.github.com/repos/opensearch-project/data-prepper/issues/2859/comments | 0 | 2023-06-12T15:49:21Z | 2023-06-12T18:18:19Z | https://github.com/opensearch-project/data-prepper/issues/2859 | 1,753,053,029 | 2,859 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The BufferAccumulator class is being reused between the opensearch source and s3 source. It may be useful for future sources as well.
**Describe the solution you'd like**
Move the BufferAccumulator class to a `buffer-common` module that can be depended on and used by any source or plugin that needs it.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Consolidate BufferAccumulator class for reuse between sources | https://api.github.com/repos/opensearch-project/data-prepper/issues/2855/comments | 0 | 2023-06-10T06:06:24Z | 2023-06-12T18:29:57Z | https://github.com/opensearch-project/data-prepper/issues/2855 | 1,750,819,597 | 2,855 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
As a user of Data Prepper who has logs containing a list of json objects, I would like to run mutate event processors on every value in the list
**Describe the solution you'd like**
Given a log with the following structure
```
root_key: [
{
"key_one": "value_one",
"key_two": "value_two"
},
{
"key_one": "value_three,
"key_two": "value_four"
}
]
```
I would like be able to manipulate the elements of this map in place by the mutate event processors iterating over the elements and running them, especially for add_entries and delete_entries.
For example, I would like the following output
```
root_key: [
{
"new_key": "value_one:value_two"
},
{
"new_key": "value_three:value_four"
}
]
```
If this is supported, the following configuration would work for the above case
```
- add_entries:
entries:
- iterate_on: "root_key" // This is my array
key: "my_new_key"
format: "${key_one}:${key_two}
- delete_entries:
iterate_on: "root_key"
with_keys: [ "key_one", "key_two" ]
```
The `iterate_on` parameter name was used for the `translate` processor, but I am fine with another naming as long as we are consistent for all mutate processors
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Similar concept for mutate string processors #1217
| Support for iterating over a list and manipulating each element, such as renaming a certain key in the element | https://api.github.com/repos/opensearch-project/data-prepper/issues/2853/comments | 0 | 2023-06-09T20:06:43Z | 2025-06-13T01:33:02Z | https://github.com/opensearch-project/data-prepper/issues/2853 | 1,750,437,042 | 2,853 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
The OpenSearch sink allows a bulk request size to be configured. The internal calculation for size is overestimating the true size of the bulk request. With the bulk size parameter set to 20MB, bulk requests are averaging 11-14MB.
**Expected behavior**
The bulk request size parameter should reflect the actual bulk sizes
**Proposed Solution**
Migrate the size calculation logic to use the size calculator from OpenSearch's BulkRequest object. Example:
```
BulkRequest::estimatedSizeInBytes
```
| [BUG] BulkSizes are overestimated | https://api.github.com/repos/opensearch-project/data-prepper/issues/2852/comments | 1 | 2023-06-09T19:02:20Z | 2023-06-30T15:28:53Z | https://github.com/opensearch-project/data-prepper/issues/2852 | 1,750,367,579 | 2,852 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Data Prepper is throwing Sigv4 error when OpenSearch URL in **hosts** has "/ " at the end of URL
**To Reproduce**
Steps to reproduce the behavior:
1. Go to pipeline.yaml
2. Click on Sinks
3. under OpenSearch - hosts give / at the end of URL say - "https://mydomain-us-east-1.es.amazonaws.com/" -- This URL works fine "https://mydomain-us-east-1.es.amazonaws.com"
4. See error
Caused by: org.opensearch.client.opensearch._types.OpenSearchException: Request failed: [security_exception] The request signature we calculated does not match the signature you provided. Check your AWS Secret Access Key and signing method. Consult the service documentation for details.
**Expected behavior**
Connection to OpenSearch should be successful with or with of "/" at the end of URL
| [BUG] Sigv4 error when hosts has "/ " at the end of URL | https://api.github.com/repos/opensearch-project/data-prepper/issues/2843/comments | 2 | 2023-06-07T15:45:48Z | 2023-10-19T22:37:56Z | https://github.com/opensearch-project/data-prepper/issues/2843 | 1,746,203,682 | 2,843 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Add support for writing tags along with events to Sink.
Some events may be tagged in the DataPrepper. Need a way to write these tags to Sink along with the event data.
**Describe the solution you'd like**
Provide a sink level config option to write tags along with events. When this option is used, the tags in the events should be written to the sink under the key provided with the config option. For example.
```
opensearch:
...
tags_key_name: "tags"
```
The implementation should support all sinks without having to write sink specific configuation support.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Add support for writing tags along with events to Sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/2827/comments | 0 | 2023-06-05T20:47:19Z | 2023-06-27T15:42:29Z | https://github.com/opensearch-project/data-prepper/issues/2827 | 1,742,595,535 | 2,827 |
[
"opensearch-project",
"data-prepper"
] | null | Allow injecting configurations into the pipeline_configurations | https://api.github.com/repos/opensearch-project/data-prepper/issues/2826/comments | 0 | 2023-06-05T14:52:40Z | 2023-08-30T17:31:08Z | https://github.com/opensearch-project/data-prepper/issues/2826 | 1,741,952,730 | 2,826 |
[
"opensearch-project",
"data-prepper"
] | I want to have an extension and use another extension.
For example, I want to create a new plugin that will use the `AwsCredentialsSupplier` class which is provided by the AWS Plugin extension.
```
public class NewExtensionsPlugin implements ExtensionPlugin {
@DataPrepperPluginConstructor
public NewExtensionsPlugin(AwsCredentialsSupplier awsCredentialsSupplier) {
}
```
Currently, this is not supported.
See the following code where this is not yet allowed:
https://github.com/opensearch-project/data-prepper/blob/affe0b217744e7d52155860115fb1f8b8e233dbe/data-prepper-core/src/main/java/org/opensearch/dataprepper/plugin/ExtensionLoader.java#L54-L62
Proposed solution:
Allow `ExtensionPlugin` classes to add an annotation which defines what components the provide. This can allow the plugin framework to create a dependency tree.
```
@ExtensionProvides(value = {ClassSuppliedOne.class, ClassSuppliedTwo.class})
```
For example, in the `AwsPlugin`:
```
@ExtensionProvides(value = {AwsCredentialsSupplier.class})
public class AwsPlugin implements ExtensionPlugin {
...
@Override
public void apply(final ExtensionPoints extensionPoints) {
extensionPoints.addExtensionProvider(new AwsExtensionProvider(defaultAwsCredentialsSupplier));
}
}
```
What needs to happen:
- [ ] Load extensions in the necessary order. If Extension B depends on Extension A, then Extension A must load first.
- [ ] Update the ExtensionLoader to get extensions and inject them. | Allow extensions to depend upon other extensions | https://api.github.com/repos/opensearch-project/data-prepper/issues/2825/comments | 0 | 2023-06-05T14:36:35Z | 2025-03-11T19:50:04Z | https://github.com/opensearch-project/data-prepper/issues/2825 | 1,741,924,415 | 2,825 |
[
"opensearch-project",
"data-prepper"
] | null | Allow extracting configurations which were injected into data-prepper-config.yaml into plugins via the DataPrepperPluginConstructor. | https://api.github.com/repos/opensearch-project/data-prepper/issues/2824/comments | 0 | 2023-06-05T14:34:51Z | 2023-08-30T21:14:18Z | https://github.com/opensearch-project/data-prepper/issues/2824 | 1,741,921,433 | 2,824 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The `convert_entry_type` may be used a few times with the same configuration. It might be nice to consolidate this into a single `keys` value.
For example, see this sample for VPC Flow Logs:
```
processor:
- convert_entry_type:
key: dstport
type: integer
- convert_entry_type:
key: srcport
type: integer
- convert_entry_type:
key: protocol
type: integer
- convert_entry_type:
key: bytes
type: integer
- convert_entry_type:
key: packets
type: integer
```
**Describe the solution you'd like**
Allow defining multiple `keys` in the `convert_entry_type` processor.
Each of the `convert_entry_type` processors could be combined:
```
processor:
- convert_entry_type:
keys: [ 'dstport', 'srcport', 'protocol', 'bytes', 'packets' ]
type: integer
```
**Additional context**
This also becomes much more helpful when combined with #2822. I have the following sample:
```
processor:
- delete_entries:
with_keys:
- dstport
delete_when: '/dstport == "-"'
- convert_entry_type:
key: dstport
type: integer
- delete_entries:
with_keys:
- srcport
delete_when: '/srcport == "-"'
- convert_entry_type:
key: srcport
type: integer
- delete_entries:
with_keys:
- protocol
delete_when: '/protocol == "-"'
- convert_entry_type:
key: protocol
type: integer
- delete_entries:
with_keys:
- bytes
delete_when: '/bytes == "-"'
- convert_entry_type:
key: bytes
type: integer
- delete_entries:
with_keys:
- packets
delete_when: '/packets == "-"'
- convert_entry_type:
key: packets
type: integer
```
This could simply become:
```
processor:
- convert_entry_type:
keys: [ 'dstport', 'srcport', 'protocol', 'bytes', 'packets' ]
type: integer
null_values: [ '-' ]
```
| Define multiple keys for type conversion | https://api.github.com/repos/opensearch-project/data-prepper/issues/2823/comments | 0 | 2023-06-05T14:21:07Z | 2023-07-05T02:42:09Z | https://github.com/opensearch-project/data-prepper/issues/2823 | 1,741,894,957 | 2,823 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The `convert_entry_type` processor requires that the input match the expected type. In some pipelines, this may be provided through a special character that does not convert.
For example, in VPC Flow Logs, reading the CSV file using the `csv` codec may have some such as `srcport` (source port) come as `-` when there is no such value.
I have to delete this entry to provide conversion.
```
processor:
- delete_entries:
with_keys:
- srcport
delete_when: '/srcport == "-"'
- convert_entry_type:
key: srcport
type: integer
```
**Describe the solution you'd like**
Provide a new configuration in the `convert_entry_type` processor which allows a list of strings which are explicitly defined to be `null` (that is removed).
```
processor:
- convert_entry_type:
key: srcport
type: integer
null_values: [ '-' ]
```
The `convert_entry_type` processor will know to treat any value in the `null_values` list as `null`.
**Describe alternatives you've considered (Optional)**
An alternative solution would be to update the CSV processor and codec to have a concept of `null` strings. But, these are not mutually exclusive solutions. Not all pipelines will go through CSV.
| Define null characters in convert processor | https://api.github.com/repos/opensearch-project/data-prepper/issues/2822/comments | 3 | 2023-06-05T14:16:32Z | 2023-06-13T15:49:29Z | https://github.com/opensearch-project/data-prepper/issues/2822 | 1,741,886,811 | 2,822 |
[
"opensearch-project",
"data-prepper"
] | ## WS-2016-7057 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>plexus-utils-2.0.6.jar</b></p></summary>
<p>A collection of various utility classes to ease working with strings, files, command lines, XML and more.</p>
<p>Path to dependency file: /release/maven/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar</p>
<p>
Dependency Hierarchy:
- data-prepper-plugins-2.4.0-SNAPSHOT (Root Library)
- opensearch-source-2.4.0-SNAPSHOT
- maven-artifact-3.0.3.jar
- :x: **plexus-utils-2.0.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Plexus-utils before 3.0.24 are vulnerable to Directory Traversal
<p>Publish Date: 2016-05-07
<p>URL: <a href=https://github.com/codehaus-plexus/plexus-utils/commit/33a2853df8185b4519b1b8bfae284f03392618ef>WS-2016-7057</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2016-05-07</p>
<p>Fix Resolution: 3.0.24</p>
</p>
</details>
<p></p>
| WS-2016-7057 (Medium) detected in plexus-utils-2.0.6.jar | https://api.github.com/repos/opensearch-project/data-prepper/issues/2820/comments | 1 | 2023-06-03T00:51:40Z | 2023-06-08T20:52:43Z | https://github.com/opensearch-project/data-prepper/issues/2820 | 1,739,004,884 | 2,820 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2022-4244 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>plexus-utils-2.0.6.jar</b></p></summary>
<p>A collection of various utility classes to ease working with strings, files, command lines, XML and more.</p>
<p>Path to dependency file: /release/maven/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar</p>
<p>
Dependency Hierarchy:
- data-prepper-plugins-2.4.0-SNAPSHOT (Root Library)
- opensearch-source-2.4.0-SNAPSHOT
- maven-artifact-3.0.3.jar
- :x: **plexus-utils-2.0.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
CVE-2022-4244 codehaus-plexus: Directory Traversal
<p>Publish Date: 2022-12-01
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-4244>CVE-2022-4244</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-12-01</p>
<p>Fix Resolution: org.codehaus.plexus:plexus-utils:3.0.24</p>
</p>
</details>
<p></p>
| CVE-2022-4244 (Medium) detected in plexus-utils-2.0.6.jar | https://api.github.com/repos/opensearch-project/data-prepper/issues/2819/comments | 0 | 2023-06-03T00:51:38Z | 2023-06-08T20:52:44Z | https://github.com/opensearch-project/data-prepper/issues/2819 | 1,739,004,870 | 2,819 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2022-4245 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>plexus-utils-2.0.6.jar</b></p></summary>
<p>A collection of various utility classes to ease working with strings, files, command lines, XML and more.</p>
<p>Path to dependency file: /release/maven/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar</p>
<p>
Dependency Hierarchy:
- data-prepper-plugins-2.4.0-SNAPSHOT (Root Library)
- opensearch-source-2.4.0-SNAPSHOT
- maven-artifact-3.0.3.jar
- :x: **plexus-utils-2.0.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/ebd3e757c341c1d9c1352431bbad7bf5db2ea939">ebd3e757c341c1d9c1352431bbad7bf5db2ea939</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
CVE-2022-4245 codehaus-plexus: XML External Entity (XXE) Injection
<p>Publish Date: 2022-12-01
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-4245>CVE-2022-4245</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugzilla.suse.com/show_bug.cgi?id=1205930">https://bugzilla.suse.com/show_bug.cgi?id=1205930</a></p>
<p>Release Date: 2022-12-01</p>
<p>Fix Resolution: org.codehaus.plexus:plexus-utils:3.0.24</p>
</p>
</details>
<p></p>
| CVE-2022-4245 (Medium) detected in plexus-utils-2.0.6.jar | https://api.github.com/repos/opensearch-project/data-prepper/issues/2818/comments | 0 | 2023-06-03T00:51:35Z | 2023-06-08T20:52:44Z | https://github.com/opensearch-project/data-prepper/issues/2818 | 1,739,004,861 | 2,818 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2017-1000487 - Critical Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>plexus-utils-2.0.6.jar</b></p></summary>
<p>A collection of various utility classes to ease working with strings, files, command lines, XML and more.</p>
<p>Path to dependency file: /release/maven/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar</p>
<p>
Dependency Hierarchy:
- data-prepper-plugins-2.4.0-SNAPSHOT (Root Library)
- opensearch-source-2.4.0-SNAPSHOT
- maven-artifact-3.0.3.jar
- :x: **plexus-utils-2.0.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/ebd3e757c341c1d9c1352431bbad7bf5db2ea939">ebd3e757c341c1d9c1352431bbad7bf5db2ea939</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Plexus-utils before 3.0.16 is vulnerable to command injection because it does not correctly process the contents of double quoted strings.
<p>Publish Date: 2018-01-03
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-1000487>CVE-2017-1000487</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-1000487">https://nvd.nist.gov/vuln/detail/CVE-2017-1000487</a></p>
<p>Release Date: 2018-01-03</p>
<p>Fix Resolution: 3.0.16</p>
</p>
</details>
<p></p>
| CVE-2017-1000487 (Critical) detected in plexus-utils-2.0.6.jar | https://api.github.com/repos/opensearch-project/data-prepper/issues/2817/comments | 4 | 2023-06-03T00:51:33Z | 2023-06-08T20:52:45Z | https://github.com/opensearch-project/data-prepper/issues/2817 | 1,739,004,851 | 2,817 |
[
"opensearch-project",
"data-prepper"
] | ## WS-2016-7062 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>plexus-utils-2.0.6.jar</b></p></summary>
<p>A collection of various utility classes to ease working with strings, files, command lines, XML and more.</p>
<p>Path to dependency file: /release/maven/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.codehaus.plexus/plexus-utils/2.0.6/3a20c424a712a7c02b02af61dcad5f001b29a9fd/plexus-utils-2.0.6.jar</p>
<p>
Dependency Hierarchy:
- data-prepper-plugins-2.4.0-SNAPSHOT (Root Library)
- opensearch-source-2.4.0-SNAPSHOT
- maven-artifact-3.0.3.jar
- :x: **plexus-utils-2.0.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/c2d7767742f1f998fd4d787c44b6d7638e3a34db">c2d7767742f1f998fd4d787c44b6d7638e3a34db</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Security vulnerability found in plexus-utils before 3.0.24. XML injection found in XmlWriterUtil.java.
<p>Publish Date: 2016-05-07
<p>URL: <a href=https://github.com/codehaus-plexus/plexus-utils/commit/f933e5e78dc2637e485447ed821fe14904f110de>WS-2016-7062</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2016-05-07</p>
<p>Fix Resolution: 3.0.24</p>
</p>
</details>
<p></p>
| WS-2016-7062 (Medium) detected in plexus-utils-2.0.6.jar | https://api.github.com/repos/opensearch-project/data-prepper/issues/2816/comments | 0 | 2023-06-03T00:51:31Z | 2023-06-08T20:52:42Z | https://github.com/opensearch-project/data-prepper/issues/2816 | 1,739,004,842 | 2,816 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
This class referenced here (https://github.com/opensearch-project/data-prepper/pull/2806/commits/91fabf404797995858ccbcae7413245efa444692#r1213996924) is copy pasted between the opensearch sink and opensearch source for sigv4 signing, and it does not contain any unit tests.
**Describe the solution you'd like**
Move this class to the `aws-plugin-api` via an extension point for reuse between the opensearch source and sink, and anywhere else it is needed in the future. Also add unit tests once this class is moved to the `aws-plugin-api`
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Reuse sigv4 signing interceptor between opensearch source and sink, include unit tests | https://api.github.com/repos/opensearch-project/data-prepper/issues/2809/comments | 1 | 2023-06-02T15:53:39Z | 2023-09-05T18:45:21Z | https://github.com/opensearch-project/data-prepper/issues/2809 | 1,738,440,498 | 2,809 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Add a function in the DataPrepper Expressions to check if a string contains another string. This is useful to check if a field in the event has a particular prefix/suffix/substring in it.
**Describe the solution you'd like**
Proposal is to add `containsSubstring` function to DataPrepper expressions which can take string literal or a json pointer to check if a string contains another string as it's substring.
The function signature is as follows
```
boolean containsSubstring(String string1, String string2)
```
It returns true if the string in the `string2` is a substring of `string1`, Otherwise it returns false.
It can be used in the following 4 ways
```
containsSubstring(/jsonPointer, "stringLiteral")
containsSubstring("stringLiteral", /jsonPointer)
containsSubstring("stringLiteral1", "stringLiteral2")
containsSubstring(/jsonPointer1, /jsonPointer2)
```
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Add a function in the DataPrepper Expressions to check if a string contains another string | https://api.github.com/repos/opensearch-project/data-prepper/issues/2804/comments | 1 | 2023-06-02T00:30:49Z | 2023-06-05T16:13:30Z | https://github.com/opensearch-project/data-prepper/issues/2804 | 1,737,279,854 | 2,804 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
As a user of the grok processor, I would like to tag the Events that timeout
**Describe the solution you'd like**
Example for tagging on match failure (https://github.com/opensearch-project/data-prepper/blob/3d8d01babf1e4822926f260fa24ca95f69ea6d9f/data-prepper-plugins/grok-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessor.java#L263)
Can have a tag on timeout that will be tagged here (https://github.com/opensearch-project/data-prepper/blob/3d8d01babf1e4822926f260fa24ca95f69ea6d9f/data-prepper-plugins/grok-processor/src/main/java/org/opensearch/dataprepper/plugins/processor/grok/GrokProcessor.java#L135)
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Grok tags_on_timeout parameter for timing out on pattern matching | https://api.github.com/repos/opensearch-project/data-prepper/issues/2802/comments | 1 | 2023-06-01T18:35:35Z | 2024-03-12T19:47:12Z | https://github.com/opensearch-project/data-prepper/issues/2802 | 1,736,870,566 | 2,802 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
S3 can send event notifications over Amazon EventBridge. This will come in a different format than direct to SQS or via SNS.
**Describe the solution you'd like**
Support detecting the Amazon EventBridge [format](https://docs.aws.amazon.com/AmazonS3/latest/userguide/ev-events.html) and using it.
**Describe alternatives you've considered (Optional)**
Using the same work as #2788. However, there needs to be some difference. For generic Amazon EventBridge, the `s3` source needs to read the `detail-type` property to check the that the object was created (and other similar types that are important).
**Additional context**
See #2788 for Amazon Security Lake and #2604 for SQS via SNS.
| Support Amazon EventBridge messages from S3 | https://api.github.com/repos/opensearch-project/data-prepper/issues/2789/comments | 1 | 2023-05-31T21:39:07Z | 2023-06-13T22:53:19Z | https://github.com/opensearch-project/data-prepper/issues/2789 | 1,735,068,133 | 2,789 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
[Amazon Security Lake](https://docs.aws.amazon.com/security-lake/latest/userguide/what-is-security-lake.html) provides events over Amazon EventBridge. The format is different from S3 directly to SQS or via SNS as shown in [this sample configuration](https://docs.aws.amazon.com/security-lake/latest/userguide/subscriber-data-access.html#sample-notification).
**Describe the solution you'd like**
Automatically detect events from Amazon Security Lake and read the S3 object accordingly.
**Additional context**
See #2604 for another feature that was able to parse SQS messages which came via SNS.
| Support Amazon Security Lake data events in S3 source with SQS queue | https://api.github.com/repos/opensearch-project/data-prepper/issues/2788/comments | 1 | 2023-05-31T20:56:22Z | 2023-06-13T22:53:20Z | https://github.com/opensearch-project/data-prepper/issues/2788 | 1,735,008,670 | 2,788 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Update RandomCutForest Maven version. The new RCF version has fixes to reduce noise.
**To Reproduce**
In some cases, RCF generates some anomalies with low grade (aka noise).
**Expected behavior**
Less or no noise.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| [BUG] Update RandomCutForest Maven version | https://api.github.com/repos/opensearch-project/data-prepper/issues/2783/comments | 0 | 2023-05-30T20:47:34Z | 2023-05-31T19:22:38Z | https://github.com/opensearch-project/data-prepper/issues/2783 | 1,732,989,402 | 2,783 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
In the current state, user credential variables such as username and password are hardcoded in pipelines.yaml. For better security, we could support AWS secret as an extension plugin, i.e.
data-prepper-config.yaml
```
aws:
credentials:
default:
sts_role_arn: arn:aws:iam::99123456789:role/OsiDataPlaneRole
region: us-east-1
my-custom-role:
sts_role_arn:
region:
secret: # secret extension
my_es_secret_configuration:
name: ...
authentication: my-custom-role
```
Then our pipeline configuration can reference the key-value pairs stored in an AWS secret.
pipelines.yaml
```
...
sink:
- opensearch:
...
username: ${{ aws_secret:my_es_secret_configuration:USERNAME }}
password: ${{ aws_secret:my_es_secret_configuration:PASSWORD }}
```
**Describe the solution you'd like**
We can reuse aws credential extensions when configuring the credentials for AWS secrets.
**Describe alternatives you've considered (Optional)**
Alternatively, we can make AWS secret extension independent of credentials plugin at the price of duplication in configuration:
```
aws:
credentials:
default:
sts_role_arn: arn:aws:iam::99123456789:role/OsiDataPlaneRole
region: us-east-1
my-custom-role:
sts_role_arn:
region:
aws_secrets:
secrets:
my-secret1:
sts_role_arn: arn:aws:iam::99123456789:role/OsiDataPlaneRole
region: us-east-1
```
**Additional context**
Add any other context or screenshots about the feature request here.
| Support AWS secrets in configuring Data Prepper pipelines | https://api.github.com/repos/opensearch-project/data-prepper/issues/2780/comments | 3 | 2023-05-30T14:54:04Z | 2024-05-01T15:04:45Z | https://github.com/opensearch-project/data-prepper/issues/2780 | 1,732,443,278 | 2,780 |
[
"opensearch-project",
"data-prepper"
] | Hey guys, I hope you're well!
**Describe the bug**
I need to have more than 1000 fields on my otel indices, this is a must have to my solution.
I've tried to add it to the data-prepper template_file but it's no working as expected.
I want to have the following command automatically applied to each index created by dataprepper:
```
PUT otel-v1-apm-span-*/_settings
{
"index.mapping.total_fields.limit": 2500
}
```
*References:*
- https://opensearch.org/docs/2.7/data-prepper/pipelines/configuration/sinks/opensearch/
- https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/opensearch/src/main/resources/otel-v1-apm-span-index-template.json
- https://stackoverflow.com/questions/55372330/what-does-limit-of-total-fields-1000-in-index-has-been-exceeded-means-in
**To Reproduce**
*pipelines.yaml*
```
sink:
- opensearch:
hosts: [ "https://node-0.example.com:9200" ]
#index_type: trace-analytics-raw
index_type: custom
template_file: "/usr/share/data-prepper/templates/index-template.json"
cert: "/usr/share/data-prepper/root-ca.pem"
username: "admin"
password: "admin"
socket_timeout: 0
connect_timeout: 0
```
*index-template.json*
```
{
"version": 1,
"settings": {
"index": { "mapping": { "total_fields": { "limit": "2500" } } }
},
"mappings": {
// ...
}
}
```
*GET otel-v1-apm-span-***/_settings*
```
{
"otel-v1-apm-span-": {
"settings": {
"index": {
"number_of_shards": "1",
"provided_name": "otel-v1-apm-span-",
"creation_date": "1685351383685",
"number_of_replicas": "1",
"uuid": "P_ss8jsCRj2kreesYZWPbw",
"version": {
"created": "136277827"
}
}
}
}
}
```
I am required to run the following command for each index created to have my index settings as desired:
```
PUT otel-v1-apm-span-*/_settings
{
"index.mapping.total_fields.limit": 2500
}
```
Result:
```
{
"otel-v1-apm-span-": {
"settings": {
"index": {
"mapping": {
"total_fields": {
"limit": "2500"
}
},
"number_of_shards": "1",
"provided_name": "otel-v1-apm-span-",
"creation_date": "1685351383685",
"number_of_replicas": "1",
"uuid": "P_ss8jsCRj2kreesYZWPbw",
"version": {
"created": "136277827"
}
}
}
}
}
```
**Expected behavior**
Data prepper should follow my index settings on template_file or have any other way to configure it's index settings.
**Screenshots**
My setting is not going to be used by data-prepper to create the index:

When i've applied it manually

**Environment (please complete the following information):**
- OS: Manjaro Latest and Ubuntu Latest
**Additional context**
No
| [DOCS] How to Increase "index.mapping.total_fields.limit" for new indices | https://api.github.com/repos/opensearch-project/data-prepper/issues/2779/comments | 6 | 2023-05-30T08:38:55Z | 2024-09-04T10:23:36Z | https://github.com/opensearch-project/data-prepper/issues/2779 | 1,731,793,187 | 2,779 |
[
"opensearch-project",
"data-prepper"
] | Problem Satement:
* Right now, there is no support for sending log information to CloudWatch Logs (CWL). Making it difficult for users of Data-Prepper(DP) library that want to take advantage of these features. Such as log monitoring, querying and alarming. In order to extend the flexibility of DP to reach the users of CWL, DP would have to extend their sink functionality to reaching the CWL service endpoint.
Solution:
* A solution to this would be the addition of a DP Sink as a plugin. This plugin would function like any of the other sinks with its own configuration parameters that can be tuned to fit the use. This plugin should integrate seamlessly and continue to work with the event structure that DP follows.
The CWL-Sink should support:
* The handling of AWS credentials for account verification and R/W access.
* The Log-Group endpoint to which DP should publish logs to. (This would be defined in the yaml file)
* The Sink should buffer logs to provide reliable data transfer. (in the meantime should support JSON) | Support CloudWatch Logs as a Sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/2778/comments | 0 | 2023-05-29T20:58:10Z | 2023-06-06T17:30:55Z | https://github.com/opensearch-project/data-prepper/issues/2778 | 1,731,228,391 | 2,778 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Right now, to load files from S3 in `pipelines.yaml` in any plugin pipeline author has to provide AWS credentials in each plugin. This is making the pipeline look noisy. Currently we support loading ISM policy(`ism_policy_file `) and index template(`template_file`) from S3.
**Describe the solution you'd like**
Leverage the [extensions](https://github.com/opensearch-project/data-prepper/issues/2570) work and create an `s3_configuration` extension in `data-prepper-config.yaml`.
```
aws:
# New configuration
s3_configurations:
authentication: my-custom-role
credentials:
default:
sts_role_arn: arn:aws:iam::99123456789:role/OsiDataPlaneRole
region: us-east-1
my-custom-role:
sts_role_arn:
region:
```
Now, pipeline author don't have to configure the credentials in pipeline and just use S3 URI path to load files from S3.
```
log-pipeline:
s3:
aws:
sts_role_arn: arn:aws:iam::123456789012:role/MyRole
region: us-east-1
sink:
- opensearch:
hosts: [""]
ism_policy_file: "s3://filepath"
```
| Add AWS credentials for loading S3 files in pipeline.yaml to data-prepper-config.yaml | https://api.github.com/repos/opensearch-project/data-prepper/issues/2771/comments | 0 | 2023-05-26T17:36:47Z | 2023-05-26T18:01:18Z | https://github.com/opensearch-project/data-prepper/issues/2771 | 1,728,000,735 | 2,771 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Currently OpenSearch sink checks to see if the exception is retry-able or not to retry the initialization. If the exception is considered non-retryable, initialization is not retried. This is not ideal because some of the non-retry-able errors can be fixed by fixing the permissions/OpenSearch config etc.
**To Reproduce**
If Trust relationship is not properly set with OS domain, then STS assume role errors occur which makes the OpenSearch Sink initialization non-retry-able.
**Expected behavior**
Let the OpenSearch sink init code to retry in all exception cases.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| [BUG] OpenSearch Sink should continue to try to initialize even after non-retryable exceptions | https://api.github.com/repos/opensearch-project/data-prepper/issues/2769/comments | 0 | 2023-05-26T17:00:12Z | 2023-05-31T19:15:21Z | https://github.com/opensearch-project/data-prepper/issues/2769 | 1,727,957,776 | 2,769 |
[
"opensearch-project",
"data-prepper"
] | null | Update the S3 sink to use AWS Plugin for credentials | https://api.github.com/repos/opensearch-project/data-prepper/issues/2767/comments | 0 | 2023-05-26T16:22:58Z | 2023-06-02T19:30:49Z | https://github.com/opensearch-project/data-prepper/issues/2767 | 1,727,910,686 | 2,767 |
[
"opensearch-project",
"data-prepper"
] | null | Update the S3 source to use AWS Plugin for credentials | https://api.github.com/repos/opensearch-project/data-prepper/issues/2766/comments | 0 | 2023-05-26T16:22:56Z | 2023-06-01T15:07:41Z | https://github.com/opensearch-project/data-prepper/issues/2766 | 1,727,910,632 | 2,766 |
[
"opensearch-project",
"data-prepper"
] | null | Update the OpenSearch sink and Trace Group processor to use AWS Plugin for credentials | https://api.github.com/repos/opensearch-project/data-prepper/issues/2765/comments | 0 | 2023-05-26T16:22:53Z | 2023-06-02T17:49:10Z | https://github.com/opensearch-project/data-prepper/issues/2765 | 1,727,910,573 | 2,765 |
[
"opensearch-project",
"data-prepper"
] | Update existing usage of loading credentials for pipeline components to use the AWS Plugin instead.
## Tasks
- [x] #2765
- [x] #2766
- [x] #2767 | Use the new AWS Plugin for loading AWS credentials | https://api.github.com/repos/opensearch-project/data-prepper/issues/2764/comments | 0 | 2023-05-26T16:21:45Z | 2023-06-02T19:31:11Z | https://github.com/opensearch-project/data-prepper/issues/2764 | 1,727,908,903 | 2,764 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
While uploading one of the revisions, the tail sampler action dropped core logic in the PR #2497
**To Reproduce**
Basic functionality of allowing events matching error condition and percent sampling non-error events is broken
**Expected behavior**
All events matching error condition should be allowed
All events not matching error condition should go through probabilistic sampling logic and allow only some of them.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| [BUG] Tail Sampler action in Aggregate processor broken | https://api.github.com/repos/opensearch-project/data-prepper/issues/2760/comments | 0 | 2023-05-26T04:02:49Z | 2023-06-07T21:22:37Z | https://github.com/opensearch-project/data-prepper/issues/2760 | 1,726,867,678 | 2,760 |
[
"opensearch-project",
"data-prepper"
] | **Describe the issue**
When running the [data prepper log ingestion demo](https://github.com/opensearch-project/data-prepper/tree/565538323d80643d87781a3d8149279ee5f9dda4/examples/log-ingestion), after starting the data prepper container the documentation states there should be specific log output to confirm data prepper is running correctly:
https://github.com/opensearch-project/data-prepper/blob/565538323d80643d87781a3d8149279ee5f9dda4/examples/log-ingestion/README.md?plain=1#L57-L61
However, there is no additional log output after data prepper is launched. This is the last line of the log:
`2023-05-25T18:37:01,392 [main] INFO org.opensearch.dataprepper.pipeline.server.DataPrepperServer - Data Prepper server running at :4900`
The demo documentation also states there should be similar log output after data prepper successfully processes logs from FluentBit:
https://github.com/opensearch-project/data-prepper/blob/565538323d80643d87781a3d8149279ee5f9dda4/examples/log-ingestion/README.md?plain=1#L118-L122
This also does not display in the log output.
**To Reproduce**
Steps to reproduce the behavior:
1. Follow the [data prepper log ingestion demo](https://github.com/opensearch-project/data-prepper/tree/565538323d80643d87781a3d8149279ee5f9dda4/examples/log-ingestion).
2. View the log output after running the command to the run the data prepper container:
`docker run --name data-prepper -v ${PWD}/log_pipeline.yaml:/usr/share/data-prepper/pipelines/log_pipeline.yaml --network "data-prepper_opensearch-net" opensearchproject/data-prepper:2`
3. Also view the log output after lines have been added to the log file and processed by FluentBit.
**Additional context**
The data prepper log outputs for the process worker have been removed after the demo guide was created. This is because if a data prepper instance was sent a lot of data, the log would be spammed.
**Proposed change**
Update the log ingestion demo guide documentation to remove the steps that tell the user to expect log output from the data prepper after processing data.
| Data Prepper Log Ingestion Demo Guide documentation needs updating | https://api.github.com/repos/opensearch-project/data-prepper/issues/2756/comments | 0 | 2023-05-25T20:18:22Z | 2023-05-25T21:52:24Z | https://github.com/opensearch-project/data-prepper/issues/2756 | 1,726,436,127 | 2,756 |
[
"opensearch-project",
"data-prepper"
] | null | Support shared AWS credentials across pipeline components | https://api.github.com/repos/opensearch-project/data-prepper/issues/2751/comments | 0 | 2023-05-24T21:29:37Z | 2023-05-31T21:30:49Z | https://github.com/opensearch-project/data-prepper/issues/2751 | 1,724,763,469 | 2,751 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2023-32681 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>requests-2.26.0-py2.py3-none-any.whl</b></p></summary>
<p>Python HTTP for Humans.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/92/96/144f70b972a9c0eabbd4391ef93ccd49d0f2747f4f6a2a2738e99e5adc65/requests-2.26.0-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/92/96/144f70b972a9c0eabbd4391ef93ccd49d0f2747f4f6a2a2738e99e5adc65/requests-2.26.0-py2.py3-none-any.whl</a></p>
<p>Path to dependency file: /release/smoke-tests/otel-span-exporter/requirements.txt</p>
<p>Path to vulnerable library: /release/smoke-tests/otel-span-exporter/requirements.txt,/release/smoke-tests/otel-span-exporter/requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **requests-2.26.0-py2.py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Unintended leak of Proxy-Authorization header in requests from 2.3.0 before 2.31.0. Requests has been vulnerable to potentially leaking Proxy-Authorization headers to destination servers, specifically during redirects to an HTTPS origin. For HTTP connections sent through the proxy, the proxy will identify the header in the request itself and remove it prior to forwarding to the destination server. However when sent over HTTPS, the Proxy-Authorization header must be sent in the CONNECT request as the proxy has no visibility into further tunneled requests. This results in Requests forwarding the header to the destination server unintentionally, allowing a malicious actor to potentially exfiltrate those credentials.
<p>Publish Date: 2023-05-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-32681>CVE-2023-32681</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-j8r2-6x86-q33q">https://github.com/advisories/GHSA-j8r2-6x86-q33q</a></p>
<p>Release Date: 2023-05-11</p>
<p>Fix Resolution: requests -2.31.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2023-32681 (Medium) detected in requests-2.26.0-py2.py3-none-any.whl - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/2749/comments | 1 | 2023-05-24T16:04:48Z | 2023-05-24T21:19:29Z | https://github.com/opensearch-project/data-prepper/issues/2749 | 1,724,329,914 | 2,749 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Add support to tag events when parse_json fails to parse. This is sub-issue of issue #629.
**Describe the solution you'd like**
Solution is to add option to parse_json processor that can add tags on failure. The proposed config is
```
processor:
- parse_json:
source: "log"
tags_on_failure: ["json_parse_fail", "tag2"]
```
This config would add the specified tags to the event upon parse failure.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
The added tags may be used to in the pipeline to do conditional routing or conditional processing.
| Add support to tag events when parse_json fails to parse | https://api.github.com/repos/opensearch-project/data-prepper/issues/2744/comments | 0 | 2023-05-24T03:44:43Z | 2023-05-24T21:21:51Z | https://github.com/opensearch-project/data-prepper/issues/2744 | 1,723,126,611 | 2,744 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Any nested configurations in a pipeline configuration will not run the validations of the nested class without the `@Valid` annotation.
For example, the `aws` parameter gets validated in the s3 source because it has the `@Valid` annotation (https://github.com/opensearch-project/data-prepper/blob/d0d7ba01d179b60d153ff54865c7924a133a7358/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/S3SourceConfig.java#L42)
While the sqs options do not get validated (https://github.com/opensearch-project/data-prepper/blob/d0d7ba01d179b60d153ff54865c7924a133a7358/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/S3SourceConfig.java#L38)
**To Reproduce**
Steps to reproduce the behavior:
Run a pipeline configuration like the following and you will get a NPE at runtime and not get an exception for this validation until you add the `@Valid` annotation
```
s3-scan-pipeline:
source:
s3:
sqs:
queue_url: null
codec:
newline:
aws:
region: "us-west-2"
sts_role_arn: "arn:aws:iam::870201406020:role/s3-to-osis-role"
sink:
- stdout:
```
**Expected behavior**
These should all be validated
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| [BUG] Nested configuration validation annotations are not getting triggered | https://api.github.com/repos/opensearch-project/data-prepper/issues/2743/comments | 1 | 2023-05-23T18:38:40Z | 2023-05-24T21:13:15Z | https://github.com/opensearch-project/data-prepper/issues/2743 | 1,722,592,598 | 2,743 |
[
"opensearch-project",
"data-prepper"
] | Currently, Data Prepper has end-user documentation in plugin repos and the docs folder. Much of this information is already in the opensearch.org [documentation Git repo](https://github.com/opensearch-project/documentation-website/tree/main/_data-prepper) and thus is duplicated.
Update all plugin README.md files to have the following structure:
```
# plugin_name
<Brief introduction>
<Link to the opensearch.org documentation>
## Developer guide
<Any specific instructions for developing within this plugin>
```
Also, update the end-user documentation in the `docs` directory to simply redirect users to the opensearch.org documentation.
```
# Original title
This documentation has moved to <link to the opensearch.org documentation>.
```
### Tasks
- [ ] Collect the plugins that need to be added and put below.
#### Plugins
- [ ] s3-source
#### Docs directory
| Remove all end-user documentation from Data Prepper repo | https://api.github.com/repos/opensearch-project/data-prepper/issues/2740/comments | 0 | 2023-05-23T16:16:36Z | 2024-04-16T19:58:10Z | https://github.com/opensearch-project/data-prepper/issues/2740 | 1,722,402,249 | 2,740 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Using `range` with s3 scan does not use the `range` parameter
https://github.com/opensearch-project/data-prepper/blob/d0d7ba01d179b60d153ff54865c7924a133a7358/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/S3ScanService.java#L74 should be used in the partition supplier here (https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/s3-source/src/main/java/org/opensearch/dataprepper/plugins/source/S3ScanPartitionCreationSupplier.java)
**Expected behavior**
Range parameter scans from start_time plus range or from end_time - range
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| [BUG] S3 Scan Range Paramter is unused in the code | https://api.github.com/repos/opensearch-project/data-prepper/issues/2739/comments | 1 | 2023-05-23T15:14:09Z | 2023-06-07T16:45:18Z | https://github.com/opensearch-project/data-prepper/issues/2739 | 1,722,288,093 | 2,739 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.