issue_owner_repo
listlengths
2
2
issue_body
stringlengths
0
262k
issue_title
stringlengths
1
1.02k
issue_comments_url
stringlengths
53
116
issue_comments_count
int64
0
2.49k
issue_created_at
stringdate
1999-03-17 02:06:42
2025-06-23 11:41:49
issue_updated_at
stringdate
2000-02-10 06:43:57
2025-06-23 11:43:00
issue_html_url
stringlengths
34
97
issue_github_id
int64
132
3.17B
issue_number
int64
1
215k
[ "opensearch-project", "data-prepper" ]
## CVE-2022-1471 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>snakeyaml-1.33.jar</b>, <b>snakeyaml-1.32.jar</b></p></summary> <p> <details><summary><b>snakeyaml-1.33.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="https://bitbucket.org/snakeyaml/snakeyaml">https://bitbucket.org/snakeyaml/snakeyaml</a></p> <p>Path to dependency file: /data-prepper-plugins/otel-trace-group-processor/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.33/2cd0a87ff7df953f810c344bdf2fe3340b954c69/snakeyaml-1.33.jar</p> <p> Dependency Hierarchy: - common-2.3.0-SNAPSHOT (Root Library) - jackson-dataformat-yaml-2.14.2.jar - :x: **snakeyaml-1.33.jar** (Vulnerable Library) </details> <details><summary><b>snakeyaml-1.32.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="https://bitbucket.org/snakeyaml/snakeyaml">https://bitbucket.org/snakeyaml/snakeyaml</a></p> <p>Path to dependency file: /e2e-test/log/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.32/e80612549feb5c9191c498de628c1aa80693cf0b/snakeyaml-1.32.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.32/e80612549feb5c9191c498de628c1aa80693cf0b/snakeyaml-1.32.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.32/e80612549feb5c9191c498de628c1aa80693cf0b/snakeyaml-1.32.jar</p> <p> Dependency Hierarchy: - opensearch-rest-high-level-client-1.3.8.jar (Root Library) - opensearch-1.3.8.jar - opensearch-x-content-1.3.8.jar - :x: **snakeyaml-1.32.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> SnakeYaml's Constructor() class does not restrict types which can be instantiated during deserialization. Deserializing yaml content provided by an attacker can lead to remote code execution. We recommend using SnakeYaml's SafeConsturctor when parsing untrusted content to restrict deserialization. We recommend upgrading to version 2.0 and beyond. <p>Publish Date: 2022-12-01 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-1471>CVE-2022-1471</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/561/cve-2022-1471-vulnerability-in#comment-64634374">https://bitbucket.org/snakeyaml/snakeyaml/issues/561/cve-2022-1471-vulnerability-in#comment-64634374</a></p> <p>Release Date: 2022-12-01</p> <p>Fix Resolution (org.yaml:snakeyaml): 2.0</p> <p>Direct dependency fix Resolution (org.opensearch.client:opensearch-rest-high-level-client): 2.0.0</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
CVE-2022-1471 (High) detected in snakeyaml-1.33.jar, snakeyaml-1.32.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/2092/comments
1
2022-12-21T01:08:13Z
2023-05-04T15:42:49Z
https://github.com/opensearch-project/data-prepper/issues/2092
1,505,554,931
2,092
[ "opensearch-project", "data-prepper" ]
Using DataPrepper to ingest metrics into OpenSearch it is possible to crash the metrics ingestion with invalid data. If data points without a value are sent, the null check in the OtelMetricsRawProcessor throws a NullPointerException. This crashes the current ingestion thread. Once all threads crash, no more data ingestion is possible. DataPrepper does not restart failed threads nor does it crash itself. Additional monitoring of the buffers or logs is required to detect this disfunctional state. Restarting DataPrepper resolves the issue until the next invalid data point is ingested. During restarts all data in the buffers is lost. **To Reproduce** We encountered the issue using the OpenTelemetry Collector to scrape Istio metrics from its Prometheus endpoint. Since the issue only happens occasionally and could not be forced, we provide an alternative approach, that generates the invalid metrics directly. Nevertheless, keep in mind, that the issue happened in a productive setup. Steps to reproduce the behavior: 1. Run DataPrepper with a default metrics pipeline, e.g.: ``` metrics-pipeline: source: otel_metrics_source: ssl: false processor: - otel_metrics_raw_processor: ``` 2. Store the following OTEL metric as JSON in a file `metric.json`. Note, that there is no "value" field in the data point. ``` { "resourceMetrics": [ { "resource": { "attributes": [ { "key": "service.name", "value": { "stringValue": "basic-metric-service" } } ], "droppedAttributesCount": 0 }, "scopeMetrics": [ { "metrics": [ { "description": "Example of a Counter", "name": "requests", "gauge": { "dataPoints": [ { "attributes": [], "startTimeUnixNano": 1660736598000000000, "timeUnixNano": 1660736598000001000 } ] }, "unit": "1" } ], "scope": { "name": "example-exporter-collector", "version": "" } } ] } ] } ``` 3. Prepare a otel-collector docker image, e.g. `docker pull otel/opentelemetry-collector:0.67.0` and point it to the DataPrepper instance using the following configuration: ``` receivers: otlp: protocols: grpc: http: cors: allowed_origins: - "*" processors: batch: send_batch_size: 50 timeout: 1s exporters: logging: logLevel: debug otlp/metrics: endpoint: "<DataPrepper Endpoint>" tls: insecure_skip_verify: true service: pipelines: metrics: receivers: [otlp] processors: [batch] exporters: [logging, otlp/metrics] ``` 4. Run the Docker image with: ``` docker run -v "${PWD}/otelcol-config-sample.yaml":/otelcol-config-sample.yaml -p 4318:4318 otel/opentelemetry-collector:0.67.0 --config otelcol-config-sample.yaml ``` 5. Send the invalid metric to the otel-collector: ``` curl -X POST -H "Content-Type: application/json" -d @metric-request-gauge.json http://localhost:4318/v1/metrics ``` The OpenTelemetry Collector should send the invalid metric to DataPrepper causing the NPE (stacktrace from DataPrepper 1.5): ``` 2022-11-23T16:23:08,402 [metrics-pipeline-processor-worker-7-thread-1] ERROR com.amazon.dataprepper.pipeline.ProcessWorker - Encountered exception during pipeline metrics-pipeline processing java.lang.NullPointerException: value cannot be null at com.google.common.base.Preconditions.checkNotNull(Preconditions.java:907) ~[data-prepper.jar:1.5.0] at com.amazon.dataprepper.model.metric.ParameterValidator.lambda$validate$2(ParameterValidator.java:33) ~[data-prepper.jar:1.5.0] at java.util.Collections$SingletonList.forEach(Collections.java:4933) ~[?:?] at com.amazon.dataprepper.model.metric.ParameterValidator.validate(ParameterValidator.java:31) ~[data-prepper.jar:1.5.0] at com.amazon.dataprepper.model.metric.JacksonGauge$Builder.build(JacksonGauge.java:80) ~[data-prepper.jar:1.5.0] at com.amazon.dataprepper.plugins.processor.otelmetrics.OTelMetricsRawProcessor.lambda$mapGauge$0(OTelMetricsRawProcessor.java:89) ~[data-prepper.jar:1.5.0] at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?] at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?] at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?] at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?] at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:913) ~[?:?] at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?] at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:578) ~[?:?] at com.amazon.dataprepper.plugins.processor.otelmetrics.OTelMetricsRawProcessor.mapGauge(OTelMetricsRawProcessor.java:91) ~[data-prepper.jar:1.5.0] at com.amazon.dataprepper.plugins.processor.otelmetrics.OTelMetricsRawProcessor.doExecute(OTelMetricsRawProcessor.java:54) ~[data-prepper.jar:1.5.0] at com.amazon.dataprepper.model.processor.AbstractProcessor.lambda$execute$0(AbstractProcessor.java:55) ~[data-prepper.jar:1.5.0] at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:65) ~[data-prepper.jar:1.5.0] at com.amazon.dataprepper.model.processor.AbstractProcessor.execute(AbstractProcessor.java:55) ~[data-prepper.jar:1.5.0] at com.amazon.dataprepper.pipeline.ProcessWorker.run(ProcessWorker.java:62) ~[data-prepper.jar:1.5.0] at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?] at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) ~[?:?] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) ~[?:?] at java.lang.Thread.run(Thread.java:832) ~[?:?] ``` **Expected behavior** DataPrepper should not lose a worker thread to faulty input data. The metric should be dropped increasing an appropriate counter in the metrics. Optionally a log entry can be issued at DEBUG or INFO level. **Environment (please complete the following information):** - OS: [e.g. Ubuntu 20.04 LTS] - DataPrepper versions 1.5 and 2.0 were tested. Both with local binaries and the provided Docker images. **Additional context** The issue is cause by declaring the value field as "REQUIRED_NON_NULL_VALUES" in the JacksonGauge: https://github.com/opensearch-project/data-prepper/blob/01228357aa9f5c6f8099887b3c90d15e892c769c/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/metric/JacksonGauge.java#L25-L29. Since the validation exceptions are not called and handled, the worker thread crashes. It would be nice, if a different behaviour on validation errors would be implemented as indicated in expected behaviour.
[BUG] Invalid Opentelemetry Data can cause ingestion to be stopped
https://api.github.com/repos/opensearch-project/data-prepper/issues/2089/comments
4
2022-12-20T15:12:07Z
2023-01-27T01:03:02Z
https://github.com/opensearch-project/data-prepper/issues/2089
1,504,727,642
2,089
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** It would be nice to have rate limiting capability in aggregator actions to be able to limit the number of events while aggregating. This is a sub-issue of issue #2015. **Describe the solution you'd like** Provide a new action `rate_limiter` that would limit the number of events per second as shown below ``` processor: aggregate: identification_keys: - # ... Identification keys for the metric ... action: rate_limiter: events_per_second: 10 drop_when_exceeds: true ``` The above config would limit the number of events per second to 10 and drops any excessive events that are rejected by the rate limiter. Setting `drop_when_exceeds` to `false` would result in rate limiter to block when an event could not be allowed to pass. **Additional context** This action when combined with `when` condition of aggregation can be used to rate limit a events matching a specific condition. Rate limiting is one of the policies of tail sampling provided by OTEL - https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/processor/tailsamplingprocessor/README.md
New aggregate processor action - rate_limiter - to limit the number of events passed through aggregator in a given time period
https://api.github.com/repos/opensearch-project/data-prepper/issues/2088/comments
1
2022-12-20T07:57:52Z
2023-03-02T02:41:16Z
https://github.com/opensearch-project/data-prepper/issues/2088
1,504,135,152
2,088
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** It would be nice to have metrics for ACM and AWS CloudMap related api calls to monitor the success, latency and error for the calls that can be used to create alarm. **Describe the solution you'd like** With the AWS SDK for Java 2.x, you can collect metrics about the service clients in your application, analyze the output in Amazon CloudWatch, and then act on it. https://docs.aws.amazon.com/sdk-for-java/latest/developer-guide/metrics.html AWS SDK v2 doesn't support custom publisher yet: https://github.com/aws/aws-sdk-java-v2/issues/3311#issuecomment-1194487912 **Describe alternatives you've considered (Optional)** Instrument the api calls explicitly when the API calls to ACM and AWS CloudMap are made using existing Metrics collector.
Emit Metrics for ACM and AWS CloudMap related calls
https://api.github.com/repos/opensearch-project/data-prepper/issues/2087/comments
2
2022-12-20T02:06:33Z
2023-02-07T20:04:14Z
https://github.com/opensearch-project/data-prepper/issues/2087
1,503,878,930
2,087
[ "opensearch-project", "data-prepper" ]
Make use of Java's `ScheduledExecutorService` to read the RSS feed URL at the interval of `pollingFrequency`
Add Polling to enable reading the feed URL periodically
https://api.github.com/repos/opensearch-project/data-prepper/issues/2081/comments
0
2022-12-16T16:05:55Z
2022-12-16T16:06:51Z
https://github.com/opensearch-project/data-prepper/issues/2081
1,500,488,723
2,081
[ "opensearch-project", "data-prepper" ]
Hello Team, I need some advise to implement a new environments with winlogbait agent (7.12.1) sending event logs to opensearch throught dataprepper. i used this settings: logging.to_files: true logging.files: path: C:\ProgramData\winlogbeat\Logs logging.level: info output.elasticsearch: hosts: ["localhost:9200"] enabled: false ssl.certificate: "/etc/pki/client/cert.pem" ssl.key: "/etc/pki/client/cert.key" output.logstash: hosts: ["192.168.107.1:7104"] enabled: true ssl.enabled: true rootca is installed in ROOTCA OS envirronement but i have this issue in logs: 2022-12-15T17:20:25.594+0100 ERROR [publisher_pipeline_output] pipeline/output.go:180 failed to publish events: write tcp 192.168.0.66:59140->192.168.107.1:7104: wsasend: Une connexion existante a dû être fermée par l’hôte distant. 2022-12-15T17:20:25.594+0100 INFO [publisher_pipeline_output] pipeline/output.go:143 Connecting to backoff(async(tcp://192.168.107.1:7104)) 2022-12-15T17:20:25.594+0100 INFO [publisher] pipeline/retry.go:219 retryer: send unwait signal to consumer also i tried to create index in opensearch by using winlogbeat.template.json by this method: curl --insecure -s -H 'Content-Type: application/json' -XPUT https://opensearch-node1:9200/_index_template/winlogbeat-7.12.1 -u 'admin:XXXXX' --data-binary "@winlogbeat.template.json"; but i have this issue : {"error":{"root_cause":[{"type":"invalid_index_template_exception","reason":"index_template [winlogbeat-8.5.3] invalid, cause [Validation Failed: 1: unknown setting [index.lifecycle.name] please check that any required plugins are installed, or check the breaking changes documentation for removed settings;2: expected [index.lifecycle.name] to be private but it was not;]"}],"type":"invalid_index_template_exception","reason":"index_template [winlogbeat-8.5.3] invalid, cause [Validation Failed: 1: unknown setting [index.lifecycle.name] please check that any required plugins are installed, or check the breaking changes documentation for removed settings;2: expected [index.lifecycle.name] to be private but it was not;]"},"status":400} thanks for your help
[advise] winlogbai 7.12.1 + dataprepper + opensearch implementation
https://api.github.com/repos/opensearch-project/data-prepper/issues/2086/comments
1
2022-12-16T10:46:09Z
2023-01-09T17:47:38Z
https://github.com/opensearch-project/data-prepper/issues/2086
1,503,784,812
2,086
[ "opensearch-project", "data-prepper" ]
## CVE-2022-41915 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.74.Final.jar</b></p></summary> <p></p> <p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p> <p>Path to dependency file: /data-prepper-main/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.74.Final/73c7bd6341cb59feab6f56200b1e2d908b054fd4/netty-codec-http-4.1.74.Final.jar</p> <p> Dependency Hierarchy: - sts-2.17.264.jar (Root Library) - netty-nio-client-2.17.264.jar - :x: **netty-codec-http-4.1.74.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Netty project is an event-driven asynchronous network application framework. In versions prior to 4.1.86.Final, when calling `DefaultHttpHeadesr.set` with an _iterator_ of values, header value validation was not performed, allowing malicious header values in the iterator to perform HTTP Response Splitting. This issue has been patched in version 4.1.86.Final. Integrators can work around the issue by changing the `DefaultHttpHeaders.set(CharSequence, Iterator<?>)` call, into a `remove()` call, and call `add()` in a loop over the iterator of values. <p>Publish Date: 2022-12-13 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-41915>CVE-2022-41915</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-12-13</p> <p>Fix Resolution (io.netty:netty-codec-http): 5.0.0.Alpha1</p> <p>Direct dependency fix Resolution (software.amazon.awssdk:sts): 2.17.265</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
CVE-2022-41915 (Medium) detected in netty-codec-http-4.1.74.Final.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/2077/comments
1
2022-12-14T19:23:16Z
2022-12-22T00:41:36Z
https://github.com/opensearch-project/data-prepper/issues/2077
1,497,249,500
2,077
[ "opensearch-project", "data-prepper" ]
## CVE-2022-3510 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>protobuf-java-3.21.1.jar</b>, <b>protobuf-java-3.19.4.jar</b></p></summary> <p> <details><summary><b>protobuf-java-3.21.1.jar</b></p></summary> <p>Core Protocol Buffers library. Protocol Buffers are a way of encoding structured data in an efficient yet extensible format.</p> <p>Library home page: <a href="https://developers.google.com/protocol-buffers/">https://developers.google.com/protocol-buffers/</a></p> <p>Path to dependency file: /data-prepper-plugins/armeria-common/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar</p> <p> Dependency Hierarchy: - armeria-grpc-1.19.0.jar (Root Library) - :x: **protobuf-java-3.21.1.jar** (Vulnerable Library) </details> <details><summary><b>protobuf-java-3.19.4.jar</b></p></summary> <p>Core Protocol Buffers library. Protocol Buffers are a way of encoding structured data in an efficient yet extensible format.</p> <p>Library home page: <a href="https://developers.google.com/protocol-buffers/">https://developers.google.com/protocol-buffers/</a></p> <p>Path to dependency file: /data-prepper-plugins/aggregate-processor/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.19.4/748e4e0b9e4fa6b9b1fe65690aa04a9db56cfc4d/protobuf-java-3.19.4.jar</p> <p> Dependency Hierarchy: - opentelemetry-proto-0.16.0-alpha.jar (Root Library) - :x: **protobuf-java-3.19.4.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/ebd3e757c341c1d9c1352431bbad7bf5db2ea939">ebd3e757c341c1d9c1352431bbad7bf5db2ea939</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A parsing issue similar to CVE-2022-3171, but with Message-Type Extensions in protobuf-java core and lite versions prior to 3.21.7, 3.20.3, 3.19.6 and 3.16.3 can lead to a denial of service attack. Inputs containing multiple instances of non-repeated embedded messages with repeated or unknown fields causes objects to be converted back-n-forth between mutable and immutable forms, resulting in potentially long garbage collection pauses. We recommend updating to the versions mentioned above. <p>Publish Date: 2022-12-12 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-3510>CVE-2022-3510</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-4gg5-vx3j-xwc7">https://github.com/advisories/GHSA-4gg5-vx3j-xwc7</a></p> <p>Release Date: 2022-12-12</p> <p>Fix Resolution: com.google.protobuf:protobuf-java:3.21.7,3.20.3,3.19.6,3.16.3</p> </p> </details> <p></p>
CVE-2022-3510 (High) detected in protobuf-java-3.21.1.jar, protobuf-java-3.19.4.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/2076/comments
1
2022-12-14T19:23:14Z
2022-12-19T18:23:54Z
https://github.com/opensearch-project/data-prepper/issues/2076
1,497,249,459
2,076
[ "opensearch-project", "data-prepper" ]
## CVE-2022-41881 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-haproxy-4.1.74.Final.jar</b></p></summary> <p></p> <p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p> <p>Path to dependency file: /e2e-test/trace/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec-haproxy/4.1.74.Final/2491c304000b2c0b6b47f163f826019a4a35a5f0/netty-codec-haproxy-4.1.74.Final.jar</p> <p> Dependency Hierarchy: - data-prepper-main-2.1.0-SNAPSHOT (Root Library) - data-prepper-core-2.1.0-SNAPSHOT - armeria-1.20.3.jar - :x: **netty-codec-haproxy-4.1.74.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/ebd3e757c341c1d9c1352431bbad7bf5db2ea939">ebd3e757c341c1d9c1352431bbad7bf5db2ea939</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Netty project is an event-driven asynchronous network application framework. In versions prior to 4.1.86.Final, a StackOverflowError can be raised when parsing a malformed crafted message due to an infinite recursion. This issue is patched in version 4.1.86.Final. There is no workaround, except using a custom HaProxyMessageDecoder. <p>Publish Date: 2022-12-12 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-41881>CVE-2022-41881</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-12-12</p> <p>Fix Resolution: io.netty:netty-codec-haproxy:netty-4.1.86.Final</p> </p> </details> <p></p>
CVE-2022-41881 (High) detected in netty-codec-haproxy-4.1.74.Final.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/2075/comments
1
2022-12-14T17:07:17Z
2022-12-22T00:41:31Z
https://github.com/opensearch-project/data-prepper/issues/2075
1,497,053,787
2,075
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? no **Describe the solution you'd like** I'm working on creating Stitched and deduplicated Cisco Netflow config and to make this more efficient we need udp input and a flow codec to make this more efficient. Similar to this one codec https://www.elastic.co/guide/en/logstash/current/plugins-codecs-netflow.html Will be sharing our config in this repo https://github.com/Cargill/OpenSIEM-Logstash-Parsing **Describe alternatives you've considered (Optional)** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here.
Add UDP input and flow Codec
https://api.github.com/repos/opensearch-project/data-prepper/issues/2074/comments
2
2022-12-14T13:26:35Z
2023-01-12T22:59:53Z
https://github.com/opensearch-project/data-prepper/issues/2074
1,496,630,238
2,074
[ "opensearch-project", "data-prepper" ]
Convert an RSS Feed Item into a `JacksonDocument`
Convert Feed Item into Event
https://api.github.com/repos/opensearch-project/data-prepper/issues/2072/comments
0
2022-12-13T19:12:53Z
2023-01-24T21:26:25Z
https://github.com/opensearch-project/data-prepper/issues/2072
1,494,995,134
2,072
[ "opensearch-project", "data-prepper" ]
**Describe the bug** It seems that using version 2.0.1 the dynamic index name (with index type custom) is not working when trying to create the index name starting from the value of a field inside a trace event. **To Reproduce** Steps to reproduce the behavior: 1. Data prepper config ``` entry-pipeline: delay: "100" source: otel_trace_source: ssl: false sink: - pipeline: name: "raw-pipeline" raw-pipeline: source: pipeline: name: "entry-pipeline" processor: - otel_trace_raw: sink: - opensearch: hosts: ["https://<ENDPOINT>:443"] aws_sigv4: true aws_region: "eu-west-1" index: "otel-v1-apm-span-${resource.attributes.obs-tenant}-%{yyyy.MM.dd}" index_type: trace-analytics-raw trace_analytics_raw: true ``` 2. My trace data inside Opensearch looks like: ``` [ { "_index": "otel-v1-apm-span-${resource.attributes.obs-tenant}-2022.12.12", "_type": "_doc", "_id": "jTtrCIUB2abFRUifipmm", "_score": 0.6931471, "_source": { "traceId": "00000000000000000000e7a86b9eaece", "droppedLinksCount": 0, "kind": "SPAN_KIND_SERVER", "droppedEventsCount": 0, "traceGroupFields": { "endTime": "2022-12-12T22:09:28.160267Z", "durationInNanos": 108000, "statusCode": 0 }, "traceGroup": "HTTP GET /", "serviceName": "frontend", "parentSpanId": "", "spanId": "0000e7a86b9eaece", "traceState": "", "name": "HTTP GET /", "startTime": "2022-12-12T22:09:28.160159Z", "links": [], "endTime": "2022-12-12T22:09:28.160267Z", "droppedAttributesCount": 0, "durationInNanos": 108000, "events": [], "span.attributes.http@url": "/", "resource.attributes.client-uuid": "63d9e116e98c3b2", "resource.attributes.service@name": "frontend", "span.attributes.component": "net/http", "status.code": 0, "span.attributes.sampler@param": true, "span.attributes.http@method": "GET", "resource.attributes.ip": "10.10.10.10", "resource.attributes.opencensus@exporterversion": "Jaeger-Go-2.30.0", "resource.attributes.obs-tenant": "devsecops", "span.attributes.http@status_code": 200, "span.attributes.sampler@type": "const" } } ] ``` 3. The trace is then added inside an index without parsing dynamically the value of **resource.attributes.obs-tenant** from the input data.
[BUG] Opensearch sink, dynamic index not working
https://api.github.com/repos/opensearch-project/data-prepper/issues/2070/comments
1
2022-12-12T22:22:46Z
2023-01-17T15:39:43Z
https://github.com/opensearch-project/data-prepper/issues/2070
1,492,891,448
2,070
[ "opensearch-project", "data-prepper" ]
**Describe the bug** `json` processor is not available as documented in the [docs](https://opensearch.org/docs/2.0/clients/data-prepper/data-prepper-reference/#json). **To Reproduce** pipelines.yaml: ``` log-pipeline: source: http: ssl: false processor: - json: source: "log" sink: - opensearch: ...... ``` **Expected behavior** The error should not occur, and ideally, my json-formatted logs should be parsed. **Error message** ``` 2022-12-08T18:19:24,398 [main] ERROR org.opensearch.dataprepper.parser.PipelineParser - Construction of pipeline components failed, skipping building of pipeline [log-pipeline] and its connected pipelines org.opensearch.dataprepper.model.plugin.NoPluginFoundException: Unable to find a plugin named 'json'. Please ensure that plugin is annotated with appropriate values. ``` **Environment:** - docker host OS: Ubuntu 20.04 LTS - docker image: opensearchproject/data-prepper:latest
[BUG] The json processor is not available
https://api.github.com/repos/opensearch-project/data-prepper/issues/2066/comments
1
2022-12-08T18:44:26Z
2022-12-14T16:47:34Z
https://github.com/opensearch-project/data-prepper/issues/2066
1,485,210,432
2,066
[ "opensearch-project", "data-prepper" ]
## CVE-2022-23491 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>certifi-2021.10.8-py2.py3-none-any.whl</b></p></summary> <p>Python package for providing Mozilla's CA Bundle.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/37/45/946c02767aabb873146011e665728b680884cd8fe70dde973c640e45b775/certifi-2021.10.8-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/37/45/946c02767aabb873146011e665728b680884cd8fe70dde973c640e45b775/certifi-2021.10.8-py2.py3-none-any.whl</a></p> <p>Path to dependency file: /release/smoke-tests/otel-span-exporter/requirements.txt</p> <p>Path to vulnerable library: /release/smoke-tests/otel-span-exporter/requirements.txt,/release/smoke-tests/otel-span-exporter/requirements.txt</p> <p> Dependency Hierarchy: - :x: **certifi-2021.10.8-py2.py3-none-any.whl** (Vulnerable Library) <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Certifi is a curated collection of Root Certificates for validating the trustworthiness of SSL certificates while verifying the identity of TLS hosts. Certifi 2022.12.07 removes root certificates from "TrustCor" from the root store. These are in the process of being removed from Mozilla's trust store. TrustCor's root certificates are being removed pursuant to an investigation prompted by media reporting that TrustCor's ownership also operated a business that produced spyware. Conclusions of Mozilla's investigation can be found in the linked google group discussion. <p>Publish Date: 2022-12-07 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-23491>CVE-2022-23491</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.cve.org/CVERecord?id=CVE-2022-23491">https://www.cve.org/CVERecord?id=CVE-2022-23491</a></p> <p>Release Date: 2022-12-07</p> <p>Fix Resolution: certifi - 2022.12.07</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
CVE-2022-23491 (High) detected in certifi-2021.10.8-py2.py3-none-any.whl - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/2065/comments
1
2022-12-08T18:12:11Z
2023-02-08T21:59:50Z
https://github.com/opensearch-project/data-prepper/issues/2065
1,485,161,520
2,065
[ "opensearch-project", "data-prepper" ]
## CVE-2022-41854 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.31.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="https://bitbucket.org/snakeyaml/snakeyaml">https://bitbucket.org/snakeyaml/snakeyaml</a></p> <p>Path to dependency file: /data-prepper-plugins/otel-metrics-source/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar</p> <p> Dependency Hierarchy: - jackson-dataformat-yaml-2.13.4.jar (Root Library) - :x: **snakeyaml-1.31.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Those using Snakeyaml to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stack overflow. This effect may support a denial of service attack. <p>Publish Date: 2022-11-11 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-41854>CVE-2022-41854</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/531/">https://bitbucket.org/snakeyaml/snakeyaml/issues/531/</a></p> <p>Release Date: 2022-11-11</p> <p>Fix Resolution (org.yaml:snakeyaml): 1.32</p> <p>Direct dependency fix Resolution (com.fasterxml.jackson.dataformat:jackson-dataformat-yaml): 2.14.0</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
CVE-2022-41854 (Medium) detected in snakeyaml-1.31.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/2060/comments
2
2022-12-01T21:58:58Z
2022-12-15T22:54:05Z
https://github.com/opensearch-project/data-prepper/issues/2060
1,472,009,185
2,060
[ "opensearch-project", "data-prepper" ]
Use the [RSS Reader](https://github.com/w3stling/rssreader) to parse the feed URL items.
Parse RSS feed items
https://api.github.com/repos/opensearch-project/data-prepper/issues/2059/comments
0
2022-12-01T21:27:32Z
2023-01-24T21:26:33Z
https://github.com/opensearch-project/data-prepper/issues/2059
1,471,979,743
2,059
[ "opensearch-project", "data-prepper" ]
Add project setup for RSS Source.
Setup RSS Source boilerplate code
https://api.github.com/repos/opensearch-project/data-prepper/issues/2041/comments
1
2022-11-30T15:50:53Z
2023-01-19T21:37:51Z
https://github.com/opensearch-project/data-prepper/issues/2041
1,469,855,803
2,041
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** It would be nice to have an anomaly detector processor that uses ML anomaly detector algorithms like "random-cut forest". Events (or aggregated events) may be passed through this processor to identify any anomalies and generate anomaly events. **Describe the solution you'd like** Provide a new processor that can detect anomalies using the user-configurable method/algorithm ``` processor: anomaly_detector: mode: "random-cut-forest.metrics.v1" keys: ["key1", "key2"] ``` The above configuration will create an anomaly detector processor that uses `random-cut-forest.metrics.v1` algorithm to detect anomalies in the event's key1 and key2 events. For example, if the events are HTTP log message events with `client-ip`, `latency` as fields (along with others), values for `client-ip` and `latency` can be passed to the anomaly detector to identify any anomalies in the latencies of HTTP request/responses from the same source IP. ``` sample-pipeline: source: http: ssl: false port: 2021 processor: - grok: match: message: ['%{IPORHOST:clientip} (?:%{WORD:ident}|-) (%{USER:auth}|-) \[%{HTTPDATE:timestamp}\] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-) %{BASE10NUM:latency} "(?:%{WORD:ref}|-)" %{GREEDYDATA:useragent}'] processor: anomaly_detector: mode: "random-cut-forest.metrics.v1" keys: ["clientip", "latency"] ``` This is a new feature request that will accommodate using any supported ML algorithm to detect anomalies. Initially, we will have support for random cut forest algorithm only. **Additional context** The output event record when an anomaly is detected may have the following fields for the example above ``` {"latency":10.425959275971174,"clientip":"10.10.10.10","deviation_from_ expected":9.88751976547253,"grade":1.0,"confidence":0.9744249290428905} ```
New data-prepper processor to detect Anomalies
https://api.github.com/repos/opensearch-project/data-prepper/issues/2040/comments
0
2022-11-30T04:39:06Z
2023-01-09T21:40:27Z
https://github.com/opensearch-project/data-prepper/issues/2040
1,469,036,445
2,040
[ "opensearch-project", "data-prepper" ]
**Describe the bug** ![Screen Shot 2022-11-28 at 4 13 19 PM](https://user-images.githubusercontent.com/19492223/204578782-f404f3bf-0eca-4aa9-a6fc-cee079a60939.png) **To Reproduce** Steps to reproduce the behavior: 1. Go to 'examples/dev/trace-analytics-sample-app', run `docker-compose up -d` 2. Open localhost:8089 in the browser, Click on 'buttons' 3. Go to Trace Analytics Dashboard, wait for a few minutes 4. No data showed up **Expected behavior** verified that examples/trace-analytics-sample-app still works. **Screenshots** If applicable, add screenshots to help explain your problem. **Environment (please complete the following information):** - OS: MacOS - Version: 12.6.1 **Additional context** Add any other context about the problem here.
[BUG] examples/dev/trace-analytics-sample-app data-prepper failed to receive trace data
https://api.github.com/repos/opensearch-project/data-prepper/issues/2039/comments
1
2022-11-29T16:00:27Z
2023-01-12T18:05:12Z
https://github.com/opensearch-project/data-prepper/issues/2039
1,468,305,579
2,039
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** At the Moment Data Prepper can not send data to an OpenSearch data stream **Describe the solution you'd like** A configoration option for the OpenSearch sink to support a data stream
OpenSearch sink config option for data streams
https://api.github.com/repos/opensearch-project/data-prepper/issues/2038/comments
9
2022-11-29T07:42:52Z
2023-07-25T17:47:23Z
https://github.com/opensearch-project/data-prepper/issues/2038
1,467,600,013
2,038
[ "opensearch-project", "data-prepper" ]
**Describe the bug** Can not send logs to Opensearch Data stream **To Reproduce** Steps to reproduce the behavior: 1. create an index template for a data stream in Opensearch 2. use data-prepper to send logs with an index matching the index patern of the datastream 3. get the error "WARN com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink - Document [org.opensearch.client.opensearch.core.bulk.BulkOperation@c4aecd] has failure: java.lang.RuntimeException: only write ops with an op_type of create are allowed in data streams" **Expected behavior** write logs to Data stream **Environment (please complete the following information):** - Running in docker (Docker Desktop 4.10.1) - using latest version of Opensearch - using latest version of Data prepper **Comment** I didn't find an option to set the op_typ.
[BUG] Data Prepper can not write to Opensearch Datastream
https://api.github.com/repos/opensearch-project/data-prepper/issues/2037/comments
11
2022-11-24T08:59:17Z
2024-01-08T07:19:14Z
https://github.com/opensearch-project/data-prepper/issues/2037
1,462,988,888
2,037
[ "opensearch-project", "data-prepper" ]
Description: Peer Forwarding processor documentation from [this](https://github.com/opensearch-project/data-prepper/issues/700) page suggests use of below parameters to enableSSL/TLS ```yaml peer_forwarder: ssl: true certificate: file: certificate_path: /usr/share/my/path/public.cert private_key_path: /usr/share/my/path/private.key authentication: unauthenticated: discovery_mode: "dns" domain_name: "data-prepper-cluster.my-domain.net" ``` The configuration listed on [docs page](https://github.com/opensearch-project/data-prepper/blob/main/docs/peer_forwarder.md) is different as given below: ``` peer_forwarder: ssl: true ssl_certificate_file: "<cert-file-path>" ssl_key_file: "<private-key-file-path>"yaml ``` But unfortunately both of the above configuration doesn't work. With above configs, data prepper keep crashing with below exception: `java.lang.IllegalArgumentException: ssl is enabled, sslKeyCertChainFile can not be empty or null` Note: The full exception is huge, so I will put in the in the comment. So just to t to try out, I provided the crt file in the `sslKeyCertChainFile` parameter from the exception message and after the configuration. and data prepper came up and running. Below is configuration I am using right now, which works without an issue: ```yaml processor: - peer_forwarder: discovery_mode: "dns" domain_name: "data-prepper-headless" ssl: true sslKeyCertChainFile: "/certs/tls.crt" ``` I believe the implementation was changed at some place but the documentation is not yet updated I request someone to please confirm if what I am saying is correct, so that I can create a PR for documentation
Core Peer Forwarding documentation is not up to date for TLS configurations
https://api.github.com/repos/opensearch-project/data-prepper/issues/2036/comments
5
2022-11-23T09:26:37Z
2022-12-07T11:30:41Z
https://github.com/opensearch-project/data-prepper/issues/2036
1,461,364,255
2,036
[ "opensearch-project", "data-prepper" ]
**Describe the bug** The S3 source does not process all records from compressed files when using `automatic` compression. This is a repeat of #1568 for `automatic` compression. That fix was never applied to that option.
[BUG] S3 Source fails to load all records for large compressed logs with automatic compression
https://api.github.com/repos/opensearch-project/data-prepper/issues/2026/comments
0
2022-11-19T18:33:47Z
2022-11-21T17:15:07Z
https://github.com/opensearch-project/data-prepper/issues/2026
1,456,613,749
2,026
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** It can be difficult to see how much data the S3 source is processing. Also, some errors are common enough to warrant their own metrics. **Describe the solution you'd like** Provide additional metrics in the S3 source: * `s3ObjectSizeBytes` - (Distribution Summary) The size (in bytes) of an S3 object being processed. For compressed files, this is the compressed size. * `s3ObjectProcessedBytes` - (Distribution Summary) The total number of bytes processed in the S3 source. For compressed files, this is the uncompressed size. * `s3ObjectsEvents` - (Distribution Summary) The number of events/records from S3 objects. * `s3ObjectsNotFound` - (Counter) The number of Not Found errors. * `s3ObjectsAccessDenied` - (Counter) The number of Access Denied errors.
Provide additional metrics for S3 source
https://api.github.com/repos/opensearch-project/data-prepper/issues/2025/comments
1
2022-11-19T18:29:55Z
2022-11-22T20:25:22Z
https://github.com/opensearch-project/data-prepper/issues/2025
1,456,612,522
2,025
[ "opensearch-project", "data-prepper" ]
## CVE-2022-41917 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opensearch-1.3.5.jar</b></p></summary> <p>OpenSearch subproject :server</p> <p>Path to dependency file: /release/maven/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.opensearch/opensearch/1.3.5/b1c5b9898939fd6b42d6d3bbeda632142f9cef9d/opensearch-1.3.5.jar</p> <p> Dependency Hierarchy: - data-prepper-main-2.1.0-SNAPSHOT (Root Library) - data-prepper-plugins-2.1.0-SNAPSHOT - otel-trace-group-processor-2.1.0-SNAPSHOT - opensearch-rest-high-level-client-1.3.5.jar - :x: **opensearch-1.3.5.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> OpenSearch is a community-driven, open source fork of Elasticsearch and Kibana. OpenSearch allows users to specify a local file when defining text analyzers to process data for text analysis. An issue in the implementation of this feature allows certain specially crafted queries to return a response containing the first line of text from arbitrary files. The list of potentially impacted files is limited to text files with read permissions allowed in the Java Security Manager policy configuration. OpenSearch version 1.3.7 and 2.4.0 contain a fix for this issue. Users are advised to upgrade. There are no known workarounds for this issue. <p>Publish Date: 2022-11-16 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-41917>CVE-2022-41917</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/opensearch-project/OpenSearch/security/advisories/GHSA-w3rx-m34v-wrqx">https://github.com/opensearch-project/OpenSearch/security/advisories/GHSA-w3rx-m34v-wrqx</a></p> <p>Release Date: 2022-11-16</p> <p>Fix Resolution: org.opensearch:opensearch:2.4.0,org.opensearch.plugin:analysis-nori:2.4.0,org.opensearch.plugin:analysis-kuromoji:2.4.0,org.opensearch.plugin:analysis-icu-client:2.4.0,org.opensearch.plugin:analysis-common:2.4.0</p> </p> </details> <p></p>
CVE-2022-41917 (Medium) detected in opensearch-1.3.5.jar
https://api.github.com/repos/opensearch-project/data-prepper/issues/2022/comments
0
2022-11-17T19:11:03Z
2022-11-30T18:57:41Z
https://github.com/opensearch-project/data-prepper/issues/2022
1,453,844,208
2,022
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Support aggregating only some events meeting a certain value. **Describe the solution you'd like** Provide an `aggregate_when` configuration to the `aggregate` processor. **Describe alternatives you've considered (Optional)** * Support `when` on actions, but this requires any action needing this to support a `when` configuration. * Processor-based when statements. This is not currently available. **Additional context** This is implemented by #2018
Aggregate only some events
https://api.github.com/repos/opensearch-project/data-prepper/issues/2021/comments
0
2022-11-17T17:32:04Z
2022-11-18T21:31:56Z
https://github.com/opensearch-project/data-prepper/issues/2021
1,453,722,215
2,021
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** It would be nice to have aggregate action `count ` in the aggregate processor to count the number of events matching the specified identification_keys. The aggregation action should optionally support a condition under which the count action is done **Describe the solution you'd like** Provide a new action for "count" with optional condition as shown below ``` processor: aggregate: identification_keys: - # ... Identification keys for the metric ... action: count: [when: <condition>] ``` Examples 1. `Count `Aggregate Action - to count the messages with unique identification_keys with optional condition Example 1 - Count the number of messages with clientip 127.0.0.1 ``` processor: aggregate: identification_keys: - # ... Identification keys for the metric ... action: count: when: '/clientip == "127.0.0.1"' ``` Example 2 - Count the number of messages with response code 503 or 404 ``` processor: aggregate: identification_keys: - # ... Identification keys for the metric ... action: count: when: '/response == 503 or /response == 404' ``` Example 3 - Count the number of messages with matching identification keys (No condition specified) ``` processor: aggregate: identification_keys: - # ... Identification keys for the metric ... action: count: ``` **Additional context** The following options can be added to this action in the future - Make the action to delete all other fields (other than identification_keys) from the event by default - New fields like `start_time` and `end_time` may be added to the event when specified as config option as follows ``` processor: aggregate: identification_keys: - # ... Identification keys for the metric ... action: count: [when: <condition>] [add_keys: ["start_time", "end_time"]] ```
New Aggregate processor actions
https://api.github.com/repos/opensearch-project/data-prepper/issues/2015/comments
3
2022-11-15T06:10:03Z
2023-01-09T21:41:33Z
https://github.com/opensearch-project/data-prepper/issues/2015
1,449,225,282
2,015
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The S3 source validates that an S3 bucket is owned by the same account as the SQS queue. This can protect against reading from buckets in unknown accounts. This approach uses S3's [bucket ownership verification](https://docs.aws.amazon.com/AmazonS3/latest/userguide/bucket-owner-condition.html). Pipeline authors can disable this by setting `disable_bucket_ownership_validation` to true. This completely disables the bucket ownership validation. There is no way to validate that buckets are owned by specific accounts other than the SQS queue account. **Describe the solution you'd like** Provide two options for bucket validation: * `bucket_owners` - A simple map of bucket name to expected owner. * `default_bucket_owner` - A scalar value with an accountId to use for any bucket not in the map above. If specified, this will override the SQS accountId. ``` s3: sqs: queue_url: "https://sqs.us-east-1.amazonaws.com/000000000000/MyQueue" bucket_owners: my-bucket-01: 123456789012 my-bucket-02: 99999999999 default_bucket_owner: 111111111111 ``` In the example above, the S3 source will set an expectation that `my-bucket-01` is owned by `123456789012`. It will expect that `my-bucket-02` is owned by `99999999999`. It would expect that any other bucket (say `my-bucket-02`) is owned by `111111111111`. It will never expect any bucket to be owned by the SQS queue - `000000000000`. **Describe alternatives you've considered (Optional)** The existing functionality allows for skipping validation. So it is possible that no additional functionality is needed. But, then there is no bucket validation. However, the S3 documentation recommends validation: *We recommend using bucket owner condition whenever you perform a supported S3 operation and know the account ID of the expected bucket owner.* **Additional context** Original PR adding the current functionality: #1526
Support defining bucket ownership
https://api.github.com/repos/opensearch-project/data-prepper/issues/2012/comments
2
2022-11-12T15:53:10Z
2023-07-26T19:41:31Z
https://github.com/opensearch-project/data-prepper/issues/2012
1,446,502,582
2,012
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Some pipelines have Event values in one type (e.g. string), but want to convert them to another type (e.g. integer). **Describe the solution you'd like** Provide a new convert processor along with the other Mutate Event Processors. ``` processor - convert_entries: entries: - from_key: "mySource" to_key: "myTarget" type: integer ``` The default value for `to_key` can be the `from_key`. So this could be simplified in some cases: ``` processor - convert_entries: entries: - from_key: "http_status" type: integer ``` **Additional context** With conditional routing and expressions this can help pipeline authors perform better comparisons. It also allows for sending data to OpenSearch in a more desirable format. See #2009 for a grok-based solution for a similar problem.
Provide a type conversion / cast processor
https://api.github.com/repos/opensearch-project/data-prepper/issues/2010/comments
4
2022-11-11T15:43:26Z
2022-12-22T21:50:03Z
https://github.com/opensearch-project/data-prepper/issues/2010
1,445,654,519
2,010
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The `grok` processor currently creates all Event values as strings. For example, when grokking on an Apache HTTP log, all `response` values are strings. This prevents a pipeline author from creating conditional routing expressions which perform comparisons such as `/response < 500`. **Describe the solution you'd like** The `grok` processor can have two options to help pipeline authors. 1. Manual configuration of pattern types. 2. Automatic conversion of pattern types for pre-defined patterns. *Manual configuration* Provide a configuration that allows the `grok` processor to convert specific patterns. This new configuration - `conversions` - would take a map of patterns to destination types. For example: ``` grok: conversions: INT: integer NUMBER: decimal MY_CUSTOM_NUMBER: integer ``` *Automatic configuration* Provide a setting that allows the `grok` processor to automatically convert specific patterns which it has pre-included. The `grok` processor has some default patterns like `INT`. Most pipeline authors probably want these to automatically get the correct type. The `grok` processor can automatically convert these known patterns. This would be a change of behavior. So, I propose that this configure be disabled by default, but in a future major version we would enable it. Thus, to use it in Data Prepper 2.0. ``` grok: disable_automatic_conversion: false ``` But, perhaps in Data Prepper 3.0, the default value here becomes `false`. So pipeline authors no longer have to specify it. **Describe alternatives you've considered (Optional)** Ask pipeline authors to use a casting processor as requested in #2010. The solution using grok can be easier for pipeline authors, especially with an automatic conversion.
Extract values from Grok with the correct type
https://api.github.com/repos/opensearch-project/data-prepper/issues/2009/comments
2
2022-11-11T15:37:32Z
2024-01-05T20:05:04Z
https://github.com/opensearch-project/data-prepper/issues/2009
1,445,646,161
2,009
[ "opensearch-project", "data-prepper" ]
**Describe the bug** The 2.0.1 docker image declares two ENV variables in [image](https://hub.docker.com/layers/opensearchproject/data-prepper/2.0.1/images/sha256-3e3f836ba107dd322687da4972a64b3d52e8760980f2e0b39b8bc2e6e72aa7a8?context=explore) layer 16 and 17 ENV_CONFIG_FILEPATH=/usr/share/data-prepper/config/data-prepper-config.yaml ENV_PIPELINE_FILEPATH=/usr/share/data-prepper/pipelines/pipelines.yaml It looks like the engine doesn't consider the values from ENV variable. It works with 1.5.1 image and doesn't with 2.0.1. I'm aware that the folder structure changed. My expectation is either it should work or documented as part of breaking changes. **To Reproduce** Steps to reproduce the behavior: Here is my k8s manifest which works in 1.5.1 and doesn't with 2.0.1 ```apiVersion: apps/v1 kind: Deployment metadata: labels: io.kompose.service: data-prepper name: data-prepper spec: replicas: 1 selector: matchLabels: io.kompose.service: data-prepper template: metadata: labels: k8s-app: tracing io.kompose.service: data-prepper spec: containers: - image: opensearchproject/data-prepper:2.0.1 name: data-prepper env: - name: ENV_PIPELINE_FILEPATH value: /usr/share/data-prepper-config/pipelines.yaml - name: ENV_CONFIG_FILEPATH value: /usr/share/data-prepper-config/data-prepper-config.yaml ports: - containerPort: 21890 volumeMounts: - mountPath: /usr/share/data-prepper-config name: data-prepper-config restartPolicy: Always volumes: - name: data-prepper-config configMap: name: otel-configmap items: - key: data-prepper-pipeline path: pipelines.yaml - key: data-prepper-config path: data-prepper-config.yaml ``` **Expected behavior** It should work as my ENV variables pointing to the right config files. But I get ``` Caused by: org.opensearch.dataprepper.parser.ParseException: Pipelines configuration file not found at /usr/share/data-prepper/pipelines at org.opensearch.dataprepper.parser.PipelineParser.mergePipelineConfigurationFiles(PipelineParser.java:137) at org.opensearch.dataprepper.parser.PipelineParser.parseConfiguration(PipelineParser.java:84) at org.opensearch.dataprepper.DataPrepper.<init>(DataPrepper.java:60) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:499) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:480) at org.springframework.beans.BeanUtils.instantiateClass(BeanUtils.java:211)``` **Screenshots** If applicable, add screenshots to help explain your problem. **Environment (please complete the following information):** - OS: [e.g. Ubuntu 20.04 LTS] - Version [e.g. 22] : 2.0.01 **Additional context** Add any other context about the problem here.
[BUG] Docker image doesn't honor the ENV variables for pipeline and config files.
https://api.github.com/repos/opensearch-project/data-prepper/issues/2008/comments
1
2022-11-10T17:13:16Z
2024-01-23T20:54:15Z
https://github.com/opensearch-project/data-prepper/issues/2008
1,444,271,256
2,008
[ "opensearch-project", "data-prepper" ]
**Describe the bug** Custom Auth Plugin created with DataPrepperPluginConstructor annotation and PluginMetrics parameter fails to start data prepper. Data Prepper fails to start as pipelineName is null and is not set when the plugin is initialized. ``` Caused by: java.lang.IllegalArgumentException: PluginSetting.pipelineName must not be null at org.opensearch.dataprepper.metrics.PluginMetrics.fromPluginSetting(PluginMetrics.java:27) ~[data-prepper-api-2.0.0-SNAPSHOT.jar:?] at org.opensearch.dataprepper.plugin.PluginArgumentsContext.lambda$new$2(PluginArgumentsContext.java:48) ~[data-prepper-core-2.0.0-SNAPSHOT.jar:?] at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?] at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?] at java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:948) ~[?:?] at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:484) ~[?:?] ``` **To Reproduce** Steps to reproduce the behavior: Custom auth plugin ``` @DataPrepperPlugin(name = "my_auth", pluginType = ArmeriaHttpAuthenticationProvider.class) public class MyAuthProvider implements ArmeriaHttpAuthenticationProvider{ private final PluginMetrics pluginMetrics; @DataPrepperPluginConstructor public MyAuthProvider(final PluginMetrics pluginMetrics) { this.pluginMetrics = pluginMetrics; } @Override public Optional<Function<? super HttpService, ? extends HttpService>> getAuthenticationDecorator() { return Optional.of(createDecorator(pluginMetrics)); } } ``` Data Prepper log pipeline: ``` log-pipeline: source: http: authentication: my_auth: ``` **Expected behavior** Data Prepper should start and custom auth plugin should emit metrics.
[BUG] Custom Auth Plugin created with DataPrepperPluginConstructor annotation and PluginMetrics parameter fails to start data prepper
https://api.github.com/repos/opensearch-project/data-prepper/issues/2007/comments
0
2022-11-10T15:52:32Z
2023-03-02T03:01:31Z
https://github.com/opensearch-project/data-prepper/issues/2007
1,444,147,711
2,007
[ "opensearch-project", "data-prepper" ]
## CVE-2022-3509 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>protobuf-java-3.21.1.jar</b>, <b>protobuf-java-3.19.4.jar</b></p></summary> <p> <details><summary><b>protobuf-java-3.21.1.jar</b></p></summary> <p>Core Protocol Buffers library. Protocol Buffers are a way of encoding structured data in an efficient yet extensible format.</p> <p>Library home page: <a href="https://developers.google.com/protocol-buffers/">https://developers.google.com/protocol-buffers/</a></p> <p>Path to dependency file: /data-prepper-plugins/armeria-common/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar</p> <p> Dependency Hierarchy: - armeria-grpc-1.19.0.jar (Root Library) - :x: **protobuf-java-3.21.1.jar** (Vulnerable Library) </details> <details><summary><b>protobuf-java-3.19.4.jar</b></p></summary> <p>Core Protocol Buffers library. Protocol Buffers are a way of encoding structured data in an efficient yet extensible format.</p> <p>Library home page: <a href="https://developers.google.com/protocol-buffers/">https://developers.google.com/protocol-buffers/</a></p> <p>Path to dependency file: /data-prepper-plugins/aggregate-processor/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.19.4/748e4e0b9e4fa6b9b1fe65690aa04a9db56cfc4d/protobuf-java-3.19.4.jar</p> <p> Dependency Hierarchy: - opentelemetry-proto-0.16.0-alpha.jar (Root Library) - :x: **protobuf-java-3.19.4.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A parsing issue similar to CVE-2022-3171, but with textformat in protobuf-java core and lite versions prior to 3.21.7, 3.20.3, 3.19.6 and 3.16.3 can lead to a denial of service attack. Inputs containing multiple instances of non-repeated embedded messages with repeated or unknown fields causes objects to be converted back-n-forth between mutable and immutable forms, resulting in potentially long garbage collection pauses. We recommend updating to the versions mentioned above. <p>Publish Date: 2022-12-12 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-3509>CVE-2022-3509</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-3509">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-3509</a></p> <p>Release Date: 2022-12-12</p> <p>Fix Resolution: com.google.protobuf:protobuf-java:3.16.3,3.19.6,3.20.3,3.21.7</p> </p> </details> <p></p>
CVE-2022-3509 (High) detected in protobuf-java-3.21.1.jar, protobuf-java-3.19.4.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/2006/comments
1
2022-11-09T14:23:36Z
2022-12-19T18:23:52Z
https://github.com/opensearch-project/data-prepper/issues/2006
1,442,197,253
2,006
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Plugin authors should be using the [`pluginConfigurationType`](https://github.com/opensearch-project/data-prepper/blob/0d1a378f38937a928653b9754d9e02d72fe3484d/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/annotations/DataPrepperPlugin.java#L48-L57) and use their own POJO models. This approach should be preferred to using `PluginSettings` for plugin authors. **Describe the solution you'd like** I recommend that Data Prepper deprecate the `PluginSettings` class and migrate all existing plugins to use a POJO model for plugin configurations. Path to completion: - [ ] Add the `@Deprecated` annotation to `PluginSettings`. - [x] #5246 - [ ] #4838 - [ ] In a breaking release (perhaps 3.0) move `PluginSettings` to data-prepper-core. - [ ] Any time after that we can refactor `PluginSettings` within data-prepper-core without this being a breaking change. **Additional context** Some existing proposals will work best with use of the POJO model. A couple examples are #656 and #475.
Remove PluginSettings from data-prepper-api
https://api.github.com/repos/opensearch-project/data-prepper/issues/2000/comments
0
2022-11-03T16:50:34Z
2025-02-24T17:48:26Z
https://github.com/opensearch-project/data-prepper/issues/2000
1,434,977,627
2,000
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The Data Prepper event model has support for [metadata](https://github.com/opensearch-project/data-prepper/blob/db8ca416b7467e45a599c7681ce72b607fc6fc9e/data-prepper-api/src/main/java/org/opensearch/dataprepper/model/event/EventMetadata.java#L15-L36) which includes type information. Data Prepper expression syntax allows for accessing data using JSON Pointer syntax. For example, `/status_code`. However, there is no mechanism to access Event Metadata. Some pipeline authors would like to perform pipeline routing based on event type. Some others might want to filter based on tags (#629). **Describe the solution you'd like** Design and implement a syntax that allows for accessing any EventMetadata. **Additional context** The discussion around supporting Event tags offers a proposal for `getTags()` in [this comment](https://github.com/opensearch-project/data-prepper/issues/629#issuecomment-1190938897).
Support for metadata and types in expressions
https://api.github.com/repos/opensearch-project/data-prepper/issues/1998/comments
2
2022-11-02T23:57:26Z
2023-08-16T22:23:41Z
https://github.com/opensearch-project/data-prepper/issues/1998
1,433,930,392
1,998
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Pipeline authors wanted to fetch data to Prometheus **Describe the solution you'd like** Create a source plugin which will fetch data from Prometheus and generate Data Prepper Events. This plugin should be able to handle scale and performance targeted while designing the plugin **Describe alternatives you've considered (Optional)** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Remote write example: https://github.com/open-telemetry/opentelemetry-collector-contrib/blob/main/exporter/prometheusremotewriteexporter/README.md
Ingest data from Prometheus as Source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1997/comments
0
2022-11-02T01:48:01Z
2022-11-03T17:12:19Z
https://github.com/opensearch-project/data-prepper/issues/1997
1,432,341,597
1,997
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** A clear and concise description of what the problem is. Ex. It would be nice to have [...] **Describe the solution you'd like** A clear and concise description of what you want to happen. **Describe alternatives you've considered (Optional)** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here.
Ingest data from DynamoDB as Source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1996/comments
1
2022-11-02T01:46:48Z
2023-10-06T14:55:25Z
https://github.com/opensearch-project/data-prepper/issues/1996
1,432,339,583
1,996
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Pipeline author wants to read data from desired databases. Plugin needs to provide interface to read/ingest data with a JDBC interface. **Describe the solution you'd like** The interface should also provide ability to read data and run queries periodically. Every row read will be converted to Data prepper event. Columns should be mapped as fields in event. I would envision JDBC driver libraries to be provided in yml configuration by pipeline author. User should be able to pass required configuration for drivers under "jdbc_driver_lib". Additionally for scheduling a periodic run a cron like syntax configuration should be passed in the yml. The plugin should be able to support SigV4 and accept awsCrediential provider, region, security parameters in yml configuration which should include trust & keystore configurations - trustStoreLocation,trustStoreType,trustStorePassword, keyStoreLocation,KeyStoreType,keyStorePassword The plugin should include support for multi-node worker partitioning. ``` source: - jdbc: jdbc_driver_lib: "jdbc-oracle.jar" jbdc_driver:"oracle.jdbc.driver.OracleDriver" jdbc_connection_string:"jdbc:oracle://127.0.0.1:8080" jdbc_user:"user" jdbc_schedule:"* * * 3 *" sql_query:"SELECT EMPLOYEE_ID FROM EMPLOYEES WHERE LAST_NAME= :LAST_NAME" fetchSize: " " awsCredentialsProvider: "com.amazonaws.opensearch.sql.jdbc.shadow.com.amazonaws.auth.AWSCredentialsProvider" ``` **Additional context** https://github.com/opensearch-project/sql-jdbc
Ingest data from ODBC/JDBC datasources as Source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1995/comments
1
2022-11-02T01:45:26Z
2023-03-13T17:54:48Z
https://github.com/opensearch-project/data-prepper/issues/1995
1,432,338,736
1,995
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Pipeline authors wanted to get Logs out of Amazon Cloud Watch to send to either OpenSearch or Prometheus **Describe the solution you'd like** Create a CloudWatch source plugin which retrieves logs from CloudWatch.I t should put logs into the Data Prepper in the current Metrics model, which is based on the OpenTelemetry specification. ``` source: cloudwatch: namespace: ES/OpenSearchService log_source: logspath dimensions: - name: DomainName value: my-domain ``` As this is a polling source, it should also allow pipeline authors to define a poll interval similar to the S3 source. - It also needs to have AWS configurations. These can follow similar conventions to other AWS-based plugins. This plugin should be able to handle scale and performance targeted while designing the plugin - Should work for Multi node and use source coordinator - use ACE for metrics batching framework - https://aws.amazon.com/blogs/opensource/monitor-aws-services-used-by-kubernetes-with-prometheus-and-promcat/#:~:text=YACE%E2%80%94or%20%E2%80%9CYet%20another%20CloudWatch,of%20tag%20labels%20to%20metrics - https://github.com/nerdswords/yet-another-cloudwatch-exporter **Describe alternatives you've considered (Optional)** **Additional context** https://aws.amazon.com/premiumsupport/knowledge-center/cloudwatch-logs-retrieve-data/ https://docs.aws.amazon.com/AmazonCloudWatchLogs/latest/APIReference/Welcome.html https://docs.aws.amazon.com/AmazonCloudWatch/latest/APIReference/API_GetMetricData.html ``` Source: cloudwatch: logs : log_group_1 log_group_2 metrics : metrics_1 metrics_1 ``` Sample Metric Data: ``` Sample query - aws cloudwatch get-metric-data --cli-input-json file:///data.json Data.json - would be like below { "MetricDataQueries": [ { "Id": "myRequest", "MetricStat": { "Metric": { "Namespace": "AWS/EBS", "MetricName": "VolumeReadBytes", "Dimensions": [ { "Name": "VolumeId", "Value": "vol-a" } ] }, "Period": 3600, "Stat": "Average", "Unit": "Bytes" }, "Label": "myRequestLabel", "ReturnData": true } ], "StartTime": "2023-06-01T10:40:0000", "EndTime": "2023-06-27T14:12:0000" } Result- { "MetricDataResults": [ { "Id": "myRequest", "Label": "myRequestLabel", "Timestamps": [ "2023-06-21T13:40:00+00:00", "2023-06-21T12:40:00+00:00", "2023-06-21T11:40:00+00:00", "2023-06-21T10:40:00+00:00", "2023-06-21T09:40:00+00:00", "2023-06-21T08:40:00+00:00", "2023-06-21T07:40:00+00:00", "2023-06-21T06:40:00+00:00", "2023-06-21T05:40:00+00:00", "2023-06-21T04:40:00+00:00", "2023-06-21T03:40:00+00:00", "2023-06-21T02:40:00+00:00", "2023-06-21T01:40:00+00:00" ], "Values": [ 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0,5939.2, 341.3333333333333, 2184.5333333333333, 26785848.888888888 ], "StatusCode": "Complete" } ], "Messages": [] ```
CloudWatch Logs as Source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1994/comments
0
2022-11-02T01:36:29Z
2023-06-21T14:23:53Z
https://github.com/opensearch-project/data-prepper/issues/1994
1,432,332,514
1,994
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Data Prepper logs some potentially sensitive data such as Events or incoming messages. These can be sensitive because they may contain log data from upstream systems which should not be visible in the Data Prepper logs. **Describe the solution you'd like** Data Prepper can make use of SLF4J/Log4J markers. A [marker](https://logging.apache.org/log4j/2.x/manual/markers.html) allows for filtering log events based on the Marker assigned to the log event. Once markers are in place, Data Prepper administrators can configure their Log4j appenders to use a [MarkerFilter](https://logging.apache.org/log4j/2.x/manual/filters.html#MarkerFilter) to determine whether or not to display this information. I'd like to propose that Data Prepper adds two new markers: * `SENSITIVE` - A catch-all for any logging that may be sensitive because it may log input data. * `EVENT` - A specific type of logging for logging Data Prepper Events. This could have a parent marker of `SENSITIVE`. By default, Data Prepper would be configured to filter out all sensitive data. Administrators who want this would then need to opt-in to allowing this input into the logs. **Describe alternatives you've considered (Optional)** Data Prepper could have only one Marker - `SENSITIVE`. But, there could be value in allowing `SENSITIVE` values aside from Events, say for example to better understand input data. Another idea I considered was creating an `EventLogger`. This is similar to `RequestLogger` classes in some other projects. Any class that wants to log Events could call that logger. Then a Data Prepper administrator could turn off logging for the `EventLogger`. However, this will not cover the case of source data before making an Event, or sink data which was already converted from an event. **Additional context** In #1989 the solution was to remove the logging data entirely. But, this data can be helpful for debugging.
[PROPOSAL] Provide Logging Markers to filter sensitive data and events
https://api.github.com/repos/opensearch-project/data-prepper/issues/1990/comments
0
2022-10-31T19:44:36Z
2023-01-26T17:31:28Z
https://github.com/opensearch-project/data-prepper/issues/1990
1,430,395,146
1,990
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Pipeline authors want to get metrics out of Amazon CloudWatch to send to either OpenSearch or Prometheus. **Describe the solution you'd like** Create a CloudWatch source plugin which retrieves metrics from CloudWatch. It should put those metrics into the Data Prepper in the current Metrics model, which is based on the OpenTelemetry specification. ``` source: cloudwatch: namespace: ES/OpenSearchService metric_name: ShardCount dimensions: - name: DomainName value: my-domain ``` As this is a polling source, it should also allow pipeline authors to define a poll interval similar to the S3 source. It also needs to have AWS configurations. These can follow similar conventions to other AWS-based plugins. **Describe alternatives you've considered (Optional)** An alternative design would be to put the metrics into the buffer in the CloudWatch format. Then pipeline authors could configure a processor which performs the translation. This approach could be valuable if the translation is non-trivial and requires some additional configurations from pipeline authors. It could also be valuable if the translation is costly and we want to have it happen on data in the buffer. **Additional context** https://docs.aws.amazon.com/AmazonCloudWatch/latest/APIReference/API_GetMetricStatistics.html
Poll CloudWatch for metrics as a source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1987/comments
0
2022-10-30T02:10:04Z
2022-10-30T02:10:04Z
https://github.com/opensearch-project/data-prepper/issues/1987
1,428,554,193
1,987
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Some pipeline authors want to send events from Data Prepper into Kafka. This can allow them to use an existing Kafka stream or send data to other destinations via Kafka. **Describe the solution you'd like** Create a Kafka sink to send events to Kafka. It should batch these events when sending to Kafka. ``` sink: - kafka: bootstrap_servers: - localhost:9092 topic: my-topic ``` The configuration should also support many of the [producer configs](https://kafka.apache.org/documentation/#producerconfigs). It should certainly support the configurations which the Kafka client will handle completely and do not require anything from Data Prepper beyond passing the configurations. The Kafka producer client supports sending to multiple topics. If there is value in making topics dynamic, the configuration could support a parameterized topic. For example, `topic: my-topic:${/type}`. For connections, this plugin should use the existing mechanisms in Data Prepper for getting SSL certificates, such as from the file, S3, or ACM. **Additional context** https://kafka.apache.org/33/javadoc/index.html?org/apache/kafka/clients/producer/KafkaProducer.html
Support Kafka as a Sink
https://api.github.com/repos/opensearch-project/data-prepper/issues/1986/comments
5
2022-10-30T01:50:24Z
2025-02-18T07:52:35Z
https://github.com/opensearch-project/data-prepper/issues/1986
1,428,547,890
1,986
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Some users have looked for a way to migrate data from one OpenSearch cluster to another. One of the missing components here would be the ability to retrieve data from OpenSearch. Other users have expressed a need to transform data from OpenSearch clusters. **Describe the solution you'd like** Include an OpenSearch source plugin for Data Prepper. It will need some of the following configurations. * Connection configurations similar to the `opensearch` sink. * Source index or indices. This should also support dynamic indices based on date. * Possibly query options to filter down the data. If not specified, all data in the index would be used. * Schedule configurations for reading data. **Additional context** This should be similar to the `logstash-input-opensearch-plugin` provided in the OpenSearch project. https://opensearch.org/blog/community/2022/05/introducing-logstash-input-opensearch-plugin-for-opensearch/ https://github.com/opensearch-project/logstash-input-opensearch
OpenSearch source plugin
https://api.github.com/repos/opensearch-project/data-prepper/issues/1985/comments
4
2022-10-30T01:06:01Z
2023-10-06T16:31:51Z
https://github.com/opensearch-project/data-prepper/issues/1985
1,428,534,357
1,985
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Pipeline authors often want to enrich Events with data from an existing OpenSearch cluster. This allows authors to enrich events with data from other events which were already sent to OpenSearch. **Describe the solution you'd like** Provide an OpenSearch enrichment processor. It would take some of the following parameters. * A query template which can perform queries using parameters from the input Event. * Document to Event mappings. * The same connection configuration options as available in the `opensearch` sink. ``` processor: - opensearch_enrichment: query: "requestId:${/requestId}" mappings: - from_key: "bytes" to_key: "bytes" hosts: ["https://localhost:9200"] cert: path/to/cert username: YOUR_USERNAME_HERE password: YOUR_PASSWORD_HERE ``` **Context** This plugin would probably have some similarities to an OpenSearch plugin for Logstash, as proposed in the following issues. https://github.com/opensearch-project/OpenSearch/issues/1976 https://github.com/opensearch-project/opensearch-clients/issues/4
OpenSearch Enrichment Processor
https://api.github.com/repos/opensearch-project/data-prepper/issues/1984/comments
0
2022-10-30T00:58:47Z
2022-10-30T00:58:47Z
https://github.com/opensearch-project/data-prepper/issues/1984
1,428,532,096
1,984
[ "opensearch-project", "data-prepper" ]
**Describe the bug** Our documentation is incorrect on the default values for buffers. See #1906 **Solution** Fix in our READMEs and in the opensearch.org documentation.
Correct documentation for the default buffer values
https://api.github.com/repos/opensearch-project/data-prepper/issues/1983/comments
0
2022-10-27T22:19:55Z
2023-02-04T20:04:58Z
https://github.com/opensearch-project/data-prepper/issues/1983
1,426,329,203
1,983
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Data Prepper may be able to reduce the amount of data it processes by using [S3 Select](https://docs.aws.amazon.com/AmazonS3/latest/userguide/selecting-content-from-objects.html). **Describe the solution you'd like** Provide an optional feature within the `s3` source that allows for loading objects from S3 using S3 Select. **Describe alternatives you've considered (Optional)** Not supporting this feature. It's main benefit is to reduce data and process some of the data on S3 instead of Data Prepper. **Additional context** N/A
Support S3 Select when loading objects from S3 via the S3 source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1971/comments
0
2022-10-25T14:01:53Z
2023-04-14T20:29:09Z
https://github.com/opensearch-project/data-prepper/issues/1971
1,422,520,407
1,971
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The current S3 source is notified of changes to buckets via SQS. However, some Data Prepper users would like to be able to load objects from S3 that do not have an SQS notification. **Describe the solution you'd like** Provide a mechanism within the existing S3 source that scans an S3 bucket for objects and downloads these. The pipeline author should be able to specify a bucket name and an optional list of key prefixes. The S3 source will then scan that bucket for each of these key prefixes. If no key prefixes are provided, then the S3 source scans the whole bucket. After scanning, each object is loaded just like it normally is. Example configuration: ``` s3: notification_type: scan scan: bucket: my-bucket key_prefixes: - my/prefix/a - my/prefix/b ``` This requires some support for saving state about which keys have been scanned already so that Data Prepper does not reread these. **Describe alternatives you've considered (Optional)** Pipeline authors can perform an S3 copy to re-generate SQS messages. However, this may take somewhat longer and might incur additional costs. One approach could be to allow the S3 source to scan multiple buckets. This might look like the following. Example configuration: ``` s3: notification_type: scan scan: buckets: - name: my-bucket-1 key_prefixes: - my/prefix/a - my/prefix/b - name: my-bucket-2 - name: my-bucket-3 key_prefixes: - my/prefix/for/this/bucket ``` **Additional context** N/A
S3 Scan for S3 Source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1970/comments
2
2022-10-25T13:49:33Z
2023-05-02T15:32:58Z
https://github.com/opensearch-project/data-prepper/issues/1970
1,422,501,110
1,970
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Updating OTel versions may break existing features. Data Prepper's end-to-end tests should be able to test against clients using different OTel versions. **Describe the solution you'd like** Allow the end-to-end tests to run with a configurable OTel client version. Make the GitHub actions run a matrix which includes the version used in `2.0` and the version that we are updating to in #1335. It can be provided as a Gradle argument similar to how we pass in a Java argument. The following shows how Data Prepper currently tests multiple Java clients in the end-to-end tests. https://github.com/opensearch-project/data-prepper/blob/43b7d33c28cb8572a92c397d59e90a3d9aba6ebb/.github/workflows/data-prepper-log-analytics-basic-grok-e2e-tests.yml#L30 **Describe alternatives you've considered (Optional)** Update the OTel version in Data Prepper only at major versions. But, this will hinder progress. **Additional context** The work done in #1335 is updating the OTel version. It is passing, but it uses the updated OTel version as the same OTel version in Data Prepper. It is possible that this is a breaking change and it needs to be verified.
Test against multiple OTel version
https://api.github.com/repos/opensearch-project/data-prepper/issues/1963/comments
1
2022-10-21T14:09:34Z
2023-01-23T21:08:21Z
https://github.com/opensearch-project/data-prepper/issues/1963
1,418,402,129
1,963
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Many pipelines receive sensitive data that needs to be removed or obfuscated. Data Prepper pipeline authors can do this somewhat manually with some of the existing processors like `remove_field`. However, the data can exist in multiple fields so this can be tedious. **Describe the solution you'd like** I'd like a processor that takes as input a list of fields which contain sensitive data. It uses the values from those fields to obfuscate or remove data from other fields. ``` processor: grok: match: log: ["%{NOTSPACE:some_field} %{NOTSPACE:name} %{NOTSPACE:email_address} %{GREEDYDATA:more_data}"] obfuscate: source_fields: - name - email_address ``` Give it a log like the following. ``` "start_login bob bob@example.org the rest of the message includes bob and an email bob@example.org" ``` Grok will update the event to look like the following. ``` log : "start_login bob bob@example.org the rest of the message includes bob and an email bob@example.org" some_field : "start_login" name: "bob" email_address: "bob@example.org" more_data: "the rest of the message includes bob and an email bob@example.org" ``` Then this event goes into the proposed `obfuscate` processor. It pulls the strings `bob` and `bob@example.org`. Then it looks for it all fields, replacing those values. Thus, the output Event would look like the following. ``` log : "start_login *** *** the rest of the message includes *** and an email ***" some_field : "start_login" more_data: "the rest of the message includes *** and an email ***" ``` This processor could have some options. * `obfuscation_character` - `*` by default * `obfuscation_length` - `3` by default * `unobfuscated_length` - Leave some characters in place. For example if this value were 8, then we'd get `***mple.org` from `example.org`. Default is `0`. * `retain_source_fields` - If set to `true`, it would keep the `name` and `email_address` values, unobfuscated. By default is `false` This processor could also have special substitution rules to mask data based on characters. For example, `bob@example.org` could be made to `***@*******.***`. Perhaps combined with other rules we could even have generate values such as `***@*******.org`. **Describe alternatives you've considered (Optional)** Using existing processors for find/replace.
[PROPOSAL] Obfuscate processor
https://api.github.com/repos/opensearch-project/data-prepper/issues/1952/comments
8
2022-10-20T17:56:17Z
2023-05-26T22:41:19Z
https://github.com/opensearch-project/data-prepper/issues/1952
1,417,034,582
1,952
[ "opensearch-project", "data-prepper" ]
**Describe the bug** The default buffer used in Data Prepper is the [BlockingBuffer](https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/blocking-buffer). The plugin name is `bounded_blocking` and yet metrics appear as `<pipeline>.BlockingBuffer.recordsInBuffer.count` **Expected behavior** I would expect the naming to be consistent with plugin name and the metrics that are published. **Additional context** Relevant link to the issue: https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/blocking-buffer/src/main/java/org/opensearch/dataprepper/plugins/buffer/blockingbuffer/BlockingBuffer.java#L69
[BUG] bounded_blocking has a different metric name
https://api.github.com/repos/opensearch-project/data-prepper/issues/1947/comments
2
2022-10-19T19:19:49Z
2022-11-17T23:59:30Z
https://github.com/opensearch-project/data-prepper/issues/1947
1,415,468,240
1,947
[ "opensearch-project", "data-prepper" ]
Hello, I am using AWS OpenSearch as a sink to data-prepper pipeline and currently using the master user and password to connect to the OpenSearch cluster. I plan to create a dedicated user and role in OpenSearch for data-prepper. What are the required OpenSearch permissions for data-prepper that I should assign to the role? The use case are trace analytics and metrics via OpenTelemtry Collector => data-prepper => OpenSearch. Thank you.
OpenSearch permissions required for data-prepper
https://api.github.com/repos/opensearch-project/data-prepper/issues/1941/comments
1
2022-10-19T14:14:38Z
2022-10-20T20:23:31Z
https://github.com/opensearch-project/data-prepper/issues/1941
1,415,056,130
1,941
[ "opensearch-project", "data-prepper" ]
**Describe the bug** Trying to run data trace-analytics-sample-app. I tried using latest docker image from docker up as well as built it from scratch. the example executes "data-prepper-wait-for-opensearch-and-start.sh" on start and wait for opensearch to start. First error shows up that "curl no found". This error I fixed by building the image from scratch - added curl in docker file. Once I resolved "curl not found" error, I am getting - data-prepper | Error: Unable to access jarfile data-prepper.jar **To Reproduce** Steps to reproduce the behavior: 1. Go to 'trace-analytics-sample-app director located under examples sub directory' 2. Run 'docker-compose up' 3. See error **Expected behavior** Example should run without any error **Screenshots** If applicable, add screenshots to help explain your problem. ![image](https://user-images.githubusercontent.com/4293310/196613236-a109f4f9-d290-441b-b8e9-88aace2471a2.png) **Environment (please complete the following information):** - OS: [e.g. Ubuntu 20.04 LTS] - Version [e.g. 22] **Additional context** Add any other context about the problem here.
[BUG]Unable to access jarfile data-prepper.jar
https://api.github.com/repos/opensearch-project/data-prepper/issues/1940/comments
6
2022-10-19T06:27:03Z
2023-11-29T21:04:36Z
https://github.com/opensearch-project/data-prepper/issues/1940
1,414,373,988
1,940
[ "opensearch-project", "data-prepper" ]
Hello, what are the default value of buffer_size and batch_size settings if bounded_blocking section is omit. thanks
[question] dataprepper buffer
https://api.github.com/repos/opensearch-project/data-prepper/issues/1931/comments
3
2022-10-18T15:05:47Z
2022-10-27T22:20:23Z
https://github.com/opensearch-project/data-prepper/issues/1931
1,413,395,344
1,931
[ "opensearch-project", "data-prepper" ]
**Describe the bug** VM1 192.168.12.12:2021: Docker: data-prepper running VM2: Fluent-bit docker: /fluent-bit/bin/fluent-bit -i cpu -t cpu -o http -p Host=192.168.12.12 -p Port=2021 -m * -p URI /data-prepper/log/ingest -p Format json 4. See error in fluent-bit log: [2022/10/17 09:52:13] [ warn] [engine] failed to flush chunk '1-1666000332.243777239.flb', retry in 9 seconds: task_id=5, input=cpu.0 > output=http.0 (out_id=0) [2022/10/17 09:52:14] [error] [output:http:http.0] 192.168.12.12:2021, HTTP status=404 Status: 404 Description: Not Found ----- And, What is the URI in this example? https://github.com/opensearch-project/data-prepper/blob/main/docs/log_analytics.md URI /log/ingest Format json Data-prepper works with 'Host = localhost' , not working from remote. Telnet able to connect to ip:2021
[BUG] fluent-bit docker can't connect to data-prepper http 404
https://api.github.com/repos/opensearch-project/data-prepper/issues/1930/comments
7
2022-10-18T02:10:13Z
2023-01-12T17:27:07Z
https://github.com/opensearch-project/data-prepper/issues/1930
1,412,463,905
1,930
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The current Data Prepper Authentication implementation of http basic auth requires the username and password to be provided as clear text in the Data Prepper pipeline configuration file. Data Prepper doesn't support use of certificate or docker secrets to set the login/password. It would be nice to read the username/password from some secret manager or other medium instead of having it as plain text in the configuration file. **Describe the solution you'd like** Create implementation of [ArmeriaHttpAuthenticationProvider](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/armeria-common/src/main/java/org/opensearch/dataprepper/armeria/authentication/ArmeriaHttpAuthenticationProvider.java) to read the http basic username and password from a given type of secret manager such as docker secret. **Describe alternatives you've considered (Optional)** * Support reading the credentials from certificate. * Support reading the credentials from secret manager such as [AWS Secret Manager](https://aws.amazon.com/secrets-manager/). **Additional context** Related data prepper opensearch [forum post](https://forum.opensearch.org/t/openserach-how-to-secure-sign-in-best-practices/11208).
Read http basic auth username and password from secret manager
https://api.github.com/repos/opensearch-project/data-prepper/issues/1929/comments
2
2022-10-17T23:26:03Z
2022-11-03T17:10:49Z
https://github.com/opensearch-project/data-prepper/issues/1929
1,412,362,876
1,929
[ "opensearch-project", "data-prepper" ]
**Describe the bug** we have setup tracing system using OTEL, data prepper and AWS OpenSearch. we are successfully able to get traces for different services. we are getting traces which has interaction with each other. but services are not listing in service tab and service-map is always empty. Also in Data prepper logs, we could see that some data is being processed in service-map pipeline. However service-map index has 0 document in it. attached screenshot of logs. Please provide suggestion on this.. **Expected behavior** Services should list in service tab and service-map should appear. **Screenshots** ![image](https://user-images.githubusercontent.com/53902890/196162168-e3a634ae-ce79-4089-bc2f-0ff10f9fead5.png) ![image](https://user-images.githubusercontent.com/53902890/196162260-28fa6987-000e-4472-83f3-2ca081a0ce55.png) ![image](https://user-images.githubusercontent.com/53902890/196162071-4120b25a-1c04-447c-a499-3c866375013a.png) ![image](https://user-images.githubusercontent.com/53902890/196162849-77a456e5-c529-4012-8617-67c3cbb877ed.png) **Environment (please complete the following information):** - OS: [e.g. Ubuntu 20.04 LTS] - Version [e.g. 22] **Additional context** Add any other context about the problem here.
[BUG] Service map is empty, can not see services listing in Trace analytics
https://api.github.com/repos/opensearch-project/data-prepper/issues/1927/comments
6
2022-10-17T11:12:55Z
2024-11-05T20:52:55Z
https://github.com/opensearch-project/data-prepper/issues/1927
1,411,386,116
1,927
[ "opensearch-project", "data-prepper" ]
**Describe the bug** The Data Prepper S3 source fails to process `s3:TestEvent` objects. These remain the SQS queue and are reprocessed over and over again. **To Reproduce** Configure the S3 bucket to send notifications to SQS. It will create an SQS record for `s3:TestEvent`. Create an S3 source pipeline in Data Prepper. Run it. **Expected behavior** Data Prepper should recognize the `s3:TestEvent` and remove it from the SQS queue. It should also not add this to the buffer. Ideally it does output the fact it received it since this can be useful for whomever is running Data Prepper to check that the SQS queue reads work. **Actual Output** ``` 2022-10-15T17:09:11,275 [Thread-1] ERROR org.opensearch.dataprepper.plugins.source.SqsWorker - Invalid JSON string in message body of bf803036-5fd3-4857-9db2-d608be440d8d com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException: Unrecognized field "Service" (class org.opensearch.dataprepper.plugins.source.S3EventNotification), not marked as ignorable (one known property: "Records"]) at [Source: (String)"{"Service":"Amazon S3","Event":"s3:TestEvent","Time":"2022-10-15T16:36:25.510Z","Bucket":"***","RequestId":"***","HostId":"***"}"; line: 1, column: 233] (through reference chain: org.opensearch.dataprepper.plugins.source.S3EventNotification["Service"]) at com.fasterxml.jackson.databind.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:61) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.DeserializationContext.handleUnknownProperty(DeserializationContext.java:1127) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:2023) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperty(BeanDeserializerBase.java:1700) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.handleUnknownProperties(BeanDeserializerBase.java:1650) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:540) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1405) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:352) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:185) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:323) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4674) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3629) ~[jackson-databind-2.13.4.jar:2.13.4] at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3597) ~[jackson-databind-2.13.4.jar:2.13.4] at org.opensearch.dataprepper.plugins.source.S3EventNotification.parseJson(S3EventNotification.java:58) ~[s3-source-2.1.0-SNAPSHOT.jar:?] at org.opensearch.dataprepper.plugins.source.SqsWorker.convertS3EventMessages(SqsWorker.java:137) ~[s3-source-2.1.0-SNAPSHOT.jar:?] at java.util.stream.Collectors.lambda$uniqKeysMapAccumulator$1(Collectors.java:180) ~[?:?] at java.util.stream.ReduceOps$3ReducingSink.accept(ReduceOps.java:169) ~[?:?] at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1625) ~[?:?] at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509) ~[?:?] at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?] at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921) ~[?:?] at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?] at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682) ~[?:?] at org.opensearch.dataprepper.plugins.source.SqsWorker.getS3MessageEventNotificationRecordMap(SqsWorker.java:132) ~[s3-source-2.1.0-SNAPSHOT.jar:?] at org.opensearch.dataprepper.plugins.source.SqsWorker.processSqsMessages(SqsWorker.java:99) ~[s3-source-2.1.0-SNAPSHOT.jar:?] at org.opensearch.dataprepper.plugins.source.SqsWorker.run(SqsWorker.java:76) ~[s3-source-2.1.0-SNAPSHOT.jar:?] at java.lang.Thread.run(Thread.java:833) ~[?:?] ``` **Environment (please complete the following information):** Data Prepper 2.1.0-SNAPSHOT from `main` build. **Additional context** Connecting an S3 Event Subscription to SQS will create this test message.
[BUG] S3 TestEvent stuck in SQS queue
https://api.github.com/repos/opensearch-project/data-prepper/issues/1924/comments
3
2022-10-15T17:15:06Z
2022-10-24T21:32:57Z
https://github.com/opensearch-project/data-prepper/issues/1924
1,410,232,517
1,924
[ "opensearch-project", "data-prepper" ]
**Describe the bug** The Data Prepper S3 source is unable to read from keys with spaces in them. **To Reproduce** Simple pipeline: ``` log-pipeline: source: s3: notification_type: "sqs" codec: newline: sqs: queue_url: "https://sqs.us-east-1.amazonaws.com/123456789012/MyPipeline" aws: region: "us-east-1" processor: sink: - stdout: ``` Here is a sample file: ``` cat "has some spaces.log" This file has spaces. Two to be precise. ``` Upload that file: ``` aws s3 cp "has some spaces.log" s3://s3-source-manual-test/ ``` **Expected behavior** The S3 object should be read just like the one without spaces. **Actual Outputs** ``` 2022-10-15T16:56:08,659 [Thread-1] WARN org.opensearch.dataprepper.plugins.source.SqsWorker - Unable to process S3Object: s3ObjectReference=[bucketName=***, key=has+some+spaces.log]. software.amazon.awssdk.services.s3.model.NoSuchKeyException: The specified key does not exist. (Service: S3, Status Code: 404, Request ID: ***, Extended Request ID: ***) ``` **Environment (please complete the following information):** Data Prepper Docker build from `main`. ``` docker run -p 4900:4900 -p 2021:2021 \ -v ${PWD}/pipelines.yaml:/usr/share/data-prepper/pipelines/pipelines.yaml \ -v ${HOME}/.aws:/root/.aws \ opensearch-data-prepper:2.1.0-SNAPSHOT ``` Files uploaded via macOS using the AWS CLI. **Additional context** The following values come from the S3 Console: * S3 URI: `s3://***/has some spaces.log` * Object URL: `https://***.s3.amazonaws.com/has+some+spaces.log` * ARN: `arn:aws:s3:::***/has some spaces.log` SQS Body: ``` {"Records":[{"eventVersion":"2.1","eventSource":"aws:s3","awsRegion":"us-east-1","eventTime":"2022-10-15T16:55:22.934Z","eventName":"ObjectCreated:Put","userIdentity":{"principalId":"AWS:***:***"},"requestParameters":{"sourceIPAddress":"x.y.x.y"},"responseElements":{"x-amz-request-id":"***","x-amz-id-2":"***"},"s3":{"s3SchemaVersion":"1.0","configurationId":"SQSSourceTest","bucket":{"name":"***","ownerIdentity":{"principalId":"***"},"arn":"arn:aws:s3:::***"},"object":{"key":"has+some+spaces.log","size":41,"eTag":"f7d5180f521d7cc51b6bfa64d72fca3b","sequencer":"***"}}}]} ```
[BUG] Unable to read from S3 key with spaces
https://api.github.com/repos/opensearch-project/data-prepper/issues/1923/comments
0
2022-10-15T17:04:04Z
2022-10-24T21:35:38Z
https://github.com/opensearch-project/data-prepper/issues/1923
1,410,229,617
1,923
[ "opensearch-project", "data-prepper" ]
[Kubernetes deployment template](https://github.com/opensearch-project/data-prepper/blob/main/deployment-template/k8s/data-prepper-k8s.yaml) needs an update to reflect the new directory structure in 2.0: ``` containers: - args: - java - -jar - /usr/share/data-prepper/data-prepper.jar - /etc/data-prepper/pipelines.yaml - /etc/data-prepper/data-prepper-config.yaml image: opensearchproject/data-prepper:latest name: data-prepper ```
Kubernetes deployment template needs an update
https://api.github.com/repos/opensearch-project/data-prepper/issues/1922/comments
6
2022-10-14T22:45:59Z
2022-10-19T22:04:44Z
https://github.com/opensearch-project/data-prepper/issues/1922
1,409,942,856
1,922
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Pipeline components (e.g. processors, sinks) that use keys to access events hold onto a string and then pass that string into `Event::put` and `Event::get`. This has a few problems: * Validation of the JSON pointer does not happen until the call to `put`/`get`. It would be nice to validate these when constructing the plugin. * The string is re-parsed by Jackson each time. This does slow down processing in processors. **Describe the solution you'd like** I propose adding a new model which allows for creating keys and then re-using them. ``` public interface EventKey { String getKey(); } public interface EventKeyFactory { EventKey createEventKey(String key); } ``` Plugins would use these keys somewhat like the following: ``` private final EventKey keyToUse; @DataPrepperPluginConstructor public MyProcessor(MyProcessorConfiguration myProcessorConfiguration, EventKeyFactory eventKeyFactory) { this.keyToUse = eventKeyFactory.createEventKey(myProcessorConfiguration()); } @Override public Collection<Record<Event>> doExecute(final Collection<Record<Event>> records) { for(final Record<Event> record : records) { Object someValue = record.getData().get(keyToUse); // Do whatever record.getData().put(keyToUse, newValue); } } ``` **Describe alternatives you've considered (Optional)** One alternative: Building on #1915, we could include the factory method in `EventFactory`. ``` public interface EventFactory { EventKey createEventKey(String key); // The rest was part of 1915 <T extends Builder<T>> Builder<T> builder(); Event fromMessage(String message); Event copy(Event event); } ``` Second alternative: Add a static method `EventKey createEventKey(String key)` on the `Event` interface. However, this will be harder for plugin developers to test against. And it may not be as easy to extend. **Additional context** The implementation of the `EventKeyFactory` is highly related to the `EventFactory` suggested in #1915. This is because the implementations are going to be in This is somewhat related to #1915 and I suggest that we have a single implementation for both interfaces: ``` class JacksonEventFactory implements EventFactory, EventKeyFactory { // ... } ```
Support an EventKey object
https://api.github.com/repos/opensearch-project/data-prepper/issues/1916/comments
0
2022-10-11T14:41:50Z
2024-06-17T15:40:29Z
https://github.com/opensearch-project/data-prepper/issues/1916
1,404,768,722
1,916
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Creating events using a static factory has some problems: * The types must be known at compile-time for any source that uses it. * Cloning events is fragile and leaks code. This can be [seen in peer forwarder](https://github.com/opensearch-project/data-prepper/blob/892162ae34eabf049769a5ef80f553a53a5d0462/data-prepper-core/src/main/java/org/opensearch/dataprepper/peerforwarder/server/PeerForwarderHttpService.java#L117-L127). * We have to expose implementations like `JacksonEvent`. * Changing the implementation for testing is not easy with the current approach. **Describe the solution you'd like** Create an `EventFactory` class which can provide a more flexible mechanism for creating events. It might look like the following initially: ``` public interface EventFactory { <T extends Builder<T>> Builder<T> builder(); Event fromMessage(String message); Event copy(Event event); } ``` We also might want to change the `builder()` interface to take in require arguments (e.g. type). This makes it easier to catch errors at compile-time rather than runtime. We could eventually have a class which allows for registering new types. I suggest that we follow on with this later. But, to help convey the concept, I put together a possible interface. In this way, sources could register a type using a string and have it implemented with a concrete class. ``` public interface EventTypeRegistry { <T extends Event> void registerType(String type, Class<T> interfaceClass, Class<? extends T> implementationClass); } ``` **Describe alternatives you've considered (Optional)** We could add static builder methods to the `Event` class directly. And we could even support registration on these static methods. Such changes would cross all of Data Prepper, and that is probably acceptable. But, we have a DI framework which can make a factory class easy enough to implement. And this solution would be be as easy to for clients to use when writing unit tests. ## Tasks - [x] Initial class model with Log model - [ ] Support as test class - [ ] Split Gradle packages for event model and testing - [ ] Support Metric events - [ ] Support Trace events - [ ] Support Document event - [ ] Support OTel Log event - [ ] Update sources to use new EventFactory - [ ] Update codecs to use new EventFactory
Provide an EventFactory
https://api.github.com/repos/opensearch-project/data-prepper/issues/1915/comments
0
2022-10-11T14:27:50Z
2024-02-12T16:57:24Z
https://github.com/opensearch-project/data-prepper/issues/1915
1,404,744,429
1,915
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Some values, such as codes, can be assigned alternate meaning. I'd like to map one value to another. HTTP status codes are a classic example. I might want to map fields with a value of `200` to `OK` and fields with a value of `404` to `Not Found`. **Describe the solution you'd like** Create a new processor which can map values. It should take the following. * A source key * A destination key * A map of values from the source to values to set in the destination. For example: ``` processor: map_value: source_key: /status_code destination_key: /status map: 200: OK 404: Not Found ``` I'd like a better name than `map_value`. Another interesting possibility would be supporting ranges for numbers. Some examples: ``` processor: map_value: source_key: /status_code destination_key: /status range: 200-399: Success 400-499: Client Error 500-599: Server Error ``` And perhaps comma-delimited values: ``` processor: map_value: source_key: /status_code destination_key: /status range: 200,201: Success 401,403: Authentication 500-599: Server Error ```
Processor to map values to alternate values
https://api.github.com/repos/opensearch-project/data-prepper/issues/1914/comments
6
2022-10-11T02:46:14Z
2024-02-27T18:18:02Z
https://github.com/opensearch-project/data-prepper/issues/1914
1,403,921,829
1,914
[ "opensearch-project", "data-prepper" ]
Increase buffer defaults to larger values now that we have an Event model.
Increase buffer defaults for Event model
https://api.github.com/repos/opensearch-project/data-prepper/issues/1908/comments
1
2022-10-10T17:00:41Z
2022-10-10T17:01:10Z
https://github.com/opensearch-project/data-prepper/issues/1908
1,403,472,167
1,908
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The metric names are not pertinent to their meaning according to README in grok processor ``` * `grokProcessingMatchSuccess`: records the number of Records that found at least one pattern match from the match field * `grokProcessingMatchFailure`: records the number of Records that did not match any of the patterns specified in the match field. ``` **Describe the solution you'd like** Suggested change: * grokProcessingMatchSuccess -> grokProcessingMatch * grokProcessingMatchFailure -> grokProcessingMismatch **Describe alternatives you've considered (Optional)** N/A **Additional context** N/A
Rename grokProcessingMatchFailure and grokProcessingMatchSuccess metrics
https://api.github.com/repos/opensearch-project/data-prepper/issues/1892/comments
1
2022-10-06T17:03:24Z
2022-10-06T22:40:13Z
https://github.com/opensearch-project/data-prepper/issues/1892
1,400,027,343
1,892
[ "opensearch-project", "data-prepper" ]
## CVE-2022-3171 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>protobuf-java-3.21.1.jar</b></p></summary> <p>Core Protocol Buffers library. Protocol Buffers are a way of encoding structured data in an efficient yet extensible format.</p> <p>Library home page: <a href="https://developers.google.com/protocol-buffers/">https://developers.google.com/protocol-buffers/</a></p> <p>Path to dependency file: /release/maven/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.21.1/2e396173a5b6ab549d790eba21c1d125bfe92912/protobuf-java-3.21.1.jar</p> <p> Dependency Hierarchy: - otel-metrics-raw-processor-2.1.0-SNAPSHOT (Root Library) - protobuf-java-util-3.21.1.jar - :x: **protobuf-java-3.21.1.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A parsing issue with binary data in protobuf-java core and lite versions prior to 3.21.7, 3.20.3, 3.19.6 and 3.16.3 can lead to a denial of service attack. Inputs containing multiple instances of non-repeated embedded messages with repeated or unknown fields causes objects to be converted back-n-forth between mutable and immutable forms, resulting in potentially long garbage collection pauses. We recommend updating to the versions mentioned above. <p>Publish Date: 2022-10-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-3171>CVE-2022-3171</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-h4h5-3hr4-j3g2">https://github.com/advisories/GHSA-h4h5-3hr4-j3g2</a></p> <p>Release Date: 2022-10-12</p> <p>Fix Resolution: com.google.protobuf:protobuf-java:3.16.3,3.19.6,3.20.3,3.21.7;com.google.protobuf:protobuf-javalite:3.16.3,3.19.6,3.20.3,3.21.7;com.google.protobuf:protobuf-kotlin:3.19.6,3.20.3,3.21.7;com.google.protobuf:protobuf-kotlin-lite:3.19.6,3.20.3,3.21.7;google-protobuf - 3.19.6,3.20.3,3.21.7</p> </p> </details> <p></p>
CVE-2022-3171 (High) detected in protobuf-java-3.21.1.jar
https://api.github.com/repos/opensearch-project/data-prepper/issues/1891/comments
0
2022-10-06T15:21:58Z
2022-10-19T17:35:36Z
https://github.com/opensearch-project/data-prepper/issues/1891
1,399,855,412
1,891
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Add documentation for the conditional routing feature added in #1337. **Describe the solution you'd like** Documentation in the repository.
Create documentation for conditional routing in data-prepper repo
https://api.github.com/repos/opensearch-project/data-prepper/issues/1890/comments
0
2022-10-06T00:12:32Z
2022-10-10T15:35:18Z
https://github.com/opensearch-project/data-prepper/issues/1890
1,398,530,620
1,890
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Some use-cases require adding new headers to IAM requests made to STS when performing an `iam:AssumeRole` action. Data Prepper can provide support for pipeline authors to configure these headers. **Describe the solution you'd like** Provide configurations to modify these headers when assuming roles in the `s3` source and the `opensearch` sink. ``` source: s3: aws: sts_role_arn: arn:aws:iam::123456789012:role/MyS3Role sts_header_overrides: my-header-name1: my-header-value1 my-header-name2: my-header-value2 sink: - opensearch: aws_sts_role_arn: arn:aws:iam::123456789012:role/MyOpenSeachRole aws_sts_header_overrides: my-header-name3: my-header-value3 my-header-name4: my-header-value4 ``` **Describe alternatives you've considered (Optional)** Data Prepper could support a global configuration for this. But that is a fairly large change that requires some design.
Support custom headers when assuming AWS IAM roles
https://api.github.com/repos/opensearch-project/data-prepper/issues/1888/comments
0
2022-10-05T15:53:00Z
2023-01-24T23:17:15Z
https://github.com/opensearch-project/data-prepper/issues/1888
1,398,021,997
1,888
[ "opensearch-project", "data-prepper" ]
**Describe the bug** Data Prepper supports creating a pipeline which uses the pipeline sink to send events to multiple downstream pipelines. However, the Event is not being copied. Thus, the same Event object is passed into multiple threads. This produces unexpected behavior. **To Reproduce** Create a pipeline which modifies an Event in a downstream pipeline. **Expected behavior** Modifying an Event in one pipeline should not modify it in others. The `PipelineConnector` should duplicate the Events. **Additional context** The `Event` model contains metadata, making it hard to "copy" an event. We solved this in peer-forwarder with a quick fix which is can be seen in [this PR](https://github.com/opensearch-project/data-prepper/pull/1865/files#diff-0e97029518f91e331458a8a579d81237b6d2175edb0041a0c2ec119490db4cf9R116-R127). However, this approach is not highly extensible since it requires a lot of knowledge duplicated throughout the code.
[BUG] Sending Events to multiple pipelines does not duplicate Events
https://api.github.com/repos/opensearch-project/data-prepper/issues/1886/comments
0
2022-10-05T15:20:38Z
2022-10-24T21:35:08Z
https://github.com/opensearch-project/data-prepper/issues/1886
1,397,978,386
1,886
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** The OpenSearch Java client supports an AWS-based transport (`AwsSdk2Transport`). Investigate using this when sending requests to Amazon OpenSearch Service. **Describe the solution you'd like** Possibly change to use the AWS transport. **Additional context** Suggested by @dblock in https://github.com/opensearch-project/data-prepper/pull/1877#issuecomment-1267023948
Use OpenSearch Java client's AwsSdk2Transport
https://api.github.com/repos/opensearch-project/data-prepper/issues/1881/comments
2
2022-10-05T01:06:54Z
2023-03-28T14:14:23Z
https://github.com/opensearch-project/data-prepper/issues/1881
1,397,037,006
1,881
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Data Prepper originally loaded plugins from the `com.amazon.dataprepper.plugins` package. Data Prepper can now load plugins from any package so long as the plugin configures it. And by default it loads from the `org.opensearch.dataprepper.plugins` package. So the old package is no longer needed by default. Support is coming from: https://github.com/opensearch-project/data-prepper/blob/b99942c0c415dba9e825558f73500f42ebd95efe/data-prepper-main/src/main/resources/META-INF/data-prepper.plugins.properties#L6 https://github.com/opensearch-project/data-prepper/blob/b99942c0c415dba9e825558f73500f42ebd95efe/data-prepper-core/src/main/java/org/opensearch/dataprepper/plugin/PluginPackagesSupplier.java#L97 **Describe the solution you'd like** Remove default support for this package by default. As this is close to the 2.0 release, I propose retaining it for 2.x and removing in 3.0.
Remove support for old com.amazon.dataprepper plugin package
https://api.github.com/repos/opensearch-project/data-prepper/issues/1878/comments
0
2022-10-04T13:27:12Z
2022-10-04T13:27:44Z
https://github.com/opensearch-project/data-prepper/issues/1878
1,396,278,574
1,878
[ "opensearch-project", "data-prepper" ]
**Describe the bug** Running the current `main` branch (for 2.0), the OpenSearch sink will not write to an Amazon OpenSearch Service domain. ``` Caused by: software.amazon.awssdk.core.exception.SdkClientException: Multiple HTTP implementations were found on the classpath. To avoid non-deterministic loading implementations, please explicitly provide an HTTP client via the client builders, set the software.amazon.awssdk.http.service.impl system property with the FQCN of the HTTP service to use as the default, or remove all but one HTTP implementation from the classpath at software.amazon.awssdk.core.exception.SdkClientException$BuilderImpl.build(SdkClientException.java:102) ~[sdk-core-2.17.264.jar:?] at software.amazon.awssdk.core.internal.http.loader.ClasspathSdkHttpServiceProvider.loadService(ClasspathSdkHttpServiceProvider.java:62) ~[sdk-core-2.17.264.jar:?] at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197) ~[?:?] at java.util.Spliterators$ArraySpliterator.tryAdvance(Spliterators.java:1002) ~[?:?] at java.util.stream.ReferencePipeline.forEachWithCancel(ReferencePipeline.java:129) ~[?:?] at java.util.stream.AbstractPipeline.copyIntoWithCancel(AbstractPipeline.java:527) ~[?:?] at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:513) ~[?:?] at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499) ~[?:?] at java.util.stream.FindOps$FindOp.evaluateSequential(FindOps.java:150) ~[?:?] at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?] at java.util.stream.ReferencePipeline.findFirst(ReferencePipeline.java:647) ~[?:?] at software.amazon.awssdk.core.internal.http.loader.SdkHttpServiceProviderChain.loadService(SdkHttpServiceProviderChain.java:44) ~[sdk-core-2.17.264.jar:?] at software.amazon.awssdk.core.internal.http.loader.CachingSdkHttpServiceProvider.loadService(CachingSdkHttpServiceProvider.java:46) ~[sdk-core-2.17.264.jar:?] at software.amazon.awssdk.core.internal.http.loader.DefaultSdkHttpClientBuilder.buildWithDefaults(DefaultSdkHttpClientBuilder.java:40) ~[sdk-core-2.17.264.jar:?] at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.lambda$resolveSyncHttpClient$7(SdkDefaultClientBuilder.java:343) ~[sdk-core-2.17.264.jar:?] at java.util.Optional.orElseGet(Optional.java:364) ~[?:?] at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.resolveSyncHttpClient(SdkDefaultClientBuilder.java:343) ~[sdk-core-2.17.264.jar:?] at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.finalizeSyncConfiguration(SdkDefaultClientBuilder.java:282) ~[sdk-core-2.17.264.jar:?] at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.syncClientConfiguration(SdkDefaultClientBuilder.java:178) ~[sdk-core-2.17.264.jar:?] at software.amazon.awssdk.services.sts.DefaultStsClientBuilder.buildClient(DefaultStsClientBuilder.java:27) ~[sts-2.17.264.jar:?] at software.amazon.awssdk.services.sts.DefaultStsClientBuilder.buildClient(DefaultStsClientBuilder.java:22) ~[sts-2.17.264.jar:?] at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.build(SdkDefaultClientBuilder.java:145) ~[sdk-core-2.17.264.jar:?] at software.amazon.awssdk.services.sts.StsClient.create(StsClient.java:79) ~[sts-2.17.264.jar:?] at org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration.attachSigV4(ConnectionConfiguration.java:240) ~[opensearch-2.0.0-SNAPSHOT.jar:?] at org.opensearch.dataprepper.plugins.sink.opensearch.ConnectionConfiguration.createClient(ConnectionConfiguration.java:206) ~[opensearch-2.0.0-SNAPSHOT.jar:?] at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.initialize(OpenSearchSink.java:99) ~[opensearch-2.0.0-SNAPSHOT.jar:?] at org.opensearch.dataprepper.plugins.sink.opensearch.OpenSearchSink.<init>(OpenSearchSink.java:90) ~[opensearch-2.0.0-SNAPSHOT.jar:?] ... 82 more ``` **To Reproduce** Run Data Prepper against an Amazon OpenSearch Service domain. **Additional context** The AWS Java SDK v2 is providing multiple clients by default. They are conflicting with each other.
[BUG] AWS services fail to load
https://api.github.com/repos/opensearch-project/data-prepper/issues/1876/comments
2
2022-10-04T02:28:08Z
2022-10-05T15:34:56Z
https://github.com/opensearch-project/data-prepper/issues/1876
1,395,575,395
1,876
[ "opensearch-project", "data-prepper" ]
## CVE-2022-36944 - Critical Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>scala-library-2.13.6.jar</b></p></summary> <p>Standard library for the Scala Programming Language</p> <p>Library home page: <a href="https://www.scala-lang.org/">https://www.scala-lang.org/</a></p> <p>Path to dependency file: /performance-test/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.scala-lang/scala-library/2.13.6/ed7a2f528c7389ea65746c22a01031613d98ab3d/scala-library-2.13.6.jar</p> <p> Dependency Hierarchy: - zinc_2.13-1.6.1.jar (Root Library) - :x: **scala-library-2.13.6.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> Scala 2.13.x before 2.13.9 has a Java deserialization chain in its JAR file. On its own, it cannot be exploited. There is only a risk in conjunction with Java object deserialization within an application. In such situations, it allows attackers to erase contents of arbitrary files, make network connections, or possibly run arbitrary code (specifically, Function0 functions) via a gadget chain. <p>Publish Date: 2022-09-23 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-36944>CVE-2022-36944</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-09-23</p> <p>Fix Resolution: org.scala-lang:scala-library:2.13.9</p> </p> </details> <p></p>
CVE-2022-36944 (Critical) detected in scala-library-2.13.6.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/1869/comments
0
2022-10-03T21:24:54Z
2023-11-07T18:03:45Z
https://github.com/opensearch-project/data-prepper/issues/1869
1,395,351,246
1,869
[ "opensearch-project", "data-prepper" ]
## CVE-2022-42003 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.13.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /performance-test/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.4/98b0edfa8e4084078f10b7b356c300ded4a71491/jackson-databind-2.13.4.jar</p> <p> Dependency Hierarchy: - gatling-charts-highcharts-3.8.4.jar (Root Library) - gatling-recorder-3.8.4.jar - :x: **jackson-databind-2.13.4.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/ebd3e757c341c1d9c1352431bbad7bf5db2ea939">ebd3e757c341c1d9c1352431bbad7bf5db2ea939</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In FasterXML jackson-databind before 2.14.0-rc1, resource exhaustion can occur because of a lack of a check in primitive value deserializers to avoid deep wrapper array nesting, when the UNWRAP_SINGLE_VALUE_ARRAYS feature is enabled. Additional fix version in 2.13.4.1 and 2.12.17.1 <p>Publish Date: 2022-10-02 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-42003>CVE-2022-42003</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-10-02</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.12.7.1,2.13.4.1</p> </p> </details> <p></p>
CVE-2022-42003 (High) detected in jackson-databind-2.13.4.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/1868/comments
1
2022-10-03T21:24:52Z
2022-12-17T00:06:04Z
https://github.com/opensearch-project/data-prepper/issues/1868
1,395,351,214
1,868
[ "opensearch-project", "data-prepper" ]
## CVE-2022-42004 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.13.3.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: /data-prepper-plugins/parse-json-processor/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/e/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.13.3/56deb9ea2c93a7a556b3afbedd616d342963464e/jackson-databind-2.13.3.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.13.3.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/5a73bf31a2a3abdaf2b81c0d0784ba51ed19c122">5a73bf31a2a3abdaf2b81c0d0784ba51ed19c122</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In FasterXML jackson-databind before 2.13.4, resource exhaustion can occur because of a lack of a check in BeanDeserializer._deserializeFromArray to prevent use of deeply nested arrays. An application is vulnerable only with certain customized choices for deserialization. <p>Publish Date: 2022-10-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-42004>CVE-2022-42004</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-10-02</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.13.4</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
CVE-2022-42004 (Medium) detected in jackson-databind-2.13.3.jar
https://api.github.com/repos/opensearch-project/data-prepper/issues/1867/comments
0
2022-10-03T21:24:50Z
2022-10-04T13:28:19Z
https://github.com/opensearch-project/data-prepper/issues/1867
1,395,351,178
1,867
[ "opensearch-project", "data-prepper" ]
Update Data Prepper documentation at: https://github.com/opensearch-project/documentation-website We have a target branch in that project for working changes: https://github.com/opensearch-project/documentation-website/tree/data-prepper-2.0
Update Data Prepper documentation on OpenSearch.org for 2.0.0
https://api.github.com/repos/opensearch-project/data-prepper/issues/1846/comments
1
2022-09-30T13:45:31Z
2022-10-11T20:13:18Z
https://github.com/opensearch-project/data-prepper/issues/1846
1,392,468,772
1,846
[ "opensearch-project", "data-prepper" ]
Create Data Prepper 2.0.0 Changelog The Changelog is a detailed overview of all the changes made to Data Prepper in this release. It needs to be generated from Git history. See https://github.com/opensearch-project/data-prepper/issues/1393 for a previous release's instructions.
Create Data Prepper 2.0.0 Changelog
https://api.github.com/repos/opensearch-project/data-prepper/issues/1845/comments
0
2022-09-30T13:44:11Z
2022-10-08T01:52:40Z
https://github.com/opensearch-project/data-prepper/issues/1845
1,392,467,044
1,845
[ "opensearch-project", "data-prepper" ]
Create Data Prepper 2.0.0 Release Notes All changes should be available at: https://github.com/opensearch-project/data-prepper/milestone/3?closed=1
Create Data Prepper 2.0.0 Release Notes
https://api.github.com/repos/opensearch-project/data-prepper/issues/1844/comments
0
2022-09-30T13:42:59Z
2022-10-07T16:25:08Z
https://github.com/opensearch-project/data-prepper/issues/1844
1,392,465,519
1,844
[ "opensearch-project", "data-prepper" ]
**Describe the bug** The `file` sink fails to write all records to the output file when the pipeline has multiple threads running. **To Reproduce** Create a pipeline with multiple threads which writes sufficient data to where multiple threads pick it up. The file will only have part of the records. **Expected behavior** The file has all the records. **Additional context** I found this while testing conditional routing as part of #1337 .
[BUG] File sink does not write correctly with multiple threads
https://api.github.com/repos/opensearch-project/data-prepper/issues/1843/comments
0
2022-09-30T01:30:03Z
2022-10-05T16:04:25Z
https://github.com/opensearch-project/data-prepper/issues/1843
1,391,690,404
1,843
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Core has bumped jackson and snakeyaml dependencies [here](https://github.com/opensearch-project/OpenSearch/pull/4556) to `2.13.4` and `1.32` respectively for versions 2.x, 2.3 , 2.2 , 2.1 , 2.0, 1.3 and 1.x. Plugin and client teams should do the same to align with core. **Describe the solution you'd like** A clear and concise description of what you want to happen. **Describe alternatives you've considered (Optional)** A clear and concise description of any alternative solutions or features you've considered. **Additional context** Add any other context or screenshots about the feature request here.
Bump jackson and snakeyaml dependencies
https://api.github.com/repos/opensearch-project/data-prepper/issues/1839/comments
1
2022-09-29T19:10:31Z
2022-10-04T13:28:39Z
https://github.com/opensearch-project/data-prepper/issues/1839
1,391,364,563
1,839
[ "opensearch-project", "data-prepper" ]
**Describe the bug** I started data prepper with a generic log pipeline that used AWS ACM certificate provider for the HTTP log source. However, I gave an incorrect certificate ARN that data prepper was unable to use. Data prepper still came up successfully however I was unable to send log data because the http source was not running. ``` 2022-09-28T10:27:51.174-05:00 | CopyException retrieving the certificate with arn: arn:aws:acm:us-east-1:123456789012:certificate/someCertificate 2022-09-28T10:27:51.174-05:00 | software.amazon.awssdk.services.acm.model.ResourceNotFoundException: Could not find certificate arn:aws:acm:us-east-1:123456789012:certificate/someCertificate . (Service: Acm, Status Code: 400, Request ID: 73995789-64ce-4e90-9c3c-20649fd6cdac) 2022-09-28T10:27:51.174-05:00 | at software.amazon.awssdk.core.internal.http.CombinedResponseHandler.handleErrorResponse(CombinedResponseHandler.java:125) ~[data-prepper-core-1.5.0.jar:1.5.0] ... 2022-09-28T10:27:51.180-05:00 | at com.amazon.dataprepper.plugins.certificate.acm.ACMCertificateProvider.getCertificate(ACMCertificateProvider.java:90) ~[data-prepper-core-1.5.0.jar:1.5.0] 2022-09-28T10:27:51.180-05:00 | at com.amazon.dataprepper.plugins.source.loghttp.HTTPSource.start(HTTPSource.java:83) ~[data-prepper-core-1.5.0.jar:1.5.0] ... 2022-09-28T10:27:51.189-05:00 | Data Prepper server running at :4900 ``` **To Reproduce** PipelineConfiguration.yaml snippet: ``` log-pipeline: source: http: acm_certificate_arn: "arn: arn:aws:acm:us-east-1:123456789012:certificate/someCertificate " aws_region: "us-east-1" port: 21890 use_acm_certificate_for_ssl: true ssl: true authentication: ... processor: ... ``` **Expected behavior** I expected data prepper to fail to start if there is no source. Running data prepper without a source is useless. This may not be different if data prepper would support multiple sources. **Additional context** This is similar to: #936
[BUG] Data Prepper starts without a source
https://api.github.com/repos/opensearch-project/data-prepper/issues/1829/comments
1
2022-09-28T20:44:49Z
2022-11-03T17:00:51Z
https://github.com/opensearch-project/data-prepper/issues/1829
1,389,925,333
1,829
[ "opensearch-project", "data-prepper" ]
Data prepper 2.0 reads all yaml files from the pipelines directory and therefore putting an example file there can cause conflicts for users.
Remove example pipeline file from tarball and Docker image
https://api.github.com/repos/opensearch-project/data-prepper/issues/1826/comments
0
2022-09-28T19:35:17Z
2022-09-29T17:21:08Z
https://github.com/opensearch-project/data-prepper/issues/1826
1,389,855,137
1,826
[ "opensearch-project", "data-prepper" ]
I need to create a configuration that can add data to a certain index based on a certain key value. How can I implement it?
I need to create a configuration that can add data to a certain index based on a certain key value.
https://api.github.com/repos/opensearch-project/data-prepper/issues/1824/comments
2
2022-09-28T07:52:41Z
2022-09-30T19:51:04Z
https://github.com/opensearch-project/data-prepper/issues/1824
1,388,898,440
1,824
[ "opensearch-project", "data-prepper" ]
Reserve the following names by not allowing them in pipelines: `core`, `data-prepper`
Reserve pipeline names
https://api.github.com/repos/opensearch-project/data-prepper/issues/1820/comments
1
2022-09-27T20:49:20Z
2022-09-29T18:26:15Z
https://github.com/opensearch-project/data-prepper/issues/1820
1,388,360,129
1,820
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Buffer fill percentage is the primary indicator for horizontal scaling for data prepper. However, the metrics reported by micrometer are with a gauge and with no concept of upper bounds as defined in the pipeline configuration. Determining when to scale and reviewing metrics in a dashboards requires knowledge of the `blockingBuffer` pipeline configuration buffer size values in relation to the current `blockingBuffer.recordsInBuffer` metric value. **Describe the solution you'd like** A new metrics for blockingBuffer.bufferUsage` which tracks the utilization rate of the buffer based on the number of records in the buffer and the buffer size. **Describe alternatives you've considered (Optional)** - publishing the buffer size would be an alternative. Users could then perform calculations on their dashboards.
[Feature Request] Buffer Fill Percentage
https://api.github.com/repos/opensearch-project/data-prepper/issues/1817/comments
4
2022-09-27T18:06:20Z
2022-10-06T15:05:52Z
https://github.com/opensearch-project/data-prepper/issues/1817
1,388,163,453
1,817
[ "opensearch-project", "data-prepper" ]
**Describe the bug** When running Data Prepper via Docker, I am unable to shut it down using Ctrl+C. **To Reproduce** 1) Build Data Prepper Docker locally 2) Create a simple pipeline: ``` simple-test-pipeline: workers: 2 delay: "5000" source: random: sink: - stdout: ``` 3) Run Data Prepper: ``` docker run -p 4900:4900 \ -v ${PWD}/simple-test/pipelines.yaml:/usr/share/data-prepper/pipelines/pipelines.yaml \ opensearch-data-prepper:2.0.0-SNAPSHOT ``` 4) Try to stop with Ctrl+C **Expected behavior** I expect Data Prepper to shut down. However, it does not. If I run with the latest released version, it does shut down: ``` docker run -p 4900:4900 \ -v ${PWD}/simple-test/pipelines.yaml:/usr/share/data-prepper/pipelines.yaml \ opensearchproject/data-prepper:1 ``` **Environment (please complete the following information):** - OS: macOS 12.5 - Docker Desktop 4.9.0 **Additional context** Originally reported in the forums: https://forum.opensearch.org/t/latest-data-prepper-not-working/11055
[BUG] Unable to shutdown Data Prepper Docker with Ctrl+C
https://api.github.com/repos/opensearch-project/data-prepper/issues/1816/comments
1
2022-09-27T17:18:18Z
2022-10-03T22:49:35Z
https://github.com/opensearch-project/data-prepper/issues/1816
1,388,103,449
1,816
[ "opensearch-project", "data-prepper" ]
**Describe the bug** The default peer forwarder port (`21890`) conflicts with the port used by OTel trace source. We should have a default that works well within Data Prepper.
[BUG] Peer Forwarder port conflicts with OTel
https://api.github.com/repos/opensearch-project/data-prepper/issues/1811/comments
1
2022-09-27T01:46:07Z
2022-09-28T14:48:45Z
https://github.com/opensearch-project/data-prepper/issues/1811
1,386,951,718
1,811
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Armeria 1.19.0 was released on Sep 13. https://armeria.dev/release-notes/1.19.0/ **Describe the solution you'd like** Update to Armeria 1.19.0 for all clients.
Update to Armeria 1.19.0
https://api.github.com/repos/opensearch-project/data-prepper/issues/1806/comments
0
2022-09-26T22:25:40Z
2022-09-29T19:43:52Z
https://github.com/opensearch-project/data-prepper/issues/1806
1,386,803,566
1,806
[ "opensearch-project", "data-prepper" ]
There are a few places where Data Prepper image with `latest` tag are pulled for e2e test and demo. This will likely break after 2.0 release and will need an update.
Update tests and examples after 2.0 Docker image release
https://api.github.com/repos/opensearch-project/data-prepper/issues/1805/comments
3
2022-09-26T16:24:51Z
2023-04-24T16:05:30Z
https://github.com/opensearch-project/data-prepper/issues/1805
1,386,379,796
1,805
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Currently, there are several places within the dataprepper codebase where error logs contain sensitive information. It would be nice to downgrade some of these to WARN or INFO so that error logs can be used by third parties for debugging if neccessary. **Describe the solution you'd like** Go through the codebase and lower the log level of any error messages that contain sensitive information inside the message.
Lower log level in messages that contain sensitive information
https://api.github.com/repos/opensearch-project/data-prepper/issues/1804/comments
2
2022-09-26T15:24:47Z
2022-11-03T17:05:01Z
https://github.com/opensearch-project/data-prepper/issues/1804
1,386,280,403
1,804
[ "opensearch-project", "data-prepper" ]
## CVE-2022-1941 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>protobuf-3.15.6-cp37-cp37m-manylinux1_x86_64.whl</b></p></summary> <p>Protocol Buffers</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/7d/cc/abf8e30629db7a8b15efb79d4c87e235895d2c636ce7a4ac625cfc816f07/protobuf-3.15.6-cp37-cp37m-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/7d/cc/abf8e30629db7a8b15efb79d4c87e235895d2c636ce7a4ac625cfc816f07/protobuf-3.15.6-cp37-cp37m-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /examples/trace-analytics-sample-app/sample-app/requirements.txt</p> <p>Path to vulnerable library: /examples/trace-analytics-sample-app/sample-app/requirements.txt,/examples/trace-analytics-sample-app/sample-app/requirements.txt</p> <p> Dependency Hierarchy: - :x: **protobuf-3.15.6-cp37-cp37m-manylinux1_x86_64.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A parsing vulnerability for the MessageSet type in the ProtocolBuffers versions prior to and including 3.16.1, 3.17.3, 3.18.2, 3.19.4, 3.20.1 and 3.21.5 for protobuf-cpp, and versions prior to and including 3.16.1, 3.17.3, 3.18.2, 3.19.4, 3.20.1 and 4.21.5 for protobuf-python can lead to out of memory failures. A specially crafted message with multiple key-value per elements creates parsing issues, and can lead to a Denial of Service against services receiving unsanitized input. We recommend upgrading to versions 3.18.3, 3.19.5, 3.20.2, 3.21.6 for protobuf-cpp and 3.18.3, 3.19.5, 3.20.2, 4.21.6 for protobuf-python. Versions for 3.16 and 3.17 are no longer updated. <p>Publish Date: 2022-09-22 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-1941>CVE-2022-1941</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cloud.google.com/support/bulletins#GCP-2022-019">https://cloud.google.com/support/bulletins#GCP-2022-019</a></p> <p>Release Date: 2022-09-22</p> <p>Fix Resolution: Google.Protobuf - 3.18.3,3.19.5,3.20.2,3.21.6;protobuf-python - 3.18.3,3.19.5,3.20.2,4.21.6</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
CVE-2022-1941 (Medium) detected in protobuf-3.15.6-cp37-cp37m-manylinux1_x86_64.whl - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/1799/comments
1
2022-09-23T18:30:36Z
2022-09-24T14:45:30Z
https://github.com/opensearch-project/data-prepper/issues/1799
1,384,166,106
1,799
[ "opensearch-project", "data-prepper" ]
null
Update samples for directory structure change
https://api.github.com/repos/opensearch-project/data-prepper/issues/1795/comments
0
2022-09-21T22:36:35Z
2022-09-28T19:40:17Z
https://github.com/opensearch-project/data-prepper/issues/1795
1,381,595,874
1,795
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** In Data Prepper v1.5.1 the `otel_metrics_source` cannot be used with the `string_converter` plugin. **Describe the solution you'd like** Documentation on plugin/feature compatibility. **Describe alternatives you've considered (Optional)** Enhance pipeline validation on startup to detect incomatibility issues. ## Additional context [Related community forum question](https://forum.opensearch.org/t/using-otel-metrics-source-to-data-prepper/10991) ### pipelines.yaml ``` entry-pipeline: source: otel_metrics_source: ssl: false sink: - stdout: ``` ### otel-collector-config.yaml ``` receivers: prometheus: config: scrape_configs: - job_name: 'data-prepper' scrape_interval: 10s metrics_path: "/metrics/prometheus" static_configs: - targets: ['host.docker.internal:4900'] exporters: otlp/2: endpoint: data-prepper:21891 tls: insecure: true service: pipelines: metrics: receivers: [prometheus] exporters: [otlp/2] ``` ### prometheus.yaml ``` ``` ### docker-compose.yaml ``` version: "3.9" services: data-prepper: restart: unless-stopped image: opensearch-data-prepper:2.0.0-SNAPSHOT working_dir: /usr/share/data-prepper/ volumes: - ./data-prepper/config/data-prepper-config.yaml:/usr/share/data-prepper/config/data-prepper-config.yaml - ./data-prepper/config/pipelines.yaml:/usr/share/data-prepper/pipelines.yaml - ../../shared-config/log4j2.properties:/usr/share/data-prepper/config/log4j.properties ports: - "2021:2021" otel-collector: restart: unless-stopped image: otel/opentelemetry-collector:0.40.0 command: ["--config=/etc/otel-collector-config.yml"] volumes: - ./opentelemetry-collector/otel-collector-config.yml:/etc/otel-collector-config.yml depends_on: - data-prepper ports: - "4317:4317" prometheus: restart: unless-stopped image: prom/prometheus volumes: - ./prometheus/prometheus.yaml:/etc/prometheus/prometheus.yaml ports: - "9000:9000" depends_on: - otel-collector ``` I recommend using the smoke-tests docker-compose.yaml as a base for setup.
Improved otel_metrics_source documentation
https://api.github.com/repos/opensearch-project/data-prepper/issues/1791/comments
1
2022-09-21T16:26:30Z
2023-08-28T08:49:32Z
https://github.com/opensearch-project/data-prepper/issues/1791
1,381,198,776
1,791
[ "opensearch-project", "data-prepper" ]
**Is your feature request related to a problem? Please describe.** Currently `PluginMetrics` publishes metrics with prefix `<pipeline-name>.<plugin-name>.<defined-by-plugins>` which adds a restriction to publish any metrics from Data Prepper core peer forwarding or is some cases conditional routing router. **Describe the solution you'd like** Make the pipeline-name component more generic, it can be pipeline name / component-scope. Something like `<generic-name>.<component-id>.<defined-metric-name>`. So for metrics from core package we could prefix it by `<component-scope>` instead of a `<pipeline-name>`. Examples: - log-pipeline.buffer.numberOfRecords - log-pipeline.grok.numberOfRecords - log-pipeline.opensearch.numberOfRecords - log-pipeline.router.defined-metric-name - core.peer-forwarder.defined-metric-name Pros: - No need to change any existing code in `PluginMetrics` class. Cons: - If pipeline name is `core` all the metrics will be prefixed with `core`. **Describe alternatives you've considered (Optional)** An alternative could be` <pipeline-name>.<component-scope>.<component-id>.<defined-metric-name>`. Examples: - log-pipeline.buffer.buffer.numberOfRecords - log-pipeline.processor.grok.numberOfRecords - log-pipeline.sink.opensearch.numberOfRecords - log-pipeline.core.peer-forwarder.defined-metric-name - log-pipeline.core.router.defined-metric-name Cons for alternative approach: - Peer forwarder server side metrics might not be aware of the pipeline name in case of bad requests or if the request contains invalid pipeline name. **Additional context** https://github.com/opensearch-project/data-prepper/blob/e3b8a826edfba2c84101fc08b63b7a8891d13e60/data-prepper-api/src/main/java/com/amazon/dataprepper/metrics/PluginMetrics.java#L32-L35
Support additional context in metrics prefix
https://api.github.com/repos/opensearch-project/data-prepper/issues/1789/comments
3
2022-09-19T22:23:07Z
2022-10-05T16:01:38Z
https://github.com/opensearch-project/data-prepper/issues/1789
1,378,592,176
1,789
[ "opensearch-project", "data-prepper" ]
null
Updated documentation for directory structure change
https://api.github.com/repos/opensearch-project/data-prepper/issues/1785/comments
0
2022-09-19T20:27:33Z
2022-09-20T18:42:35Z
https://github.com/opensearch-project/data-prepper/issues/1785
1,378,486,001
1,785
[ "opensearch-project", "data-prepper" ]
## CVE-2022-37767 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>pebble-3.1.5.jar</b></p></summary> <p>Templating engine for Java.</p> <p>Library home page: <a href="http://pebbletemplates.io">http://pebbletemplates.io</a></p> <p>Path to dependency file: /performance-test/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.pebbletemplates/pebble/3.1.5/bc7d795cef5281a4f75d00df2607d78de0ae4bc/pebble-3.1.5.jar</p> <p> Dependency Hierarchy: - gatling-charts-highcharts-3.8.3.jar (Root Library) - gatling-app-3.8.3.jar - gatling-core-3.8.3.jar - :x: **pebble-3.1.5.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Pebble Templates 3.1.5 allows attackers to bypass a protection mechanism and implement arbitrary code execution with springbok <p>Publish Date: 2022-09-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37767>CVE-2022-37767</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p>
CVE-2022-37767 (High) detected in pebble-3.1.5.jar - autoclosed
https://api.github.com/repos/opensearch-project/data-prepper/issues/1781/comments
1
2022-09-19T14:25:27Z
2022-09-19T21:57:37Z
https://github.com/opensearch-project/data-prepper/issues/1781
1,378,048,115
1,781
[ "opensearch-project", "data-prepper" ]
null
Validate server certificates on the client
https://api.github.com/repos/opensearch-project/data-prepper/issues/1775/comments
0
2022-09-16T14:37:49Z
2022-09-17T19:53:30Z
https://github.com/opensearch-project/data-prepper/issues/1775
1,376,044,952
1,775
[ "opensearch-project", "data-prepper" ]
null
Document Peer Forwarding in OpenSearch.org documentation
https://api.github.com/repos/opensearch-project/data-prepper/issues/1773/comments
1
2022-09-15T19:47:43Z
2022-10-06T16:46:33Z
https://github.com/opensearch-project/data-prepper/issues/1773
1,375,003,874
1,773
[ "opensearch-project", "data-prepper" ]
null
Create a guide to using Peer Forwarding (documentation task) in Data Prepper repo
https://api.github.com/repos/opensearch-project/data-prepper/issues/1772/comments
0
2022-09-15T19:47:41Z
2022-09-26T14:31:36Z
https://github.com/opensearch-project/data-prepper/issues/1772
1,375,003,840
1,772
[ "opensearch-project", "data-prepper" ]
The shutdown behavior of a few of the Processors could cause data loss/unintended behavior. **Grok Processor** The Grok Processor [calls shutdown on its local ExecutorService](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/grok-prepper/src/main/java/com/amazon/dataprepper/plugins/prepper/grok/GrokPrepper.java#L146) in the `prepareForShutdown` method. This occurs before the `ProcessWorker`s are shutdown. Grok Processor configurations that specify a timeout would encounter an exception before the buffer data is cleared as no new tasks can be submitted to the Grok Processor's ExecutorService since it was already shutdown. **Otel Trace Raw Processor** The Otel Trace Raw Processor sets an `isShuttingDown` boolean to true as part of its `prepareForShutdown` method. This will cause all traces to be flushed due as [shouldGarbageCollect() now evaluates to true](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/otel-trace-raw-processor/src/main/java/com/amazon/dataprepper/plugins/processor/oteltrace/OTelTraceRawProcessor.java#L220). This is the correct behavior when the buffer is empty and the ProcessWorker's run method is in its final iteration. However, if the buffer is not empty, then multiple calls to the Otel Trace Raw Processor will be made with `isShuttingDown` set to true, prematurely flushing traces. **Aggregate Processor** The Aggregate Processor [does not have shutdown behavior implemented](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/aggregate-processor/src/main/java/com/amazon/dataprepper/plugins/processor/aggregate/AggregateProcessor.java#L116-L129) which will result in any partial aggregation data being dropped on pipeline shutdown. **Service Map Stateful Processor** The Service Map Stateful Processor sets the [previousTimestamp to 0 in its prepareForShutdown method](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/service-map-stateful/src/main/java/com/amazon/dataprepper/plugins/prepper/ServiceMapStatefulPrepper.java#L271). This causes the window to be closed and the data flushed in the next `doExecute()` iteration which is the correct behavior if the buffer is empty. However the `rotateWindow()` method is also invoked which resets the `previousTimestamp`, thereby opening another window. If `doExecute()` must be invoked again as the buffer is not yet empty, then `prepareForShutdown()` method will have been for nothing and the window duration must be observed and flushed before the [isReadyForShutdown method](https://github.com/opensearch-project/data-prepper/blame/main/data-prepper-plugins/service-map-stateful/src/main/java/com/amazon/dataprepper/plugins/prepper/ServiceMapStatefulPrepper.java#L276) will evaluate to true and allow the ProcessWorkers to terminate. If the `processorShutdownTimeout` expires before the new window is flushed, that data will be dropped.
Prevent Processor-Level Data Loss on Pipeline Shutdown
https://api.github.com/repos/opensearch-project/data-prepper/issues/1770/comments
10
2022-09-15T18:56:23Z
2022-09-29T03:52:49Z
https://github.com/opensearch-project/data-prepper/issues/1770
1,374,950,001
1,770
[ "opensearch-project", "data-prepper" ]
This is a follow-up to #1736 where one or more pipeline files can be loaded from the `pipelines/` directory. This moves one step further to enable recursive loading in case that the pipeline files are organized into subdirectories.
Recursively load pipeline configurations from directory
https://api.github.com/repos/opensearch-project/data-prepper/issues/1769/comments
0
2022-09-15T18:43:48Z
2022-11-03T17:03:26Z
https://github.com/opensearch-project/data-prepper/issues/1769
1,374,937,165
1,769