issue_owner_repo listlengths 2 2 | issue_body stringlengths 0 262k ⌀ | issue_title stringlengths 1 1.02k | issue_comments_url stringlengths 53 116 | issue_comments_count int64 0 2.49k | issue_created_at stringdate 1999-03-17 02:06:42 2025-06-23 11:41:49 | issue_updated_at stringdate 2000-02-10 06:43:57 2025-06-23 11:43:00 | issue_html_url stringlengths 34 97 | issue_github_id int64 132 3.17B | issue_number int64 1 215k |
|---|---|---|---|---|---|---|---|---|---|
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The `trace_raw` plugin should use Core Peer Forwarding.
**Describe the solution you'd like**
Implement the `RequiresPeerForwarding` interface and use `traceId` as the identification key. | Update trace_raw to use Core Peer Forwarding | https://api.github.com/repos/opensearch-project/data-prepper/issues/1766/comments | 0 | 2022-09-14T19:08:06Z | 2022-09-16T15:14:26Z | https://github.com/opensearch-project/data-prepper/issues/1766 | 1,373,460,229 | 1,766 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The `service_map_stateful` plugin should use core peer forwarding.
**Describe the solution you'd like**
Implement the `RequiresPeerForwarding` interface and use the `traceId` as the identification key.
| Update service_map_stateful to use Core Peer Forwarding | https://api.github.com/repos/opensearch-project/data-prepper/issues/1765/comments | 0 | 2022-09-14T19:06:42Z | 2022-09-16T15:14:26Z | https://github.com/opensearch-project/data-prepper/issues/1765 | 1,373,458,770 | 1,765 |
[
"opensearch-project",
"data-prepper"
] | null | Improve Data Prepper assemble task to create a runnable distribution | https://api.github.com/repos/opensearch-project/data-prepper/issues/1762/comments | 0 | 2022-09-14T14:32:50Z | 2022-09-16T20:17:27Z | https://github.com/opensearch-project/data-prepper/issues/1762 | 1,373,094,775 | 1,762 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The Core Peer Forwarding (#700) feature will replace the existing `peer-forwarder`. However, an ideal trace pipeline should forward events prior to the `raw_trace` and `service_map_stateful` processors.
**Describe the solution you'd like**
Create a `trace_peer_forwarder` plugin. Thus, we can create a pipeline like the following.
```
entry-pipeline:
delay: "100"
source:
otel_trace_source:
processor:
- trace_peer_forwarder:
sink:
- pipeline:
name: "raw-pipeline"
- pipeline:
name: "service-map-pipeline"
raw-pipeline:
source:
pipeline:
name: "entry-pipeline"
processor:
- otel_trace_raw:
sink:
- opensearch:
service-map-pipeline:
delay: "100"
source:
pipeline:
name: "entry-pipeline"
processor:
- service_map_stateful:
sink:
- opensearch:
```
It can forward based on the `traceId`.
**Describe alternatives you've considered (Optional)**
Rely exclusively on `service_map_stateful` and `otel_trace_raw`. But, then the events will need to be forwarded twice.
| Provide a trace_peer_forwarder plugin | https://api.github.com/repos/opensearch-project/data-prepper/issues/1759/comments | 2 | 2022-09-13T19:44:44Z | 2022-09-16T15:14:27Z | https://github.com/opensearch-project/data-prepper/issues/1759 | 1,371,927,179 | 1,759 |
[
"opensearch-project",
"data-prepper"
] | null | Support mTLS authentication between peers for core peer-forwarding | https://api.github.com/repos/opensearch-project/data-prepper/issues/1758/comments | 0 | 2022-09-13T19:40:02Z | 2022-09-16T20:19:05Z | https://github.com/opensearch-project/data-prepper/issues/1758 | 1,371,922,414 | 1,758 |
[
"opensearch-project",
"data-prepper"
] | how to replace/override raw-span-policy? I need to include the delete action for otel-v1-apm-span-* during data-prepper setup time
thanks! | raw-span-policy delete action | https://api.github.com/repos/opensearch-project/data-prepper/issues/1751/comments | 3 | 2022-09-12T05:07:01Z | 2022-10-03T15:00:29Z | https://github.com/opensearch-project/data-prepper/issues/1751 | 1,369,287,196 | 1,751 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
When you are attempting to docker run by guide or docker-compose-up it is crashed with
data-prepper | 2022-09-10T23:18:28,864 [main] ERROR com.amazon.dataprepper.plugin.PluginCreator - Encountered exception while instantiating the plugin OpenSearchSink
**To Reproduce**
Steps to reproduce the behavior:
1. Clone repo
2. GOTO examples
3. GOTO log-ingestion
4. Follow instructions
5. By instruction enter the command docker run --name data-prepper -v /full/path/to/log_pipeline.yaml:/usr/share/data-prepper/pipelines.yaml --network "data-prepper_opensearch-net" opensearchproject/data-prepper:latest with corresponding replacement (in my case the network name was d61fd085926e log-ingestion_opensearch-net bridge local )

6. Data-Prepper crashes with exception data-prepper | 2022-09-10T23:18:28,864 [main] ERROR com.amazon.dataprepper.plugin.PluginCreator - Encountered exception while instantiating the plugin OpenSearchSink
(See full log attached)
[CMD.report.txt](https://github.com/opensearch-project/data-prepper/files/9541545/CMD.report.txt)
**Expected behavior**
Data Prepper is running correctly, you should see something similar to the following line as the latest output in your terminal.
```
INFO com.amazon.dataprepper.pipeline.ProcessWorker - log-pipeline Worker: No records received from buffer
```
**Screenshots**

**Environment (please complete the following information):**
- OS: Windows 11
- Docker Desktop 4.12.0
- VSCode
**Additional context**
I also did docker compose to reproduce the issue, **behavior is the same with command line used and full path specified**
[docker-compose.txt](https://github.com/opensearch-project/data-prepper/files/9541543/docker-compose.txt)
| [BUG] Ingesting example doesn't work, crashes with ERROR com.amazon.dataprepper.plugin.PluginCreator - Encountered exception while instantiating the plugin OpenSearchSink | https://api.github.com/repos/opensearch-project/data-prepper/issues/1749/comments | 1 | 2022-09-10T23:35:40Z | 2023-01-12T17:27:01Z | https://github.com/opensearch-project/data-prepper/issues/1749 | 1,368,754,470 | 1,749 |
[
"opensearch-project",
"data-prepper"
] | null | Write events received by server to buffer | https://api.github.com/repos/opensearch-project/data-prepper/issues/1746/comments | 0 | 2022-09-09T04:09:16Z | 2022-09-14T15:18:19Z | https://github.com/opensearch-project/data-prepper/issues/1746 | 1,367,260,310 | 1,746 |
[
"opensearch-project",
"data-prepper"
] | Create a Sink for sending Metrics data to CloudWatch | Support CloudWatch as a Sink for Metrics | https://api.github.com/repos/opensearch-project/data-prepper/issues/1745/comments | 0 | 2022-09-09T02:10:24Z | 2022-09-09T02:10:24Z | https://github.com/opensearch-project/data-prepper/issues/1745 | 1,367,188,457 | 1,745 |
[
"opensearch-project",
"data-prepper"
] | Create a Sink which provides metrics to Prometheus as an API so that Prometheus can scrape Data Prepper to get metrics. | Support Prometheus as a Sink (pull/scrape) | https://api.github.com/repos/opensearch-project/data-prepper/issues/1744/comments | 3 | 2022-09-09T02:06:44Z | 2025-03-04T20:59:59Z | https://github.com/opensearch-project/data-prepper/issues/1744 | 1,367,186,045 | 1,744 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The time for processors to flush their data downstream and shutdown is [hardcoded to 10 seconds](https://github.com/opensearch-project/data-prepper/blob/66e21adef6566c742c9d60ae1c91ef06ed27f232/data-prepper-core/src/main/java/org/opensearch/dataprepper/pipeline/Pipeline.java#L169). This may or may not be enough time for a processor to clear its data and prevent data loss.
**Describe the solution you'd like**
Make the shutdown timeout configurable via the `data-prepper-config.yaml`
**Describe alternatives you've considered (Optional)**
Adding the timeout as a parameter to the shutdown API.
| Make ProcessorTimeout in Shutdown API Configurable | https://api.github.com/repos/opensearch-project/data-prepper/issues/1742/comments | 2 | 2022-09-08T21:47:27Z | 2022-09-15T19:16:25Z | https://github.com/opensearch-project/data-prepper/issues/1742 | 1,367,022,803 | 1,742 |
[
"opensearch-project",
"data-prepper"
] | This will make Data Prepper run all pipelines from a pipelines directory.
For example:
```
data-prepper-$VERSION/
bin/
data-prepper # Shell script to run Data Prepper on Linux/macOS
pipelines/ # New directory for pipelines
trace-analytics.yaml
log-ingest.yaml
... # Other files and directories
```
With this change, users can run Data Prepper by running:
```
bin/data-prepper
```
Data Prepper would then read the `pipelines/` directory for YAML files. It will parse each one and start the pipelines defined in all those configurations.
In the example above, Data Prepper would run both a trace-analytics pipeline and a log-ingest pipeline. This will allow users to keep their pipeline definitions distinct and thus more compact and focused.
(Originally from https://github.com/opensearch-project/data-prepper/issues/305#issuecomment-1024635979) | Load pipeline files from the pipelines directory | https://api.github.com/repos/opensearch-project/data-prepper/issues/1736/comments | 0 | 2022-09-08T14:25:36Z | 2022-09-16T19:11:42Z | https://github.com/opensearch-project/data-prepper/issues/1736 | 1,366,447,896 | 1,736 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
The otel_trace_raw processor does not seem to work. The pipeline using this processor fails with the below
error message:
```
data-prepper | 2022-09-08T07:01:33,206 [raw-pipeline-processor-worker-5-thread-1] ERROR com.amazon.dataprepper.pipeline.ProcessWorker - Encountered exception during pipeline raw-pipeline processing
data-prepper | java.lang.ClassCastException: class io.opentelemetry.proto.collector.trace.v1.ExportTraceServiceRequest cannot be cast to class com.amazon.dataprepper.model.trace.Span (io.opentelemetry.proto.collector.trace.v1.ExportTraceServiceRequest and com.amazon.dataprepper.model.trace.Span are in unnamed module of loader 'app')
data-prepper | at com.amazon.dataprepper.plugins.processor.oteltrace.OTelTraceRawProcessor.doExecute(OTelTraceRawProcessor.java:77) ~[data-prepper.jar:1.5.1]
data-prepper | at com.amazon.dataprepper.model.processor.AbstractProcessor.lambda$execute$0(AbstractProcessor.java:55) ~[data-prepper.jar:1.5.1]
data-prepper | at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:65) ~[data-prepper.jar:1.5.1]
data-prepper | at com.amazon.dataprepper.model.processor.AbstractProcessor.execute(AbstractProcessor.java:55) ~[data-prepper.jar:1.5.1]
data-prepper | at com.amazon.dataprepper.pipeline.ProcessWorker.run(ProcessWorker.java:62) ~[data-prepper.jar:1.5.1]
data-prepper | at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?]
data-prepper | at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
data-prepper | at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) ~[?:?]
data-prepper | at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) ~[?:?]
data-prepper | at java.lang.Thread.run(Thread.java:832) ~[?:?]
```
I am using the sample app in [examples/trace-analytics-sample-app](https://github.com/opensearch-project/data-prepper/tree/main/examples/trace-analytics-sample-app) to test this.
I am not able to see any service or traces on dashboard. Service map does seem to work though.
**To Reproduce**
Steps to reproduce the behavior:
1. Follow instruction in [examples/trace-analytics-sample-app](https://github.com/opensearch-project/data-prepper/tree/main/examples/trace-analytics-sample-app).
2. Check data-prepper logs. You should see the above error.
3. Open opensearch-dashboards, and navigate to `app/observability-dashboards#/trace_analytics/services`.
You should see some information in the service map section, but service and traces sections are empty.
4. In the opensearch-dashboards, check indices in Stack Management page, you should see no docs in the otel-v1-apm-span-000001 index.
**Expected behavior**
The example should work as expected.
**Screenshots**
NA
**Environment (please complete the following information):**
- OS: Amazon Linux
- Version: 2
**Additional context**
[examples/trace-analytics-sample-app](https://github.com/opensearch-project/data-prepper/tree/main/examples/trace-analytics-sample-app) also dos not build unless you revert https://github.com/opensearch-project/data-prepper/pull/1691. | [BUG] Encountered exception during pipeline raw-pipeline processing. | https://api.github.com/repos/opensearch-project/data-prepper/issues/1735/comments | 5 | 2022-09-08T07:32:09Z | 2022-10-07T16:37:54Z | https://github.com/opensearch-project/data-prepper/issues/1735 | 1,365,689,561 | 1,735 |
[
"opensearch-project",
"data-prepper"
] | There are still plugins that have "prepper" in their names, e.g., ServiceMapStatefulPrepper, GrokPrepper. They should be changed to use "processor". This is a follow-up to the interface and configuration name change in #619. | Change plugin name from prepper to processor | https://api.github.com/repos/opensearch-project/data-prepper/issues/1730/comments | 0 | 2022-09-07T18:32:07Z | 2022-09-30T15:30:26Z | https://github.com/opensearch-project/data-prepper/issues/1730 | 1,365,044,963 | 1,730 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Unable to start Data Prepper with a http source configured to use ACM
**To Reproduce**
Steps to reproduce the behavior:
data-prepper-config.yaml file
```
ssl: true
keyStoreFilePath: "/Users/smbayer/.config/data-prepper/keystore.p12"
keyStorePassword: "password"
privateKeyPassword: "password"
```
pipelines.yaml file
```
http-source-pipeline:
source:
http:
ssl: true
ssl_key_file: 'not-null'
ssl_certificate_file: 'not-null'
max_connection_count: 2000
request_timeout: 10000
port: 2021
use_acm_certificate_for_ssl: true
aws_region: us-east-1
acm_private_key_password: "5%95WemteNXNPZPn@fbx"
acm_certificate_arn: "acm cert arn"
sink:
- stdout:
```
Then running these command to start data prepper
```
./gradlew :release:archives:buildArchives -Prelease
tar -xzf "${DISTRIBUTIONS}/opensearch-data-prepper-2.0.0-SNAPSHOT-linux-x64.tar.gz" -C "${DISTRIBUTIONS}"
eval \
"${DISTRIBUTIONS}/opensearch-data-prepper-${VERSION}-linux-x64/bin/data-prepper" \
"${PIPELINES_YAML}" \
"${CONFIG_YAML}"
```
On startup, I observe the following error message:
```
2022-09-07T10:45:00,660 [main] WARN com.amazon.dataprepper.plugins.source.loghttp.HTTPSource - Creating http source without authentication. This is not secure.
2022-09-07T10:45:00,663 [main] WARN com.amazon.dataprepper.plugins.source.loghttp.HTTPSource - In order to set up Http Basic authentication for the http source, go here: https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/http-source#authentication-configurations
2022-09-07T10:45:00,745 [main] WARN org.opensearch.dataprepper.pipeline.server.config.DataPrepperServerConfiguration - Creating data prepper server without authentication. This is not secure.
2022-09-07T10:45:00,745 [main] WARN org.opensearch.dataprepper.pipeline.server.config.DataPrepperServerConfiguration - In order to set up Http Basic authentication for the data prepper server, go here: https://github.com/opensearch-project/data-prepper/blob/main/docs/core_apis.md#authentication
2022-09-07T10:45:01,392 [main] INFO com.amazon.dataprepper.plugins.source.loghttp.HTTPSource - Creating http source with SSL/TLS enabled.
2022-09-07T10:45:01,392 [main] INFO com.amazon.dataprepper.plugins.source.loghttp.certificate.CertificateProviderFactory - Using ACM certificate and private key for SSL/TLS.
2022-09-07T10:45:01,704 [main] ERROR org.opensearch.dataprepper.pipeline.Pipeline - Pipeline [http-source-pipeline] encountered exception while starting the source, skipping execution
software.amazon.awssdk.core.exception.SdkClientException: Multiple HTTP implementations were found on the classpath. To avoid non-deterministic loading implementations, please explicitly provide an HTTP client via the client builders, set the software.amazon.awssdk.http.service.impl system property with the FQCN of the HTTP service to use as the default, or remove all but one HTTP implementation from the classpath
at software.amazon.awssdk.core.exception.SdkClientException$BuilderImpl.build(SdkClientException.java:102) ~[sdk-core-2.17.247.jar:?]
at software.amazon.awssdk.core.internal.http.loader.ClasspathSdkHttpServiceProvider.loadService(ClasspathSdkHttpServiceProvider.java:62) ~[sdk-core-2.17.247.jar:?]
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:195) ~[?:?]
at java.util.Spliterators$ArraySpliterator.tryAdvance(Spliterators.java:958) ~[?:?]
at java.util.stream.ReferencePipeline.forEachWithCancel(ReferencePipeline.java:127) ~[?:?]
at java.util.stream.AbstractPipeline.copyIntoWithCancel(AbstractPipeline.java:502) ~[?:?]
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:488) ~[?:?]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:474) ~[?:?]
at java.util.stream.FindOps$FindOp.evaluateSequential(FindOps.java:150) ~[?:?]
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[?:?]
at java.util.stream.ReferencePipeline.findFirst(ReferencePipeline.java:543) ~[?:?]
at software.amazon.awssdk.core.internal.http.loader.SdkHttpServiceProviderChain.loadService(SdkHttpServiceProviderChain.java:44) ~[sdk-core-2.17.247.jar:?]
at software.amazon.awssdk.core.internal.http.loader.CachingSdkHttpServiceProvider.loadService(CachingSdkHttpServiceProvider.java:46) ~[sdk-core-2.17.247.jar:?]
at software.amazon.awssdk.core.internal.http.loader.DefaultSdkHttpClientBuilder.buildWithDefaults(DefaultSdkHttpClientBuilder.java:40) ~[sdk-core-2.17.247.jar:?]
at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.lambda$resolveSyncHttpClient$7(SdkDefaultClientBuilder.java:343) ~[sdk-core-2.17.247.jar:?]
at java.util.Optional.orElseGet(Optional.java:362) ~[?:?]
at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.resolveSyncHttpClient(SdkDefaultClientBuilder.java:343) ~[sdk-core-2.17.247.jar:?]
at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.finalizeSyncConfiguration(SdkDefaultClientBuilder.java:282) ~[sdk-core-2.17.247.jar:?]
at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.syncClientConfiguration(SdkDefaultClientBuilder.java:178) ~[sdk-core-2.17.247.jar:?]
at software.amazon.awssdk.services.acm.DefaultAcmClientBuilder.buildClient(DefaultAcmClientBuilder.java:27) ~[acm-2.17.209.jar:?]
at software.amazon.awssdk.services.acm.DefaultAcmClientBuilder.buildClient(DefaultAcmClientBuilder.java:22) ~[acm-2.17.209.jar:?]
at software.amazon.awssdk.core.client.builder.SdkDefaultClientBuilder.build(SdkDefaultClientBuilder.java:145) ~[sdk-core-2.17.247.jar:?]
at com.amazon.dataprepper.plugins.source.loghttp.certificate.CertificateProviderFactory.getCertificateProvider(CertificateProviderFactory.java:47) ~[http-source-2.0.0-SNAPSHOT.jar:?]
at com.amazon.dataprepper.plugins.source.loghttp.HTTPSource.start(HTTPSource.java:86) ~[http-source-2.0.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.pipeline.Pipeline.execute(Pipeline.java:144) ~[data-prepper-core-2.0.0-SNAPSHOT.jar:?]
at org.opensearch.dataprepper.DataPrepper.lambda$execute$0(DataPrepper.java:70) ~[data-prepper-core-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at java.util.LinkedHashMap.forEach(LinkedHashMap.java:723) ~[?:?]
at org.opensearch.dataprepper.DataPrepper.execute(DataPrepper.java:69) ~[data-prepper-core-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
at org.opensearch.dataprepper.DataPrepperExecute.main(DataPrepperExecute.java:26) ~[data-prepper-main-2.0.0-SNAPSHOT.jar:2.0.0-SNAPSHOT]
```
**Expected behavior**
Data Prepper to start with the Http Source enforcing SSL using the ACM certificate.
**Environment (please complete the following information):**
- OS: MacOS 12.5.1
- Version Main branch (commit 27db8ad2baacb00d1332a69b9350f9b7ba2e8c9e)
| [BUG] Unable to start Data Prepper with a http source configured to use ACM | https://api.github.com/repos/opensearch-project/data-prepper/issues/1729/comments | 0 | 2022-09-07T15:47:23Z | 2022-09-08T15:52:04Z | https://github.com/opensearch-project/data-prepper/issues/1729 | 1,364,869,124 | 1,729 |
[
"opensearch-project",
"data-prepper"
] | This is for Data Prepper to read the config file located at config/data-prepper-config.yaml rather than require it as a command-line argument. | Load configuration files from the config directory | https://api.github.com/repos/opensearch-project/data-prepper/issues/1728/comments | 0 | 2022-09-07T14:41:16Z | 2022-09-13T19:01:34Z | https://github.com/opensearch-project/data-prepper/issues/1728 | 1,364,773,637 | 1,728 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2022-38750 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.26.jar</b></p></summary>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /e2e-test/log/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar</p>
<p>
Dependency Hierarchy:
- opensearch-rest-high-level-client-1.3.5.jar (Root Library)
- opensearch-1.3.5.jar
- opensearch-x-content-1.3.5.jar
- :x: **snakeyaml-1.26.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38750>CVE-2022-38750</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47027">https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47027</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution: org.yaml:snakeyaml:1.31</p>
</p>
</details>
<p></p>
| CVE-2022-38750 (Medium) detected in snakeyaml-1.26.jar - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/1725/comments | 1 | 2022-09-06T17:53:09Z | 2022-12-01T21:59:08Z | https://github.com/opensearch-project/data-prepper/issues/1725 | 1,363,629,146 | 1,725 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2022-38751 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.26.jar</b></p></summary>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /e2e-test/log/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar</p>
<p>
Dependency Hierarchy:
- opensearch-rest-high-level-client-1.3.5.jar (Root Library)
- opensearch-1.3.5.jar
- opensearch-x-content-1.3.5.jar
- :x: **snakeyaml-1.26.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38751>CVE-2022-38751</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47039">https://bugs.chromium.org/p/oss-fuzz/issues/detail?id=47039</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution: org.yaml:snakeyaml:1.31</p>
</p>
</details>
<p></p>
| CVE-2022-38751 (Medium) detected in snakeyaml-1.26.jar - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/1724/comments | 1 | 2022-09-06T17:53:07Z | 2022-12-01T21:59:06Z | https://github.com/opensearch-project/data-prepper/issues/1724 | 1,363,629,115 | 1,724 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2022-38752 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.31.jar</b></p></summary>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="https://bitbucket.org/snakeyaml/snakeyaml">https://bitbucket.org/snakeyaml/snakeyaml</a></p>
<p>Path to dependency file: /data-prepper-plugins/otel-metrics-source/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.31/cf26b7b05fef01e7bec00cb88ab4feeeba743e12/snakeyaml-1.31.jar</p>
<p>
Dependency Hierarchy:
- jackson-dataformat-yaml-2.13.4.jar (Root Library)
- :x: **snakeyaml-1.31.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stack-overflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38752>CVE-2022-38752</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-9w3m-gqgf-c4p9">https://github.com/advisories/GHSA-9w3m-gqgf-c4p9</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution (org.yaml:snakeyaml): 1.32</p>
<p>Direct dependency fix Resolution (com.fasterxml.jackson.dataformat:jackson-dataformat-yaml): 2.14.0</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2022-38752 (Medium) detected in snakeyaml-1.31.jar - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/1723/comments | 1 | 2022-09-06T17:53:05Z | 2022-12-15T22:54:02Z | https://github.com/opensearch-project/data-prepper/issues/1723 | 1,363,629,092 | 1,723 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2022-38749 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.26.jar</b></p></summary>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /e2e-test/log/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar</p>
<p>
Dependency Hierarchy:
- opensearch-rest-high-level-client-1.3.5.jar (Root Library)
- opensearch-1.3.5.jar
- opensearch-x-content-1.3.5.jar
- :x: **snakeyaml-1.26.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Using snakeYAML to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stackoverflow.
<p>Publish Date: 2022-09-05
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-38749>CVE-2022-38749</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/526/stackoverflow-oss-fuzz-47027">https://bitbucket.org/snakeyaml/snakeyaml/issues/526/stackoverflow-oss-fuzz-47027</a></p>
<p>Release Date: 2022-09-05</p>
<p>Fix Resolution: org.yaml:snakeyaml:1.31</p>
</p>
</details>
<p></p>
| CVE-2022-38749 (Medium) detected in snakeyaml-1.26.jar - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/1722/comments | 1 | 2022-09-06T17:53:03Z | 2022-12-01T21:59:11Z | https://github.com/opensearch-project/data-prepper/issues/1722 | 1,363,629,067 | 1,722 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
I'm trying to set up a monitoring solution on mac using docker opensearch, data prepper and aws distro open telemetry collector. Opensearch comes up and when I start data prepper the below error is thrown. Don't see much help on google, so raising an issue. Please help. I'm attaching my data prepper pipeline xml along. For open search no additional config's - just running the plain vanilla docker image
Caused by: java.net.ConnectException: Connection refused
at org.opensearch.client.RestClient.extractAndWrapCause(RestClient.java:907) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.RestClient.performRequest(RestClient.java:301) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.RestClient.performRequest(RestClient.java:289) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1762) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1728) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1696) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.ClusterClient.getSettings(ClusterClient.java:119) ~[data-prepper.jar:1.5.1]
at com.amazon.dataprepper.plugins.sink.opensearch.index.IndexManager.checkISMEnabled(IndexManager.java:149) ~[data-prepper.jar:1.5.1]
at com.amazon.dataprepper.plugins.sink.opensearch.index.IndexManager.checkAndCreateIndexTemplate(IndexManager.java:165) ~[data-prepper.jar:1.5.1]
at com.amazon.dataprepper.plugins.sink.opensearch.index.IndexManager.setupIndex(IndexManager.java:160) ~[data-prepper.jar:1.5.1]
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.initialize(OpenSearchSink.java:105) ~[data-prepper.jar:1.5.1]
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.<init>(OpenSearchSink.java:89) ~[data-prepper.jar:1.5.1]
Commands used
=============
opensearch
------------
docker run \
-p 9200:9200 -p 9600:9600 \
-e "discovery.type=single-node" \
-e "plugins.security.disabled=true" \
opensearchproject/opensearch:2.2.0
curl -XGET http://localhost:9200 -u 'admin:admin' --insecure
{
"name" : "307e3f84568c",
"cluster_name" : "opensearch",
"cluster_uuid" : "TjH1gMngR0Oe9a94tZV0cA",
"version" : {
"distribution" : "opensearch",
"number" : "2.2.0",
"build_type" : "tar",
"build_hash" : "b1017fa3b9a1c781d4f34ecee411e0cdf930a515",
"build_date" : "2022-08-09T02:28:05.169390805Z",
"build_snapshot" : false,
"lucene_version" : "9.3.0",
"minimum_wire_compatibility_version" : "7.10.0",
"minimum_index_compatibility_version" : "7.0.0"
},
"tagline" : "The OpenSearch Project: https://opensearch.org/"
}
[pipeline.txt](https://github.com/opensearch-project/data-prepper/files/9484433/pipeline.txt)
[data-prepper.txt](https://github.com/opensearch-project/data-prepper/files/9484434/data-prepper.txt)
data-prepper
------------
docker run --name data-prepper \
-v /Users/vishnu/git/opentelemetry-poc/javaagent-ex/opentelemetry-java-docs/javaagent/pipeline.yaml:/usr/share/data-prepper/pipelines.yaml \
-v /Users/vishnu/git/opentelemetry-poc/javaagent-ex/opentelemetry-java-docs/javaagent/data-prepper.yaml:/usr/s
[error.txt](https://github.com/opensearch-project/data-prepper/files/9484436/error.txt)
hare/data-prepper/data-prepper-config.yaml \
opensearchproject/data-prepper:latest
running opensearch as a http instance.
error:
| [BUG] Opensearch connection refused | https://api.github.com/repos/opensearch-project/data-prepper/issues/1720/comments | 13 | 2022-09-04T09:56:27Z | 2023-10-01T06:01:13Z | https://github.com/opensearch-project/data-prepper/issues/1720 | 1,361,099,622 | 1,720 |
[
"opensearch-project",
"data-prepper"
] | Have a class which is dedicated to the transformation between WireEvent and Event both directions. The client and server would use this class. It could even have two interfaces so they are not necessarily coupled.
Here is an outline of it:
The server would use:
```
interface WireEventDeserializer {
Event deserializeEvent(WireEvent wireEvent)
```
The client would use:
```
interface WireEventSerializer {
WireEvent serializeEvent(Event event)
```
Then you can have one class which implements both:
```
class WireEventTransformer implements WireEventSerializer, WireEventDeserializer
```
All the logic for this transformation is in one place.
_Originally posted by @dlvenable in https://github.com/opensearch-project/data-prepper/pull/1705#discussion_r960057128_ | Add a Dedicated class for transformation between Event and WireEvent | https://api.github.com/repos/opensearch-project/data-prepper/issues/1718/comments | 0 | 2022-09-01T16:25:10Z | 2023-01-09T21:43:12Z | https://github.com/opensearch-project/data-prepper/issues/1718 | 1,359,108,847 | 1,718 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
If Data Prepper is unable to access SQS (say a permissions issue), Data Prepper keeps trying quickly and produces a lot of error logs.
**Describe the solution you'd like**
I'm interested in either or both of the following:
* Have a backoff strategy when unable to read from the SQS queue.
* Don't log the error each time. If the error is the same, it may not be necessary to repeat the log each time.
**Describe alternatives you've considered (Optional)**
N/A
**Additional context**
N/A
| Improve S3 Source behavior when SQS is unavailable | https://api.github.com/repos/opensearch-project/data-prepper/issues/1708/comments | 3 | 2022-09-01T01:52:58Z | 2023-03-13T18:41:06Z | https://github.com/opensearch-project/data-prepper/issues/1708 | 1,358,134,130 | 1,708 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2022-25857 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.26.jar</b></p></summary>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /e2e-test/log/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.yaml/snakeyaml/1.26/a78a8747147d2c5807683e76ec2b633e95c14fe9/snakeyaml-1.26.jar</p>
<p>
Dependency Hierarchy:
- opensearch-rest-high-level-client-1.3.5.jar (Root Library)
- opensearch-1.3.5.jar
- opensearch-x-content-1.3.5.jar
- :x: **snakeyaml-1.26.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/opensearch-project/data-prepper/commit/90bdaa7e7833bdd504c817e49d4434b4d8880f56">90bdaa7e7833bdd504c817e49d4434b4d8880f56</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package org.yaml:snakeyaml from 0 and before 1.31 are vulnerable to Denial of Service (DoS) due missing to nested depth limitation for collections.
<p>Publish Date: 2022-08-30
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-25857>CVE-2022-25857</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-25857</a></p>
<p>Release Date: 2022-08-30</p>
<p>Fix Resolution: org.yaml:snakeyaml:1.31</p>
</p>
</details>
<p></p>
| CVE-2022-25857 (High) detected in snakeyaml-1.26.jar - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/1701/comments | 1 | 2022-08-31T16:48:14Z | 2022-12-01T21:59:03Z | https://github.com/opensearch-project/data-prepper/issues/1701 | 1,357,603,954 | 1,701 |
[
"opensearch-project",
"data-prepper"
] | Currently core peer forwarder has a limitation of creating beans with ssl as true by default.
Solution 1:
Make ssl false by default.
Solution 2:
Have a configuration with default values for ssl certificate file and key file.
Solution 3:
Make Data Prepper use a configuration file while building beans where ssl is disabled. | Make Peer Forwarding secure by default | https://api.github.com/repos/opensearch-project/data-prepper/issues/1699/comments | 3 | 2022-08-30T15:30:18Z | 2022-10-04T15:29:24Z | https://github.com/opensearch-project/data-prepper/issues/1699 | 1,355,949,218 | 1,699 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The `s3` source includes two codes in 1.5 and a new codec for CSV processing is coming in 2.0. These populate Events somewhat differently.
* `newline-delimited` -> The newline is saved to the `message` key of the Event. This is a single string.
* `json` -> The JSON is expanded into `message`. So, if the JSON has a key named `sourceIp`, it is populated in `/message/sourceIp`.
* `csv` -> Each key is expanded directly into the root of the Event (`/`). Thus, if the CSV has a key named `sourceIp`, it is populated in `/sourceIp`.
Also, the `s3` processor adds two special keys to all Events: `bucket` and `key`. These indicate the S3 bucket and key, respectively, for the object. The S3 Processor populates this, not the Codecs.
**Describe the solution you'd like**
First, all codecs should put the data in the same place consistently. Second, we should decide where we want this data to reside (`/message` or `/`). Third, it should avoid conflicting with the `bucket` and `key`.
One possible solution is to change the `s3` source to save the `bucket` and `key` to a top-level object named `s3`. Then the codecs save to the root (`/`). This could lead to conflicts if the actual data has a column or field named `s3`. But, if we make this key configurable, then pipeline authors could potentially avoid this.
**Describe alternatives you've considered (Optional)**
An alternative would be more robust support for Event metadata. The bucket and key could be saved as metadata. However, Data Prepper's conditional routing and processors don't support Event metadata presently.
**Additional context**
* #251
* #1081
| S3 Event Decoding Consistency | https://api.github.com/repos/opensearch-project/data-prepper/issues/1687/comments | 2 | 2022-08-23T15:28:35Z | 2022-09-26T22:30:08Z | https://github.com/opensearch-project/data-prepper/issues/1687 | 1,348,140,305 | 1,687 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently Data Prepper supports certificate from SSL supports ACM, S3 and file. It'd be nice to have support for AWS secrets manager certificate providers in `com.amazon.dataprepper.pluigns.certificate`
**Describe the solution you'd like**
A clear and concise description of what you want to happen. | Support AWS secrets manager certificate provider to enable TLS/SSL | https://api.github.com/repos/opensearch-project/data-prepper/issues/1679/comments | 0 | 2022-08-19T06:42:08Z | 2022-11-03T17:02:36Z | https://github.com/opensearch-project/data-prepper/issues/1679 | 1,344,014,511 | 1,679 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
As of now (mandatory in 1.4), we should use the OTel Trace Raw Processor and not anymore the otel_trace_raw_prepper.
As the otel-trace-group-prepper seems to be using the output of otel_trace_raw_prepper.
Is it still relevant for “Otel Trace Raw Processor” ? Are they compatible or will it be obsolete as otel_trace_raw_prepper in dataprepper 1.4?
We are building the OTEL Pipeline and follow the recommandation of the documentation by using “Otel Trace Raw Processor”.
We are interested in the prepper "otel-trace-group-prepper" as we have some element that will process in a RabbitMQ queue. That's means when the trace will to dataprepper, we will be out of the window. As per our understanding this prepper will help to fill the gap by finding the information already set in opensearch to enrich the current set of data before import.
Thanks for the help in clarification, | Does otel-trace-group-prepper will be obsolote in 1.4? Is it compatible with the "OTel Trace Raw Processor" | https://api.github.com/repos/opensearch-project/data-prepper/issues/1673/comments | 2 | 2022-08-17T11:03:20Z | 2022-08-17T15:12:39Z | https://github.com/opensearch-project/data-prepper/issues/1673 | 1,341,589,944 | 1,673 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Core peer forwarder currently stores all the peers in a map as they are made available but it does not remove from the map as they exit.
**Describe the solution you'd like**
Clean up the resources in shutdown call of `PeerForwardingProcessorDecorator`.
**Additional context**
This is the same behavior even in the peer forwarder plugin [here](https://github.com/opensearch-project/data-prepper/blob/c51ff6dc40531aed272291f77b5c297919cb21b1/data-prepper-plugins/peer-forwarder/src/main/java/com/amazon/dataprepper/plugins/prepper/peerforwarder/PeerClientPool.java#L56).
There is an existing [TODO](https://github.com/opensearch-project/data-prepper/blob/c51ff6dc40531aed272291f77b5c297919cb21b1/data-prepper-plugins/peer-forwarder/src/main/java/com/amazon/dataprepper/plugins/prepper/peerforwarder/PeerForwarder.java#L317) in peer forwarder plugin to do it.
| Support to remove peers from peer client poll as they exit | https://api.github.com/repos/opensearch-project/data-prepper/issues/1672/comments | 1 | 2022-08-16T17:01:45Z | 2023-01-06T21:39:38Z | https://github.com/opensearch-project/data-prepper/issues/1672 | 1,340,638,220 | 1,672 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Add support in Http Source to enable TLS/SSL using certificate and private key from AWS Certificate Manager (ACM) similar to ACM support in [OTel Trace Source](https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/otel-trace-source#ssl)
**Describe the solution you'd like**
Follow similar approach as OTel Trace Source.
**Describe alternatives you've considered (Optional)**
None
**Additional context**
We already support ACM integration in OTel Trace Source, so ACM support for Log Source will help keep SSL configuration consistent across different source. | Add support in Http Source to enable TLS/SSL using certificate and private key from AWS Certificate Manager (ACM) | https://api.github.com/repos/opensearch-project/data-prepper/issues/1670/comments | 1 | 2022-08-16T15:37:48Z | 2022-08-22T20:24:42Z | https://github.com/opensearch-project/data-prepper/issues/1670 | 1,340,542,233 | 1,670 |
[
"opensearch-project",
"data-prepper"
] | Create a `PipelineDataFlowModel.Builder` class. This will allow us to remove a constructor and have only one private constructor to be used by the builder and by Jackson for JSON deserialization.
Relatedly, the `PipelineDataFlowModel` should use non-empty serialization for `processors`. It should also return an empty `List<PluginModel>` instead of `null` to provide consistency for clients. | Create a builder for PipelineDataFlowModel and non-empty consistency between different types | https://api.github.com/repos/opensearch-project/data-prepper/issues/1669/comments | 0 | 2022-08-16T15:05:05Z | 2022-10-05T19:22:30Z | https://github.com/opensearch-project/data-prepper/issues/1669 | 1,340,497,070 | 1,669 |
[
"opensearch-project",
"data-prepper"
] | null | Add the router to the Data Prepper pipeline model | https://api.github.com/repos/opensearch-project/data-prepper/issues/1665/comments | 0 | 2022-08-13T21:28:47Z | 2022-08-18T17:51:23Z | https://github.com/opensearch-project/data-prepper/issues/1665 | 1,338,061,216 | 1,665 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
When metrics are exported from OTEL to data-prepper the following 404 error es received:
> 0tel-collector | 2022-08-12T20:18:18.517Z error exporterhelper/queued_retry.go:183 Exporting failed. The error is not retryable. Dropping data. {"kind": "exporter", "data_type": "metrics", "name": "otlp/http", "error": "Permanent error: rpc error: code = Unimplemented desc = unexpected HTTP status code received from server: 404 (Not Found); transport: received unexpected content-type \"text/plain; charset=utf-8\"", "dropped_items": 173}

**To Reproduce**
Steps to reproduce the behavior:
1. I launch data-prepper with OpenSearch as a sink for thre pipelines: "raw-pipeline" ,"service-map-pipeline" and "metrics-pipeline":

2. I launch Oel with the following config:

For metrics, I've tried the exporters otlp, otlp/http, otlp/2 and otlp/data-prepper. All unsuccesfully.
3. I instrument a Java Application

**Expected behavior**
Metrics traces sould be imported into OpenSearch. Just to mention, the traces pipeline works OK
**Environment (please complete the following information):**
- OS: Ubuntu 20.04 LTS
- otel/opentelemetry-collector-contrib:latest
- opensearchproject/data-prepper:latest
| [BUG] Otel / data-prepper 404 error with metrics pipelines | https://api.github.com/repos/opensearch-project/data-prepper/issues/1664/comments | 6 | 2022-08-12T20:41:38Z | 2022-08-31T21:28:17Z | https://github.com/opensearch-project/data-prepper/issues/1664 | 1,337,689,153 | 1,664 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The `type` property on the `@DataPrepperPlugin` annotation is deprecated and should be removed.
**Describe the solution you'd like**
Remove the `type` property.
| Remove deprecated DataPrepperPlugin type property | https://api.github.com/repos/opensearch-project/data-prepper/issues/1657/comments | 0 | 2022-08-11T01:20:04Z | 2022-08-11T19:34:00Z | https://github.com/opensearch-project/data-prepper/issues/1657 | 1,335,363,069 | 1,657 |
[
"opensearch-project",
"data-prepper"
] | null | Create the data-prepper script which runs Data Prepper | https://api.github.com/repos/opensearch-project/data-prepper/issues/1655/comments | 0 | 2022-08-11T00:16:14Z | 2022-08-11T20:14:56Z | https://github.com/opensearch-project/data-prepper/issues/1655 | 1,335,333,796 | 1,655 |
[
"opensearch-project",
"data-prepper"
] | null | Create the data-prepper script which runs Data Prepper | https://api.github.com/repos/opensearch-project/data-prepper/issues/1654/comments | 0 | 2022-08-11T00:15:26Z | 2022-08-11T00:16:02Z | https://github.com/opensearch-project/data-prepper/issues/1654 | 1,335,333,452 | 1,654 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
When using the event record type for tracing it throws an error in the processor
```
[raw-pipeline-processor-worker-5-thread-3] ERROR com.amazon.dataprepper.pipeline.ProcessWorker - Encountered exception during pipeline raw-pipeline processing
java.lang.ClassCastException: class io.opentelemetry.proto.collector.trace.v1.ExportTraceServiceRequest cannot be cast to class com.amazon.dataprepper.model.trace.Span (io.opentelemetry.proto.collector.trace.v1.ExportTraceServiceRequest and com.amazon.dataprepper.model.trace.Span are in unnamed module of loader 'app')
at com.amazon.dataprepper.plugins.processor.oteltrace.OTelTraceRawProcessor.doExecute(OTelTraceRawProcessor.java:77) ~[data-prepper.jar:1.4.0]
at com.amazon.dataprepper.model.processor.AbstractProcessor.lambda$execute$0(AbstractProcessor.java:55) ~[data-prepper.jar:1.4.0]
at io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:61) ~[data-prepper.jar:1.4.0]
at com.amazon.dataprepper.model.processor.AbstractProcessor.execute(AbstractProcessor.java:55) ~[data-prepper.jar:1.4.0]
at com.amazon.dataprepper.pipeline.ProcessWorker.run(ProcessWorker.java:62) ~[data-prepper.jar:1.4.0]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515) ~[?:?]
at java.util.concurrent.FutureTask.run(FutureTask.java:264) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) ~[?:?]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630) ~[?:?]
at java.lang.Thread.run(Thread.java:832) ~[?:?]
```
**To Reproduce**
Steps to reproduce the behavior:
Deploy a standard tracing pipeline and send traces from otel collector
Pipeline configuration is
```yaml
entry-pipeline:
workers : 8
delay: "100"
buffer:
bounded_blocking:
buffer_size: 4096
batch_size: 160
source:
otel_trace_source:
ssl: false
sink:
- pipeline:
name: "raw-pipeline"
- pipeline:
name: "service-map-pipeline"
raw-pipeline:
workers : 8
delay: "100"
source:
pipeline:
name: "entry-pipeline"
buffer:
bounded_blocking:
buffer_size: 4096
batch_size: 160
processor:
- otel_trace_raw:
sink:
- opensearch:
hosts: ["host"]
username: redacted
password: redacted
index: otel-v1-apm-span
index_type: custom
service-map-pipeline:
workers : 1
delay: "100"
source:
pipeline:
name: "entry-pipeline"
processor:
- service_map_stateful:
buffer:
bounded_blocking:
buffer_size: 512
batch_size: 8
sink:
- opensearch:
hosts: ["host"]
username: redacted
password: redacted
index: otel-v1-apm-service-map
index_type: custom
```
**Expected behavior**
Pipeline should be successful
**Environment (please complete the following information):**
- OS: Docker image running in kubernetes
- Version: tried with 1.4.0 and 1.5.1
| [BUG] Trace event processor does not work | https://api.github.com/repos/opensearch-project/data-prepper/issues/1653/comments | 3 | 2022-08-10T22:57:22Z | 2022-08-15T20:23:11Z | https://github.com/opensearch-project/data-prepper/issues/1653 | 1,335,284,955 | 1,653 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The OpenSearch sink has two values which were deprecated: `trace_analytics_raw` and `trace_analytics_service_map`. The `index_type` field replaces these.
**Describe the solution you'd like**
Remove `trace_analytics_raw` and `trace_analytics_service_map` from the OpenSearch sink.
**Tasks**
- [x] Remove `trace_analytics_raw` and `trace_analytics_service_map` from Data Prepper (#1690)
- [x] Update OpenSearch documentation website regarding this change (https://github.com/opensearch-project/documentation-website/pull/1029)
| Remove deprecated OpenSearch values | https://api.github.com/repos/opensearch-project/data-prepper/issues/1648/comments | 0 | 2022-08-09T18:29:52Z | 2022-09-07T22:02:54Z | https://github.com/opensearch-project/data-prepper/issues/1648 | 1,333,637,427 | 1,648 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper currently has a `BlockingBuffer` plugin which is used a buffer. Core Peer Forwarding(#700) requires a similar buffer for `PeerForwarderServer` but with some minor changes which is being added as part of #1641. With two buffers exactly similar, there's a lot of duplicate code.
**Describe the solution you'd like**
Buffer logic implemented only once which can be used by both `BlockingBuffer` and `PeerForwarderReceiveBuffer`.
| Duplicated buffer logic in PeerForwarderReceiveBuffer which is similar to BlockingBuffer | https://api.github.com/repos/opensearch-project/data-prepper/issues/1643/comments | 0 | 2022-08-05T05:01:04Z | 2022-11-03T16:53:11Z | https://github.com/opensearch-project/data-prepper/issues/1643 | 1,329,449,940 | 1,643 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently, plugin configurations use Java validations (@AssertTrue for example) to validate the configuration options. We don't support that in the core package to validate `DataPrepperConfiguration` options.
**Describe the solution you'd like**
We need to configure Spring to use the validations if possible. | Configure Spring to use Jakarta Bean validations if possible | https://api.github.com/repos/opensearch-project/data-prepper/issues/1624/comments | 0 | 2022-07-27T21:18:13Z | 2022-11-03T16:51:18Z | https://github.com/opensearch-project/data-prepper/issues/1624 | 1,320,131,724 | 1,624 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently, only piplelines.yaml can be deserialized using [PluginDurationDeserializer](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-core/src/main/java/com/amazon/dataprepper/plugin/PluginDurationDeserializer.java). It doesn't allow `DataPrepperConfiguration` to use `Duration`.
**Describe the solution you'd like**
One solution I could think of is make `PluginDurationDeserializer` a generic class instead of just for plugins and make it public. We can configure the object mapper of [DataPrepperAppConfiguration](https://github.com/opensearch-project/data-prepper/blob/e1b0e4162a12c11aecef6243188348e271ad9323/data-prepper-core/src/main/java/com/amazon/dataprepper/parser/config/DataPrepperAppConfiguration.java#L50) here with custom deserializer.
| Add Duration support for Data Prepper Configuration using PluginDurationDeserializer | https://api.github.com/repos/opensearch-project/data-prepper/issues/1623/comments | 2 | 2022-07-27T20:57:30Z | 2022-09-15T19:17:04Z | https://github.com/opensearch-project/data-prepper/issues/1623 | 1,320,114,254 | 1,623 |
[
"opensearch-project",
"data-prepper"
] | Update the docuumentation in data prepper and OpenSearch [documentation](https://github.com/opensearch-project/documentation-website). | Write documentation for CSV Processor | https://api.github.com/repos/opensearch-project/data-prepper/issues/1619/comments | 0 | 2022-07-25T20:56:28Z | 2022-08-11T21:21:21Z | https://github.com/opensearch-project/data-prepper/issues/1619 | 1,317,359,448 | 1,619 |
[
"opensearch-project",
"data-prepper"
] | Updated Logstash converter to convert CSVProcessor. | Add CSVProcessor Logstash converter | https://api.github.com/repos/opensearch-project/data-prepper/issues/1618/comments | 1 | 2022-07-25T20:55:28Z | 2022-09-13T16:52:01Z | https://github.com/opensearch-project/data-prepper/issues/1618 | 1,317,358,511 | 1,618 |
[
"opensearch-project",
"data-prepper"
] | Add CSV codec to S3 source | Add CSV Codec | https://api.github.com/repos/opensearch-project/data-prepper/issues/1617/comments | 0 | 2022-07-25T20:54:39Z | 2022-08-09T20:40:31Z | https://github.com/opensearch-project/data-prepper/issues/1617 | 1,317,357,758 | 1,617 |
[
"opensearch-project",
"data-prepper"
] | Add integration tests for CSV processor to check if events are updated as expected. | Add CSVProcessor Integration Tests | https://api.github.com/repos/opensearch-project/data-prepper/issues/1616/comments | 0 | 2022-07-25T20:54:33Z | 2022-08-26T20:11:59Z | https://github.com/opensearch-project/data-prepper/issues/1616 | 1,317,357,663 | 1,616 |
[
"opensearch-project",
"data-prepper"
] | Update existing newline codec in s3 source | Modify newline codec to add CSV header auto detection support | https://api.github.com/repos/opensearch-project/data-prepper/issues/1615/comments | 0 | 2022-07-25T20:53:50Z | 2022-08-09T20:38:49Z | https://github.com/opensearch-project/data-prepper/issues/1615 | 1,317,356,940 | 1,615 |
[
"opensearch-project",
"data-prepper"
] | Implement CSV processor logic in `doExecute()` | Implement CSVProcessor parsing | https://api.github.com/repos/opensearch-project/data-prepper/issues/1614/comments | 0 | 2022-07-25T20:53:01Z | 2022-08-04T20:23:48Z | https://github.com/opensearch-project/data-prepper/issues/1614 | 1,317,356,181 | 1,614 |
[
"opensearch-project",
"data-prepper"
] | Add CSV processor configuration and CSVProcessor skeleton | Add CSVProcessor skeleton | https://api.github.com/repos/opensearch-project/data-prepper/issues/1613/comments | 0 | 2022-07-25T20:51:32Z | 2022-07-28T18:47:14Z | https://github.com/opensearch-project/data-prepper/issues/1613 | 1,317,354,872 | 1,613 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently the HTTP source plugin does not support mTLS verification. This is a feature seen requested in the forums.
**Describe the solution you'd like**
The HTTP source plugin supports mTLS verification for added security.
The HTTP source should be configured with tlsCustomizer that will configure the SslContextBuilder with TrustManager.
```
sb.tlsCustomizer(sslCtxBuilder -> {
sslCtxBuilder.trustManager(loadTrustManager(...))
.clientAuth(ClientAuth.REQUIRE);
});
```
**Additional context**
This issue stemmed from a forum post, https://forum.opensearch.org/t/data-prepper-mtls-with-the-http-source-plugin/10362.
| mTLS Support for HTTP Source Plugin | https://api.github.com/repos/opensearch-project/data-prepper/issues/1612/comments | 0 | 2022-07-25T19:54:41Z | 2022-08-09T18:31:34Z | https://github.com/opensearch-project/data-prepper/issues/1612 | 1,317,297,573 | 1,612 |
[
"opensearch-project",
"data-prepper"
] | null | Add end-to-end tests | https://api.github.com/repos/opensearch-project/data-prepper/issues/1610/comments | 0 | 2022-07-25T15:46:38Z | 2022-11-16T23:20:31Z | https://github.com/opensearch-project/data-prepper/issues/1610 | 1,317,034,545 | 1,610 |
[
"opensearch-project",
"data-prepper"
] | Metrics for peer forwarder include:
Gauge:
- peerEndpoints
Counter: (records)
- recordsActuallyProcessedLocally
- recordsToBeProcessedLocally
- recordsToBeForwarded
- recordsFailedForwarding
- recordsSuccessfullyForwarded
- recordsReceivedFromPeers
Counter: (requests)
- requests
- requestsFailed (any request with other than 200 OK response)
- requestsSuccessful
- requests
- requestsTooLarge (buffer)
- requestTimeouts (buffer)
- requestsUnprocessable (buffer not found, NPE)
- badRequests
Timer:
- requestForwardingLatency
- requestProcessingLatency | Add metrics for monitoring PeerForwarder | https://api.github.com/repos/opensearch-project/data-prepper/issues/1609/comments | 3 | 2022-07-25T15:46:36Z | 2022-09-30T14:22:55Z | https://github.com/opensearch-project/data-prepper/issues/1609 | 1,317,034,513 | 1,609 |
[
"opensearch-project",
"data-prepper"
] | null | Inject PeerForwarderServer and start in execute | https://api.github.com/repos/opensearch-project/data-prepper/issues/1608/comments | 0 | 2022-07-25T15:46:34Z | 2022-09-15T15:25:26Z | https://github.com/opensearch-project/data-prepper/issues/1608 | 1,317,034,459 | 1,608 |
[
"opensearch-project",
"data-prepper"
] | null | PeerForwarderProcessorDecorator will take client, server and buffer as input | https://api.github.com/repos/opensearch-project/data-prepper/issues/1607/comments | 0 | 2022-07-25T15:46:32Z | 2022-09-14T19:11:12Z | https://github.com/opensearch-project/data-prepper/issues/1607 | 1,317,034,406 | 1,607 |
[
"opensearch-project",
"data-prepper"
] | null | Add PeerForwarderClient | https://api.github.com/repos/opensearch-project/data-prepper/issues/1606/comments | 0 | 2022-07-25T15:46:30Z | 2022-08-26T22:08:37Z | https://github.com/opensearch-project/data-prepper/issues/1606 | 1,317,034,369 | 1,606 |
[
"opensearch-project",
"data-prepper"
] | null | Add PeerForwarderServer similar to HTTP source | https://api.github.com/repos/opensearch-project/data-prepper/issues/1605/comments | 0 | 2022-07-25T15:46:06Z | 2022-09-09T04:08:11Z | https://github.com/opensearch-project/data-prepper/issues/1605 | 1,317,033,884 | 1,605 |
[
"opensearch-project",
"data-prepper"
] | null | Embed PeerForwarderConfiguration into DataPrepperConfiguration | https://api.github.com/repos/opensearch-project/data-prepper/issues/1604/comments | 1 | 2022-07-25T15:45:15Z | 2022-07-25T15:45:48Z | https://github.com/opensearch-project/data-prepper/issues/1604 | 1,317,032,829 | 1,604 |
[
"opensearch-project",
"data-prepper"
] | Create `PeerForwarderReceiveBuffer` which is used by plugins which implement `RequiresPeerForwarding` interface. Each plugin will have it's own buffer to which `PeerForwarderServer` will write. | Create a thread safe buffer for PeerForwarder server | https://api.github.com/repos/opensearch-project/data-prepper/issues/1603/comments | 0 | 2022-07-25T15:45:01Z | 2022-08-08T03:14:42Z | https://github.com/opensearch-project/data-prepper/issues/1603 | 1,317,032,577 | 1,603 |
[
"opensearch-project",
"data-prepper"
] | null | Embed PeerForwarderConfiguration into DataPrepperConfiguration | https://api.github.com/repos/opensearch-project/data-prepper/issues/1602/comments | 0 | 2022-07-25T15:45:00Z | 2022-09-08T18:24:16Z | https://github.com/opensearch-project/data-prepper/issues/1602 | 1,317,032,558 | 1,602 |
[
"opensearch-project",
"data-prepper"
] | Hi I'm trying to run data-prepper with opensearch sink. I've deployed opensearch in AWS, and I have the domain endpoint. I create the pipelines.yaml with the following content:
`
entry-pipeline:
delay: "100"
source:
otel_trace_source:
ssl: false
sink:
- pipeline:
name: "raw-pipeline"
- pipeline:
name: "service-map-pipeline"
raw-pipeline:
source:
pipeline:
name: "entry-pipeline"
prepper:
- otel_trace_raw_prepper:
sink:
- opensearch:
hosts: [ "https://vpc-logs-dev-<ID>.us-west-1.es.amazonaws.com" ]
aws_region: "us-west-1"
aws_sigv4: true
trace_analytics_raw: true
service-map-pipeline:
delay: "100"
source:
pipeline:
name: "entry-pipeline"
prepper:
- service_map_stateful:
sink:
- opensearch:
hosts: [ "https://vpc-logs-dev-<ID>.us-west-1.es.amazonaws.com" ]
aws_region: "us-west-1"
aws_sigv4: true
trace_analytics_service_map: true
`
I then run the docker image (setting env AWS_ACCESS_KEY_ID and AWS_SECRET_ACCESS_KEY) but I get the following error stacktrace:
`Caused by: java.lang.RuntimeException: 30,000 milliseconds timeout on connection http-outgoing-0 [ACTIVE]
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.<init>(OpenSearchSink.java:92) ~[data-prepper.jar:1.5.1]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:64) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:481) ~[?:?]
at com.amazon.dataprepper.plugin.PluginCreator.newPluginInstance(PluginCreator.java:40) ~[data-prepper.jar:1.5.1]
... 76 more
Caused by: java.net.SocketTimeoutException: 30,000 milliseconds timeout on connection http-outgoing-0 [ACTIVE]
at org.opensearch.client.RestClient.extractAndWrapCause(RestClient.java:892) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.RestClient.performRequest(RestClient.java:301) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.RestClient.performRequest(RestClient.java:289) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1762) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1728) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1696) ~[data-prepper.jar:1.5.1]
at org.opensearch.client.ClusterClient.getSettings(ClusterClient.java:119) ~[data-prepper.jar:1.5.1]
at com.amazon.dataprepper.plugins.sink.opensearch.index.IndexManager.checkISMEnabled(IndexManager.java:149) ~[data-prepper.jar:1.5.1]
at com.amazon.dataprepper.plugins.sink.opensearch.index.IndexManager.checkAndCreateIndexTemplate(IndexManager.java:165) ~[data-prepper.jar:1.5.1]
at com.amazon.dataprepper.plugins.sink.opensearch.index.IndexManager.setupIndex(IndexManager.java:160) ~[data-prepper.jar:1.5.1]
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.initialize(OpenSearchSink.java:105) ~[data-prepper.jar:1.5.1]
at com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink.<init>(OpenSearchSink.java:89) ~[data-prepper.jar:1.5.1]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) ~[?:?]
at jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:64) ~[?:?]
at jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:?]
at java.lang.reflect.Constructor.newInstanceWithCaller(Constructor.java:500) ~[?:?]
at java.lang.reflect.Constructor.newInstance(Constructor.java:481) ~[?:?]
at com.amazon.dataprepper.plugin.PluginCreator.newPluginInstance(PluginCreator.java:40) ~[data-prepper.jar:1.5.1]
... 76 more
`
So It cannot connect to my opensearch domain. Just to mention this domain has no password.
| OpenSearch Connection Timeout | https://api.github.com/repos/opensearch-project/data-prepper/issues/1601/comments | 1 | 2022-07-22T21:09:48Z | 2022-08-12T23:13:06Z | https://github.com/opensearch-project/data-prepper/issues/1601 | 1,315,419,511 | 1,601 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Health Check to Log Source or Otel Source returns 401 when Auth is enabled. If we have data prepper fronted by load balancer and load balancer is doing health check, the health check path should be excluded from Auth.
**To Reproduce**
Steps to reproduce the behavior:
1. Configure the Data Pepper Pipeline config with Auth
```
source:
http:
authentication:
http_basic:
username: my-user
password: my_s3cr3t
```
2. Start Data Prepper
3. curl http://localhost:2021/health
**Expected behavior**
return Http status 200 with response {}.
**Screenshots**
```
curl http://localhost:2021/health
401 Unauthorized%
```
**Environment (please complete the following information):**
- OS: Linux
| [BUG] Health Check returns 401 when Auth is enabled | https://api.github.com/repos/opensearch-project/data-prepper/issues/1600/comments | 3 | 2022-07-21T18:52:04Z | 2025-03-17T19:01:49Z | https://github.com/opensearch-project/data-prepper/issues/1600 | 1,313,667,765 | 1,600 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
I'm using the same certificate on Http source as in otel_trace_source .
On otel_trace_source, I received a ". HttpRequestException: The SSL connection could not be established, see inner exception. AuthenticationException: Authentication failed" error when trying to push events.
**To Reproduce**
Here are the way I setup the sources:
```
otel_trace_source:
ssl: true
sslKeyCertChainFile: "certificat.crt"
sslKeyFile: "certificat.key"
# Explicitly disable authentication
authentication:
unauthenticated:
port: 2022
```
```
http:
ssl: true
ssl_certificate_file: "certificat.crt"
ssl_key_file: "certificat.key"
# Explicitly disable authentication
authentication:
unauthenticated:
port: 2021
```
**Expected behavior**
No error on connection
**Environment (please complete the following information):**
- Public Docker image
Is there anything specific to otel_trace_source in term of certificate?
Thanks, | [BUG] Certificate issue on otel_trace_source | https://api.github.com/repos/opensearch-project/data-prepper/issues/1599/comments | 1 | 2022-07-21T09:52:37Z | 2022-07-22T09:53:27Z | https://github.com/opensearch-project/data-prepper/issues/1599 | 1,312,989,333 | 1,599 |
[
"opensearch-project",
"data-prepper"
] | Wrap the identified processors PeerForwardingProcessorDecorator and pass the updated list of processors to Pipeline. | Identify processors which implement RequiredPeerForwarding | https://api.github.com/repos/opensearch-project/data-prepper/issues/1597/comments | 0 | 2022-07-19T04:36:01Z | 2022-07-20T17:55:44Z | https://github.com/opensearch-project/data-prepper/issues/1597 | 1,308,966,942 | 1,597 |
[
"opensearch-project",
"data-prepper"
] | ## CVE-2022-31159 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>aws-java-sdk-s3-1.12.257.jar</b></p></summary>
<p>The AWS Java SDK for Amazon S3 module holds the client classes that are used for communicating with Amazon Simple Storage Service</p>
<p>Library home page: <a href="https://aws.amazon.com/sdkforjava">https://aws.amazon.com/sdkforjava</a></p>
<p>Path to dependency file: /data-prepper-core/build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.amazonaws/aws-java-sdk-s3/1.12.257/f3f29e4c9742ecd8a227f3c6a9e69891eaac380/aws-java-sdk-s3-1.12.257.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.amazonaws/aws-java-sdk-s3/1.12.257/f3f29e4c9742ecd8a227f3c6a9e69891eaac380/aws-java-sdk-s3-1.12.257.jar,/e/caches/modules-2/files-2.1/com.amazonaws/aws-java-sdk-s3/1.12.257/f3f29e4c9742ecd8a227f3c6a9e69891eaac380/aws-java-sdk-s3-1.12.257.jar</p>
<p>
Dependency Hierarchy:
- :x: **aws-java-sdk-s3-1.12.257.jar** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The AWS SDK for Java enables Java developers to work with Amazon Web Services. A partial-path traversal issue exists within the `downloadDirectory` method in the AWS S3 TransferManager component of the AWS SDK for Java v1 prior to version 1.12.261. Applications using the SDK control the `destinationDirectory` argument, but S3 object keys are determined by the application that uploaded the objects. The `downloadDirectory` method allows the caller to pass a filesystem object in the object key but contained an issue in the validation logic for the key name. A knowledgeable actor could bypass the validation logic by including a UNIX double-dot in the bucket key. Under certain conditions, this could permit them to retrieve a directory from their S3 bucket that is one level up in the filesystem from their working directory. This issue’s scope is limited to directories whose name prefix matches the destinationDirectory. E.g. for destination directory`/tmp/foo`, the actor can cause a download to `/tmp/foo-bar`, but not `/tmp/bar`. If `com.amazonaws.services.s3.transfer.TransferManager::downloadDirectory` is used to download an untrusted buckets contents, the contents of that bucket can be written outside of the intended destination directory. Version 1.12.261 contains a patch for this issue. As a workaround, when calling `com.amazonaws.services.s3.transfer.TransferManager::downloadDirectory`, pass a `KeyFilter` that forbids `S3ObjectSummary` objects that `getKey` method return a string containing the substring `..` .
<p>Publish Date: 2022-07-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31159>CVE-2022-31159</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/aws/aws-sdk-java/security/advisories/GHSA-c28r-hw5m-5gv3">https://github.com/aws/aws-sdk-java/security/advisories/GHSA-c28r-hw5m-5gv3</a></p>
<p>Release Date: 2022-07-15</p>
<p>Fix Resolution: com.amazonaws:aws-java-sdk-s3:1.12.261</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
| CVE-2022-31159 (Medium) detected in aws-java-sdk-s3-1.12.257.jar - autoclosed | https://api.github.com/repos/opensearch-project/data-prepper/issues/1594/comments | 2 | 2022-07-18T10:48:36Z | 2022-08-10T21:52:58Z | https://github.com/opensearch-project/data-prepper/issues/1594 | 1,307,760,351 | 1,594 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
* As a user of Amazon OpenSearch Service, I would like to configure my `opensearch` sink of Data Prepper with the domain ARN rather than the host endpoint.
**Describe the solution you'd like**
The ability to pass the domain ARN to the opensearch sink plugin. Potential configuration solutions include
1. Keep the parameters the same and allow for the ARN to be passed to the `hosts` field. This is the easiest change to make, but it may be a slightly confusing user experience that the host is equivalent to an ARN
```
- opensearch:
hosts: [ "arn:aws:es:us-west-1:987654321098:domain/test-domain/" ]
aws_region: "us-east-1"
aws_sts_role_arn: "role"
```
2. Make `hosts` conditionally optional based on a `aws_domain_arns` parameter
```
- opensearch:
aws_domain_arns: [ "arn:aws:es:us-west-1:987654321098:domain/test-domain/" ]
aws_region: "us-east-1"
aws_sts_role_arn: "role"
```
| Optional domain ARN parameter for OpenSearch sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/1593/comments | 2 | 2022-07-14T17:35:24Z | 2022-07-29T14:39:29Z | https://github.com/opensearch-project/data-prepper/issues/1593 | 1,305,098,463 | 1,593 |
[
"opensearch-project",
"data-prepper"
] | Create PeerForwarderConfiguration class and embed it in to DataPrepperCOnfiguration. | Add PeerForwarderConfiguration class | https://api.github.com/repos/opensearch-project/data-prepper/issues/1591/comments | 0 | 2022-07-14T16:30:59Z | 2022-08-09T20:39:51Z | https://github.com/opensearch-project/data-prepper/issues/1591 | 1,305,026,016 | 1,591 |
[
"opensearch-project",
"data-prepper"
] | Add RequiresPeerForwarding interface which is implemented by stateful processors. Data Prepper will detect this plugins which implement it and configure peer forwarder for these processors. | Add RequiresPeerForwarding interface | https://api.github.com/repos/opensearch-project/data-prepper/issues/1590/comments | 0 | 2022-07-14T16:29:29Z | 2022-07-19T02:07:17Z | https://github.com/opensearch-project/data-prepper/issues/1590 | 1,305,024,563 | 1,590 |
[
"opensearch-project",
"data-prepper"
] | Add PeerForwardingProcessingDecorator to Data Prepper core which implements processor. | Add PeerForwardingProcessingDecorator skeleton | https://api.github.com/repos/opensearch-project/data-prepper/issues/1589/comments | 0 | 2022-07-14T16:28:37Z | 2022-07-19T02:07:17Z | https://github.com/opensearch-project/data-prepper/issues/1589 | 1,305,023,722 | 1,589 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The new plugin framework replaces the static PluginFactory classes. Delete these.
**Describe the solution you'd like**
Delete deprecated plugin classes and related testing code.
| Remove deprecated PluginFactory classes | https://api.github.com/repos/opensearch-project/data-prepper/issues/1584/comments | 0 | 2022-07-08T22:00:27Z | 2022-07-12T02:05:05Z | https://github.com/opensearch-project/data-prepper/issues/1584 | 1,299,474,203 | 1,584 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Large log files in S3 with gzip compression do not have all records come through.
One example log file (from Application Load Balancer) has about 80k lines in it. However, Data Prepper only reports about 2000 of these. | [BUG] S3 Source fails to load all records for large compressed newline-delimited logs | https://api.github.com/repos/opensearch-project/data-prepper/issues/1568/comments | 2 | 2022-07-05T19:25:19Z | 2022-07-08T00:21:02Z | https://github.com/opensearch-project/data-prepper/issues/1568 | 1,294,680,038 | 1,568 |
[
"opensearch-project",
"data-prepper"
] | Tracking #1550 for 1.5.1 | [BUG] S3 Source poll delay will always call Thread.sleep() if poll delay is greater than 0 - 1.5.1 | https://api.github.com/repos/opensearch-project/data-prepper/issues/1567/comments | 0 | 2022-07-05T16:20:34Z | 2022-07-06T17:49:51Z | https://github.com/opensearch-project/data-prepper/issues/1567 | 1,294,509,044 | 1,567 |
[
"opensearch-project",
"data-prepper"
] | Tracking #1544 for 1.5.1 | [BUG] S3 Source stops on S3 error - 1.5.1 | https://api.github.com/repos/opensearch-project/data-prepper/issues/1566/comments | 0 | 2022-07-05T16:19:42Z | 2022-07-06T17:49:44Z | https://github.com/opensearch-project/data-prepper/issues/1566 | 1,294,508,197 | 1,566 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
`PluginSettings` can return null items when fetching configurations. This requires every plugin to perform null checks.
**Describe the solution you'd like**
Null checking is an anti-pattern and can be eliminated through the use of optionals or returning default values.
**Additional context**
Raised from #1560
| Eliminate null checks required by PluginSetting clients. | https://api.github.com/repos/opensearch-project/data-prepper/issues/1563/comments | 4 | 2022-07-05T15:54:36Z | 2022-11-11T23:02:48Z | https://github.com/opensearch-project/data-prepper/issues/1563 | 1,294,482,809 | 1,563 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The S3 Source imports `aws-java-sdk-s3` v1 in order to use the S3 Event Notification model. This adds unnecessary dependencies into Data Prepper.
**Describe the solution you'd like**
Create an internal model representation of the S3 Event Notification structure. Use Jackson's [`@JsonIgnoreProperties`](https://fasterxml.github.io/jackson-annotations/javadoc/2.9/com/fasterxml/jackson/annotation/JsonIgnoreProperties.html#ignoreUnknown()) so that we can model only the parts we need.
| Remove AWS SDK v1 from S3 Source | https://api.github.com/repos/opensearch-project/data-prepper/issues/1562/comments | 3 | 2022-07-05T15:48:18Z | 2022-08-10T21:46:04Z | https://github.com/opensearch-project/data-prepper/issues/1562 | 1,294,476,024 | 1,562 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
When we process a batch of SQS messages, we wait for some time before processing next batch. The wait time is based on number of messages processed and poll time. If poll time is greater than 0 and maximum messages is default value (which is 10 and it's the maximum value) SQS worker will quite likely have a delay because SQS might never return maximum number of messages.
SQS will return as soon as it gets at least one message even if maximum messages is set to 10.
`if (messagesProcessed < sqsOptions.getMaximumMessages() && s3SourceConfig.getSqsOptions().getPollDelay().toMillis() > 0) {`
**Expected behavior**
SQS should wait if at least one message is processed instead of checking against maximum messages.
| [BUG] S3 Source poll delay will always call Thread.sleep() if poll delay is greater than 0 | https://api.github.com/repos/opensearch-project/data-prepper/issues/1550/comments | 1 | 2022-06-29T20:28:55Z | 2022-07-05T16:18:23Z | https://github.com/opensearch-project/data-prepper/issues/1550 | 1,289,268,757 | 1,550 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
As a customer using DataPrepper to ingest OTel Trace and Metrics data using HTTP (without gRPC wire format), i want to support HTTP health check in OTelTraceSource and OTelMetricsSource because i would want to do health check on the source using HTTP and not using gRPC.
**Describe the solution you'd like**
Introduce new configuration to the OTelTraceSource and OTelMetricsSource ```health_check_service_type```. Possible values are ```GRPC and HTTP```, default is ```GRPC``` (backward compatible). When ```health_check_service_type``` is ```GRPC``` existing ```HealthGrpcService``` will be used. When ```health_check_service_type``` is ```HTTP```, ```HealthCheckService``` will be used that responds with HTTP status "200 OK" if the server is healthy and can accept requests and HTTP status "503 Service Not Available" if the server is unhealthy and cannot accept requests.
```
source:
otel_trace_source:
health_check_service: true
health_check_service_type: HTTP
proto_reflection_service: true
```
**Describe alternatives you've considered (Optional)**
Another alternative is support both GRPC and HTTP health check on OTelTraceSource and OTelMetricsSource. Additional configuration parameter will be added to enable HTTP health check. The current ```health_check_service``` will be used to enable GRPC health check and new configuration ```http_health_check_service``` will be used to enable HTTP health check.
| Enable HTTP Health Check for OTelTraceSource and OTelMetricsSource | https://api.github.com/repos/opensearch-project/data-prepper/issues/1546/comments | 4 | 2022-06-28T16:55:51Z | 2022-09-27T17:04:23Z | https://github.com/opensearch-project/data-prepper/issues/1546 | 1,287,634,346 | 1,546 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
When a failure occurred reading from an S3 object in the `s3` source, Data Prepper stopped processing the SQS queue and stopped reading S3 objects.
**To Reproduce**
Steps to reproduce the behavior:
1. Create an IAM role that gives permission to an SQS queue, but not the S3 bucket
2. Configure Data Prepper to use that role and read from the SQS queue
3. Run Data Prepper
4. You see an exception and then Data Prepper stops
**Expected behavior**
Data Prepper should log the exception as it can be useful to the operator. However, it should continue to process from the SQS queue. This will be important when transient issues occur.
Additionally, Data Prepper probably should pause shortly after any error rather than immediately retry.
**Additional context**
Below is the sample stack trace.
```
2022-06-27T14:52:31,322 [Thread-1] ERROR com.amazon.dataprepper.plugins.source.S3ObjectWorker - Error reading from S3 object: s3ObjectReference=[bucketName=***, key=***].
software.amazon.awssdk.services.s3.model.S3Exception: Access Denied (Service: S3, Status Code: 403, Request ID: FURZ9T6YXPTW27BV, Extended Request ID: YGQ9eZoetO8IXMtzIJC69PtycUwwvfGe9/fRXKNDP3W4MVF53qiit1V8+LmYoQnjdtMV8ToXba4=)
at software.amazon.awssdk.core.internal.http.CombinedResponseHandler.handleErrorResponse(CombinedResponseHandler.java:125) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.CombinedResponseHandler.handleResponse(CombinedResponseHandler.java:82) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.CombinedResponseHandler.handle(CombinedResponseHandler.java:60) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.CombinedResponseHandler.handle(CombinedResponseHandler.java:41) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.HandleResponseStage.execute(HandleResponseStage.java:40) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.HandleResponseStage.execute(HandleResponseStage.java:30) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.RequestPipelineBuilder$ComposingRequestPipelineStage.execute(RequestPipelineBuilder.java:206) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptTimeoutTrackingStage.execute(ApiCallAttemptTimeoutTrackingStage.java:73) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptTimeoutTrackingStage.execute(ApiCallAttemptTimeoutTrackingStage.java:42) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.TimeoutExceptionHandlingStage.execute(TimeoutExceptionHandlingStage.java:78) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.TimeoutExceptionHandlingStage.execute(TimeoutExceptionHandlingStage.java:40) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptMetricCollectionStage.execute(ApiCallAttemptMetricCollectionStage.java:50) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptMetricCollectionStage.execute(ApiCallAttemptMetricCollectionStage.java:36) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.RetryableStage.execute(RetryableStage.java:81) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.RetryableStage.execute(RetryableStage.java:36) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.RequestPipelineBuilder$ComposingRequestPipelineStage.execute(RequestPipelineBuilder.java:206) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.StreamManagingStage.execute(StreamManagingStage.java:56) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.StreamManagingStage.execute(StreamManagingStage.java:36) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallTimeoutTrackingStage.executeWithTimer(ApiCallTimeoutTrackingStage.java:80) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallTimeoutTrackingStage.execute(ApiCallTimeoutTrackingStage.java:60) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallTimeoutTrackingStage.execute(ApiCallTimeoutTrackingStage.java:42) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallMetricCollectionStage.execute(ApiCallMetricCollectionStage.java:48) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallMetricCollectionStage.execute(ApiCallMetricCollectionStage.java:31) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.RequestPipelineBuilder$ComposingRequestPipelineStage.execute(RequestPipelineBuilder.java:206) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.RequestPipelineBuilder$ComposingRequestPipelineStage.execute(RequestPipelineBuilder.java:206) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ExecutionFailureExceptionReportingStage.execute(ExecutionFailureExceptionReportingStage.java:37) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.pipeline.stages.ExecutionFailureExceptionReportingStage.execute(ExecutionFailureExceptionReportingStage.java:26) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.http.AmazonSyncHttpClient$RequestExecutionBuilderImpl.execute(AmazonSyncHttpClient.java:193) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.invoke(BaseSyncClientHandler.java:103) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.doExecute(BaseSyncClientHandler.java:167) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.lambda$execute$0(BaseSyncClientHandler.java:68) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.measureApiCallSuccess(BaseSyncClientHandler.java:175) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.execute(BaseSyncClientHandler.java:62) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.core.client.handler.SdkSyncClientHandler.execute(SdkSyncClientHandler.java:52) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.awscore.client.handler.AwsSyncClientHandler.execute(AwsSyncClientHandler.java:63) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.services.s3.DefaultS3Client.getObject(DefaultS3Client.java:4483) ~[data-prepper.jar:1.5.0]
at software.amazon.awssdk.services.s3.S3Client.getObject(S3Client.java:7889) ~[data-prepper.jar:1.5.0]
at com.amazon.dataprepper.plugins.source.S3ObjectWorker.doParseObject(S3ObjectWorker.java:100) ~[data-prepper.jar:1.5.0]
at com.amazon.dataprepper.plugins.source.S3ObjectWorker.lambda$parseS3Object$0(S3ObjectWorker.java:84) ~[data-prepper.jar:1.5.0]
at io.micrometer.core.instrument.composite.CompositeTimer.recordCallable(CompositeTimer.java:77) ~[data-prepper.jar:1.5.0]
at com.amazon.dataprepper.plugins.source.S3ObjectWorker.parseS3Object(S3ObjectWorker.java:83) ~[data-prepper.jar:1.5.0]
at com.amazon.dataprepper.plugins.source.S3Service.addS3Object(S3Service.java:53) ~[data-prepper.jar:1.5.0]
at com.amazon.dataprepper.plugins.source.SqsWorker.processS3Objects(SqsWorker.java:155) ~[data-prepper.jar:1.5.0]
at com.amazon.dataprepper.plugins.source.SqsWorker.processSqsMessages(SqsWorker.java:95) ~[data-prepper.jar:1.5.0]
at com.amazon.dataprepper.plugins.source.SqsWorker.run(SqsWorker.java:72) ~[data-prepper.jar:1.5.0]
at java.lang.Thread.run(Thread.java:832) ~[?:?]
Exception in thread "Thread-1" software.amazon.awssdk.services.s3.model.S3Exception: Access Denied (Service: S3, Status Code: 403, Request ID: YGR59TQYXBQW2W1H, Extended Request ID: 4kQ5eeC6tO8IXMtzIJC6X9RIcRw7ZgGe6/fZW8NDI3W0NVN53qiit1V8+QmZ4QPjdtnV9ToxB14=)
at software.amazon.awssdk.core.internal.http.CombinedResponseHandler.handleErrorResponse(CombinedResponseHandler.java:125)
at software.amazon.awssdk.core.internal.http.CombinedResponseHandler.handleResponse(CombinedResponseHandler.java:82)
at software.amazon.awssdk.core.internal.http.CombinedResponseHandler.handle(CombinedResponseHandler.java:60)
at software.amazon.awssdk.core.internal.http.CombinedResponseHandler.handle(CombinedResponseHandler.java:41)
at software.amazon.awssdk.core.internal.http.pipeline.stages.HandleResponseStage.execute(HandleResponseStage.java:40)
at software.amazon.awssdk.core.internal.http.pipeline.stages.HandleResponseStage.execute(HandleResponseStage.java:30)
at software.amazon.awssdk.core.internal.http.pipeline.RequestPipelineBuilder$ComposingRequestPipelineStage.execute(RequestPipelineBuilder.java:206)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptTimeoutTrackingStage.execute(ApiCallAttemptTimeoutTrackingStage.java:73)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptTimeoutTrackingStage.execute(ApiCallAttemptTimeoutTrackingStage.java:42)
at software.amazon.awssdk.core.internal.http.pipeline.stages.TimeoutExceptionHandlingStage.execute(TimeoutExceptionHandlingStage.java:78)
at software.amazon.awssdk.core.internal.http.pipeline.stages.TimeoutExceptionHandlingStage.execute(TimeoutExceptionHandlingStage.java:40)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptMetricCollectionStage.execute(ApiCallAttemptMetricCollectionStage.java:50)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallAttemptMetricCollectionStage.execute(ApiCallAttemptMetricCollectionStage.java:36)
at software.amazon.awssdk.core.internal.http.pipeline.stages.RetryableStage.execute(RetryableStage.java:81)
at software.amazon.awssdk.core.internal.http.pipeline.stages.RetryableStage.execute(RetryableStage.java:36)
at software.amazon.awssdk.core.internal.http.pipeline.RequestPipelineBuilder$ComposingRequestPipelineStage.execute(RequestPipelineBuilder.java:206)
at software.amazon.awssdk.core.internal.http.StreamManagingStage.execute(StreamManagingStage.java:56)
at software.amazon.awssdk.core.internal.http.StreamManagingStage.execute(StreamManagingStage.java:36)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallTimeoutTrackingStage.executeWithTimer(ApiCallTimeoutTrackingStage.java:80)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallTimeoutTrackingStage.execute(ApiCallTimeoutTrackingStage.java:60)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallTimeoutTrackingStage.execute(ApiCallTimeoutTrackingStage.java:42)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallMetricCollectionStage.execute(ApiCallMetricCollectionStage.java:48)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ApiCallMetricCollectionStage.execute(ApiCallMetricCollectionStage.java:31)
at software.amazon.awssdk.core.internal.http.pipeline.RequestPipelineBuilder$ComposingRequestPipelineStage.execute(RequestPipelineBuilder.java:206)
at software.amazon.awssdk.core.internal.http.pipeline.RequestPipelineBuilder$ComposingRequestPipelineStage.execute(RequestPipelineBuilder.java:206)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ExecutionFailureExceptionReportingStage.execute(ExecutionFailureExceptionReportingStage.java:37)
at software.amazon.awssdk.core.internal.http.pipeline.stages.ExecutionFailureExceptionReportingStage.execute(ExecutionFailureExceptionReportingStage.java:26)
at software.amazon.awssdk.core.internal.http.AmazonSyncHttpClient$RequestExecutionBuilderImpl.execute(AmazonSyncHttpClient.java:193)
at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.invoke(BaseSyncClientHandler.java:103)
at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.doExecute(BaseSyncClientHandler.java:167)
at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.lambda$execute$0(BaseSyncClientHandler.java:68)
at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.measureApiCallSuccess(BaseSyncClientHandler.java:175)
at software.amazon.awssdk.core.internal.handler.BaseSyncClientHandler.execute(BaseSyncClientHandler.java:62)
at software.amazon.awssdk.core.client.handler.SdkSyncClientHandler.execute(SdkSyncClientHandler.java:52)
at software.amazon.awssdk.awscore.client.handler.AwsSyncClientHandler.execute(AwsSyncClientHandler.java:63)
at software.amazon.awssdk.services.s3.DefaultS3Client.getObject(DefaultS3Client.java:4483)
at software.amazon.awssdk.services.s3.S3Client.getObject(S3Client.java:7889)
at com.amazon.dataprepper.plugins.source.S3ObjectWorker.doParseObject(S3ObjectWorker.java:100)
at com.amazon.dataprepper.plugins.source.S3ObjectWorker.lambda$parseS3Object$0(S3ObjectWorker.java:84)
at io.micrometer.core.instrument.composite.CompositeTimer.recordCallable(CompositeTimer.java:77)
at com.amazon.dataprepper.plugins.source.S3ObjectWorker.parseS3Object(S3ObjectWorker.java:83)
at com.amazon.dataprepper.plugins.source.S3Service.addS3Object(S3Service.java:53)
at com.amazon.dataprepper.plugins.source.SqsWorker.processS3Objects(SqsWorker.java:155)
at com.amazon.dataprepper.plugins.source.SqsWorker.processSqsMessages(SqsWorker.java:95)
at com.amazon.dataprepper.plugins.source.SqsWorker.run(SqsWorker.java:72)
at java.base/java.lang.Thread.run(Thread.java:832)
```
| [BUG] S3 Source stops on S3 error | https://api.github.com/repos/opensearch-project/data-prepper/issues/1544/comments | 0 | 2022-06-27T17:23:24Z | 2022-07-05T16:18:15Z | https://github.com/opensearch-project/data-prepper/issues/1544 | 1,286,118,749 | 1,544 |
[
"opensearch-project",
"data-prepper"
] | With the introduction of a #305 each plugin will have its own jar file in a directory structure. These plugins can be loaded with their own Java classloader. This will allow different plugins to have different dependency versions.
Some classes will use a shared classloader:
* Data Prepper API & Core classes
* Java classes | Run plugins in their own classloader | https://api.github.com/repos/opensearch-project/data-prepper/issues/1543/comments | 0 | 2022-06-25T20:31:17Z | 2022-08-18T18:28:26Z | https://github.com/opensearch-project/data-prepper/issues/1543 | 1,284,707,257 | 1,543 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The great new thing about Logstash and friends were the support for continuous operations/feeding.
**Describe the solution you'd like**
MQTT is a low key protocol for IOT devices with NB-IOT modems that has MQTT in firmware. The devices publish messages to a broker.
The broker support subscribing to subjects. So it would be nice if Data Prepper could support a continous subscription.
These messages should preferably be in JSON to map to Opensearch records.
**Describe alternatives you've considered (Optional)**
An alternative is using legacy Filebeat to eat from a logfile produced by a subscription program that stores to this logfile.
This adds operational complexity and risk for failure.
**Additional context**
| MQTT subscribe support | https://api.github.com/repos/opensearch-project/data-prepper/issues/1542/comments | 1 | 2022-06-25T20:19:54Z | 2022-07-05T17:43:15Z | https://github.com/opensearch-project/data-prepper/issues/1542 | 1,284,704,757 | 1,542 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
I'm looking for instructions to setup data-prepper within K8s.
**Describe the solution you'd like**
Helm install would be preferred.
**Describe alternatives you've considered (Optional)**
Alternatively, a good example of yaml file with all K8s resources defined would work too.
**Additional context**
By looking at the example here https://github.com/opensearch-project/data-prepper/blob/main/examples/dev/k8s/data-prepper.yaml, it is using example-k8s/data-prepper image which I cannot find from quick google search. After changed the docker image to opensearchproject/data-preper, I got the issue below during container startup. From the example, it already has ssl disabled with pipeline yaml file, but not sure if I have to disable ssl within data prepper configuration yaml file.
WARNING: sun.reflect.Reflection.getCallerClass is not supported. This will impact performance.
2022-06-24T16:39:27,052 [main] INFO com.amazon.dataprepper.parser.config.DataPrepperAppConfiguration - Command line args: /appconfig/trace_analytics_no_ssl.yml
2022-06-24T16:39:27,054 [main] INFO com.amazon.dataprepper.parser.config.DataPrepperArgs - Using /appconfig/trace_analytics_no_ssl.yml configuration file
2022-06-24T16:39:32,921 [main] WARN com.amazon.dataprepper.parser.model.PipelineConfiguration - Prepper configurations are deprecated, processor configurations will be required in Data Prepper 2.0
2022-06-24T16:39:32,925 [main] WARN com.amazon.dataprepper.parser.model.PipelineConfiguration - Prepper configurations are deprecated, processor configurations will be required in Data Prepper 2.0
2022-06-24T16:39:32,926 [main] WARN com.amazon.dataprepper.parser.model.PipelineConfiguration - Prepper configurations are deprecated, processor configurations will be required in Data Prepper 2.0
2022-06-24T16:39:32,928 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building pipeline [entry-pipeline] from provided configuration
2022-06-24T16:39:32,928 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building [otel_trace_source] as source component for the pipeline [entry-pipeline]
2022-06-24T16:39:33,076 [main] WARN com.amazon.dataprepper.plugins.source.oteltrace.OTelTraceSource - Creating otel-trace-source without authentication. This is not secure.
2022-06-24T16:39:33,076 [main] WARN com.amazon.dataprepper.plugins.source.oteltrace.OTelTraceSource - In order to set up Http Basic authentication for the otel-trace-source, go here: https://github.com/opensearch-project/data-prepper/tree/main/data-prepper-plugins/otel-trace-source#authentication-configurations
2022-06-24T16:39:33,077 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building buffer for the pipeline [entry-pipeline]
2022-06-24T16:39:33,089 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building processors for the pipeline [entry-pipeline]
2022-06-24T16:39:33,437 [main] INFO com.amazon.dataprepper.plugins.prepper.peerforwarder.discovery.DnsPeerListProvider - Found endpoints: [Endpoint{data-prepper-headless.opensearch, ipAddr=192.168.4.45, weight=1000}]
2022-06-24T16:39:33,438 [main] INFO com.amazon.dataprepper.plugins.prepper.peerforwarder.HashRing - Building hash ring with endpoints: [192.168.4.45]
2022-06-24T16:39:33,441 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building sinks for the pipeline [entry-pipeline]
2022-06-24T16:39:33,441 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building [pipeline] as sink component
2022-06-24T16:39:33,442 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building [pipeline] as sink component
2022-06-24T16:39:33,443 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building pipeline [service-map-pipeline] from provided configuration
2022-06-24T16:39:33,443 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building [pipeline] as source component for the pipeline [service-map-pipeline]
2022-06-24T16:39:33,444 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building buffer for the pipeline [service-map-pipeline]
2022-06-24T16:39:33,444 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building processors for the pipeline [service-map-pipeline]
2022-06-24T16:39:33,581 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building sinks for the pipeline [service-map-pipeline]
2022-06-24T16:39:33,581 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building [opensearch] as sink component
2022-06-24T16:39:33,588 [main] WARN com.amazon.dataprepper.plugins.sink.opensearch.index.IndexConfiguration - The parameters, trace_analytics_raw and trace_analytics_service_map, are deprecated. Please use index_type parameter instead.
2022-06-24T16:39:33,594 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink - Initializing OpenSearch sink
2022-06-24T16:39:33,600 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - Using the username provided in the config.
2022-06-24T16:39:33,725 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - Using the trust all strategy
2022-06-24T16:39:34,100 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.index.IndexManager - Found version 0 for existing index template otel-v1-apm-service-map-index-template
2022-06-24T16:39:34,100 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.index.IndexManager - Index template otel-v1-apm-service-map-index-template should not be updated, current version 0 >= existing version 0
2022-06-24T16:39:34,118 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink - Initialized OpenSearch sink
2022-06-24T16:39:34,119 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building pipeline [raw-pipeline] from provided configuration
2022-06-24T16:39:34,119 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building [pipeline] as source component for the pipeline [raw-pipeline]
2022-06-24T16:39:34,119 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building buffer for the pipeline [raw-pipeline]
2022-06-24T16:39:34,120 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building processors for the pipeline [raw-pipeline]
2022-06-24T16:39:34,120 [main] WARN com.amazon.dataprepper.parser.PipelineParser - No plugin of type Processor found for plugin setting: otel_trace_raw_prepper, attempting to find comparable Prepper plugin.
2022-06-24T16:39:34,122 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building sinks for the pipeline [raw-pipeline]
2022-06-24T16:39:34,122 [main] INFO com.amazon.dataprepper.parser.PipelineParser - Building [opensearch] as sink component
2022-06-24T16:39:34,123 [main] WARN com.amazon.dataprepper.plugins.sink.opensearch.index.IndexConfiguration - The parameters, trace_analytics_raw and trace_analytics_service_map, are deprecated. Please use index_type parameter instead.
2022-06-24T16:39:34,124 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink - Initializing OpenSearch sink
2022-06-24T16:39:34,124 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - Using the username provided in the config.
2022-06-24T16:39:34,124 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.ConnectionConfiguration - Using the trust all strategy
2022-06-24T16:39:34,190 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.index.IndexManager - Found version 1 for existing index template otel-v1-apm-span-index-template
2022-06-24T16:39:34,190 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.index.IndexManager - Index template otel-v1-apm-span-index-template should not be updated, current version 1 >= existing version 1
2022-06-24T16:39:34,196 [main] INFO com.amazon.dataprepper.plugins.sink.opensearch.OpenSearchSink - Initialized OpenSearch sink
2022-06-24T16:39:34,333 [main] WARN com.amazon.dataprepper.pipeline.server.config.DataPrepperServerConfiguration - Creating data prepper server without authentication. This is not secure.
2022-06-24T16:39:34,333 [main] WARN com.amazon.dataprepper.pipeline.server.config.DataPrepperServerConfiguration - In order to set up Http Basic authentication for the data prepper server, go here: https://github.com/opensearch-project/data-prepper/blob/main/docs/core_apis.md#authentication
2022-06-24T16:39:34,334 [main] INFO com.amazon.dataprepper.pipeline.server.HttpServerProvider - Creating Data Prepper server with TLS
2022-06-24T16:39:34,336 [main] WARN org.springframework.context.support.AbstractApplicationContext - Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'dataPrepperServer' defined in URL [jar:file:/usr/share/data-prepper/data-prepper.jar!/com/amazon/dataprepper/pipeline/server/DataPrepperServer.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'httpServer' defined in class path resource [com/amazon/dataprepper/pipeline/server/config/DataPrepperServerConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.sun.net.httpserver.HttpServer]: Factory method 'httpServer' threw exception; nested exception is java.lang.IllegalStateException: Problem loading keystore to create SSLContext
Exception in thread "main" org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'dataPrepperServer' defined in URL [jar:file:/usr/share/data-prepper/data-prepper.jar!/com/amazon/dataprepper/pipeline/server/DataPrepperServer.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'httpServer' defined in class path resource [com/amazon/dataprepper/pipeline/server/config/DataPrepperServerConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.sun.net.httpserver.HttpServer]: Factory method 'httpServer' threw exception; nested exception is java.lang.IllegalStateException: Problem loading keystore to create SSLContext
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:800)
at org.springframework.beans.factory.support.ConstructorResolver.autowireConstructor(ConstructorResolver.java:229)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.autowireConstructor(AbstractAutowireCapableBeanFactory.java:1372)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1222)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.preInstantiateSingletons(DefaultListableBeanFactory.java:955)
at org.springframework.context.support.AbstractApplicationContext.finishBeanFactoryInitialization(AbstractApplicationContext.java:918)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:583)
at com.amazon.dataprepper.ContextManager.<init>(ContextManager.java:48)
at com.amazon.dataprepper.DataPrepperExecute.main(DataPrepperExecute.java:22)
Caused by: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'httpServer' defined in class path resource [com/amazon/dataprepper/pipeline/server/config/DataPrepperServerConfiguration.class]: Bean instantiation via factory method failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.sun.net.httpserver.HttpServer]: Factory method 'httpServer' threw exception; nested exception is java.lang.IllegalStateException: Problem loading keystore to create SSLContext
at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:658)
at org.springframework.beans.factory.support.ConstructorResolver.instantiateUsingFactoryMethod(ConstructorResolver.java:638)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.instantiateUsingFactoryMethod(AbstractAutowireCapableBeanFactory.java:1352)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBeanInstance(AbstractAutowireCapableBeanFactory.java:1195)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.doCreateBean(AbstractAutowireCapableBeanFactory.java:582)
at org.springframework.beans.factory.support.AbstractAutowireCapableBeanFactory.createBean(AbstractAutowireCapableBeanFactory.java:542)
at org.springframework.beans.factory.support.AbstractBeanFactory.lambda$doGetBean$0(AbstractBeanFactory.java:335)
at org.springframework.beans.factory.support.DefaultSingletonBeanRegistry.getSingleton(DefaultSingletonBeanRegistry.java:234)
at org.springframework.beans.factory.support.AbstractBeanFactory.doGetBean(AbstractBeanFactory.java:333)
at org.springframework.beans.factory.support.AbstractBeanFactory.getBean(AbstractBeanFactory.java:208)
at org.springframework.beans.factory.config.DependencyDescriptor.resolveCandidate(DependencyDescriptor.java:276)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.doResolveDependency(DefaultListableBeanFactory.java:1391)
at org.springframework.beans.factory.support.DefaultListableBeanFactory.resolveDependency(DefaultListableBeanFactory.java:1311)
at org.springframework.beans.factory.support.ConstructorResolver.resolveAutowiredArgument(ConstructorResolver.java:887)
at org.springframework.beans.factory.support.ConstructorResolver.createArgumentArray(ConstructorResolver.java:791)
... 14 more
Caused by: org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.sun.net.httpserver.HttpServer]: Factory method 'httpServer' threw exception; nested exception is java.lang.IllegalStateException: Problem loading keystore to create SSLContext
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:185)
at org.springframework.beans.factory.support.ConstructorResolver.instantiate(ConstructorResolver.java:653)
... 28 more
Caused by: java.lang.IllegalStateException: Problem loading keystore to create SSLContext
at com.amazon.dataprepper.pipeline.server.SslUtil.createSslContext(SslUtil.java:35)
at com.amazon.dataprepper.pipeline.server.HttpServerProvider.get(HttpServerProvider.java:41)
at com.amazon.dataprepper.pipeline.server.config.DataPrepperServerConfiguration.httpServer(DataPrepperServerConfiguration.java:59)
at com.amazon.dataprepper.pipeline.server.config.DataPrepperServerConfiguration$$EnhancerBySpringCGLIB$$6fac9e.CGLIB$httpServer$1(<generated>)
at com.amazon.dataprepper.pipeline.server.config.DataPrepperServerConfiguration$$EnhancerBySpringCGLIB$$6fac9e$$FastClassBySpringCGLIB$$b3de3ee2.invoke(<generated>)
at org.springframework.cglib.proxy.MethodProxy.invokeSuper(MethodProxy.java:244)
at org.springframework.context.annotation.ConfigurationClassEnhancer$BeanMethodInterceptor.intercept(ConfigurationClassEnhancer.java:331)
at com.amazon.dataprepper.pipeline.server.config.DataPrepperServerConfiguration$$EnhancerBySpringCGLIB$$6fac9e.httpServer(<generated>)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:64)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at org.springframework.beans.factory.support.SimpleInstantiationStrategy.instantiate(SimpleInstantiationStrategy.java:154)
... 29 more
Caused by: java.io.IOException: Is a directory
at java.base/sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at java.base/sun.nio.ch.FileDispatcherImpl.read(FileDispatcherImpl.java:48)
at java.base/sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:276)
at java.base/sun.nio.ch.IOUtil.read(IOUtil.java:245)
at java.base/sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:229)
at java.base/sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:65)
at java.base/sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:107)
at java.base/sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:101)
at java.base/java.io.BufferedInputStream.fill(BufferedInputStream.java:244)
at java.base/java.io.BufferedInputStream.read(BufferedInputStream.java:263)
at java.base/sun.security.util.DerValue.init(DerValue.java:383)
at java.base/sun.security.util.DerValue.<init>(DerValue.java:327)
at java.base/sun.security.util.DerValue.<init>(DerValue.java:340)
at java.base/sun.security.pkcs12.PKCS12KeyStore.engineLoad(PKCS12KeyStore.java:1960)
at java.base/sun.security.util.KeyStoreDelegator.engineLoad(KeyStoreDelegator.java:220)
at java.base/java.security.KeyStore.load(KeyStore.java:1472)
at com.amazon.dataprepper.pipeline.server.SslUtil.createSslContext(SslUtil.java:24)
... 41 more
| is there an example to setup data-prepper within kubernetes | https://api.github.com/repos/opensearch-project/data-prepper/issues/1541/comments | 9 | 2022-06-24T16:37:36Z | 2023-11-22T05:44:02Z | https://github.com/opensearch-project/data-prepper/issues/1541 | 1,283,954,315 | 1,541 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Both the [S3 Source](https://github.com/opensearch-project/data-prepper/issues/251) and HTTP Source use similar concepts of codecs for parsing input data. The S3 Source currently makes these codecs available as plugins. So they can be extended for the S3 source. But, if another source wanted to use these plugins it would be unable to.
**Describe the solution you'd like**
Create a core concept in Data Prepper of source-based codecs or parsers. These should be generic enough to take any Java `InputStream` and produce events from them.
I propose that we based this concept on the S3 codec. It has a few advantages:
1. It uses an InputStream. This is advantageous for large inputs2.
2. It has a `Consumer` for each event. This also allows the source using the codec to receive `Event` objects and decide independently of the Codec of the best way to handle these.
3. It is not connected to HTTP in anyway.
**Describe alternatives you've considered (Optional)**
Data Prepper can have a similar concept for output codecs/parsers. However, I see no reason to force these to be the same concept. (Implementors may choose to pair them together to avoid code duplication).
**Additional context**
S3 Codec interface:
https://github.com/opensearch-project/data-prepper/blob/37c8b09ae8b7b8ce233564f5da7b09d1d41953f2/data-prepper-plugins/s3-source/src/main/java/com/amazon/dataprepper/plugins/source/codec/Codec.java#L19-L27
HTTP Codec interface:
https://github.com/opensearch-project/data-prepper/blob/37c8b09ae8b7b8ce233564f5da7b09d1d41953f2/data-prepper-plugins/http-source/src/main/java/com/amazon/dataprepper/plugins/source/loghttp/codec/Codec.java#L16-L23 | Support generic parsers/codecs | https://api.github.com/repos/opensearch-project/data-prepper/issues/1532/comments | 4 | 2022-06-23T14:37:47Z | 2023-06-03T00:57:50Z | https://github.com/opensearch-project/data-prepper/issues/1532 | 1,282,515,913 | 1,532 |
[
"opensearch-project",
"data-prepper"
] | Create Data Prepper 1.5.0 Changelog
The Changelog is a detailed overview of all the changes made to Data Prepper in this release. It needs to be generated from Git history.
See #1393 for the previous release's instructions. | Create Data Prepper 1.5.0 Changelog | https://api.github.com/repos/opensearch-project/data-prepper/issues/1521/comments | 0 | 2022-06-16T17:14:57Z | 2022-06-23T16:12:01Z | https://github.com/opensearch-project/data-prepper/issues/1521 | 1,273,862,416 | 1,521 |
[
"opensearch-project",
"data-prepper"
] | Create Data Prepper 1.5.0 Release Notes
All changes should be available at:
https://github.com/opensearch-project/data-prepper/milestone/6?closed=1 | Create Data Prepper 1.5.0 Release Notes | https://api.github.com/repos/opensearch-project/data-prepper/issues/1520/comments | 0 | 2022-06-16T17:14:13Z | 2022-06-23T14:23:47Z | https://github.com/opensearch-project/data-prepper/issues/1520 | 1,273,861,739 | 1,520 |
[
"opensearch-project",
"data-prepper"
] | Update Data Prepper documentation at: https://github.com/opensearch-project/documentation-website | Update Data Prepper documentation on OpenSearch.org for 1.5.0 | https://api.github.com/repos/opensearch-project/data-prepper/issues/1519/comments | 0 | 2022-06-16T17:13:04Z | 2022-06-24T00:50:23Z | https://github.com/opensearch-project/data-prepper/issues/1519 | 1,273,860,592 | 1,519 |
[
"opensearch-project",
"data-prepper"
] | Update `THIRD-PARTY` file. | Update THIRD-PARTY file for Data Prepper 1.5.0 | https://api.github.com/repos/opensearch-project/data-prepper/issues/1518/comments | 0 | 2022-06-16T17:12:47Z | 2022-06-17T16:33:30Z | https://github.com/opensearch-project/data-prepper/issues/1518 | 1,273,859,909 | 1,518 |
[
"opensearch-project",
"data-prepper"
] | null | Add the documentation for S3 Source | https://api.github.com/repos/opensearch-project/data-prepper/issues/1515/comments | 0 | 2022-06-15T21:01:53Z | 2022-06-22T20:24:18Z | https://github.com/opensearch-project/data-prepper/issues/1515 | 1,272,735,858 | 1,515 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Dependabot PRs for Antlr were failing in the main builds.
**Describe the solution you'd like**
Make necessary changes to update the Antlr versions to the latest.
| Update Antlr | https://api.github.com/repos/opensearch-project/data-prepper/issues/1513/comments | 1 | 2022-06-15T12:16:27Z | 2022-08-31T20:29:26Z | https://github.com/opensearch-project/data-prepper/issues/1513 | 1,272,146,661 | 1,513 |
[
"opensearch-project",
"data-prepper"
] | # Motivation
Data Prepper has an upcoming directory structure change (#305) which is conducive to having multiple jar files. Along with this, the data-prepper-core project can be split into multiple Jar files. These jar files can even be deployed to Maven Central.
Also, many of the projects, have the `data-prepper-` prefix. We may want to consider renaming the artifact and project names.
# Current Structure
```
data-prepper-api/
data-prepper-benchmarks/
data-prepper-core/
data-prepper-expression/
data-prepper-logstash-configuration/
data-prepper-plugins/
aggregate-processor/
common/
grok-processor/
...
data-prepper-test-common/
release/
e2e-test/
performance-test/
```
# Proposed Solution
Re-organize the project structure along the lines of the following:
```
common/
data-prepper-api/
data-prepper-configuration-converter-api/
core/
data-prepper-core/
data-prepper-main/
data-prepper-expression/
data-prepper-configuration-converter/
data-prepper-pipeline/
...
plugins/
(existing plugins)
test/
data-prepper-test-common/
release/
e2e-test/
performance-test/
```
### Common Components
These are components which are used by both Data Prepper Core Components and Plugins. These projects are located in the `common/` directory.
* `data-prepper-api` - The existing project as-is. It will retain the Maven groupId `org.opensearch.dataprepper`.
* `data-prepper-configuration-converter-api` - This project can have interfaces from `data-prepper-logstash-configuration` so that they are available to plugins. It will be in the Maven groupId: `org.opensearch.dataprepper.converter`
### Core Components
Most components here have the Maven groupId: `org.opensearch.dataprepper.core`. The are located in the `core/` directory in the project structure.
* `data-prepper-expression` - The existing project as-is
* `data-prepper-configuration-converter` - The existing project mostly as-is, though interfaces would be moved to `data-prepper-configuration-converter-api`. This project will have a different Maven groupId than other projects here : `org.opensearch.dataprepper.converter`.
* `data-prepper-pipeline` - Code responsible for validating and parsing pipeline configurations
* `data-prepper-server` - Manages the server which runs at 4900 by default (list pipelines, metrics, etc.)
* `data-prepper-plugin-framework` - The framework for loading plugins
* `data-prepper-core` - Contains everything needed to run Data Prepper, though without a Java main method.
* `data-prepper-main` - The Java `main` method is here, and that is all. It uses `data-prepper`. By splitting this, it will be possible to run Data Prepper with a different `main` method, or even to run it programmatically.
There may be more ways to break down the core project. This can happen iteratively as well. But, by having this structure, the project will be in a better position to split them.
### Plugins
This will be the same directory currently named `data-prepper-plugins`, but renamed to just `plugins`. The projects will not deploy to Maven Central.
### Test
Any projects related to common testing libraries. There is only one library now, but we may find value in adding more to make integration testing easier. The Maven groupId is `org.opensearch.dataprepper.test`.
### Other Projects
This proposal currently does not modify: `release`, `e2e-test`, `performance-test`.
## Out of Scope
This restructure does not include any work toward having bundled/core plugins versus optional plugins. Thus, the plugins project is mostly left as-is.
This does not include the location of the scripts to start Data Prepper in the new directory structure. This issue is focused mainly on the Gradle project structure.
## Questions
* ~Is there a better name for the `core/data-prepper` project?~
* Should the Maven artifact names (and project names) start with `data-prepper-`? This is somewhat redundant, but many projects use this convention, so we may wish to continue to follow it.
* Do we want to do anything with the `e2e-tests` and `performance-tests` directories?
* What about `data-prepper-benchmarks`? Do we even need this still?
* Can we move the `shared-configs` to a better location?
# Tasks
- [ ] Create high-level Gradle structure (common, core, test)
- [ ] Extract data-prepper-pipeline
- [ ] Extract data-prepper-main, including current build of uber-jar
- [ ] Extract data-prepper-server
- [ ] data-prepper-plugin-framework
- [ ] Create Maven artifacts for Data Prepper core libraries
- [ ] Create Maven artifacts for Data Prepper test libraries
| [RFC] New Gradle project directory structure | https://api.github.com/repos/opensearch-project/data-prepper/issues/1503/comments | 1 | 2022-06-10T20:08:21Z | 2022-10-25T00:47:43Z | https://github.com/opensearch-project/data-prepper/issues/1503 | 1,267,961,633 | 1,503 |
[
"opensearch-project",
"data-prepper"
] | Provide metrics related to SQS:
* messagesReceived (Counter)
* messagesDeleted (Counter)
* messagesFailed (Counter)
* messageDelay (Timer) - difference between time the message was added to SQS and when it was completed. | Add SQS-related metrics | https://api.github.com/repos/opensearch-project/data-prepper/issues/1501/comments | 0 | 2022-06-10T19:13:17Z | 2022-06-21T14:35:56Z | https://github.com/opensearch-project/data-prepper/issues/1501 | 1,267,913,813 | 1,501 |
[
"opensearch-project",
"data-prepper"
] | The S3 Source currently attempts to write to the buffer and if the buffer is full, will fail and drop Events. The difficulty with retrying writes to this buffer is that the S3 connection may close.
Investigate retrying writes to the Buffer as long as the connection to S3 is open. | Retry failed buffer writes as long as the connection is alive | https://api.github.com/repos/opensearch-project/data-prepper/issues/1500/comments | 0 | 2022-06-10T19:09:47Z | 2022-06-27T18:38:02Z | https://github.com/opensearch-project/data-prepper/issues/1500 | 1,267,911,137 | 1,500 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper has AWS integrations to get SSL certificates from AWS services (S3 or ACM). These are only unit tested and there are not tests which verify against actual AWS resources.
**Describe the solution you'd like**
I'd like to have a GitHub Action which runs against a known AWS environment. I want to be able to run this locally when performing tests against these specific changes. I would also still like to run the full Data Prepper build (`./gradlew clean build`) locally without having any AWS credentials or resources.
**Describe alternatives you've considered (Optional)**
N/A
**Additional context**
I am doing some work presently for the S3 source to have an integration test. PR #1474 includes an integration test which loads objects from an actual S3 bucket. This command does not run as part of the entire build. Perhaps something similar could work for the SSL certificates.
| Integration/end-to-end tests for AWS certificate integrations | https://api.github.com/repos/opensearch-project/data-prepper/issues/1481/comments | 0 | 2022-06-08T18:55:42Z | 2022-06-08T18:55:49Z | https://github.com/opensearch-project/data-prepper/issues/1481 | 1,265,163,025 | 1,481 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
As a customer, I'd like to understand the end-to-end latency of data ingestion in each stage. Thus I want to known how much time the Data Prepper is behind. For example, I have a system generating logs and output logs to DataPrepper, each log record has a `time` field which indicates when it happened in my application. DataPrepper process those logs and ingest into OpenSearch. As a customer, I can get the last processed record time, so I can identify where is the system's bottleneck.
**Describe the solution you'd like**
Allow customer to specify a time field in data. And expose a delay metric.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| An overall delay metrics | https://api.github.com/repos/opensearch-project/data-prepper/issues/1478/comments | 0 | 2022-06-08T11:34:54Z | 2022-06-13T19:40:21Z | https://github.com/opensearch-project/data-prepper/issues/1478 | 1,264,604,514 | 1,478 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The customer is using a log framework/method to write logs to local files, it may export a function called `logFramework.log(level, message)`, and the log framework will format the log message automatically. However, some new team members forget to use the log framework to output logs and just use the native method to output logs such as `console.log(xxxx)`. So this will result some invalid records in the log files.
```
{ "time": "XXXX", "msg": "xxxxx", "level": "INFO"}
{ "time": "XXXX", "msg": "yyyyy", "level": "ERROR"}
xxxxx
{ "time": "XXXX", "msg": "xxxxx", "level": "WARNING"}
{ "time": "XXXX", "msg": "zzzzz", "level": "DEBUG"}
```
I wish the `xxxxx` can also ben saved to OpenSearch as `{"time": "system-time", "message":"xxxxx"}` . I can then filter level=NONE to find out all invalid records.
**Describe the solution you'd like**
Offer an option to ingest invalid records as it is.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| An option to insert invalid parsed log into OpenSearch as it is | https://api.github.com/repos/opensearch-project/data-prepper/issues/1477/comments | 2 | 2022-06-08T11:27:29Z | 2022-11-03T16:41:00Z | https://github.com/opensearch-project/data-prepper/issues/1477 | 1,264,596,251 | 1,477 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Implement major version tagging in Data Prepper per https://github.com/opensearch-project/opensearch-build/issues/1621. More details are in that issue.
Once completed, users can run the following command to get the latest version within a major-version series.
```
docker pull opensearchproject/data-prepper:1
```
**Out of scope**
This does not include major/minor. If requested, this could be added in future versions.
| Docker Labels for Latest Major Versions | https://api.github.com/repos/opensearch-project/data-prepper/issues/1475/comments | 2 | 2022-06-07T16:29:22Z | 2022-06-23T12:43:52Z | https://github.com/opensearch-project/data-prepper/issues/1475 | 1,263,566,358 | 1,475 |
[
"opensearch-project",
"data-prepper"
] | null | Add S3-related metrics | https://api.github.com/repos/opensearch-project/data-prepper/issues/1464/comments | 0 | 2022-06-02T15:30:43Z | 2022-06-16T14:27:32Z | https://github.com/opensearch-project/data-prepper/issues/1464 | 1,258,350,623 | 1,464 |
[
"opensearch-project",
"data-prepper"
] | null | Ensure that only buckets owned by the expected account are read | https://api.github.com/repos/opensearch-project/data-prepper/issues/1463/comments | 0 | 2022-06-02T15:30:07Z | 2022-06-15T14:55:45Z | https://github.com/opensearch-project/data-prepper/issues/1463 | 1,258,349,899 | 1,463 |
[
"opensearch-project",
"data-prepper"
] | null | Parse JSON S3 objects | https://api.github.com/repos/opensearch-project/data-prepper/issues/1462/comments | 0 | 2022-06-02T15:30:04Z | 2022-06-08T01:44:14Z | https://github.com/opensearch-project/data-prepper/issues/1462 | 1,258,349,838 | 1,462 |
[
"opensearch-project",
"data-prepper"
] | Parse S3 objects which are newline-delimited. | Parse newline-delimited S3 objects | https://api.github.com/repos/opensearch-project/data-prepper/issues/1461/comments | 0 | 2022-06-02T15:30:00Z | 2022-06-03T18:46:56Z | https://github.com/opensearch-project/data-prepper/issues/1461 | 1,258,349,775 | 1,461 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Some use-cases require placing different documents in different dynamic indices. Pipeline authors want to configure a dynamic index name that is derived from a property (or multiple properties) in a Data Prepper event.
**Describe the solution you'd like**
Support dynamic index names using a format string. This format string can use `${}` to signal string interpolation and use JSON Pointer for getting fields from events. For example: `metadata-${metadataType}`.
```
pipeline:
...
sink:
opensearch:
hosts: ["https://opensearch-host"]
index_type: custom
index: "metadata-${metadataType}"
```
**Describe alternatives you've considered (Optional)**
Using conditional routing could allow for supporting a predefined number of indices. However, this approach has a very practical bound. Using this format string allows for unlimited (from Data Prepper's perspective) indices.
**Additional context**
N/A
| Dynamic Index Name in OpenSearch sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/1459/comments | 3 | 2022-06-01T17:32:52Z | 2023-03-01T13:24:07Z | https://github.com/opensearch-project/data-prepper/issues/1459 | 1,256,601,326 | 1,459 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
OpenSearch supports a `routing` parameter to route documents to a specific shard in the [Index API](https://opensearch.org/docs/latest/opensearch/rest-api/document-apis/index-document/). Some pipeline authors would like to have Data Prepper use a specific field for the `routing` parameter to route documents to specific shards.
**Describe the solution you'd like**
Provide a new `routing_field` in the OpenSearch sink. This should be similar to the existing `document_id_field`. If present, then Data Prepper will set the value of the `routing` property for that document to the value which the event has for the defined field.
This feature should support JSON Pointer syntax so that nested fields are available, not just top-level fields.
Example:
```
sink:
- opensearch:
hosts: ["https://my-opensearch"]
routing_field: metadata/id
document_id_field: id
```
Given the following event, Data Prepper will create the document with the routing set to `abcd`:
```
id: "123",
metadata: {
id : "abcd",
fieldA: "valueA"
},
fieldB: "valueB"
```
**Additional context**
This change should support complex/nested fields similar to what is proposed for `document_id_field` in #1456. | Routing of documents to shards in OpenSearch sink | https://api.github.com/repos/opensearch-project/data-prepper/issues/1458/comments | 1 | 2022-06-01T12:18:33Z | 2022-11-03T16:54:41Z | https://github.com/opensearch-project/data-prepper/issues/1458 | 1,255,849,417 | 1,458 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper only uses the `index` action of the [Bulk API](https://opensearch.org/docs/latest/opensearch/rest-api/document-apis/bulk/). In some cases, teams want to create only if the document does not already exist. The OpenSearch `_bulk` API does support this via a `create` action.
**Describe the solution you'd like**
Create a new `action` property of the OpenSearch sink. The default value will be `index` (this is the current behavior). Pipeline authors can define this value to be either `create` or `index`.
Data Prepper will send bulk requests with all events using the configured `action`.
If OpenSearch responds that the document could not be created because it exists, this document will be dropped without any retry or DLQ. This is a reasonable expectation for pipeline authors who use `create` since their goals are to avoid updating existing documents.
**Describe alternatives you've considered (Optional)**
Data Prepper could key off of values directly in the document to determine if `create` or `index` should be used. This would be somewhat more complicated since pipeline authors would need to define conditions and fallbacks. This approach could be added later if needed. But, the proposed solution should be simpler for pipeline authors.
**Additional context**
In addition to `index` and `create`, the Bulk API also supports `update` and `delete`. The proposal above would allow us to add these in future iterations.
| Create-only actions in OpenSearch bulk requests | https://api.github.com/repos/opensearch-project/data-prepper/issues/1457/comments | 5 | 2022-06-01T12:10:21Z | 2022-08-08T17:11:34Z | https://github.com/opensearch-project/data-prepper/issues/1457 | 1,255,831,647 | 1,457 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.