issue_owner_repo listlengths 2 2 | issue_body stringlengths 0 262k ⌀ | issue_title stringlengths 1 1.02k | issue_comments_url stringlengths 53 116 | issue_comments_count int64 0 2.49k | issue_created_at stringdate 1999-03-17 02:06:42 2025-06-23 11:41:49 | issue_updated_at stringdate 2000-02-10 06:43:57 2025-06-23 11:43:00 | issue_html_url stringlengths 34 97 | issue_github_id int64 132 3.17B | issue_number int64 1 215k |
|---|---|---|---|---|---|---|---|---|---|
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper requires JDK 14 to build. This is not an LTS version. Developers should be able to build on JDK 11.
**Describe the solution you'd like**
Support building on JDK 11. This may require support from OpenSearch since the `opensearch` sink uses OpenSearch Gradle plugins which required (or still require) JDK 11.
**Describe alternatives you've considered (Optional)**
Building on JDK 17 is a better option than 14 since it is a LTS version. But, OpenSearch is aiming for JDK 11, and it is a reasonable target since it has been available for a long time. | Build on JDK 11 | https://api.github.com/repos/opensearch-project/data-prepper/issues/665/comments | 0 | 2021-11-30T19:10:16Z | 2022-06-25T20:25:23Z | https://github.com/opensearch-project/data-prepper/issues/665 | 1,067,581,503 | 665 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper (DP) does not utilize any Dependency Injection(DI) frameworks at this moment nor are we leveraging an existing framework which supports DI. Without a DI framework there are numerous code smells in DP. Currently, DP couples construction of dependent objects with implementation of other objects. DP also has cyclical dependencies and leverages static methods and instances to solve its own singleton management.
**Describe the solution you'd like**
* Introduce dependency injection to the core of Data Prepper.
* Add minimal spring dependencies (Spring Core, javax inject)
**Tasks:**
- [x] Create Spring POC for Spring DI with UberJar and Multi-Jar deployment
- [x] Measure startup impact to DataPrepper
- [x] Add Spring DI framework as dependency and create DataPrepper class from the DI framework
- [x] DataPrepper should inject all dependencies and eliminate singleton implementation
- [x] DataPrepperServer should inject all dependencies
**Related Issues**
- Support Dependency Injection in Plugins [#929](https://github.com/opensearch-project/data-prepper/issues/929)
- Support parsing configuration files with Spring [#932](https://github.com/opensearch-project/data-prepper/issues/932)
**Additional context**
Implements: #519
Dependencies should be injected via a constructor and not via setter methods.
Unit tests should be updated to leverage dependency injection.
| Add Spring Dependency injection to Data Prepper Core | https://api.github.com/repos/opensearch-project/data-prepper/issues/664/comments | 1 | 2021-11-30T15:49:48Z | 2022-02-18T19:47:15Z | https://github.com/opensearch-project/data-prepper/issues/664 | 1,067,393,482 | 664 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The OpenSearch Plugins are still built using Gradle 6.6.x whereas the Gradle release train has moved to 7.x already. The OpenSearch core is about to be switched to Gradle 7.3 (see please opensearch-project/OpenSearch#1609), it would make sense to switch all plugins to Gradle 7.3 as well.
**Describe the solution you'd like**
Update Gradle to 7.3
**Describe alternatives you've considered (Optional)**
N/A
**Additional context**
See please opensearch-project/OpenSearch#1246 and opensearch-project/opensearch-plugins#107
| Update to Gradle 7 | https://api.github.com/repos/opensearch-project/data-prepper/issues/662/comments | 6 | 2021-11-29T16:52:44Z | 2022-06-25T20:25:54Z | https://github.com/opensearch-project/data-prepper/issues/662 | 1,066,268,882 | 662 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
When Data Prepper fails due to a user error, Data Prepper outputs a stack trace. The user of Data Prepper must make sense of this stack trace to know how to resolve the error. I am including a sample stack trace as additional context at the bottom of the issue.
**Describe the solution you'd like**
For user errors, return a cleaner message which provides more descriptive context.
```
FAILURE
Where:
Failed to parse the configuration file pipelines.yaml. Unable to parse pipeline, simple-test-pipeline.
Why:
Invalid configuration, at least one sink is required
For more help, visit the project page: https://github.com/opensearch-project/data-prepper
```
**Scope**
Unexpected failures in Data Prepper Core and in plugins should still result in stack trace output. When these unexpected scenarios occur, they are likely bugs and stack traces will help track them down.
**Proposed Approach**
Update Data Prepper API to include specific exceptions which indicate that a user errors has occurred. In Data Prepper Core, catch these types of exceptions and then wrap them in an outer exception which has additional context.
An example is plugin configuration errors. Plugins can throw a standardized exception - perhaps `InvalidPluginConfigurationException`. The pipeline parsing code can then catch that specific exception and wrap it into an exception which includes additional context and indicates that the exception is user-related. Finally, that exception is caught in `DataPrepperExecute` and used to print out the improved user message.
**Help Page**
The example output above refers users to a specific web page for help. It may also be valuable to have a single page dedicated to helping users. It might point to the documentation, suggest the forums, and include a link to submit a bug report.
For example:
```
For more help, visit the project page: https://github.com/opensearch-project/data-prepper/docs/help.md
```
**Additional context**
Current output:
```
Exception in thread "main" com.amazon.dataprepper.parser.ParseException: Failed to parse the configuration file pipelines.yaml
at com.amazon.dataprepper.parser.PipelineParser.parseConfiguration(PipelineParser.java:75)
at com.amazon.dataprepper.DataPrepper.execute(DataPrepper.java:129)
at com.amazon.dataprepper.DataPrepperExecute.main(DataPrepperExecute.java:33)
Caused by: com.fasterxml.jackson.databind.exc.ValueInstantiationException: Cannot construct instance of `com.amazon.dataprepper.parser.model.PipelineConfiguration`, problem: Invalid configuration, at least one sink is required
at [Source: (File); line: 6, column: 1] (through reference chain: java.util.LinkedHashMap["simple-test-pipeline"])
at com.fasterxml.jackson.databind.exc.ValueInstantiationException.from(ValueInstantiationException.java:47)
at com.fasterxml.jackson.databind.DeserializationContext.instantiationException(DeserializationContext.java:2047)
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.wrapAsJsonMappingException(StdValueInstantiator.java:587)
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.rewrapCtorProblem(StdValueInstantiator.java:610)
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:293)
at com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:518)
at com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1405)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:351)
at com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:184)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer._readAndBindStringKeyMap(MapDeserializer.java:609)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:437)
at com.fasterxml.jackson.databind.deser.std.MapDeserializer.deserialize(MapDeserializer.java:32)
at com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at com.fasterxml.jackson.databind.ObjectMapper._readMapAndClose(ObjectMapper.java:4675)
at com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:3515)
at com.amazon.dataprepper.parser.PipelineParser.parseConfiguration(PipelineParser.java:58)
... 2 more
Caused by: java.lang.IllegalArgumentException: Invalid configuration, at least one sink is required
at com.amazon.dataprepper.parser.model.PipelineConfiguration.getSinksFromConfiguration(PipelineConfiguration.java:107)
at com.amazon.dataprepper.parser.model.PipelineConfiguration.<init>(PipelineConfiguration.java:45)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490)
at com.fasterxml.jackson.databind.introspect.AnnotatedConstructor.call(AnnotatedConstructor.java:128)
at com.fasterxml.jackson.databind.deser.std.StdValueInstantiator.createFromObjectWith(StdValueInstantiator.java:291)
... 15 more
```
**Outstanding Questions**
*What exceptions should result in these clear errors?*
I tend to think that Data Prepper should catch JSR-380 exceptions and also have a custom exception that plugins can throw when manual validation is necessary.
* `InvalidPluginConfigurationException` - from Data Prepper API
* `ConstraintViolationException` - from JSR-380
* `ValidationException` - from JSR-380
*Where should the errors go?*
Should Data Prepper log these errors to the logs? Or perhaps stdout or stderr? Perhaps both? Or should it be configurable?
**Tasks**
- [ ] Create a standard exception for invalid plugin configurations - `InvalidPluginConfigurationException`
- [ ] Catch validation errors and extract the relevant error
- [ ] Catch YAML parsing error and extract the relevant error
- [ ] Output the error in a clean format
| Clear and Compact User Error Messages | https://api.github.com/repos/opensearch-project/data-prepper/issues/656/comments | 2 | 2021-11-24T19:55:23Z | 2023-07-23T16:22:07Z | https://github.com/opensearch-project/data-prepper/issues/656 | 1,062,843,213 | 656 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Not a problem. It is feature supported in other ingestion tools. Data Prepper should have it. Overtime, new indexes are created with strings indicating the time range the index is for.
**Describe the solution you'd like**
In OpenSearch Sink configuration, there is an `index` parameter. We want the parameter to be able to support a date and time patterns, so indices can be created on OpenSearch hosts according to the pattern configured using the `index` pattern.
**Describe alternatives you've considered (Optional)**
NA.
**Additional context**
It's part of our 1.3 goal. | [RFC] Support creating index names with date and time patterns | https://api.github.com/repos/opensearch-project/data-prepper/issues/648/comments | 8 | 2021-11-23T01:00:24Z | 2022-01-20T23:19:44Z | https://github.com/opensearch-project/data-prepper/issues/648 | 1,060,741,002 | 648 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper plugins still implement `Prepper`.
**Describe the solution you'd like**
Update all `Prepper` plugins to implement only `Processor`.
**Additional context**
This is a follow-on task to #311.
| Update existing Preppers to use only Processor | https://api.github.com/repos/opensearch-project/data-prepper/issues/647/comments | 0 | 2021-11-23T00:11:42Z | 2022-08-23T14:51:25Z | https://github.com/opensearch-project/data-prepper/issues/647 | 1,060,711,450 | 647 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Split from #625 . We have manually assigned constant MESSAGE_KEY="message" as key for creating JacksonEvent out of raw string in 3 different sources.
**Describe the solution you'd like**
JacksonEvent builder should expose an API to cover that.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Add API in JacksonEvent builder to support creation from raw text string | https://api.github.com/repos/opensearch-project/data-prepper/issues/646/comments | 2 | 2021-11-22T22:51:18Z | 2022-02-21T16:05:22Z | https://github.com/opensearch-project/data-prepper/issues/646 | 1,060,666,626 | 646 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The Data Prepper Docker image is only available for x86 architectures presently. It is written in Java and should be able to run on ARM.
**Describe the solution you'd like**
Provide an ARM architecture Docker image. This should be done as a Docker multiarch image so that users can follow the same instructions regardless of their architecture.
## Tasks
- [x] #3352
- [ ] #2571
- [ ] Build the ARM image in the GitHub Actions build
- [ ] Update Jenkinsfile to copy ARM image
| Docker image for ARM architectures | https://api.github.com/repos/opensearch-project/data-prepper/issues/640/comments | 10 | 2021-11-22T22:28:54Z | 2025-04-07T10:01:09Z | https://github.com/opensearch-project/data-prepper/issues/640 | 1,060,653,051 | 640 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper has several deployment scripts and examples. Some of these have updated to use OpenSearch, but some are still using OpenDistro for ElasticSearch.
**Describe the solution you'd like**
Update these to run using OpenSearch. Also include updates to the documentation for those scripts.
Or remove any obsolete scripts or examples.
**Additional context**
This is somewhat related to #638.
| Update Deployment Scripts to use OpenSearch or Remove Obsolete Scripts | https://api.github.com/repos/opensearch-project/data-prepper/issues/639/comments | 3 | 2021-11-22T22:15:26Z | 2022-03-01T20:51:06Z | https://github.com/opensearch-project/data-prepper/issues/639 | 1,060,643,690 | 639 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Much of the documentation still refers to OpenDistro for ElasticSearch.
**Describe the solution you'd like**
Update the documentation to use OpenSearch instead of OpenDistro.
| Update Documentation to use OpenSearch | https://api.github.com/repos/opensearch-project/data-prepper/issues/638/comments | 1 | 2021-11-22T22:12:27Z | 2021-11-30T23:11:45Z | https://github.com/opensearch-project/data-prepper/issues/638 | 1,060,641,535 | 638 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper supports HTTP Basic authentication to OpenSearch clusters. However, it does not support [client certificate authentication](https://opensearch.org/docs/latest/security-plugin/configuration/client-auth/). As a client, Data Prepper can authenticate using a client certificate. This can replace HTTP Basic authentication or be combined with HTTP Basic authentication.
**Describe the solution you'd like**
Provide a mechanism for client client certificate authentication to OpenSearch clusters. Data Prepper pipeline authors would need to supply as input paths to both a certificate file and a private key file. Additionally, Data Prepper should allow this in conjunction with HTTP Basic authentication, for any cluster which requires both.
```
sink:
- opensearch:
authentication:
mutual_tls:
private_key: -----BEGIN PRIVATE KEY----- ...
```
**Additional context**
This was originally requested in a discussion as part of #310.
| OpenSearch Client Certificate Authentication | https://api.github.com/repos/opensearch-project/data-prepper/issues/633/comments | 1 | 2021-11-22T18:24:39Z | 2025-04-15T07:29:07Z | https://github.com/opensearch-project/data-prepper/issues/633 | 1,060,466,279 | 633 |
[
"opensearch-project",
"data-prepper"
] | Performance testing needs to be done for log analytics | Performance Testing for Log Analytics | https://api.github.com/repos/opensearch-project/data-prepper/issues/630/comments | 0 | 2021-11-19T02:51:27Z | 2021-12-17T16:07:16Z | https://github.com/opensearch-project/data-prepper/issues/630 | 1,058,070,899 | 630 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently, Data Prepper does not support a standard concept for adding descriptions dynamically for what happens in individual sources, processors, and sinks. For example, if the grok processor fails to match, the user should be able to look at an Event and tell that it was not able to match. Otherwise, it can feel like grok is simply not working at all. Another example would be a `json` processor failing to parse json. The user needs to know if the parsing failed in order to pinpoint the problems with their configuration. Additionally, many users would like to check for certain tags when routing, or drop Events with those tags to save space. Lastly, users of OpenSearch would like to query based on tags in Events. Data Prepper needs a concept for easily handling these types of situations for any sink, processor, or source.
**Describe the solution you'd like**
A key that is dedicated to adding information of this sort. For example, if grok fails to match, then the event will add a `tags` key with `[grok_match_failure]` as a value. The value of `tags` will be a `Set`, as it is not helpful to have duplicate tags.
```
{ "message": "a log", "tags": ["grok_match_failure"] }
```
Now a grok user is able to quickly tell that there was no match for this Event. If this Event then went through a json parser and failed to parse, you can add `json_parse_failure` to the `tags` key, like this.
```
{ "message": "a log", "tags": ["grok_match_failure", "json_parse_failure"] }
```
To make this functionality consistent between plugins, the `Event` class could have a new function
```
void addTag(String tagName);
```
which would add `tagName` to the set of `tags`.
In order to separate tags from the rest of the Event, checking for tags with conditional expressions would look like this:
```
drop:
when: 'event.hasTag("grok_parse_failure")'
```
While I don't believe it to be required for the first iteration of tagging, a processor to control the adding and removing of custom tags could exist. It would look something like this:
```
processor:
- tag_manager:
add_tags: ["tag3", "tag4"]
remove_tags: ["tag1", "tag2"]
```
This processor could also be split into two, with one called `add_tags` and one called `remove_tags`.
**Describe alternatives you've considered (Optional)**
This problem could be solved at the plugin level. So given the same scenario with grok match failure, the Event would become something like
```
{ "message": "a log", "grok_match_failure": true }
```
and then after the json parsing fails, the Event would become something like
```
{ "message": "a log", "grok_match_failure": true, "json_parse_failure": true }
```
As you can tell, this solution doesn't scale as well. You can imagine that with a large amount of sources, processors, and sinks adding their own booleans to an Event, the Event could quickly become cluttered, and the querying for tags in OpenSearch would also become more of a pain.
### Add the tags to the `EventMetadata`
Additionally, the tags could be added to the `EventMetadata` rather than the actual `Event`. This would make the Events cleaner, and the overall tagging options more configurable and extracted from the event data itself. The `EventMetadata` would contain the following:
```
Set<String> getTags();
```
This approach would allow for conditional checks on tags, but it needs a little more implementation to make the tags a part of the sink output. For example, the OpenSearch sink could have a configuration option like the following:
```
opensearch:
host: ["localhost:9200"]
save_tags: false (default would be true)
```
This would give individual sinks the ability to configure tags however they please (they could change the name of the `tags` key or remove certain tags at the sink level)
The one concern with this is that it would result in some unnecessary duplicate code, but it is entirely likely that some sinks would like to have the tags in the Event itself, and some would not. To make some options like the `save_tags` logic reusable, a plugin could be created that would handle the logic for adding the tags from the `EventMetadata` to the `Event` itself before it is shipped to the sink.
**Additional context**
Please provide alternatives to solve this problem if there are other ideas that make more sense than the tagging concept described here.
| Tagging Events in Data Prepper | https://api.github.com/repos/opensearch-project/data-prepper/issues/629/comments | 9 | 2021-11-19T02:47:19Z | 2023-06-05T21:35:26Z | https://github.com/opensearch-project/data-prepper/issues/629 | 1,058,068,879 | 629 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The current StatefulServiceMapProcessor only extracts service-map relationship among those internal services that have interactions (API calls) being provisioned. Some users desire that services only invoked by non-provisioned external client calls also show up in the service-map model and written under `otel-v1-service-map` index in opensearch backend so that it can be displayed by trace-analytics dashboard.
**Describe the solution you'd like**
We will need to
1. modify model and schema
(1) `ServiceMapRelationship` model
(2) service-map template: https://github.com/opensearch-project/data-prepper/blob/main/docs/schemas/trace-analytics/otel-v1-apm-service-map-index-template.md
2. modify the stateful processing business logic in service-map plugin that adapts to the new model
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
See opensearch-project/dashboards-observability#57 for a related, but distinct issue.
| Support isolated service node extraction in service-map | https://api.github.com/repos/opensearch-project/data-prepper/issues/628/comments | 8 | 2021-11-18T21:15:39Z | 2023-03-20T19:22:55Z | https://github.com/opensearch-project/data-prepper/issues/628 | 1,057,807,324 | 628 |
[
"opensearch-project",
"data-prepper"
] | The Logstash Configuration Converter does not convert from Logstash's nested syntax to Data Prepper's JsonPointer nested syntax.
For example, `[outer_key][inner_key]` should be replaced with `/outer_key/inner_key` in order to properly get to "value" in the nested json below
```
outer_key: {
inner_key: "value"
}
```
This also applies to arrays such as `[array][0][key]` should be `/array/0/key` to get to "value" as shown below.
```
array: [
{ "key": "value"},
{ "not_key": "not_value" }
]
```
| Logstash Configuration Converter for Nested Syntax | https://api.github.com/repos/opensearch-project/data-prepper/issues/627/comments | 0 | 2021-11-18T18:51:28Z | 2022-03-07T15:52:07Z | https://github.com/opensearch-project/data-prepper/issues/627 | 1,057,688,812 | 627 |
[
"opensearch-project",
"data-prepper"
] | Update Data Prepper documentation at: https://github.com/opensearch-project/documentation-website | Update Data Prepper documentation on OpenSearch.org for v1.2 | https://api.github.com/repos/opensearch-project/data-prepper/issues/624/comments | 1 | 2021-11-18T16:05:58Z | 2022-02-28T16:52:24Z | https://github.com/opensearch-project/data-prepper/issues/624 | 1,057,528,945 | 624 |
[
"opensearch-project",
"data-prepper"
] | Integrate with the OpenSearch infrastructure to perform the release process. | Release Data Prepper Docker 1.2.0 using OpenSearch infrastructure | https://api.github.com/repos/opensearch-project/data-prepper/issues/623/comments | 3 | 2021-11-18T15:27:29Z | 2022-02-28T16:52:48Z | https://github.com/opensearch-project/data-prepper/issues/623 | 1,057,487,457 | 623 |
[
"opensearch-project",
"data-prepper"
] | Data Prepper pipelines are an important concept to understand. A tutorial which guides new users and developers through the process of creating and expanding a tutorial would help them have a good understanding.
This was identified as part of the work for #258 . | Tutorial for creating Data Prepper pipelines | https://api.github.com/repos/opensearch-project/data-prepper/issues/622/comments | 0 | 2021-11-18T14:17:06Z | 2022-04-19T22:23:08Z | https://github.com/opensearch-project/data-prepper/issues/622 | 1,057,409,341 | 622 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Remove ambiguity in pipeline configuration YAML files and Plugins by supporting a single name for Processors/Preppers.
**Describe the solution you'd like**
Remove support for the the `Prepper` plugin type and remove support for the prepper name within the pipeline configuration YAML.
**Describe alternatives you've considered (Optional)**
n/a
**Additional context**
This change will be dependent on all DataPrepper provided plugins migrating to the new Processor plugin type.
**Tasks**
- [x] Remove Prepper plugin support in code (PR #1707)
- [x] Update usage of `prepper:` to `processor:` in OpenSearch Documentation website (PR: https://github.com/opensearch-project/documentation-website/pull/1056) | Remove support for deprecated Prepper plugins | https://api.github.com/repos/opensearch-project/data-prepper/issues/619/comments | 7 | 2021-11-17T17:37:06Z | 2022-09-14T20:58:51Z | https://github.com/opensearch-project/data-prepper/issues/619 | 1,056,409,300 | 619 |
[
"opensearch-project",
"data-prepper"
] | Create Data Prepper 1.2 Release Notes
All changes should be available at:
https://github.com/opensearch-project/data-prepper/milestone/1?closed=1
The documentation should also note any breaking changes, even if minor. | Create Data Prepper 1.2 Release Notes | https://api.github.com/repos/opensearch-project/data-prepper/issues/618/comments | 0 | 2021-11-17T13:52:12Z | 2022-02-28T16:52:40Z | https://github.com/opensearch-project/data-prepper/issues/618 | 1,056,156,563 | 618 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The `THIRD-PARTY` file Data Prepper currently has is difficult to maintain. Data Prepper should use automation to generate it.
**Describe the solution you'd like**
Add a Gradle task which will re-generate the `THIRD-PARTY` file. It should be run manually, and require a PR before updating the file in `main`.
The [`com.github.jk1.dependency-license-report`](https://github.com/jk1/Gradle-License-Report) Gradle plugin will generate a good format when using the `TextReportRenderer` render. Because Data Prepper is on Gradle 6.6, it must be the 1.x version. | Provide Gradle task to generate the THIRD-PARTY file | https://api.github.com/repos/opensearch-project/data-prepper/issues/615/comments | 0 | 2021-11-16T18:39:13Z | 2021-11-30T23:28:51Z | https://github.com/opensearch-project/data-prepper/issues/615 | 1,055,202,952 | 615 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The Service Map Stateful processor uses the deprecated string model.
**Describe the solution you'd like**
Update `service-map-stateful` to `Event` model. | Update Service Map Stateful Processor to Event model | https://api.github.com/repos/opensearch-project/data-prepper/issues/614/comments | 1 | 2021-11-16T18:21:56Z | 2022-01-24T17:55:20Z | https://github.com/opensearch-project/data-prepper/issues/614 | 1,055,186,694 | 614 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The OTel Raw Prepper uses the deprecated string model.
**Describe the solution you'd like**
Update `otel-trace-raw-prepper` to the `Event` model. | OTel Raw Prepper to Event model | https://api.github.com/repos/opensearch-project/data-prepper/issues/613/comments | 1 | 2021-11-16T18:20:45Z | 2022-01-24T17:54:45Z | https://github.com/opensearch-project/data-prepper/issues/613 | 1,055,185,444 | 613 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The Otel Group Prepper uses the deprecated string model.
**Describe the solution you'd like**
Update `otel-trace-group-prepper` to the `Event` model. | OTel Group Prepper to Event | https://api.github.com/repos/opensearch-project/data-prepper/issues/612/comments | 1 | 2021-11-16T18:19:57Z | 2022-01-24T17:54:07Z | https://github.com/opensearch-project/data-prepper/issues/612 | 1,055,184,422 | 612 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The OTel Trace Source plugin uses the string model.
**Describe the solution you'd like**
Migrate `otel-trace-source` source plugin to `Event` model. | OTel Trace Source to Event model | https://api.github.com/repos/opensearch-project/data-prepper/issues/611/comments | 1 | 2021-11-16T18:18:39Z | 2022-01-24T17:53:39Z | https://github.com/opensearch-project/data-prepper/issues/611 | 1,055,183,374 | 611 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The Peer Forwarder processor uses the string model.
**Describe the solution you'd like**
Migrate `peer-forwarder` to `Event` model.
| Peer Forwarder to Event model | https://api.github.com/repos/opensearch-project/data-prepper/issues/610/comments | 0 | 2021-11-16T18:17:20Z | 2022-03-01T16:15:11Z | https://github.com/opensearch-project/data-prepper/issues/610 | 1,055,182,213 | 610 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The Trace Analytics and Log Ingest pipeline components (source, processors, sinks) are currently incompatible. The Trace Analytics pipelines use the `String` model which is deprecated in favor of the `Event` model.
**Describe the solution you'd like**
Update all sources, processors, and sinks to use the `Event` model.
| Update to Event model for all sources, processors, and sinks | https://api.github.com/repos/opensearch-project/data-prepper/issues/609/comments | 1 | 2021-11-16T18:12:40Z | 2021-11-16T18:14:04Z | https://github.com/opensearch-project/data-prepper/issues/609 | 1,055,178,517 | 609 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Follow up on Event model change #538
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| StringPrepper to use Event model | https://api.github.com/repos/opensearch-project/data-prepper/issues/608/comments | 0 | 2021-11-16T15:25:06Z | 2021-12-16T21:44:57Z | https://github.com/opensearch-project/data-prepper/issues/608 | 1,055,010,499 | 608 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Follow up on Event model change #538 for the [NoOpPrepper](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/common/src/main/java/com/amazon/dataprepper/plugins/prepper/NoOpPrepper.java)
**Describe the solution you'd like**
This processor is a generic and should be moved to using `Record<Event>`
**Additional context**
This plugin will move to only `Event`'s when the `Record`is fully deprecated in v2.0
| NoOpPrepper to use Event model | https://api.github.com/repos/opensearch-project/data-prepper/issues/607/comments | 2 | 2021-11-16T15:24:29Z | 2022-03-01T17:12:45Z | https://github.com/opensearch-project/data-prepper/issues/607 | 1,055,009,827 | 607 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Follow up on Event model change #538 to migrate the [FileSink](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/common/src/main/java/com/amazon/dataprepper/plugins/sink/FileSink.java).
**Describe the solution you'd like**
This sink should implement should utilized events: `... implements Sink<Record<Event>> {`
**Additional context**
This plugin will move to only `Event`'s when the `Record`is fully deprecated in v2.0
| FileSink to use Event model | https://api.github.com/repos/opensearch-project/data-prepper/issues/606/comments | 1 | 2021-11-16T15:23:41Z | 2021-12-16T23:10:52Z | https://github.com/opensearch-project/data-prepper/issues/606 | 1,055,008,927 | 606 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Follow up on Event model change #538
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| RandomStringSource to use Event model | https://api.github.com/repos/opensearch-project/data-prepper/issues/605/comments | 0 | 2021-11-16T15:22:55Z | 2021-11-23T18:31:45Z | https://github.com/opensearch-project/data-prepper/issues/605 | 1,055,008,040 | 605 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Follow up on Event model change #538
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| StdInSource to use Event model | https://api.github.com/repos/opensearch-project/data-prepper/issues/604/comments | 0 | 2021-11-16T15:21:49Z | 2021-11-23T18:31:45Z | https://github.com/opensearch-project/data-prepper/issues/604 | 1,055,006,734 | 604 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The File source can be useful for users who want to run Data Prepper directly on an application host. Using the file source in a production environment as it is now would not work, however, as it only supports reading a single file completely from start to finish.
**Describe the solution you'd like**
The File Source should have support for tailing a file as new data comes into it. This can be done by keeping a file as a mini internal database of currently tracked files. This would allow for Data Prepper to be restarted and still be able to keep track of what has already been read from a file. There are a couple of additional features that are required for using the file source in production, such as rotation/deletion of large files and support for multiple file sources at once. However, this issue should be kept solely for adding tail support to the file.
| Add tailing support to the File Source | https://api.github.com/repos/opensearch-project/data-prepper/issues/602/comments | 0 | 2021-11-16T03:47:14Z | 2022-04-19T19:36:37Z | https://github.com/opensearch-project/data-prepper/issues/602 | 1,054,408,943 | 602 |
[
"opensearch-project",
"data-prepper"
] | The file source is a good way to get started and test the grok prepper. The recent
Event model change #538 broke the connection between the fileand grok, as it still uses a `Record<String>` type while grok was updated to a ` Record<Event>`
Additionally, the file source is useful for those who just want to run data prepper on an application host, and this issue will also involve refactoring the file source and start to add some improvements to it. | File Source Improvements and update to Event Model | https://api.github.com/repos/opensearch-project/data-prepper/issues/600/comments | 0 | 2021-11-16T03:15:26Z | 2021-11-18T17:49:37Z | https://github.com/opensearch-project/data-prepper/issues/600 | 1,054,394,506 | 600 |
[
"opensearch-project",
"data-prepper"
] | Data Prepper should provide a processor which removes entire events. This processor need not take any special input, but does require conditional support from #522. (Otherwise every event would be dropped).
**Outstanding Questions**
What name should this processor have? | Processor to filter out (remove/drop) entire events | https://api.github.com/repos/opensearch-project/data-prepper/issues/598/comments | 2 | 2021-11-15T23:42:07Z | 2022-01-20T23:19:10Z | https://github.com/opensearch-project/data-prepper/issues/598 | 1,054,262,585 | 598 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The OpenSearch plugin's Gradle build applies the `opensearch.build` Gradle plugin.
```
apply plugin: 'opensearch.build'
```
This plugin is applying various OpenSearch constraints and requirements to this build. As a result, this build file is very unique from the other Data Prepper plugins. Some of the implications:
* The whole project cannot update the Gradle version
* The opensearch project manually forces all versions rather than inherit versions from Data Prepper
* Dependabot updates to this project are not always straightforward
* It appears to cause issues with using JUnit 5
**Describe the solution you'd like**
**Solution 1**
Ideally, Data Prepper would not have any sub-projects which apply the `opensearch.build` Gradle plugin. The two plugins that the opensearch plugin really need are:
```
apply plugin: 'opensearch.testclusters'
apply plugin: 'opensearch.rest-test'
```
However, `opensearch.rest-test` requires `opensearch.build`. There is a standalone rest-test plugin, but it attempts to create the `test` target itself and thus doesn't work well with the Java plugin.
I've tried to use only the `opensearch.testclusters` plugin, but the integration tests fail with some Lucene-related tests.
**Solution 2**
An alternative may be to split the opensearch plugin into two projects:
* `opensearch`
* `opensearch-integration-test`
The `opensearch-integration-test` project could still use the `opensearch.build` plugin. But, the `opensearch` plugin would be a normal Data Prepper plugin. This can at least keep runtime dependencies in sync. And unit tests could use JUnit 5. But, the Data Prepper build would still be forced to Gradle 6.
**Solution 3**
The approach to testing the OpenSearch sink can change. Only one test requires the OpenSearch test framework.
https://github.com/opensearch-project/data-prepper/blob/cbf1082c88acab85f6d7dbac71c6cd6f5932a8d0/data-prepper-plugins/opensearch/src/test/java/com/amazon/dataprepper/plugins/sink/opensearch/OpenSearchSinkIT.java#L71
Data Prepper could run OpenSearch in a Docker container similar to how the end-to-end tests work.
| Clean up OpenSearch plugin build | https://api.github.com/repos/opensearch-project/data-prepper/issues/593/comments | 2 | 2021-11-15T16:43:15Z | 2022-04-29T15:59:04Z | https://github.com/opensearch-project/data-prepper/issues/593 | 1,053,875,733 | 593 |
[
"opensearch-project",
"data-prepper"
] | The stdout sink is a good way to get started and test the grok prepper. The recent
Event model change #538 broke the connection between the stdout sink and grok, as it still uses a `Record<String>` type while grok was updated to a `Record<Event>` | StdOut Sink to use Event Model | https://api.github.com/repos/opensearch-project/data-prepper/issues/585/comments | 0 | 2021-11-12T23:28:55Z | 2021-11-18T16:31:07Z | https://github.com/opensearch-project/data-prepper/issues/585 | 1,052,476,923 | 585 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently, Data Prepper pipelines only accept data from one input, I'd like to be able to bring in data from multiple inputs into a single pipeline.
**Describe the solution you'd like**
Data Prepper tests to support running with Java 15
**Describe alternatives you've considered (Optional)**
N/A
**Additional context**
The current build will throw "Unsupported class file major version 59" when running the gradle test task when using Java 15 | Support running automated tests with Java 15 | https://api.github.com/repos/opensearch-project/data-prepper/issues/576/comments | 3 | 2021-11-12T17:44:37Z | 2022-06-08T19:05:23Z | https://github.com/opensearch-project/data-prepper/issues/576 | 1,052,221,789 | 576 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
With this bulkRequestSizeBytes, we would have insight into how large each Sink batch is and adjust the bulk_size parameter if needed.
**Describe the solution you'd like**
Having the OpenSearch Sink emit a metric, bulkRequestSizeBytes, right before the request sending the batch to OpenSearch.
**Describe alternatives you've considered (Optional)**
NA
**Additional context**
NA | Support Sink metrics: bulkRequestSizeBytes | https://api.github.com/repos/opensearch-project/data-prepper/issues/571/comments | 1 | 2021-11-12T15:11:00Z | 2021-11-18T18:29:14Z | https://github.com/opensearch-project/data-prepper/issues/571 | 1,052,067,194 | 571 |
[
"opensearch-project",
"data-prepper"
] | The Log Analytics Getting Started Guide will be made similar to the [Trace Analytics Guide](https://github.com/opensearch-project/data-prepper/blob/main/docs/trace_analytics.md)
It will focus on log ingestion support being added for Data Prepper 1.2, but will call out future plans for expanding log analytics with Data Prepper through the creation of additional sources, preppers, and sinks.
The focus will be on describing a log ingestion flow using
* FluentBit
* http source
* grok prepper
* opensearch sink
* OpenSearch
It will also link to a guide for setting up a demo of this log ingestion flow by running FluentBit, Data Prepper, and OpenSearch through Docker. | Log Analytics Getting Started Guide | https://api.github.com/repos/opensearch-project/data-prepper/issues/550/comments | 0 | 2021-11-05T21:02:01Z | 2021-11-12T23:24:49Z | https://github.com/opensearch-project/data-prepper/issues/550 | 1,046,241,803 | 550 |
[
"opensearch-project",
"data-prepper"
] | As part of the migration to the new model `Record`s should be replaced with `Event`s. All interfaces, abstract classes and plugins should be updated to support only `Event`s. `Record` and `RecordMetadata`s should be removed from the code base.
This will be a breaking change and should be introduced in the next major version 2.0.
Related to new Model Proposal: #319 | Deprecate Records | https://api.github.com/repos/opensearch-project/data-prepper/issues/547/comments | 1 | 2021-11-05T15:22:20Z | 2022-02-21T20:13:54Z | https://github.com/opensearch-project/data-prepper/issues/547 | 1,045,968,622 | 547 |
[
"opensearch-project",
"data-prepper"
] | Migrate existing trace plugins to event model. Trace analytics plugins should pass either a Record<Span> (or Record<Event> in the case of the service mapping plugin). This will eilminate excessive de/serialization between plugins
Plugins To be updated:
- Peer Forwarder
- otel source
- otel srv group prepper
- otel raw
- service map
- opensearch sink
related to #319
**GitHub Issues**
* #600
* #604
* #605
* #606
* #607
* #608
* #610
* #611
* #612
* #613
* #614
* #758 | Migrate Trace Analytics Plugins to Event Model | https://api.github.com/repos/opensearch-project/data-prepper/issues/546/comments | 1 | 2021-11-05T15:15:46Z | 2022-03-15T14:44:38Z | https://github.com/opensearch-project/data-prepper/issues/546 | 1,045,962,331 | 546 |
[
"opensearch-project",
"data-prepper"
] | Update HTTP Source and Grok Prepper to leverage the new Event model for log ingestion. | Log Ingestion Plugin Event Model Integration | https://api.github.com/repos/opensearch-project/data-prepper/issues/538/comments | 1 | 2021-11-04T20:59:19Z | 2021-11-09T20:29:01Z | https://github.com/opensearch-project/data-prepper/issues/538 | 1,045,198,406 | 538 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
I have an application that sends traces to opentelemtry collector. Traces collected by java instrumentation library: https://github.com/open-telemetry/opentelemetry-java-instrumentation And then collector send this data to data-prepper.
It also sends the same data to Jaeger system.
And for all these traces I see below errors in data-prepper service:
```
{"timeMillis":1635957998134,"thread":"raw-pipeline-prepper-worker-5-thread-1","level":"WARN","loggerName":"com.amazon.dataprepper.plugins.prepper.oteltrace.OTelTraceRawPrepper","message":"Missing trace group for SpanId: 12c47c820fb56ca3","endOfBatch":false,"loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","threadId":26,"threadPriority":5}
{"timeMillis":1635957998134,"thread":"raw-pipeline-prepper-worker-5-thread-1","level":"WARN","loggerName":"com.amazon.dataprepper.plugins.prepper.oteltrace.OTelTraceRawPrepper","message":"Missing trace group for SpanId: 6165987d239ae041","endOfBatch":false,"loggerFqcn":"org.apache.logging.slf4j.Log4jLogger","threadId":26,"threadPriority":5}
```
At the same in Jaeger these traces works as expected.
Here is my data-prepper pipeline:
```
entry-pipeline:
delay: "100"
source:
otel_trace_source:
ssl: false
prepper:
- peer_forwarder:
discovery_mode: "dns"
domain_name: "data-prepper-headless"
ssl: false
buffer:
bounded_blocking:
buffer_size: 1024 # max number of records the buffer accepts
batch_size: 256 # max number of records the buffer drains after each read
sink:
- pipeline:
name: "raw-pipeline"
- pipeline:
name: "service-map-pipeline"
raw-pipeline:
source:
pipeline:
name: "entry-pipeline"
prepper:
- otel_trace_raw_prepper:
sink:
- opensearch:
hosts: [ "https://logging-cluster-alfa-os.es.svc.cluster.local:9200" ]
insecure: true
username: "data-prepper"
password: "xxxxxxxxxxxxx"
trace_analytics_raw: true
service-map-pipeline:
delay: "100"
source:
pipeline:
name: "entry-pipeline"
prepper:
- service_map_stateful:
sink:
- opensearch:
hosts: ["https://logging-cluster-alfa-os.es.svc.cluster.local:9200"]
insecure: true
username: "data-prepper"
password: "xxxxxxxxxxxxx"
trace_analytics_service_map: true
```
and OTEL collector pipeline:
````
config:
extensions:
health_check: {}
processors:
batch/traces:
send_batch_size: 50
timeout: 1s
memory_limiter: null
receivers:
otlp:
protocols:
grpc:
http:
jaeger:
protocols:
grpc:
endpoint: 0.0.0.0:14250
thrift_http:
endpoint: 0.0.0.0:14268
zipkin:
endpoint: 0.0.0.0:9411
exporters:
otlp/data-prepper:
endpoint: data-prepper.es.svc.cluster.local:21890
tls:
insecure: true
service:
extensions:
- health_check
pipelines:
traces:
receivers: [otlp,jaeger,zipkin]
processors: [batch/traces]
exporters: [otlp/data-prepper]
telemetry:
logs:
level: "warn"
````
| Missing trace group for SpanId [BUG] | https://api.github.com/repos/opensearch-project/data-prepper/issues/534/comments | 9 | 2021-11-03T17:32:14Z | 2024-08-12T18:16:17Z | https://github.com/opensearch-project/data-prepper/issues/534 | 1,043,880,382 | 534 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Some plugins are using Java 11 APIs, though Data Prepper should only require Java 8.
There may be others, but I have found: `Path.of` and `Files.readString`.
Performing a find usages in an IDE should reveal the problem locations.
Usages in the test code are acceptable since the build requires Java 14 anyway. | [BUG] Some plugins use Java 11 APIs | https://api.github.com/repos/opensearch-project/data-prepper/issues/523/comments | 2 | 2021-11-02T23:00:53Z | 2021-11-12T20:46:12Z | https://github.com/opensearch-project/data-prepper/issues/523 | 1,042,922,371 | 523 |
[
"opensearch-project",
"data-prepper"
] | Data Prepper currently does not provide any control logic within pipelines. This proposal is to add a basic conditional system for processing data in Preppers only when certain conditions are met.
This proposal is to add a new `when` field which is available for all Processors.
When a Prepper contains the `when` field, the condition expressed in the value must be true for the Prepper to process a given event. If the field is not provided, then all events are processed.
The conditional expression should support:
* Equality operators: `==`, `>`, `<`, `>=`, `<=`
* Boolean operators: `and`, `or`, and `not`
* Set operators: `in`, and `not in`
* Regex operators: `=~` and `!~` which check a pattern on the right against the string on the left
* Fields will be accessed using JsonPointer as defined in #450
* Sets defined by `[]` and comma delimited
Thus, an example might be:
```
preppers:
...
- grok:
when: "/http/response/status_code and /http/response/status_code >= 400 and /http/response/status_code < 500"
match: "..."
- grok:
when: "/http/response/status_code in [500, 501]"
match: "..."
```
### Implementation
The `AbstractPrepper` class can support the `when` behavior so that individual Prepper implementations do not need to handle the `when` field.
This will require that `AbstractPrepper` receive `Event` types and not any type. This is ongoing work in #319. Making this change in `AbstractPrepper` is a breaking change though since it does not require the `Event` type currently.
The `AbstractPrepper` will only call `doExecute` for records which meet the conditional expression provided by `when`.
# Tasks
- [ ] Create ANTLR Parser
- [ ] Create statement evaluator
- [ ] Add when property to Abstract Processor (scope pending)
- [ ] Create logstash config converter
- [ ] Finalize scope for 1.3
| [RFC] Basic Conditional Logic in Processors | https://api.github.com/repos/opensearch-project/data-prepper/issues/522/comments | 11 | 2021-11-02T22:49:49Z | 2023-02-07T16:46:43Z | https://github.com/opensearch-project/data-prepper/issues/522 | 1,042,917,206 | 522 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The existing OTel Trace Source and the new HTTP Source both open up ports when users activate them in their pipelines. Because these are open ports, Data Prepper should encourage users to secure them with HTTP Basic Authentication and with TLS enabled.
**Describe the solution you'd like**
Data Prepper can nudge users to secure configuration by two mechanisms:
1. Update the documentation to encourage the configuring both authentication and TLS
2. Output a warning if either of these are disabled. Include a link to the documentation.
| Encourage secure configuration of external sources | https://api.github.com/repos/opensearch-project/data-prepper/issues/521/comments | 1 | 2021-11-02T22:23:00Z | 2021-11-16T18:58:30Z | https://github.com/opensearch-project/data-prepper/issues/521 | 1,042,903,406 | 521 |
[
"opensearch-project",
"data-prepper"
] | ## What kind of business use case are you trying to solve? What are your requirements?
Dependency Injection(DI) is the technique which objects receive their dependencies. Leveraging this technique and moving instantiation of dependencies out-side of the scope of an object give us a the following benefits:
* Objects align with the Single Responsibility Principle when they are not responsible for creating their dependencies
* Coupling this technique with interfaces allows an object to support Dependency Inversion and easily swap implementation of dependencies without impacting the object’s implementation.
* Easier unit test mocking. Dependencies can be mocked and injected.
### What is the problem? What is preventing you from meeting the requirements?
Data Prepper (DP) does not utilize any Dependency Injection(DI) frameworks at this moment nor are we leveraging an existing framework which supports DI. Without a DI framework there are numerous code smells in DP. Currently, DP couples construction of dependent objects with implementation of other objects. DP also has cyclical dependencies and leverages static methods and instances to solve its own singleton management.
Other Requirements
* Solution should be extendable to plugin framework and support runtime integration for DP plugins.
* Supports new [Directory Structure Proposal](https://github.com/opensearch-project/data-prepper/issues/305) and proposed plugin architecture
### What are you proposing? What do you suggest we do to solve the problem or improve the existing situation?
We will adopt dependency injection framework to eliminate the current code smells in DP and help drive DP towards a more modular architecture.
### What are remaining open questions?
- What DI framework should we adopt? There are three well known frameworks: Spring, Guice, Dagger.
| [RFC] Introducing Dependency Injection | https://api.github.com/repos/opensearch-project/data-prepper/issues/519/comments | 3 | 2021-11-02T21:24:31Z | 2022-01-14T21:37:28Z | https://github.com/opensearch-project/data-prepper/issues/519 | 1,042,854,803 | 519 |
[
"opensearch-project",
"data-prepper"
] | Provide a prepper which can extract dates from fields in events. | Prepper for extracting dates from fields | https://api.github.com/repos/opensearch-project/data-prepper/issues/509/comments | 6 | 2021-11-01T15:24:22Z | 2022-02-28T17:03:16Z | https://github.com/opensearch-project/data-prepper/issues/509 | 1,041,297,085 | 509 |
[
"opensearch-project",
"data-prepper"
] | Data Prepper should provide a Prepper or Preppers to support the following operations:
* Renaming fields
* Adding fields
* Deleting fields
* Copying fields
There should be one processor per action, e.g. a CopyProcessor, RenameProcessor, etc.
There should be one configuration, `skip_if_present`, that is defaulted to `false`. When set to `true`, the rename, add, and copy actions will not do any destructive behavior on existing data
| Mutate, alter, and delete fields from Events | https://api.github.com/repos/opensearch-project/data-prepper/issues/508/comments | 9 | 2021-11-01T15:24:18Z | 2022-02-24T18:04:18Z | https://github.com/opensearch-project/data-prepper/issues/508 | 1,041,297,007 | 508 |
[
"opensearch-project",
"data-prepper"
] | Data Prepper should have a processor which can parse key-value strings from one field and save the results into another field as a map.
For example, on the input event:
```
{
"query": "key1:value1&key2:value2&key3:value3"
}
```
Could update the event to include a field:
```
{
"query_params" : {
"key1" : "value1",
"key2" : "value2",
"key3" : "value3"
}
}
```
This processor must allow configuration of:
* The source field (`query` in the example above)
* The destination field (`query_params` in the example above)
* The delimiter between fields (`&` in the example above)
* The delimiter between key and value (`:` in the example above)
* non_match value - default `null`, the value to assign a key when it has no match
* prefix - A prefix to all keys. default `""`.
It should also support regex expressions for both types of delimiters. | Support parsing messages with key-value strings such as queries and properties | https://api.github.com/repos/opensearch-project/data-prepper/issues/507/comments | 10 | 2021-11-01T15:24:07Z | 2022-01-25T16:18:52Z | https://github.com/opensearch-project/data-prepper/issues/507 | 1,041,296,840 | 507 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The documentation for plugins should be consistent and easy to use for users of Data Prepper.
Documenting plugins should be easy for plugin developers.
**Describe the solution you'd like**
This proposal builds on the proposal from #469.
In #469, I proposed that Data Prepper support POJO models for plugin configurations. Another extension to this approach would be the ability to support dynamically generated documentation for Data Prepper from these models.
Here is an example which builds upon the example I used in #469:
```
class HttpSourceConfig {
@Min(0)
@Max(65535)
@DataPrepperPluginDocumentation(description = "The port on which the HTTP server runs")
private int port = 2021;
@Min(0)
@DataPrepperPluginDocumentation(description = "The request timeout for the HTTP server")
private int requestTimeout = 10_000;
...
}
```
Data Prepper could determine the following dynamically:
* The documentation text
* The requirements stated using JSR 303 validation. This would require an understanding of specific validation annotations though.
* The data type
I also believe it may be possible to also determine:
* The default value (would require instantiating an object, so it might not be ideal)
* The property name. I would really only want to do this if Data Prepper can get Jackson to provide an authoritative answer.
Depending upon the difficult of the latter two, the annotation could look like the following instead:
```
@DataPrepperPluginDocumentation(key = "request_timeout",
description = "The request timeout for the HTTP server",
defaultValue = "10,000")
```
With the following information, Data Prepper could then support two approaches to documentation:
1. As part of the build process, auto-generate HTML-based documentation
2. Support a command-line help using this documentation.
| [Proposal] Automatic Plugin Documentation | https://api.github.com/repos/opensearch-project/data-prepper/issues/475/comments | 0 | 2021-10-26T18:12:01Z | 2022-04-19T19:36:13Z | https://github.com/opensearch-project/data-prepper/issues/475 | 1,036,599,363 | 475 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
This integration test is failing randomly.
**To Reproduce**
Submit a PR
**Expected behavior**
The integration test should be deterministic in its passing or failing.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
It runs as part of our integration test suite. Here is a run that failed: https://github.com/opensearch-project/data-prepper/actions/runs/1386037215 the only thing that changed was a readme file.
| [BUG] Flaky Tests on Data Prepper Trace Analytics Raw Span End-to-end test with Gradle | https://api.github.com/repos/opensearch-project/data-prepper/issues/474/comments | 3 | 2021-10-26T15:50:30Z | 2023-11-04T16:53:11Z | https://github.com/opensearch-project/data-prepper/issues/474 | 1,036,473,346 | 474 |
[
"opensearch-project",
"data-prepper"
] | Implement trace models interfaces.
As part of #436 | Implement Trace Models | https://api.github.com/repos/opensearch-project/data-prepper/issues/472/comments | 0 | 2021-10-26T14:46:38Z | 2021-11-02T16:23:04Z | https://github.com/opensearch-project/data-prepper/issues/472 | 1,036,393,733 | 472 |
[
"opensearch-project",
"data-prepper"
] | The data prepper server itself needs support for TLS/SSL by default.
This includes creating a default `data-prepper-config.yaml` and `keystore.p12` that can be used in the default `data-prepper-config.yaml`. Both of these will be passed to the Data Prepper docker image, and the default `data-prepper-config.yaml` will be used if no argument for a `data-prepper-config.yaml` is provided to `docker run`.
Documentation for how someone can create their own keystore and have a secure data prepper endpoint will be provided. | Secure Data Prepper server with TLS/SSL by default | https://api.github.com/repos/opensearch-project/data-prepper/issues/471/comments | 0 | 2021-10-25T23:06:12Z | 2021-10-29T21:59:18Z | https://github.com/opensearch-project/data-prepper/issues/471 | 1,035,649,495 | 471 |
[
"opensearch-project",
"data-prepper"
] | # Is your feature request related to a problem? Please describe.
There are a few problems with our plugin code:
* Each plugin has to translate between `PluginSetting` and an internal model. This is mostly tedious work.
* The `PluginSetting` class is somewhat overloaded. This likely because it is the one class which can be injected into new plugin instances.
* Plugin validation is manual.
* All plugins must take a `PlugginSetting` even if it is not used (this is not too big of a deal since it only comes up for smaller plugins).
# Describe the solution you'd like
At a high level, I'd like for plugins to be able to define POJO classes which represent their settings models.
This could be done by adding a new field to the `@DataPrepperPlugin` annotation:
```
Class<?> pluginConfigurationType() default PluginSetting.class;
```
Plugins can still get a `PluginSetting` class (that is the default value). But, they could also have related POJOs which are set in there. Then Data Prepper Core performs a Jackson conversion.
(I have already implemented this behavior and it is working well for simple cases)
Building on top of this design, I'd also like to make a few other changes:
### Split up PluginSetting
The `PluginSetting` class has two methods which can be extracted: `getName()` and `getPipelineName()`. I believe these are really not so much settings as they are metadata which is needed later.
I propose a new interface:
```
public interface PluginDescription {
String getName();
String getPipelineName();
}
```
The existing `PluginSetting` can implement this since it already provides these methods.
I'd like to find code that uses only these two methods and change that code to take in a `PluginDescription` instead. Here is one example:
```
public static PluginMetrics fromPluginSetting(final PluginSetting pluginSetting)
```
It only needs the name and pipelineName. So the signature could become:
```
@Deprecated
public static PluginMetrics fromPluginSetting(final PluginSetting pluginSetting) {
...
}
public static PluginMetrics fromPluginDescription(final PluginDescription pluginDescription) {
...
}
```
I think it would be good to also revisit the `getProcessWorkers()` method. This one makes a little more sense since it is defined by the pipeline configuration. But, it still isn't technically a plugin setting.
### Support Alternate Constructors
Right now plugins must have a constructor which takes only a `PluginSetting` object.
I propose that Data Prepper has a new annotation to indicate which constructor to use. This could be `@DataPrepperPluginConstructor`. I also think it is valuable to consider using JSR-330's `@Inject` annotation rather than create our own. But, even with our own annotation, it is very simple.
The approach for choosing a constructor would then become:
1. Look for the annotated constructor. Use this if found
2. Look for a constructor which takes `PluginSetting`. Use this if found. This is the existing behavior, so Data Prepper should continue to handle it. I think it should be deprecated behavior in v2.0.
3. Choose the default constructor if none of the others are available. Use it.
For the annotated constructor, Data Prepper should be able to populate any of the following types:
* `PluginDescription`
* `PluginSetting`
* The class defined by `DataPrepperPlugin::pluginConfigurationType`
I would also like to consider supporting passing in the following classes into the constructor:
* `PluginMetrics`
If Data Prepper supports dependency injection, Data Prepper could also pass in other beans which are defined.
### Settings Validation
I'd like to add JSR 303 Bean Validation to these model classes.
Plugin authors could make their POJOs look like the following:
```
class HttpSourceConfig {
@Min(0)
@Max(65535)
private int port = 2021;
@Min(0)
private int requestTimeout = 10_000;
...
}
```
This isn't strictly necessary since constructors could still run validation manually.
Data Prepper could use the Hibernate Validator project which is Apache 2.0 licensed and is the reference implementation for JSR 303.
# Additional context
This largely came up because now I want to inject plugins into plugins. I could add more fields to `PluginSetting`, but this would just continue Data Prepper down this path of stuffing fields onto `PluginSettings`.
# Tasks
* [x] Support a custom plugin configuration type
* [x] Support for `@DataPrepperPluginConstructor`
* [x] JSR-303 bean validation of plugin configuration objects | Improved Plugin Configurations | https://api.github.com/repos/opensearch-project/data-prepper/issues/469/comments | 3 | 2021-10-25T21:32:28Z | 2022-01-24T15:58:42Z | https://github.com/opensearch-project/data-prepper/issues/469 | 1,035,593,700 | 469 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
We want to support the following Logstash plugins:
* `http`
* `grok`
* `elasticsearch`
* `amazon_es`
Are there any others?
**Describe the solution you'd like**
Create mapping files for each of these. They should be created to match the format defined in #466 .
| Support Logstash Configuration mapping for key plugins | https://api.github.com/repos/opensearch-project/data-prepper/issues/467/comments | 0 | 2021-10-22T17:35:48Z | 2021-11-11T18:31:36Z | https://github.com/opensearch-project/data-prepper/issues/467 | 1,033,810,224 | 467 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Beyond just parsing a Logstash configuration grammar, we must also be able map Logstash plugins and settings into Data Prepper plugins and settings.
**Describe the solution you'd like**
Provide a mapping file which can define how a Logstash plugin should map into Data Prepper.
| Provide a Mapping from Logstash Configuration files to Data Prepper configurations | https://api.github.com/repos/opensearch-project/data-prepper/issues/466/comments | 1 | 2021-10-22T17:11:09Z | 2021-11-16T17:45:29Z | https://github.com/opensearch-project/data-prepper/issues/466 | 1,033,792,363 | 466 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
This is part of the Logstash Configuration support. We need to parse Logstash Configuration files.
**Describe the solution you'd like**
Parse Logstash Configuration files using ANTLR.
| Implement Logstash Configuration Parsing | https://api.github.com/repos/opensearch-project/data-prepper/issues/465/comments | 0 | 2021-10-22T17:09:10Z | 2021-11-05T15:16:00Z | https://github.com/opensearch-project/data-prepper/issues/465 | 1,033,790,860 | 465 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper's HTTP endpoint is currently unsecured. Thus, users must either add a proxy on the host, or leave them open to network access.
**Describe the solution you'd like**
Support HTTP Basic Authentication on the HTTP input endpoint. Additionally, we will turn this on by default with a preconfigured username and password. Users will be able to turn off the HTTP security.
**Additional context**
This is related to #314 and #312
| Secure the HTTP input plugin with HTTP Basic Authentication | https://api.github.com/repos/opensearch-project/data-prepper/issues/464/comments | 4 | 2021-10-22T14:45:55Z | 2021-11-05T18:14:19Z | https://github.com/opensearch-project/data-prepper/issues/464 | 1,033,662,703 | 464 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently ssl is enabled by default in otel-trace-source but we require user to provide sslCertificateFile and sslKeyFile in order to run the source.
**Describe the solution you'd like**
We could default sslCertificateFile and sslKeyFile to use demo-data-prepper.crt and demo-data-prepper.key respectively in https://github.com/opensearch-project/data-prepper/tree/main/examples/demo
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Provide demo TLS/SSL cert and key for otel-trace-source | https://api.github.com/repos/opensearch-project/data-prepper/issues/462/comments | 1 | 2021-10-21T18:14:52Z | 2021-11-16T02:23:07Z | https://github.com/opensearch-project/data-prepper/issues/462 | 1,032,791,809 | 462 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently the endpoint of http source is not secured by default
**Describe the solution you'd like**
* set default ssl to true
* Provide demo key and cert files for `ssl_certificate_file` and `ssl_key_file` as default. We could reuse `demo-data-prepper.crt` and `demo-data-prepper.key` in https://github.com/opensearch-project/data-prepper/tree/main/examples/demo
* For testing purpose, on the client side we could turn on tlsNoVerify: https://javadoc.io/doc/com.linecorp.armeria/armeria-javadoc/latest/com/linecorp/armeria/client/ClientFactoryBuilder.html
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Secure endpoint by default on http source | https://api.github.com/repos/opensearch-project/data-prepper/issues/461/comments | 1 | 2021-10-21T17:46:19Z | 2021-11-16T02:19:45Z | https://github.com/opensearch-project/data-prepper/issues/461 | 1,032,763,429 | 461 |
[
"opensearch-project",
"data-prepper"
] | ### What kind of business use case are you trying to solve? What are your requirements?
We want to provide Logstash users an easy way to run their workloads on Data Prepper. By leveraging their existing Logstash configuration file, we will make the transition from Logstash to Data Prepper as seamless as possible.
### What is the problem? What is preventing you from meeting the requirements?
The Pipeline Configuration file required to run a Data Prepper instance (`pipelines.yaml`) is in YAML format and Logstash configuration file is a custom format.
### What are you proposing? What do you suggest we do to solve the problem or improve the existing situation?
The goal is to take existing Logstash configuration files, transform them into Data Prepper configuration files and build a Data Prepper Log Ingestion pipeline from them. The Logstash Configuration Converter will support the following requirements:
* The Converter will transform the Logstash configuration format to a Data Prepper YAML with the respective mapping and run alongside the Data Prepper log pipeline. This will be a 1-step process to make it user-friendly.
* The Converter will fail for unsupported Logstash plugins and Conditionals (scope is mentioned below in **What are your assumptions or prerequisites?** section). However, if the basic supported plugins are present in the Configuration, it will process those and throw errors for the Unsupported plugins. This can be altered later on when we extend the scope.
### User Experience
### macOS/Linux
* User has a `logstash.conf`file.
* As of Data Prepper 1.2 release, the Converter will be part of the Data Prepper uber-jar.
* User will run the uber-jar with the Logstash configuration file and `data-prepper-config.yaml` (to configure Data Prepper server with TLS/SSL):
```
java -jar data-prepper-core-$VERSION.jar logstash.conf data-prepper-config.yaml
```
* The `logstash.yaml` will be generated.
* The Data Prepper log pipeline will read the `logstash.yaml` and run the plugins.
### Docker
* User can pull the Data Prepper for Docker image using:
```
docker pull opensearchproject/data-prepper:latest
```
* A bind-mounted volume can be used to give the Logstash configuration using:
```
docker run --name data-prepper --expose 21890
-v /full/path/to/logstash.conf:/usr/share/data-prepper/pipelines.conf
-v /full/path/to/data-prepper-config.yaml:/usr/share/data-prepper/data-prepper-config.yaml
opensearchproject/opensearch-data-prepper:latest pipelines.conf
```
### What are your assumptions or prerequisites?
* Users wish to use existing Logstash configuration with Data prepper.
* Scope of the Converter for the Data Prepper 1.2 release will be supporting a Log HTTP Source plugin as Input plugin, Grok Prepper as Filter Plugin and Opensearch Sink as Output plugin.
* Conditionals and other Logstash plugins are not supported in this Converter for the Data Prepper 1.2 release.
| [RFC] Logstash Configuration Converter | https://api.github.com/repos/opensearch-project/data-prepper/issues/452/comments | 4 | 2021-10-20T17:02:10Z | 2021-11-18T19:04:09Z | https://github.com/opensearch-project/data-prepper/issues/452 | 1,031,634,661 | 452 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
How do we want to support both of these requirements:
* Providing a JSON-path structure for accessing fields within events
* Allowing fields to contain dots in the name e.g. `my.field`.
Raised by this question:
https://github.com/opensearch-project/data-prepper/pull/435/files#r732111116
| Resolve approach for JSON path and dots in key names | https://api.github.com/repos/opensearch-project/data-prepper/issues/450/comments | 15 | 2021-10-19T19:11:47Z | 2021-10-26T15:00:20Z | https://github.com/opensearch-project/data-prepper/issues/450 | 1,030,655,878 | 450 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
As part of the support for Logstash configurations, we need to have a basic model for Logstash configuration files.
**Describe the solution you'd like**
Create a Logstash configuration file data model.
**Describe alternatives you've considered (Optional)**
This was originally proposed to be resolved by using maps, lists, and primitives. However, some parts of the configuration can have a predefined model, and this makes it easier to comprehend.
| New Model for Logstash Configuration files | https://api.github.com/repos/opensearch-project/data-prepper/issues/446/comments | 0 | 2021-10-18T17:31:09Z | 2021-10-25T17:55:15Z | https://github.com/opensearch-project/data-prepper/issues/446 | 1,029,401,801 | 446 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The current code for PipelineConfiguration and PluginSettings are designed for reading from YAML. We also want to serialize YAML using these classes.
**Describe the solution you'd like**
Rework `PipelineConfiguration` and `PluginSettings` so that developers can easily write code which produces Pipeline objects that can be serialized as YAML.
| Refine the Pipeline and Settings model for creating | https://api.github.com/repos/opensearch-project/data-prepper/issues/437/comments | 4 | 2021-10-15T17:26:40Z | 2021-11-04T16:41:17Z | https://github.com/opensearch-project/data-prepper/issues/437 | 1,027,649,834 | 437 |
[
"opensearch-project",
"data-prepper"
] | Implement Log & Trace Models according to their interface.
This is part of #319 | Implement Log & Trace Model | https://api.github.com/repos/opensearch-project/data-prepper/issues/436/comments | 0 | 2021-10-15T14:42:11Z | 2021-11-04T20:56:41Z | https://github.com/opensearch-project/data-prepper/issues/436 | 1,027,514,478 | 436 |
[
"opensearch-project",
"data-prepper"
] | Implement the following interfaces as part of the new model:
- Event
- EventMetadata
This is part of #319 | Implement metadata & event model | https://api.github.com/repos/opensearch-project/data-prepper/issues/434/comments | 0 | 2021-10-15T14:13:20Z | 2021-10-19T19:28:19Z | https://github.com/opensearch-project/data-prepper/issues/434 | 1,027,487,900 | 434 |
[
"opensearch-project",
"data-prepper"
] | Apart from common metrics in [AbstractPrepper](https://github.com/opendistro-for-elasticsearch/data-prepper/blob/main/data-prepper-api/src/main/java/com/amazon/dataprepper/model/prepper/AbstractPrepper.java), grok-prepper introduces the following custom metrics.
Counter
* grokProcessingMatchFailure: records the number of Records that did not match any of the patterns specified in the match field
* grokProcessingMatchSuccess: records the number of Records that found at least one pattern match from the match field
* grokProcessingMatchTimeout: records the total number of Records that took longer than the time set in `timeoutMillis` to match
* grokProcessingErrors: records the total number of processing errors for Records
| Metrics for Grok Prepper | https://api.github.com/repos/opensearch-project/data-prepper/issues/422/comments | 0 | 2021-10-13T16:00:04Z | 2021-11-04T20:36:43Z | https://github.com/opensearch-project/data-prepper/issues/422 | 1,025,429,791 | 422 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Building an external plugin requires having data-prepper-api. Presently, this is not available unless it is manually built by a plugin author.
We should deploy Data Prepper artifacts to Maven Central.
Additionally, we deploy to the Maven groupId: `org.opensearch.dataprepper`. Since this project is not currently deployed, this is not a breaking change.
**Describe the solution you'd like**
Deploy aftifacts to Maven Central
**Additional context**
This is important to fully support to #321 .
# Tasks
* [x] Update groupId to `org.opensearch.dataprepper`
* [x] Support publication of Maven artifacts to Maven local
* [x] #1180
* [x] https://github.com/opensearch-project/opensearch-build/issues/1751
| Deploy Artifacts to Maven Central | https://api.github.com/repos/opensearch-project/data-prepper/issues/421/comments | 2 | 2021-10-13T15:33:56Z | 2022-03-23T15:29:57Z | https://github.com/opensearch-project/data-prepper/issues/421 | 1,025,402,492 | 421 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently, Data Prepper pipelines only accept data from one input, I'd like to be able to bring in data from multiple inputs into a single pipeline.
**Describe the solution you'd like**
Data Prepper pipelines support multiple inputs into a single pipeline.
**Describe alternatives you've considered (Optional)**
N/A
**Additional context**
N/A | Support multiple sources for a pipeline | https://api.github.com/repos/opensearch-project/data-prepper/issues/406/comments | 2 | 2021-10-11T15:49:40Z | 2025-03-04T20:55:21Z | https://github.com/opensearch-project/data-prepper/issues/406 | 1,022,870,965 | 406 |
[
"opensearch-project",
"data-prepper"
] | This is the first task for the [#319](https://github.com/opensearch-project/data-prepper/issues/319).
This includes introducing all interfaces for `event`, `span`, `log`, and any other sub component of these three interfaces | New Internal Model Interfaces | https://api.github.com/repos/opensearch-project/data-prepper/issues/405/comments | 0 | 2021-10-11T14:30:50Z | 2021-10-15T14:10:40Z | https://github.com/opensearch-project/data-prepper/issues/405 | 1,022,776,520 | 405 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently http source is still using the Buffer::write API which means incoming request data could be partially written into the Buffer.
**Describe the solution you'd like**
Use the new `Buffer::writeAll` API in http source
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Integrate new Buffer::writeAll API with http source | https://api.github.com/repos/opensearch-project/data-prepper/issues/404/comments | 1 | 2021-10-08T21:06:17Z | 2022-02-21T20:14:25Z | https://github.com/opensearch-project/data-prepper/issues/404 | 1,021,462,012 | 404 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
This set of tests will fail and pass or pass then fail.
**To Reproduce**
Steps to reproduce the behavior:
1. Create a PR
2. Watch it fail
3. rerun the action
4. watch it pass
**Expected behavior**
To pass and not fail (as long as the test is correct)
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| Flaky Integration Test: Run ./gradlew :data-prepper-core:serviceMapEndToEndTest | https://api.github.com/repos/opensearch-project/data-prepper/issues/399/comments | 2 | 2021-10-08T15:38:54Z | 2021-10-20T19:30:38Z | https://github.com/opensearch-project/data-prepper/issues/399 | 1,021,229,356 | 399 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
This integration test is flaky and goes between passing and not.
**To Reproduce**
Steps to reproduce the behavior:
1. Create a PR
2. Let the tests run, see it fail
3. Retry the test
4. watch it pass
**Expected behavior**
The tests to run successfully (or not multiple times in a row).
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| Flaky integration test: Run ./gradlew :data-prepper-plugins:opensearch:test --tests | https://api.github.com/repos/opensearch-project/data-prepper/issues/398/comments | 1 | 2021-10-08T15:04:29Z | 2023-11-04T16:53:32Z | https://github.com/opensearch-project/data-prepper/issues/398 | 1,021,197,516 | 398 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Presently, the DataPrepper only loads plugins from the `com.amazon.dataprepper.plugins` Java package. We have found that we can add new plugins and make them part of DataPrepper by putting our customer plugins in that same package.
However, as plugin authors, we don't want to write almost all of our code in one Java package - `com.myorg.project` - and then put just the plugins into the `com.amazon.dataprepper.plugins` package.
**Describe the solution you'd like**
DataPrepper should be able to search other packages for plugins.
Provide a mechanism whereby a plugin author can write a plugin in any package name the plugin author desires.
**Additional context**
Copied from ODFE Data Prepper: https://github.com/opendistro-for-elasticsearch/data-prepper/issues/680
| Support additional package names for plugins | https://api.github.com/repos/opensearch-project/data-prepper/issues/379/comments | 1 | 2021-10-07T15:54:59Z | 2022-02-01T16:02:40Z | https://github.com/opensearch-project/data-prepper/issues/379 | 1,020,205,110 | 379 |
[
"opensearch-project",
"data-prepper"
] | This is a subtask of the issue for a grok processor: #256.
The Grok processor needs support for configuration options for `break_on_match`, `overwrite`, `pattern_definitions`, `patterns_dir`, `patterns_files_glob`, and `target` | Grok Processor Additional Features | https://api.github.com/repos/opensearch-project/data-prepper/issues/376/comments | 0 | 2021-10-06T21:20:39Z | 2021-10-18T20:28:55Z | https://github.com/opensearch-project/data-prepper/issues/376 | 1,019,255,491 | 376 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently there is no metrics for monitoring http source.
**Describe the solution you'd like**
We will add the following metrics:
Counter
* Number of received requests: requests received by the http server
* Number of processed request: request data successfully pushed to buffer.
* Number of time-out request: due to buffer full. A surge in this metric indicates user/operator should increase the data-prepper buffer size or upscale the number of processWorkers configured at data-prepper pipeline level.
* Number of bad request: due to json validation or parsing error
* Number of rejected request: due to throttling. This is essentially a metric on backpressure. A surge in this metric means user/operator should increase the threads or the queue size in the executor (maxPendingRequests).
Timer
* Request processing time(latency): includes data encoding/decoding, pushing data into buffer
Summary
* Payload size distribution
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Adding metrics into http source | https://api.github.com/repos/opensearch-project/data-prepper/issues/374/comments | 1 | 2021-10-06T18:50:03Z | 2021-10-12T14:40:40Z | https://github.com/opensearch-project/data-prepper/issues/374 | 1,018,975,215 | 374 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Recently I am seeing more frequent CI failure on e2e tests due to UnprocessedRequestException when Armeria client sending the requests.
```
Task :data-prepper-core:rawSpanEndToEndTest
com.amazon.dataprepper.integration.EndToEndRawSpanTest > testPipelineEndToEnd FAILED
io.grpc.StatusRuntimeException at EndToEndRawSpanTest.java:156
Caused by: com.linecorp.armeria.client.UnprocessedRequestException at UnprocessedRequestException.java:45
Caused by: io.netty.channel.AbstractChannel$AnnotatedConnectException
Caused by: java.net.ConnectException at Errors.java:155
```
**To Reproduce**
This is produced by github CI.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: linux
- Version: github
**Additional context**
Add any other context about the problem here.
| Trace analytics ingestion End-to-end tests have been flaky | https://api.github.com/repos/opensearch-project/data-prepper/issues/366/comments | 9 | 2021-10-06T18:32:07Z | 2021-10-20T19:30:55Z | https://github.com/opensearch-project/data-prepper/issues/366 | 1,018,941,170 | 366 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently only local file certificate provider is supported.
**Describe the solution you'd like**
We will expand support to s3 and ACM when necessary
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Support more TLS/SSL certificate providers in http source | https://api.github.com/repos/opensearch-project/data-prepper/issues/365/comments | 1 | 2021-10-06T16:28:02Z | 2022-09-09T01:48:19Z | https://github.com/opensearch-project/data-prepper/issues/365 | 1,018,702,004 | 365 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The TLS/SSL config has not been unified across plugins (otel-trace-source, http, peerforwarder). It should be unifiable so that all plugins could reuse the same config and CertificateProviderFactory.
**Describe the solution you'd like**
We need to sort out all necessary TLS/SSL config parameters and document them in a separate TLS/SSL module
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Centralize SSL configuration and certificate provider factory | https://api.github.com/repos/opensearch-project/data-prepper/issues/364/comments | 4 | 2021-10-06T16:26:20Z | 2023-10-12T20:05:32Z | https://github.com/opensearch-project/data-prepper/issues/364 | 1,018,698,540 | 364 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently we have the following certificate providers on source plugin (server) and peer forwarder (client)
- local file
- s3
- ACM
where only ACM supports encrypted key file with passphrase.
**Describe the solution you'd like**
We should support encrypted key with passphrase for all certificate provider since key file encryption standard is generic.
Approach:
Since Armeria server provides the API to configure TLS with encrypted key and passphrase, we only need to modify the Certificate model to accommodate encrypted key and passphrase.
**Describe alternatives you've considered (Optional)**
Alternatively, we could refactor the [decrypt](https://github.com/opensearch-project/data-prepper/blob/3a669bf4c3d7110db9a5c9994e86b47430f5b2f1/data-prepper-plugins/otel-trace-source/src/main/java/com/amazon/dataprepper/plugins/certificate/acm/ACMCertificateProvider.java#L128) method existing in ACM provider into a common SSL utility method for reusage. This will work with servers that happen not to support encrypted private key. We do not have such use case so far.
**Additional context**
Add any other context or screenshots about the feature request here.
| All TLS/SSL certificate provider should accommodate encrypted key | https://api.github.com/repos/opensearch-project/data-prepper/issues/362/comments | 0 | 2021-10-05T22:28:49Z | 2022-04-19T19:35:13Z | https://github.com/opensearch-project/data-prepper/issues/362 | 1,016,968,514 | 362 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The http source only supports insecure http connection. We need to enable https
**Describe the solution you'd like**
As an initial open source release, we will support local certificate file path.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Enable TLS/SSL for http source plugin | https://api.github.com/repos/opensearch-project/data-prepper/issues/358/comments | 1 | 2021-10-05T16:17:48Z | 2021-10-12T14:40:05Z | https://github.com/opensearch-project/data-prepper/issues/358 | 1,016,520,939 | 358 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
We updated ODFE Data Prepper dependencies with known CVEs. This task is to apply that work into OpenSearch Data Prepper.
**Describe the solution you'd like**
Update any dependencies which were not updated by dependabot or other changes.
See https://github.com/opendistro-for-elasticsearch/data-prepper/issues/877 | Update dependencies from ODFE Data Prepper | https://api.github.com/repos/opensearch-project/data-prepper/issues/357/comments | 5 | 2021-10-05T15:21:24Z | 2021-10-20T23:39:15Z | https://github.com/opensearch-project/data-prepper/issues/357 | 1,016,461,422 | 357 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Basic input validation is done on *String* inputs but correct formats are not checked for what they should be. For example, ARNs, S3 paths, etc. are not checked if they are a valid format or not until the time of use.
**To Reproduce**
Steps to reproduce the behavior:
Create a YAML configuration file with an incorrectly formatted ARN and run the code.
**Expected behavior**
When Data Prepper parses the file, it should throw errors and prevent Data Prepper startup due to misconfigurations. | Better input validation. | https://api.github.com/repos/opensearch-project/data-prepper/issues/350/comments | 1 | 2021-10-04T14:35:21Z | 2021-10-20T23:39:15Z | https://github.com/opensearch-project/data-prepper/issues/350 | 1,015,263,190 | 350 |
[
"opensearch-project",
"data-prepper"
] | This is a subtask of the issue for a grok processor: #256.
The grok prepper needs a configurable way to specify how long the prepper should attempt to match patterns against a log before it should move on to the next one. This will allow users to improve performance by not spending too much time matching an individual log.
The user should also be able to set the timeout to be disabled, which will be done when the timeout_millis value is set to 0. | Grok Prepper Match Timeout | https://api.github.com/repos/opensearch-project/data-prepper/issues/345/comments | 0 | 2021-10-01T17:10:31Z | 2021-10-20T20:54:25Z | https://github.com/opensearch-project/data-prepper/issues/345 | 1,013,580,959 | 345 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The project is in the opensearch project and the naming should likewise be adjusted.
**Describe the solution you'd like**
`com.amazon` will be replaced with `org.opensearch` in all files
| Update package naming to be org.opensearch.dataprepper from com.amazon.dataprepper | https://api.github.com/repos/opensearch-project/data-prepper/issues/344/comments | 2 | 2021-10-01T16:34:42Z | 2022-10-04T13:18:22Z | https://github.com/opensearch-project/data-prepper/issues/344 | 1,013,546,401 | 344 |
[
"opensearch-project",
"data-prepper"
] | Coming from https://github.com/opensearch-project/project-meta/issues/17
A Developer Certificate of Origin is required on commits in the OpenSearch-Project.
See [doc.yml](https://github.com/opensearch-project/.github/blob/main/workflow/dco.yml) for an example workflow. Ensure CONTRIBUTING.md to has a section on the DCO per the [project template](https://github.com/opensearch-project/.github/blob/main/CONTRIBUTING.md#developer-certificate-of-origin).
- [x] DCO Check Workflow
- [x] CONTRIBUTING.md DCO Section | Ensure DCO Workflow Check | https://api.github.com/repos/opensearch-project/data-prepper/issues/326/comments | 1 | 2021-09-30T17:12:08Z | 2021-10-05T20:59:08Z | https://github.com/opensearch-project/data-prepper/issues/326 | 1,012,423,428 | 326 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Currently http source plugin does not limit the task queue size in handling new requests, this will lead to memory overflow in high load.
**Describe the solution you'd like**
The source plugin needs to return 429 (Too many requests) once the blockingTaskExecutor queue size reaches max_pending_requests.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Add throttling handler into http source | https://api.github.com/repos/opensearch-project/data-prepper/issues/323/comments | 1 | 2021-09-29T15:16:18Z | 2021-10-12T14:39:37Z | https://github.com/opensearch-project/data-prepper/issues/323 | 1,011,095,432 | 323 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
The current plugin classes are static utilities and inextensible. To support additional new sources of plugins, as in #321, will require either changing these static classes or creating a more flexible class structure.
**Describe the solution you'd like**
The goal of this design is to replace the following classes:
* PluginFactory (note, this is being replaced by a new class with the same name)
* PluginRepository
* SourceFactory
* BufferFactory
* PrepperFactory
* SinkFactory
The following two diagrams outline the proposed class design for Data Prepper plugins. These diagrams show Java packages, as well as the module in which those packages exist. Packages within the same module share the same color.
The `com.amazon.dataprepper.plugin` package is split into two different diagrams. The first diagram focuses on what it exposes to other packages and modules.

The following diagram outlines the internal details of the `com.amazon.dataprepper.plugin` package.

* `PluginFactory` - Implementations of this class provide the ability to create new plugin instances. This interface exists in data-prepper-api so that custom plugins can use this class without depending on data-prepper-core.
* `DefaultPluginFactory` - This design anticipates that we will only need one implementation of PluginFactory. This is that single implementation.
* `PluginProvider` - An interface for finding plugin classes.
* `ClasspathPluginProvider` - An implementation of `PluginFactory` which locates plugins in the classpath only.
* `RepositoryPluginProvider` - An implementation of `PluginFactory` which locates plugins from a remote repository. This is not in scope for the initial work.
Additionally, this proposal changes to the `@DataPrepperPlugin` class:
```
@Documented
@Retention(RetentionPolicy.RUNTIME)
@Target({ElementType.TYPE})
public @interface DataPrepperPlugin {
/**
*
* @return Name of the plugin which should be unique for the type
*/
String name();
/**
* @deprecated Remove in favor of {@link DataPrepperPlugin#pluginType()}
* @return The plugin type enum
*/
@Deprecated
PluginType type();
/**
* The class type for this plugin.
*
* @return The Java class
*/
Class<?> pluginType();
}
```
**Additional Context**
Existing classes will be deprecated and removed in a future major release of Data Prepper. | Plugin Class Refactoring | https://api.github.com/repos/opensearch-project/data-prepper/issues/322/comments | 0 | 2021-09-29T14:09:08Z | 2021-10-20T18:51:54Z | https://github.com/opensearch-project/data-prepper/issues/322 | 1,011,010,710 | 322 |
[
"opensearch-project",
"data-prepper"
] | This RFC proposes a new approach for supporting plugins with Data Prepper. Its main goals are to promote modularity and allow for decentralized plugins.
**What is the problem? What is preventing you from meeting the requirements?**
Data Prepper plugins should be split out so that they do not need to be part of every installation and Data Prepper. Additionally, other teams or individuals should be able to easily create their own plugins for Data Prepper. These remote plugins would be development and deployed independently of Data Prepper.
Currently, Data Prepper requires that all the plugins be embedded within the jar and in the current classpath.
**What are you proposing? What do you suggest we do to solve the problem or improve the existing situation?**
Data Prepper will support loading plugins from two different sources:
* The Java classpath of Data Prepper
* External plugins from remote repositories
This diagram outlines the division between the two types of plugins.

The Java classpath will be used for two scenarios:
1. Any core plugin which will always be included within Data Prepper
2. Custom Data Prepper distributions. Users of Data Prepper may be running Data Prepper without internet access (e.g. running in an enclosed network), and should be able to install custom versions of Data Prepper with all the plugins they need.
Loading plugins from remote repositories will be used for plugins which are not currently installed in Data Prepper. This proposal uses Maven Central as the mechanism for distribution of these plugins. Additionally, plugins will load within their own dedicated class loader to provide isolation between plugins. A future RFC or proposal will detail this approach.
This diagram shows the concept for supporting remote plugins which are external to Data Prepper.

Remote plugins will be downloaded into their own `plugins` directory within the Data Prepper directory structure.
The following outlines a possible directory structure. In this approach, each plugin has its own uber-jar file with all of its dependencies include. It may also be worth considering expanding a plugin uber jar as an alternative.
```
data-prepper-$VERSION/
bin/
data-prepper # Shell script to run Data Prepper on Linux/macOS
plugins/
my-plugin-a.jar
my-plugin-b.jar
logs/ # Directory for log output
LICENSE
NOTICE
README.md
```
**What are your assumptions or prerequisites?**
This work depends on the new directory structure being added as part of [Directory Structure for Data Prepper](https://github.com/opensearch-project/data-prepper/issues/305).
**Tasks**
- [ ] #1543
- [ ] Support loading plugins from a remote repository
| [RFC] Plugin Redesign | https://api.github.com/repos/opensearch-project/data-prepper/issues/321/comments | 0 | 2021-09-29T13:48:52Z | 2022-06-25T20:31:18Z | https://github.com/opensearch-project/data-prepper/issues/321 | 1,010,984,915 | 321 |
[
"opensearch-project",
"data-prepper"
] | ## What kind of business use case are you trying to solve? What are your requirements?
Existing preppers consume and emit serialized JSON strings. This wastes CPU cycles when chaining preppers due to excessive de/serialization. Users of the DP have encountered [runtime exceptions](https://github.com/opendistro-for-elasticsearch/data-prepper/issues/464) due to conflicting data requirements of their preppers. Model definitions are duplicated throughout prepper plugins. (e.g. [otel-trace-group-prepper TraceGroup](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/otel-trace-group-prepper/src/main/java/com/amazon/dataprepper/plugins/prepper/oteltracegroup/model/TraceGroup.java) and [otel-trace-raw-prepper](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/otel-trace-raw-prepper/src/main/java/com/amazon/dataprepper/plugins/prepper/oteltrace/model/TraceGroup.java))
*Requirements:*
1. Extendable - the model should scale beyond the existing trace analytics support.
2. Type Safety - the model should be prescriptive enough to enable type safety checks throughout the pipelines in the future.
3. Eliminate need to duplicate code between plugins
4. Allow preppers to operate on internal data in a generic way
5. Remove excessing serialization
### What is the problem? What is preventing you from meeting the requirements?
Currently, data flows through Data Prepper as a Collection<Records<T>>. Records are generic types that allow for any type to flow through. Trace events have been been defined as a Collection of Records as type String. The strings are serialized representations of JSON objects conforming to the OTEL Spec.
### What are you proposing? What do you suggest we do to solve the problem or improve the existing situation?
We will deprecate Records and define explicit object models for Traces and Logs. Traces and Logs will implement a new interface called Event. Events will be the new data type flowing through Data Prepper.
Source plugins will be responsible for translating the external requests into Events. Sink plugins will be responsible for transforming the Events into the correct output schema. Preppers will only accept Events as inputs and outputs or subtypes. This will effectively create internal boundaries for our model between sources and sinks.
#### Event
Events will be managed through public putting, deleting and fetching methods. An additional method for generating a JSON String is included to support the sinks.
/**
* Add or update the key with a the given value in the Event
*/
void put(String key, Object value);
/**
* Retrieves the given key from the Event
*/
<T> T get(String key, Class<T> type);
/**
* Deletes the given key from the Event
*/
void delete(String key);
/**
* Generates a Json representation of the Event
*/
JsonNode toJsonNode();
/**
* Get Metadata
*/
Metadata getMetadata();
#### Event*Metadata*
Will be a class with a slight refactor of the RecordMetadata Class. Currently, RecordMetadata maintains a map of attributes and has one required recordType attribute. However, the recordType has been historically ignored. The new model will have required attributes recategorized as POJO fields. The eventType will help preserve the type (i.e log, span), for casting and type validation. The EventMetadata class will still maintain a mapping for custom metadata defined in attributes.
public class EventMetadata {
private String eventType;
private long timeReceived;
private Map<String, Object> attributes;
#### Span
Span will be a new model to support the traces. It will implement the Event interface and maintain the same attributes as the current [RawSpan Object](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/otel-trace-raw-prepper/src/main/java/com/amazon/dataprepper/plugins/prepper/oteltrace/model/RawSpan.java). This will ensure backwards compatibility with our existing preppers.
#### Phased Approach
This design includes breaking changes and will be broken into two phases. One phase allows us to build support for the new model and onboard log ingestion and trace analytics. The second phase will deprecate the old model and will be a part of the 2.0 release
### What are your assumptions or prerequisites?
The design and changes to the pipelines to enforce type safety are out of scope and should addressed in a separate review. However, the output of this design should not hinder but enable type safety enforcement.
This aligns with the proposal for [Log Ingestion RFC](https://github.com/opensearch-project/data-prepper/issues/306)
### What are remaining open questions?
* Which library should we use to support the underlying interfaces? (JsonPath or Jackson) [JsonPath](https://github.com/json-path/JsonPath) is a library for reading and updating JSON documents. It natively supports the dot notation. [Jackson](https://github.com/FasterXML/jackson) is fast JSON library for parsing json objects and supports Json Pointers for managing objects. Both libraries will work. Jackson will be ideal to reduce dependencies.
| [RFC] Internal Model Proposal | https://api.github.com/repos/opensearch-project/data-prepper/issues/319/comments | 5 | 2021-09-28T21:44:28Z | 2022-08-05T20:58:54Z | https://github.com/opensearch-project/data-prepper/issues/319 | 1,010,239,069 | 319 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper running behind proxies is not currently supported.
**Describe the solution you'd like**
Follow the same approach as we are planning #300. Each AWS configuration can include a new proxy configuration.
For example, within peer-forwarder:
```
prepper:
- peer_forwarder:
discovery_mode: aws_cloud_map
proxy: http://my-proxy:9000
```
**Describe alternatives you've considered (Optional)**
We may wish to have a default AWS configuration available for users to configure. This could be used for any plugin which uses AWS. However, this would be a much larger change.
**AWS Components Needing Updating**
- [ ] peer-forwarder Cloud Map (AWS SDK v2)
- [ ] opensearch (AWS SDK v2)
- [ ] CloudWatchMeterRegistryProvider (AWS SDK v2)
- [ ] `ACMCertificateProvider` (multiple plugins; AWS SDK v1)
- [ ] `S3CertificateProvider` (multiple plugins; AWS SDK v1) | Proxy support for AWS SDKs | https://api.github.com/repos/opensearch-project/data-prepper/issues/317/comments | 1 | 2021-09-27T23:14:42Z | 2023-06-05T21:09:19Z | https://github.com/opensearch-project/data-prepper/issues/317 | 1,008,784,921 | 317 |
[
"opensearch-project",
"data-prepper"
] | **Describe the bug**
Although configured in ServerBuilder: https://github.com/opensearch-project/data-prepper/blob/1cb44e1094ac7428130f75a5b01fe55aeeb09283/data-prepper-plugins/otel-trace-source/src/main/java/com/amazon/dataprepper/plugins/source/oteltrace/OTelTraceSource.java#L107
BlockingTaskExecutor is not used by default to execute GRPC service: https://armeria.dev/docs/server-grpc#blocking-service-implementation
This leads to blocking tasks executed directly on main EventLoop thread that blocks event handling in netty framework. Also, as a consequence, `threads` parameter never actually works in the source plugin.
**To Reproduce**
Do not have way to reproduce in data-prepper context. But it is expected to severely delay the new request handling.
**Expected behavior**
We should useBlockingExecutor in grpcServiceBuilder:
```
final GrpcServiceBuilder grpcServiceBuilder = GrpcService
.builder()
// explicitly use blocking task executor
.useBlockingTaskExecutor(true)
.addService(new OTelTraceGrpcService(
oTelTraceSourceConfig.getRequestTimeoutInMillis(),
buffer,
pluginMetrics
))
.useClientTimeoutHeader(false);
```
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Ubuntu 20.04 LTS]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| BlockingTaskExecutor is never used in Otel-Trace-Source | https://api.github.com/repos/opensearch-project/data-prepper/issues/316/comments | 4 | 2021-09-27T22:01:33Z | 2021-12-15T18:12:56Z | https://github.com/opensearch-project/data-prepper/issues/316 | 1,008,682,100 | 316 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Current buffer offers write API that only allows write single item atomically. Once we migrated to the internal data model as mentioned in https://github.com/opensearch-project/data-prepper/issues/306, the existing write API will lead to partially ingested request data when called in source plugin.
**Describe the solution you'd like**
We will provide a new writeAll(Collection<T> items) API that **atomically** writes collection of raw data type as multiple items into the buffer.
**Describe alternatives you've considered (Optional)**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
| Add and implement writeAll API in Buffer | https://api.github.com/repos/opensearch-project/data-prepper/issues/315/comments | 1 | 2021-09-27T20:37:03Z | 2021-11-18T18:20:56Z | https://github.com/opensearch-project/data-prepper/issues/315 | 1,008,580,229 | 315 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper's OTel endpoint is currently unsecured. Thus, users must either add a proxy on the host, or leave them open to network access.
**Describe the solution you'd like**
Support HTTP Basic Authentication on the OTel endpoint. Additionally, we will turn this on by default with a preconfigured username and password. Users will be able to turn off the HTTP security.
This is similar to #312, but covers the OTel endpoint specifically.
| Secure OTel gRPC API with HTTP Basic security | https://api.github.com/repos/opensearch-project/data-prepper/issues/314/comments | 0 | 2021-09-27T18:35:06Z | 2021-11-12T21:27:17Z | https://github.com/opensearch-project/data-prepper/issues/314 | 1,008,467,401 | 314 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
All HTTP methods work on all the core APIs (`shutdown`, `list`, `metrics`).
**Describe the solution you'd like**
Only the following HTTP combinations should be supported:
| Path | Method |
| ----- | ----- |
| shutdown | POST |
| list | GET |
| list | POST |
| metrics/sys | GET |
| metrics/sys | POST |
| metrics/prometheus | GET |
| metrics/prometheus | POST |
| Support correct HTTP methods on core API endpoints | https://api.github.com/repos/opensearch-project/data-prepper/issues/313/comments | 0 | 2021-09-27T17:51:03Z | 2022-08-31T18:27:01Z | https://github.com/opensearch-project/data-prepper/issues/313 | 1,008,426,173 | 313 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper provides a few core APIs such as `shutdown`, `list`, `metrics/...`. Currently, users of Data Prepper must currently secure them either through network security or a proxy on the host.
**Describe the solution you'd like**
Add support for HTTP Basic authentication on these endpoints. Additionally, this authentication mechanism will be turned on by default with a predefined username and password. Turning it on should encourage users to set their own username and password. We will support turning off HTTP Basic authentication entirely.
| Secure core APIs with HTTP Basic security | https://api.github.com/repos/opensearch-project/data-prepper/issues/312/comments | 5 | 2021-09-27T16:59:24Z | 2021-11-10T22:46:07Z | https://github.com/opensearch-project/data-prepper/issues/312 | 1,008,380,937 | 312 |
[
"opensearch-project",
"data-prepper"
] | **Is your feature request related to a problem? Please describe.**
Data Prepper currently supports `prepper` plugins. This term is not entirely clear. Additionally, it is somewhat ambiguous with Data Prepper.
**Describe the solution you'd like**
Rename `prepper` to `processor`.
This can be done in a phased approach. First, we will add `processor` as a supported name within the pipeline YAML configuration. We will continue to support either `processor` or `prepper` in the YAML. However, only one may be present per pipeline.
In a major release version of Data Prepper we will remove `prepper` so that only `processor` is allowed.
Additionally, we will add a new interface `Processor` with the same signature as `Prepper`. The `Prepper` interface will be made to inherit from `Processor` and will be deprecated. In a major release of Data Prepper, we will remove the `Prepper` interface.
| Rename Prepper to Processor | https://api.github.com/repos/opensearch-project/data-prepper/issues/311/comments | 9 | 2021-09-27T16:23:28Z | 2021-11-30T17:51:45Z | https://github.com/opensearch-project/data-prepper/issues/311 | 1,008,350,684 | 311 |
[
"opensearch-project",
"data-prepper"
] | ### What kind of business use case are you trying to solve? What are your requirements?
[A roadmap has been released](https://opensearch.org/blog/releases/2021/09/data-prepper-roadmap/) to enable Data Prepper to ingest logs from telemetry data collection agents (such as FluentBit and Telemetry Collector) into OpenSearch. In accordance to this roadmap, we are proposing new updates on [current OpenSearch Sink implementation](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/opensearch/src/main/java/com/amazon/dataprepper/plugins/sink) to support following use cases.
*Use Cases:*
1. As a user, I want to be able to ingest log data in batches from Data Prepper into OpenSearch.
2. As a developer, I want to refactor OpenSearch Sink plugin to make it easily extensible to support more index types.
### What is the problem? What is preventing you from meeting the requirements?
1. OpenSearch Sink doesn’t officially support ingestion of log data.
2. OpenSearch Sink’s source code needs to be refactored to make it easily extensible to support more index types. The existing code has some anti-pattern implementations. For example, in the [IndexStateManagement](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/opensearch/src/main/java/com/amazon/dataprepper/plugins/sink/opensearch/IndexStateManagement.java) class, if-else statements are used to switch between different index types.
### What are you proposing? What do you suggest we do to solve the problem or improve the existing situation?
We are going to make the following two major updates on [existing OpenSearch Sink implementation](https://github.com/opensearch-project/data-prepper/blob/main/data-prepper-plugins/opensearch/src/main/java/com/amazon/dataprepper/plugins/sink).
#### Refactoring for Better Extensibility
* Create an IndexType enum list which maintains the complete list of supported index types.
```
public enum IndexType {
TRACE_ANALYTICS_RAW("trace-analytics-raw"),
TRACE_ANALYTICS_SERVICE_MAP("trace-analytics-service-map"),
CUSTOM("custom"); //will be used to handle generic event ingestion, inlcuding log.
private String indexTypeCode; // which will be passed in through the new index_type parameter in configuration.
}
```
* Create IndexManager interface, which is implemented by subclasses corresponding to index types: *TraceAnalyticsRawIndexManager, TraceAnalyticsServiceMapIndexManager, DefaultIndexManager.*
```
interface IndexManager {
boolean checkISMEnabled();
Optional<String> checkAndCreatePolicy();
void checkAndCreateIndexTemplate();
void checkAndCreateIndex();
}
```
* Delete IndexStateManagement class. The implementations in IndexStateManagement class will be moved to relevant IndexManager sub-classes. For example, implementations for Raw Trace Analytics in IndexStateManagement will be moved to TraceAnalyticsRawIndexManager.
* DefaultIndexManager will be used to support general data ingestion, including log ingestion, into OpenSearch.
* Move OpenSearchSink::checkAndCreateIndex() to corresponding IndexManager sub-classes because its implementation is specific to index type.
* Move OpenSearchSink:: checkAndCreateIndexTemplate() to corresponding IndexManager sub-classes because its implementation is related to index type.
* Add a new IndexManagerFactory which produces an IndexManager instance corresponding to the IndexType input.
```
public class IndexManagerFactory {
Map<IndexType, IndexManager> indexManagers = new HashMap<>();
public IndexManagerFactory() {
indexManagers.put(IndexType.CUSTOM, DefaultIndexManager.getInstance());
indexManagers.put(IndexType.TRACE_RAW, TraceRawIndexManager.getInstance());
indexManagers.put(IndexType.TRACE_SERVICE_MAP, TraceRawIndexManager.getInstance());
indexManagers.put(IndexType.LOG_SERVICE_MAP, LogServiceMapIndexManager.getInstance());
//...
}
public IndexManager getIndexManager(IndexType indexType) {
return indexManagers.get(indexType);
}
}
```
* OpenSearchSink class will be independent of any specific index types. It will depend on IndexManagerFactory to get specific IndexManager instance to perform index related operations.
#### Support of New Parameters in Configuration
Parameter | Required | Description | Additional Comments
-- | -- | -- | --
index_type | No | Default: custom <br>One string value from the index types that Data Prepper supports: trace-analytics-raw, trace-analytics-service-map, custom This parameter is used only when "trace_analytics_raw" and "trace_analytics_service_map" are not set. | We consider two parameters, "trace_analytics_raw" and "trace_analytics_service_map", as deprecated. If customers are using "trace_analytics_raw" and "trace_analytics_service_map", we will show a warning on console notifying customers that they are using a parameter which will be removed and that they should use index_type instead. We will remove the two deprecated parameters in the next major version release of Data Prepper, likely in 2022.
timeout | No | Default: 60 <br>Set the timeout, in seconds, for network operations and requests sent OpenSearch. If a timeout occurs, the request will be retried. |
ism_policy_file | No | This specifize a path to an index state machine policy file. If not provided, we will use a default one. This is only effective when index state management is enabled on OpenSearch. |
number_of_shards | No | Default: 1 <br>Number of shards for the index | if the parameter is set in configuration, the IndexConfiguration class will use the set value to override default.
number_of_replicas | No | Default: 1 <br>Number of replicas for the index | if the parameter is set in configuration, the IndexConfiguration class will use the set value to override default.
#### Example of the OpenSearch Sink Configuration
```
sink:
- opensearch:
hosts: ["https://search-host1.example.com:9200", "https://search-host2.example.com:9200"]
cert: "config/root-ca.pem"
username: "ta-user"
password: "ta-password"
index_type: "custom"
index: "my-service-application-log"
template_file: "/path/to/index_template"
ism_policy_file: "/path/to/ism_policy"
```
### What are your assumptions or prerequisites?
This aligns with the most recent [blog post](https://opensearch.org/blog/releases/2021/09/data-prepper-roadmap/)
### What are remaining open questions?
1. We are considering supporting ingestion to OpenSearch hosts through a reversed proxy.
2. We are considering supporting date and time pattern in index names.
| [RFC] OpenSearch Sink Updates Proposal | https://api.github.com/repos/opensearch-project/data-prepper/issues/310/comments | 4 | 2021-09-24T16:11:12Z | 2021-11-22T18:14:07Z | https://github.com/opensearch-project/data-prepper/issues/310 | 1,006,620,051 | 310 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.