pszemraj commited on
Commit
85536d9
1 Parent(s): efd03ae

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -1
README.md CHANGED
@@ -10,10 +10,14 @@ pinned: false
10
  # stacked summaries
11
 
12
 
13
- This organization exists to test and evaluate the (_potential_) benefits of "task-oriented pretraining" as popularized by the [FLAN-t5](https://huggingface.co/google/flan-t5-base) series of models
14
 
15
  ## mission statement
16
 
17
  The idea is to apply a similar concept but adjusted to be more specific w.r.t. the summarization task. Hopefully, this will train models that actually "know" how to condense and distill meaningful information from text rather than learning some naive style transfer of _"this is what the dataset summaries sound like so I will do that with essential words."_
18
 
19
  The most apparent augmentation/task is "stacking" summaries that are shorter than `MAX_LENGTH_TOKENS` when combined, so the model has to learn to separate and group summaries for these independent concepts.
 
 
 
 
 
10
  # stacked summaries
11
 
12
 
13
+ This organization exists to test and evaluate the (_potential_) benefits of "task-oriented pretraining" as popularized by the [FLAN-t5](https://huggingface.co/google/flan-t5-base) series of models within the summarization NLP task.
14
 
15
  ## mission statement
16
 
17
  The idea is to apply a similar concept but adjusted to be more specific w.r.t. the summarization task. Hopefully, this will train models that actually "know" how to condense and distill meaningful information from text rather than learning some naive style transfer of _"this is what the dataset summaries sound like so I will do that with essential words."_
18
 
19
  The most apparent augmentation/task is "stacking" summaries that are shorter than `MAX_LENGTH_TOKENS` when combined, so the model has to learn to separate and group summaries for these independent concepts.
20
+
21
+ ## work statement
22
+
23
+ unless otherwise noted, work here is completed by "night shift" NLP enthusiasts/researchers in their free time.