librarian-bot commited on
Commit
495e265
1 Parent(s): 197cdf3

Librarian Bot: Add base_model information to model

Browse files

This pull request aims to enrich the metadata of your model by adding [`distilgpt2`](https://huggingface.co/distilgpt2) as a `base_model` field, situated in the `YAML` block of your model's `README.md`.

How did we find this information? We performed a regular expression match on your `README.md` file to determine the connection.

**Why add this?** Enhancing your model's metadata in this way:
- **Boosts Discoverability** - It becomes straightforward to trace the relationships between various models on the Hugging Face Hub.
- **Highlights Impact** - It showcases the contributions and influences different models have within the community.

For a hands-on example of how such metadata can play a pivotal role in mapping model connections, take a look at [librarian-bots/base_model_explorer](https://huggingface.co/spaces/librarian-bots/base_model_explorer).

This PR comes courtesy of [Librarian Bot](https://huggingface.co/librarian-bot). If you have any feedback, queries, or need assistance, please don't hesitate to reach out to [@davanstrien](https://huggingface.co/davanstrien). Your input is invaluable to us!

Files changed (1) hide show
  1. README.md +19 -24
README.md CHANGED
@@ -1,37 +1,33 @@
1
  ---
 
 
2
  license: apache-2.0
 
3
  tags:
4
  - generated_from_trainer
5
  - chatgpt
 
 
6
  metrics:
7
  - accuracy
8
- model-index:
9
- - name: distilgpt2-HC3
10
- results: []
11
  widget:
12
- - text: >-
13
- Review: Best cast iron skillet you will ever buy. Is this review positive or
14
- negative? <answer>
15
  example_title: Sentiment analysis
16
- - text: >-
17
- Barack Obama nominated Hilary Clinton as his secretary of state on Monday.
18
  He chose her because <answer>
19
  example_title: Coreference resolution
20
- - text: >-
21
- On a shelf, there are five books: a gray book, a red book, a purple book, a
22
- blue book, and a black book. Here's the puzzle, <answer>
23
  example_title: Logic puzzles
24
- - text: >-
25
- The two men running to become New York City's next mayor will face off in
26
  their first debate Wednesday night <answer>
27
  example_title: Reading comprehension
28
- - text: >-
29
- Is it true that if I have five 5-hour energy drinks in a single 24-hour
30
- period, I get 25 hours of energy and spontaneously explode? <answer>
31
  example_title: 5 hour energy
32
- - text: >-
33
- what happens if you train a smaller model on a dataset of
34
- reinforcement-learning optimized model responses? <answer>
35
  example_title: deep learning advice
36
  inference:
37
  parameters:
@@ -39,12 +35,11 @@ inference:
39
  max_length: 96
40
  no_repeat_ngram_size: 3
41
  repetition_penalty: 1.5
42
- datasets:
43
- - pszemraj/HC3-textgen-qa
44
- language:
45
- - en
46
- library_name: transformers
47
  pipeline_tag: text-generation
 
 
 
 
48
  ---
49
 
50
 
 
1
  ---
2
+ language:
3
+ - en
4
  license: apache-2.0
5
+ library_name: transformers
6
  tags:
7
  - generated_from_trainer
8
  - chatgpt
9
+ datasets:
10
+ - pszemraj/HC3-textgen-qa
11
  metrics:
12
  - accuracy
 
 
 
13
  widget:
14
+ - text: 'Review: Best cast iron skillet you will ever buy. Is this review positive
15
+ or negative? <answer>'
 
16
  example_title: Sentiment analysis
17
+ - text: Barack Obama nominated Hilary Clinton as his secretary of state on Monday.
 
18
  He chose her because <answer>
19
  example_title: Coreference resolution
20
+ - text: 'On a shelf, there are five books: a gray book, a red book, a purple book,
21
+ a blue book, and a black book. Here''s the puzzle, <answer>'
 
22
  example_title: Logic puzzles
23
+ - text: The two men running to become New York City's next mayor will face off in
 
24
  their first debate Wednesday night <answer>
25
  example_title: Reading comprehension
26
+ - text: Is it true that if I have five 5-hour energy drinks in a single 24-hour period,
27
+ I get 25 hours of energy and spontaneously explode? <answer>
 
28
  example_title: 5 hour energy
29
+ - text: what happens if you train a smaller model on a dataset of reinforcement-learning
30
+ optimized model responses? <answer>
 
31
  example_title: deep learning advice
32
  inference:
33
  parameters:
 
35
  max_length: 96
36
  no_repeat_ngram_size: 3
37
  repetition_penalty: 1.5
 
 
 
 
 
38
  pipeline_tag: text-generation
39
+ base_model: distilgpt2
40
+ model-index:
41
+ - name: distilgpt2-HC3
42
+ results: []
43
  ---
44
 
45