Librarian Bot: Add base_model information to model
Browse filesThis pull request aims to enrich the metadata of your model by adding [`distilgpt2`](https://huggingface.co/distilgpt2) as a `base_model` field, situated in the `YAML` block of your model's `README.md`.
How did we find this information? We performed a regular expression match on your `README.md` file to determine the connection.
**Why add this?** Enhancing your model's metadata in this way:
- **Boosts Discoverability** - It becomes straightforward to trace the relationships between various models on the Hugging Face Hub.
- **Highlights Impact** - It showcases the contributions and influences different models have within the community.
For a hands-on example of how such metadata can play a pivotal role in mapping model connections, take a look at [librarian-bots/base_model_explorer](https://huggingface.co/spaces/librarian-bots/base_model_explorer).
This PR comes courtesy of [Librarian Bot](https://huggingface.co/librarian-bot). If you have any feedback, queries, or need assistance, please don't hesitate to reach out to [@davanstrien](https://huggingface.co/davanstrien).
If you want to automatically add `base_model` metadata to more of your modes you can use the [Librarian Bot](https://huggingface.co/librarian-bot) [Metadata Request Service](https://huggingface.co/spaces/librarian-bots/metadata_request_service)!
@@ -2,22 +2,21 @@
|
|
2 |
license: apache-2.0
|
3 |
tags:
|
4 |
- generated_from_trainer
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
5 |
model-index:
|
6 |
- name: distilgpt2-finetune-acl22
|
7 |
results: []
|
8 |
-
widget:
|
9 |
-
|
10 |
-
- text: "Toward Annotator Group Bias in Crowdsourcing. Introduction"
|
11 |
-
example_title: "Introduction"
|
12 |
-
- text: "Over the last few years, there has been a move towards data"
|
13 |
-
example_title: "Over the last few years"
|
14 |
-
- text: "We introduce a new language representation"
|
15 |
-
example_title: "new language representation"
|
16 |
-
- text: "Acknowledgements. This research is supported by the National Science Foundation"
|
17 |
-
example_title: "Acknowledgements"
|
18 |
-
- text: "We hope that our work serves not only to inform the NLP "
|
19 |
-
example_title: "We hope that"
|
20 |
-
|
21 |
---
|
22 |
|
23 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
|
|
2 |
license: apache-2.0
|
3 |
tags:
|
4 |
- generated_from_trainer
|
5 |
+
widget:
|
6 |
+
- text: Toward Annotator Group Bias in Crowdsourcing. Introduction
|
7 |
+
example_title: Introduction
|
8 |
+
- text: Over the last few years, there has been a move towards data
|
9 |
+
example_title: Over the last few years
|
10 |
+
- text: We introduce a new language representation
|
11 |
+
example_title: new language representation
|
12 |
+
- text: Acknowledgements. This research is supported by the National Science Foundation
|
13 |
+
example_title: Acknowledgements
|
14 |
+
- text: 'We hope that our work serves not only to inform the NLP '
|
15 |
+
example_title: We hope that
|
16 |
+
base_model: distilgpt2
|
17 |
model-index:
|
18 |
- name: distilgpt2-finetune-acl22
|
19 |
results: []
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
20 |
---
|
21 |
|
22 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|