Instructions to use vikp/instruct_rater with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use vikp/instruct_rater with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="vikp/instruct_rater")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("vikp/instruct_rater") model = AutoModelForSequenceClassification.from_pretrained("vikp/instruct_rater") - Notebooks
- Google Colab
- Kaggle
Update README.md
Browse files
README.md
CHANGED
|
@@ -6,7 +6,7 @@ This model judges if a given output is sufficient to recreate a given instructio
|
|
| 6 |
|
| 7 |
It's useful for filtering data to train a reverse instruct model. It could also have applications around determining if an output/instruction pair is linked, or around quality filtering data (data where the instruction can be recreated from the output might be higher quality).
|
| 8 |
|
| 9 |
-
The model is a binary classifier trained on top of Python
|
| 10 |
|
| 11 |
To use it, pass in this prompt format:
|
| 12 |
|
|
|
|
| 6 |
|
| 7 |
It's useful for filtering data to train a reverse instruct model. It could also have applications around determining if an output/instruction pair is linked, or around quality filtering data (data where the instruction can be recreated from the output might be higher quality).
|
| 8 |
|
| 9 |
+
The model is a binary classifier trained on top of Python 410m with 100k examples for 1 epoch. The final validation loss is .35. You can see an example of a dataset filtered with this model [here](https://huggingface.co/datasets/vikp/reverse_instruct).
|
| 10 |
|
| 11 |
To use it, pass in this prompt format:
|
| 12 |
|