Text Classification
Transformers
PyTorch
English
electra
reward-model
reward_model
RLHF
Inference Endpoints
theblackcat102 commited on
Commit
12a77db
1 Parent(s): 0bde796
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -35,7 +35,7 @@ All models are train on these dataset with a same split seed across datasets (if
35
  # How to use
36
 
37
  ```
38
- from transformer import AutoModelForSequenceClassification, AutoTokenizer
39
  reward_name = "OpenAssistant/reward-model-electra-large-discriminator"
40
  rank_model, tokenizer = AutoModelForSequenceClassification.from_pretrained(reward_name), AutoTokenizer.from_pretrained(reward_name)
41
  question, answer = "Explain nuclear fusion like I am five", "Nuclear fusion is the process by which two or more protons and neutrons combine to form a single nucleus. It is a very important process in the universe, as it is the source of energy for stars and galaxies. Nuclear fusion is also a key process in the production of energy for nuclear power plants."
 
35
  # How to use
36
 
37
  ```
38
+ from transformers import AutoModelForSequenceClassification, AutoTokenizer
39
  reward_name = "OpenAssistant/reward-model-electra-large-discriminator"
40
  rank_model, tokenizer = AutoModelForSequenceClassification.from_pretrained(reward_name), AutoTokenizer.from_pretrained(reward_name)
41
  question, answer = "Explain nuclear fusion like I am five", "Nuclear fusion is the process by which two or more protons and neutrons combine to form a single nucleus. It is a very important process in the universe, as it is the source of energy for stars and galaxies. Nuclear fusion is also a key process in the production of energy for nuclear power plants."