Joetib commited on
Commit
51a57c1
1 Parent(s): 65a6f9d

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +90 -0
README.md ADDED
@@ -0,0 +1,90 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - head_qa
5
+ language:
6
+ - en
7
+ library_name: transformers
8
+ ---
9
+
10
+ # ibleducation/ibl-multiple-choice-7B
11
+ ibleducation/ibl-multiple-choice-7B is a model finetuned on top of mistralai/Mistral-7B-Instruct-v0.1
12
+
13
+
14
+ The model is finetuned to generate a multiple choice questions.
15
+ The output of the model is a json object with the following entries
16
+ 1. category: The topic area of the question
17
+ 2. qtext: The question text
18
+ 3. ra: The aid of the correct answer
19
+ 4. answers: a list of possible answer choices each with an `aid` (answer id) and `atext` (answer text.)
20
+
21
+
22
+
23
+ ## Example Conversations
24
+ 1. Question: Photosynthesis \
25
+ Answer:
26
+ ```json
27
+ {
28
+ "category": "Photosynthesis",
29
+ "qtext": "The chlorophyll fluorescence measurement technique is based on the emission of fluorescence by the chlorophylls present in the photosynthetic pigmentation:",
30
+ "ra": 4,
31
+ "answers": [
32
+ {"aid": 1, "atext": "It is used to determine the light absorption characteristics of the pigments."},
33
+ {"aid": 2, "atext": "It is used to determine the light emission characteristics of the pigments."},
34
+ {"aid": 3, "atext": "It is used to determine the kinetics of light absorption by the pigments."},
35
+ {"aid": 4, "atext": "It is used to determine the kinetics of light emission by the pigments."},
36
+ {"aid": 5, "atext": "It is used to determine the energy that the pigments emit when they absorb light."}
37
+ ]
38
+ }
39
+ ```
40
+
41
+
42
+ ## Model Details
43
+
44
+ - **Developed by:** [IBL Education](https://ibl.ai)
45
+ - **Model type:** [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1)
46
+ - **Base Model:** [Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1)
47
+ - **Language:** English
48
+ - **Finetuned from weights:** [Mistral-7B-Instruct-v0.1](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.1)
49
+ - **Finetuned on data:**
50
+ - [Head_qa](https://huggingface.co/datasets/head_qa)
51
+ - **Model License:** MIT
52
+
53
+ ## How to Get Started with the Model
54
+
55
+ ### Install the necessary packages
56
+
57
+ Requires: [transformers](https://pypi.org/project/transformers/) > 4.35.0
58
+ ```shell
59
+ pip install transformers
60
+ pip install accelerate
61
+ ```
62
+ ### You can then try the following example code
63
+
64
+ ```python
65
+ from transformers import AutoModelForCausalLM, AutoTokenizer
66
+ import transformers
67
+ import torch
68
+
69
+ model_id = "ibleducation/ibl-multiple-choice-7B"
70
+
71
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
72
+ model = AutoModelForCausalLM.from_pretrained(
73
+ model_id,
74
+ device_map="auto",
75
+ )
76
+ pipeline = transformers.pipeline(
77
+ "text-generation",
78
+ model=model,
79
+ tokenizer=tokenizer,
80
+ )
81
+ prompt = "<s>[INST] Algebra [/INST] "
82
+
83
+ response = pipeline(prompt)
84
+ print(response['generated_text'])
85
+ ```
86
+
87
+ **Important** - Use the prompt template below:
88
+ ```
89
+ <s>[INST] {prompt} [/INST]
90
+ ```