SwastikM commited on
Commit
6715a2a
1 Parent(s): a72b0bf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +30 -62
README.md CHANGED
@@ -12,7 +12,7 @@ widget:
12
  ---
13
 
14
 
15
- # Model Card for Model ID
16
 
17
  Generate SQL from Natural Language question with a SQL context.
18
 
@@ -30,98 +30,66 @@ BART from facebook/bart-large-cnn is fintuned on gretelai/synthetic_text_to_sql
30
  - **Finetuned from model [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn?text=The+tower+is+324+metres+%281%2C063+ft%29+tall%2C+about+the+same+height+as+an+81-storey+building%2C+and+the+tallest+structure+in+Paris.+Its+base+is+square%2C+measuring+125+metres+%28410+ft%29+on+each+side.+During+its+construction%2C+the+Eiffel+Tower+surpassed+the+Washington+Monument+to+become+the+tallest+man-made+structure+in+the+world%2C+a+title+it+held+for+41+years+until+the+Chrysler+Building+in+New+York+City+was+finished+in+1930.+It+was+the+first+structure+to+reach+a+height+of+300+metres.+Due+to+the+addition+of+a+broadcasting+aerial+at+the+top+of+the+tower+in+1957%2C+it+is+now+taller+than+the+Chrysler+Building+by+5.2+metres+%2817+ft%29.+Excluding+transmitters%2C+the+Eiffel+Tower+is+the+second+tallest+free-standing+structure+in+France+after+the+Millau+Viaduct.)**
31
  - **Dataset:** [gretelai/synthetic_text_to_sql](https://huggingface.co/datasets/gretelai/synthetic_text_to_sql)
32
 
33
- ## Uses
34
 
35
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
36
 
37
- ### Direct Use
38
 
39
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
 
40
 
41
- [More Information Needed]
42
 
43
- ### Downstream Use [optional]
44
 
45
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
46
 
47
- [More Information Needed]
48
 
 
49
 
50
- ## How to Get Started with the Model
51
 
52
- Use the code below to get started with the model.
53
-
54
- [More Information Needed]
55
 
56
  ## Training Details
57
 
58
  ### Training Data
59
 
60
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
61
 
62
  [More Information Needed]
63
 
64
  ### Training Procedure
65
 
66
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
67
 
68
- #### Preprocessing [optional]
69
 
70
- [More Information Needed]
 
71
 
72
 
73
  #### Training Hyperparameters
74
 
75
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
 
 
 
 
 
76
 
77
- #### Speeds, Sizes, Times [optional]
78
-
79
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
80
-
81
- [More Information Needed]
82
 
83
  ## Evaluation
84
 
85
- <!-- This section describes the evaluation protocols and provides the results. -->
86
-
87
- ### Testing Data, Factors & Metrics
88
-
89
- #### Testing Data
90
-
91
- <!-- This should link to a Dataset Card if possible. -->
92
-
93
- [More Information Needed]
94
-
95
- #### Factors
96
-
97
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
98
-
99
- [More Information Needed]
100
-
101
- #### Metrics
102
-
103
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
104
-
105
- [More Information Needed]
106
-
107
- ### Results
108
-
109
- [More Information Needed]
110
-
111
-
112
- ## Technical Specifications [optional]
113
-
114
- ### Model Architecture and Objective
115
-
116
- [More Information Needed]
117
-
118
- ### Compute Infrastructure
119
-
120
- [More Information Needed]
121
 
122
  #### Hardware
123
 
124
- [More Information Needed]
125
 
126
 
127
  ## Citation
@@ -162,6 +130,6 @@ Use the code below to get started with the model.
162
  }
163
 
164
 
165
- ## Model Card Authors [optional]
166
 
167
- [Swastik Maiti]
 
12
  ---
13
 
14
 
15
+ # BART (large-sized model), fine-tuned on synthetic_text_to_sql
16
 
17
  Generate SQL from Natural Language question with a SQL context.
18
 
 
30
  - **Finetuned from model [facebook/bart-large-cnn](https://huggingface.co/facebook/bart-large-cnn?text=The+tower+is+324+metres+%281%2C063+ft%29+tall%2C+about+the+same+height+as+an+81-storey+building%2C+and+the+tallest+structure+in+Paris.+Its+base+is+square%2C+measuring+125+metres+%28410+ft%29+on+each+side.+During+its+construction%2C+the+Eiffel+Tower+surpassed+the+Washington+Monument+to+become+the+tallest+man-made+structure+in+the+world%2C+a+title+it+held+for+41+years+until+the+Chrysler+Building+in+New+York+City+was+finished+in+1930.+It+was+the+first+structure+to+reach+a+height+of+300+metres.+Due+to+the+addition+of+a+broadcasting+aerial+at+the+top+of+the+tower+in+1957%2C+it+is+now+taller+than+the+Chrysler+Building+by+5.2+metres+%2817+ft%29.+Excluding+transmitters%2C+the+Eiffel+Tower+is+the+second+tallest+free-standing+structure+in+France+after+the+Millau+Viaduct.)**
31
  - **Dataset:** [gretelai/synthetic_text_to_sql](https://huggingface.co/datasets/gretelai/synthetic_text_to_sql)
32
 
33
+ ## Intended uses & limitations
34
 
35
+ Addressing the power of LLM in fintuned downstream task. Implemented as a personal Project.
36
 
37
+ ### How to use
38
 
39
+ # Load model directly
40
+ from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
41
 
42
+ tokenizer = AutoTokenizer.from_pretrained("SwastikM/bart-large-nl2sql")
43
 
44
+ model = AutoModelForSeq2SeqLM.from_pretrained("SwastikM/bart-large-nl2sql")
45
 
46
+ query_question_with_context = "sql_prompt: Which economic diversification efforts in the 'diversification' table have a higher budget than the average budget for all economic diversification efforts in the 'budget' table? sql_context: CREATE TABLE diversification (id INT, effort VARCHAR(50), budget FLOAT); CREATE TABLE budget (diversification_id INT, diversification_effort VARCHAR(50), amount FLOAT);"
47
 
48
+ sql = model.generate(text)
49
 
50
+ print(sql)
51
 
 
52
 
 
 
 
53
 
54
  ## Training Details
55
 
56
  ### Training Data
57
 
58
+ [gretelai/synthetic_text_to_sql](https://huggingface.co/datasets/gretelai/synthetic_text_to_sql)
59
 
60
  [More Information Needed]
61
 
62
  ### Training Procedure
63
 
64
+ HuggingFace Accelerate with Training Loop.
65
 
66
+ #### Preprocessing
67
 
68
+ - ***Encoder Input:*** "sql_prompt: " + data['sql_prompt']+" sql_context: "+data['sql_context']
69
+ - ***Decoder Input:*** data['sql']
70
 
71
 
72
  #### Training Hyperparameters
73
 
74
+ - **Optimizer:** AdamW
75
+ - **lr:** 2e-5
76
+ - **decay:** linear
77
+ - **num_warmup_steps:** 0
78
+ - **batch_size:** 8
79
+ - **num_training_steps:** 12500
80
 
 
 
 
 
 
81
 
82
  ## Evaluation
83
 
84
+ ***Rouge Score***
85
+ - **Rouge1:** 55.69
86
+ - **Rouge2:** 42.99
87
+ - **RougeL:** 51.43
88
+ - **RougeLsum:** 51.40
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
89
 
90
  #### Hardware
91
 
92
+ -**GPU:** P100
93
 
94
 
95
  ## Citation
 
130
  }
131
 
132
 
133
+ ## Model Card Authors
134
 
135
+ Swastik Maiti