wvangils commited on
Commit
4731df7
1 Parent(s): fec7c48

Add new model - bloom 560

Browse files

The 350 model has changed to the 560 model (count including embedding weights).

Files changed (1) hide show
  1. app.py +2 -2
app.py CHANGED
@@ -2,7 +2,7 @@ import transformers
2
  from transformers import pipeline
3
  import gradio as gr
4
 
5
- checkpoint_choices = ['wvangils/GPT-Medium-Beatles-Lyrics-finetuned-newlyrics', 'wvangils/GPT-Neo-125m-Beatles-Lyrics-finetuned-newlyrics', 'wvangils/BLOOM-350m-Beatles-Lyrics-finetuned-newlyrics']
6
 
7
  # Create function for generation
8
  def generate_beatles(checkpoint, input_prompt, temperature, top_p):
@@ -33,7 +33,7 @@ output_box = gr.Textbox(label="Lyrics by The Beatles and chosen language model:"
33
  title='Beatles lyrics generator'
34
  description="<p style='text-align: center'>Multiple language models were fine-tuned on lyrics from The Beatles to generate Beatles-like text. Give it a try!</p>"
35
  article="""<p style='text-align: left'>A couple of data scientists working for <a href='https://cmotions.nl/' target="_blank">Cmotions</a> came together to construct a text generation model that will output Beatles-like text.
36
- We tried several text generation models that we were able to load in Colab: a general <a href='https://huggingface.co/gpt2-medium' target='_blank'>GPT2-medium</a> model, the Eleuther AI small-sized GPT model <a href='https://huggingface.co/EleutherAI/gpt-neo-125M' target='_blank'>GPT-Neo</a> and the new kid on the block build by the <a href='https://bigscience.notion.site/BLOOM-BigScience-176B-Model-ad073ca07cdf479398d5f95d88e218c4' target='_blank'>Bigscience</a> initiative <a href='bigscience/bloom-350m' target='_blank'>BLOOM 350m</a>.
37
  Further we've put together a <a href='https://huggingface.co/datasets/cmotions/Beatles_lyrics' target='_blank'> Huggingface dataset</a> containing all known lyrics created by The Beatles. <a href='https://www.theanalyticslab.nl/blogs/' target='_blank'>Read this blog </a> to see how this model was build in a Python notebook using Huggingface.
38
  The default output contains 100 tokens and has a repetition penalty of 1.0.
39
  </p>"""
2
  from transformers import pipeline
3
  import gradio as gr
4
 
5
+ checkpoint_choices = ['wvangils/GPT-Medium-Beatles-Lyrics-finetuned-newlyrics', 'wvangils/GPT-Neo-125m-Beatles-Lyrics-finetuned-newlyrics', 'wvangils/BLOOM-560m-Beatles-Lyrics-finetuned']
6
 
7
  # Create function for generation
8
  def generate_beatles(checkpoint, input_prompt, temperature, top_p):
33
  title='Beatles lyrics generator'
34
  description="<p style='text-align: center'>Multiple language models were fine-tuned on lyrics from The Beatles to generate Beatles-like text. Give it a try!</p>"
35
  article="""<p style='text-align: left'>A couple of data scientists working for <a href='https://cmotions.nl/' target="_blank">Cmotions</a> came together to construct a text generation model that will output Beatles-like text.
36
+ We tried several text generation models that we were able to load in Colab: a general <a href='https://huggingface.co/gpt2-medium' target='_blank'>GPT2-medium</a> model, the Eleuther AI small-sized GPT model <a href='https://huggingface.co/EleutherAI/gpt-neo-125M' target='_blank'>GPT-Neo</a> and the new kid on the block build by the <a href='https://bigscience.notion.site/BLOOM-BigScience-176B-Model-ad073ca07cdf479398d5f95d88e218c4' target='_blank'>Bigscience</a> initiative <a href='https://huggingface.co/bigscience/bloom-560m' target='_blank'>BLOOM 560m</a>.
37
  Further we've put together a <a href='https://huggingface.co/datasets/cmotions/Beatles_lyrics' target='_blank'> Huggingface dataset</a> containing all known lyrics created by The Beatles. <a href='https://www.theanalyticslab.nl/blogs/' target='_blank'>Read this blog </a> to see how this model was build in a Python notebook using Huggingface.
38
  The default output contains 100 tokens and has a repetition penalty of 1.0.
39
  </p>"""