ScriptForge-small / README.md
SRDdev's picture
Update README.md
adcf957
|
raw
history blame
3.11 kB
metadata
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
widget:
  - text: 10 Meditation tips
    example_title: Health Exmaple
  - text: Cooking red sauce pasta
    example_title: Cooking Example
  - text: Introduction to Keras
    example_title: Technology Example
tags:
  - text-generation

SCRIPTGPT

Pretrained model on the English language using a causal language modeling (CLM) objective. It was introduced in this paper and first released on this page.

Model description

ScriptGPT is a language model trained on a dataset of CUSTOM YouTube videos. ScriptGPT-small is a Causal language transformer. The model resembles the GPT2 architecture, the model is a Causal Language model meaning it predicts the probability of a sequence of words based on the preceding words in the sequence. It generates a probability distribution over the next word given the previous words, without incorporating future words.

The goal of ScriptGPT is to generate scripts for AI videos that are coherent, informative, and engaging. This can be useful for content creators who are looking for inspiration or who want to automate the process of generating video scripts. To use ScriptGPT, users can provide a prompt or a starting sentence, and the model will generate a sequence of words that follow the context and style of the training data.

The current model is the smallest one with 124 million parameters (SRDdev/ScriptGPT-small)

More models are coming soon...

Intended uses

The intended uses of ScriptGPT include generating scripts for videos that explain artificial intelligence concepts, providing inspiration for content creators, and automating the process of generating video scripts.

How to use

You can use this model directly with a pipeline for text generation.

Load Model

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("SRDdev/ScriptGPT-small")
model = AutoModelForCausalLM.from_pretrained("SRDdev/ScriptGPT-small")

Pipeline

from transformers import pipeline
generator = pipeline('text generation, model= model , tokenizer=tokenizer)

context = "This is an introduction to Keras, a high-level neural networks API that is popular among researchers and developers. The video covers the basics of Keras, how to install it, and how to use it to build and train a neural network. The presenter demonstrates building and training a simple binary classification model using Keras"
length_to_generate = 1000 

script = generator(context, max_length=length_to_generate, do_sample=True)[0]['generated_text']

script

Keeping the context more detailed results in better outputs

Limitations and bias

The model is trained on Youtube Scripts and will work better for that. It may also generate random information and users should be aware of that and cross-validate the results.

Citations

@model{ Name=Shreyas Dixit framework=Pytorch Year=Jan 2023 Pipeline=text-generation Github=https://github.com/SRDdev LinkedIn=https://www.linkedin.com/in/srddev }