Quantization made by Richard Erkhov.
hebrew-bad_wiki-gpt_neo-tiny - bnb 8bits
- Model creator: https://huggingface.co/Norod78/
- Original model: https://huggingface.co/Norod78/hebrew-bad_wiki-gpt_neo-tiny/
Original model description:
language: he
thumbnail: https://avatars1.githubusercontent.com/u/3617152?norod.jpg widget:
- text: "מתמטיקה:"
- text: "עליית המכונות"
- text: "ויקיפדיה העברית"
- text: "האירוויזיון הוא"
- text: "דוד בן-גוריון היה"
license: mit
hebrew-bad_wiki-gpt_neo-tiny
Table of Contents
- Model Details
- Uses
- Risks, Limitations and Biases
- Training
- Evaluation
- Environmental Impact
- How to Get Started With the Model
Model Details
Model Description:
The model developer notes that the model is
Hebrew nonsense generation model which produces really bad wiki-abstract text.
- Developed by: Doron Adler
- Model Type: Text Generation
- Language(s): Hebrew
- License: MIT
- Resources for more information:
- GitHub Repo
- HuggingFace Space
Uses
Direct Use
This model can be used for text generation.
Misuse and Out-of-scope Use
Risks, Limitations and Biases
CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.
Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).
Training
Training Data
Hebrew Wikipedia Dump (hewiki abstract) from May 2020
Training Procedure
This model was fined tuned upon hebrew-gpt_neo-tiny which was previously trained using EleutherAI's gpt-neo.
Fine-tuning on the wiki-absract text was done using @minimaxir's aitextgen.
Evaluation
Configs
Model configs for the hebrew-gpt_neo-tiny is available on the hebrew-gpt_neo model github
- Activation Function: gelu
- Number_Head: 12
- Number_Vocab: 50257
- Train batch size: 250
- Eval batch size: 64
- Predict batch size: 1
Environmental Impact
Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type based on the associated paper.
Hardware Type: [More information needed]
Hours used: Unknown
Cloud Provider: GCP tpu-v8s
Compute Region: europe-west4
Carbon Emitted: [More information needed]
How to Get Started With the Model
A Google Colab Notebook is also available here
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Norod78/hebrew-bad_wiki-gpt_neo-tiny")
model = AutoModelForCausalLM.from_pretrained("Norod78/hebrew-bad_wiki-gpt_neo-tiny")
- Downloads last month
- 14