YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

hebrew-bad_wiki-gpt_neo-tiny - bnb 8bits

Original model description:

language: he

thumbnail: https://avatars1.githubusercontent.com/u/3617152?norod.jpg widget:

  • text: "מתמטיקה:"
  • text: "עליית המכונות"
  • text: "ויקיפדיה העברית"
  • text: "האירוויזיון הוא"
  • text: "דוד בן-גוריון היה"

license: mit

hebrew-bad_wiki-gpt_neo-tiny

Table of Contents

Model Details

Model Description:

The model developer notes that the model is

Hebrew nonsense generation model which produces really bad wiki-abstract text.

Uses

Direct Use

This model can be used for text generation.

Misuse and Out-of-scope Use

Risks, Limitations and Biases

CONTENT WARNING: Readers should be aware this section contains content that is disturbing, offensive, and can propagate historical and current stereotypes.

Significant research has explored bias and fairness issues with language models (see, e.g., Sheng et al. (2021) and Bender et al. (2021)).

Training

Training Data

Hebrew Wikipedia Dump (hewiki abstract) from May 2020

Training Procedure

This model was fined tuned upon hebrew-gpt_neo-tiny which was previously trained using EleutherAI's gpt-neo.

Fine-tuning on the wiki-absract text was done using @minimaxir's aitextgen.

Evaluation

Configs

Model configs for the hebrew-gpt_neo-tiny is available on the hebrew-gpt_neo model github

  • Activation Function: gelu
  • Number_Head: 12
  • Number_Vocab: 50257
  • Train batch size: 250
  • Eval batch size: 64
  • Predict batch size: 1

Environmental Impact

Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019). We present the hardware type based on the associated paper.

  • Hardware Type: [More information needed]

  • Hours used: Unknown

  • Cloud Provider: GCP tpu-v8s

  • Compute Region: europe-west4

  • Carbon Emitted: [More information needed]

How to Get Started With the Model

A Google Colab Notebook is also available here

​​

from transformers import AutoTokenizer, AutoModelForCausalLM

tokenizer = AutoTokenizer.from_pretrained("Norod78/hebrew-bad_wiki-gpt_neo-tiny")

model = AutoModelForCausalLM.from_pretrained("Norod78/hebrew-bad_wiki-gpt_neo-tiny")

Downloads last month
14
Safetensors
Model size
81.9M params
Tensor type
F32
·
FP16
·
I8
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.