Edit model card

FIDO-GPT: Generative AI behind "Fidonet Cybernetic Immortality" Project

FIDONet is a historic computer network based on nightly mail exchange between servers via telephone lines, which was popular in 1990-s. In FIDONet Cybernetic Immortality Project we are looking to create exhibits that will revive now-almost-dead FIDONet by automatically writing correspondence in FIDONet style via generative large language models.

This model allows post generation in Russian Language. If you are looking for English-language version of Fido-GPT, take a look at fido-gpt model.

This model is based on ruGPT3-small model, and was fine-tuned for 1 epoch on subset of selected fidonet archives from 2013-2015. The following echo areas were included: ru.anekdot, ru.anekdot.digest, ru.anekdot.filtered, ru.anekdot.talks, ru.anekdot.the.best, ru.computer.humor, ru.cpp, ru.dos, ru.film, ru.java, ru.linguist, ru.linux, ru.moderator, ru.pascal, ru.photo, ru.physics, ru.psychology.

Total training dataset size was 552 Mb. The process took around 3 hours on NVidia V100 compute in Yandex Datasphere service.

This code can be used for generation:

from transformers import pipeline, AutoModelForCausalLM,AutoTokenizer
import torch

model_name = 'shwars/fido-rugpt3-small'

model = AutoModelForCausalLM.from_pretrained(model_name)
tokenizer = AutoTokenizer.from_pretrained(model_name)
pipe = pipeline(model=model,tokenizer=tokenizer,task="text-generation",device="cuda")
result = pipe("<s>Area: ru.anekdot",do_sample=True,max_length=500)[0]['generated_text'].replace('\\n','\n')

Project idea and model training: Dmitry Soshnikov

Downloads last month
6