H4rmoniousCaramel / README.md
neovalle's picture
Update README.md
19cfcf6
metadata
library_name: transformers
pipeline_tag: text2text-generation
license: mit
datasets:
  - neovalle/H4rmony
language:
  - en
tags:
  - Environment
  - climate
  - ecology
  - ecolinguistics

Model Card for Model neovalle/H4rmoniousCaramel

image/jpeg

Model Details

Model Description

This is model is a fine-tuned version of google/flan-t5-large finetuned with the H4rmony dataset which aims to better align the model with ecological values through the use of ecolinguistics principles.

  • Developed by: Jorge Vallego
  • Funded by : Neovalle Ltd.
  • Shared by : airesearch@neovalle.co.uk
  • Model type: t5 Language Model
  • Language(s) (NLP): Primarily English
  • License: MIT
  • Finetuned from model: google/flan-t5-large

Uses

Intended as PoC to show the effect of H4rmony dataset.

Direct Use

For testing purposes to gain insight in order to help with the continous improvement of the H4rmony dataset.

Downstream Use

Its direct use in applications is not recommended as this model is under testing for a specific task only

Out-of-Scope Use

Not meant to be used other than testing and evaluation of the H4rmony dataset.

Bias, Risks, and Limitations

This model might produce biased completions already existing in the base model and unintentionally introduced during fine-tuning.

How to Get Started with the Model

It can be loaded and run in a free Colab instance.

Code to load base and finetuned models to compare outputs:

https://github.com/Neovalle/H4rmony/blob/main/H4rmoniousCaramel.ipynb

Training Details

Supervised Fine Tuning

Training Data

H4rmony Dataset - https://huggingface.co/datasets/neovalle/H4rmony