phi-chemistry / README.md
matthewkenney's picture
Update README.md
e93c96f
metadata
license: apache-2.0
language:
  - en
pipeline_tag: summarization
widget:
  - text: What is the peak phase of T-eV?
    example_title: Question Answering
tags:
  - arxiv

Table of Contents

  1. TL;DR
  2. Model Details
  3. Usage
  4. Uses
  5. Citation

TL;DR

This is a Phi-1_5 model trained on camel-ai/chemistry. This model is for research purposes only and should not be used in production settings.

Model Description

  • Model type: Language model
  • Language(s) (NLP): English
  • License: Apache 2.0
  • Related Models: Phi-1_5

Usage

Find below some example scripts on how to use the model in transformers:

Using the Pytorch model


from huggingface_hub import notebook_login
from datasets import load_dataset, Dataset
from transformers import AutoModelForCausalLM, AutoTokenizer, TextStreamer

model = "ArtifactAI/phi-chemistry"

model = AutoModelForCausalLM.from_pretrained(base_model, trust_remote_code= True)
tokenizer = AutoTokenizer.from_pretrained(base_model, trust_remote_code=True)

def generate(prompt):
  inputs = tokenizer(f'''Below is an instruction that describes a task. Write a response that appropriately completes the request If you are adding additional white spaces, stop writing".\n\n### Instruction:\n{prompt}.\n\n### Response:\n ''', return_tensors="pt", return_attention_mask=False)
  streamer = TextStreamer(tokenizer, skip_prompt= True)
  _ = model.generate(**inputs, streamer=streamer, max_new_tokens=500)
  
generate("What is the IUPAC name for the organic compound with the molecular formula C6H12O2?")

Training Data

The model was trained on camel-ai/chemistry, a dataset of question/answer pairs. Questions are generated using the t5-base model, while the answers are generated using the GPT-3.5-turbo model.

Citation

@misc{phi-chemistry,
    title={phi-chemistry},
    author={Matthew Kenney},
    year={2023}
}