--- license: apache-2.0 --- # About this model This is a fine-tuned version of `mistralai/Mistral-Nemo-Base-2407` for materials property prediction. The model was trained to predict bandgaps using the Materials Project dataset. Please refer to the example snippet for input templates and usage. This model is part of the hands-on materials for [DxMT AIMHack 2024](https://dxmt.mext.go.jp/news/1105). For scripts related to fine-tuning and data retrieval, please visit the GitHub repository: [llm4material tutorial](https://github.com/resnant/llm4mat-tutorial). # Usage ```python import torch from transformers import AutoModelForCausalLM, AutoTokenizer model_id = "ysuz/Mistral-Nemo-Base-2407-bandgap" tokenizer = AutoTokenizer.from_pretrained(model_id) model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto", torch_dtype=torch.float16, ) # example of input context structure_text = """ Reduced Formula: BaSrI4 abc : 5.807091 5.807091 8.251028 angles: 90.000000 90.000000 90.000000 pbc : True True True space group: ('P4/mmm', 123) Sites (6) # SP a b c magmom 0 Ba 0.5 0.5 0 -0 1 Sr 0 0 0.5 -0 2 I 0 0.5 0.257945 0 3 I 0.5 0 0.257945 0 4 I 0 0.5 0.742055 0 5 I 0.5 0 0.742055 0 Output: """ prompt = f"Instruction: What is the bandgap value of following material?:\n{structure_text}\n\nOutput:\n" inputs = tokenizer(prompt, return_tensors="pt").to(model.device) with torch.no_grad(): tokens = model.generate( **inputs, max_new_tokens=256, do_sample=True, temperature=0.5, top_p=0.9, repetition_penalty=1.05, ) generated_text = tokenizer.decode(tokens[0], skip_special_tokens=True) print(f"Generated raw text:\n{generated_text}\n\n") ``` # Model performance This model delivers an MAE of 0.33 on the test set, which was randomly split from the Materials Project dataset, with 113,568 instances for training and 12,618 for testing. The results show promise as a baseline for materials property prediction models based on Large Language Models. For comparison, one of the current state-of-the-art models, [CrystalFormer](https://omron-sinicx.github.io/crystalformer/), achieves an MAE of around 0.20 eV using 60k training data from the Materials Project dataset.