YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Contract Drafter v1
Fine-tuned Equall/Saul-7B-Instruct-v1 for legal contract clause drafting.
Training
- Method: SFT + LoRA (r=16, alpha=32)
- Examples: 126 across 9 contract types (SaaS, MSA, NDA, DPA, SOW, Vendor, Consulting, IP Assignment, Employment)
- Epochs: 3
- Learning rate: 2e-4 with cosine schedule
- Final loss: 0.568 (down from 2.608)
- Token accuracy: 96.9%
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained(
"narcolepticchicken/contract-drafter-v1",
device_map="auto",
torch_dtype="auto",
)
tokenizer = AutoTokenizer.from_pretrained("narcolepticchicken/contract-drafter-v1")
prompt = """<s>[INST] You are an expert contract drafter.
Draft a limitation_of_liability clause for a SaaS agreement.
Deal: Enterprise SaaS platform, $200K ACV, SOC 2 Type II.
Constraints: annual billing, 99.9% uptime SLA.
Law: Delaware
Draft ONLY the clause text. [/INST]"""
inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=600, temperature=0.3)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Performance
Evaluated against base model on 7 contract types. See contract-drafts-v1 for full results.
Files
model.safetensorsโ merged weights (3.8GB, 4-bit)lora/โ LoRA adapter for future fine-tuningtokenizer.jsonโ SaulLM tokenizer (Mistral-based)
Related
- contract-drafting-assistant Space โ interactive drafting UI
- contract-clause-index-v1 โ 37,508 real contract clauses
- contract-nli-v1 โ legal-domain NLI model
- Downloads last month
- 45
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐ Ask for provider support