tommytracx commited on
Commit
d16b1c5
Β·
verified Β·
1 Parent(s): 82cd407

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +139 -3
README.md CHANGED
@@ -1,3 +1,139 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ datasets:
4
+ - GainEnergy/gpt-4o-oilandgas-trainingset
5
+ base_model:
6
+ - qihoo360/TinyR1-32B-Preview
7
+ library_name: transformers
8
+ tags:
9
+ - oil-gas
10
+ - drilling-engineering
11
+ - retrieval-augmented-generation
12
+ - finetuned
13
+ - energy-ai
14
+ - tiny-r1-32b
15
+ - lora
16
+ model-index:
17
+ - name: OGAI-R1
18
+ results:
19
+ - task:
20
+ type: text-generation
21
+ name: Oil & Gas Engineering AI
22
+ dataset:
23
+ name: GainEnergy GPT-4o Oil & Gas Training Set
24
+ type: custom
25
+ metrics:
26
+ - name: Engineering Calculations Accuracy
27
+ type: accuracy
28
+ value: 94.3
29
+ - name: Technical Document Retrieval Precision
30
+ type: precision
31
+ value: 90.5
32
+ - name: Context Retention
33
+ type: contextual-coherence
34
+ value: High
35
+ ---
36
+
37
+ # OGAI-R1: Oil & Gas AI Model for Engineering & Technical Knowledge
38
+
39
+ ![Hugging Face](https://img.shields.io/badge/HuggingFace-OGAI--R1-blue)
40
+ [![License](https://img.shields.io/github/license/huggingface/transformers.svg)](LICENSE)
41
+
42
+ **OGAI-R1** is a **fine-tuned version of TinyR1-32B**, designed specifically for **oil and gas engineering applications**. It is optimized for **engineering calculations, wellbore stability analysis, reservoir management, and document-based retrieval-augmented generation (RAG)**.
43
+
44
+ The model has been trained using **GainEnergy's GPT-4o Oil & Gas Training Set**, incorporating expert knowledge, technical formulas, and structured query-response interactions.
45
+
46
+ ## πŸ— **Why Use OGAI-R1?**
47
+ - **πŸš€ Fine-tuned for oil & gas engineering tasks** (drilling, production, reservoir, and refining).
48
+ - **πŸ’‘ Optimized for RAG** – Enhanced document understanding and retrieval.
49
+ - **πŸ“š Long-Context Retention** – Handles **up to 32K tokens** for complex engineering workflows.
50
+ - **⚑ LoRA Fine-Tuning on TinyR1-32B** – Enables efficient inference and quick knowledge retrieval.
51
+
52
+ ---
53
+
54
+ ## πŸ›  **How to Use OGAI-R1**
55
+
56
+ ### **1️⃣ Install Required Dependencies**
57
+ ```bash
58
+ pip install torch transformers accelerate bitsandbytes
59
+ ```
60
+
61
+ ### **2️⃣ Load the Model**
62
+ ```python
63
+ from transformers import AutoModelForCausalLM, AutoTokenizer
64
+
65
+ model_id = "GainEnergy/OGAI-R1"
66
+
67
+ # Load tokenizer
68
+ tokenizer = AutoTokenizer.from_pretrained(model_id)
69
+
70
+ # Load model
71
+ model = AutoModelForCausalLM.from_pretrained(model_id, device_map="auto")
72
+
73
+ # Run inference
74
+ prompt = "Explain the principles of reservoir simulation in petroleum engineering."
75
+ inputs = tokenizer(prompt, return_tensors="pt").to("cuda")
76
+ outputs = model.generate(**inputs, max_new_tokens=512)
77
+ print(tokenizer.decode(outputs[0], skip_special_tokens=True))
78
+ ```
79
+
80
+ ---
81
+
82
+ ## πŸ“¦ **Model Variants**
83
+ | **Model Name** | **Base Model** | **Precision** | **Context Window** | **Use Case** |
84
+ |--------------|--------------|--------------|--------------|--------------|
85
+ | **OGAI-R1** | TinyR1-32B | FP16 | 32K tokens | **Engineering Calculations & RAG** |
86
+ | **OGAI-8x7B** | Mixtral-8x7B | 4-bit | 32K tokens | Oil & Gas AI Assistant |
87
+ | **OGAI-Reasoner** | DeepSeek-R1 | FP16 | 128K tokens | Logical Reasoning & AI Simulation |
88
+
89
+ ---
90
+
91
+ ## πŸ“Œ **Key Capabilities**
92
+ βœ… **Engineering Calculations** – Computes reservoir volumes, wellbore stability, mud weight, casing depth, and more.
93
+ βœ… **Technical Document Understanding** – Trained on oil and gas **technical literature, drilling reports, and engineering manuals**.
94
+ βœ… **Retrieval-Augmented Generation (RAG)** – Enhances AI-driven document retrieval for faster decision-making.
95
+ βœ… **High-Context Retention (32K tokens)** – Supports **long technical reports, operational workflows, and AI-driven engineering analysis**.
96
+
97
+ ---
98
+
99
+ ## πŸš€ **Use Cases**
100
+ - **Wellbore Stability & Drilling Optimization**
101
+ - **Hydraulics & Fluid Flow Simulations**
102
+ - **Reservoir Engineering & Petrophysics Analysis**
103
+ - **AI-Powered Document Retrieval & RAG Workflows**
104
+ - **Technical Compliance & Regulatory Document Processing**
105
+
106
+ ---
107
+
108
+ ## πŸ“‘ **Deployment Options**
109
+ | **Platform** | **Compatible?** | **Recommended Setup** |
110
+ |-------------|----------------|-----------------------|
111
+ | **Hugging Face Inference API** | βœ… Yes | Deploy via `hf.co/GainEnergy/OGAI-R1` |
112
+ | **RunPod.io (Serverless GPU)** | βœ… Yes | `A100-40GB` or `RTX 4090` |
113
+ | **AWS EC2 (G5 Instances)** | βœ… Yes | `ml.g5.2xlarge` (8 vCPUs, 32GB RAM) |
114
+ | **Local GPU (Consumer Hardware)** | βœ… Yes | Requires **β‰₯16GB VRAM (RTX 3090, 4090)** |
115
+
116
+ ---
117
+
118
+ ## ⚠️ **Limitations**
119
+ 🚧 **Optimized for Oil & Gas Engineering** – Not designed for general-purpose AI tasks.
120
+ 🚧 **Requires domain-specific expertise** – Outputs should be validated by industry experts.
121
+ 🚧 **Computational requirements** – Running the full TinyR1-32B model requires high-end GPUs.
122
+
123
+ ---
124
+
125
+ ## πŸ”— **Resources**
126
+ - **[GainEnergy AI Platform](https://gain.energy)** – Explore AI-powered drilling automation.
127
+ - **[Hugging Face Model Hub](https://huggingface.co/GainEnergy/OGAI-R1)** – Download & deploy the model.
128
+
129
+ ---
130
+
131
+ ## πŸ“š **Citing OGAI-R1**
132
+ ```bibtex
133
+ @article{ogai-r1-2025,
134
+ title={OGAI-R1: An AI Model for Oil & Gas Engineering Optimization},
135
+ author={GainEnergy AI Team},
136
+ year={2025},
137
+ publisher={Hugging Face Models}
138
+ }
139
+ ```