File size: 2,609 Bytes
39e085f
 
 
30ec8a7
 
6f58585
30ec8a7
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
39e085f
e6df0ae
 
 
39e085f
 
 
e6df0ae
 
39e085f
 
 
 
e6df0ae
 
39e085f
 
31a4a5d
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
---
license: llama3
language:
- tr
- en
base_model: meta-llama/Meta-Llama-3-8B-Instruct
model-index:
- name: MARS
  results:
    - task:
        type: text-generation
        name: Text Generation
      dataset:
        name: AI2 Reasoning Challenge TR
        type: ai2_arc
        config: ARC-Challenge
        split: test
        args:
          num_few_shot: 25
      metrics:
        - type: acc
          value: 46.08
          name: accuracy
    - task:
        type: text-generation
        name: Text Generation
      dataset:
        name: MMLU TR
        type: cais/mmlu
        config: all
        split: test
        args:
          num_few_shot: 5
      metrics:
        - type: acc
          value: 47.02
          name: accuracy
    - task:
        type: text-generation
        name: Text Generation
      dataset:
        name: TruthfulQA TR
        type: truthful_qa
        config: multiple_choice
        split: validation
        args:
          num_few_shot: 0
      metrics:
        - type: acc
          name: accuracy
          value: 49.38
    - task:
        type: text-generation
        name: Text Generation
      dataset:
        name: Winogrande TR
        type: winogrande
        config: winogrande_xl
        split: validation
        args:
          num_few_shot: 5
      metrics:
        - type: acc
          value: 53.71
          name: accuracy
    - task:
        type: text-generation
        name: Text Generation
      dataset:
        name: GSM8k TR
        type: gsm8k
        config: main
        split: test
        args:
          num_few_shot: 5
      metrics:
        - type: acc
          value: 53.08
          name: accuracy
---

<p style="align-self: center">
    <img src="MARS-1.0.png" alt="Curiosity MARS model logo" style="border-radius: 1rem">
</p>

<div style="display: flex; justify-content: center; align-items: center; flex-direction: column">
    <h1 style="font-size: 5em; margin-bottom: 0; padding-bottom: 0;">MARS</h1>
    <aside>by <a href="https://curiosity.tech">Curiosity Technology</a></aside>
</div>

MARS is the first iteration of Curiosity Technology models, based on Llama 3 8B.

We have trained MARS on in-house Turkish dataset, as well as several open-source datasets and their Turkish
translations.
It is our intention to release Turkish translations in near future for community to have their go on them.

MARS have been tranied for 3 days on 4xA100.

## Model Details

- **Base Model**: Meta Llama 3 8B Instruct
- **Training Dataset**: In-house & Translated Open Source Turkish Datasets
- **Training Method**: LoRA Fine Tuning