File size: 1,074 Bytes
c809873
 
 
 
 
 
 
 
 
 
 
 
 
 
f3bb368
 
c809873
 
 
 
 
 
 
 
0c46c44
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
---
license: llama3
datasets:
- jpgard/t4-full
language:
- en
---

This repository contains the TabuLa-8B model. 
TabuLa-8B is a foundation model for prediction (classification and binned regression) on tabular data.

TabuLa-8B is described in the paper "Large Scale Transfer Learning for Tabular Data via Language Modeling."

For more details on the model, see the paper, which includes a Model Card detailing the model architecture, training, and evaluation.
TabuLa-8B was trained with [rtfm](https://github.com/mlfoundations/rtfm), 
using the [t4 dataset](https://huggingface.co/datasets/mlfoundations/t4-full).

# Usage and Examples

We will add usage examples of the model soon!

# License and Terms of Use

TabuLa-8B is fine-tuned from the Llama-3 8B model. 
As a result, we release it under the [Llama 3 license](https://llama.meta.com/llama3/license/), 
and by using the model you agree to abide by the [Llama 3 Community License Agreement](https://llama.meta.com/llama3/license/) 
and the Llama 3 [Acceptable Use Policy](https://llama.meta.com/llama3/use-policy/).