File size: 956 Bytes
da7c55f
 
 
4e4fdfc
 
63d78a3
 
 
 
be7c034
 
b05185e
 
 
 
 
63d78a3
b05185e
 
 
 
 
63d78a3
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
---
license: mit
---
<h1>Transformer Encoder for Social Science (TESS)</h1>

TESS is a deep neural network model intended for social science related NLP tasks. The model is developed by Haosen Ge, In Young Park, Xuancheng Qian, and Grace Zeng. 

We demonstrate in two validation tests that TESS outperforms BERT and RoBERTa by 16.7\% on average, especially when the number of training samples is limited (<1,000 training instances). The results display the superiority of TESS on social science text processing tasks. 

GitHub: [TESS](https://github.com/haosenge/TESS).

<h2>Training Corpus</h2>

|     TEXT      |    SOURCE     |
| ------------- | ------------- |
| Preferential Trade Agreements  | ToTA  |
| Congressional Bills  | Kornilova and Eidelman (2019)  |
|UNGA Resolutions | UN |
|Firms' Annual Reports | Loughran and McDonald (2016)|
| U.S. Court Opinions | Caselaw Access Project|

The model is trained on 4 NVIDIA A100 GPUs for 120K steps.