File size: 3,477 Bytes
7095ee2
 
64c1ea8
7095ee2
 
 
 
 
 
 
 
8bae39f
7095ee2
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
33e3f5f
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
---
datasets:
- LevMuchnik/SupremeCourtOfIsrael
language:
- he
base_model:
- avichr/heBERT
pipeline_tag: question-answering
---
# HeBERT Finetuned Legal Clauses

This model fine-tunes [avichr/heBERT](https://huggingface.co/avichr/heBERT) model on [LevMuchnik/SupremeCourtOfIsrael](https://huggingface.co/datasets/LevMuchnik/SupremeCourtOfIsrael) dataset.


## Training and evaluation data


| Dataset  | Split | # samples |
| -------- | ----- | --------- |
| SupremeCourtOfIsrael | train | 147,946 |
| SupremeCourtOfIsrael | validation | 36,987 |


## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- evaluation_strategy: "epoch"
- learning_rate: 2e-5
- train_batch_size: 16
- eval_batch_size: 16
- num_train_epochs: 3
- weight_decay: 0.01


### Framework versions

- Transformers 4.17.0
- Pytorch 1.10.0+cu111
- Datasets 1.18.4
- Tokenizers 0.11.6

### Results

| Metric | # Value   |
| ------ | --------- |
| **Accuracy** | **0.96** | 
| **F1** | **0.96** |


## Example Usage

```python
from transformers import pipeline

model_checkpoint = "shay681/HeBERT_finetuned_Legal_Clauses"
qa_pipeline = pipeline(
    "question-answering",
    model=model_checkpoint, 
    )

predictions = qa_pipeline({
    'context': "讘 讘讬转 讛诪砖驻讟 讛注诇讬讜谉 讘讬专讜砖诇讬诐 专注"讗 2225/01 讘驻谞讬: 讻讘讜讚 讛砖讜驻讟转 讟' 砖讟专住讘专讙-讻讛谉 讛诪讘拽砖转: 砖讬专讜转讬 讘专讬讗讜转 讻诇诇讬转 谞讙讚 讛诪砖讬讘 讛 : 讞谞讬转讛 诪讬讬讟诇住, 注讜"讚 讘拽砖转 专砖讜转 注专注讜专 注诇 讛讞诇讟转 讘讬转 讛诪砖驻讟 讛诪讞讜讝讬 讘转诇-讗讘讬讘-讬驻讜 诪讬讜诐 21.2.01 讘讛"驻 11571/99 讜讛"驻 11243/99 砖谞讬转谞讛 注诇 讬讚讬 讻讘讜讚 讛砖讜驻讟 讗' 住讟专砖谞讜讘 讛讞诇讟讛 1. 讛讜讞诇讟 砖讛讘拽砖讛 诪爪专讬讻讛 转砖讜讘讛. 2. 讛诪砖讬讘讛 转讙讬砖 转砖讜讘转讛 诇讘拽砖讛 讘转讜讱 15 讬诪讬诐 诪诪讜注讚 讛讛诪爪讗讛. 3. 注诇 谞讜住讞 讛转砖讜讘讛 讬讞讜诇讜 讛讜专讗讜转讬讛 砖诇 转拽谞讛 406(讘) 诇转拽谞讜转 住讚专 讛讚讬谉 讛讗讝专讞讬, 转砖诪"讚1984-. 4. 讛转砖讜讘讛 转讜讙砖 讘诪拽讘讬诇, 讛谉 诇讘讬转-诪砖驻讟 讝讛, 讜讛谉 讘诪讬砖专讬谉 诇诪讘拽砖转. 5. 讘转砖讜讘讛 转讬讻诇诇 讛转讬讬讞住讜转 诇讗驻砖专讜转 砖讘讬转-讛诪砖驻讟 讬讘拽砖 诇驻注讜诇 注诇-驻讬 住诪讻讜转讜 诇驻讬 转拽谞讛 410 诇转拽谞讜转 住讚专 讛讚讬谉 讛讗讝专讞讬, 转砖诪"讚1984-. 诪转讘拽砖转 讛转讬讬讞住讜转 诇砖讗诇讛 讗诐 讘诪拽专讛 讻讝讛 谞讬转谉 讬讛讬讛 诇专讗讜转 讘讚讘专讬 讛转讙讜讘讛 住讬讻讜诪讬诐 讘讻转讘. 讛注讚专 讛转讬讬讞住讜转 讻诪讜讛讜 讻讛住讻诪讛. 讛注讚专 转讙讜讘讛 讻诪讜讛讜 讻讗讬-讛转讬讬爪讘讜转, 注诇 讛讻专讜讱 讘讻讱. 谞讬转谞讛 讛讬讜诐, 讬"讙 讘住讬讜讜谉 转砖住"讗 (4.6.01). 砖 讜 驻 讟 转 _________________ 讛注转拽 诪转讗讬诐 诇诪拽讜专 01022250. J01 谞讜住讞 讝讛 讻驻讜祝 诇砖讬谞讜讬讬 注专讬讻讛 讟专诐 驻专住讜诪讜 讘拽讜讘抓 驻住拽讬 讛讚讬谉 砖诇 讘讬转 讛诪砖驻讟 讛注诇讬讜谉 讘讬砖专讗诇. 砖诪专讬讛讜 讻讛谉 - 诪讝讻讬专 专讗砖讬 讘讘讬转 讛诪砖驻讟 讛注诇讬讜谉 驻讜注诇 诪专讻讝 诪讬讚注, 讟诇' 02-6750444",
    
    'question': "讗讬诇讜 住注讬驻讬诐\讞讜拽讬诐\转拽谞讜转 诪爪讜讬讬谞讬诐 讘诪住诪讱 ?"
})

print(predictions)
# output:
# {'score': 0.9999890327453613, 'start': 0, 'end': 7, 'answer': '转拽谞讛 406(讘) 诇转拽谞讜转 住讚专 讛讚讬谉 讛讗讝专讞讬, 转砖诪"讚1984'}
```

### About Me
Created by Shay Doner.
This is my final project as part of intelligent systems M.Sc studies at Afeka College in Tel-Aviv.
For more cooperation, please contact email: 
shay681@gmail.com