Edit model card

Russian NER model fine-tuned on RURED2. https://github.com/denis-gordeev/rured2

If you have any questions, message me at https://t.me/nlp_party This model outputs multiple possible labels for a single token. So for proper usage you can use it like in the following code:

import torch
from torch import nn
from transformers import (AutoTokenizer, AutoModelForTokenClassification, 
                          TrainingArguments, Trainer)

model_name = "denis-gordeev/rured2-ner-microsoft-mdeberta-v3-base"
model = AutoModelForTokenClassification.from_pretrained(
    model_name).to('cuda')

tokenizer = AutoTokenizer.from_pretrained(model_name)

def predict(text:str, glue_tokens=False, output_together=True, glue_words=True):
    sigmoid = nn.Sigmoid()
    tokenized = tokenizer(text)
    input_ids = torch.tensor(
            [tokenized["input_ids"]], dtype=torch.long
        ).to("cuda")
    token_type_ids = torch.tensor(
            [tokenized["token_type_ids"]], dtype=torch.long
        ).to("cuda")
    attention_mask = torch.tensor(
            [tokenized["attention_mask"]], dtype=torch.long
        ).to("cuda")
    preds = model(**{"input_ids": input_ids, "token_type_ids": token_type_ids, "attention_mask": attention_mask})
    logits = sigmoid(preds.logits)

    output_tokens = []
    output_preds = []
    id_to_label = {int(k): v for k, v in model.config.id2label.items()}
    for i, token in enumerate(input_ids[0]):
        if token > 3:
            class_ids = (logits[0][i] > 0.5).nonzero()
            if class_ids.shape[0] >= 1:
                class_names = [id_to_label[int(cl)] for cl in class_ids]
            else:
                class_names = [id_to_label[int(logits[0][i].argmax())]]
            converted_token = tokenizer.convert_ids_to_tokens([token])[0]
            new_word_bool = converted_token.startswith("▁")
            converted_token = converted_token.replace("▁", "")
            if glue_words and not(new_word_bool) and output_tokens:
                output_tokens[-1] += converted_token
            else:
                output_tokens.append(converted_token)
                output_preds.append(class_names)
        else:
            class_names = []
    if output_together:
        return [[output_tokens[t_i], output_preds[t_i]] for t_i in range(len(output_tokens))]
    return output_tokens, output_preds

denis-gordeev/rured2-ner-microsoft-mdeberta-v3-base

This model is a fine-tuned version of microsoft/mdeberta-v3-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0096
  • F1 Micro: 0.5837
  • O F1 Micro: 0.6370
  • O Recall Micro: 0.9242
  • O Precision Micro: 0.4860
  • B-person F1 Micro: 0.9639
  • B-person Recall Micro: 0.9816
  • B-person Precision Micro: 0.9468
  • B-norp F1 Micro: 0.6190
  • B-norp Recall Micro: 0.8667
  • B-norp Precision Micro: 0.4815
  • B-commodity F1 Micro: 0.7553
  • B-commodity Recall Micro: 0.9470
  • B-commodity Precision Micro: 0.6281
  • B-date F1 Micro: 0.8386
  • B-date Recall Micro: 0.8471
  • B-date Precision Micro: 0.8304
  • I-date F1 Micro: 0.6419
  • I-date Recall Micro: 0.9492
  • I-date Precision Micro: 0.4849
  • B-country F1 Micro: 0.6152
  • B-country Recall Micro: 0.9765
  • B-country Precision Micro: 0.4490
  • B-economic Sector F1 Micro: 0.5576
  • B-economic Sector Recall Micro: 0.5897
  • B-economic Sector Precision Micro: 0.5287
  • I-economic Sector F1 Micro: 0.2517
  • I-economic Sector Recall Micro: 0.6667
  • I-economic Sector Precision Micro: 0.1551
  • B-news Source F1 Micro: 0.7988
  • B-news Source Recall Micro: 0.8327
  • B-news Source Precision Micro: 0.7677
  • B-profession F1 Micro: 0.8088
  • B-profession Recall Micro: 0.9464
  • B-profession Precision Micro: 0.7061
  • I-news Source F1 Micro: 0.4808
  • I-news Source Recall Micro: 0.8400
  • I-news Source Precision Micro: 0.3368
  • I-person F1 Micro: 0.3381
  • I-person Recall Micro: 0.996
  • I-person Precision Micro: 0.2036
  • B-organization F1 Micro: 0.8350
  • B-organization Recall Micro: 0.8993
  • B-organization Precision Micro: 0.7794
  • I-profession F1 Micro: 0.2462
  • I-profession Recall Micro: 0.8030
  • I-profession Precision Micro: 0.1454
  • B-event F1 Micro: 0.5658
  • B-event Recall Micro: 0.5436
  • B-event Precision Micro: 0.5899
  • B-city F1 Micro: 0.625
  • B-city Recall Micro: 0.8904
  • B-city Precision Micro: 0.4815
  • B-gpe F1 Micro: 0.6760
  • B-gpe Recall Micro: 0.9380
  • B-gpe Precision Micro: 0.5284
  • I-event F1 Micro: 0.2577
  • I-event Recall Micro: 0.3776
  • I-event Precision Micro: 0.1956
  • B-group F1 Micro: 0.6667
  • B-group Recall Micro: 0.75
  • B-group Precision Micro: 0.6
  • B-ordinal F1 Micro: 0.5306
  • B-ordinal Recall Micro: 0.8125
  • B-ordinal Precision Micro: 0.3939
  • B-product F1 Micro: 0.6683
  • B-product Recall Micro: 0.8232
  • B-product Precision Micro: 0.5625
  • I-organization F1 Micro: 0.3128
  • I-organization Recall Micro: 0.8425
  • I-organization Precision Micro: 0.1921
  • B-money F1 Micro: 0.8530
  • B-money Recall Micro: 0.8947
  • B-money Precision Micro: 0.8151
  • I-money F1 Micro: 0.6259
  • I-money Recall Micro: 0.9644
  • I-money Precision Micro: 0.4632
  • B-currency F1 Micro: 0.7441
  • B-currency Recall Micro: 0.9658
  • B-currency Precision Micro: 0.6052
  • B-percent F1 Micro: 0.8639
  • B-percent Recall Micro: 0.8902
  • B-percent Precision Micro: 0.8391
  • I-percent F1 Micro: 0.6995
  • I-percent Recall Micro: 0.9846
  • I-percent Precision Micro: 0.5424
  • I-group F1 Micro: 0.1844
  • I-group Recall Micro: 0.4836
  • I-group Precision Micro: 0.1139
  • B-cardinal F1 Micro: 0.6903
  • B-cardinal Recall Micro: 0.7358
  • B-cardinal Precision Micro: 0.65
  • B-law F1 Micro: 0.3704
  • B-law Recall Micro: 0.3571
  • B-law Precision Micro: 0.3846
  • I-law F1 Micro: 0.3246
  • I-law Recall Micro: 0.3936
  • I-law Precision Micro: 0.2761
  • B-fac F1 Micro: 0.6910
  • B-fac Recall Micro: 0.6910
  • B-fac Precision Micro: 0.6910
  • I-fac F1 Micro: 0.3007
  • I-fac Recall Micro: 0.7151
  • I-fac Precision Micro: 0.1904
  • B-age F1 Micro: 0.8649
  • B-age Recall Micro: 0.7619
  • B-age Precision Micro: 1.0
  • I-city F1 Micro: 0.1047
  • I-city Recall Micro: 0.6429
  • I-city Precision Micro: 0.0570
  • B-work Of Art F1 Micro: 0.3158
  • B-work Of Art Recall Micro: 0.375
  • B-work Of Art Precision Micro: 0.2727
  • I-work Of Art F1 Micro: 0.3721
  • I-work Of Art Recall Micro: 0.5
  • I-work Of Art Precision Micro: 0.2963
  • B-region F1 Micro: 0.8070
  • B-region Recall Micro: 0.7731
  • B-region Precision Micro: 0.8440
  • I-region F1 Micro: 0.2817
  • I-region Recall Micro: 0.8197
  • I-region Precision Micro: 0.1701
  • I-cardinal F1 Micro: 0.3851
  • I-cardinal Recall Micro: 0.4831
  • I-cardinal Precision Micro: 0.3202
  • I-currency F1 Micro: 0.0
  • I-currency Recall Micro: 0.0
  • I-currency Precision Micro: 0.0
  • B-quantity F1 Micro: 0.7311
  • B-quantity Recall Micro: 0.7311
  • B-quantity Precision Micro: 0.7311
  • I-quantity F1 Micro: 0.4889
  • I-quantity Recall Micro: 0.7989
  • I-quantity Precision Micro: 0.3522
  • B-crime F1 Micro: 0.3736
  • B-crime Recall Micro: 0.4048
  • B-crime Precision Micro: 0.3469
  • I-crime F1 Micro: 0.3245
  • I-crime Recall Micro: 0.5648
  • I-crime Precision Micro: 0.2276
  • B-trade Agreement F1 Micro: 0.7170
  • B-trade Agreement Recall Micro: 0.7037
  • B-trade Agreement Precision Micro: 0.7308
  • B-nationality F1 Micro: 0.0
  • B-nationality Recall Micro: 0.0
  • B-nationality Precision Micro: 0.0
  • B-family F1 Micro: 0.5
  • B-family Recall Micro: 0.8889
  • B-family Precision Micro: 0.3478
  • I-family F1 Micro: 0.0
  • I-family Recall Micro: 0.0
  • I-family Precision Micro: 0.0
  • I-product F1 Micro: 0.2021
  • I-product Recall Micro: 0.6824
  • I-product Precision Micro: 0.1186
  • B-time F1 Micro: 0.6538
  • B-time Recall Micro: 0.6296
  • B-time Precision Micro: 0.68
  • I-time F1 Micro: 0.6118
  • I-time Recall Micro: 0.9811
  • I-time Precision Micro: 0.4444
  • I-commodity F1 Micro: 0.0444
  • I-commodity Recall Micro: 0.1667
  • I-commodity Precision Micro: 0.0256
  • B-application F1 Micro: 0.0
  • B-application Recall Micro: 0.0
  • B-application Precision Micro: 0.0
  • I-application F1 Micro: 0.0
  • I-application Recall Micro: 0.0
  • I-application Precision Micro: 0.0
  • I-country F1 Micro: 0.1695
  • I-country Recall Micro: 0.7895
  • I-country Precision Micro: 0.0949
  • B-award F1 Micro: 0.5455
  • B-award Recall Micro: 0.4615
  • B-award Precision Micro: 0.6667
  • I-award F1 Micro: 0.4459
  • I-award Recall Micro: 0.8049
  • I-award Precision Micro: 0.3084
  • I-gpe F1 Micro: 0.3284
  • I-gpe Recall Micro: 0.9167
  • I-gpe Precision Micro: 0.2
  • B-location F1 Micro: 0.4885
  • B-location Recall Micro: 0.5161
  • B-location Precision Micro: 0.4638
  • I-location F1 Micro: 0.3189
  • I-location Recall Micro: 0.6316
  • I-location Precision Micro: 0.2133
  • I-ordinal F1 Micro: 0.5
  • I-ordinal Recall Micro: 0.4
  • I-ordinal Precision Micro: 0.6667
  • I-trade Agreement F1 Micro: 0.1163
  • I-trade Agreement Recall Micro: 0.3846
  • I-trade Agreement Precision Micro: 0.0685
  • B-religion F1 Micro: 0.0
  • B-religion Recall Micro: 0.0
  • B-religion Precision Micro: 0.0
  • I-age F1 Micro: 0.4324
  • I-age Recall Micro: 0.5714
  • I-age Precision Micro: 0.3478
  • B-investment Program F1 Micro: 0.0
  • B-investment Program Recall Micro: 0.0
  • B-investment Program Precision Micro: 0.0
  • I-investment Program F1 Micro: 0.0
  • I-investment Program Recall Micro: 0.0
  • I-investment Program Precision Micro: 0.0
  • B-borough F1 Micro: 0.7059
  • B-borough Recall Micro: 0.6667
  • B-borough Precision Micro: 0.75
  • B-price F1 Micro: 0.0
  • B-price Recall Micro: 0.0
  • B-price Precision Micro: 0.0
  • I-price F1 Micro: 0.0
  • I-price Recall Micro: 0.0
  • I-price Precision Micro: 0.0
  • B-character F1 Micro: 0.0
  • B-character Recall Micro: 0.0
  • B-character Precision Micro: 0.0
  • I-character F1 Micro: 0.0
  • I-character Recall Micro: 0.0
  • I-character Precision Micro: 0.0
  • B-website F1 Micro: 0.0
  • B-website Recall Micro: 0.0
  • B-website Precision Micro: 0.0
  • B-street F1 Micro: 0.4000
  • B-street Recall Micro: 0.4286
  • B-street Precision Micro: 0.375
  • I-street F1 Micro: 0.3256
  • I-street Recall Micro: 1.0
  • I-street Precision Micro: 0.1944
  • B-village F1 Micro: 0.6667
  • B-village Recall Micro: 0.7
  • B-village Precision Micro: 0.6364
  • I-village F1 Micro: 0.2222
  • I-village Recall Micro: 0.875
  • I-village Precision Micro: 0.1273
  • B-disease F1 Micro: 0.5965
  • B-disease Recall Micro: 0.7083
  • B-disease Precision Micro: 0.5152
  • I-disease F1 Micro: 0.3704
  • I-disease Recall Micro: 0.7812
  • I-disease Precision Micro: 0.2427
  • B-penalty F1 Micro: 0.1579
  • B-penalty Recall Micro: 0.1579
  • B-penalty Precision Micro: 0.1579
  • I-penalty F1 Micro: 0.1674
  • I-penalty Recall Micro: 0.3175
  • I-penalty Precision Micro: 0.1136
  • B-weapon F1 Micro: 0.6715
  • B-weapon Recall Micro: 0.7302
  • B-weapon Precision Micro: 0.6216
  • I-weapon F1 Micro: 0.2455
  • I-weapon Recall Micro: 0.5965
  • I-weapon Precision Micro: 0.1545
  • I-borough F1 Micro: 0.4091
  • I-borough Recall Micro: 0.6923
  • I-borough Precision Micro: 0.2903
  • B-vehicle F1 Micro: 0.6349
  • B-vehicle Recall Micro: 0.5882
  • B-vehicle Precision Micro: 0.6897
  • I-vehicle F1 Micro: 0.4174
  • I-vehicle Recall Micro: 0.7273
  • I-vehicle Precision Micro: 0.2927
  • B-language F1 Micro: 0.0
  • B-language Recall Micro: 0.0
  • B-language Precision Micro: 0.0
  • I-language F1 Micro: 0.0
  • I-language Recall Micro: 0.0
  • I-language Precision Micro: 0.0
  • B-house F1 Micro: 0.0
  • B-house Recall Micro: 0.0
  • B-house Precision Micro: 0.0
  • I-norp F1 Micro: 0.0
  • I-norp Recall Micro: 0.0
  • I-norp Precision Micro: 0.0
  • I-house F1 Micro: 0.0
  • I-house Recall Micro: 0.0
  • I-house Precision Micro: 0.0
  • I-website F1 Micro: 0.0
  • I-website Recall Micro: 0.0
  • I-website Precision Micro: 0.0
  • F1 Macro: 0.3969
  • Recall Macro: 0.5603
  • Precision Macro: 0.3447

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss F1 Micro O F1 Micro O Recall Micro O Precision Micro B-person F1 Micro B-person Recall Micro B-person Precision Micro B-norp F1 Micro B-norp Recall Micro B-norp Precision Micro B-commodity F1 Micro B-commodity Recall Micro B-commodity Precision Micro B-date F1 Micro B-date Recall Micro B-date Precision Micro I-date F1 Micro I-date Recall Micro I-date Precision Micro B-country F1 Micro B-country Recall Micro B-country Precision Micro B-economic Sector F1 Micro B-economic Sector Recall Micro B-economic Sector Precision Micro I-economic Sector F1 Micro I-economic Sector Recall Micro I-economic Sector Precision Micro B-news Source F1 Micro B-news Source Recall Micro B-news Source Precision Micro B-profession F1 Micro B-profession Recall Micro B-profession Precision Micro I-news Source F1 Micro I-news Source Recall Micro I-news Source Precision Micro I-person F1 Micro I-person Recall Micro I-person Precision Micro B-organization F1 Micro B-organization Recall Micro B-organization Precision Micro I-profession F1 Micro I-profession Recall Micro I-profession Precision Micro B-event F1 Micro B-event Recall Micro B-event Precision Micro B-city F1 Micro B-city Recall Micro B-city Precision Micro B-gpe F1 Micro B-gpe Recall Micro B-gpe Precision Micro I-event F1 Micro I-event Recall Micro I-event Precision Micro B-group F1 Micro B-group Recall Micro B-group Precision Micro B-ordinal F1 Micro B-ordinal Recall Micro B-ordinal Precision Micro B-product F1 Micro B-product Recall Micro B-product Precision Micro I-organization F1 Micro I-organization Recall Micro I-organization Precision Micro B-money F1 Micro B-money Recall Micro B-money Precision Micro I-money F1 Micro I-money Recall Micro I-money Precision Micro B-currency F1 Micro B-currency Recall Micro B-currency Precision Micro B-percent F1 Micro B-percent Recall Micro B-percent Precision Micro I-percent F1 Micro I-percent Recall Micro I-percent Precision Micro I-group F1 Micro I-group Recall Micro I-group Precision Micro B-cardinal F1 Micro B-cardinal Recall Micro B-cardinal Precision Micro B-law F1 Micro B-law Recall Micro B-law Precision Micro I-law F1 Micro I-law Recall Micro I-law Precision Micro B-fac F1 Micro B-fac Recall Micro B-fac Precision Micro I-fac F1 Micro I-fac Recall Micro I-fac Precision Micro B-age F1 Micro B-age Recall Micro B-age Precision Micro I-city F1 Micro I-city Recall Micro I-city Precision Micro B-work Of Art F1 Micro B-work Of Art Recall Micro B-work Of Art Precision Micro I-work Of Art F1 Micro I-work Of Art Recall Micro I-work Of Art Precision Micro B-region F1 Micro B-region Recall Micro B-region Precision Micro I-region F1 Micro I-region Recall Micro I-region Precision Micro I-cardinal F1 Micro I-cardinal Recall Micro I-cardinal Precision Micro I-currency F1 Micro I-currency Recall Micro I-currency Precision Micro B-quantity F1 Micro B-quantity Recall Micro B-quantity Precision Micro I-quantity F1 Micro I-quantity Recall Micro I-quantity Precision Micro B-crime F1 Micro B-crime Recall Micro B-crime Precision Micro I-crime F1 Micro I-crime Recall Micro I-crime Precision Micro B-trade Agreement F1 Micro B-trade Agreement Recall Micro B-trade Agreement Precision Micro B-nationality F1 Micro B-nationality Recall Micro B-nationality Precision Micro B-family F1 Micro B-family Recall Micro B-family Precision Micro I-family F1 Micro I-family Recall Micro I-family Precision Micro I-product F1 Micro I-product Recall Micro I-product Precision Micro B-time F1 Micro B-time Recall Micro B-time Precision Micro I-time F1 Micro I-time Recall Micro I-time Precision Micro I-commodity F1 Micro I-commodity Recall Micro I-commodity Precision Micro B-application F1 Micro B-application Recall Micro B-application Precision Micro I-application F1 Micro I-application Recall Micro I-application Precision Micro I-country F1 Micro I-country Recall Micro I-country Precision Micro B-award F1 Micro B-award Recall Micro B-award Precision Micro I-award F1 Micro I-award Recall Micro I-award Precision Micro I-gpe F1 Micro I-gpe Recall Micro I-gpe Precision Micro B-location F1 Micro B-location Recall Micro B-location Precision Micro I-location F1 Micro I-location Recall Micro I-location Precision Micro I-ordinal F1 Micro I-ordinal Recall Micro I-ordinal Precision Micro I-trade Agreement F1 Micro I-trade Agreement Recall Micro I-trade Agreement Precision Micro B-religion F1 Micro B-religion Recall Micro B-religion Precision Micro I-age F1 Micro I-age Recall Micro I-age Precision Micro B-investment Program F1 Micro B-investment Program Recall Micro B-investment Program Precision Micro I-investment Program F1 Micro I-investment Program Recall Micro I-investment Program Precision Micro B-borough F1 Micro B-borough Recall Micro B-borough Precision Micro B-price F1 Micro B-price Recall Micro B-price Precision Micro I-price F1 Micro I-price Recall Micro I-price Precision Micro B-character F1 Micro B-character Recall Micro B-character Precision Micro I-character F1 Micro I-character Recall Micro I-character Precision Micro B-website F1 Micro B-website Recall Micro B-website Precision Micro B-street F1 Micro B-street Recall Micro B-street Precision Micro I-street F1 Micro I-street Recall Micro I-street Precision Micro B-village F1 Micro B-village Recall Micro B-village Precision Micro I-village F1 Micro I-village Recall Micro I-village Precision Micro B-disease F1 Micro B-disease Recall Micro B-disease Precision Micro I-disease F1 Micro I-disease Recall Micro I-disease Precision Micro B-penalty F1 Micro B-penalty Recall Micro B-penalty Precision Micro I-penalty F1 Micro I-penalty Recall Micro I-penalty Precision Micro B-weapon F1 Micro B-weapon Recall Micro B-weapon Precision Micro I-weapon F1 Micro I-weapon Recall Micro I-weapon Precision Micro I-borough F1 Micro I-borough Recall Micro I-borough Precision Micro B-vehicle F1 Micro B-vehicle Recall Micro B-vehicle Precision Micro I-vehicle F1 Micro I-vehicle Recall Micro I-vehicle Precision Micro B-language F1 Micro B-language Recall Micro B-language Precision Micro I-language F1 Micro I-language Recall Micro I-language Precision Micro B-house F1 Micro B-house Recall Micro B-house Precision Micro I-norp F1 Micro I-norp Recall Micro I-norp Precision Micro I-house F1 Micro I-house Recall Micro I-house Precision Micro I-website F1 Micro I-website Recall Micro I-website Precision Micro F1 Macro Recall Macro Precision Macro
0.0033 1.0 3014 0.0092 0.5876 0.6385 0.9365 0.4844 0.9705 0.9816 0.9596 0.6118 0.8667 0.4727 0.7873 0.9394 0.6776 0.8436 0.8571 0.8304 0.6416 0.9669 0.4801 0.6229 0.9831 0.4559 0.6024 0.6410 0.5682 0.2491 0.5965 0.1574 0.7954 0.8306 0.7630 0.8659 0.9184 0.8191 0.4757 0.8461 0.3309 0.3331 0.996 0.2 0.8277 0.9038 0.7633 0.2675 0.7652 0.1621 0.5617 0.5291 0.5987 0.7347 0.8630 0.6396 0.6989 0.9535 0.5516 0.2461 0.3922 0.1793 0.7073 0.7143 0.7004 0.592 0.7708 0.4805 0.7213 0.8049 0.6535 0.3127 0.8040 0.1941 0.88 0.9098 0.8521 0.6312 0.9502 0.4726 0.7622 0.9658 0.6295 0.8824 0.9146 0.8523 0.6952 1.0 0.5328 0.1762 0.4426 0.1100 0.6688 0.6604 0.6774 0.3846 0.3571 0.4167 0.2473 0.3617 0.1878 0.7348 0.7253 0.7445 0.3085 0.6977 0.1980 0.8333 0.7143 1.0 0.1010 0.7143 0.0543 0.3333 0.25 0.5 0.4242 0.4375 0.4118 0.7729 0.8151 0.7348 0.2865 0.8033 0.1744 0.4196 0.5085 0.3571 0.0 0.0 0.0 0.7177 0.7479 0.6899 0.4931 0.7933 0.3577 0.3789 0.4286 0.3396 0.3341 0.6574 0.2240 0.6415 0.6296 0.6538 0.0 0.0 0.0 0.4737 1.0 0.3103 0.0 0.0 0.0 0.2270 0.5647 0.1420 0.6667 0.6667 0.6667 0.5854 0.9057 0.4324 0.0741 0.1667 0.0476 0.0 0.0 0.0 0.0 0.0 0.0 0.1867 0.7368 0.1069 0.4444 0.3077 0.8 0.4706 0.7805 0.3368 0.2619 0.9167 0.1528 0.4878 0.4839 0.4918 0.2997 0.6053 0.1991 0.25 0.2 0.3333 0.1154 0.2308 0.0769 0.0 0.0 0.0 0.3750 0.6429 0.2647 0.0 0.0 0.0 0.0 0.0 0.0 0.6667 0.6667 0.6667 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.5 0.5714 0.4444 0.3333 1.0 0.2 0.6222 0.7 0.56 0.2424 1.0 0.1379 0.5778 0.5417 0.6190 0.3425 0.7812 0.2193 0.1212 0.1053 0.1429 0.1847 0.3651 0.1237 0.7015 0.7460 0.6620 0.2256 0.5263 0.1435 0.4045 0.6923 0.2857 0.7143 0.7353 0.6944 0.3826 0.6667 0.2683 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3969 0.5513 0.3515
0.0032 2.0 6028 0.0096 0.5837 0.6370 0.9242 0.4860 0.9639 0.9816 0.9468 0.6190 0.8667 0.4815 0.7553 0.9470 0.6281 0.8386 0.8471 0.8304 0.6419 0.9492 0.4849 0.6152 0.9765 0.4490 0.5576 0.5897 0.5287 0.2517 0.6667 0.1551 0.7988 0.8327 0.7677 0.8088 0.9464 0.7061 0.4808 0.8400 0.3368 0.3381 0.996 0.2036 0.8350 0.8993 0.7794 0.2462 0.8030 0.1454 0.5658 0.5436 0.5899 0.625 0.8904 0.4815 0.6760 0.9380 0.5284 0.2577 0.3776 0.1956 0.6667 0.75 0.6 0.5306 0.8125 0.3939 0.6683 0.8232 0.5625 0.3128 0.8425 0.1921 0.8530 0.8947 0.8151 0.6259 0.9644 0.4632 0.7441 0.9658 0.6052 0.8639 0.8902 0.8391 0.6995 0.9846 0.5424 0.1844 0.4836 0.1139 0.6903 0.7358 0.65 0.3704 0.3571 0.3846 0.3246 0.3936 0.2761 0.6910 0.6910 0.6910 0.3007 0.7151 0.1904 0.8649 0.7619 1.0 0.1047 0.6429 0.0570 0.3158 0.375 0.2727 0.3721 0.5 0.2963 0.8070 0.7731 0.8440 0.2817 0.8197 0.1701 0.3851 0.4831 0.3202 0.0 0.0 0.0 0.7311 0.7311 0.7311 0.4889 0.7989 0.3522 0.3736 0.4048 0.3469 0.3245 0.5648 0.2276 0.7170 0.7037 0.7308 0.0 0.0 0.0 0.5 0.8889 0.3478 0.0 0.0 0.0 0.2021 0.6824 0.1186 0.6538 0.6296 0.68 0.6118 0.9811 0.4444 0.0444 0.1667 0.0256 0.0 0.0 0.0 0.0 0.0 0.0 0.1695 0.7895 0.0949 0.5455 0.4615 0.6667 0.4459 0.8049 0.3084 0.3284 0.9167 0.2 0.4885 0.5161 0.4638 0.3189 0.6316 0.2133 0.5 0.4 0.6667 0.1163 0.3846 0.0685 0.0 0.0 0.0 0.4324 0.5714 0.3478 0.0 0.0 0.0 0.0 0.0 0.0 0.7059 0.6667 0.75 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4000 0.4286 0.375 0.3256 1.0 0.1944 0.6667 0.7 0.6364 0.2222 0.875 0.1273 0.5965 0.7083 0.5152 0.3704 0.7812 0.2427 0.1579 0.1579 0.1579 0.1674 0.3175 0.1136 0.6715 0.7302 0.6216 0.2455 0.5965 0.1545 0.4091 0.6923 0.2903 0.6349 0.5882 0.6897 0.4174 0.7273 0.2927 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3969 0.5603 0.3447
0.0029 3.0 9042 0.0102 0.5857 0.6369 0.9367 0.4824 0.9555 0.9862 0.9266 0.5909 0.8667 0.4483 0.7590 0.9545 0.63 0.8374 0.8551 0.8205 0.6347 0.9752 0.4704 0.6097 0.9817 0.4422 0.6286 0.7051 0.5670 0.2581 0.6316 0.1622 0.7871 0.8347 0.7446 0.8664 0.9371 0.8056 0.4789 0.8564 0.3324 0.3383 0.996 0.2038 0.8071 0.8929 0.7364 0.2440 0.8106 0.1436 0.5397 0.4738 0.6269 0.7014 0.8767 0.5845 0.6503 0.9225 0.5021 0.2354 0.3306 0.1828 0.6799 0.75 0.6217 0.592 0.7708 0.4805 0.7163 0.7622 0.6757 0.3133 0.8077 0.1944 0.8278 0.8496 0.8071 0.6187 0.9644 0.4555 0.7749 0.9315 0.6634 0.8606 0.8659 0.8554 0.688 0.9923 0.5265 0.1872 0.4918 0.1156 0.6826 0.7170 0.6514 0.4286 0.4286 0.4286 0.2900 0.4149 0.2229 0.7124 0.6910 0.7352 0.3132 0.7093 0.2010 0.8649 0.7619 1.0 0.0988 0.5714 0.0541 0.1429 0.125 0.1667 0.3889 0.4375 0.35 0.7317 0.7563 0.7087 0.2889 0.8525 0.1739 0.4224 0.5763 0.3333 0.0 0.0 0.0 0.7265 0.7143 0.7391 0.4936 0.7598 0.3656 0.3564 0.4286 0.3051 0.2857 0.6944 0.1799 0.6471 0.8148 0.5366 0.0 0.0 0.0 0.5455 1.0 0.375 0.0 0.0 0.0 0.2392 0.5529 0.1526 0.6415 0.6296 0.6538 0.6 0.9057 0.4486 0.1176 0.6667 0.0645 0.0 0.0 0.0 0.0 0.0 0.0 0.1436 0.7368 0.0795 0.4286 0.4615 0.4 0.4595 0.8293 0.3178 0.3548 0.9167 0.22 0.5197 0.5323 0.5077 0.3300 0.6447 0.2217 0.4444 0.4 0.5 0.0870 0.2308 0.0536 0.0 0.0 0.0 0.3902 0.5714 0.2963 0.0 0.0 0.0 0.0 0.0 0.0 0.6471 0.6111 0.6875 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4000 0.4286 0.375 0.3256 1.0 0.1944 0.6829 0.7 0.6667 0.2258 0.875 0.1296 0.4928 0.7083 0.3778 0.3521 0.7812 0.2273 0.15 0.1579 0.1429 0.2128 0.3175 0.16 0.7059 0.7619 0.6575 0.2581 0.7018 0.1581 0.3956 0.6923 0.2769 0.7143 0.7353 0.6944 0.4915 0.8788 0.3412 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3958 0.5647 0.3398
0.0026 4.0 12056 0.0105 0.5820 0.6363 0.9236 0.4854 0.9661 0.9839 0.9490 0.6118 0.8667 0.4727 0.7568 0.9545 0.6269 0.8481 0.8370 0.8595 0.6525 0.9587 0.4945 0.6156 0.9844 0.4478 0.6125 0.6282 0.5976 0.2618 0.6316 0.1651 0.8011 0.8448 0.7618 0.8447 0.9254 0.7769 0.4769 0.8303 0.3346 0.3379 0.996 0.2034 0.8171 0.8938 0.7525 0.2480 0.8182 0.1461 0.5205 0.5174 0.5235 0.6432 0.8767 0.5079 0.6821 0.9147 0.5438 0.2270 0.4441 0.1525 0.6405 0.7460 0.5612 0.6016 0.7708 0.4933 0.7299 0.7744 0.6902 0.3177 0.8059 0.1978 0.8699 0.8797 0.8603 0.6308 0.9395 0.4748 0.7637 0.9521 0.6376 0.8571 0.8780 0.8372 0.688 0.9923 0.5265 0.1887 0.4918 0.1167 0.6879 0.7484 0.6364 0.3333 0.3571 0.3125 0.2439 0.3723 0.1813 0.6925 0.6910 0.6940 0.3186 0.7326 0.2036 0.8333 0.7143 1.0 0.1046 0.5714 0.0576 0.2857 0.25 0.3333 0.3333 0.4375 0.2692 0.7583 0.7647 0.7521 0.2985 0.8197 0.1825 0.3416 0.4661 0.2696 0.0 0.0 0.0 0.7113 0.7143 0.7083 0.4965 0.7877 0.3625 0.3800 0.4524 0.3276 0.3125 0.6944 0.2016 0.6545 0.6667 0.6429 0.0 0.0 0.0 0.5143 1.0 0.3462 0.0 0.0 0.0 0.2338 0.5294 0.15 0.6429 0.6667 0.6207 0.5561 0.9811 0.3881 0.1481 0.6667 0.0833 0.0 0.0 0.0 0.0 0.0 0.0 0.1172 0.7368 0.0636 0.4762 0.3846 0.625 0.4648 0.8049 0.3267 0.2299 0.8333 0.1333 0.4429 0.5 0.3974 0.3009 0.6711 0.1939 0.4444 0.4 0.5 0.0714 0.1538 0.0465 0.0 0.0 0.0 0.4118 0.5 0.35 0.0 0.0 0.0 0.0 0.0 0.0 0.7568 0.7778 0.7368 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3077 0.2857 0.3333 0.3333 1.0 0.2 0.6667 0.75 0.6 0.2222 0.875 0.1273 0.5246 0.6667 0.4324 0.3145 0.7812 0.1969 0.1818 0.2105 0.16 0.1910 0.3016 0.1397 0.6341 0.8254 0.5149 0.2434 0.6491 0.1498 0.3925 0.8077 0.2593 0.7042 0.7353 0.6757 0.4265 0.8788 0.2816 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3908 0.5625 0.3356
0.0024 5.0 15070 0.0106 0.5861 0.6371 0.9416 0.4815 0.9805 0.9839 0.9772 0.5909 0.8667 0.4483 0.8039 0.9470 0.6983 0.7907 0.7827 0.7988 0.6354 0.9126 0.4874 0.6076 0.9791 0.4405 0.5590 0.5769 0.5422 0.2528 0.5965 0.1604 0.7992 0.8508 0.7536 0.8109 0.9347 0.7161 0.4823 0.8485 0.3369 0.3381 0.996 0.2036 0.8518 0.9103 0.8003 0.2566 0.8106 0.1524 0.5482 0.5203 0.5793 0.6995 0.8767 0.5818 0.6629 0.8992 0.5249 0.2411 0.3355 0.1882 0.6858 0.7103 0.6630 0.5455 0.8125 0.4105 0.6468 0.7927 0.5462 0.3140 0.8498 0.1926 0.8722 0.8722 0.8722 0.6272 0.9431 0.4699 0.7363 0.9658 0.5949 0.8690 0.8902 0.8488 0.6904 0.9692 0.5362 0.1986 0.4590 0.1267 0.7049 0.7736 0.6474 0.3125 0.3571 0.2778 0.2397 0.4043 0.1704 0.6680 0.6953 0.6429 0.3102 0.7267 0.1972 0.8649 0.7619 1.0 0.0859 0.5 0.0470 0.2667 0.25 0.2857 0.3333 0.4375 0.2692 0.7819 0.7983 0.7661 0.2779 0.8361 0.1667 0.4099 0.5593 0.3235 0.0 0.0 0.0 0.7378 0.6975 0.7830 0.4953 0.7318 0.3743 0.3191 0.3571 0.2885 0.3052 0.5185 0.2162 0.6667 0.8519 0.5476 0.0 0.0 0.0 0.5625 1.0 0.3913 0.0 0.0 0.0 0.25 0.5765 0.1596 0.5926 0.5926 0.5926 0.56 0.9245 0.4016 0.1067 0.6667 0.0580 0.0 0.0 0.0 0.0 0.0 0.0 0.1266 0.7895 0.0688 0.3478 0.3077 0.4 0.4828 0.6829 0.3733 0.2933 0.9167 0.1746 0.4885 0.5161 0.4638 0.3055 0.5526 0.2111 0.4615 0.6 0.375 0.1034 0.2308 0.0667 0.0 0.0 0.0 0.4091 0.6429 0.3 0.0 0.0 0.0 0.0 0.0 0.0 0.7647 0.7222 0.8125 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.4000 0.4286 0.375 0.3333 1.0 0.2 0.6341 0.65 0.6190 0.2034 0.75 0.1176 0.3404 0.6667 0.2286 0.3356 0.7812 0.2137 0.2041 0.2632 0.1667 0.3077 0.4444 0.2353 0.6483 0.7460 0.5732 0.2628 0.6316 0.1659 0.4368 0.7308 0.3115 0.7385 0.7059 0.7742 0.448 0.8485 0.3043 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.3916 0.5600 0.3347

Framework versions

  • Transformers 4.35.0
  • Pytorch 2.1.0+cu121
  • Datasets 2.2.2
  • Tokenizers 0.14.1
Downloads last month
51
Safetensors
Model size
278M params
Tensor type
F32
·

Finetuned from