hw2942's picture
Update README.md
ef2e68c
metadata
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: bert-base-chinese-finetuning-wallstreetcn-morning-news-vix-sz50-v1
    results: []
language:
  - zh
widget:
  - >-
    text:A股创业板六年新高;纳指跌落高位,标普又新高,创史上第二大中概IPO和今年美股最大IPO的滴滴首日冲高回落,市值破800亿美元,叮咚买菜次日涨逾60%;美元逾两月新高,金银铜6月大跌,原油半年涨超50%。\n中国6月官方制造业PMI为50.9,价格指数从高位回落。\n央行等六部门:充分发挥信贷等金融子市场合力,增强政策的针对性和可操作性。\n人社部
    “十四五”
    发展规划要求,基本养老保险参保率达95%,城镇新增就业逾5000万人。\n沪深交所7月19日起下调基金交易经手费收费标准。\n奈雪的茶赴港上市首日破发,收盘大跌14%,市值跌破300亿港元。\n港股上市倒计时,小鹏汽车定价165港元/股。\n格力2020股东会通过员工持股计划等议案,董明珠称接班人不是我说你行就行,是你能行才行。\n美国6月小非农ADP新增就业高于预期,绝对值较5月有所回落。\n美联储逆回购用量史上首次逼近1万亿美元。\n媒体称拜登最早下周颁布新行政令,限制多个行业的寡头垄断。\n亚马逊称FTC新任主席有偏见,寻求其回避反垄断调查。\n散户最爱平台Robinhood遭FINRA创纪录罚款7000万美元,被指坑害百万客户。

bert-base-chinese-finetuning-wallstreetcn-morning-news-vix-sz50-v1

This model is a fine-tuned version of bert-base-chinese on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0050
  • Accuracy: 0.6538

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 19 0.6986 0.5
No log 2.0 38 0.6988 0.5
No log 3.0 57 0.7804 0.5
No log 4.0 76 0.6912 0.5
No log 5.0 95 0.8595 0.5192
No log 6.0 114 0.7574 0.5962
No log 7.0 133 1.6235 0.6154
No log 8.0 152 1.2308 0.6346
No log 9.0 171 1.1341 0.6923
No log 10.0 190 1.0050 0.6538

Framework versions

  • Transformers 4.30.2
  • Pytorch 2.0.1+cu118
  • Datasets 2.13.1
  • Tokenizers 0.13.3