Model Card for Model ID

This modelcard aims to detect text that was generated by LLMs.

Model Details

Model Description

  • Developed by: huolongguo10
  • Model type: bert
  • Language(s) (NLP): Chinese
  • License: [More Information Needed]
  • Finetuned from model [optional]: bert-base-chinese

Model Sources [optional]

  • Repository: [More Information Needed]
  • Paper [optional]: [More Information Needed]
  • Demo [optional]: [More Information Needed]

Uses

Direct Use

from transformers import AutoTokenizer, AutoModelForMaskedLM

tokenizer = AutoTokenizer.from_pretrained("huolongguo10/LLM_detect")

model = AutoModelForMaskedLM.from_pretrained("huolongguo10/LLM_detect")

Training Details

Training Data

Training Procedure

Preprocessing [optional]

[More Information Needed]

Training Hyperparameters

  • Training regime: fp32

Evaluation

Testing Data, Factors & Metrics

Testing Data

[More Information Needed]

Factors

[More Information Needed]

Metrics

[More Information Needed]

Results

[More Information Needed]

Summary

Model Examination [optional]

[More Information Needed]

Environmental Impact

Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).

  • Hardware Type: P100
  • Hours used: 4h
  • Cloud Provider: kaggle

Technical Specifications [optional]

Model Architecture and Objective

bert

Compute Infrastructure

[More Information Needed]

Hardware

P100

Software

transformers

Citation [optional]

BibTeX:

[More Information Needed]

APA:

[More Information Needed]

Glossary [optional]

[More Information Needed]

More Information [optional]

[More Information Needed]

Model Card Authors [optional]

[More Information Needed]

Model Card Contact

[More Information Needed]

Downloads last month
9
Safetensors
Model size
102M params
Tensor type
F32
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Space using huolongguo10/LLM_detect 1