Yvnminc commited on
Commit
299dd17
·
verified ·
1 Parent(s): c76f380

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -1
README.md CHANGED
@@ -2,6 +2,7 @@
2
  license: apache-2.0
3
  task_categories:
4
  - text-retrieval
 
5
  language:
6
  - en
7
  tags:
@@ -37,4 +38,4 @@ NAICS Classification is a fundamental component of enterprise-level GHG emission
37
 
38
  Each enterprise description (query) is encoded separately, and matched against NAICS descriptions (corpus) based on the cosine similarity of their embeddings. This methodology leverages a dual-tower architecture, wherein the first tower processes the query (enterprise text) and the second tower processes NAICS descriptions.
39
 
40
- We apply machine learning to fine-tune a pre-trained Sentence-BERT model. Zero-shot SBERT models may achieve only around 20% Top-1 accuracy on the 1000 classes sector classification task, whereas contrastive fine-tuning raises this to over 75%. Further preprocessing exceeding 77% Top-1 accuracy, such as lowercasing and URL removal, can add incremental gains, leading to state-of-the-art results.
 
2
  license: apache-2.0
3
  task_categories:
4
  - text-retrieval
5
+ - text-classification
6
  language:
7
  - en
8
  tags:
 
38
 
39
  Each enterprise description (query) is encoded separately, and matched against NAICS descriptions (corpus) based on the cosine similarity of their embeddings. This methodology leverages a dual-tower architecture, wherein the first tower processes the query (enterprise text) and the second tower processes NAICS descriptions.
40
 
41
+ We apply machine learning to fine-tune a pre-trained Sentence-BERT model. Zero-shot SBERT models may achieve only around 20% Top-1 accuracy on the 1000 classes sector classification task, whereas contrastive fine-tuning raises this to over 75%. Further preprocessing exceeding 77% Top-1 accuracy, such as lowercasing and URL removal, can add incremental gains, leading to state-of-the-art results.