ATTACK-BERT / README.md
basel's picture
Update README.md
7d0d228
|
raw
history blame
1.04 kB
metadata
pipeline_tag: sentence-similarity
tags:
  - cybersecurity
  - sentence-embedding
  - sentence-similarity

ATT&CK BERT

ATT&CK BERT is a cybersecurity domain-specific language model based on sentence-transformers. ATT&CK BERT maps sentences representing attack actions to a semantically meaningful embedding vector. Embedding vectors of sentences with similar meanings have a high cosine similarity.

Usage (Sentence-Transformers)

Using this model becomes easy when you have sentence-transformers installed:

pip install -U sentence-transformers

Then you can use the model like this:

from sentence_transformers import SentenceTransformer
sentences = ["Attacker takes a screenshot", "Attacker captures the screen"]

model = SentenceTransformer('basel/ATTACK-BERT')
embeddings = model.encode(sentences)

from sklearn.metrics.pairwise import cosine_similarity
print(cosine_similarity([embeddings[0]], [embeddings[1]]))