Edit model card

SLIM-SUMMARY-PHI-3-GGUF

slim-summary-phi-3 is a finetune of phi-3 mini (3.8B parameters) to implement a function-calling summarization model, and then packaged as 4_K_M quantized GGUF, providing a small, fast inference implementation, to provide high-quality summarizations of complex business documents, on a small, specialized locally-deployable model with summary output structured as a python list of key points.

The model takes as input a text passage, an optional parameter with a focusing phrase or query, and an experimental optional (N) parameter, which is used to guide the model to a specific number of items to return in a summary list.

Please see the usage notes at: slim-summary

To pull the model via API:

from huggingface_hub import snapshot_download           
snapshot_download("llmware/slim-summary-phi-3-gguf", local_dir="/path/on/your/machine/", local_dir_use_symlinks=False)  

Load in your favorite GGUF inference engine, or try with llmware as follows:

from llmware.models import ModelCatalog  

# to load the model and make a basic inference
model = ModelCatalog().load_model("slim-summary-phi-3-gguf")
response = model.function_call(text_sample)  

# this one line will download the model and run a series of tests
ModelCatalog().tool_test_run("slim-summary-phi-3-gguf", verbose=True)  

Note: please review config.json in the repository for prompt wrapping information, details on the model, and full test set.

Model Card Contact

Darren Oberst & llmware team

Any questions? Join us on Discord

Downloads last month
79
GGUF
Model size
3.82B params
Architecture
phi3
Inference API
Inference API (serverless) has been turned off for this model.

Collection including llmware/slim-summary-phi-3-gguf