Edit model card

Fine-Tuned BART Model for CLI Command Generation This repository contains a fine-tuned BART model for generating CLI commands from human instructions. The model is based on the Hugging Face Transformers library and has been fine-tuned on a dataset of human instructions and corresponding CLI commands.

Model Details Model: BART (facebook/bart-large) Task: CLI Command Generation from Human Instructions Tokenizer: BART Tokenizer (facebook/bart-large) Fine-Tuning Hyperparameters: num_epochs=2, batch_size=4, accumulation_steps=2, learning_rate=2e-5

Usage You can use this model to generate CLI commands from human instructions. You can either directly use the model for inference or integrate it into applications using the provided Gradio interface.

Inference To perform inference using the model, you can load it.

Gradio Interface An interactive Gradio interface is provided for easy model interaction. You can run the interface by executing the code in gradio_app.py. The interface allows you to enter human instructions and get corresponding generated CLI commands.

Model Files config.json: Model configuration file pytorch_model.bin: Model weights tokenizer.json: Tokenizer configuration file vocab.txt: Vocabulary file

Acknowledgments The initial BART model and tokenizer are from the Hugging Face Transformers library (facebook/bart-large).

Downloads last month
8

Space using difinative/AIBuddy 1