--- library_name: transformers license: llama3.1 base_model: NousResearch/Hermes-3-Llama-3.1-8B tags: - llama-factory - full - unsloth - generated_from_trainer model-index: - name: kimhyeongjun/Hermes-3-Llama-3.1-8B-Kor-Finance-Advisor results: [] --- [![QuantFactory Banner](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeiuCm7c8lEwEJuRey9kiVZsRn2W-b4pWlu3-X534V3YmVuVc2ZL-NXg2RkzSOOS2JXGHutDuyyNAUtdJI65jGTo8jT9Y99tMi4H4MqL44Uc5QKG77B0d6-JfIkZHFaUA71-RtjyYZWVIhqsNZcx8-OMaA?key=xt3VSDoCbmTY7o-cwwOFwQ)](https://hf.co/QuantFactory) # QuantFactory/Hermes-3-Llama-3.1-8B-Kor-Finance-Advisor-GGUF This is quantized version of [kimhyeongjun/Hermes-3-Llama-3.1-8B-Kor-Finance-Advisor](https://huggingface.co/kimhyeongjun/Hermes-3-Llama-3.1-8B-Kor-Finance-Advisor) created using llama.cpp # Original Model Card # kimhyeongjun/Hermes-3-Llama-3.1-8B-Kor-Finance-Advisor This is my personal toy project for Chuseok(Korean Thanksgiving Day). This model is a fine-tuned version of [NousResearch/Hermes-3-Llama-3.1-8B](https://huggingface.co/NousResearch/Hermes-3-Llama-3.1-8B) on the Korean_synthetic_financial_dataset_21K. ## Model description Everything happened automatically without any user intervention. Based on finance PDF data collected directly from the web, we refined the raw data using the 'meta-llama/Meta-Llama-3.1-70B-Instruct-FP8' model. After generating synthetic data based on the cleaned data, we further evaluated the quality of the generated data using the 'meta-llama/Llama-Guard-3-8B' and 'RLHFlow/ArmoRM-Llama3-8B-v0.1' models. We then used 'Alibaba-NLP/gte-large-en-v1.5' to extract embeddings and applied Faiss to perform Jaccard distance-based nearest neighbor analysis to construct the final dataset of 21k, which is diverse and sophisticated. 모든 과정은 사용자의 개입 없이 자동으로 진행되었습니다. 웹에서 직접 수집한 금융 관련 PDF 데이터를 기반으로, 돈이 없어서 'meta-llama/Meta-Llama-3.1-70B-Instruct-FP8' 모델을 활용하여 Raw 데이터를 정제하였습니다. 정제된 데이터를 바탕으로 합성 데이터를 생성한 후, 'meta-llama/Llama-Guard-3-8B' 및 'RLHFlow/ArmoRM-Llama3-8B-v0.1' 모델을 통해 생성된 데이터의 품질을 심층적으로 평가하였습니다. 이어서 'Alibaba-NLP/gte-large-en-v1.5'를 사용하여 임베딩을 추출하고, Faiss를 적용하여 자카드 거리 기반의 근접 이웃 분석을 수행함으로써 다양하고 정교한 최종 데이터셋 21k을 직접 구성하였습니다. ## Task duration 3days (20240914~20240916) ## evaluation Nothing (I had to take the Thanksgiving holiday off.) ## sample ![image/png](https://cdn-uploads.huggingface.co/production/uploads/619d8e31c21bf5feb310bd82/gJ6hnvAV2Qx9774AFFwQe.png) ### Framework versions - Transformers 4.44.2 - Pytorch 2.4.0+cu121 - Datasets 2.21.0 - Tokenizers 0.19.1