difference in inference between llama_cpp and langchain's LlamaCpp wrapper

#9
by YairFr - opened

i see in my implementation the same behavior as is seen in https://github.com/R3gm/InsightSolver-Colab/blob/main/LLM_Inference_with_llama_cpp_python__Llama_2_13b_chat.ipynb
the inference with llama_cpp is giving the desired direct answer :

image.png

but with langchain we get just a general walkthrough and not the actual python:

image.png
why is that , and how can i avoid it when i am using the lanchain's wrapper ?

Sign up or log in to comment