<0x0A> in the output

#1
by eramax - opened

The output has the <0x0A> issue.

Regards

The outputs contains /******/ , I don't know if it is from the model or the gguf

➜  hf llm run eramax/go-bruins-v2.1.1:Q5_K_M
>>> Recognize this  popular algorithm and make an example of use of it
... w = lambda g: reduce(lambda g, k: [[min(a, g[a][k] + g[k][b]) for b in range(len(g))] for a in range(len(g))], range(len(g)), g)
 /******/<0x0A>This algorithm appears to be implementing the Floyd-Warshall all pairs shortest path algorithm. Here is an^C
➜  hf llm run eramax/go-bruins-v2.1.1:Q5_K_M
>>> give me the output of this code \
... s = "She enrolled in the course at the university." \
... doc = nlp(s) \
... [(t.text, t.dep_, t.head.text) for t in doc]
 /******/ /******/ /******/ /******/ /******/ /******/ /******/ /******/ /******/ /******/ /******/ /******/ /******/ /******/ /******/**/ The given code is attempting to extract dependency triples from
a sentence using spaCy, an NLP library in Python. Here's the code with some necessary imports and explanations:<0x0A><0x0A>```python<0x0A>import spacy<0x0A><0x0A>nlp = spacy.load('en_core_web_sm') # load
the spaCy English model<0x0A><0x0A>^C

please try the original model and respond, whether this problem is carried from the model itself or was it introduced by ggufing?

This one does not have the issue: https://huggingface.co/21world/go-bruins-v2.1.1.GGUF

This problem was previously reproduced and fixed for v2 model, here: https://huggingface.co/TheBloke/go-bruins-v2-GGUF/discussions/1#6575e27bf9898ed3ab9c3f98

Sign up or log in to comment