Huge Table Q/A
Is it possible to use the model to answer questions on a huge table dataset (ex. having 1 million records)?
Is it possible to use the model to answer questions on a huge table dataset (ex. having 1 million records)?
Hi, were you able to find a workaround this problem? Even I'm facing similar issue..
Is it possible to use the model to answer questions on a huge table dataset (ex. having 1 million records)?
Hi, were you able to find a workaround this problem? Even I'm facing similar issue..
Hi, I am facing the same issue, I want to query large tables too. Any update or advice would be awesome
Hi,
I have 4000 rows and 15 columns, and it is unable to read it. It gives, "too many rows" error. By any chance, have you find solution?
Thanks
ValueError: Too many rows is the error it gives
The model is designed for 1024 tokens. Even if you feed more than that, model wont be able to process. I think the workaround will be to use some kind of vector store/ similarity search retriever which can feed max 20 rows to the model. (Implementation will require vector store clients)
Hi All,
How do you all get the results to compute?
For instance, when I ask for the average it is printing an answer like this: {'answer': 'AVERAGE > 75.57781966, 72.31750532, 65.19845111, 47.4168394, 68.44955858, 76.7413947, ...
But it is not really making the computations.