Discussion about this post

User's avatar
Rajashekar's avatar

Great & Meaningful project... as a novice, I tried following your detailed writeup.. faced some abstacles while installing & configuring the dependency packages (mainly sentence-transformers & torch), as for mac silicon processor, their is a separate URL... but somehow I could install almost all. Finally when I ran the Welcome.py, and tried to use chat, I am not getting any response... I checked that Ollama is installed with the said version...

Checked the logs & found error 'Unexpected chunk format in response stream.'.. Any quick help is appretiated

Expand full comment
Himanshu Dharm's avatar

Hey! This is a pretty cool project. Thank you!

It got me thinking, since the model is going through a lot of docs and ofcourse it will handle it smartly. A nice option would be to also throw in a citation window?

Which could indicate the line/para/page no. of the doc where the answer is fetched from.

Happy to hear your thoughts on this.

Cheers!

Expand full comment
5 more comments...

No posts