You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
As a temporary measure, I've just removed llama-index package in pyqt-openai which causes error because as you said, the code is old version. Usage of llama-index is significantly changed.
I will create tag to figure out how to implement "new" llama-index in this package.
It is bit too late (which is an understatement) but i applied new llamaindex codes into feature/llamaindex branch just a couple of days ago. If there are no issues, I will merge it this weekend.
After installing everything from the requirements.txt file, I'm running into the following problem:
No module named 'llama_index.response'
Is the code set up for an old version of llama_index?
The text was updated successfully, but these errors were encountered: