Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HTTP request failed with status: 404 when trying to connect to ooba #11

Open
Bibrrr opened this issue Apr 19, 2024 · 7 comments
Open

HTTP request failed with status: 404 when trying to connect to ooba #11

Bibrrr opened this issue Apr 19, 2024 · 7 comments
Assignees

Comments

@Bibrrr
Copy link

Bibrrr commented Apr 19, 2024

Hello :)
I have an issue with this app when it comes to connection to ooba with openai api.
I changed port to 5000 in text-default.json config file to match port with that from ooba. Bot turns on just fine, it also appears to connect to ooba. However when messaging bot in discord, I get this
obraz

I'm trying to figure out if it has something to do with openklyde, or openai api extension itself. Extention loads without that /v1 at the end
obraz

@badgids
Copy link
Owner

badgids commented Apr 19, 2024 via email

@Bibrrr
Copy link
Author

Bibrrr commented Apr 20, 2024

That's a silly typo... but it's still not working
obraz
I noticed that when you run koboldcpp there's an actual api website when you go to 127.0.0.1:5001/api. I assume there should be something similar on 127.0.0.1:5000/api, but there's nothing
issue

@badgids
Copy link
Owner

badgids commented Apr 20, 2024

It's been a minute since I used Ooba. Did you launch Textgen-UI with the --listen --api and the --openai extension?

For Ooba the API endpoint is http://127.0.0.1:5000/v1/

It was correct the first go 'round, just missing the trailing / after v1

https://github.com/oobabooga/text-generation-webui/wiki/12-%E2%80%90-OpenAI-API

@Bibrrr
Copy link
Author

Bibrrr commented Apr 20, 2024

I completely forgot about /docs... sorry about that.
I think I see where's the issue. Your bot is looking for localhost:5000/api/v1/model, while in ooba it's localhost:5000/v1/models

for text generation in kobold it's 127.0.0.1:5001/api/v1/generate while in ooba it's 127.0.0.1:5000/v1/chat/completions . That's why it's throwing 404 errors.

I will look into the code and check if changing those values helps

@Bibrrr
Copy link
Author

Bibrrr commented Apr 20, 2024

It's getting somewhere !
I changed text-default config to the following
obraz
and it succesfully sends a request to ooba and generates text. There's a new issue... It looks like it can't retrieve data from openai api.
I tried multiple things. I mostly focued on line 380 in bot.py file. I changed data = llm_response['results'][0]['text'] to data = llm_response['choices'][0]['object'] as it's suggests in ooba docs. Now I get this error
bot py error
I also tried to change like 380 to data = llm_response['choices'][0]['message']['content'] as per openai api docs. It errors out at message (also included in 2nd screenshot).

Further changes that I tried...
I tried to change "generation" in config to "generation": "chat/completions". It gives HTML error 422. With no text generation from ooba

@badgids
Copy link
Owner

badgids commented Apr 20, 2024

Yeah it looks like Ooba has changed some things in the API from when I first wrote this program last year.

@badgids badgids self-assigned this Apr 20, 2024
@badgids
Copy link
Owner

badgids commented Apr 20, 2024

If you get this working again with Ooba before I have some more time to look into it, put in a PR. If we have to change so much it breaks the other API's, we'll just put the Ooba version in it's own branch.

It's getting somewhere !
I changed text-default config to the following
obraz
and it succesfully sends a request to ooba and generates text. There's a new issue... It looks like it can't retrieve data from openai api.
I tried multiple things. I mostly focued on line 380 in bot.py file. I changed data = llm_response['results'][0]['text'] to data = llm_response['choices'][0]['object'] as it's suggests in ooba docs. Now I get this error
bot py error
I also tried to change like 380 to data = llm_response['choices'][0]['message']['content'] as per openai api docs. It errors out at message (also included in 2nd screenshot).

Further changes that I tried...
I tried to change "generation" in config to "generation": "chat/completions". It gives HTML error 422. With no text generation from ooba

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants