Skip to content

Incorrect endpoint being used for Local LLMs #26

@Saher-Anwar

Description

@Saher-Anwar

I've setup ollama (llama3) on my pc.

.env file has the following fields:

LLM_URL=http://127.0.0.1:11434
LLM_MODEL=ollama

The error:

For more information check: https://developer.mozilla.org/en-US/docs/Web/HTTP/Status/404
00:17:02 - INFO - [362/365] score=0  Software Developer
00:17:02 - INFO - HTTP Request: POST http://127.0.0.1:11434/chat/completions "HTTP/1.1 404 Not Found"
00:17:02 - ERROR - LLM error scoring job 'Senior Cloud Security Engineer': Client error '404 Not Found' for url 'http://127.0.0.1:11434/chat/completions'

It seems to be defaulting to the /chat/completions endpoint for local LLMs which seems to be causing the issue. On looking at the code, it seems that the assumed response will not work either.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions