Skip to content

[FEAT] Support for LM Studio/Ollama #26

@traderarpit4

Description

@traderarpit4

It would be extremely awesome to see the ability to bring our own locally hosted LLMs to the table. Most of the features and methods of interaction would remain the same, as far as I can understand, because Gemini seems to be OpenAI compatible for the most part.

It'd also bring on folks who don't want to upload their entire vaults to Google, or want to be able to use other providers as innovation occurs. Also having the ability to use custom domains would be fantastic overall, even as an experimental feature.

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions