Integration with Local Ollama Server for Custom LLM Models

Current Situation:
Pastai currently operates with
LLM (llama 3, 7b), limiting users to the default capabilities provided by the software.

Proposed Feature:
Implement support for connecting to a local Ollama server, allowing users to utilize custom large language models (LLMs) of their choice.

Rationale:
While Pastai may have been initially designed for users without extensive LLM knowledge, adding this feature would significantly enhance its versatility and appeal to a broader user base, including those with LLM expertise. This integration aligns with industry trends, as seen in applications like LM Studio and AnythingLLM.

Please authenticate to join the conversation.

Upvoters
Status

In Review

Board

💡 Feature Request

Date

3 months ago

Author

Steve

Subscribe to post

Get notified by email when there are changes.