Current Situation:
Pastai currently operates with LLM (llama 3, 7b), limiting users to the default capabilities provided by the software.
Proposed Feature:
Implement support for connecting to a local Ollama server, allowing users to utilize custom large language models (LLMs) of their choice.
Rationale:
While Pastai may have been initially designed for users without extensive LLM knowledge, adding this feature would significantly enhance its versatility and appeal to a broader user base, including those with LLM expertise. This integration aligns with industry trends, as seen in applications like LM Studio and AnythingLLM.
Please authenticate to join the conversation.
In Review
💡 Feature Request
3 months ago
Steve
Get notified by email when there are changes.
In Review
💡 Feature Request
3 months ago
Steve
Get notified by email when there are changes.