f58fad9578
feat: Rename old llmcore module to pluginllmcore
2026-03-30 00:50:02 +02:00
a466332822
feat: Add OpenAI Responses API ( #282 )
...
* feat: Add OpenAI Responses API
* fix: Make temperature optional
* chore: Increase default value of max tokens
2025-12-01 12:14:55 +01:00
695b35b510
feature: Add support Qwen3-coder model ( #221 )
...
Add support Qwen3-coder model
Rename template for old
2025-08-18 12:01:34 +02:00
62de53c306
chore: Update copyrights
2025-04-04 18:01:02 +02:00
e66f467214
feat: Add llama.cpp provider and fim template ( #118 )
2025-03-09 22:57:33 +01:00
7ba615a72d
feat: Add Google AI provider and template
2025-02-25 14:26:56 +01:00
e924029ec2
feat: Add filter templates for each provider
2025-02-23 01:41:47 +01:00
d96f44d42c
refactor: Rework providers and templates logic
2025-02-22 19:39:28 +01:00
288fefebe5
feat: Add CodeLlama QML FIM
2025-02-01 09:36:14 +01:00
289a19ac1a
feat: Add OpenAI provider and template
2025-01-17 01:22:12 +01:00
511f5b36eb
Upgrade to version 0.4.4
...
* feat: Add attachments for message
* feat: Support QtC color palette for chat view
* feat: Improve code completion from non-FIM models
* refactor: Removed trimming messages
* chore: Bump version to 0.4.4
2025-01-08 02:05:25 +01:00
d04e5bc967
Add Claude provider and templates for chat and code ( #55 )
...
* feat: Add provider settings
* feat: Add Claude provider
* feat: Add Claude templates
* refactor: Setting input sensitivity
* fix: Back text after read code block
* fix: Add missing system message for ollama fim
2024-12-23 22:22:01 +01:00
1261f913bb
♻️ refactor: Rework currents and add new templates
...
Add Alpaca, Llama3, LLama2, ChatML templates
2024-11-26 00:28:27 +01:00