f75a87ac5e
refactor: Remove dead code
2026-04-01 21:12:41 +02:00
928519a636
refactor: Remove validate request function
2026-04-01 01:24:46 +02:00
cd017ae1f2
refactor: Remove duplicated signals
2026-04-01 01:07:37 +02:00
f58fad9578
feat: Rename old llmcore module to pluginllmcore
2026-03-30 00:50:02 +02:00
a466332822
feat: Add OpenAI Responses API ( #282 )
...
* feat: Add OpenAI Responses API
* fix: Make temperature optional
* chore: Increase default value of max tokens
2025-12-01 12:14:55 +01:00
953774aaa8
refactor: Full rework quick refactor ( #257 )
2025-11-15 14:51:47 +01:00
9ecd285d1d
feat: Improve code suggestions ( #256 )
2025-11-14 17:02:43 +01:00
e7110810f8
fix: Clear connection before cancel
2025-11-01 20:51:01 +01:00
db82fb08e8
feat: Add chat-agent switcher in chat ui ( #247 )
...
* feat: Add chat-agent switcher in chat ui
fix: qml errors
refactor: Change top bar layout
fix: default value
* fix: update github action for qtc
2025-10-31 16:09:38 +01:00
ff0f994ec6
feat: Add project-specific rules support
2025-10-14 01:53:44 +02:00
10b924d78a
Feat: Add Claude tools support to plugin ( #231 )
...
* feat: Add settings for handle using tools in chat
* feat: Add Claude tools support
* fix: Add ai ignore to read project files list tool
* fix: Add ai ignore to read project file by name tool
* fix: Add ai ignore to read current opened files tool
2025-09-30 23:19:46 +02:00
ac53296e85
fix: Change behavior of cancel request
...
*now cancel request cancel all requests
2025-09-28 15:28:01 +02:00
ec1b5bdf5f
refactor: Remove non-streaming support ( #229 )
2025-09-17 19:38:27 +02:00
76309be0a6
Refactor llm providers to use internal http client ( #227 )
...
* refactor: Move http client into provider
* refactor: Rework ollama provider for work with internal http client
* refactor: Rework LM Studio provider to work with internal http client
* refactor: Rework Mistral AI to work with internal http client
* fix: Replace url and header to QNetworkRequest
* refactor: Rework Google provider to use internal http client
* refactor: OpenAI compatible providers switch to use internal http client
* fix: Remove m_requestHandler from tests
* refactor: Remove old handleData method
* fix: Remove LLMClientInterfaceTest
2025-09-03 10:56:05 +02:00
c36dffea93
refactor: Add model output settings instead smartprocessing setting ( #220 )
2025-08-17 22:01:26 +02:00
637a4d9d4c
feat: Add custom providers endpoint ( #188 )
2025-05-17 09:21:06 +02:00
25a6983de0
refactor: Make connection more async ( #182 )
2025-05-01 15:35:33 +02:00
2fe6850a06
refactor: Improve textsuggestion working
2025-04-24 01:25:45 +02:00
615175bea8
feat: Add file list for ignoring in request for llm ( #163 )
2025-04-17 09:12:47 +02:00
62de53c306
chore: Update copyrights
2025-04-04 18:01:02 +02:00
9d2d70fc63
feat: Add sharing opened files with code completion requests ( #156 )
2025-04-04 10:38:06 +02:00
79218d8412
refactor: Replace singletone for context manager ( #151 )
2025-04-01 22:29:45 +02:00
d58ff90458
fix: Fixed typo in the use of the project name
2025-03-27 00:34:10 +01:00
3d770f91c7
refactor: Reduce dependency on TextDocument in ContextManager ( #128 )
2025-03-10 18:06:19 +01:00
c724bace06
refactor: Move document access out of prepareContext() ( #129 )
2025-03-10 17:54:03 +01:00
719065ebfc
refactor: Extract document reading to separate class ( #127 )
...
This decouples LLMClientInterface from Qt Creator text editor
implementation and allows to write tests
2025-03-10 17:42:40 +01:00
a218064a4f
refactor: Introduce base class for RequestHandler ( #125 )
...
This will make it possible to write a mock implementation.
2025-03-10 17:29:45 +01:00
ed59be4199
refactor: Extract performance logging to separate class ( #124 )
...
This should not be responsibility of LLMClientInterface. Extracting this
class also adds flexibility to silence logging output in tests.
2025-03-10 17:10:01 +01:00
58c3e26e7f
refactor: Decouple LLMClientInterface from ProvidersManager ( #120 )
...
This will be needed for tests.
2025-03-10 10:40:51 +01:00
98e1047bf1
refactor: Decouple prompt template manager from their users ( #115 )
...
This makes it possible to test the user classes
2025-03-10 02:13:10 +01:00
c9a3cdaf25
refactor: Reuse extractFilePathFromRequest() more ( #117 )
2025-03-08 16:18:44 +01:00
6c323642e4
refactor: Inject settings into LLMClientInterface ( #114 )
...
This reduces reliance on global state and makes it more possible to test
the code.
2025-03-08 15:08:15 +01:00
44b3b0cc0c
refactor: Don't use global state in ContextManager::isSpecifyCompletion ( #112 )
...
Using global state makes testing things way harder.
2025-03-08 10:38:52 +01:00
f94c79a5ff
fix: Improve support for code blocks without language ( #108 )
...
This makes it possible to represent code blocks in models that emit
their suggestion immediately after the ``` characters.
2025-03-07 15:30:22 +01:00
90beebf2ee
Revert "refactor: Move all processing logic to CodeHandler::processText()" ( #109 )
2025-03-07 01:57:13 +01:00
521261e0a3
refactor: Move all processing logic to CodeHandler::processText() ( #107 )
...
This will become useful once more processing modes are available
2025-03-06 18:49:28 +01:00
69a8aa80d9
refactor: Make DocumentContextReader::prepareContext() testable ( #96 )
2025-03-05 20:18:59 +01:00
bcf7b6c226
refactor: Make DocumentContextReader usable outside Qt Creator context ( #89 )
...
This makes it possible to write simple unit tests for it without running
full Qt Creator. Not coupling DocumentContextReader to
TextEditor::TextDocument unnecessarily is also a better design in
general.
2025-03-05 01:53:02 +01:00
61196cae90
chore: Run clang-format over the codebase ( #82 )
...
This commit is a result of the following commands:
clang-format-19 --style=file -i $(git ls-files | fgrep .cpp)
clang-format-19 --style=file -i $(git ls-files | fgrep .hpp)
2025-03-02 22:44:20 +01:00
c8e0f3268e
fix: End completion position in lsp answer
2025-02-26 23:12:26 +01:00
84025ec843
feat: Separate system prompt for fin and non-fim models
2025-02-26 22:51:38 +01:00
7ba615a72d
feat: Add Google AI provider and template
2025-02-25 14:26:56 +01:00
d96f44d42c
refactor: Rework providers and templates logic
2025-02-22 19:39:28 +01:00
bd25736a55
refactor: Optimize SystemPrompt for code completion
2025-02-16 16:53:03 +01:00
60936f6d84
refactor: Improve code completion message for Instruct models
Build plugin / ${{ matrix.config.name }} (map[artifact:Linux-x64 cc:gcc cxx:g++ name:Ubuntu Latest GCC os:ubuntu-latest platform:linux_x64]) (push) Has been cancelled
Build plugin / ${{ matrix.config.name }} (map[artifact:Windows-x64 cc:cl cxx:cl environment_script:C:/Program Files/Microsoft Visual Studio/2022/Enterprise/VC/Auxiliary/Build/vcvars64.bat name:Windows Latest MSVC os:windows-latest platform:windows_x64]) (push) Has been cancelled
Build plugin / ${{ matrix.config.name }} (map[artifact:macOS-universal cc:clang cxx:clang++ name:macOS Latest Clang os:macos-latest platform:mac_x64]) (push) Has been cancelled
Build plugin / update_json (push) Has been cancelled
Build plugin / release (push) Has been cancelled
2025-02-12 02:05:37 +01:00
7d23d0323f
refactor: Improve system prompt and message
2025-02-12 01:47:52 +01:00
2a0beb6c4c
feat: Add language-specific LLM preset configuration
...
- Add ability to configure separate provider/model/template for specific programming language
- Add UI controls for language preset configuration
- Support custom provider selection per language
- Support custom model selection per language
- Support custom template selection per language
2025-02-02 22:57:18 +01:00
511f5b36eb
Upgrade to version 0.4.4
...
* feat: Add attachments for message
* feat: Support QtC color palette for chat view
* feat: Improve code completion from non-FIM models
* refactor: Removed trimming messages
* chore: Bump version to 0.4.4
2025-01-08 02:05:25 +01:00
f27429aa66
refactor: Move context to separate lib
2024-12-24 22:45:20 +01:00
d04e5bc967
Add Claude provider and templates for chat and code ( #55 )
...
* feat: Add provider settings
* feat: Add Claude provider
* feat: Add Claude templates
* refactor: Setting input sensitivity
* fix: Back text after read code block
* fix: Add missing system message for ollama fim
2024-12-23 22:22:01 +01:00