Compare commits

...

5 Commits

Author SHA1 Message Date
add86d2e67 chore: Upgrade version to 0.4.6 2025-01-21 15:05:48 +01:00
a6c909d34d exp: Add ubuntu 22.04 experimental builds 2025-01-21 14:58:44 +01:00
2814dec3e5 fix: Improve file attachment handling
- Add files to existing list instead of replacing when using attach dialog
- Prevent duplicate files from being added to attachment list
2025-01-21 11:33:13 +01:00
1b86b60de8 Add system prompt configuration to readme 2025-01-20 10:00:36 +01:00
4b7f638731 Add setup OpenAI provider 2025-01-19 20:28:06 +01:00
5 changed files with 81 additions and 93 deletions

View File

@ -41,6 +41,12 @@ jobs:
platform: linux_x64, platform: linux_x64,
cc: "gcc", cxx: "g++" cc: "gcc", cxx: "g++"
} }
- {
name: "Ubuntu 22.04 GCC", artifact: "Linux-x64(Ubuntu-22.04-experimental)",
os: ubuntu-22.04,
platform: linux_x64,
cc: "gcc", cxx: "g++"
}
- { - {
name: "macOS Latest Clang", artifact: "macOS-universal", name: "macOS Latest Clang", artifact: "macOS-universal",
os: macos-latest, os: macos-latest,

View File

@ -264,10 +264,18 @@ void ChatRootView::showAttachFilesDialog()
} }
if (dialog.exec() == QDialog::Accepted) { if (dialog.exec() == QDialog::Accepted) {
QStringList filePaths = dialog.selectedFiles(); QStringList newFilePaths = dialog.selectedFiles();
if (!filePaths.isEmpty()) { if (!newFilePaths.isEmpty()) {
m_attachmentFiles = filePaths; bool filesAdded = false;
emit attachmentFilesChanged(); for (const QString &filePath : newFilePaths) {
if (!m_attachmentFiles.contains(filePath)) {
m_attachmentFiles.append(filePath);
filesAdded = true;
}
}
if (filesAdded) {
emit attachmentFilesChanged();
}
} }
} }
} }

View File

@ -1,7 +1,7 @@
{ {
"Id" : "qodeassist", "Id" : "qodeassist",
"Name" : "QodeAssist", "Name" : "QodeAssist",
"Version" : "0.4.5", "Version" : "0.4.6",
"Vendor" : "Petr Mironychev", "Vendor" : "Petr Mironychev",
"VendorId" : "petrmironychev", "VendorId" : "petrmironychev",
"Copyright" : "(C) ${IDE_COPYRIGHT_YEAR} Petr Mironychev, (C) ${IDE_COPYRIGHT_YEAR} The Qt Company Ltd", "Copyright" : "(C) ${IDE_COPYRIGHT_YEAR} Petr Mironychev, (C) ${IDE_COPYRIGHT_YEAR} The Qt Company Ltd",

148
README.md
View File

@ -15,18 +15,18 @@
## Table of Contents ## Table of Contents
1. [Overview](#overview) 1. [Overview](#overview)
2. [Installation for using Ollama](#installation-for-using-Ollama) 2. [Install plugin to QtCreator](#install-plugin-to-qtcreator)
3. [Installation for using Claude](#installation-for-using-Claude) 3. [Configure for Anthropic Claude](#configure-for-anthropic-claude)
3. [Configure Plugin](#configure-plugin) 4. [Configure for OpenAI](#configure-for-openai)
4. [Supported LLM Providers](#supported-llm-providers) 5. [Configure for using Ollama](#configure-for-using-ollama)
5. [Recommended Models](#recommended-models) 6. [System Prompt Configuration](#system-prompt-configuration)
- [Ollama](#ollama) 7. [Template-Model Compatibility](#template-model-compatibility)
6. [QtCreator Version Compatibility](#qtcreator-version-compatibility) 8. [QtCreator Version Compatibility](#qtcreator-version-compatibility)
7. [Development Progress](#development-progress) 9. [Development Progress](#development-progress)
8. [Hotkeys](#hotkeys) 10. [Hotkeys](#hotkeys)
9. [Troubleshooting](#troubleshooting) 11. [Troubleshooting](#troubleshooting)
10. [Support the Development](#support-the-development-of-qodeassist) 12. [Support the Development](#support-the-development-of-qodeassist)
11. [How to Build](#how-to-build) 13. [How to Build](#how-to-build)
## Overview ## Overview
@ -35,7 +35,8 @@
- Side and Bottom panels - Side and Bottom panels
- Support for multiple LLM providers: - Support for multiple LLM providers:
- Ollama - Ollama
- Claude - OpenAI
- Anthropic Claude
- LM Studio - LM Studio
- OpenAI-compatible providers(eg. https://openrouter.ai) - OpenAI-compatible providers(eg. https://openrouter.ai)
- Extensive library of model-specific templates - Extensive library of model-specific templates
@ -62,11 +63,46 @@
<img width="326" alt="QodeAssistBottomPanel" src="https://github.com/user-attachments/assets/4cc64c23-a294-4df8-9153-39ad6fdab34b"> <img width="326" alt="QodeAssistBottomPanel" src="https://github.com/user-attachments/assets/4cc64c23-a294-4df8-9153-39ad6fdab34b">
</details> </details>
## Installation for using Ollama ## Install plugin to QtCreator
1. Install Latest Qt Creator
2. Download the QodeAssist plugin for your Qt Creator
3. Launch Qt Creator and install the plugin:
- Go to:
- MacOS: Qt Creator -> About Plugins...
- Windows\Linux: Help -> About Plugins...
- Click on "Install Plugin..."
- Select the downloaded QodeAssist plugin archive file
1. Install Latest QtCreator ## Configure for Anthropic Claude
2. Install [Ollama](https://ollama.com). Make sure to review the system requirements before installation. 1. Open Qt Creator settings and navigate to the QodeAssist section
3. Install a language models in Ollama via terminal. For example, you can run: 2. Go to Provider Settings tab and configure Claude api key
3. Return to General tab and configure:
- Set "Claude" as the provider for code completion or/and chat assistant
- Set the Claude URL (https://api.anthropic.com)
- Select your preferred model (e.g., claude-3-5-sonnet-20241022)
- Choose the Claude template for code completion or/and chat
<details>
<summary>Example of Claude settings: (click to expand)</summary>
<img width="823" alt="Claude Settings" src="https://github.com/user-attachments/assets/828e09ea-e271-4a7a-8271-d3d5dd5c13fd" />
</details>
## Configure for OpenAI
1. Open Qt Creator settings and navigate to the QodeAssist section
2. Go to Provider Settings tab and configure OpenAI api key
3. Return to General tab and configure:
- Set "OpenAI" as the provider for code completion or/and chat assistant
- Set the OpenAI URL (https://api.openai.com)
- Select your preferred model (e.g., gpt-4o)
- Choose the OpenAI template for code completion or/and chat
<details>
<summary>Example of OpenAI settings: (click to expand)</summary>
<img width="829" alt="OpenAI Settings" src="https://github.com/user-attachments/assets/4716f790-6159-44d0-a8f4-565ccb6eb713" />
</details>
## Configure for using Ollama
1. Install [Ollama](https://ollama.com). Make sure to review the system requirements before installation.
2. Install a language models in Ollama via terminal. For example, you can run:
For standard computers (minimum 8GB RAM): For standard computers (minimum 8GB RAM):
``` ```
@ -80,31 +116,6 @@ For high-end systems (32GB+ RAM):
``` ```
ollama run qwen2.5-coder:32b ollama run qwen2.5-coder:32b
``` ```
4. Download the QodeAssist plugin for your QtCreator.
5. Launch Qt Creator and install the plugin:
- Go to MacOS: Qt Creator -> About Plugins...
Windows\Linux: Help -> About Plugins...
- Click on "Install Plugin..."
- Select the downloaded QodeAssist plugin archive file
## Installation for using Claude
1. Install Latest QtCreator
2. Download the QodeAssist plugin for your QtCreator.
3. Launch Qt Creator and install the plugin:
- Go to MacOS: Qt Creator -> About Plugins...
Windows\Linux: Help -> About Plugins...
- Click on "Install Plugin..."
- Select the downloaded QodeAssist plugin archive file
4. Select Claude provider
5. Select Claude api
6. Fill in api key for Claude
5. Select Claude templates for code completion and chat
6. Enjoy!
## Configure Plugin
QodeAssist comes with default settings that should work immediately after installing a language model. The plugin is pre-configured to use Ollama with standard templates, so you may only need to verify the settings.
1. Open Qt Creator settings (Edit > Preferences on Linux/Windows, Qt Creator > Preferences on macOS) 1. Open Qt Creator settings (Edit > Preferences on Linux/Windows, Qt Creator > Preferences on macOS)
2. Navigate to the "Qode Assist" tab 2. Navigate to the "Qode Assist" tab
@ -112,51 +123,20 @@ QodeAssist comes with default settings that should work immediately after instal
- Ollama is selected as your LLM provider - Ollama is selected as your LLM provider
- The URL is set to http://localhost:11434 - The URL is set to http://localhost:11434
- Your installed model appears in the model selection - Your installed model appears in the model selection
- The prompt template is Ollama Auto FIM - The prompt template is Ollama Auto FIM or Ollama Auto Chat for chat assistance. You can specify template if it is not work correct
4. Click Apply if you made any changes 4. Click Apply if you made any changes
You're all set! QodeAssist is now ready to use in Qt Creator. You're all set! QodeAssist is now ready to use in Qt Creator.
<details>
<summary>Example of Ollama settings: (click to expand)</summary>
<img width="824" alt="Ollama Settings" src="https://github.com/user-attachments/assets/ed64e03a-a923-467a-aa44-4f790e315b53" />
</details>
## Supported LLM Providers ## System Prompt Configuration
QodeAssist currently supports the following LLM (Large Language Model) providers:
- [Ollama](https://ollama.com)
- [LM Studio](https://lmstudio.ai)
- [OpenRouter](https://openrouter.ai)
- OpenAI compatible providers
## Recommended Models: The plugin comes with default system prompts optimized for chat and instruct models, as these currently provide better results for code assistance. If you prefer using FIM (Fill-in-Middle) models, you can easily customize the system prompt in the settings.
QodeAssist has been thoroughly tested and optimized for use with the following language models:
- Qwen2.5-coder ## Template-Model Compatibility
- CodeLlama
- StarCoder2
- DeepSeek-Coder-V2
### Model Types
FIM models (codellama:7b-code, starcoder2:7b, etc.) - Optimized for code completion and suggestions
Instruct models (codellama:7b-instruct, starcoder2:instruct, etc.) - Better for chat assistance, explanations, and code review
For best results, use FIM models with code completion and Instruct models with chat features.
### Ollama:
### For autocomplete(FIM)
```
ollama run codellama:7b-code
ollama run starcoder2:7b
ollama run qwen2.5-coder:7b-base
ollama run deepseek-coder-v2:16b-lite-base-q3_K_M
```
### For chat and instruct
```
ollama run codellama:7b-instruct
ollama run starcoder2:instruct
ollama run qwen2.5-coder:7b-instruct
ollama run deepseek-coder-v2
```
### Template-Model Compatibility
| Template | Compatible Models | Purpose | | Template | Compatible Models | Purpose |
|----------|------------------|----------| |----------|------------------|----------|
@ -172,12 +152,6 @@ ollama run deepseek-coder-v2
| Llama3 | `llama3 model family` | Chat assistance | | Llama3 | `llama3 model family` | Chat assistance |
| Ollama Auto Chat | `Any Ollama chat model` | Chat assistance | | Ollama Auto Chat | `Any Ollama chat model` | Chat assistance |
> Note:
> - FIM (Fill-in-Middle) templates are optimized for code completion
> - Chat templates are designed for interactive dialogue
> - The Ollama Auto templates automatically adapt to most Ollama models
> - Custom Template allows you to define your own prompt format
## QtCreator Version Compatibility ## QtCreator Version Compatibility
- QtCreator 15.0.0 - 0.4.x - QtCreator 15.0.0 - 0.4.x

View File

@ -90,7 +90,7 @@ void PluginUpdater::handleUpdateResponse(QNetworkReply *reply)
#elif defined(Q_OS_MACOS) #elif defined(Q_OS_MACOS)
if (name.contains("macOS")) if (name.contains("macOS"))
#else #else
if (name.contains("Linux")) if (name.contains("Linux") && !name.contains("experimental"))
#endif #endif
{ {
info.downloadUrl = asset.toObject()["browser_download_url"].toString(); info.downloadUrl = asset.toObject()["browser_download_url"].toString();