Go to file
Petr Mironychev f8bb9998ab 🐛 fix: Fix tags
2024-12-03 21:52:41 +01:00
.github 🐛 fix: Fix tags 2024-12-03 21:52:41 +01:00
chat Fix pr30 build 2024-11-11 21:45:38 +01:00
ChatView 🐛 fix: Move api key from request json to config 2024-11-26 10:52:47 +01:00
core Adapt new settings 2024-11-11 00:03:38 +01:00
llmcore feat: Add removing codeblock wrappers from code completion 2024-11-26 11:26:50 +01:00
logger Upgrade to version 0.3.0 2024-10-14 00:10:24 +02:00
providers feat: Improve OpenAI message handling 2024-11-26 11:43:51 +01:00
rawPromptExamples Add examples of customs request 2024-09-08 02:25:21 +02:00
resources/images Initial commit 2024-08-27 11:58:55 +02:00
settings Upgrade plugin to Qt Creator 15 2024-12-03 11:15:35 +01:00
templates ♻️ refactor: Rework currents and add new templates 2024-11-26 00:28:27 +01:00
utils Add multiline insert support 2024-09-02 10:54:14 +02:00
.clang-format Add clang-format and qmlformat files 2024-11-12 00:07:12 +01:00
.gitignore Add projects code style improvements 2024-11-16 00:47:57 +01:00
.qmlformat.ini Add projects code style improvements 2024-11-16 00:47:57 +01:00
CMakeLists.txt Upgrade plugin to Qt Creator 15 2024-12-03 11:15:35 +01:00
ConfigurationManager.cpp Impove UX general setting by added helpers dialogs for user (#42) 2024-11-16 15:25:28 +01:00
ConfigurationManager.hpp Rework Chat Assistant Settings 2024-11-10 22:50:47 +01:00
DocumentContextReader.cpp 🐛 fix: Change format for context in system prompt 2024-11-26 10:15:20 +01:00
DocumentContextReader.hpp Fix systemPrompt and context working 2024-11-16 10:20:57 +01:00
LICENSE Initial commit 2024-08-27 11:36:28 +02:00
LLMClientInterface.cpp 🐛 fix: Move api key from request json to config 2024-11-26 10:52:47 +01:00
LLMClientInterface.hpp Upgrade plugin version to 0.3.0 2024-10-14 00:22:24 +02:00
LLMSuggestion.cpp Upgrade plugin to Qt Creator 15 2024-12-03 11:15:35 +01:00
LLMSuggestion.hpp Upgrade plugin to Qt Creator 15 2024-12-03 11:15:35 +01:00
LSPCompletion.hpp Add multiline insert support 2024-09-02 10:54:14 +02:00
QodeAssist_en_001.ts Initial commit 2024-08-27 11:58:55 +02:00
qodeassist.cpp ♻️ refactor: Rework currents and add new templates 2024-11-26 00:28:27 +01:00
QodeAssist.json.in 🐛 fix: FIx json error formating 2024-12-03 21:46:24 +01:00
QodeAssist.qrc Initial commit 2024-08-27 11:58:55 +02:00
QodeAssistClient.cpp Upgrade plugin to Qt Creator 15 2024-12-03 11:15:35 +01:00
QodeAssistClient.hpp Add smart trigger for call suggestion 2024-09-08 02:51:56 +02:00
QodeAssistConstants.hpp Upgrade to version 0.3.0 2024-10-14 00:10:24 +02:00
QodeAssisttr.h Initial commit 2024-08-27 11:58:55 +02:00
README.md Add icons to README 2024-12-03 21:11:31 +01:00

QodeAssist - AI-powered coding assistant plugin for Qt Creator

Discord Build plugin GitHub Downloads (all assets, all releases) GitHub Tag Static Badge Static Badge

qodeassist-icon QodeAssist is an AI-powered coding assistant plugin for Qt Creator. It provides intelligent code completion and suggestions for C++ and QML, leveraging large language models through local providers like Ollama. Enhance your coding productivity with context-aware AI assistance directly in your Qt development environment.

Table of Contents

  1. Overview
  2. Installation
  3. Configure Plugin
  4. Supported LLM Providers
  5. Recommended Models
  6. QtCreator Version Compatibility
  7. Development Progress
  8. Hotkeys
  9. Troubleshooting
  10. Support the Development
  11. How to Build

Overview

  • AI-powered code completion
  • Chat functionality:
    • Side and Bottom panels
  • Support for multiple LLM providers:
  • Extensive library of model-specific templates
  • Custom template support
  • Easy configuration and model selection
Code completion: (click to expand) QodeAssistPreview
Chat with LLM models in side panels: (click to expand) QodeAssistChat
Chat with LLM models in bottom panel: (click to expand) QodeAssistBottomPanel

Installation

  1. Install Latest QtCreator
  2. Install Ollama. Make sure to review the system requirements before installation.
  3. Install a language models in Ollama via terminal. For example, you can run:

For standard computers (minimum 8GB RAM):

ollama run qwen2.5-coder:7b

For better performance (16GB+ RAM):

ollama run qwen2.5-coder:14b

For high-end systems (32GB+ RAM):

ollama run qwen2.5-coder:32b
  1. Download the QodeAssist plugin for your QtCreator.
  2. Launch Qt Creator and install the plugin:
    • Go to MacOS: Qt Creator -> About Plugins... Windows\Linux: Help -> About Plugins...
    • Click on "Install Plugin..."
    • Select the downloaded QodeAssist plugin archive file

Configure Plugin

QodeAssist comes with default settings that should work immediately after installing a language model. The plugin is pre-configured to use Ollama with standard templates, so you may only need to verify the settings.

  1. Open Qt Creator settings (Edit > Preferences on Linux/Windows, Qt Creator > Preferences on macOS)
  2. Navigate to the "Qode Assist" tab
  3. On the "General" page, verify:
    • Ollama is selected as your LLM provider
    • The URL is set to http://localhost:11434
    • Your installed model appears in the model selection
    • The prompt template is Ollama Auto FIM
  4. Click Apply if you made any changes

You're all set! QodeAssist is now ready to use in Qt Creator.

Supported LLM Providers

QodeAssist currently supports the following LLM (Large Language Model) providers:

  • Ollama
  • LM Studio (experimental)
  • OpenAI compatible providers (experimental)

QodeAssist has been thoroughly tested and optimized for use with the following language models:

  • Qwen2.5-coder
  • CodeLlama
  • StarCoder2
  • DeepSeek-Coder-V2

Ollama:

For autocomplete(FIM)

ollama run codellama:7b-code
ollama run starcoder2:7b
ollama run qwen2.5-coder:7b-base
ollama run deepseek-coder-v2:16b-lite-base-q3_K_M

For chat

ollama run codellama:7b-instruct
ollama run starcoder2:instruct
ollama run qwen2.5-coder:7b-instruct
ollama run deepseek-coder-v2

Template-Model Compatibility

Template Compatible Models Purpose
CodeLlama FIM codellama:code Code completion
DeepSeekCoder FIM deepseek-coder-v2, deepseek-v2.5 Code completion
Ollama Auto FIM Any Ollama base model Code completion
Qwen FIM Qwen 2.5 models Code completion
StarCoder2 FIM starcoder2 base model Code completion
Alpaca starcoder2:instruct Chat assistance
Basic Chat Messages without tokens Chat assistance
ChatML Qwen 2.5 models Chat assistance
Llama2 llama2 model family, codellama:instruct Chat assistance
Llama3 llama3 model family Chat assistance
Ollama Auto Chat Any Ollama chat model Chat assistance

Note:

  • FIM (Fill-in-Middle) templates are optimized for code completion
  • Chat templates are designed for interactive dialogue
  • The Ollama Auto templates automatically adapt to most Ollama models
  • Custom Template allows you to define your own prompt format

QtCreator Version Compatibility

  • QtCreator 15.0.0 - 0.4.x
  • QtCreator 14.0.2 - 0.2.3 - 0.3.x
  • QtCreator 14.0.1 - 0.2.2 plugin version and below

Development Progress

  • Basic plugin with code autocomplete functionality
  • Improve and automate settings
  • Add chat functionality
  • Sharing diff with model
  • Sharing project source with model
  • Support for more providers and models

Hotkeys

  • To call manual request to suggestion, you can use or change it in settings
    • on Mac: Option + Command + Q
    • on Windows: Ctrl + Alt + Q
  • To insert the full suggestion, you can use the TAB key
  • To insert line by line, you can use the "Move cursor word right" shortcut:
    • On Mac: Option + Right Arrow
    • On Windows: Alt + Right Arrow

Troubleshooting

If QodeAssist is having problems connecting to the LLM provider, please check the following:

  1. Verify the IP address and port:

  2. Check the endpoint:

Make sure the endpoint in the settings matches the one required by your provider - For Ollama, it should be /api/generate - For LM Studio and OpenAI compatible providers, it's usually /v1/chat/completions

  1. Confirm that the selected model and template are compatible:

Ensure you've chosen the correct model in the "Select Models" option Verify that the selected prompt template matches the model you're using

If you're still experiencing issues with QodeAssist, you can try resetting the settings to their default values:

  1. Open Qt Creator settings
  2. Navigate to the "Qode Assist" tab
  3. Pick settings page for reset
  4. Click on the "Reset Page to Defaults" button
    • The API key will not reset
    • Select model after reset

Support the development of QodeAssist

If you find QodeAssist helpful, there are several ways you can support the project:

  1. Report Issues: If you encounter any bugs or have suggestions for improvements, please open an issue on our GitHub repository.

  2. Contribute: Feel free to submit pull requests with bug fixes or new features.

  3. Spread the Word: Star our GitHub repository and share QodeAssist with your fellow developers.

  4. Financial Support: If you'd like to support the development financially, you can make a donation using one of the following:

    • Bitcoin (BTC): bc1qndq7f0mpnlya48vk7kugvyqj5w89xrg4wzg68t
    • Ethereum (ETH): 0xA5e8c37c94b24e25F9f1f292a01AF55F03099D8D
    • Litecoin (LTC): ltc1qlrxnk30s2pcjchzx4qrxvdjt5gzuervy5mv0vy
    • USDT (TRC20): THdZrE7d6epW6ry98GA3MLXRjha1DjKtUx

Every contribution, no matter how small, is greatly appreciated and helps keep the project alive!

How to Build

Create a build directory and run

cmake -DCMAKE_PREFIX_PATH=<path_to_qtcreator> -DCMAKE_BUILD_TYPE=RelWithDebInfo <path_to_plugin_source>
cmake --build .

where <path_to_qtcreator> is the relative or absolute path to a Qt Creator build directory, or to a combined binary and development package (Windows / Linux), or to the Qt Creator.app/Contents/Resources/ directory of a combined binary and development package (macOS), and <path_to_plugin_source> is the relative or absolute path to this plugin directory.

For Contributors

QML code style: Preferably follow the following guidelines https://github.com/Furkanzmc/QML-Coding-Guide, thank you @Furkanzmc for collect them C++ code style: check use .clang-fortmat in project

qodeassist-icon qodeassist-icon-small