mirror of
https://github.com/Palm1r/QodeAssist.git
synced 2025-11-12 21:12:44 -05:00
2.2 KiB
2.2 KiB
Troubleshooting
Connection Issues
1. Verify provider URL and port
Make sure you're using the correct default URLs:
- Ollama:
http://localhost:11434 - LM Studio:
http://localhost:1234 - llama.cpp:
http://localhost:8080
2. Check model and template compatibility
- Ensure the correct model is selected in settings
- Verify that the selected prompt template matches your model
- Some models may not support certain features (e.g., tool calling)
3. Linux compatibility
- Prebuilt binaries support Ubuntu 22.04+
- For other distributions, you may need to build manually
- See issue #48 for known Linux compatibility issues and solutions
Reset Settings
If issues persist, you can reset settings to their default values:
- Open Qt Creator → Settings → QodeAssist
- Select the settings page you want to reset
- Click "Reset Page to Defaults" button
Note:
- API keys are preserved during reset
- You will need to re-select your model after reset
Common Issues
Plugin doesn't appear after installation
- Restart Qt Creator completely
- Check that the plugin is enabled in About Plugins
- Verify you downloaded the correct version for your Qt Creator
No suggestions appearing
- Check that code completion is enabled in QodeAssist settings
- Verify your provider/model is running and accessible
- Check Qt Creator's Application Output pane for error messages
- Try manual suggestion hotkey (⌥⌘Q on macOS, Ctrl+Alt+Q on Windows/Linux)
Chat not responding
- Verify your API key is configured correctly (for cloud providers)
- Check internet connection (for cloud providers)
- Ensure the provider service is running (for local providers)
- Look for error messages in the chat panel
Getting Help
If you continue to experience issues:
- Check existing GitHub Issues
- Join our Discord Community for support
- Open a new issue with:
- Qt Creator version
- QodeAssist version
- Operating system
- Provider and model being used
- Steps to reproduce the problem