# Model Selection Rules

## ALWAYS use these local models:

### For CODING tasks (coding, debugging, scripting, code analysis, file operations):
- **Model**: Qwen3-Coder (local-llama/qwen3-coder)
- Context: 16384

### For CREATIVE/WRITING tasks (writing, storytelling, plot, chapters, character development):
- **Model**: Darkest-Planet (local-llama/darkest-planet)  
- Context: 131072

### For ALL OTHER tasks:
- **Default**: Darkest-Planet (local-llama/darkest-planet)

## Model Selection Priority:
1. If task involves ANY code - use Qwen3-Coder
2. If task involves writing/storytelling/creativity - use Darkest-Planet
3. Fallback to kilo-auto/free if local models fail

## Note:
Local models run at http://127.0.0.1:9292/v1 (llama.cpp server)
If local models fail, fall back to kilo-auto/free
