MC
System
GPU
AI
Agents
Collectors
Data
Cache
Core
Builder →
AI / LLM ENGINE
Waterfall · Ollama · DABI Config
OLLAMA MODELS (LOCAL)
LLM WATERFALL — PRIORITY ORDER
Fallback chain: if provider fails → next in list. Cost tracked per 1K tokens.
DABI TEMPERATURE
0 (precise)
0.3
1.0 (creative)
MAX TOKENS
256
4096
8192
CONFIDENCE THRESHOLD
0%
80%
100%
LLM COST ESTIMATE (MONTHLY)
$0
GROQ (FREE)
~$5
DEEPSEEK
$0
OLLAMA (LOCAL)
~$20
OPENAI
~$21
CLAUDE
~$46
TOTAL/MO
RINET AI OS — MISSION CONTROL