PromptMySite Admin Dashboard
Manage global system settings, caching, and local inference models.
Inference Architecture & Fallbacks
Drag and drop priority sequence for LLM Help Me requests.
1
Mac Studio Local (MiniMax m2.5)
Primary. Zero variable OpEx costs.
2
Kilocode API (MiniMax M2.5)
Fallback Layer via 1Password Key
3
OpenRouter (Any/Anthropic)
Not configured. Fallback Layer 3.
Knowledge Base Cache
84%
Cache Hit Rate
-$2.40
OpEx Saved Today
Feature Tiers
Free Tier
Paid Tier