Remote MCP
Use this for ChatGPT Developer Mode and MCP-capable agent clients.
https://mcp.friskydev.com/mcp
Frisky Developments Command Layer
A premium operator deck for Frisky Developments: fifteen specialist agents, one shared memory layer, and a deployable MCP surface for ChatGPT, Codex, and the next AI workers.
Frisky Developments Architecture
Cyberpunk clarity without dashboard clutter: every operator, model surface, deployment lane, and memory write lands in a shared control plane.
Cloud Integration Layer
The accelerated Frisky builder ecosystem: one cloud-callable specialist layer for ChatGPT, Codex, Cursor, Antigravity, CLI runners, GPT Actions, Supabase Edge Functions, Neon-backed agents, Telegram bots, and server automations.
Use this for ChatGPT Developer Mode and MCP-capable agent clients.
https://mcp.friskydev.com/mcp
Import this schema in GPT Builder, then use bearer auth for live specialist calls.
/openapi.json
Server tools can call Frisky specialists directly with a Frisky OAuth token or bot token.
POST /consult_specialist
Install the shell CLI on local machines, CI, or Coolify dev containers.
/frisky-mcp-cli.sh
Use remote MCP configs for coding sessions, reviews, planning, and turning rough ideas into executable work.
Keep database work server-side and call Frisky specialists from Edge Functions or agent jobs.
Bearer $FRISKY_BOT_API_TOKEN
Infisical stays the vault. Coolify dev receives runtime tokens without putting secrets in browsers or GPT instructions.
vault -> runtime env -> MCP server
Loading
Frisky operators wired through one MCP surface
Consult Frisky Director: prioritize today's beta work
Consult Frisky Launch Strategy: build a beta rollout checklist
Consult Frisky Codex Engineer: review deployment risk