Quickstart

Get from zero to your first AI call in under 5 minutes. Pick your integration method below.

Direct API
MCP / IDE
CLI Tool

Direct API Integration

Use standard HTTP requests to call any supported LLM through a single endpoint. Compatible with any language or framework.

1 Get Your API Key

Sign up at monstergaming.ai/signup to get your API key. It starts with mg_test_ (sandbox) or mg_live_ (production).

2 Make Your First Call

curl
# Your first API call
curl -X POST https://api.monstergaming.ai/v1/messages \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "sonnet",
    "max_tokens": 1024,
    "messages": [
      {"role": "user", "content": "Hello from Monster Gaming!"}
    ]
  }'

Tip: You can use friendly model names like sonnet, opus, haiku, gpt-4o, gemini, or deepseek. We resolve them to the latest version automatically.

3 Streaming

For real-time streaming responses (SSE), use the /v1/stream endpoint:

curl
curl -N -X POST https://api.monstergaming.ai/v1/stream \
  -H "Authorization: Bearer YOUR_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{
    "model": "sonnet",
    "max_tokens": 1024,
    "messages": [
      {"role": "user", "content": "Write a short game design doc for a puzzle game."}
    ]
  }'

Supported Models

Friendly NameProviderTier Required
haikuAnthropic Claude Haiku 4.5Free+
sonnetAnthropic Claude Sonnet 4.6Starter+
opusAnthropic Claude Opus 4.6Pro+
gpt-4oOpenAI GPT-4oStarter+
geminiGoogle Gemini 3.1 ProFree+ (default)
deepseekDeepSeek ChatStarter+
mistralMistral LargeStarter+

Response Format

Responses follow the Anthropic Messages API format:

json
{
  "id": "msg_01XFDUDYJgAACzvnptvVoYEL",
  "type": "message",
  "role": "assistant",
  "content": [
    {"type": "text", "text": "Hello! How can I help with your game?"}
  ],
  "model": "claude-sonnet-4-6-20250514",
  "usage": {"input_tokens": 12, "output_tokens": 15}
}

Error Handling

All errors return JSON with error, code, and message fields. Common status codes:

MCP Server Setup

Use Monster Gaming as an MCP server in Claude Code, Cursor, or any MCP-compatible IDE. One config, all models.

1 Claude Code

Add to your ~/.claude/settings.json:

json
{
  "mcpServers": {
    "monster-gaming": {
      "url": "https://api.monstergaming.ai/mcp",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

2 Cursor / VS Code

Add to your .cursor/mcp.json or VS Code MCP settings:

json
{
  "servers": {
    "monster-gaming": {
      "type": "streamable-http",
      "url": "https://api.monstergaming.ai/mcp",
      "headers": {
        "Authorization": "Bearer YOUR_API_KEY"
      }
    }
  }
}

3 Verify Connection

Once configured, you should see Monster Gaming tools available in your IDE. Test with:

prompt
# In Claude Code or Cursor, ask:
"Use Monster Gaming to generate a player movement script for Unreal Engine 5"

Note: MCP integration provides tool-use capabilities beyond raw chat — including game asset generation, code review tuned for game engines, and project-aware context.

Monster CLI

A command-line tool for quick interactions, key management, and usage monitoring.

1 Install

bash
# Install via npm (Node 18+)
npm install -g @monstergaming/cli

# Or download the binary
curl -fsSL https://monstergaming.ai/install.sh | sh

2 Login

bash
# Login with your API key
monster login
# Enter your API key when prompted

# Or set it via environment variable
export MONSTER_API_KEY=mg_live_xxxx

3 Ask Questions

bash
# Quick one-shot question
monster ask "How do I implement A* pathfinding in UE5?"

# Choose a specific model
monster ask --model opus "Review this blueprint for performance issues"

# Pipe code for review
cat PlayerController.cpp | monster ask "Find bugs in this code"

4 Manage Keys & Usage

bash
# List your API keys
monster keys

# Check usage for current billing period
monster usage

# Generate a new sandbox key
monster keys create --env sandbox

# Test connectivity
monster test

Tip: Use monster ask --stream for real-time streaming output, just like the API's /v1/stream endpoint.