Up/Down Token Schema for AI Engineers
AI engineers integrate Resolved Markets into agent systems via MCP (Model Context Protocol) to provide autonomous agents with real-time market intelligence. The platform delivers continuous orderbook snapshots from 100+ Polymarket contracts across crypto, sports, economics, and weather with 20Hz capture rates and millisecond precision. MCP integration enables agents to query historical snapshots, stream live updates via WebSocket, analyze bid/ask spreads for sentiment, and trigger trading or research workflows based on market conditions. With structured API responses and rich context about prediction market states, AI agents can build context-aware reasoning about market sentiment, event probabilities, and cross-market correlations without manual prompt engineering.
Up/Down Token Schema is shaped for AI agents. Resolved Markets pipes the 14-column ClickHouse orderbook schema into MCP, REST, and WebSocket so language-model agents can ground responses in real Polymarket orderbook state.
Data challenges AI Engineers run into
Up/Down Token Schema from Resolved Markets is built around the data gaps AI Engineers hit when they try to work with raw Polymarket feeds.
Market data integration complexity reduces agent autonomy and increases latency
Building intelligent agents requires integrating external data sources, but market data APIs are complex: authentication tokens, rate limiting, inconsistent schemas, and error handling. AI engineers waste engineering time building wrapper layers just to fetch data, when they should be focusing on agent reasoning and decision-making. Resolved Markets' MCP integration abstracts all this complexity into simple, agent-friendly tools that return structured data ready for LLM processing.
Unstructured market data requires extensive prompt engineering for reasoning
Market data delivered as raw tickers (BTC: 42000, spread: 50) forces engineers to write detailed prompts explaining financial concepts to LLMs. 'What does a 50-basis-point spread mean for sentiment?' 'How does Polymarket probability compare to consensus?' Engineers spend hours tuning prompts when they could be building agent behaviors. Resolved Markets provides semantically rich context: 'orderbook_imbalance', 'spread_evolution_direction', 'depth_weighted_midpoint'—data that LLMs understand intuitively.
Limited context windows for understanding prediction market dynamics
Agents need sufficient context to reason effectively. Single orderbook snapshots lack history. Did the spread widen or narrow? Is bid/ask concentration increasing (conviction) or decreasing (uncertainty)? Resolved Markets provides context windows of recent snapshots, enabling agents to understand market momentum and trend. This historical context prevents agents from overreacting to momentary noise and enables reasoning about market regime changes.
Difficulty correlating events across multiple market categories
Prediction markets for crypto, sports, economics, and weather operate in silos in most systems. An agent predicting BTC price needs context that FOMC probability just shifted, or that major weather events affect energy prices. Resolved Markets unifies all categories through MCP, enabling agents to discover cross-market correlations (like 'when recession probabilities rise, commodity prices fall'). This multi-category awareness makes agents dramatically smarter about market relationships.
Built for quantitative work on Up/Down Token Schema
Orderbook-level prediction-market data that doesn't exist anywhere else.
MCP integration enables zero-boilerplate market data access in agent systems
With MCP integration, agents access Resolved Markets like any other tool: query orderbooks, stream updates, retrieve historical snapshots. No authentication, rate-limit management, or error-handling boilerplate. Your agent system treats market data like first-class citizens, enabling autonomous decision-making workflows: 'If BTC prediction probability spikes above 85%, alert trading desk' or 'When FOMC outcome probability crystallizes, analyze correlated markets'. This abstraction unlocks agent autonomy previously impossible.
Structured JSON responses optimize for LLM reasoning without prompt engineering
LLMs reason better with semantically meaningful data. Instead of raw spread values, Resolved Markets' structured responses include 'sentiment_direction: bullish_shift', 'depth_imbalance: buy_side_concentration_75_percent', 'volatility_regime: elevated'. Agents understand these concepts without interpretation layers. This reduces hallucination and improves decision quality. Engineers spend time on agent strategy, not data translation.
Historical context windows enable agents to detect market regime changes
Agents track sentiment evolution, not single snapshots. By providing recent orderbook history through context windows, agents detect 'probability accelerating downward' vs 'stable with noise'. This pattern recognition prevents agents from triggering alerts on momentary volatility. Agents can implement sophisticated logic: 'Trigger alert only if spread widens for 3 consecutive snapshots AND depth concentration increases', reasoning about sustained shifts rather than isolated data points.
Cross-category market data enables sophisticated correlation reasoning
When agents see BTC prediction probability and FOMC probability and energy price predictions in a unified interface, they discover relationships: 'Recession probability surged 10%, and simultaneously BTC contract shifted bearish while energy prices weakened.' These correlations emerge naturally from multi-category access. Agents reason about complex market dynamics (macro → crypto → commodities) without explicit correlation prompts. Your agents become market-aware without domain expertise hardcoding.
How AI Engineers use Up/Down Token Schema
Seven categories, hundreds of markets
Prediction markets across crypto, sports, economics, weather, and more — live and historical orderbook data, all queryable through one API.
Crypto
BTC, ETH, SOL, XRP — up/down markets every 5m to 1d.
Equities
S&P 500 (SPX) daily open — up or down predictions.
Social
Elon Musk tweet counts — weekly prediction ranges.
Sports
NBA, NFL, EPL — game outcomes and season predictions.
Economics
Fed decisions, jobs reports — FOMC meetings and macro data.
Weather
44 cities daily — temperature, hurricanes, Arctic ice.
Hyperliquid
BTC, ETH, SOL, XRP perp orderbooks — 1/sec sampling.
Tick-level orderbook snapshots
Every snapshot includes full bid/ask depth, mid prices, spreads, and crypto spot price.
| Side | Bid | Size | Ask | Size | Spread |
|---|---|---|---|---|---|
| UP | 0.5400 | 1,240 | 0.5500 | 1,100 | 1.00% |
| UP | 0.5300 | 980 | 0.5600 | 1,450 | 3.00% |
| UP | 0.5200 | 1,560 | 0.5700 | 890 | 5.00% |
| UP | 0.5100 | 2,100 | 0.5800 | 2,300 | 7.00% |
| UP | 0.5000 | 1,800 | 0.5900 | 1,700 | 9.00% |
| UP | 0.4900 | 3,200 | 0.6000 | 3,100 | 11.00% |
cryptoLowCardinality(String)BTCtimeframeLowCardinality(String)5mtoken_sideEnum8('UP','DOWN')UPtimestampDateTime64(3)2026-05-09 03:14:12.061crypto_priceFloat64$80,471.01best_bidFloat640.5400best_askFloat640.5500mid_priceFloat640.5450spreadFloat640.0100bidsArray(Tuple(F64,F64))[(0.54,1240),...]asksArray(Tuple(F64,F64))[(0.55,1100),...]Comprehensive market coverage
Prediction markets across multiple categories, captured continuously with high-frequency precision.
Up/Down Token Schema ships with
What AI Engineers build with Up/Down Token Schema
Up and running in minutes
Three steps from signup to live Up/Down Token Schema in your application.
Get Your API Key
Generate a free API key instantly. No credit card. Just click and go.
Sign Up FreeExplore the API
Browse 11 endpoints with live examples. Test requests directly from the docs.
API ReferenceStart Building
Integrate live Up/Down Token Schema into your research pipeline, trading bot, or analytics platform.
fetch('/v1/markets/live', { headers: { 'X-API-Key': key } })
npm install -g resolved-markets-mcpWiring Up/Down Token Schema into your workflow
AI engineers integrate Up/Down Token Schema primarily through the MCP server, with WebSocket streaming for high-throughput live updates and REST for ad-hoc queries. All three return the same continuous Polymarket capture.
- Native ClickHouse JDBC/ODBC connector
- Snowflake Snowpipe ingest for streaming Up/Down Token Schema
- AWS Glue catalog integration for Up/Down Token Schema Parquet files
Why AI Engineers pick Up/Down Token Schema
- MCP integration provides agents with zero-boilerplate access to 100+ Polymarket orderbooks across all categories
- Structured JSON responses with semantic market context (sentiment, imbalance, volatility) enable LLMs to reason without prompt engineering
- Historical snapshot context windows enable agents to detect market regime changes and momentum shifts
- Unified access to crypto, sports, economics, and weather predictions enables agents to discover cross-category correlations autonomously
Why Up/Down Token Schema matters
Up/Down Token Schema matters for AI engineering because LLM agents need grounded data. DateTime64(3) timestamps with full bid/ask arrays on best_bid, best_ask, mid_price, spread, bids[], asks[] gives autonomous systems a reliable, real-time view of prediction markets.
Up/Down Token Schema in context
AI engineering on prediction markets is converging on MCP. Up/Down Token Schema fits cleanly into that pattern: structured function calls expose live and historical orderbook state to any agent that speaks the protocol.
Frequently asked: Up/Down Token Schema for AI Engineers
-
How does MCP integration work with our existing agent framework?
Resolved Markets' MCP tools integrate seamlessly into any agent system supporting MCP (Claude, Anthropic SDK, compatible frameworks). Add our tool definitions to your agent's tool manifest, and agents can call functions like 'get_orderbook_snapshot(market_id)', 'stream_updates(category)', 'query_history(market_id, time_range)'. Return values are JSON-structured for direct LLM processing. No custom wrapper code needed—agents access Resolved Markets like any other tool.
-
What structured data does MCP return for market sentiment analysis?
Each orderbook snapshot includes full depth arrays, bid/ask spread, timestamp, and derived metrics: bid_side_quantity_sum, ask_side_quantity_sum (for imbalance), depth_concentration_percentile (conviction level), spread_basis_points, and volume_weighted_midpoint. We also return sentiment_direction (bullish/bearish/neutral) by comparing current spread to historical 1-hour average. These fields enable agents to reason about market conditions without needing raw orderbook processing logic.
-
Can agents use Resolved Markets data to trigger automated trading workflows?
Yes, agents can stream orderbook updates via WebSocket and implement decision logic: 'If BTC probability exceeds 80% AND spread narrows below 10 basis points AND buy_side_concentration > 70%, execute hedge position.' Resolved Markets' real-time delivery (20Hz for crypto) enables sub-second decision latencies. Agents can integrate trading APIs (dYdX, Uniswap, prediction market APIs) and autonomously execute based on market conditions discovered through Resolved Markets insights.
-
How do we provide agents with context about multiple prediction markets simultaneously?
Use our 'batch_query' endpoint to retrieve snapshots from multiple markets in a single request, or subscribe to WebSocket streams for multiple markets. Resolved Markets returns structured arrays of orderbooks with metadata (market_name, category, outcome_yes_probability, outcome_no_probability). Agents receive rich context: 'Here are the top 5 crypto price prediction markets, top 5 economics outcomes, top 5 weather events with orderbook snapshots.' This multi-market context enables sophisticated reasoning.
-
What are the latency characteristics for agent decision-making?
WebSocket streams deliver orderbook updates with millisecond precision (20Hz for crypto, variable for other categories). End-to-end latency from Polymarket to agent is typically 200-500ms. For decision workflows, agents can subscribe to streaming updates and process in near-real-time. Historical queries return results in <100ms. For time-sensitive strategies, agents can maintain local snapshots updated via WebSocket and perform decisions on cached data, eliminating query latency.
-
What AI applications use Up/Down Token Schema?
Autonomous trading, sentiment analysis agents, portfolio optimization systems, multi-agent simulations on best_bid, best_ask, mid_price, spread, bids[], asks[], and any LLM workflow that needs grounded prediction-market state.
-
How does Up/Down Token Schema ground LLM responses?
Instead of hallucinating market state, agents call the MCP server and receive the latest snapshot from Up/Down Token Schema — with bid/ask depth and millisecond timestamps included.
-
Can Up/Down Token Schema be used in RAG pipelines?
Yes. Historical snapshots from Up/Down Token Schema can be indexed into vector stores or queried directly via MCP. Agents pull live and historical data through the same interface.
-
Can I use Up/Down Token Schema with dbt?
Yes. Most teams build dbt models that consume Up/Down Token Schema via the ClickHouse connector and derive downstream features (spread, depth imbalance, mid-price velocity).
-
Is Up/Down Token Schema compatible with Apache Iceberg or Delta Lake?
Yes. Bulk Parquet exports of Up/Down Token Schema drop directly into Iceberg or Delta tables for time-travel queries and ACID semantics.