Cumulative Depth Snapshots for Academic Researchers
Resolved Markets provides academic researchers unprecedented access to prediction market microstructure data at scale, with 11.4M+ orderbook snapshots spanning 100+ markets across four asset categories. The orderbook-level dataset enables rigorous empirical studies of price discovery mechanisms, liquidity provision dynamics, and information aggregation efficiency in prediction markets—areas where public datasets remain severely limited. Researchers can query millisecond-timestamped snapshots via REST API, reconstruct bid/ask spread evolution, analyze order flow patterns during information events (FOMC announcements, sports outcomes), and test market efficiency hypotheses against authentic Polymarket data. ClickHouse-backed infrastructure ensures reproducibility and scalable analysis of 11.4M+ records, supporting statistical inference impossible on smaller datasets while maintaining researcher-friendly API access.
Academic researchers use Cumulative Depth Snapshots as a publication-ready dataset on prediction markets. Resolved Markets captures the 14-column ClickHouse orderbook schema with DateTime64(3) timestamps with full bid/ask arrays, including documented schema and timestamp checksums for reproducible studies on best_bid, best_ask, mid_price, spread, bids[], asks[].
Data challenges Academic Researchers run into
Cumulative Depth Snapshots from Resolved Markets is built around the data gaps Academic Researchers hit when they try to work with raw Polymarket feeds.
Fragmented prediction market data limits market microstructure research scope
Academic prediction market research has historically relied on transaction-level data or aggregated mid-prices, missing orderbook microstructure entirely. Market depth, bid/ask spread dynamics, and liquidity clustering patterns—core mechanisms in financial economics—remain unexplored for Polymarket. Researchers studying information aggregation or price discovery must reconstruct behavior from indirect signals rather than observing actual order placement, limiting causal inference rigor and forcing reliance on theoretical models instead of empirical validation.
Orderbook depth unavailable in prior academic datasets on Polymarket
Event-study methodology requires precise timestamps linking observable news events to market reactions. Without millisecond-granularity orderbook data, researchers cannot distinguish between immediate price impact (first 100ms after news release) and slower-moving liquidity adjustments. Resolved Markets' millisecond timestamps enable researchers to measure exact response lags, quantify price discovery speed vs. other markets, and test whether Polymarket predictions incorporate information faster than traditional betting markets.
Latency between events and observable market reactions difficult to quantify
Current Polymarket research relies on coarse daily data or sparse samples, limiting sample size for rigorous statistical testing. With only hundreds or thousands of observations, detecting subtle effects requires implausibly strong signals. Resolved Markets' 11.4M+ snapshots provide statistical power to test hypotheses about spread stationarity, order flow informativeness, and microstructure patterns with tight confidence intervals, enabling publication-grade analysis impossible on sparse datasets.
Limited historical depth prevents robust statistical power for causal inference
Reproducibility requires researchers to use identical data and methodologies. Proprietary or limited-access datasets prevent replication studies. Resolved Markets' free tier and documented API enable academic researchers to publish findings with full transparency—peers can independently verify results using the same orderbook snapshots, advancing scientific integrity in prediction market research.
Built for quantitative work on Cumulative Depth Snapshots
Orderbook-level prediction-market data that doesn't exist anywhere else.
Access orderbook-level microstructure data impossible to obtain elsewhere
Orderbook depth arrays reveal how different market participants behave—how much liquidity exists at each price level, how quickly spreads narrow after information shocks, and whether order placement follows predictable patterns. This microstructure data enables original research on market efficiency, participant behavior, and information incorporation in prediction markets. Test whether large BTC price predictions attract deeper liquidity, whether sports outcomes trigger faster revaluation than economics events, and whether weather predictions show herding behavior.
11.4M+ snapshots provide statistical power for rigorous hypothesis testing
With 11.4M snapshots across 100+ markets, test hypotheses with statistical rigor matching traditional finance research. Conduct cross-category studies—compare price discovery speed in crypto vs. sports vs. economics markets, measure whether information aggregates differently across categories, and identify market-specific microstructure features. Large sample sizes yield tight confidence intervals, publishable effect sizes, and robust conclusions rather than speculative findings from thin datasets.
Millisecond precision enables causal event-study methodology
Millisecond timestamps transform event-study research from coarse daily analysis to precise intra-minute causality estimation. Observe orderbook state at the exact moment FOMC announcements release or sports games conclude. Measure whether Polymarket reacts faster than traditional prediction markets, quantify price impact in first 100ms vs. first 1000ms, and test information efficiency at sub-second granularity previously impossible to study.
Free tier with API access ensures research reproducibility and transparency
Free tier access eliminates funding barriers for graduate researchers and unfunded scholars. Publish results using publicly documented API, ensuring peers can reproduce findings. Build cumulative knowledge base as researchers contribute complementary studies. Transparent data access increases citation counts and research impact, advancing the field of prediction market microstructure economics faster than proprietary data would permit.
How Academic Researchers use Cumulative Depth Snapshots
Seven categories, hundreds of markets
Prediction markets across crypto, sports, economics, weather, and more — live and historical orderbook data, all queryable through one API.
Crypto
BTC, ETH, SOL, XRP — up/down markets every 5m to 1d.
Equities
S&P 500 (SPX) daily open — up or down predictions.
Social
Elon Musk tweet counts — weekly prediction ranges.
Sports
NBA, NFL, EPL — game outcomes and season predictions.
Economics
Fed decisions, jobs reports — FOMC meetings and macro data.
Weather
44 cities daily — temperature, hurricanes, Arctic ice.
Hyperliquid
BTC, ETH, SOL, XRP perp orderbooks — 1/sec sampling.
Tick-level orderbook snapshots
Every snapshot includes full bid/ask depth, mid prices, spreads, and crypto spot price.
| Side | Bid | Size | Ask | Size | Spread |
|---|---|---|---|---|---|
| UP | 0.5400 | 1,240 | 0.5500 | 1,100 | 1.00% |
| UP | 0.5300 | 980 | 0.5600 | 1,450 | 3.00% |
| UP | 0.5200 | 1,560 | 0.5700 | 890 | 5.00% |
| UP | 0.5100 | 2,100 | 0.5800 | 2,300 | 7.00% |
| UP | 0.5000 | 1,800 | 0.5900 | 1,700 | 9.00% |
| UP | 0.4900 | 3,200 | 0.6000 | 3,100 | 11.00% |
cryptoLowCardinality(String)BTCtimeframeLowCardinality(String)5mtoken_sideEnum8('UP','DOWN')UPtimestampDateTime64(3)2026-05-09 03:14:12.061crypto_priceFloat64$80,471.01best_bidFloat640.5400best_askFloat640.5500mid_priceFloat640.5450spreadFloat640.0100bidsArray(Tuple(F64,F64))[(0.54,1240),...]asksArray(Tuple(F64,F64))[(0.55,1100),...]Comprehensive market coverage
Prediction markets across multiple categories, captured continuously with high-frequency precision.
Cumulative Depth Snapshots ships with
What Academic Researchers build with Cumulative Depth Snapshots
Up and running in minutes
Three steps from signup to live Cumulative Depth Snapshots in your application.
Get Your API Key
Generate a free API key instantly. No credit card. Just click and go.
Sign Up FreeExplore the API
Browse 11 endpoints with live examples. Test requests directly from the docs.
API ReferenceStart Building
Integrate live Cumulative Depth Snapshots into your research pipeline, trading bot, or analytics platform.
fetch('/v1/markets/live', { headers: { 'X-API-Key': key } })
rm-api download --category crypto --days 180Wiring Cumulative Depth Snapshots into your workflow
Academic researchers typically bulk-export Cumulative Depth Snapshots via the CLI, load into R or Python, and run analyses against the documented 14-column schema.
- Native ClickHouse JDBC/ODBC connector
- Snowflake Snowpipe ingest for streaming Cumulative Depth Snapshots
- AWS Glue catalog integration for Cumulative Depth Snapshots Parquet files
Why Academic Researchers pick Cumulative Depth Snapshots
- 11.4M orderbook snapshots enable statistically rigorous market microstructure research at scale
- Millisecond-precision timestamps facilitate causal event-study methodology unavailable elsewhere
- Free tier access ensures reproducible, transparent academic research with citation impact
- Cross-category data (crypto, sports, economics, weather) enables comparative prediction market analysis
Why Cumulative Depth Snapshots matters
Cumulative Depth Snapshots matters for academic research because it's reproducible. Schema, methodology, and timestamps are published, so studies built on Cumulative Depth Snapshots can be replicated by other researchers.
Cumulative Depth Snapshots in context
Empirical research on prediction markets has been gated by data access. Cumulative Depth Snapshots removes that gate: documented, free for academic use, and deep enough for rigorous studies on best_bid, best_ask, mid_price, spread, bids[], asks[].
Frequently asked: Cumulative Depth Snapshots for Academic Researchers
-
What are the precise fields included in each orderbook snapshot?
Each snapshot includes bid prices with volumes (full depth array), ask prices with volumes (full depth array), market identifier, capture timestamp in milliseconds, and metadata about market state. This enables reconstruction of exact historical bid/ask spreads, spread stationarity tests, and order flow analysis. Query via REST API with flexible filtering by market, date range, or timestamp interval.
-
How can I use this data for event-study analysis of FOMC announcements?
Query snapshots from 30 minutes before and 2 hours after each FOMC announcement using millisecond timestamps. Measure bid/ask spread evolution, observe liquidity clustering around key price levels, and track orderbook depth changes. Compare BTC/ETH price prediction orderbooks to identify which contracts reacted fastest, quantify price discovery lags, and test information efficiency hypotheses against authentic Polymarket microstructure.
-
Is the dataset sufficient for publishing peer-reviewed research?
Yes. With 11.4M snapshots and documented methodology, researchers can publish event-study papers, microstructure analysis, and cross-category comparative studies. The large sample size enables tight statistical inference and reproducible findings. Access via documented API ensures peers can independently verify results, meeting open-science standards and maximizing research impact and citations.
-
Can I perform SQL-style queries on the historical snapshots?
Resolved Markets' ClickHouse backend enables analytical queries. Filter snapshots by market category (crypto, sports, economics, weather), date ranges, or specific contract identifiers. Aggregate orderbook metrics across time windows, compute spread statistics, and extract order flow patterns. REST API handles complex analytical requests, supporting research workflows from exploratory data analysis to formal hypothesis testing.
-
How do the Polymarket orderbooks compare across different market categories?
Compare microstructure metrics across crypto (BTC, ETH, SOL, XRP), sports (NBA, NFL, EPL), economics (FOMC, jobs), and weather markets. Analyze whether spread stationarity, order clustering patterns, and price discovery speed differ systematically. The cross-category dataset enables research on whether prediction market microstructure is category-universal or category-specific, advancing understanding of information aggregation across diverse outcome types.
-
Is Cumulative Depth Snapshots suitable for academic publication?
Yes. Schema, methodology, and timestamps are documented. ClickHouse exports allow other researchers to replicate studies on the same Cumulative Depth Snapshots dataset.
-
What research areas does Cumulative Depth Snapshots support?
Market microstructure, herding behavior, price discovery, behavioral finance, and information cascades — all on best_bid, best_ask, mid_price, spread, bids[], asks[].
-
How do academics access Cumulative Depth Snapshots?
Free API access plus extended history depth for academic researchers. CLI bulk export, REST queries, and ClickHouse all return the same Cumulative Depth Snapshots.
-
Can I use Cumulative Depth Snapshots with dbt?
Yes. Most teams build dbt models that consume Cumulative Depth Snapshots via the ClickHouse connector and derive downstream features (spread, depth imbalance, mid-price velocity).
-
Is Cumulative Depth Snapshots compatible with Apache Iceberg or Delta Lake?
Yes. Bulk Parquet exports of Cumulative Depth Snapshots drop directly into Iceberg or Delta tables for time-travel queries and ACID semantics.