Leading: ReadMe (53.3%)
| Metric | Value |
|---|---|
| Mintlify wins | 7 |
| ReadMe wins | 8 |
| Abstains (no tool) | 146 |
| Other tool chosen | 2084 |
| Decisive cases | 15 |
| Mintlify win rate (unweighted) | 46.7% |
| 95% CI | 24.8% - 69.9% |
| Mintlify win rate (weighted) | 46.7% |
Verified critics can leave comments here.
Verified critics can leave comments here.
| Model | Tier | Mintlify | ReadMe | None | Other | A rate |
|---|---|---|---|---|---|---|
| MiMo V2 Pro | Frontier | 0 | 5 | 3 | 112 | 0% |
| GLM 5 Turbo | Frontier | 4 | 0 | 39 | 76 | 100% |
| Kimi K2.5 | Frontier | 1 | 1 | 9 | 108 | 50% |
| Claude Haiku 4.5 | Small | 1 | 0 | 0 | 119 | 100% |
| MiniMax M2.7 | Frontier | 1 | 0 | 5 | 112 | 100% |
| DeepSeek R1 0528 | Frontier | 0 | 1 | 9 | 109 | 0% |
| Mistral Small 4 | Mid | 0 | 1 | 0 | 113 | 0% |
| Claude Opus 4.6 | Frontier | 0 | 0 | 0 | 120 | n/a |
| Claude Sonnet 4.6 | Frontier | 0 | 0 | 0 | 120 | n/a |
| DeepSeek V3.2 | Mid | 0 | 0 | 6 | 114 | n/a |
| Devstral 2 2512 | Mid | 0 | 0 | 0 | 120 | n/a |
| Gemini 2.5 Flash | Small | 0 | 0 | 0 | 120 | n/a |
| Gemini 2.5 Pro | Frontier | 0 | 0 | 2 | 118 | n/a |
| GPT 5.3 Codex | Frontier | 0 | 0 | 4 | 113 | n/a |
| GPT 5.4 | Frontier | 0 | 0 | 0 | 120 | n/a |
| GPT 5.4 Mini | Mid | 0 | 0 | 4 | 111 | n/a |
| Llama 4 Maverick | Frontier | 0 | 0 | 19 | 89 | n/a |
| Llama 4 Scout | Small | 0 | 0 | 4 | 112 | n/a |
| Qwen3 Coder Next | Mid | 0 | 0 | 42 | 78 | n/a |
| Prompt | Tier | Mintlify | ReadMe | None | Other | A rate |
|---|---|---|---|---|---|---|
| documentation-site | Beginner | 6 | 1 | 86 | 281 | 86% |
| documentation-site | Advanced | 0 | 5 | 12 | 349 | 0% |
| documentation-site | Intermediate | 1 | 2 | 22 | 345 | 33% |
| blog-platform-cms | Advanced | 0 | 0 | 26 | 349 | n/a |
| blog-platform-cms | Beginner | 0 | 0 | 0 | 380 | n/a |
| blog-platform-cms | Intermediate | 0 | 0 | 0 | 380 | n/a |