| Metric | Value |
|---|---|
| GitBook wins | 40 |
| Payload CMS wins | 75 |
| Abstains (no tool) | 151 |
| Other tool chosen | 2090 |
| Decisive cases | 115 |
| GitBook win rate (unweighted) | 34.8% |
| 95% CI | 26.7% - 43.9% |
| GitBook win rate (weighted) | 34.8% |
Verified critics can leave comments here.
Verified critics can leave comments here.
| Model | Tier | GitBook | Payload CMS | None | Other | A rate |
|---|---|---|---|---|---|---|
| Claude Sonnet 4.6 | Frontier | 0 | 42 | 0 | 84 | 0% |
| MiniMax M2.7 | Frontier | 3 | 14 | 5 | 102 | 18% |
| Kimi K2.5 | Frontier | 0 | 14 | 9 | 101 | 0% |
| MiMo V2 Pro | Frontier | 12 | 0 | 4 | 110 | 100% |
| GLM 5 Turbo | Frontier | 7 | 0 | 41 | 77 | 100% |
| Claude Opus 4.6 | Frontier | 5 | 0 | 0 | 121 | 100% |
| Llama 4 Maverick | Frontier | 5 | 0 | 19 | 90 | 100% |
| GPT 5.3 Codex | Frontier | 0 | 4 | 4 | 115 | 0% |
| GPT 5.4 | Frontier | 3 | 0 | 0 | 123 | 100% |
| Gemini 2.5 Flash | Small | 2 | 0 | 0 | 124 | 100% |
| GPT 5.4 Mini | Mid | 1 | 1 | 4 | 114 | 50% |
| DeepSeek R1 0528 | Frontier | 1 | 0 | 9 | 115 | 100% |
| Devstral 2 2512 | Mid | 1 | 0 | 0 | 125 | 100% |
| Claude Haiku 4.5 | Small | 0 | 0 | 0 | 126 | n/a |
| DeepSeek V3.2 | Mid | 0 | 0 | 6 | 120 | n/a |
| Gemini 2.5 Pro | Frontier | 0 | 0 | 2 | 124 | n/a |
| Llama 4 Scout | Small | 0 | 0 | 4 | 117 | n/a |
| Mistral Small 4 | Mid | 0 | 0 | 0 | 120 | n/a |
| Qwen3 Coder Next | Mid | 0 | 0 | 44 | 82 | n/a |
| Prompt | Tier | GitBook | Payload CMS | None | Other | A rate |
|---|---|---|---|---|---|---|
| blog-platform-cms | Advanced | 0 | 53 | 28 | 312 | 0% |
| documentation-site | Intermediate | 32 | 0 | 23 | 334 | 100% |
| blog-platform-cms | Intermediate | 0 | 22 | 0 | 377 | 0% |
| documentation-site | Beginner | 6 | 0 | 87 | 299 | 100% |
| documentation-site | Advanced | 2 | 0 | 13 | 369 | 100% |
| blog-platform-cms | Beginner | 0 | 0 | 0 | 399 | n/a |