Business
Tested
- Avg AIMAC Debt 6.63
- Avg Cost $0.28
- Models Tested 42
- Zero Debt 6
- Avg Violations 11.5
Insights for Business
- Common Issues: Elements must meet minimum color contrast ratio thresholds, Links must have discernible text, Form elements must have labels
- Category Difficulty: Rank #17 of 28 categories (based on average model performance)
- Real-World Comparison: The WebAIM Million audits accessibility across 1 million real websites annually. Using different automated tools, AI models showed 78.0% fewer detected issues than real Business websites (11.5 axe-core violations vs 52.6 WAVE errors).
Model Gallery
Click a screenshot to view the generated HTML page, or click the model name for detailed results and prompts.
Model Results
AIMAC Debt:The model's accessibility debt for this category (lower = better)
| lower = better | in usd | |||
|---|---|---|---|---|
| Claude Haiku 4.5 | 0.00 | $0.07 | 0 | 0 |
| o4 Mini | 0.00 | $0.025 | 0 | 0 |
| Qwen3 Coder Next | 0.00 | $0.016 | 0 | 0 |
| GPT 5.3 Codex | 0.00 | $0.12 | 0 | 0 |
| GPT 5.4 | 0.00 | $0.16 | 0 | 0 |
| GPT 5.4 Mini | 0.00 | $0.03 | 0 | 0 |
| GPT 5.4 Pro | 2.00 | $9.38 | 0 | 1 |
| Olmo 3.1 32B Instruct | 3.05 | $0.004 | 0 | 3 |
| Qwen3 Max | 3.53 | $0.036 | 0 | 5 |
| Trinity Large Preview (free) | 3.53 | $0.00 | 0 | 5 |
| Qwen3 Coder Plus | 3.71 | $0.03 | 0 | 6 |
| gpt oss 120b | 3.85 | $0.004 | 0 | 7 |
| Grok 4.1 Fast | 3.85 | $0.002 | 0 | 7 |
| GPT 5.1 Codex Mini | 3.85 | $0.016 | 0 | 7 |
| Trinity Large Thinking | 4.00 | $0.01 | 0 | 4 |
| Nova 2 Lite | 4.19 | $0.015 | 0 | 10 |
| Gemini 3.1 Pro Preview | 4.19 | $0.13 | 0 | 10 |
| Mistral Medium 3.1 | 4.28 | $0.016 | 0 | 11 |
| Qwen3 Max Thinking | 4.28 | $0.03 | 0 | 11 |
| Qwen3.6 Plus Preview (free) | 4.58 | $0.00 | 0 | 15 |
| Claude Opus 4.6 | 4.70 | $0.83 | 0 | 17 |
| GLM 5 | 4.90 | $0.06 | 0 | 21 |
| Nemotron 3 Super (free) | 4.94 | $0.00 | 0 | 22 |
| MiniMax M2.7 | 5.24 | $0.03 | 0 | 30 |
| Devstral 2 2512 | 6.00 | $0.012 | 0 | 5 |
| GLM 4.7 Flash | 6.05 | $0.004 | 0 | 5 |
| KAT Coder Pro V2 | 6.71 | $0.012 | 0 | 8 |
| R1 0528 | 7.32 | $0.035 | 0 | 8 |
| Mistral Large 3 2512 | 7.87 | $0.017 | 0 | 14 |
| DeepSeek V3.2 | 8.19 | $0.003 | 0 | 14 |
| Qwen3 Coder Flash | 8.44 | $0.011 | 0 | 17 |
| Qwen3.5 Flash | 8.750 | $0.003 | 2 | 0 |
| DeepSeek V3.2 Speciale | 8.752 | $0.006 | 0 | 22 |
| Kimi K2 Thinking | 9.10 | $0.025 | 0 | 30 |
| Codestral 2508 | 10.28 | $0.004 | 0 | 16 |
| o3 | 10.94 | $0.04 | 3 | 0 |
| Kimi K2.5 | 12.37 | $0.04 | 1 | 14 |
| Gemini 3 Flash Preview | 13.03 | $0.02 | 2 | 11 |
| Grok 4.20 | 13.26 | $0.07 | 2 | 14 |
| Qwen3.5 397B A17B | 19.38 | $0.033 | 3 | 23 |
| Claude Sonnet 4.6 | 23.19 | $0.45 | 4 | 48 |
| Mistral Small 4 | 26.25 | $0.01 | 5 | 22 |