Science
Tested
- Avg AIMAC Debt 6.31
- Avg Cost $0.14
- Models Tested 37
- Zero Debt 4
- Avg Violations 13.4
Insights for Science
- Common Issues: Elements must meet minimum color contrast ratio thresholds, Links must have discernible text, Select element must have an accessible name
- Category Difficulty: Rank #17 of 28 categories (based on average model performance)
- Real-World Comparison: The WebAIM Million audits accessibility across 1 million real websites annually. Using different automated tools, AI models showed 69.7% fewer detected issues than real Science websites (13.4 axe-core violations vs 44.0 WAVE errors).
Model Gallery
Click a screenshot to view the generated HTML page, or click the model name for detailed results and prompts.
Model Results
AIMAC Debt:The model's accessibility debt for this category (lower = better)
| lower = better | in usd | |||
|---|---|---|---|---|
| GPT 5 Mini | 0.00 | $0.02 | 0 | 0 |
| KAT Coder Pro V1 | 0.00 | $0.008 | 0 | 0 |
| GPT 5.2 Codex | 0.00 | $0.06 | 0 | 0 |
| GLM 4.7 Flash | 0.00 | $0.0035 | 0 | 0 |
| Qwen3 Coder Plus | 2.66 | $0.03 | 0 | 2 |
| Gemini 3 Flash Preview | 2.66 | $0.02 | 0 | 2 |
| o3 | 2.66 | $0.035 | 0 | 2 |
| Qwen3 Coder 480B A35B | 3.05 | $0.011 | 0 | 3 |
| DeepSeek V3.2 Speciale | 3.32 | $0.006 | 0 | 4 |
| Kimi K2 Thinking | 3.85 | $0.03 | 0 | 7 |
| Grok 4.1 Fast | 3.85 | $0.0024 | 0 | 7 |
| GLM 4.7 | 4.28 | $0.014 | 0 | 11 |
| Kimi K2 0905 | 4.58 | $0.02 | 0 | 15 |
| Gemini 3 Pro Preview | 4.64 | $0.13 | 0 | 16 |
| MiniMax M2.1 | 4.94 | $0.008 | 0 | 22 |
| Mistral Medium 3.1 | 4.94 | $0.014 | 0 | 22 |
| GPT 5.2 Pro | 5.03 | $3.16 | 0 | 24 |
| MiniMax M1 | 5.03 | $0.03 | 0 | 24 |
| GPT 5.2 | 5.14 | $0.24 | 0 | 27 |
| Claude Haiku 4.5 | 5.54 | $0.08 | 0 | 41 |
| Claude Sonnet 4.5 | 5.58 | $0.24 | 0 | 43 |
| gpt oss 120b | 6.53 | $0.004 | 0 | 7 |
| Mistral Large 3 2512 | 7.53 | $0.013 | 0 | 9 |
| Qwen3 Max | 7.71 | $0.04 | 0 | 10 |
| GPT 5.1 Codex Mini | 8.05 | $0.014 | 1 | 3 |
| o4 Mini | 8.05 | $0.024 | 1 | 3 |
| Gemini 2.5 Flash Lite | 8.32 | $0.003 | 1 | 4 |
| DeepSeek V3.2 | 8.41 | $0.003 | 0 | 14 |
| Mistral Small 3.2 24B | 8.90 | $0.0007 | 0 | 20 |
| Codestral 2508 | 9.21 | $0.005 | 0 | 33 |
| Devstral 2 2512 | 9.32 | $0.0036 | 1 | 5 |
| Qwen3 Coder Flash | 9.35 | $0.012 | 0 | 29 |
| Claude Opus 4.5 | 9.75 | $0.77 | 0 | 26 |
| R1 | 12.98 | $0.03 | 1 | 12 |
| Qwen3 235B A22B Instruct 2507 | 13.05 | $0.01 | 2 | 3 |
| GLM 4.5 Air | 16.60 | $0.01 | 2 | 10 |
| Nova 2 Lite | 17.82 | $0.017 | 2 | 23 |