Family and Parenting
Tested
- Avg AIMAC Debt 6.71
- Avg Cost $0.17
- Models Tested 42
- Zero Debt 3
- Avg Violations 15.4
Insights for Family and Parenting
- Common Issues: Elements must meet minimum color contrast ratio thresholds, Links must have discernible text, Select element must have an accessible name
- Category Difficulty: Rank #15 of 28 categories (based on average model performance)
- Real-World Comparison: The WebAIM Million audits accessibility across 1 million real websites annually. Using different automated tools, AI models showed 72.9% fewer detected issues than real Family and Parenting websites (15.4 axe-core violations vs 56.7 WAVE errors).
Model Gallery
Click a screenshot to view the generated HTML page, or click the model name for detailed results and prompts.
Model Results
AIMAC Debt:The model's accessibility debt for this category (lower = better)
| lower = better | in usd | |||
|---|---|---|---|---|
| GPT 5.1 Codex Mini | 0.00 | $0.014 | 0 | 0 |
| GPT 5.3 Codex | 0.00 | $0.10 | 0 | 0 |
| GPT 5.4 Pro | 0.00 | $4.94 | 0 | 0 |
| GLM 4.7 Flash | 2.00 | $0.004 | 0 | 1 |
| Qwen3 Max Thinking | 2.00 | $0.04 | 0 | 1 |
| Gemini 3.1 Pro Preview | 2.00 | $0.13 | 0 | 1 |
| GPT 5.4 Mini | 3.05 | $0.03 | 0 | 3 |
| Qwen3.5 397B A17B | 3.32 | $0.023 | 0 | 4 |
| Olmo 3.1 32B Instruct | 3.32 | $0.004 | 0 | 4 |
| GLM 5 | 3.53 | $0.07 | 0 | 5 |
| o3 | 3.71 | $0.04 | 0 | 6 |
| GPT 5.4 | 3.71 | $0.16 | 0 | 6 |
| Kimi K2.5 | 4.19 | $0.04 | 0 | 10 |
| Qwen3.6 Plus Preview (free) | 4.37 | $0.00 | 0 | 12 |
| Trinity Large Preview (free) | 4.44 | $0.00 | 0 | 13 |
| Trinity Large Thinking | 4.64 | $0.01 | 0 | 16 |
| Kimi K2 Thinking | 4.70 | $0.024 | 0 | 17 |
| o4 Mini | 4.70 | $0.024 | 0 | 17 |
| MiniMax M2.7 | 5.06 | $0.04 | 0 | 25 |
| Claude Opus 4.6 | 5.58 | $0.58 | 0 | 43 |
| Qwen3 Coder Next | 5.58 | $0.015 | 0 | 43 |
| Nemotron 3 Super (free) | 5.66 | $0.00 | 0 | 4 |
| Claude Sonnet 4.6 | 6.41 | $0.51 | 0 | 103 |
| KAT Coder Pro V2 | 6.53 | $0.017 | 0 | 7 |
| Nova 2 Lite | 6.98 | $0.018 | 0 | 7 |
| Mistral Medium 3.1 | 7.37 | $0.017 | 0 | 8 |
| Mistral Large 3 2512 | 7.53 | $0.018 | 0 | 9 |
| DeepSeek V3.2 Speciale | 7.58 | $0.007 | 0 | 17 |
| gpt oss 120b | 8.05 | $0.003 | 1 | 3 |
| Grok 4.1 Fast | 8.58 | $0.003 | 0 | 19 |
| Codestral 2508 | 8.58 | $0.006 | 0 | 19 |
| Devstral 2 2512 | 8.71 | $0.005 | 0 | 14 |
| Qwen3.5 Flash | 8.75 | $0.0027 | 2 | 0 |
| Qwen3 Coder Flash | 9.21 | $0.01 | 0 | 33 |
| Grok 4.20 | 9.99 | $0.07 | 1 | 23 |
| Gemini 3 Flash Preview | 10.33 | $0.02 | 1 | 33 |
| Claude Haiku 4.5 | 10.71 | $0.08 | 1 | 49 |
| Qwen3 Max | 11.09 | $0.04 | 0 | 15 |
| R1 0528 | 12.75 | $0.023 | 2 | 4 |
| DeepSeek V3.2 | 16.28 | $0.003 | 2 | 9 |
| Mistral Small 4 | 17.07 | $0.006 | 3 | 4 |
| Qwen3 Coder Plus | 23.83 | $0.027 | 8 | 17 |