IaC misconfigurations are the leading cause of cloud security incidents. Open security groups, public S3 buckets, containers running as root, secrets baked into manifests — these aren't exotic attack techniques. They're mundane configuration errors that slip through code review every day.
The tools to find these issues exist. Checkov, tfsec, Trivy — they're all good. The problem is that raw lists of findings are hard to act on at scale. Is your codebase getting better or worse over time? How do you compare across teams? How do you set a quality gate that doesn't just block every merge?
Today we're launching Misconfig Index to answer those questions.
What it does
Misconfig Index scans Terraform, Kubernetes, CloudFormation, and Dockerfile IaC, applies rule packs, and converts findings into a single weighted Misconfig Score (0–100, graded A–F). Scores are stored over time so you can see your security posture improving — or catch regressions before they reach production.
In one command:
$ pip install misconfig-index
$ misconfig scan --path ./infra
────────────────────────────────────────
Misconfig Score: 76/100 (Grade B)
────────────────────────────────────────
Category breakdown:
networking ████████░░ 80/100
identity ███████░░░ 70/100
storage █████████░ 90/100
workload ███████░░░ 72/100
────────────────────────────────────────
The scoring model
Findings are weighted by severity (critical: 10, high: 5, medium: 2, low: 1) and normalised by the number of files scanned, so a small repo with one finding doesn't automatically score worse than a large one with fifty.
score = clamp(0, 100 − (Σ weights / files_scanned) × 10)
This gives you a number that's stable enough to track week-over-week without panicking when a new rule is added, but sensitive enough to catch real regressions.
CI integration in three steps
The main use case is gating pull requests. Add your API key as a repository secret, drop in a workflow file, and every IaC change gets scored automatically:
- name: Scan IaC
env:
MISCONFIG_API_KEY: ${{ secrets.MISCONFIG_API_KEY }}
run: |
misconfig ingest \
--path . \
--repo "${{ github.repository }}" \
--branch "${{ github.ref_name }}" \
--commit "${{ github.sha }}" \
--min-score 60
The scanner exits 0 on success, 1 if the score drops below your threshold, and 2 on error. GitHub fails the check automatically.
Live badges
Add a live score badge to your README that updates on every push:

Badges are grade-coloured (green for A/B, yellow for C, red for D/F) and cached for five minutes.
The benchmark
The dashboard includes an industry benchmark — grade distribution, category averages, and the most common misconfigurations across all repos that have been scanned. As more teams connect, this becomes a real signal: you can see how your posture compares to the field and prioritise accordingly.
Self-hostable
If you can't send IaC data to a third-party service (and many of you can't), the full stack — FastAPI backend, PostgreSQL, nginx — is one docker compose up away. The same scanner, the same API, running in your own infrastructure.
What's next
- More rules — Pulumi HCL, Ansible, Bicep, ARM templates
- Slack / PagerDuty alerts — notify on score drops
- Team features — multi-org, SSO, shared dashboards
- Rule customisation — suppress false positives, write custom rules
- PR comments — post the score directly on the pull request
Try it now
Misconfig Index is free and open source under the MIT license. Self-host it or use the hosted version at misconfig.dev.
pip install misconfig-index
misconfig scan --path ./infra
If you find bugs, have feedback, or want to contribute a rule — open an issue on GitHub. We read everything.