Free · No signup · Client-side

IoU calculator

Enter two bounding boxes and see the Intersection over Union instantly, with a live canvas visualisation. Useful for spot- checking annotations, model evaluation, and explaining IoU to teammates.

Box A in indigo, Box B in violet, intersection in green.

IoU = 0.1601  ·  16.0% — partial overlap
Area A
30000
Area B
35200
∩ (intersection)
9000
∪ (union)
56200

What is IoU?

Intersection over Union (also called the Jaccard index) is the standard way to measure how well two bounding boxes overlap. The formula:

IoU = area(A ∩ B) / area(A ∪ B)

Range: 0 (no overlap) to 1 (identical boxes).

Common thresholds:

  • IoU ≥ 0.5 — “true positive” in COCO mAP@0.5 metrics
  • IoU ≥ 0.75 — strict match (COCO mAP@0.75)
  • IoU ≥ 0.7 — typical healthy inter-annotator agreement for detection labels
  • IoU ≥ 0.85 — excellent agreement; usually only achievable with strict label guidelines

IoU in code

The same calculation in Python:

def iou(a, b):
    # boxes are (x, y, w, h)
    x1 = max(a[0], b[0])
    y1 = max(a[1], b[1])
    x2 = min(a[0] + a[2], b[0] + b[2])
    y2 = min(a[1] + a[3], b[1] + b[3])
    inter = max(0, x2 - x1) * max(0, y2 - y1)
    union = a[2] * a[3] + b[2] * b[3] - inter
    return inter / union if union > 0 else 0

# Example
print(iou((60, 50, 200, 150), (160, 110, 220, 160)))  # → 0.226...

For mAP-style evaluation across a whole dataset, the quality-and-IAA endpoint computes per-image IoU across annotators and reports the distribution.

Sibling tools

IoU at dataset scale — built in.

mSightFlow's quality dashboard computes inter-annotator IoU across your whole project, flags low-agreement images, and ships with class-balance alerts. Free in every tier.