Computer visionfor crops, pests, and livestock.
From plant-disease detection to fruit counting to livestock pose, the platform handles the imaging pipeline so you can stay focused on agronomy. Zero-shot detection for novel pests that show up mid-season — no waiting for labelled data, no re-training to ship a working pipeline.
- Pipelines
- YOLO · Grounding DINO · SAM · CLIP · pose
- Inputs
- Drone · tractor cam · greenhouse · mobile
- Outputs
- detections · masks · counts · embeddings
- Round-trip
- ~100-200 ms p50
- Free quota
- 300 calls / month
Agritech CV is hard for three reasons that don't exist in most other verticals: outdoor lighting varies by hour and season, biological subjects change shape and colour weekly, and connectivity in the field is unreliable. mSightFlow doesn't magic these away — but it does ship the right tools for each: zero-shot detection for novel pests, albumentations augmentation for lighting robustness, active learning for seasonal drift, and batch APIs for low-connectivity upload-when-you-can workflows.
For brand-new defect or pest classes, zero-shot + SAM gets you a working pipeline today; for production accuracy on stable classes, fine-tune on your farm-specific imagery via auto-labelling + active learning.
Six use cases
Plant disease + pest detection
Detect rust, blight, mildew, leaf-spot, pest infestations. Pre-trained detectors for common diseases, zero-shot (Grounding DINO) for novel pathogens.
Weed identification
Distinguish crop from weed for precision spraying. SAM-assisted segmentation gives per-leaf masks for targeted herbicide application.
Yield estimation + fruit counting
Count fruits per tree / per bunch, estimate ripeness from colour distribution, project per-hectare yield from drone-pass sampling.
Crop health monitoring
Texture-anomaly scoring (/cv_tools/texture_anomaly) flags unusual leaf patterns. CLIP embeddings cluster a field into health zones.
Livestock monitoring
Animal detection + pose estimation (custom skeleton schemas) for lameness, body-condition scoring, fall detection, individual ID via CLIP embeddings.
Insect & pest trap analysis
Photo of a sticky trap → species ID + count. Auto-label drops in pre-filled detections; entomologist verifies. Saves hours of manual counting.
Features that matter for agritech
Zero-shot detection (Grounding DINO)
Detect 'aphids on leaves', 'leaf miner damage', 'rust spots' — without ever training on that class. Critical when a new pest shows up mid-season.
/v1/detect/zero-shotObject detection (Ultralytics YOLO)
Pre-trained YOLO for common crops, livestock, pests. Fine-tune on your farm-specific imagery for production accuracy.
/v1/detectSAM interactive segmentation
Per-leaf, per-fruit, per-animal masks. Faster than polygon-drawing by 10× for labelling small-class datasets.
/v1/segment/interactiveTexture anomaly scoring
Sliding-window local-entropy. Detects unusual leaf patches without training. Useful for early-stage disease screening on uniform-canopy crops.
/v1/cv-tools/texture_anomalyCustom keypoints (livestock pose)
Define animal-specific skeleton schemas. Track posture, gait, fall events. Standard COCO-Person works for human-staffing monitoring.
/v1/pose + /v1/skeletonsCLIP image search
Cluster a drone pass into health zones, find visually similar fields across seasons, identify individual livestock via re-identification.
/v1/embedCode — zero-shot, count, cluster, livestock
import os, requests
from pathlib import Path
# A new pest shows up mid-season — no labelled data. Zero-shot saves the day.
resp = requests.post(
"https://api.msightflow.ai/v1/detect/zero-shot",
headers={"Authorization": f"Bearer {os.environ['MSF_API_KEY']}"},
files={"image": Path("drone_pass_0042.jpg").read_bytes()},
data={
"prompt": "leaf miner damage, aphid colony, powdery mildew, healthy leaf",
"confidence_threshold": "0.25",
},
).json()
# Per-detection: class label, bbox, confidence
healthy = [d for d in resp["detections"] if d["label"] == "healthy leaf"]
infected = [d for d in resp["detections"] if d["label"] != "healthy leaf"]
print(f"Healthy: {len(healthy)} Infected: {len(infected)}")
# Fruit counting on a single tree image.
import requests
det = requests.post(
"https://api.msightflow.ai/v1/detect",
headers={"Authorization": f"Bearer {os.environ['MSF_API_KEY']}"},
files={"image": open("orchard_tree.jpg", "rb")},
data={"model": "yolo-orchard"}, # pre-trained orchard / fruit model
).json()
fruits_by_class = {}
for d in det["detections"]:
fruits_by_class.setdefault(d["label"], 0)
fruits_by_class[d["label"]] += 1
print(fruits_by_class)
# → {"apple_ripe": 142, "apple_unripe": 38, "apple_damaged": 7}
# Cluster a field into health zones using CLIP embeddings.
import requests, numpy as np
from pathlib import Path
def embed(p):
r = requests.post("https://api.msightflow.ai/v1/embed",
headers={"Authorization": f"Bearer {os.environ['MSF_API_KEY']}"},
files={"image": p.read_bytes()},
).json()
return np.array(r["vector"], dtype=np.float32)
vectors = {p.stem: embed(p) for p in Path("drone_tiles/").glob("*.jpg")}
# Then run k-means / DBSCAN on the vectors to find visually similar zones.
# Each cluster is a candidate health zone for ground-truth follow-up.
# Custom keypoint schema for cow lameness scoring.
# Define schema once (/v1/skeletons), then use the pose endpoint with it.
import requests
resp = requests.post(
"https://api.msightflow.ai/v1/pose",
headers={"Authorization": f"Bearer {os.environ['MSF_API_KEY']}"},
files={"image": open("cow_gait_001.jpg", "rb")},
data={"skeleton_id": "cow-12-point"}, # your custom schema
).json()
for cow in resp["animals"]:
print(f"bbox={cow['box']}, head-tail angle={cow['keypoints']}")
# Combine across video frames for gait + lameness score.
The three pressures that make agritech CV hard
| Pressure | What goes wrong | How mSightFlow helps |
|---|---|---|
| Outdoor lighting variation | Models trained at noon fail at dawn / dusk / overcast. | Albumentations augmentation pipeline + /cv_tools/clahe_enhance preprocessing + per-source colour normalisation. |
| Seasonal drift | Last year's wheat looks different from this year's — variety, weather, treatment. | Active learning + auto-labelling close the retrain loop in days. Zero-shot detection ships pipelines BEFORE labels exist. |
| Low connectivity | Field cameras can't reliably stream to a cloud API. | Batch upload-when-you-can pattern. Process locally to JPG, upload at base / packing shed. Edge SDK on roadmap. |
What we don't do
One tile in your stack, not the whole stack. Saves a discovery call.
- Satellite / hyperspectral imagery (we handle RGB JPG/PNG only)
- Real-time tractor-mounted inference (cloud REST today; edge SDK on roadmap)
- Spray-control / autonomy loops (we don't actuate equipment)
- Soil chemistry, weather, water-quality (we're image-only)
- Farm-management ERP / record-keeping (one tile in your stack, not the whole stack)
Pricing — same as every other tier
Features that pair with agritech
Zero-shot detection
New pest, no labels? Type the name, get bounding boxes. The fastest possible mid-season response.
Learn moreData augmentation
Albumentations server-side — RandomBrightnessContrast, HueSaturationValue, RandomShadow. Lighting robustness.
Learn moreActive learning
Closed-loop retraining for seasonal drift. Label the 50 images that move accuracy most.
Learn moreFAQ for agritech teams
We have limited connectivity in the field. How does this work?
mSightFlow is a cloud REST API, so each image needs an internet round-trip. For low-connectivity scenarios, the common pattern is: capture in the field, batch-upload from base / vehicle / Wi-Fi point, and inspection runs centrally overnight. Sub-150 ms p50 latency means you can also run live inspection if you have decent connectivity at a fixed point (greenhouse, packing shed, milking parlour). On-device SDKs for tractors and edge gateways are on the Pro roadmap.
Crop varieties and pests are seasonal. Won't models drift?
Yes — that's the agritech-specific reality. The right approach is active learning + auto-labelling: re-score your model against this season's images, surface low-confidence cases, label them, retrain. mSightFlow's active-learning endpoint (/v1/label/score-batch) makes that loop fast. For brand-new pest or disease classes mid-season, use zero-shot detection (Grounding DINO) immediately — no waiting for training data.
Outdoor lighting kills my detection accuracy. What helps?
Three things, in priority: (1) capture under consistent conditions when possible (golden hour, overcast, fixed greenhouse lighting) — bigger accuracy lift than any algorithm. (2) Use Albumentations augmentation in training — particularly RandomBrightnessContrast, HueSaturationValue, RandomShadow. (3) For severe lighting variation, apply /cv_tools/clahe_enhance as a preprocessing step. mSightFlow's augmentation pipeline (Pro tier) handles the full Albumentations library.
Can I count things — fruits, plants, livestock?
Yes. Detection returns bounding boxes; counting is just len(detections). For accuracy across overlapping objects, pair detection with SAM (each detection gets a pixel mask) and use the masks for counting/sizing. For livestock that moves between frames, use /v1/video/track_json (CSRT/KCF tracker) to avoid double-counting the same animal across video.
Do you handle satellite or hyperspectral imagery?
Not directly. mSightFlow handles RGB JPG/PNG. For satellite work, the most common pattern is to render RGB / false-colour tiles from the source data with rasterio or QGIS, then run detection / segmentation on the tiles. Hyperspectral (>3 bands) is out of scope — that's a different problem class and there are specialised tools (HSI-specific frameworks) that fit better.
Can you train on my private dataset?
Yes on the Pro tier. Upload your labelled dataset (using SAM + auto-label to build it fast), train a custom YOLO / classifier, and mSightFlow hosts the model alongside the built-in ones with the same REST shape. We never train on customer data without an explicit project-level training request.
From drone pass to per-plant verdict, today.
300 free API calls / month. Zero-shot detection for new pests, SAM for leaf masks, CLIP for health-zone clustering — all in one API.