a-eye logo a-eye

Scan. Detect. Protect.

The content moderation API that catches what others miss - CSAM, NSFW, gore, and more.

a-eye · Content Scanner
0 scans
IMG beach_photo.jpg
VID clip_party.mp4
GIF animation.gif
FACE portrait.jpg
a-eye
99.7% ACC
< 500ms
SAFE 94.2% conf nsfw_classifier_v3
12ms
Try it yourself — upload your own image
Uploaded image preview

3B+ Images scanned
<500ms Avg latency
99.7% Detection accuracy
5 Classification categories

Features

Everything you need to moderate content at scale, from a single endpoint.

Content Moderation

Classify images, videos, and GIFs as CSAM, porn, gore, bestiality, safe, and more - with confidence scores for each category.

Multi-class classificationConfidence scoresCSAM flagging

Age Detection

Detect faces in images and estimate apparent age and gender - with attention signals to guide deeper analysis.

Face detectionAge estimationGender detectionAttention signals
Integration

Three lines. That's it.

Integrate content moderation into your app with a single API call.

JPEGPNGGIFWebPAVIFMP4WebMMOV
Python cURL Node.js
200 OK · 134ms
INPUT
OUTPUT
Pricing

Simple, usage-based

Pay only for what you use.

$ 0.10 / 1,000 images
  • No minimum spend
  • Video & GIF support
  • Content moderation & age detection
  • 100,000 free scans every month
Get Started

CHAT SCANNING API

Advanced Behaviour Analysis

Every message is scanned in real time. Flag grooming behavior, predatory patterns, and violations of TOS - then take action automatically.

Need custom moderation for your platform? Let's talk.

Contact Us