Consent is now a legal posture
Children's biometric and image data sits squarely inside modern privacy law. A single missed consent isn't an oversight any more — it's a notifiable issue.
On-device photo consent screening
FaceGate screens your photos against an enrolled consent list — entirely on the device. No cloud. No third parties. No surprises.
Photo
Source image
Detect
Find faces
Embed
112 × 112 chip
Match
Cosine score
Classify
Verdict
Why this matters
Children's biometric and image data sits squarely inside modern privacy law. A single missed consent isn't an oversight any more — it's a notifiable issue.
A sports day or production can produce hundreds of photos in an afternoon. Eyeballing each one for non-consenting faces is slow, tiring, and error-prone.
Most off-the-shelf face recognition uploads photos to a vendor's servers. Using one means sending children's faces to a third party to solve a privacy problem.
The risk isn’t a bad photo getting through — it’s not knowing you let one through.
The solution
Faces never leave the device. No cloud. No third parties.
Tag people as Include or Exclude. The app handles the rest.
Scans dozens of photos and gives you a verdict per image.
Cover faces with an emoji, export, done. Full audit trail.
Built so you can replace manual review with a structured, auditable workflow.
How it works
Add the people whose consent you're tracking. A few photos each is enough.
Point the app at a folder or take a photo. It detects and recognises every face.
Each photo lands in one of three buckets: Safe, Unsafe, or Review.
Cover non-consenting faces, export, post.
Every step runs locally on the phone — no internet required.
See the verdict
Three illustrated photos, three enrolled people. Toggle Include ↔ Exclude and see how the verdict shifts — the same logic the real app applies on-device.

1 face
SafeWhy? All faces are on the consent list as Include.

2 faces
UnsafeWhy? Blake is Excluded and confidently matched.

3 faces
ReviewWhy? An unknown face was detected — flagged for human review.
Consent list
Illustrative. The same logic runs on your device with no network calls.
Alex
Blake
Casey
UnknownConfidence model
Below the uncertain threshold, the app routes to human review instead of guessing. The defaults below are tuned for AuraFace v1 — the production model.
Cosine similarity ranges from 0 (different) to 1 (identical). The two handles split it into three bands: confident match, uncertain (human review), and no match.
Where it fits
Compliance posture
FaceGate is built against the Australian Privacy Principles and the WA Privacy and Responsible Information Sharing Act 2024. All biometric data stays on-device. No stratified demographic cohort evaluation has been published yet — that’s a Q4 2026 deliverable, and we publish the first model card before any commercial sale.
1FP / 1568 negatives
Internal 18-person ablation
0cloud calls / scan
All inference on-device
0.920F1 (AuraFace v1)
Apache 2.0 commercial model
≈ 5ms model inference
Lightweight, on a phone
We're working with WA schools on early deployments. Tell us what you need.