Skip to content

Legal

AI & decision-support disclaimer

Last updated 2026-05-12

FaceGate is a decision-support tool for consent-aware photo screening. It helps an authorised operator identify whether a photo may contain a person who has been marked as excluded from promotional, publicity, or external publication material. It is not a decision-maker. The final decision to publish, withhold, review, edit, or obscure any photo rests with the operator and their organisation.

What FaceGate does

FaceGate detects faces in photos, creates mathematical face templates, and compares those templates against people profiles configured inside the app. Each person profile can be marked as included or excluded. When an operator scans later photos, FaceGate uses those configured profiles to classify photos as Safe, Unsafe, or Review Required.

In this context, Safe means FaceGate has not identified an excluded person above the relevant matching threshold. Unsafe means FaceGate has identified a likely match to an excluded person. Review Required means the result is uncertain or requires human judgment. None of those labels is a publication approval by itself.

Why classifications are probabilistic

Face-recognition systems are statistical. They produce similarity scores, not certainty. A person may be missed, incorrectly matched, or placed into the review queue because of image quality, lighting, angle, occlusion, ageing, similar-looking people, threshold settings, enrolment-photo quality, or other factors. FaceGate is designed to reduce manual review risk, not remove the need for review.

The production version of FaceGate uses commercially viable face-recognition model technology selected for on-device matching. Even with a strong model and conservative thresholds, no face-recognition system can guarantee perfect classification for every individual photo.

How FaceGate manages classification risk

The reliability mechanism is deliberately not “trust the model”. It is:

  • A dual-threshold design. The app distinguishes confident matches, uncertain matches, and no-match outcomes. Uncertain results are routed to review rather than being treated as final publication decisions.
  • Human-in-the-loop review. The operator remains responsible for checking results, especially photos marked Review Required, photos involving children, and photos intended for public or promotional use.
  • Manual override. The operator can correct a detected face, mark a face as unknown, change a classification, or apply an obscuring overlay before saving or publishing a photo.
  • Audit logging. The app records significant actions such as enrolment, classification, review feedback, overrides, settings changes, and biometric deletion so the organisation can understand how a decision was reached.

Operator responsibility

FaceGate applies the configuration entered by the operating organisation. The organisation remains responsible for deciding who may be enrolled, whether it has the required consent or other lawful authority to store images and biometric templates on its own secure device, and whether each person should be marked included or excluded for the relevant publication purpose.

This is particularly important for schools and other organisations that manage photos of children or vulnerable people. FaceGate can help enforce an organisation’s internal publication rules, but it cannot decide whether the organisation is legally permitted to collect, retain, use, disclose, or publish a person’s image.

What FaceGate is not

  • It is not a substitute for consent records or organisational publication policies.
  • It is not a substitute for human review before publication.
  • It is not a substitute for legal, privacy, child-safety, or compliance advice.
  • It is not intended for covert surveillance, public-space monitoring, or law enforcement.
  • It does not guarantee that every excluded person will be detected, or that every detected person has been identified correctly.

Privacy and trust information

For more information about the on-device architecture, local storage model, audit trail, and review workflow, see the Privacy & Trust page and the Privacy Policy.

Product questions

For product or technical questions about FaceGate, email matthew.haskins.mh@gmail.com.