Kenyan Workers Training Meta’s AI Glasses Report Viewing Users’ Intimate Moments

Hidden Human Review: Contractors in Kenya See Users’ Most Private Moments Captured by Meta’s Ray-Ban AI Glasses

Esther Speak - Senior Reporter at Villpress
4 Min Read
Add us on Google
Add as preferred source on Google

Kenyan contract workers training Meta’s AI-powered Ray-Ban smart glasses have described reviewing highly sensitive and intimate user footage, including bathroom visits, nudity, sexual activity, and visible bank cards, often captured without the wearers’ full awareness. The revelations, detailed in a March 2026 investigation by Swedish newspapers Svenska Dagbladet and Göteborgs-Posten, highlight ongoing privacy concerns around wearable AI devices that blend always-on recording with human-in-the-loop data annotation.

The footage is routed to Nairobi-based contractors employed by Sama, a Meta subcontractor specializing in AI data labeling. Workers told the outlets they regularly encounter material far more personal than routine scenes: one described a clip where a man placed the glasses on a bedside table, then his partner entered and undressed; others reported sex acts filmed from the wearer’s perspective, pornography playback, or accidental exposures of financial details like debit cards during payments. “In some videos you can see someone going to the toilet, or getting undressed,” one annotator said. “I don’t think they know, because if they knew they wouldn’t be recording.”

 Meta Ray-Ban AI glasses Kenya privacy intimate footage
Ray-Ban Meta smart glasses with built-in camera (Image: Forbes)

Meta’s Ray-Ban smart glasses (co-developed with EssilorLuxottica) include cameras, microphones, speakers, and Meta AI for voice queries, photo capture, live translation, and contextual assistance. While the company emphasizes on-device processing for many features and automatic face-blurring in shared content, sensitive clips are sent to human reviewers when needed to train or refine AI models, particularly for object recognition, scene understanding, and query accuracy. Meta’s terms of service permit this, including cross-border transfers to partners like Sama for “improving our services.”

The practice raises sharp questions about consent and transparency. Users may not realize that everyday wear, leaving glasses on a nightstand, recording in private spaces, or capturing during intimate moments, can result in footage viewed by strangers thousands of kilometers away. Workers report discomfort and psychological strain from viewing such material, often in open offices under strict NDAs and low wages (around $2/hour in some cases). The investigation notes that automated blurring sometimes fails, leaving identifiable details intact.

Privacy experts and EU lawmakers have seized on the report. Under GDPR, processing personal data (including intimate images) requires explicit safeguards for transfers outside the EU, Kenya lacks an adequacy decision, so standard contractual clauses and supplementary measures are required. Several MEPs have questioned the European Commission on enforcement, with calls for Meta to explain transparency notices, consent flows, and data minimization. Critics argue the glasses normalize ambient surveillance, turning private lives into training fodder for AI with insufficient user controls.

Meta has not directly responded to the specific allegations as of March 4, 2026, but the company maintains that data annotation is standard for AI development, with strict policies to protect privacy. In past statements on similar moderation issues, Meta has emphasized limited access, data deletion after use, and efforts to minimize human review through on-device AI improvements.

For Lagos and broader African users (where Meta’s wearables are gaining traction via partnerships and e-commerce), the story underscores global privacy tensions: devices sold as convenient tools can expose intimate moments to offshore labor, often in low-income regions. As AI glasses push toward always-listening, always-seeing capabilities, the line between helpful assistance and invasive surveillance grows thinner, raising whether users truly understand the trade-offs when they say “Hey Meta.”

Share This Article
Esther Speak - Senior Reporter at Villpress
Senior Reporter
Follow:

Ester Speaks is a senior reporter and newsroom strategist at Villpress, where he shapes Africa-focused business, technology, and policy coverage.  She works at the intersection of journalism, and editorial systems, producing clear, high-impact news that travels globally while staying rooted in African realities.

notification icon

We want to send you notifications for the newest news and updates.

Enable Notifications OK No thanks