Software program engineer Vishnu Mohandas determined he would stop Google in additional methods than one when he realized that the tech big had briefly helped the US army develop AI to review drone footage. In 2020 he left his job engaged on Google Assistant and in addition stopped backing up all of his pictures to Google Pictures. He feared that his content material might be used to coach AI techniques, even when they weren’t particularly ones tied to the Pentagon venture. “I do not management any of the longer term outcomes that it will allow,” Mohandas thought. “So now, should not I be extra accountable?”
Mohandas, who taught himself programming and is predicated in Bengaluru, India, determined he needed to develop an alternate service for storing and sharing photographs that’s open supply and end-to-end encrypted. One thing “extra non-public, healthful, and reliable,” he says. The paid service he designed, Ente, is worthwhile and says it has greater than 100,000 customers, a lot of whom are already a part of the privacy-obsessed crowd. However Mohandas struggled to articulate to wider audiences why they need to rethink counting on Google Pictures, regardless of all of the conveniences it affords.
Then one weekend in Could, an intern at Ente got here up with an concept: Give folks a way of what a few of Google’s AI fashions can be taught from finding out pictures. Final month, Ente launched https://Theyseeyourphotos.com, an internet site and advertising and marketing stunt designed to show Google’s expertise towards itself. Individuals can add any picture to the web site, which is then despatched to a Google Cloud laptop imaginative and prescient program that writes a startlingly thorough three-paragraph description of it. (Ente prompts the AI mannequin to doc small particulars within the uploaded pictures.)
One of many first photographs Mohandas tried importing was a selfie along with his spouse and daughter in entrance of a temple in Indonesia. Google’s evaluation was exhaustive, even documenting the precise watch mannequin that his spouse was sporting, a Casio F-91W. However then, Mohandas says, the AI did one thing unusual: It famous that Casio F-91W watches are generally related to Islamic extremists. “We needed to tweak the prompts to make it barely extra healthful however nonetheless spooky,” Mohandas says. Ente began asking the mannequin to provide brief, goal outputs—nothing darkish.
The identical household picture uploaded to Theyseeyourphotos now returns a extra generic outcome that features the title of the temple and the “partly cloudy sky and luxurious greenery” surrounding it. However the AI nonetheless makes quite a lot of assumptions about Mohandas and his household, like that their faces are expressing “joint contentment” and the “mother and father are probably of South Asian descent, center class.” It judges their clothes (“acceptable for sightseeing”) and notes that “the girl’s watch shows a time as roughly 2 pm, which corroborates with the picture metadata.”
Google spokesperson Colin Smith declined to remark immediately on Ente’s venture. He directed WIRED to help pages that state uploads to Google Pictures are solely used to coach generative AI fashions that assist folks handle their picture libraries, like those who analyze the age and site of picture topics.The corporate says it doesn’t promote the content material saved in Google Pictures to 3rd events or use it for promoting functions. Customers can flip off a few of the evaluation options in Pictures, however they’ll’t forestall Google from accessing their pictures solely, as a result of the info will not be end-to-end encrypted.