Millions of people uploaded photos to the Ever app. Then the company used them to develop facial recognition tools.

May 11, 2019 in News by Slad

 

Source: www.nbcnews.com
By Olivia Solon and Cyrus Farivar

“The app developers were not clear about their intentions,” one Ever user said. “I believe it’s a huge invasion of privacy.”

SAN FRANCISCO — “Make memories”: That’s the slogan on the website for the photo storage app Ever, accompanied by a cursive logo and an example album titled “Weekend with Grandpa.”

Everything about Ever’s branding is warm and fuzzy, about sharing your “best moments” while freeing up space on your phone.

 What isn’t obvious on Ever’s website or app — except for a brief reference that was added to the privacy policy after NBC News reached out to the company in April — is that the photos people share are used to train the company’s facial recognition system, and that Ever then offers to sell that technology to private companies, law enforcement and the military.

In other words, what began in 2013 as another cloud storage app has pivoted toward a far more lucrative business known as Ever AI — without telling the app’s millions of users.

“This looks like an egregious violation of people’s privacy,” said Jacob Snow, a technology and civil liberties attorney at the American Civil Liberties Union of Northern California. “They are taking images of people’s families, photos from a private photo app, and using it to build surveillance technology. That’s hugely concerning.”

Doug Aley, Ever’s CEO, told NBC News that Ever AI does not share the photos or any identifying information about users with its facial recognition customers.

Rather, the billions of images are used to instruct an algorithm how to identify faces. Every time Ever users enable facial recognition on their photos to group together images of the same people, Ever’s facial recognition technology learns from the matches and trains itself. That knowledge, in turn, powers the company’s commercial facial recognition products.

READ MORE (plus video) HERE