What is Privacy, Really?
Last week saw a significant uproar in international media regarding a popular mobile application called FaceApp. Critics are concerned that the App’s Russian developer may at some point have given or will give the collected user data to Russia’s government. In this briefing, we will take a brief look at claims and couter-claims before delving into the difference between up front and hidden costs.
After being released in early 2017, FaceApp enjoyed moderate success. The app allows users to apply AI filters to their photos that range from adding accessories to beautifying to ethnicity changes. While it faced repeated criticism about individual filters being offensive, overall perception remained positive. In July 2019 FaceApp updated and improved a filter that allowed users to see aged versions of themselves. Since this ties directly into the human morbid fascination with mortality, the filter enjoyed great popularity with millions of people posting processed pictures to social media.
Due to the buzz, several users and news sites noticed that the app’s terms of service grant the developer virtually unlimited rights to it’s users’ uploaded pictures. The terms are extensive but relatively common in startup apps. The goal is to reduce legal risk by covering as many scenarios as possible. However, this does not change the fact that legally speaking most photo apps that are popular right now could advertise with your private photos or even sell them to interested third parties.
The last part is what triggered US Senator Chuck Schumer to call for an FBI investigation into the app since the Russian origin of the developers for him raised concerns that the photos - and thus the biometric data of users - may be sold to or directly harvested by the Russian government.
The developer in turn said in an interview with TechCrunch that no such sales had taken place or will take place and that pictures are processed in the US and thereupon deleted.
Part of a bigger problem
From an objective standpoint, it is hard to single out FaceApp among the thousands of similar apps with identical terms of service. It is merely the currently most popular one and thus the center of attention. While the terms of service used by online giants like Facebook or Google are slightly more precise and user-friendly, overall the lack of oversight of the actual processing of pictures means that any photo you upload or share can theoretically be abused to build databases of your biometric data. In fact, it often is. Facebook’s “is this friend in your picture?” feature and Google Photo’s automatic tagging of people work exactly on this basis. But naturally, such data can also be used for much more sinister purposes such as surveillance or Deepfakes.
The price of data
Ultimately, this issue boils down to a known quirk in human reasoning: We are horrible at accurately assessing the cost and value of a transaction unless it is spelled out for us.
If asked up front what amount of money they would sell their biometric data to be used for advertising or analysis by any given company or government for, most people would quote large figures. Similarly, when polled, people continuously rate their desire for privacy highly. But since privacy and data are abstract topics, all of these concerns are quickly ignored under real-world circumstances.
It turns out that the price we’re willing to sell that data for is not thousands or millions of dollars, but a single picture of our aged selves which took the company a thousandth of a cent to process. Unless a major event changes the public’s perception of privacy and the value of private data, and radical legislation is passed that forces companies to delete data stores, privacy of biometric data is already nothing more than a historical concept. It is safe to assume that the vast majority of people currently alive have pictures of themselves stored with services who’s terms of service are as bad or marginally better than those of FaceApp; either by uploading them themselves or by having others upload them for us in the form of group photos or simply as a bystander in the background of a picture. And as things stand, the vast majority of people don’t care - despite claiming the opposite. As technologies like Deepfakes proliferate, this perception may change.