Rory Mir and Katitza Rodriguez* say the Virtual Reality (VR) or Augmented Reality (AR) industry has inherited a lot of privacy issues from the big tech companies creating the headsets.
If you aren’t an enthusiast, chances are you haven’t used a Virtual Reality (VR) or Augmented Reality (AR) headset.
The hype around this technology, however, is nearly inescapable.
We’re not just talking about dancing with lightsabers; there’s been a lot of talk about how VR/AR will revolutionise entertainment, education, and even activism.
EFF has long been interested in the potential of this technology, and has even developed our own VR experience, Spot the Surveillance, which places users on a street corner amidst police spying technologies.
It’s easy to be swept up in the excitement of a new technology, but utopian visions must not veil the emerging ethical and legal concerns in VR/AR.
The devices are new, but the tech giants behind them aren’t.
Any VR/AR headset you use today is likely made by a handful of corporate giants—Sony, Microsoft, HTC, and Facebook.
As such, this budding industry has inherited a lot of issues from their creators.
VR and AR hardware aren’t household devices quite yet, but if they succeed, there’s a chance they will creep into all of our personal and professional lives guided by the precedents set today.
A step backwards: Requiring facebook login for oculus
This is why Oculus’ announcement last week shocked and infuriated many users.
Oculus, acquired by Facebook in 2014, announced that it will require a Facebook account for all users within the next two years.
At the time of the acquisition Oculus offered distressed users an assurance that “[y]ou will not need a Facebook account to use or develop for the Rift [headset].”
There’s good cause to be alarmed.
Eliminating alternative logins can force Oculus users to accept Facebook’s Community Standards, or risk potentially bricking their device.
With this lack of choice, users can no longer freely give meaningful consent and lose the freedom to be anonymous on their own device.
That is because Oculus owners will also need to adopt Facebook’s controversial real name policy.
The policy requires users to register what Facebook calls their “authentic identity,” — one known by friends and family and found on acceptable documents—in order to use the social network.
Without anonymity, Oculus leaves users in sensitive contexts out to dry, such as VR activism in Hong Kong or LGBTQ+ users who can not safely reveal their identity.
Logging in to Facebook on an Oculus product already shares with Facebook to inform ads when you logged in to a Facebook account.
Facebook already has a vast collection of data, collected from across the web and even your own devices.
Combining this with sensitive biometric and environmental data detected by Oculus headsets furthers tramples user privacy.
And Facebook should really know—the company recently agreed to pay $650 million for violating Illinois’ biometric law (BIPA) for collecting user biometric data without consent.
However, for companies like Facebook, which are built on capturing your attention and selling it to advertisers, this is a potential gold mine.
Having eye-tracking data on users, for example, can cement a monopolistic power in online advertisements— regardless of how effective it actually is.
They merely need the ad industry to believe Facebook has an advantage.
Facebook violating the trust of users in its acquired companies (like Instagram and Whatsapp) may not be surprising.
After all, it has a long trail of broken promises while paying lip service to privacy concerns.
What’s troubling in this instance, however, is the position of Oculus in the VR/AR industry.
Facebook is poised to shape the medium as a whole and may normalise mass user surveillance, as Google has already done with smartphones.
We must make sure that doesn’t happen.
Defending fundamental human rights in all realities
Strapping these devices to ourselves lets us enter a virtual world, but at a price—these companies enter our lives and have access to intimate details about us though biometric data.
How we move and interact with the world offers insight, by proxy, to how we think and feel in the moment.
Eye tracking technology, often seen in cognitive science, is already being developed, which sets the stage for unprecedented privacy and security risks.
If aggregated, those in control of this biometric data may be able to identify patterns which let them more precisely predict (or cause) certain behaviour and even emotions in the virtual world.
It may allow companies to exploit users’ emotional vulnerabilities through strategies that are difficult for the user to perceive and resist.
What makes the collection of this sort of biometric data particularly frightening, is that unlike a credit card or password, it is information about us we cannot change.
Once collected, there is little users can do to mitigate the harm done by leaks or data being monetised with additional parties.
Threats to our privacy don’t stop there.
A VR/AR setup will also be densely packed with cameras, microphones, and myriad other sensors to help us interact with the real world—or at least not crash into it.
That means information about your home, your office, or even your community is collected, and potentially available to the government.
Even if you personally never use this equipment, sharing a space with someone who does puts your privacy at risk.
Without meaningful user consent and restrictions on collection, a menacing future may take shape where average people using AR further proliferate precise audio and video surveillance in public and private spaces.
It’s not hard to imagine these raw data feeds integrating with the new generations of automatic mass surveillance technology such as face recognition.
Companies like Oculus need to do more than “think about privacy”.
Industry leaders need to commit to the principles of privacy by design, security, transparency, and data minimisation.
By default, only data necessary to core functions of the device or software should be collected; even then, developers should utilise encryption, delete data as soon as reasonably possible, and have this data stay on local devices.
Any collection or use of information beyond this, particularly when shared with additional parties, must be opt-in with specific, freely given user consent.
For consent to be freely given, Facebook should provide an alternative option so the user has the ability to choose.
Effective safeguards must also be in place to ensure companies are honouring their promises to users, and to mitigate Cambridge-Analytica-type data scandals from third-party developers.
Companies should, for example, carry out a Data Protection Impact Assessment to help them identify and minimise data protection risks when the processing can likely result in a high risk to individuals.
While we encourage these companies to compete on privacy, it seems unlikely most tech giants would do so willingly.
Privacy must also be the default on all devices, not a niche or premium feature.
We all need to keep the pressure on state legislatures and Congress to adopt strong comprehensive consumer privacy laws in the United States to control what big tech can get away with.
These new laws must not preempt stronger state laws, they must provide users’ with a private right of action, and they should not include “data dividends” or pay-for-privacy schemes.
Antitrust enforcers should also take note of yet another broken promise about privacy, and think twice before allowing Facebook to acquire data-rich companies like Oculus in the future.
Mergers shouldn’t be allowed based on promises to keep the user data from acquired companies separate from Facebook’s other troves of user data when Facebook has broken such promises so many times before.
The future of privacy in VR/AR will depend on swift action now, while the industry is still budding.
Developers need to be critical of the technology and information they utilise, and how they can make their work more secure and transparent.
Enthusiasts and reviewers should prioritise open and privacy-conscious devices while they are only entertainment accessories.
Activists and researchers must create a future where AR and VR work in the best interests of the users, and society overall.
Left unchecked, we fear VR/AR development will follow the trail left by smartphones and IoT.
Developers, users, and government must ensure it does not ride its hype into an inescapable, insecure, proprietary, and privacy-invasive ecosystem.
The hardware and software may go a long way towards fulfilling long-promised aspects of technology, but it must not do so while trampling on our human rights.
*Rory Mir is a Grassroots Advocacy Organiser primarily working on the Electronic Frontier Alliance. Katitza Rodriguez is EFF’s international rights director.
This article first appeared at eff.org