Facebook doesn’t have a strong reputation for respecting privacy — in fact, the company thrives on being as invasive as possible. It collects as much personal data on users as possible across all of their services. Using Facebook and Instagram gives the company a wealth of information, including hobbies and interests, and even if a user doesn’t have any accounts, their browsing habits are still recorded if you visit any of the 8 million websites that use Facebook’s trackers for analytics. Though this process is undeniably invasive, the company is more than happy to collect this data because it’s extremely lucrative — the information gathered is used to create profiles on users, which are sold to third parties for personalized advertising. It’s so profitable, in fact, that digital advertising made up 98% of Facebook’s $55.8 billion revenue in 2018.

Facebook’s business model relies entirely on users giving up their privacy. It’s surprising then, that at their own F8 Conference this spring, the company declared a newfound commitment to privacy. CEO Mark Zuckerberg stated, “the future is private,” acknowledging the real need for social platforms that value privacy. Changes are already being made. For example, Facebook outlined a plan to encrypt all its messaging apps, which would prevent the company from viewing any of its users text or voice messages, and will provide more options that limit data storage.  

While these changes sound great for consumers and would help Facebook have the appearance of an ethical company, ethics is far from the company’s primary concern. The company is doing this to save face, and to ultimately profit from the changes. Facebook’s reputation and stocks have been rapidly declining after numerous scandals and accusations. In 2016, the company was blamed for allowing Russian agencies to push divisive ads on the site during elections, and just last year, Facebook was shown to have leaked the personal information of about 87 million people to British consulting firm Cambridge Analytica. Facebook has continually demonstrated that it can’t be trusted with users’ personal data, so of course it’s trying to “change”; there is no other choice. The company doesn’t care about its users’ privacy — it is trying to keep up with the negative press it is rightfully receiving by shifting its brand’s image to something more positive, so users and stockholders alike will feel better.

It seems pessimistic to assume Facebook is rebranding itself for selfish reasons, but the ways it is implementing “privacy” suggests this change in direction is motivated by money rather than altruism. The company has been boasting a centralized infrastructure for all their messaging apps (e.g., allowing Instagram to talk to people on Messenger) that is entirely encrypted, so all messages going through their services will be private. This is a great step, but it’s addressing the wrong problem; Facebook never collected personal data from chats in the first place. Algorithms might scan messages to ensure you’re sending safe content, but nothing is ever collected to use for advertising, because it’s much easier to track users elsewhere. Facebook is encrypting chats because it will create the appearance of caring about privacy while hardly impacting their business model. This change doesn’t affect the disturbing amounts of data it collects outside of their chat platforms, which is of much greater importance. Additionally, encrypting chat apps means the company no longer needs to spend money to maintain algorithms that moderate messages, and combining the infrastructure of the chat services makes it much more difficult for regulators to split the company up. 

It seems disheartening, but there are solutions. There’s nothing inherently evil about advertising, even when it’s personalized — companies just need to be transparent about what data they’re collecting and how it’s being used instead of hiding it behind features and marketing. Privacy policies exist, but they should be much more accessible and easy to understand. Websites should explicitly tell users when data is being collected and for what reason. Ad targeting should be opt-in, and users deserve the right to access and erase all of their data at any time. The internet should be private by default, and a great step toward this is pushing for federal regulations that would force tech companies to respect personal data. The EU’s data protection law, the General Data Protection Regulation, has certainly improved the situation, and similar regulation at the federal level in the U.S. would be a tremendous step toward privacy everywhere.

Despite their claims, Facebook and companies like Google don’t care about user privacy because user data is how they make money. More awareness and concern for privacy is desperately needed —- too much trust is put in these companies to know everything about their users.