Facebook has become a leading news source for a number of people across the globe — which may be the actual problem here.
Between its hand in global unrest from election tampering to full-blown violence, plus the lack of transparency surrounding data use — it’s clear that something needs to change.
Which begs the question? Is Zuckerberg obligated to change the platform? Or are lawmakers the ones who must push for change?
It seems that FB is trying to change
We kicked off an eventful year with a Facebook algorithm change.
Zuckerberg announced the newsfeed would shift toward a more “social” model, one built on connecting with real people and limiting the presence of paid ads.
Part of this initiative was to address political ads, too. Now, ads must be labeled as sponsored by “X group.”
But 2018 was marred by scandal. Zuckerberg, Sheryl Sandberg, and other execs were called to Congress.
Fake news hasn’t gone away, and there’s still no clear solution to the problem.
Are regulations on the horizon?
Facebook has built this testing and tracking machine — one that has made advertising more effective and specific. The company has explained this as a way to present users with ads that “relate to things they care about.”
Maybe most people don’t mind seeing products they might actually buy. The issue comes when that data machine is used to wield influence by playing into users’ fears or existing biases.
As we saw in the Cambridge Analytica shakeout, social media has the power to manipulate people quite easily.
The firm claims to have 5,000 data points on every American and used peoples’ behavioral data to gently nudge them toward altering their behaviors.
Cambridge Analytica shared over 100 fake pro-Trump stories, as you may remember, which highlighted a major issue. It’s not that users are dealing with ads, it’s that efforts like these — manipulating voters — can eat into democracy and corrode trust.
Senators Bob Corker and Chris Coons are pushing for new regulations that address growing concerns about misinformation and privacy. Coons says that he expects Congress to come up with a solution comparable to the EU’s GDPR which rolled out earlier this year.
The GDPR requires companies to ask for user consent and that they state when they will be collecting personal data.
But if the government does get involved, there’s a lot to consider. Facebook and Twitter have long been promoting their platforms as centers for free speech.
There’s no doubt that internet speech should be protected — and the idea of rules governing that space is not going to sit well with users from both sides of the aisle.
One of the biggest problems is, the existing setup limits competition.
Last year, Senator Amy Klobuchar introduced a bill that would prevent corporations from acquiring up-and-coming startups.
So Facebook, moving forward, wouldn’t be able to build up its empire by acquiring a whole bunch of future WhatsApps and Instagrams.
The idea is, we’d eventually have new alternatives cropping up outside of the control of these relative few tech companies, thus giving customers more choice over where they take their business.
Facebook is changing the rules for marketing
Earlier this year, Facebook announced it would be shutting down Partner Categories, a feature that lets marketers leverage third-party data to create targeted Facebook ads. Those third-party data brokers include some familiar names like Experian, Oracle Data Cloud, and Acxiom.
The companies have access to some of the most relevant consumer insights. Think incomes, family relationships, what kinds of cars they have, where they shop, their health status, and more.
That data was used by marketers to target people who just bought a house and were looking for a TV, say.
This change is a pretty big deal, actually. That data is a key ingredient for all kinds of companies.
And Facebook’s announcement signals a move toward protecting peoples’ privacy — though they will still provide their engagement and sales impact data on the back end of the site.
Will Facebook truly change? The jury is out.