Could Google or Facebook decide an election?
At this writing, it’s Wednesday morning after the U.S. election. None of my friends is sober, probably including my editor.
I had a different article scheduled originally, which it made the assumption that I’d been wrong all along, because that’s what everyone said. The first article in which I mentioned President Trump posted on Sept. 10, 2015, and covered data analytics in the marijuana industry. Shockingly, both Trump and marijuana won big.
I thought I was being funny. Part of the reason I was sure “President Trump” was a joke was that Facebook kept nagging me to go vote. First, it wanted me to vote early; eventually it wanted me to vote on Election Day. It wasn’t only Facebook—my Android phone kept nagging me to vote. (You’d think it would have noticed that I’d already voted or at least hung out at one of the polling places it offered to find for me, but whatever.)
This made me think. With the ubiquity of Google and Facebook, could they eventually decide elections? Politics are regional. In my state, North Carolina, if you turn out votes in the center of the state it goes Democratic. If you turn out votes in the east and west, it goes Republican. Political operatives have geographically targeted voters in this manner for years, but they have to pay to get in your face. Google and Facebook are already there.
What if instead of telling everyone to vote, they were to target voters by region? Let’s say Google and Facebook support a fictitious party we’ll call Fuchsia. In districts that swing heavily Fuchsia, they push notifications saying “go vote.” In districts that go for the other guys, they simply don’t send vote notifications and ads and instead provide scant information on polling station locations. That alone could swing some areas.
Targeted notifications could have an even more dramatic effect in districts that could go either way. Google and Facebook collect tons of psychometric data; Facebook even got caught doing it. Facebook and Google don’t only know what you “like” but what you hate and what you fear. Existing political operations know this too, but Google and Facebook have it at a much much more granular level.
To go a step further, what if Facebook manipulated your feed to increase your fear level if fear is the main reason you vote? What if your personalized Google News focused on your candidates’ positives or negatives depending on whether they want you to stay home or go to the polls? In fact, if you incorporate search technology against current events and the news, you could even have articles on other topics that passively mention either your candidate or the candidate you fear.
The point I’m trying to make is that the same technology used to manipulate you into buying stuff can be used to manipulate how or if you vote. We’re still a little away from this, but not far. Even a small amount of targeting could turn a close vote in a key state.
Source: InfoWorld Big Data