For all of it’s hype, BIG DATA has missed every major crime, every major terrorist action, every big business swing and every social trend.
For all of the billions in marketing hype, salesmanship, PR and crappy software: Cisco, Oracle, Palantir, Lucid and all of the other “WE CAN SEE THE FUTURE” analysis software products that Silicon Valley foisted off on the market, the reality now is that it all has failed miserably.
Whatever a computer tells you people are going to do, is almost never what happens.
For example: Here is a massive amount of taxpayer time and resources spent on a crime prevention BIG DATA project that didn’t stop a single crime. Now it is being tossed out the window:
One of the biggest hype stories out of Silicon Valley, FACEBOOK, finally admits that their billions of dollars of computers and software can’t even get it right and that only real humans can figure out real humans:
Facebook Says You Filter News More Than Its Algorithm Does
A Facebook study of 10 million users shows that your selection of friends holds more sway than filtering algorithms when it comes to seeing news from opposing political viewpoints.
Facebook studied millions of its most political users and determined that while its algorithm tweaks what you see most prominently in your feed, you’re the one really limiting how much news and opinion you take in from people of different political viewpoints.
Eytan Bakshy, a research scientist on Facebook’s data science team and coauthor of the paper, says the group found that Facebook’s News Feed algorithm only slightly decreases users’ exposure to news shared by those with opposing viewpoints.
“In the end, we find individual choices, both in terms of who they choose to be friends with and what they select, matters more than the effect of algorithmic sorting,” he says.
The work comes more than three years after Bakshy and other researchers concluded that while you’re more likely to look at and share information with your closest connections, most of the information you get on Facebook stems from the web of people you’re weakly connected to—refuting the idea that online social networks create “filter bubbles” limiting what we see to what we want to see (see “What Facebook Knows”).
However, Bakshy says, the previous research, published in 2012, didn’t directly measure the extent to which you’re exposed to information from people whose ideological viewpoints are opposite from yours.
In an effort to sort that out, researchers looked at anonymized data for 10.1 million Facebook users who define themselves as liberal or conservative, and seven million URLs for news stories shared on Facebook from July 7 to January 7. After using software to identify URLs that consisted of “hard” news stories (pieces focused on topics like national news and politics) that were shared by a minimum of 20 users who had a listed political affiliation, researchers labeled each story as being aligned with liberal, neutral, or conservative ideologies, depending on the average political leaning of those who shared the stories.
Researchers found that 24 percent of the “hard” stories that liberal Facebook users’ friends shared were aligned with conservative users, while 35 percent of the “hard” stories that conservative Facebook users’ friends shared were aligned with liberal users—an average of 29.5 percent exposure, overall, to content from the other side of the political spectrum.
The researchers also looked at the impact of Facebook’s News Feed ranking algorithm on the kind of news you see. Bakshy says that overall, the algorithm reduces users’ exposure to content from friends who have opposing viewpoints by less than 1 percentage point—from 29.5 percent to 28.9 percent.
And when it came down to what users ended up actually reading, researchers report that conservatives were 17 percent less likely to click on liberally aligned articles than other “hard” stories in their news feeds, while liberals were 6 percent less likely to click on conservatively aligned articles presented to them.
Sharad Goel, an assistant professor at Stanford who has studied filter bubbles, says people in the field have talked about this issue for several years but Facebook alone was in a position to explore it. He says one thing worth keeping in mind is that people may get their news from many sources, which can dwarf the impact of what they see on Facebook.
“I do agree with one of their main messages—that the algorithm itself is not driving a lot of polarization,” he says.