CIP Launch Times 1560x1040_edited.jpg


I'm an affiliate at a new UW center working on misinformation, public trust & truth. My role is building community partnerships and public engagement for the center. 

Scroll down for a related editorial I wrote advocating for platforms to confront misinformation.  



Nov 6, 2019 / 3 min read

“I don’t think people want to live in a world where you can only say things that tech companies decide are 100 percent true.”


That was Facebook’s Mark Zuckerberg last month, arguing that the company has no business censoring misleading political ads.


But the truth is, Facebook and other tech companies must accept responsibility for regulating harmful lies spread on their platforms — our lives might literally depend on it.


As an affiliate of the new Center for an Informed Public at the University of Washington, I get a close look at the latest research on how misinformation spreads online. The outlook is scary. As CIP director Jevin West has put it, “we’re drowning in bullshit.”


It’s tempting to call this a technical problem and look for a quick fix. The myth that a faulty suggested-content algorithm is behind the popularity of right-wing misinformation on Youtube was recently refuted. Whatever it is that attracts people to extremist lies shared online, it’s more than just bad code. 


And much as we might like to think we’re above the fray of misinformation here in the Pacific Northwest, we’re not.


Our region has long been a hotbed of the anti-vax movement, with island communities like Vashon reporting some of the lowest vaccination rates in the country. The fraudulent medical research that inspired this movement was quickly debunked, but nonetheless it spread like wildfire on Facebook groups, recently culminating in the largest measles outbreak in two decades, right here in our state.


As anyone who dropped their kids off at school this morning knows, another person’s exposure to misinformation has become everyone’s business.


If you search for anti-vax groups on Facebook today, you won’t find much beyond a friendly message referring you to the Centers for Disease Control. When it comes to vaccines, Facebook has openly accepted that they do have a responsibility to regulate potentially harmful lies on their platform.


So why have they stopped at political advertising?


Zuckerberg’s argument we political lies will wither in the open marketplace of ideas doesn’t stand up to recent research findings that politicians actually pay little price for dishonesty.


With a serial liar in the White House, corporate money flooding elections, and climate change denial still dominating U.S. policy, the quest for truth online is perhaps the most critical issue of our time.


As usual, Washington state is leading the fight. Last year the attorney general sued Facebook and Google for running political ads that violated the state’s public disclosure laws. Facebook settled for $238,500 (minuscule next to their $22 billion in profits last year).


Rather than complying with the law, which requires open tracking of ad purchases and their distribution, instead Facebook issued a company policy to simply stop selling political ads in Washington altogether.


Except they didn’t stop. Some candidates continued requesting ads, and Facebook often obliged, apparently unable or unwilling to enforce their own policy.


This is not the behavior of a company concerned about the public good.


Facebook wants us to believe we’re friends. They’re the only ones who have the power to stop the spread of misinformation. But until they’re made to pay a meaningful cost, they seem to lack the will.