A digital media awash in “truthiness” needs “trustiness.”That became clear during a high-octane symposium at Harvard University and Massachusetts Institute of Technology that examined comedian Stephen Colbert’s definition of a truth that is known in the gut and can’t be swayed by facts or logic.“Truthiness is a rhetorical poker game that the ’net let people play,” said Charles Nesson, founder of the Berkman Center for Internet & Society, in giving a midday “inflection point” of a two-day symposium on “Truthiness in Digital Media,” co-hosted by the Berkman Center and the MIT Center for Civic Media and supported by the Ford Foundation.On Tuesday, prominent Web activists, researchers, and opinion makers gathered at Harvard to lay out the challenges of truthiness. On Wednesday, they went to the MIT Media Lab for “Hack Day” to try to come up with new technical tools to address those challenges. Those were myriad, as indicated by the event’s subtitle: “A symposium that seeks to understand and address propaganda and misinformation in the new media ecosystem,” as in “the Internet spreads lies.”Wendell Potter, a former health industry PR executive, described how the media world has shifted from having skeptical reporters act as gatekeepers to an open field where he could spread information “without being asked a difficult question.”Participants described a media landscape where roving bots produce near-human conversation on Twitter, where citizens cluster in right- or left-wing echo chambers, and where measured, carefully presented debunking may only reinforce a falsehood because of quirks in human psychology.That’s the dark side.But the bright side is equally valid, participants noted. Crowdsourcing can effectively debunk falsehoods or sort through confusing public documents like tax returns. And if mainstream journalists fail to fact check, the Internet supports groups that will.Tuesday’s presentations began with one by someone who admitted he had come over from the dark side: Wendell Potter, a former health industry PR executive turned industry critic. Potter described how the media world has shifted from having skeptical reporters act as gatekeepers to an open field where he could spread information “without being asked a difficult question.”“All this means the consumer is at a big disadvantage; it is easy for the dark side to spread misinformation,” Potter said, citing as an example the rumors of “death panels” during the health reform debates.The symposium showed how researchers, such as Yochai Benkler, the Berkman Professor for Entrepreneurial Legal Studies, and Filippo Menczer, professor of informatics and computer science at Indiana University, are studying, tracking, and graphing the spread of information and misinformation. For example, Benkler showed through a series of slides how discussion of federal anti-piracy legislation moved from technical sites to policy sites and eventually countered the impact of heavily funded lobbying groups to derail the legislation. “It’s not about fact-checking, it’s about frame shifting,” Benkler said.Menczer described how activists used technology to tweet and retweet fake news about President Barack Obama’s health care reform plan, in an effort to target influential Twitter users. Such efforts can be detected, he said, but “by that time the damage is already done. The trick is, can we detect it early, before the damage is done?”The public is not aware of the massive funding behind advocacy groups, Melanie Sloan, director of Citizens for Responsibility and Ethics in Washington, told the audience.Tim Hwang, chief scientist of the Pacific Social Architecting Corp., discussed his team’s efforts to create social media bots that were so effective that people often engaged them in long conversations. (Sample bot response being, “That’s so interesting. Tell me more.”) To prevent manipulation, “You should treat social networks like computer networks,” he said.Many of those who spread misinformation are sincere in that they actually believe that there is no global warming or that vaccines cause health problems. However, there are some Machiavellis out there. Melanie Sloan, director of Citizens for Responsibility and Ethics in Washington, discussed a media baron who she says created at least 25 non-profit “education” groups (such as one that debunks the “myth of childhood obesity”) that are staffed by his PR agency. The public, she said, is not aware of the massive funding behind advocacy groups.Even journalists can be blinded by their own mindsets, said Kai Wright, a writer and editorial director of Colorlines.com, citing the time in 2006 when some people’s fortunes were booming due to the stock market, while poorer communities were suffering from predatory lending. Should then a reporter say the economy is good or bad? “The media has done a poor job of reporting the truth because they only look at a narrow set of facts,” he said.Even if myriad tweets put out the facts, many people may not believe them. Chris Mooney, author of the forthcoming book “The Republican Brain,” admitted even he had been deceived by the traditional Enlightenment view of reality, the idea that putting out clear, well-reasoned information changes minds. It turns out that the brain doesn’t work that way, he said. Indeed, the smarter you are, he said, the less likely you are to change your mind.“The more capable you are of coming up with an argument that supports your belief, the less likely you are to change,” he said, even if you are only rationalizing, not reasoning. “There is a science of why we deny science,” Mooney said. “There is a science of truthiness.Often, said Brendan Nyhan, assistant professor of government at Dartmouth College, people “double down” on their beliefs. And if you restate false claims while debunking them (as in a health brochure on vaccines), you may “make the claims more familiar, and now they seem more true.”But the day was not all gloom: Potter noted that through a social media groundswell, Bank of America did not implement an unpopular new fee. Hwang said bots could also be deployed to detect other bots. During the lively question-and-answer session, Panagiotis Metaxas, a Wellesley computer science professor, said, “I trust crowds quite a bit. They are too big to be bought.” Through technology, crowds can be formed, he added.Kathleen Hall Jamieson, communication professor at the Annenberg School, said the public and journalists should be made aware of methods of fear mongering. Citing her new site, Flackcheck.org, Jamieson urged the audience to push broadcasters to check the veracity of third-party ads before airing them. “You can learn to detect the patterns of deception,” she said.Video of the symposium will be posted on the Berkman Center’s site.