News Update

Information disorder 'creates a chain reaction of harm,' new report finds

That’s how the Aspen Institute’s Commission on Information Disorder begins its sweeping new report on the subject.
Sixteen commissioners spent the better part of a year studying disinformation and the broader impacts of “a world disordered by lies.” They discussed all sorts of potential solutions. And on Monday they released the results, including fifteen specific recommendations for Big Tech, government regulators, newsrooms, civil society, and other stakeholders.
“To be clear, information disorder is a problem that cannot be completely solved,” co-chairs Katie Couric, Chris Krebs and Rashad Robinson wrote in the report. “Its eradication is not the end goal. Instead, the Commission’s goal is to mitigate misinformation’s worst harms with prioritization for the most vulnerable segments of our society.”
Robinson, the president of Color of Change, discussed the subject on Sunday’s “Reliable Sources.” He made the point that a polluted information environment requires leaders to push for change.
“So much of what we have to do now is call on leadership across government and private sectors to engage in dealing with this problem,” he said, “because every single aspect of life will continue to be impacted and harmed.”
The report is not a partisan screed. (Its only mentions of Donald Trump are in footnoted articles.) It is a good-faith attempt to assess the chaos of the current media marketplace and recommend some ways forward.
For example, the commission recommends a “high reach content disclosure” to compel social media platforms to share vital background information about viral posts. They also recommend a “content moderation platform disclosure” to require that platforms share details about posts they take down.
Other recommendations are broader in nature, such as a call for new tools and platforms “that are designed to bridge divides, build empathy, and strengthen trust among communities,” rather than tear it all down; the promotion of new “accountability norms” so that “superspreaders” of obvious lies aren’t left off the hook; and a “Public Restoration Fund” to support misinformation “countermeasures through education, research, and investment in local institutions.”
The commissioners also urge a “comprehensive federal approach” to the problem, “including a centralized national response strategy, clearly-defined roles and responsibilities across the Executive Branch, and identified gaps in authorities and capabilities.”
They also propose two amendments to Section 230 of the Communications Decency Act, specifically to “withdraw platform immunity for content that is promoted through paid advertising and post promotion,” and to “remove immunity as it relates to the implementation of product features, recommendation engines, and design.”
A version of this article first appeared in the “Reliable Sources” newsletter. You can sign up for free right here.
Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Most Popular

To Top