Over the past century, media has grown from a handful of newspapers and radio broadcasts into an endless digital stream of content available at any time.
What we have now is a perfect storm. People are both producing and consuming mass amounts of content. And the consumed information is often curated by the big platforms that only have the idea of increasing your engagement in mind.
This expansion of media has connected humans in a massive way. However, today, anyone anywhere can post anything with little to no consequence. This is a problem. It isn’t simply big agencies construing ideas anymore, in this decade, the problem is that misinformation is given a platform.
Posting anonymously is new; there is no risk, and there are no limitations to the information you are presenting. This is wrong. The platform that is being used to post is not at the mercy of an editor either. There is no punishment for the platform or editor. The information just goes on to be consumed.
The big platforms should be at the mercy of the editor. That isn’t the case now.
In 1996, Section 230 of the Communications Decency Act was put into place.
According to Congress.gov: “The statute generally precludes providers and users from being held liable—that is, legally responsible—for information provided by another person.”
My view is that it protects the people with money and allows them to get bigger.
This act instills the fact that posting misinformation is accepted. I believe most citizens do not want this mass conglomeration of unmonitored posts to be available for everyone.
There is a big difference between posting something as an independent party or journalist and being defended by an organization.
The problem often lies where responsibility is held. With this act, nobody is responsible. That is what needs to be changed. These massive organizations need an incentive to monitor user’s posts. I do not believe section 230 should be removed entirely; I believe amendments should be made. I want to maintain the first amendment at almost all costs. However, this act needs to be revised. The computer services should be held responsible for the posts that users make. Lawsuits will happen, and it should be against the platforms themselves; they need to monitor their users behavior.
To provide further remedy, editors should be hired to protect the agencies against lawsuits. This will also provide more jobs and allow free speech to continue productively.
As these platforms have grown, they have developed algorithms to keep viewers engaged. Facebook is a great example of that.
From Time magazine, Noah Giansiracusa explains how facebook’s algorithm is designed to maintain your consumption.
“It turns out Facebook engineers have assigned a point value to each type of engagement users can perform on a post,” Giansiracusa said. “For each post you could be shown, these point values are multiplied by the probability that the algorithm thinks you’ll perform that form of engagement.”
In turn, the posts that you agree with or liked are pushed out to be consumed more frequently.
Constant exposure to headlines and clips limits ability to consume critically. The Idea that, because you are consuming from an algorithm or something professional, the information is accurate. This is called algorithmic complacency. It is not the fault of the consumer entirely. I believe when the product is at fault for a behavior a human does, the product needs to be changed.