User:$Lopez04!/Section 230/Bibliography
The two problems Congress has when trying to solve Section 230:
- Congress was worried that harmful content was abundant online.
- Concern was influenced by the Stratton Oakmont v. Prodigy case, where an online platform was held liable for its content users.
"Section 230 was purposefully designed to achieve both these ends by providing online platforms with what are ultimately two complementary forms of protection."
The two forms of protection that most people are familiar with:
- The one that keeps platforms from being held liable for how users use their systems and services.
- This protection also makes it safe for platforms to moderate their services if they choose.
Meaning that:
a. "any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or
b. "any action taken to enable or make available to information content providers or others the technical means to restrict access to material..."
"the wisdom of Section 230 is that it recognizes that to get the best results - the most good content and also the least bad - it needs to ensure platforms can feel safe to do what they can to advance both of these things (i.e., the good and the "not so bad").
- If social media platforms had to fear liability for how they run, there would be not motive for maintaining "good" and "not so bad" content.
- Conversely, if said platforms had to fear liability for enabling user activity on its systems, they would be no need to enable.
Examples that involve Section 230:
- "Under the Digital Millennium Copyright Act (DMCA), Section 230 is inapplicable and liability protection for platforms is conditional."
- "Platforms are so fearful of copyright liability that it regularly causes them to overly remove lawful, and often beneficial, content."
How are these considered "forms of protection" when they do nothing to minimize harmful activity on their platforms?
- For example, Zuckerberg being in front of Congress multiple times and sued
"...if the DMCA had to spend their resources policing content in this way it would come at the expense of policing their content in a way that would be more valuable to the user community and public at large."
- this isn't entirely true. they are talking as if policing platforms would cause more harm than good.
"Section 230 works because it ensures that platforms can be free to devote their resources to being the best platforms they can be to enable the most good and disable the most bad content, instead of having to spend them on activities that are focused only what protects them from liability."
- While Section 230 has its pros, it is still causing much harm, and often irreparable.
"When people throw around the imaginary “publisher/platform” distinction as a basis for losing Section 230 protection they are getting at this idea that by exercising editorial discretion over the content appearing on their sites it somehow makes the content become something that the platforms should now be liable for."
- the article is talking about what might happen if Section 230 would be removed entirely. However, as shown in our social media/digital age, its removal would not be the only thing to help prevent the harms continuously happening online, but more of a modernized and specified version of Section 230.
"Section 230 never required platform neutrality as a condition for a platform getting to benefit from its protection. Instead, the question of whether a platform can benefit from its protection against liability in user content has always been contingent on who created that content. So long as the “information content provider” (whoever created the content) is not the “interactive computer service provider” (the platform), then Section 230 applies. Curating, moderating, and even editing that user content to some degree doesn’t change this basic equation. Under Section 230 it is always appropriate to seek to hold responsible whomever created the objectionable content. But it is never ok to hold liable the platform they used to create it, which did not."
- it is important that people are held legally liable, responsible, and accountable for what goes on their platforms, which they aren't being held enough.
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3225774
- Zeran decision led to other lawsuits (i.e. Blumenthal suing AOL in 1998 [Court sided with AOL/Zeran decision]; Carafano v.
Metrosplash.com in 2003 [Court sided with Matchmaker.com/Zeran decision]; Batzel v.
Smith in 2003 [Court sided with Smith & MSM/Zeran decision but w/ partial descent --> editing of content made MSM an active participant instead of passive]) pg. 11-13
- Doe v. Internet Brands (pg. 23)
Left and Right wing politicians believe that large-scale social media platforms have abused Section 230 and it's power. They also believe that they should lose the rights to Section 230 and be able to earn them back by reaching certain platform benchmarks.
- Section 230 allows for third-party content to be hosted without facing legal or technical backlash, which gives them more privileges than a traditional publisher would normally get.
If it goes away, changes in Section 230 could lead to social media platforms being more cautious about censoring content as exemplified by Craigslist eliminating its personals section, or conversely, abandoning moderation entirely, potentially leading to services dominated by extremist material. These outcomes could have global implications, particularly as other nations are already increasing their regulation of online speech.
The twenty-six words in Section 230 of the Communications Decency Act, which was created in 1996, give immunity to online media platforms from liability for content posted by users on the feed. According to PBS NewsHour, it states that "no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The history goes back as far as the 1950s, when a case that hit the desk of the Supreme Court regarding books being held responsible for "obscenity." After the ruling, a "chilling effect" was made, which "held someone liable for someone else's content". These were claims from Jeff Kosseff , the author of “The Twenty-Six Words That Created the Internet,” a book about Section 230. While it may feel like a small sentence, this sentence holds the power to stop companies from getting sued for millions of dollars off of content posted about them, whether the claim is legit or not. Both Democratic and Republican lawmakers have criticized the provision, arguing that major social media companies like Facebook and Twitter have misused these protections. PBSNewsHour states that Section 230 also allows these social media
You will be compiling your bibliography in this sandbox.
| Bibliography
As you gather the sources for your Wikipedia contribution, think about the following:
|
Bibliography
[edit]Edit this section to compile the bibliography for your Wikipedia assignment. Add the name and/or notes about what each source covers, then use the "Cite" button to generate the citation for that source.
Examples:
|
- Jeff, Kosseff, (2016-12-01). "The Gradual Erosion of the Law That Shaped the Internet: Section 230's Evolution Over Two Decades".
- This is a peer reviewed scientific journal, so it should be a reliable source. It also includes a few cases that might be relevant enough to add to the Section 230 Wikipedia article, among other information about how Section 230 has interacted with social media over the last decade.