The Korea Herald

피터빈트

[Safiya Noble,Rashad Robinson] Under Elon Musk’s Twitter takeover, who will protect users?

By Korea Herald

Published : May 5, 2022 - 05:30

    • Link copied

Elon Musk’s Twitter takeover has triggered widespread criticism. Many people are panicked about the direction Musk will take the social platform. There’s a reason for alarm, but focusing solely on Musk ignores the crisis of monopoly control without accountability that characterizes much of the media in this country.

In recent decades, the notion of a public square, or the space available to debate, contest, experiment with and expand democratic discourse, is a struggle fraught with challenges. The tech sector has remade our understanding of who can speak and who should be heard, in both good and troubling ways. That has given rise to algorithmic and automated boosting of everything from evidence-based research and investigative journalism to outright racist propaganda.

Social media has created new monopolists, such as Mark Zuckerberg, who leads Facebook and Instagram. But in reality, we’ve been living in a world of media controlled by a very few private actors -- sometimes single families -- for a very long time. Ownership of communication outlets continues to be consolidated into the hands of a few, which has had an incredibly harmful effect on politics, education and the way we narrate and understand our shared societal challenges.

In announcing his $44 billion deal to buy Twitter, Musk said: “Free speech is the bedrock of a functioning democracy.” Over the years, Twitter has navigated how to handle content moderation, de-platforming Nazis and violent incitement to overthrow governments. As a public company with a board of directors, it has had to face some legal accountability, however limited, to agencies such as the Securities and Exchange Commission.

By taking the company private, Musk will remove this layer of oversight from Twitter. There is no question that abuses on a platform that has already struggled with racism and harassment will become even more difficult to rein in.

The issue is not just that rich people have influence over the public square, it’s that they can dominate and control a wholly privatized square -- they’ve created it, they own it, they shape it around how they can profit from it. So perhaps the real question is whether people are going to have any space and be able to engage in any activity that is not totally dominated by an entity seeking profits.

Technology companies are media companies. They have a responsibility for the way they affect our lives and democracy. Yet, when a few uncontrollable people control such platforms, that responsibility becomes voluntary and unenforceable. A self-regulated company is a non-regulated company.

Just as we need rules for television and the telecommunications industry designed to protect people, we need rules for technology companies. Frameworks of fairness and accountability for harm are necessary for a just society free from exploitation, anchored in civil and human rights. That’s true of every industry, and media platforms are no exception.

Pundits arguing that “Twitter was great before and now it will be terrible” takes us off track from the bigger problems at hand: the lack of rules and accountability. Federal laws and regulations must be crystal clear: The tech sector’s products must be subject to regulatory scrutiny before they are released. Just as drugs are subject to oversight by the Food and Drug Administration, tech products need to pass inspection -- an independent auditing process conducted by civil rights experts that exposes what they want to hide and advance proof that their products do no harm.

Regulators cannot be allowed to shift the burden and blame to consumers. The lie that we simply need to “put more control in the hands of users” is like holding individuals responsible for the air we breathe, or the pollution that destroys our lives, rather than regulating water and air quality in the best interest of the public.

In the tech world, self-regulation by corporations is essentially complete non-regulation. The real issue is about regulating deceptive and manipulative content, consumer exploitation, calls to violence and discriminatory or harmful products. Section 230 of the Communications Decency Act states that no provider or user of an “interactive computer service shall be treated as the publisher or speaker” of any information provided by another content provider. Long hailed as protecting free speech on the internet, the measure shouldn’t be used as a shield by tech-media publishers against demands for protecting users from harm to nullify 60 years of civil rights and consumer safety law.

The right approach is not complicated: If you make the internet safe, then you make the system safer for everyone. Big Tech puts Black people, people of color, women, queer and disabled people in more danger than anyone else. We need regulations and stronger digital civil rights that will safeguard the public, rather than debates about which billionaire will own the next communications platform.


Safiya Noble & Rashad Robinson
Safiya Noble is a professor at UCLA in the departments of Gender Studies and African American Studies, and a 2021 MacArthur Foundation Fellow. Rashad Robinson is the president of Color of Change, a racial justice organization, and served as the co-chair of the Aspen Institute’s Commission on Information Disorder. They wrote this for the Los Angeles Times. -- Ed.

(Tribune Content Agency)