Facebook’s algorithm monetizes misery
--
“British Ruling Pins Blame on Social Media for Teenager’s Suicide” — NYT
An inquest ruled that harmful online content contributed to the 14-year-old’s death. Ian Russell accused Meta, the owner of Facebook and Instagram, of guiding his daughter on a “demented trail of life-sucking content”, after the landmark ruling raised the regulatory pressure on social media companies. — The Guardian
Molly Russell’s father accuses Facebook of “monetizing misery” after an inquest ruled that harmful online content contributed to his 14-year-old daughter’s death.
How do social media algorithms work? How do Facebook and YouTube profit by sharing disturbing content? How do social media firms fan political violence? Should social media companies be held responsible for the harm they cause by spreading hate and misinformation? How does their greed divide people and polarize politics? What is blood money? What is Section 230? What’s the case before the Supreme Court?
Profiting from anger and misery
Algorithms are designed to maximize the time users spend on the platform because that increases the number of ads they can be shown. More ads, more profits. And as users share posts with others, it creates more opportunities to sell ads. The algorithms don’t pay much attention to the harm the content being shared might cause, just as long as it keeps users on the platform.
“Andrew Walker, senior coroner said algorithms that curate a social media user’s experience had pushed harmful content to Molly that she had not requested. He said some of the content “romanticized” acts of self-harm and sought to discourage users from seeking professional help. Concluding that it would not be safe to rule Molly’s cause of death as suicide, Walker said some of the sites she had viewed were “not safe” because they allowed access to adult content that should not have been available to a 14-year-old.
“It is likely that the above material viewed by Molly, already suffering with a depressive illness and vulnerable due…