“You are a teenage girl in 2026. You’re going hiking. You’re at the beach. You’re getting glam for a homecoming dance, posing with your friends, enjoying the kinds of moments that high school kids have been memorializing without incident for decades. These are the kinds of wholesome, keepsake memories that have been forever ruined for the three Jane Does in Tennessee who are part of a class-action lawsuit filed in March against xAI, Elon Musk’s A.I. company,” Jessica Grose, a writer for Times Opinion, says in her weekly newsletter. Jessica continues: The creation of child sex abuse material, or CSAM, by individuals is already illegal, but in March a jury in New Mexico found Meta liable to the tune of $375 million for misleading users about its safety practices and failing to protect its young users from child predators. Social media companies were previously able to avoid accountability for their role in enabling the sharing of these images by leaning on Section 230 of the Communications Decency Act of 1996, which, as my newsroom colleague Cecilia Kang has explained, “protects them from liability for what their users post.” Congress has not gotten it together to reform this law, so lawyers have had to file suits in state courts that try out innovative strategies to get justice for children. New Mexico’s attorney general, Raúl Torrez, identified the algorithms that were built by the social media companies, which are separate from what users are individually posting. “What is not covered by Section 230 are the design features themselves that are built into the product that make that product inherently dangerous,” Torrez said. He added, “The platforms are really good at connecting people with the things that they are interested in, and if you have an interest in little girls, the platform will be good at connecting you with little girls.” Read the full piece here, for free , even without a Times subscription. submitted by /u/nytopinion
Originally posted by u/nytopinion on r/ArtificialInteligence
