Over the past decade, Texas attorney general Ken Paxton has wielded his office’s significant resources to investigate well-known tech giants including Google and Meta over how they moderate content and treat rivals. He helped win settlements against Apple for allegedly misleading users and is suing TikTok for allegedly endangering children’s privacy. Now, Paxton’s latest tech investigation includes an expansive number of targets, WIRED has learned.
Rumble, Quora, and WeChat are among the 15 companies from which Texas has demanded answers by next week about their collection and use of data of people under 18 years old. Paxton announced the investigation in a press release last month but named only four of the companies being probed—Character.AI, RedÂdit, InstaÂgram, and DisÂcord. WIRED obtained the names of additional targeted companies through a public records request. They also include Kick, Kik, Pinterest, Telegram, Twitch, Tumblr, WhatsApp, and Whisper.
Paxton’s office did not respond to requests for comment, including about how it chose which businesses to investigate. But the variety of companies questioned highlights the sprawling reach of a new Texas law aimed at increasing oversight of minors’ use of social media and chat services.
Three experts in youth privacy regulations who have been following Paxton’s enforcement efforts say the new investigation should be treated credibly, and they believe it could result in companies agreeing to improve their practices. The alternative could be up to hundreds of millions of dollars in penalties per company. “When you bring all the statues together, Texas has a pretty significant hammer,” says Paul Singer, a partner and section chair at the law firm Kelley Drye & Warren.
Spokespeople for Character.AI and Tumblr say they take safety issues seriously. Meta and WeChat declined to comment. Other companies under investigation did not respond to requests for comment.
Paxton launched his probe three days after the families of an 11-year-old girl and 17-year-old boy in Texas sued chatbot startup Character.AI, claiming the company designed its product in an unsafe way that exposed the kids to sexualized and violent responses. In October, a similar case was filed in Florida by the family of a 14-year-old who shot himself to death allegedly following conversations with a Character.AI chatbot. Character.AI later introduced an experience aimed at kids and plans to roll out parental controls.
Other services under investigation including Kik, Instagram, and Discord have faced public scrutiny over their use by children. But less so WeChat, a messaging app popular among Chinese Americans, and Quora, a forum for crowdsourcing information that’s recently expanded into AI chatbots.
Paxton’s interest in Rumble, a YouTube-like website popular among US conservative political commentators, is also perhaps unexpected given his track record of partisan views about social media companies. Rumble has touted itself as a haven for free expression, unlike platforms that engage in allegedly heavier content moderation. Paxton is a Republican and has criticized platforms that unfairly silence Texans.
The privacy experts who spoke with WIRED described Rumble, Quora, and WeChat as unusual suspects but declined to speculate on the rationale behind their inclusion in the investigation. Josh Golin, executive director of the nonprofit Fairplay, which advocates for digital safety for kids, says concerns aren’t always obvious. Few advocacy groups worried about Pinterest, for example, until the case of a British teen who died from self-harm following exposure to sensitive content on the platform, he says.
Paxton’s press release last month called his new investigation “a critical step toward ensuring that social media and AI companies comply with our laws designed to protect children from exploitation and harm.”
The United States Congress has never passed a comprehensive privacy law, and it hasn’t significantly updated child online safety rules in a quarter century. That has left state lawmakers and regulators to play a big role.
Paxton’s investigation centers on compliance with Texas’ Securing Children Online through Parental Empowerment Act, or SCOPE, which went into effect in September. It applies to any website or app with social media or chat functions and that registers users under the age of 18, making it more expansive than the federal law, which covers only services catering to under-13 users.
SCOPE requires services to ask for users’ age and provide parents or guardians power over kids’ account settings and user data. Companies also are barred from selling information gathered about minors without parental permission. In October, Paxton sued TikTok for allegedly violating the law by providing inadequate parental controls and disclosing data without consent. TikTok has denied the allegations.
The investigation announced last month also referenced the Texas Data Privacy and Security Act, or TDPSA, which became effective in July and requires parental consent before processing data about users younger than 13. Paxton’s office has asked the companies being investigated to detail their compliance with both the SCOPE Act and the TDPSA, according to legal demands obtained through the public records request.
In total, companies must answer eight questions by next week, including the number of Texas minors they count as users and have barred for registering an inaccurate birthdate. Lists of whom minors’ data is sold or shared with have to be turned over. Whether any companies have already responded to the demand couldn’t be learned.
Tech company lobbying groups are challenging the constitutionality of SCOPE Act in court. In August, they secured an initial and partial victory when a federal judge in Austin, Texas, ruled that a provision requiring companies to take steps to prevent minors from seeing self-harm and abusive content was too vague.
But even a complete win might not be a salve for tech companies. States including Maryland and New York are expected to enforce similar laws starting later this year, says Ariel Fox Johnson, an attorney and principal of the consultancy Digital Smarts Law & Policy. And state attorneys general could resort to pursuing narrower cases under their tried-and-true laws barring deceptive business practices. “What we see is often information gets shared or sold or disclosed in ways families didn’t expect or understand,” Johnson says. “As more laws are enacted that create firm requirements, it seems to be becoming more clear that not everybody is in compliance.”
Source : Wired