As schools grapple with the rising issues of cyberbullying, sexual abuse imagery, and online exploitation among students, Texas lawmakers are expected to deliberate on significant measures, including potentially banning social media for minors, in the forthcoming legislative session.
In the past ten years, Texas legislators have sought to curtail the harmful effects of social media by implementing laws against cyberbullying and restricting online platforms from gathering data on young individuals. However, these efforts have often encountered legal obstacles from social media companies opposing such regulations.
Although law enforcement and prosecutors have been the typical enforcers against these digital threats, a lack of resources has shifted much of this responsibility onto educators, who are already working hard to fulfill educational demands while needing to stay updated on the ways kids interact with technology.
“Almost every child arrives at school today, regardless of their background or economic status, with a smartphone. They frequently have unrestricted access to online content, irrespective of the measures we put in place,” stated Zeph Capo, president of the Texas American Federation of Teachers.
Lawmakers are proposing various strategies to confront the online dangers faced by schoolchildren in Texas, which include a bill presented by Rep. Jared Patterson that would prevent minors from registering for social media accounts and mandate age verification for new users. Additional measures could involve funding for internet crime units within law enforcement agencies, prohibiting the use of individuals’ likenesses in creating sexual abuse images, and raising awareness about online risks.
“Social media is currently the most perilous avenue accessible to our children in Texas,” Patterson said in an announcement.
Although there is support for any initiative aiming to lessen harm to children, school officials and cybercrime investigators stress that more substantial steps are necessary to hold social media platforms accountable for ensuring safety.
“We need these companies to act responsibly and limit the tremendously harmful content, especially concerning children,” Capo lamented. “However, they seem reluctant to take such actions.”
Schools have increasingly become environments where these dangers thrive. During an October Senate Committee on State Affairs meeting, numerous incidents showcasing the negative impact of social media on Texas youth were highlighted; for example, a junior high girl who developed an eating disorder after watching content on TikTok, a boy who became addicted to inappropriate content through YouTube, and a woman who spoke about being groomed for sexual exploitation in high school as her images circulated on social media platforms.
Most of these distressing incidents began in schools, where children have regular access to technology and where teachers often lack the ability to oversee their activities due to overwhelming responsibilities. Additionally, students often find ways to evade school internet restrictions, leading to grooming by predators on school property, as noted by Jacquelyn Alutto, president of No Trafficking Zone in Houston, during the hearing.
“Currently, schools serve as a hunting ground,” she asserted.
Efforts to communicate with several school districts, including those in Austin, Round Rock, Katy, and Eanes, regarding online threats went unanswered, and the Plano district declined to comment.
Last year, national organizations like the American Federation of Teachers and the American Psychological Association criticized social media platforms for compromising classroom education, raising costs for school systems, and contributing to the youth mental health crisis nationwide. This came following a report outlining how schools are shouldering the burdens of technology’s predatory impact within their environments.
In response to these challenges, Texas Governor Greg Abbott signed House Bill 18, known as the Securing Children Online through Parental Empowerment Act, aiming to hold social media companies accountable. This legislation requires digital service providers to offer certain protections for minors, prevent access to harmful content, and furnish parents with tools to monitor their child’s online activities. It also mandates that school districts obtain parental consent for most digital applications used for educational purposes and encourages them to seek alternatives to internet-based instruction.
However, many websites and games that are perceived as child-friendly still present risks, as they can attract sexual predators masquerading as children.
“A boy enjoying a game like Roblox during lunch could be targeted by a predator, leading to sexual grooming or exploitation in a matter of weeks or months,” Alutto explained.
Compounding these challenges is the issue of minors sharing explicit images online, prompting some child welfare organizations to advocate for restricting or outright banning social media for young users.
“This environment has also enabled human traffickers to groom and recruit children,” Alutto added.
Recent studies indicate that 95% of individuals aged 13 to 17 utilize social media, with over 30% admitting to using it “almost constantly.” Furthermore, approximately 40% of children aged between 8 and 12 access social media platforms, despite most requiring users to be at least 13 years old to create an account.
This trend has resulted in a generation of youth who are almost perpetually online, leaving medical professionals uncertain about the long-term ramifications of such exposure.
While the SCOPE Act aimed to limit children’s exposure to harmful online materials and increase parental oversight, social media companies have made efforts to undermine its provisions.
A federal judge recently temporarily blocked part of the law related to filtering harmful content, citing First Amendment free speech rights as an issue. Meanwhile, Texas Attorney General Ken Paxton announced a lawsuit against TikTok based on claims that its algorithm endangers minors, which TikTok has refuted by stating that parents can request the deletion of their teens’ accounts in some states.
This lawsuit is among many similar cases across the nation, leaving Texas lawmakers to evaluate further actions in the upcoming session to enforce accountability among social media companies.
In Australia, a recent move has banned minors under the age of 16 from social media platforms, raising questions about similar strategies in Texas.
“The state must ensure that technology providers take responsibility for the protection of our children and take decisive actions to halt the distribution of child sexual abuse materials,” remarked Brent Dupre, director of law enforcement for the Texas Attorney General’s office.
Dupre’s department operates alongside two other Internet Crimes Against Children Task Forces in Texas, covering a vast area of 134 counties. He reported that his office receives around 2,500 cyber tips monthly from the National Center for Missing and Exploited Children, a volume that far exceeds their limited manpower of just 11 officers.
The situation is so dire that Dupre cited an instance when a trainer leading a live workshop on how to pose as minors in chat rooms noticed an actual adult attempting to solicit inappropriate behavior from their simulated minor.
“We are unable to conduct these proactive investigations as frequently as we would like due to our heavy caseload,” Dupre commented, emphasizing collaboration with other law enforcement agencies facing similar staffing shortages.
Christina Green, chief advancement and external relations officer at Children’s Advocacy Centers of Texas, indicated that her agency assists over 60,000 child victims annually, most of whom are linked to online incidents occurring within school settings while using social media. She emphasizes that both law enforcement and her agency require additional resources to safeguard children effectively.
“Given the swift evolution of this field, we need to ensure our tools and resources keep pace,” she advocated.
Echoing sentiments from school officials, Dupre emphasized the necessity for social media firms to impose stricter regulations on minor users. He argued that companies should monitor attempts to upload materials related to sexual exploitation and be held accountable for allowing such materials to remain accessible on their platforms.
He also proposed that legislators mandate companies use advanced technology to detect and block child sexual abuse content proactively.
“In my view, any child attempting to upload explicit content should have their account shut down immediately,” Dupre stated. “Though certain companies successfully scan for such materials, not all yet enforce adequate security measures or restrict users from gaining access.”
Nonetheless, Green concluded that the most effective prevention strategies must begin at home. She advocated for education about online dangers starting as early as the third grade and recommend annual refreshers on the subject.
She also emphasized the importance of extending this educational effort to parents, suggesting they need to ask about the digital environment when their child visits the home of a friend. “Just like you’d inquire about the presence of a pool, such inquiries about technology use should be standard practice,” Green suggested.