Connecticut lawmakers will be pushing to regulate the online activity of minors and better protect the data privacy of residents, Attorney General William Tong and state Sen. James Maroney, D-Milford, announced on Thursday.
During a morning press conference, the duo announced a number of measures that lawmakers will consider in the coming weeks. The discussion also included a review of the past year of enforcement actions taken under the Connecticut Data Privacy Act, a 2023 law that outlines consumer data protections and the responsibilities of certain groups that collect online data.
“If we don’t take action, and we just leave our information open to the dark web, there are very serious consequences for people here in Connecticut,” Tong said.
The press conference offered an early preview of legislation on online safety, data privacy, chatbot protections and more. Online privacy is a legislative priority this year, with several bills already introduced, including two bills put forward by Gov. Ned Lamont.
The latest effort comes as the applications of artificial intelligence technologies are rapidly expanding. State officials say ensuring that the technologies and platforms will not cause harm to state residents, children and minors in particular, is necessary.
“We have to make sure that children are protected under all of these laws, because they engage a great deal with not just companies, but also people online,” Tong said. “We want to make sure that they’re not exposed [to harm].”
It’s a daunting task, one that requires legislators to consider a number of issues, from chatbot use to generative AI prompts, biometric scans, and data tracking among others.
It’s also a highly charged one. Tong was visibly frustrated at times on Thursday as he described the need for better protections for children, criticizing companies that have been hesitant to provide safeguards.
Now, as the state tries to get its arms around the issue, lawmakers say that they must move just as rapidly as the developing technology that they are trying to regulate.
“We recognize that idea of, you know, ‘move fast and break things’” Maroney said. “But the tech mantra can have real life consequences.”
CT Data Privacy Act is new, and there’s a lot it doesn’t regulate yet
In the past year, the attorney general’s office received close to 70 complaints under the Connecticut Data Privacy Act, many of them citing challenges residents have faced when trying to get sensitive online data deleted.
But the state was unable to act on many of these complaints. According to an annual enforcement report released by the Office of the Attorney General on Thursday, roughly one-third of CTDPA complaints involved entities or data that is currently exempt from the Data Privacy Act.
In addition to the complaints, Tong’s office also received more than 1,830 data breach notifications from companies. The attorney general issued 63 warning letters to companies that had taken too long to provide notification of breaches to affected users.
The state also entered into a handful of Assurances of Voluntary Compliance, or AVC agreements, which require companies involved in CTDPA violations to provide a clear plan for resolving incidents and ensuring data security. Companies must also pay the state as part of the agreement. In 2025, the Office of the Attorney General entered into five such agreements, collecting more than $750,000 in payments.
The report also notes that the attorney general continues to monitor a number of specific data privacy topics, including consumer health privacy data, personal data like addresses and geolocation, and privacy data for minors.
A focus on AI chatbot regulation and data privacy
Protecting children and teens, Maroney said, will be a top focus for lawmakers this year.
“When the Data Privacy Act was passed in 2022, we hadn’t envisioned what chatbots would mean or how they would be used as companions and how many children would actually be using them,” he said. Maroney cited research that found “75% of teenagers have interacted with a companion chatbot, and we know that we need more protection for that.”
Several bills are already on the table. On the first day of the legislative session, lawmakers filed Senate Bill 4 and Senate Bill 5, which seek to regulate consumer privacy and online safety. The bills are before the General Law committee, where Maroney serves as a co-chair.
“We’re going to look at making a few changes and updates to our data privacy law,” Maroney said of the legislation, noting that lawmakers will consider banning geolocation data and creating a more narrow definition of “publicly available information” to limit what information companies can access.
He also noted that lawmakers will be asked to consider passing a Connecticut version of California’s Delete Act, which allows consumers to submit requests to delete their personal information from websites.
Lamont has also proposed legislation, calling for lawmakers to consider two different bills this year.
The first, Senate Bill 86, would require the Office of Policy and Management to designate a Chief Data Officer responsible for coordinating data policy and data collection between state agencies. The bill would also request the Department of Economic and Community Development establish a “regulatory sandbox” for AI-related regulations, and require AI developers of chatbots and other companion models to create protocols for dealing with users showing mental health problems.
The governor’s second bill, House Bill 5037, focuses on the online safety of minors, calling for companies to establish online safety protocols and default settings that would keep minors from harmful information and interactions. The proposal would also limit the accounts that minors can receive messages from, and block notifications on a minor’s account during certain times of day. The legislation would also require social media platforms to track and report to the state the number of minors using social media, the average amount of time these users are active, and the ages and times of day that minors are using these sites the most.
Connecticut’s efforts to regulate the social media and chatbot use of young users come after research has shown that children and young adults are able to easily get around online safety measures, potentially exposing them to harmful interactions with chatbots that promote sexual and violent content. A CT Insider investigation from last year found that students in the state are entering into relationships with AI companions, and that many of the most popular companion apps do not have many age safeguards despite promoting content that would appeal to young children and teens.
Tong said as the issue persists, focusing on minors in state legislation is critical. “I think everybody just needs to recognize how really powerful these technologies are, and how dangerous they can be for kids,” he said.
Tong and Maroney also said protections need to go further than one group, noting that adults, particularly those with mental health needs, could be better protected under state law.
The forthcoming debate on these bills will likely raise new questions about exactly how far state lawmakers are willing to go when regulating AI. Efforts to pass broader AI regulations have failed in recent years, as lawmakers worried that too much regulation could hinder the state’s burgeoning innovation efforts and scare companies away the state.
Maroney said state lawmakers cannot afford to wait this time around.
“We don’t want to harm innovation, but we’ve seen that in some ways, the entrepreneurs have run with this a little too far and caused some harm, and it’s time to rein them in,” Maroney said. “We want innovation, but don’t rush, because that’s when you make mistakes.”
