South Carolina lawmakers are weighing new regulations on AI chatbots used by children, aiming to prevent emotional dependence and protect privacy as the technology becomes more embedded in teenagers' lives, despite industry pushback.

South Carolina lawmakers are moving to draw firmer lines around artificial intelligence chatbots used by children, as concerns grow over emotional dependence, privacy and the widening role the technology is playing in teenagers’ lives. Speaking in Columbia, Sen. Matt Leber said the state should not leave young users without guardrails as AI becomes more common in everyday settings, from homework help to conversation and support. Pew Research Center has found that a majority of U.S. teens now use AI chatbots, with a notable share turning to them regularly and some using them for emotional support.

The proposed legislation would curb chatbots from offering emotional advice to minors or engaging in extended conversations with them, while also tightening limits on the data AI platforms can collect and sell. Users would need to give clear permission before a service could store their information or unlock broader access. Leber argued the measures are intended to keep companies from designing systems that could be harmful or manipulative, while critics of unregulated chatbot use warn that robotic validation can distort a teenager’s understanding of empathy and real-world relationships.

The push, however, is already drawing pushback from business and banking groups that rely on customer-service bots. Kristina Hinson of the South Carolina Retail Association said exceptions are needed so companies can continue to offer fast, efficient service without forcing customers through extra account steps. Lawmakers have not yet voted on either bill, saying they want to adjust the language to avoid unintended consequences before the measures return to committee for another hearing.

Source Reference Map

Inspired by headline at: [1]

Sources by paragraph: - Paragraph 1: [2], [3] - Paragraph 2: [2], [3] - Paragraph 3: [2]

Source: Noah Wire Services

Verification / Sources

  • https://www.wrdw.com/2026/04/15/sc-lawmakers-push-regulate-ai-chatbots-used-by-children/ - Please view link - unable to able to access data
  • https://www.wrdw.com/2026/04/15/sc-lawmakers-push-regulate-ai-chatbots-used-by-children/ - South Carolina lawmakers are proposing legislation to regulate AI chatbots used by children, expressing concerns over potential negative impacts such as emotional dependency and data privacy issues. The proposed bills aim to restrict chatbots from providing emotional advice or engaging in prolonged conversations with minors, and to limit the data AI platforms can collect and sell. Business groups have raised concerns about the impact on customer service chatbots, seeking exemptions from the proposed regulations.
  • https://www.pewresearch.org/internet/2026/02/24/what-parents-say-about-their-teens-ai-use/ - A Pew Research Center survey reveals that 51% of U.S. parents believe their teens use AI chatbots, with 64% of teens reporting usage. Parents are more comfortable with teens using chatbots for information searches (80%) and entertainment (66%), but less so for casual conversations (31%) or emotional support (18%). The study highlights a gap between parental perceptions and teens' actual use, indicating a need for better communication and understanding of AI chatbot usage among adolescents.
  • https://www.pewresearch.org/internet/2026/02/24/what-parents-say-about-their-teens-ai-use/ - A Pew Research Center survey reveals that 51% of U.S. parents believe their teens use AI chatbots, with 64% of teens reporting usage. Parents are more comfortable with teens using chatbots for information searches (80%) and entertainment (66%), but less so for casual conversations (31%) or emotional support (18%). The study highlights a gap between parental perceptions and teens' actual use, indicating a need for better communication and understanding of AI chatbot usage among adolescents.
  • https://www.pewresearch.org/internet/2026/02/24/what-parents-say-about-their-teens-ai-use/ - A Pew Research Center survey reveals that 51% of U.S. parents believe their teens use AI chatbots, with 64% of teens reporting usage. Parents are more comfortable with teens using chatbots for information searches (80%) and entertainment (66%), but less so for casual conversations (31%) or emotional support (18%). The study highlights a gap between parental perceptions and teens' actual use, indicating a need for better communication and understanding of AI chatbot usage among adolescents.
  • https://www.pewresearch.org/internet/2026/02/24/what-parents-say-about-their-teens-ai-use/ - A Pew Research Center survey reveals that 51% of U.S. parents believe their teens use AI chatbots, with 64% of teens reporting usage. Parents are more comfortable with teens using chatbots for information searches (80%) and entertainment (66%), but less so for casual conversations (31%) or emotional support (18%). The study highlights a gap between parental perceptions and teens' actual use, indicating a need for better communication and understanding of AI chatbot usage among adolescents.
  • https://www.pewresearch.org/internet/2026/02/24/what-parents-say-about-their-teens-ai-use/ - A Pew Research Center survey reveals that 51% of U.S. parents believe their teens use AI chatbots, with 64% of teens reporting usage. Parents are more comfortable with teens using chatbots for information searches (80%) and entertainment (66%), but less so for casual conversations (31%) or emotional support (18%). The study highlights a gap between parental perceptions and teens' actual use, indicating a need for better communication and understanding of AI chatbot usage among adolescents.

Noah Fact Check Pro

The draft above was created using the information available at the time the story first emerged. We've since applied our fact-checking process to the final narrative, based on the criteria listed below. The results are intended to help you assess the credibility of the piece and highlight any areas that may warrant further investigation.

Freshness check

Score: 8

Notes: The article was published on April 15, 2026, and reports on recent legislative actions in South Carolina. Similar legislative efforts in other states, such as Virginia and Florida, have been reported in February and January 2026, respectively. (axios.com) However, no identical narratives were found prior to this date, indicating the content is original and timely.

Quotes check

Score: 7

Notes: The article includes direct quotes from Senator Matt Leber and Kristina Hinson. Searches for these quotes did not yield earlier appearances, suggesting they are original. However, without direct access to the original sources, full verification is not possible.

Source reliability

Score: 6

Notes: The primary source is WRDW, a local news outlet. While it provides timely coverage, its reach and reputation are more limited compared to major national news organizations. The article references a Pew Research Center study from February 2026, which is a reputable source. (pewresearch.org)

Plausibility check

Score: 7

Notes: The article's claims align with known legislative trends and concerns about AI chatbots' impact on children. Similar legislative actions have been reported in other states, such as Virginia and Florida. (axios.com) However, the article lacks specific details about the proposed legislation, such as bill numbers or exact provisions, which makes full verification challenging.

Overall assessment

Verdict (FAIL, OPEN, PASS): PASS

Confidence (LOW, MEDIUM, HIGH): MEDIUM

Summary: The article provides timely and original coverage of South Carolina's legislative actions regarding AI chatbots used by children. While the content is plausible and references reputable sources, the limited reach of the primary source and the lack of specific legislative details reduce the overall confidence in the article's completeness and accuracy. (pewresearch.org)