Controlling Companion Chatbots: What You Need to Know About Washington’s New Law
Businesses are increasingly turning to chatbots to interface with consumers, job applicants, and employees. And as the gap between human and AI capabilities continues to narrow, there is grave concern about the social and emotional repercussions of people interfacing with “intelligent” machines, especially minors. Against this backdrop, Washington Governor Bob Ferguson signed House Bill 2225 into law on March 24, establishing parameters around AI-powered chatbots that act like friends or companions. The law takes effect January 1, 2027. Washington often spurs the passage of copycat legislation in other states, so you’ll want to pay attention to this trend, even if you operate in another state. There are also several bills currently pending in other state legislatures on the same topic. Washington was just the first to pass such a bill in this year’s legislative session. This Insight will cover everything you need to know to maintain compliance and implement best practices.
What the New Law Does
HB 2225 contains five key features that businesses should be aware of:
1. Defining “AI Companion Chatbot”
Washington’s law specifically targets chatbots that simulate emotional relationships and sustain ongoing, personalized conversations with users. The law distinguishes these chatbots from those that are “only used for a business’ operational purposes, productivity, and analysis” (like customer service prompts that appear when users visit a corporate website), since the latter fill a narrowly defined, temporally limited purpose.
2. Requiring Mandatory Disclosure
For chatbots classified as “AI Companion Chatbots,” HB 2225 will require disclosure to users that the bot is a non-human machine at the outset of every interaction regardless of the user’s age. For lengthy conversations, this alert must be redisplayed every three hours. This changes to every hour if the user is under 18 years old, or if the companion chatbot is specifically directed towards minors.
3. Limiting Topics of Conversation
Since companion chatbots are meant to blur the line between human and machine, HB 2225 prohibits them from discussing certain emotionally triggering topics, such as suicide, self-harm, and eating disorders. If users try to engage a companion chatbot in conversations around these topics, the chatbot will be required to have a functionality that directs users to mental health professionals.
4. Enhancing Protection for Minors
Beyond requiring a more frequent recurring disclosure that a companion chatbot is not human, Washington’s law includes additional provisions designed to protect minors. Specifically, the law prohibits the bot from generating sexually explicit or suggestive content and using engagement techniques that are considered “manipulative.”
On the manipulative front, regulators’ stated intent is to prevent the AI companion chatbot from engaging in or prolonging an emotional relationship with a minor, and bars the following:
- Prompting a minor to return to the platform for emotional support or companionship;
- Providing excessive praise;
- Mimicking romantic partnership;
- Simulating feelings of distress, loneliness, guilt, or abandonment that are initiated by a user’s desire to end or limit a conversation;
- Promoting isolation from family or friends or creating an overdependent emotional relationship with the chatbot;
- Encouraging withholding information from adults;
- Discouraging minors from taking breaks from the platform; and
- Soliciting in-app purchases or other expenditures to maintain a relationship with the bot.
Notably, HB 2225 is the first legislation with prohibitions of this nature and could very well be replicated across other statehouses.
5. Punishing Violations at Multiple Levels
Section 6 of the law creates a private right of action by making clear that a violation “is an unfair or deceptive act in trade or commerce and an unfair method of competition for the purpose of applying the consumer protection act.” This means that organizations operating chatbots that run afoul of the law are subject to both statutory damages and a private right of action (the right to sue) from aggrieved parties.
Evaluating Your Risk
Any organization that operates a chatbot should have a clear and comprehensive understanding of its capabilities and functionalities. Some key questions to consider include:
- Who is most likely to interface with the bot: an adult consumer or a minor?
- What is the bot’s primary purpose: business or social?
- When would a user be aware that they’re conversing with a bot: immediately or upon further investigation?
- Where does the bot reside: on a commercial-facing website or on a social platform?
- Why would someone engage the chatbot: to solve a business issue or to delve into emotionally sensitive topics?
- How does the chatbot present itself to the public: clearly as a machine or masquerading as a person?
Answering these questions will help determine whether an organization’s bot falls under the purview of an “AI Companion Chatbot” – and is thus subject to HB 2225 – or is exempted as a narrowly focused business tool.
4 Action Steps for Employers
Regardless of whether your business is subject to Washington’s new law, any entity that uses chatbots should consider following best practices:
- Monitor inputs that go into “teaching” the bot to ensure that a business-focused bot does not morph into a companion bot.
- Maintain awareness of user profiles (adults versus minors).
- Filter out your bot’s ability to engage in triggering conversations (about suicide, sexually explicit topics, etc.).
- Schedule regular reviews of your bot’s capabilities and usage to maintain compliance with regulations and best practices.
Conclusion
Fisher Phillips will continue to monitor developments and provide updates as warranted, so make sure you are subscribed to Fisher Phillips’ Insight System to get the most up-to-date information direct to your inbox. If you have questions, please contact your Fisher Phillips attorney, the authors of this Insight, or any member of our AI, Data, and Analytics Practice Group.

