The bill itself is really short, about 500 words, a third of which are defining what a “chatbot” is[1]. I think the quality of argument would be better if people took two minutes to read the actual bill, and then a few minutes to read through sections 6512 and 6513, and articles 131, 133, 135, 136, etc of the NY education code, and then asked their favorite LLM which of the terms are terms of art.
That said, looking at the text of the bill and the referenced sections of the NY education code, I’m inclined to agree with Zvi, Eliezer, and Dean Ball. As far as I can tell, if a New Yorker describes the symptoms their dog has, and asks what home remedies are available, and the chatbot answers, and the dog dies, the New Yorker can sue for damages + legal expenses[2].
Disclaimer[3]: I am not a lawyer, and this is not legal advice.
Their definition of “chatbot” is interesting—as far as I can tell, an IVR phone tree which takes voice input (To help us route your call, please say what you are calling about. For example, if you are calling about the status of your prescription, say ‘Prescription Status’.”) counts as a “chatbot”.
A disclaimer which the bill explicitly makes unavailable to the chatbot proprietor if I’m reading correctly: “(B) A PROPRIETOR MAY NOT WAIVE OR DISCLAIM THIS LIABILITY MERELY BY NOTIFYING CONSUMERS THAT THEY ARE INTERACTING WITH A NON-HUMAN CHATBOT SYSTEM.”
The bill itself is really short, about 500 words, a third of which are defining what a “chatbot” is [1] . I think the quality of argument would be better if people took two minutes to read the actual bill, and then a few minutes to read through sections 6512 and 6513, and articles 131, 133, 135, 136, etc of the NY education code, and then asked their favorite LLM which of the terms are terms of art.
That said, looking at the text of the bill and the referenced sections of the NY education code, I’m inclined to agree with Zvi, Eliezer, and Dean Ball. As far as I can tell, if a New Yorker describes the symptoms their dog has, and asks what home remedies are available, and the chatbot answers, and the dog dies, the New Yorker can sue for damages + legal expenses [2] .
Disclaimer [3] : I am not a lawyer, and this is not legal advice.
Their definition of “chatbot” is interesting—as far as I can tell, an IVR phone tree which takes voice input (To help us route your call, please say what you are calling about. For example, if you are calling about the status of your prescription, say ‘Prescription Status’.”) counts as a “chatbot”.
Maybe. Depending on whether this violation is “willful”.
A disclaimer which the bill explicitly makes unavailable to the chatbot proprietor if I’m reading correctly: “(B) A PROPRIETOR MAY NOT WAIVE OR DISCLAIM THIS LIABILITY MERELY BY NOTIFYING CONSUMERS THAT THEY ARE INTERACTING WITH A NON-HUMAN CHATBOT SYSTEM.”