A bill under consideration in New York would provide a private right of action, allowing people to file lawsuits against chatbot owners who violate the law.
I find it okay for writing programs since you can verify it to see if the output is correct.
But, actual analysis not so much, since when verifying what comes out that its not completely reliable even for things it should be like numbers. Now numbers might be close, but still off
Abstract stuff might be fine. But, its still not something to entirely trust on analysis because of errors. There’s a lot of double checking that needs to be going on.
I find it okay for writing programs since you can verify it to see if the output is correct.
But, actual analysis not so much, since when verifying what comes out that its not completely reliable even for things it should be like numbers. Now numbers might be close, but still off
Abstract stuff might be fine. But, its still not something to entirely trust on analysis because of errors. There’s a lot of double checking that needs to be going on.