In the first article of this two-part analysis, we looked at who owns the code created by AI chatbots like ChatGPT and explored the legal im

If your AI-generated code becomes faulty, who faces the most liability exposure?

submited by
Style Pass
2025-01-02 18:00:03

In the first article of this two-part analysis, we looked at who owns the code created by AI chatbots like ChatGPT and explored the legal implications of using AI-generated code. 

To frame this discussion, I turn to attorney and long-time Internet Press Guild member Richard Santalesa. With his tech journalism background, Santalesa understands this stuff from both a legal and a tech perspective. (He's a founding member of the SmartEdgeLaw Group.)

"Until cases grind through the courts to definitively answer this question, the legal implications of AI-generated code are the same as with human-created code," he advises.

Keep in mind, he continues, that code generated by humans is far from error-free. There will never be a service level agreement warranting that code is perfect or that users will have uninterrupted use of the services.

Santalesa also points out that it's rare for all parts of a software to be entirely home-grown. "Most coders use SDKs and code libraries that they have not personally vetted or analyzed, but rely upon nonetheless," he says. "I think AI-generated code -- for the time being -- will be in the same bucket as to legal implications."

Leave a Comment