A technology-agnostic approach I'd favor would be some regulation roughly along the lines of:
"Every business decision on behalf of a business needs to be signed off by a responsible individual"
If you want automated software doing things on your behalf, sure, but every action it takes needs to be attributable to an accountable individual. Be it an engineer, an executive, or an officer.
Doesn't matter if the "other entity" is an LLM, a smart contract, or an outsourced worker in a sweatshop through 5 levels of subcontracting.
If you make a machine and it causes a mess, that's on you. If others start using your machine unsupervised and it makes their lives a mess, that's on them (and potentially you).
That doesn't work. For that to work, the person needs to understand how that algorithm creates its output, and understand its flaws and vulnerabilities, AND be diligent about interrogating the results with those things in mind. Nobody technically sophisticated enough to do that will also have the domain knowledge to evaluate the most consequential decisions.
For example, sentencing "recommendations" are supposed to be exactly that-- recommendations for judges. But, judges seem to rubber stamp the recommendations. I'm sure for some it's a scapegoat in case someone accuses them of not really thinking about it, the more credulous probably assume the algorithm saw something they didn't, and for others, the influence might be more subtle. This is something we should have studied before we started letting this algorithm put people in jail. These are judges. Their most important function is impartiality.
Do you have specifics? A lot of people say things like that when they're toe-to-toe with human psychology but humanity still has a whole lot of problems that a whole lot of people are pretty heavily incentivized to avoid. I don't see how this would be any different.
The key is that it has to be resistant to subversion by hostile terms of use. Similar to how certain provisions in employment contracts aren't enforceable under employment law. Because literally anything you try and establish here will instantly end up as a waiver in terms of use for services that regular people can't possibly understand or reasonably opt out of.
(as example, see recent law suit that Tesla won because the driver used auto-pilot on "city streets" where it was advised not to somewhere deep in the terms of use).
"Every business decision on behalf of a business needs to be signed off by a responsible individual"
If you want automated software doing things on your behalf, sure, but every action it takes needs to be attributable to an accountable individual. Be it an engineer, an executive, or an officer.
Doesn't matter if the "other entity" is an LLM, a smart contract, or an outsourced worker in a sweatshop through 5 levels of subcontracting.
If you make a machine and it causes a mess, that's on you. If others start using your machine unsupervised and it makes their lives a mess, that's on them (and potentially you).