> we're handing the power to replace human work over to those who can afford to pay
Consider that this power works by consuming copyright-protected work done by unwitting contributors without any opt-in, creating derivative works from it and charging the users without acknowledging the authors.
In addition to being illegal, it plain discourages open information sharing—since anything you publish, regardless of license, is consumed and monetized by OpenAI in an automatic fashion. I.e., if people have no reason to read what you write or buy your books when they can just ask an LLM for the same information (which LLM had obtained from your writing), there is no motivation for you to publish.
When do we start considering this illegal? Not LLMs, of course, but for-profit operated LLMs created by mass scraping of copyright-protected data.
> Google.com adding a single yellow box with an advertisement seemed reasonable, too.
Google acts fairly though: it directs the searcher to you. Imagine if at any point Google stopped doing that and just started to show you regurgitated computed contents in response to your search, without ever telling you who authored the info. Everyone would be up in arms on day 2 if they did it; why do we forgive OpenAI and Microsoft when they do essentially that?
Consider that this power works by consuming copyright-protected work done by unwitting contributors without any opt-in, creating derivative works from it and charging the users without acknowledging the authors.
In addition to being illegal, it plain discourages open information sharing—since anything you publish, regardless of license, is consumed and monetized by OpenAI in an automatic fashion. I.e., if people have no reason to read what you write or buy your books when they can just ask an LLM for the same information (which LLM had obtained from your writing), there is no motivation for you to publish.
When do we start considering this illegal? Not LLMs, of course, but for-profit operated LLMs created by mass scraping of copyright-protected data.
> Google.com adding a single yellow box with an advertisement seemed reasonable, too.
Google acts fairly though: it directs the searcher to you. Imagine if at any point Google stopped doing that and just started to show you regurgitated computed contents in response to your search, without ever telling you who authored the info. Everyone would be up in arms on day 2 if they did it; why do we forgive OpenAI and Microsoft when they do essentially that?