Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It would be quite hard to make any AI tool that preemptively avoids the wide range of potential issues that you've mentioned. If tool makers are forced to always err on the side of caution, it's likely that the resulting tool ends up disappointing.

Only when published, and when put into context of the entire work, could a creation deemed harmful. A tool should not, for example, prevent you from making a bunch of images with ominous poses, from which you select one to use with an article that discusses the history of ominous poses.



Just because it's hard to make a tool that can't be used in negative ways doesn't mean that it's a good idea to make a tool that (charitably) makes specifically negative uses easy and (uncharitably) is deliberately designed for them.

Tool makes do err on the side of caution all the time - we **** out passwords so users don't share them as easily, we put safety catches in secateurs. "Build in safeguards against the obvious issues" is a basic design step.


- your critique is both vague, but at the same time touches a sensitive area, implying a wrongdoing by the tool authors that can't be refuted or fixed easily. What specifically bothers you? Consider that active Twitter discussions uncover and point out troublesome issues almost faster than the general public can understand and digest.

- assuming you found an egregious issue, do you also double down on maintaining that the tool is 'deliberately designed to make negative uses easy'? How so?

- I disagree with the 'safety catches' metaphor and would offer the 'hammer' metaphor instead.

- Actually, with the rapid development in this field I expect that anyone will be able to locally prompt for any content, even movies, soon, limited only by people's taste and imagination; with this realization I don't think I will follow up on this discussion that will surely be outdated in a minute.


Peritract already called out a specific issue. The male and female options come with different sets of selectable poses, and some of the female poses are pornographic in nature. This promotes the objectification of women.


> If tool makers are forced to always err on the side of caution, it's likely that the resulting tool ends up disappointing

I don't disagree with you entirely, but I still have the feeling like this will make a pretty good epitaph for humanity some day.


In that case enjoy our proof of concept:

https://app.engageusers.ai

Everything from realistic faces to realistic posts. We tried to make it as ethical as possible in multiple ways. But ultimately it is designed to spur conversation on topics that need more kickstarting engagement…




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: