A completely artificial computer simulation exhibiting properties of life is meaningless. Of course if you call SimulateLife() it will appear to simulate life. But if the things you’re building the supposed life from are made up and you can’t do it in nature then you’re basically just making a video game.
But the point is that they’re not calling SimulateLife() the properties are emergent out of the rules of the simulation. In the same way life emerges from the rules of our universe. It’s just a different substrate and is interesting for its own sake.
Let's say that tomorrow you literally find out (beyond doubt) that you are one of these lifeforms that lives inside a simulation, along with everyone else in this world.
Would you kill yourself? Would you prefer to never have lived?
What if life were extremely boring unless you were being toyed with? I.e. what if being toyed with actually makes life more fun and worth living.
It seems to me that the criteria for whether it's ethical to create conscious life is not whether someone plays with it, but rather whether the created creature experiences extreme and/or very prolonged suffering.
Even then it's questionable how real and problematic those experiences are, depending on context.
Perhaps suffering is just a trick the mind plays on you to motivate you to achieve something (mental health problems notwithstanding). Perhaps suffering can be relative and/or the mind can adapt to it, depending on how the mind in question works.
Perhaps it can be more ethical to create life, even if it can only live inside a simulation.