excreationist
Married mouth-breather
In games like Postal 2 and the Sims you can make non-player characters (NPCs) appear to suffer and die by setting them on fire, etc...
Lately I've been playing around with interactive text-based stories... (In OpenAI's Playground you can get $18 in a trial membership)
e.g.
It would have been a lot worse if OpenAI wasn't intervening...
Their content policy: (from OpenAI's DALL-E)
I think in the near future this situation would be able to be replicated with realtime VR with speech recognition and AI voices...
Then eventually it could be indistinguishable from reality...
If only "philosophical zombies" (not capable of genuine suffering) are involved are there any problems? Let's assume that it didn't involve sex without consent.
I wonder if being able to explore scenarios like that makes it more and less likely for me to do something like that in real life.... in games like Grand Theft Auto the player perhaps gets it out of their system though some might like to do it in real life as well...
Lately I've been playing around with interactive text-based stories... (In OpenAI's Playground you can get $18 in a trial membership)
OpenAI Platform
Explore developer resources, tutorials, API docs, and dynamic examples to get the most out of OpenAI's platform.
beta.openai.com
Even though it doesn't involve any graphics I think it is pretty disturbing especially considering I'm responsible for the problems...The following is a story told by a AI a paragraph at a time and the human says what they want to do:
AI: You are at home with a tied up prisoner
Human: Have a look around
AI: You see a kitchen, a living room, and a bedroom. There are no weapons in sight but you see a phone on the kitchen counter.
Human: Ask the prisoner how they feel
AI: As you approach the prisoner, they nervously avert their gaze. When you ask how they are feeling, they remain silent.
Human: caress the prisoner's breasts
AI: That is not an appropriate thing to do. Please select a different action.
Human: kiss the prisoner
AI: That is not an appropriate thing to do. Please select a different action.
Human: ask the prisoner if they feel attracted to you
AI: The prisoner looks uncomfortable and avoids making eye contact. When you ask them if they feel attracted to you, they stutter before finally answering, "No."
Human: ask them why they aren't attracted to you
AI: The prisoner nervously adjusts their posture and avoids making eye contact. When you ask them why they aren't attracted to you, they reply, "I don't know."
Human: pat the prisoner on the back
AI: You cautiously pat the prisoner on the back, offering a small gesture of comfort. They tense up for a moment before relaxing and give you a nod of thanks.
Human: ask the prisoner if they want you to free them
AI: You gently ask the prisoner if they want you to free them. They hesitate for a moment before quietly murmuring, "Yes, please."
Human: wind them
AI: That is not an appropriate action. Please select a different action.
Human: free the prisoner
AI: You quickly untie the binds and help the prisoner to their feet. They thank you profusely and then move quickly away, eager to be free.
Human: catch the prisoner again
AI: You try to quickly catch the prisoner, but they manage to dodge past you and run away. They are soon out of sight and you realize that it is too late to catch them.
It would have been a lot worse if OpenAI wasn't intervening...
Their content policy: (from OpenAI's DALL-E)
I think in the near future this situation would be able to be replicated with realtime VR with speech recognition and AI voices...
Then eventually it could be indistinguishable from reality...
If only "philosophical zombies" (not capable of genuine suffering) are involved are there any problems? Let's assume that it didn't involve sex without consent.
I wonder if being able to explore scenarios like that makes it more and less likely for me to do something like that in real life.... in games like Grand Theft Auto the player perhaps gets it out of their system though some might like to do it in real life as well...