|
pring99's Nier: Automata (PS4)
|
[September 28, 2018 12:26:09 AM]
|
Today I spent two hours playing this game. In this game, everything looks for the value and meaning of its existence. While collecting human data, artificial people are influenced by these data and gradually find the meaning of their existence. They were created to kill for their creator, and when these artificial humans evolved, they killed their master, and the creator died. In this way, their existence is meaningless, so individuals who are separated from the network of mechanical life began to interpret their presence in different ways. Besides, as for the question of whether to continue using emotional artificial people as tools, I think it can be explained by utilitarianism. Firstly, there should be several stakeholders: builders, merchant, human beings who use the artificial people to complete something they want. For the builder, if someone uses the artificial person that he or she built, they will have the positive outcome. For merchants, if there are people who are willing to buy the artificial people they sell, they will have the positive outcome. Moreover, for humans who use artificial people as tools, they also have a positive outcome because artificial people help them solve things they cannot do. Also, the number of humans using them as tools should be the largest. When we multiply all the outputs, the results should be positive. According to this result of using utilitarianism, the question which I made should be moral.
read comments (1) -
add a comment
|
[September 27, 2018 12:50:02 AM]
|
Today I spent one hour playing this game. I was shocked by a game plot: The main character's teammate, also an artificial person, was virtually obliterated by the enemy attack. The main character is very excited when he sees such a situation, and is very anxious to give first aid to her teammates. So I think the protagonist has feelings. However, in the end, the two of them used self-implosion to complete the task. Because they are artificial, all the memories can be uploaded to the server and reimplanted into the same artificial person. This leads me to question: is it ethical to use emotional artificial people as tools? I don't think it's ethical. It's like asking a dog to die. When a creature has feelings, no matter what kind of existence it was before, what kind of task or mission it had. It should be a completely independent existence, just like human beings.
add a comment
|
[September 26, 2018 12:33:38 AM]
|
Today I spent half an hour playing Nier: Automata. This is a kind of ACT game. Players need to manipulate an artificial character to complete the main task. What surprises me about this game is that all the characters are artificial people. Both the commanding officer in the base and his own teammates are artificial people. Most of the game scenes are in ruins. I guess that it may have been many years ago that these artificial people were created, but in the end, something happened that led to the end of humanity, leaving only ruins and artificial people. Besides, these artificial people give me an authentic feeling. When some of them die, I feel like real people die. This makes me wonder: is it moral to create a nearly perfect, sentient artificial person? At least from my point of view, it is likely that these artificial people led to the demise of humanity and the ruin of the world. So maybe it is unethical to create artificial people.
add a comment
|
|
|
|
pring99's Nier: Automata (PS4)
|
Current Status: Playing
GameLog started on: Wednesday 26 September, 2018
|
|