Wednesday, April 4, 2018
Be Right Back, Where does Martha cross the line?
In Black Mirror's episode, Be Right Back, we are taken through the grieving process of Martha after losing her husband to a car accident. At the funeral one of Martha's friends suggests she signs up for a program that will allow her to communicate with an artificial intelligence that will resemble her deceased husband, to help her with the grieving process. At first, Martha is hesitant to give the service a shot but once she tests the waters and finds that she finds comfort in this artificial intelligence, she goes all out and pays for the highest capacity that the software can provide, and buys a robot identical to her husband, Ash.
What I want to know is where do people think they would draw the line as to how weird, abnormal, creepy or unhealthy the software gets. The service's goal is to achieve ultimate resemblance to your lost loved ones by gathering all the information they have left through any means of technology and internet.
In my opinion, even allowing the software access to any existing information of whom you've lost just to be able to communicate with them via-message, is absurd. Psychologists state that there are stages of grief that one must go through to achieve peace with the loss of a loved one. But in my opinion, how could Martha possibly move on with her life if she keeps attempting to feel as if Ash never died and that he is still around. For me, downloading the software and trying it out is where I would draw the line. I don't think it is healthy for someone to settle for AI that mimics someone whom you cared about so deeply. Don't get me wrong, I can see why anyone could easily begin to obsess over this type of technology, because who wouldn't want the pain to simmer for just a minute and have access of communications with someone whom you never thought you would ever talk to, hear or see again.
After this, Martha continues to provide more and more of Ash's life to the software which proceeds to the ability to speak to "Ash" through the phone, letting "Ash" see what she can see through camera, and even duplicating Ash's body and bringing his physical aspects back to life. To me everything about the software is unhealthy and creepy because Ash is dead and if God, the universe, or whatever you believe in wanted "Ash's" presence impacting the living world, he wouldn't of died to begin with.
I have provided a link for more information about the stages of loss and grieve:
So where do you think you would draw the line? Or how many of you are already hiding your robots in the attic?
Labels:
10am
Subscribe to:
Post Comments (Atom)
4 comments:
I think you make some very valid and intelligent points about how adapting an android replica of a loved one who has died would be detrimental to the grieving process. As someone who has lost someone very near to me, I can personally say, I would never opt to "replace" them with an android copy. Something about it would feel like "playing God," or even being in flat out denial over their death. I don't doubt that one day humans will have the option to choose an android replacement. I just hope that most people will realize how unhealthy it would be to engage in such activities.
I agree with this completely, especially when you mentioned that AI mimicry of someone who has passed is creepy and unhealthy. Grieving is a process and is painful but it's simply how we humans cope with the loss of someone. That loss is permanent and an android replica will only disturb this process. How are you supposed to accept the fact that someone is dead when you see their replica walking about in your house? It truly is an absurd idea and a scary one as well.
I would draw the line once I started forgetting that Ash was a robot ( if I were Martha in the situation ), or lying to myself as though he were real.
I am not too sure when or if I would even forget at all, it would depend on how capable I am of distinguishing my reality from fake beliefs.
( P.s I am indeed hiding robots in my attic. )
Post a Comment