Recap
In the episode of Black Mirror, Be Right Back, we see a woman, Martha, who lost her boyfriend, Ash. Martha finds out soon after his death, that she's pregnant and will be raising the child alone. Her friend, Sarah, at the funeral recommend her trying a software that mimics her boyfriend. At first, Martha refuses, but then gives in to the software, when given the software files that mimics Ash's voices. Martha soon fall in love with the software by using it daily and giving it more files to mimic Ash more. Martha soon reaches a certain point where she wants to get more in depth with the software. She takes certain commands from the software that leads her to buying a android body for the software to use. As soon as the software is in the android's body, Martha soon becomes freak out by the human-like features that the software has that mimics Ash. As my philosopher professor explained, people get weirded out by a software mimicking some human detailed features because of the "uncanny valley." We get use to artificial intelligence software performing human tasks until it gets to the certain point where the software looks like a human performing their daily tasks. People get weirded out more knowing that it's a software rather than a actual human. Martha is satisfied with the software until she realizes that the software cannot mimic Ash 100% perfectly. Martha then becomes upset and try to compare the software ash to the real ash. Next thing she does is try to get rid of the android. In one of the scene, the android was willing to job off the cliff for Martha. This of course goes against the 2 of the 3 laws for a robot created by Isaac Asima.
The robot would be breaking law 1 and 3 which says that the robot should not do self-harm to itself and must protect itself. After Martha has this conflicts with the software, she then put the software in the attic with other lost memories of the real Ash.
Thoughts
In my opinion, Martha should've move on and not gotten the software. It's better to move on instead of holding on to something. The more you hold onto something, the more your grief and stresses out. Martha got to the point where she holds onto Ash memories so much that she wanted to make a clone of Ash. Don't get me wrong, death is a challenging thing to cope with and it's hard to accept the fact that the person is gone. But it doesn't mean to go out and buy a software to regain memories of the person, you'll never move on and you'll always self-doubt or keep grieving over the person. Eventually your demands will go up for the software which will soon stress you out.At the end, Martha begin stressing how the robot wasn't the real Ash and start becoming overwhelm the more she stared at the software in the android body. It's nice to keep memories of a loved one but we shouldn't go overboard and get an android body to revive a dead person's memories. Some people view it as disrespect because we want the person to rest as much as possible. At that point, you play with people religious beliefs. If we live in a society where we can take people's memories and transport it to an android body, then the world will be overpopulated and it could turn chaos. Crime rates would increase because you can just bring the person back in android body and restore their memories.
In my opinion, I wouldn't get an android to mimic my family or friends but if it becomes a big trend then I may have reconsidered. To have a android mimic my family member will be beyond crazy and awkward. Who would put their mom in the attic when they are done using her? It wouldn't feel right at all. What happens when you pass away? Will the androids buy another android for my body or will they just let me rest in peace? At this point, androids would be the new species while humans become extinct.
With the androids being the new specie, will the laws for robots be change by then? It's a lot of questions to ask when you see androids or "post humans" becoming the new dominating race. I wonder would robots be advance enough to clone another robot, if so, when will they acknowledge that it's becoming an issue?
5 comments:
I definitely agree with you I feel like it's okay to want to hear a deceased person's vice again or recreate memories with them again but it's not healthy to full on bring a person back to life. I found it disturbing as well that she tried so hard to make a robot exactly like Ash and I saw that it was not doing her any good. To me it seemed like she was stuck in a trance with the robot and her reality check was when her sister thought she had moved on.
This is a great recap of the episode. I also agree with your opinion. I think that the AI software that Martha tried to use as a coping mechanism ultimately did her more harm than good.
I completely agree with your thoughts, you should be able to let go of a person once they're gone. They shouldn't try to relive the memories, with someone that is no longer here. Also, your comment about you becoming more stressed from holding onto something is very accurate, because there is something within you that tells you it won't work, or it isn't working. In this case, Martha realized that the robot wasn't really Ash, and she became stressed. In the end I agree with the fact that the program did more harm than good for Martha.
You did a really good summery of the episode but I do think at the end she moved on but that is not to say that this is something that would work for everyone.
Post a Comment