Ch8: Computer Reliability


In the movie Passengers we see many examples of computer reliability issues throughout the movie. We see examples of Errors leading to System Malfunctions, Errors leading to System Failures, embedded system failures, Bias in Training Data, and many other system malfunctions.

We see how bias in training data affected the Avalon. The ship in Passengers, called Avalon, got hit by an Asteroid. The ship's Artificial Intelligence System allowed for the Asteroid to hit it. There could be many reasons for this that are not explained in the movie, but our group theorized that this was because the ship believed it could take the Asteroid head-on due to the ship's shield. We see examples of real-life AI systems having similar biases due to not being able to interpret vague or ambiguous data. AI algorithms sometimes struggle to determine non-caucasian faces, thinking something that isn’t there is there, or thinking something is there that isn’t there[2]. This is similar to the Avalon(which we know is controlled by an AI system in the movie since the ship passes a star for fuel and slingshotting) allowing an asteroid to hit it.

This Asteroid hit the ship in the Fusion Reactor Controller, which was the computer in charge of keeping the reactor cool. The ship had an embedded system, which was the reactor control computer providing information to the reactor on how it should handle cooling. When this computer system was hit, it no longer could provide for this duty, leading to the ship almost exploding if it wasn't for Jim, Aurora, and Gus. We see similarities in this to the real-life 737 Max crashes. In real life, the Boeing 737 Max, was a modification of the Boeing 737 with bigger engines. The rest of the aircraft was the same, meaning the plane had an uneven distribution of weight. To counter this problem of the plane constantly trying to tile, they installed the MCAS embedded system. The problem is when this computer chip is damaged by a bird or other flying thing, the MCAS system fails, leading to the plane tipping over and inevitably failing.[1] This is similar to what happened on the Avalon since it was both examples of an external threat hitting a fragile embedded system leading to near fatality/fatality level threats.

We also see the System trying to account for this by distributing the load of the Fusion Reactor Controller to all the other computers on the ship. What the software didn’t account for however was that this would lead to all the computers on the ship overheating. This leads to System Errors such as the ship's cereal machine sending way too much, or the gravity stopping working. We also see these System Errors lead to near-system failures when the reactor on the ship almost blew up.

All in all there were many computer reliability issues on the Avalon. The Autopilot software and the ship's lack of redundancy are similar to the MCAS system in real life, and those two issues lead to much bigger issues such as the gravity to stop working or the cereal machine to distribute more cereal. I think we should take some of what the Director of Passengers portrayed of computer reliability and redundancy issues into account when we work on autonomous driving and embedded systems in vehicles.


   Sources:
   [1] David Normansell, Lessons from the Boeing 737 MAX 8, (University of Virginia, Spring, 2020) libraetd.lib.virginia.edu/downloads/707958249?filename=Normansell_David_STS_research_pap er.pdf (4/25/24)
   [2] HOWARD, AYANNA, and JASON BORENSTEIN. “Trust and Bias in Robots.” (American Scientist, 2019), www.americanscientist.org/article/trust-and-bias-in-robots (4/26/2024)
   [3] Columbia Pictures, Passengers, 12/21/16, (4/25/2024)



          Written by Vaansh Mansharamani