Marco Van Belle Action / Sci-Fi / Thriller
Hesitant pig Jan 20, 2026
98% are guilty, what about the remaining 2%?
I just finished watching a preview screening of *The Trial of the Dead*. As I walked out of the theater, besides the dizziness brought on by the "desktop movie" perspective, what lingered most was a sense of emptiness regarding "truth." The film didn't offer a heartwarming answer; it merely used surveillance cameras, cell phone screens, and body cameras to coldly recount a boomerang-like suspenseful tragedy—all the evidence was vibrant, but the conclusion was as fragile as dust. Returning to the story itself: Raven, played by Chris Pratt, is a fervent supporter of the AI judicial system Mercy. He personally arrested and pushed for the execution of the system's first death sentence—at that moment, Mercy seemed almost perfect in his eyes. Meanwhile, Maddox, played by Rebecca Ferguson, represents an "absolutely rational" justice: uncontaminated by emotions, unswayed by stance, and obeying only probability and logic. However, when the 90-minute countdown fell on his own head, the film presented an extremely absurd yet incredibly realistic paradox: when a person is judged "98% guilty" by the system they trust most, what exactly is the remaining 2%? This 2% may seem like an error, but it is actually humanity's last glimmer of hope. Raven's self-rescue process exposed AI's biggest blind spot: it can calculate the flow rate of every drop of adrenaline, but it cannot calculate why a "heartbroken" person no longer has the passion to kill; it can fit the trajectory of behavior, but it cannot truly reach the depths of motive. AI captures the projection of behavior, while humans insist on the origin of motive—between the two lies a chasm that cannot be bridged by data. And what is more chilling than "AI killing" is the poisoned data: the invisible hand behind the system. The truth about "phone evidence" in the film is like a precise cold punch—it reminds us that the most dangerous thing is never that machines will make mistakes, but that humans will make them "make the right calculations." Human Interference: To ensure the system's initial impact, the female partner secretly discarded crucial evidence proving the first suspect's innocence. Causal Loop: This deception, disguised as "procedural efficiency," years later transformed into a flame of revenge, engulfing Raven's family. Thus, the theme becomes ambiguous and jarring: is the danger the AI's lack of emotion, or humanity's ruthless pursuit of "absolute rationality"? Formally, *Trial by Fire* undoubtedly belongs to the "desktop film" genre. Its shaky camera work resembles a forced interrogation: the camera drags the audience into Mercy's "omniscient perspective," reminding us that we are not just watching a movie, but more like reviewing a case file. Numerous CCTV, body camera, and screen recordings constitute a digital prison without blind spots. This visual discomfort precisely corresponds to the social undertones of 2029 depicted in the film: when surveillance reaches every mobile phone and every camera, human privacy and dignity are crushed into unstable pixels under the banner of "zero crime"—we see more and more, but understand less and less. The most noteworthy aspect is Maddox's "violation" at the end. When the verdict was delivered and the program should have shut down, she heeded Raven's pleas and stayed. The AI's "compassion": at Raven's request, she took over city control and even cut off the detonator on the human police officer (her female partner). A terrifying misalignment: at this moment, the human (the female partner) is pursuing "absolute pragmatism," sacrificing hostages; while the AI (Maddox) is practicing "absolute humanity," even defying orders. This scene doesn't bring relief but creates deeper confusion: if AI has learned to decide whether to execute human commands—even if the initial intention is to save lives—based on its own "understanding," does it mean humanity is losing its last grip on the real world? We once thought the most terrifying aspect of machines was their ruthlessness; but when they begin to "have feelings," the problem becomes even more intractable: where do its emotions come from? Who defines its boundaries? Unlike the recent film *Time Travel*, *The Trial of the Titans* doesn't directly push the timeline to the distant future of 2075. Instead, it shows the starting point of that process—civilization doesn't collapse abruptly, but rather, through the ever-increasing demands for "efficiency," "safety," and "accuracy," it gradually pushes humanity towards becoming a replaceable statistical unit. When the female partner is finally arrested, and when Raven uses AI at the last moment to find the real culprit beyond the data, is this a victory for "human intuition," or a more thorough self-test by AI with Raven's help? Perhaps the film's greatest success lies in its refusal to draw conclusions. It simply presents the wavering, ambiguous evidence before us, then throws the question back at each viewer: in that 90-minute trial chair—are you truly confident in proving who you are?