Philosophy and theology professors around the world are surely drafting syllabi based on Westworld, the HBO sci-fi epic about robots in a Wild West theme park. Now in its second season, the series brims with suspense and lurid thrills: saloon scenes, gunfights, the occasional scalping, tense evocations of futuristic corporate skullduggery. But the show stands out particularly for the metaphysical questions it raises: What makes us human? What does it mean to choose wrong over right? Is there such a thing as free will? In a world of artificial intelligence and computer-generated illusions, what divides the real from the unreal?
While echoing our ongoing angst about technology, the show often invites us to sympathize with the uncannily realistic androids who people Westworld, the eponymous frontier-style theme park. Human visitors (a.k.a. “guests”) can explore the park, which abounds in breathtaking canyons, picturesque homesteads, and bustling O.K. Corral–style towns. Guests can also abuse, kill, or otherwise interact with the resident androids (“hosts”), who include the beautiful rancher’s daughter Dolores (Evan Rachel Wood) and the savvy brothel madam Maeve (Thandie Newton). The park’s supervisors and staff—including the egoistic narrative-division bigwig Lee Sizemore (Simon Quarterman) and, initially, the programming specialist Bernard (the marvelous Jeffrey Wright)—maintain tight control of park activity. But a devious coder may have written secret behavioral rules for the hosts, who appear to be gaining consciousness and freedom.
Please email comments to [email protected] and join the conversation on our Facebook page.