Having to wade through misinformation galore while navigating current affairs has become a quotidian reality that is fraught with questions of truth and falsity, often straddling a fine line. False news travels faster, and falsehood spreads «deeper and more broadly» on Twitter, a study by three MIT scholars has shown, casting light on the precariousness of our online communication ecosystem.
While there is little contention that some digital technologies pose assorted political and social perils and ethical quandaries, what opportunities do they bring to artists and filmmakers operating in the field? How can the rapidly developing technologies and techniques spur more impact and engagement through storytelling? Are there any new ways technologies can aid filmmakers in recovering erased histories after systematic power imbalances virtually wipe out images? What about the unintended harm that these technologies may provoke? During an extensive industry talk, intriguingly dubbed New Tech – Manufacturing the Truth, which took place during IDFA at the end of November, artists and filmmakers, including Amer Shomali, Ali Eslami and Francesca Panetta, addressed some of these questions and discussed various technologies they use in their media practice. The talk was moderated by William Uricchio, the founder and principal investigator of the MIT Open Documentary Lab.
When doing research for his animated documentary The Wanted 18 (2014) alongside his co-director Paul Cowan, Palestinian visual artist Amer Shomali (dir. Theft of Fire, IDFA Forum 2022) faced a striking dearth of available images. The Wanted 18 follows a curious story that revolves around a herd of cows, which transpired in his hometown of Beit Sahour, east of Bethlehem, during the first Intifada in the 1980s. Amid calls to boycott Israeli products and commodities, enterprising Palestinians purchased 18 cows in an Israeli kibbutz and learned the tricks of the trade to achieve dairy self-reliance and produce milk for the local residents. The cows were soon ludicrously declared a threat to Israel’s national security, and a hot pursuit of the 18 «fugitives» ensued, spanning almost four years, with locals working together to move the animals covertly from a barn to a barn.
Upon kicking off the project, Shomali found a sole photo of the events, which was of a cow. But how do you make a documentary film when there are no documents at all, as many of the national archives were looted and remain mostly inaccessible? In the absence of images at hand, animation and reenactments became a sort of tool for the director to collect «oral history» from local residents and recreate the missing personal archives, retroactively. As the visual artist described, there was a «fundamental problem» with Palestinian films that spoke about that period. The few archival materials that were gathered and salvaged from that period tended to accentuate the violence that unfurled during the Intifada, such as hurling stones and setting car tyres on fire. However, what largely constituted the resistance movement then were these various acts of civil disobedience that were non-violent. Yet they were not part of the archives that were accessible. «So basically, whenever you make another film about the Intifada, you are stuck with these limited archives», the artist detailed. Whatever you produce, you are creating a faulty image about the Intifada, which is passed on to new generations. «Everything you have in your film is true, but it is not the truth», he added.
While there is little contention that some digital technologies pose assorted political and social perils and ethical quandaries, what opportunities do they bring to artists and filmmakers operating in the field?
During the talk, Shomali also recalled passersby approaching the crew and sharing their recollections of the events as they were filming on the streets of Beit Sahour. Shomali decided to incorporate a passerby’s story and involve his son to play the man’s younger self. The camera then became a portal in a way connecting the two generations – Shomali explained – the one who had never seen any footage from the Intifada and the other one who never had the chance to be recorded. «Eventually, it is called a reenactment, but that is as close as we can get to the truth and the archival material», he said. In Shomali’s upcoming hybrid film Theft of Fire (dubbed «a documentary about a story that did not happen», presented during this year’s IDFA Forum), archival footage, real-life interviews and scenes as well as seamlessly built-in deep fakes – are all to coalesce to conjure up a story, in which a Palestinian museum curator attempts a daring plot to break into an Israeli prison in the Negev desert and take back «a trove of looted antiquities» from Israeli military leader Moshe Dayan.
While there are more established protocols around the use of animation / CGI and, more recently, VR in documentary filmmaking, Artificial Intelligence remains a more challenging area in terms of labelling and ethical frameworks around it. Incoming Director of UAL’s AKO Storytelling Institute and former executive editor of The Guardian’s Virtual Reality, Panetta looked back at her startlingly convincing ‘deep fake’ project, dubbed In Event of Moon Disaster (2019), which she co-directed with Halsey Burgund. The immersive project (produced by the MIT Centre for Advanced Virtuality) invites the audience into an alternative history of the 1969 Apollo 11 Moon Mission, reimagining the seminal event using deep fake techniques. The project, which went on to win an Emmy award for interactive documentary, features former US President Richard Nixon delivering a contingency speech on TV that was prepared for him (but never happened) in the eventuality of the Apollo 11 mission’s failure and astronauts Neil Armstrong and Buzz Aldrin being unable to return to Earth.
The kind of ethical dilemmas that Panetta and Burgund were deliberating while making this piece were: «Why are we making the deep fake if we are saying deep fakes are a problem? How are we going to label this, so that moon conspiracy theorists do not take this and use it to their own games? How do we label this so that people are really clear what we are doing?» As Panetta noted, the piece starring Nixon delivering his unheard moon-disaster speech was created expressly to call attention to the (mis)use of deep fakes and their capabilities, as well as to try to encourage audiences to question some of the information that comes through social media streams and be aware of just how synthetic media technologies can bend the truth. Apart from the ethical conundrums that such projects involve, there is also a question of the author’s responsibility. While there is an implied contract of truth in documentary filmmaking (as pointed out by an audience member / a filmmaker during the Q&A), cutting-edge synthetic media tools still need to be publicly well understood, nor do they enjoy well-established protocols as of yet. Thus, the responsibility falls on the shoulders of the makers to pull back the curtain, signal the use of AI to audiences, if that is what is being deployed, and explain the process (which can come after it or in concert with the film) even though it may feel somewhat didactic or superfluous to do so.
Apart from the ethical conundrums that such projects involve, there is also a question of the author’s responsibility.
Ali Eslami, an Iranian-born, Amsterdam-based artist who in 2016 won the IDFA DocLab Award for Best Immersive Non-Fiction for his DeathTolls Experience (VR), spoke about his project A Stretch of Time, a new chapter of his ongoing project False Mirror that delves into the depths of post-human life in digital spaces. A Stretch of Time takes the viewer underground, where «a disembodied figure» encounters a vast living archive that stores capsules filled with fragments of memories and thoughts. The hero, we are told, is to perform a Sisyphean task imposed on him to escape, which is the only way out. If future humans (or post-humans) ever to inhabit vast and vibrant virtual spaces completely, what would their lives look like? Eslami grapples with this question in False Mirror (which A Stretch of Time is part of) by probing the bounds of speculative futurism through active worldbuilding, which is «continually growing and reshaping itself.»
The notion of a body and identity in the virtual context, as discussed by Eslami, seemed to have piqued Shomali’s interest during the talk as he flirted with the idea of technology being able to protect the identity of activists at risk in autocratic regimes. Elaborating on this subject matter, Eslami reckoned that the virtual body could channel the voice of a person who carries it in an anonymous fashion, and that holds «an amazing potential for technology to be able to protect us and amplify our voices.» However, the artist also cautioned about a «tricky» aspect of it, which may also have a negative effect. «It can sort of fetishise or dilute the voice, in a sense that it becomes a distraction of what the actual voice is», Eslami observed. «Sometimes we get ahead of ourselves. Because it is a very seductive medium, and we can get lost in forms that carry no function or context. So I think being able to keep that balance is a very tricky thing, but it is important.»