Keep knowledgeable with free updates
Merely signal as much as the Synthetic intelligence myFT Digest — delivered on to your inbox.
It has been stated that reality is singular whereas lies are plural, giving disinformation an unfair numerical benefit. However is reality actually singular?
Take the tales of our personal lives. Which is probably the most truthful model? The official account contained in our CV or LinkedIn profile? Or the one we inform ourselves? Or those our family and friends inform about us behind our backs? All of them might be concurrently true — or deceptive.
The concept that a number of truths might be drawn from the identical materials is radiantly explored in the film Eno, which I noticed final week. Primarily based on the lifetime of the multitalented music producer Brian Eno, the documentary is auto-generated by a machine and varies each time it’s proven.
In keeping with the movie’s makers, there are 52 quintillion attainable variations of it, which might make “a extremely massive field set”. This creative experiment tells us a lot concerning the nature of creativity and the plurality of reality within the age of generative media.
To make the movie, the producer Gary Hustwit and the inventive technologist Brendan Dawes digitised greater than 500 hours of Eno’s video footage, interviews and recordings. From this archive, spanning 50 years of Eno’s inventive output working with artists together with Speaking Heads, David Bowie and U2, two editors created 100 scenes. The filmmakers wrote software program producing introductory and concluding scenes with Eno and outlining a tough three-act construction. They then let the software program free on this digital archive, splicing collectively completely different scenes and recordings to generate a 90-minute movie.
Critics usually discovered the movie — or movies — to be quirky and compelling, similar to Eno himself. It could actually appear a bit random, Dawes tells me, however audiences have nonetheless been in a position to take up the components and assemble a story of their heads. Solely, “the viewers does the cooking,” he provides.
The model I noticed was a mesmerising mixture of interviews and recordings with some jagged juxtapositions however a transparent narrative arc. I used to be notably intrigued by one part wherein Eno talked concerning the idea of “scenius”. Eno has lengthy resisted the concept creativity is the output of 1 lone genius, slightly it’s the product of collective societal intelligence, or scenius. “The movie is the embodiment of this concept of scenius,” says Dawes.
Hustwit and Dawes have now launched an organization known as Anamorph to use their generative software program to different varieties of content material. Hollywood studios, promoting companies and sporting franchises are goal clients. However Dawes stresses they’re utilizing their very own proprietary software program to reimagine present human-generated content material in distinctive methods; they don’t seem to be utilizing generative AI fashions, reminiscent of OpenAI’s GPT-4, to generate different content material.
The more and more widespread use of generative AI fashions, nonetheless, raises different questions on truthfulness. A lot has been written about how buggy fashions can generate non-truths and “hallucinate” details. That is a big drawback if a user wants to generate a legal brief. However it may be a bonus function in creating fictional content material.
To investigate how good it is at doing so, Nina Beguš, a researcher at College of California, Berkeley, commissioned 250 human writers in 2019 and 80 generative AI fashions final 12 months to jot down quick tales based mostly on equivalent prompts. The problem was to reimagine the Pygmalion delusion wherein a human creates a man-made human and falls in love with it.
Beguš tells me that she was shocked that the machine-generated content material was extra formulaic and fewer imaginative in construction, but additionally extra woke. Relatively than reinforcing some societal stereotypes, the fashions appeared to problem them. Of their tales, extra of the creators had been ladies and extra of the relationships had been same-sex, for instance.
She suspects that these outputs mirror the way in which wherein the fashions have been advantageous tuned by human programmers though it’s onerous to know for certain as a result of the fashions are opaque. However she suggests we’ve now reached a “new frontier” in writing wherein human and non-human generated content material have in impact merged.
That raises worries about how dominant US AI corporations are encoding the values of this new frontier, which can jar with different societies. “Will these hegemonic fashions and cultures take over in all places?” she asks.
Whether or not machines improve or degrade truthfulness and what human values they mirror critically is determined by how they’re designed, educated and used. We had higher pay shut consideration.