(Roughly) Daily

“If it looks like a duck, walks like a duck, and quacks like a duck, everyone will need to consider that it may not have actually hatched from an egg”*…

Boris Eldagsen won the creative open category at this year’s Sony World Photography Award with his entry “Pseudomnesia: The Electrician.” He rejected award after revealing that his submission was generated by AI. (source, and more background)

Emerging technology is being used (as ever it has been) to exploit our reflexive assumptions. Victor R. Lee suggests that it’s time to to recalibrate how authenticity is judged…

It turns out that that pop stars Drake and The Weeknd didn’t suddenly drop a new track that went viral on TikTok and YouTube in April 2023. The photograph that won an international photography competition that same month wasn’t a real photograph. And the image of Pope Francis sporting a Balenciaga jacket that appeared in March 2023? That was also a fake.

All were made with the help of generative artifical intelligence, the new technology that can generate humanlike text, audio, and images on demand through programs such as ChatGPT, Midjourney, and Bard, among others.

There’s certainly something unsettling about the ease with which people can be duped by these fakes, and I see it as a harbinger of an authenticity crisis that raises some difficult questions.

How will voters know whether a video of a political candidate saying something offensive was real or generated by AI? Will people be willing to pay artists for their work when AI can create something visually stunning? Why follow certain authors when stories in their writing style will be freely circulating on the internet?

I’ve been seeing the anxiety play out all around me at Stanford University, where I’m a professor and also lead a large generative AI and education initiative.

With text, image, audio, and video all becoming easier for anyone to produce through new generative AI tools, I believe people are going to need to reexamine and recalibrate how authenticity is judged in the first place.

Fortunately, social science offers some guidance.

Long before generative AI and ChatGPT rose to the fore, people had been probing what makes something feel authentic…

Rethinking Authenticity in the Era of Generative AI,” from @VicariousLee in @undarkmag. Eminently worth reading in full.

And to put these issues into a socio-economic context, see Ted Chiang‘s “Will A.I. Become the New McKinsey?” (and closer to the theme of the piece above, his earlier “ChatGPT Is a Blurry JPEG of the Web“).

* Victor R. Lee (in the article linked above)

###

As we ruminate on the real, we might send sentient birthday greetings to Oliver Selfridge; he was born on this date in 1926. A mathematician, he became an early– and seminal– computer scientist: a pioneer in artificial intelligence, and “the father of machine perception.”

Marvin Minsky considered Selfridge to be one of his mentors, and with Selfridge organized the 1956 Dartmouth workshop that is considered the founding event of artificial intelligence as a field. Selfridge wrote important early papers on neural networkspattern recognition, and machine learning; and his “Pandemonium” paper (1959) is generally recognized as a classic in artificial intelligence. In it, Selfridge introduced the notion of “demons” that record events as they occur, recognize patterns in those events, and may trigger subsequent events according to patterns they recognize– which, over time, gave rise to aspect-oriented programming.

source

Discover more from (Roughly) Daily

Subscribe now to keep reading and get access to the full archive.

Continue reading