top of page
  • Robert Farago

Data Dignity AI Movement DOA?

New Yorker Magazine Faces Annihilation


The New Yorker is the cushiest gig in journalism. The mag pays writers handsomely to write exhaustively. Deadlines? Fungible. To be fair, the quality of writing coming out of the New Yorker is stellar. (Dan Baum R.I.P.) The quality of thought – tilting assiduously left – varies. Case in point; Jason Lanier’s AI screed.


Mr. Lanier comes to us with an impressive resume. According to goodreads.com, “He either coined or popularized the term 'Virtual Reality' and in the early 1980s founded VPL Research, the first company to sell VR products.” Computer scientist, composer, musician, photographer and author – Mr. Lanier is a renaissance man in a world of specialists.


Now that AI has replaced VR as the topic du jour, Mr. Lanier has turned his attention to AI’s threat to democratize and devalue everything he does. His essay, There Is No AI, argues that “We can work better under the assumption that there is no such thing as A.I… In my view, the most accurate way to understand what we are building today is as an innovative form of social collaboration.”


In short, Mr. Lanier reckons AI isn’t intelligent. It’s a “mashup” of human intelligence. And that means the technology isn’t an existential threat to artists. Well, it wouldn’t be if society recognizes his latest rhetorical coinage/popularization: data dignity.

In a world with data dignity, digital stuff would typically be connected with the humans who want to be known for having made it. In some versions of the idea, people could get paid for what they create, even when it is filtered and recombined through big models, and tech hubs would earn fees for facilitating things that people want to do. Some people are horrified by the idea of capitalism online, but this would be a more honest capitalism. The familiar “free” arrangement has been a disaster.

The missing word: copyright. The right to own what you create, and charge for its use. Mr. Lanier would have done well to check the U.S. Copyright Office’s definition of the term.

Copyright is a type of intellectual property that protects original works of authorship as soon as an author fixes the work in a tangible form of expression…
Works are original when they are independently created by a human author and have a minimal degree of creativity. Independent creation simply means that you create it yourself, without copying. The Supreme Court has said that, to be creative, a work must have a “spark” and “modicum” of creativity.”

The Copyright Office recently clarified its AI policy: AI output is not copyrightable. That’s because it’s not human-derived. What’s more “copyright protects expression, and never ideas, procedures, methods, systems, processes, concepts, principles, or discoveries.”


The key question for “data dignity”: whether the unfathomable amount of original human work vacuumed-up by Chat GPT and other large language models is protected by copyright even though its put through a non-human, make that an inhuman, process.



Mr. Lanier isn’t waiting for the Supreme Court’s ruling on “transformative use” of source material (Andy Warhol Foundation for the Visual Arts, Inc. v. Goldsmith), or arguing for strict legal protection for all AI “contributors.” He has a solution!

A data-dignity approach would trace the most unique and influential contributors when a big model provides a valuable output. For instance, if you ask a model for “an animated movie of my kids in an oil-painting world of talking cats on an adventure,” then certain key oil painters, cat portraitists, voice actors, and writers—or their estates—might be calculated to have been uniquely essential to the creation of the new masterpiece. They would be acknowledged and motivated. They might even get paid.

I’m not surprised that Mr. Lanier – blessed by the not-ghetto-fabulous New Yorker – is coming at AI’s existential threat from an elitist point-of-view. Mr. Lanier suggests that only “key” artists should enjoy attribution and… wait for it… money. Who decides who’s in and who’s out? Mr. Lanier reckons it’s the artistic Powers That Be.

At first, data dignity might attend only to the small number of special contributors who emerge in a given situation. Over time, though, more people might be included, as intermediate rights organizations—unions, guilds, professional groups, and so on—start to play a role… Whenever possible, the goal should be to at least establish a new creative class instead of a new dependent class.

Of course, any such data dignity system requires proper attribution. Is that even possible? Google’s Bard sucked-up over 200m works branded as copyright – and the rest. Current AI large language models (LLM’s) are based on a statistical average of all the data it stores.


Who gets the hat tip and, more importantly, the financial tip? Currently, no one. In fact, AI wouldn’t exist if it didn’t thumb its collective nose at the very concept of copyright. Don’t get me wrong. I hope the Supremes destroy this model. But I’m not optimistic. The genie is out of the bottle– a trillion dollar genie that’s already racked-up hundreds of millions of users.


The only two ways to “solve” this problem: AI pays everyone it uses to create its LLM or there’s a market for identifiable non-AI output (what Union seeks to create). Again, good luck with the former. And, come to think of it, good luck with the latter.


A sentiment that Mr. Lanier should embrace, given his final thought: “People are the answer to the problems of bits.” 

0 views0 comments

Comments


bottom of page