So the long and short of it is that I was writing a kinda long essay about what I thought about the video Austin Mcconnell released about how he used AI to make a preview of his new book, and then i was promptly kidnapped by university. Since then I think some better wordings of the things I wanted to say came out and I don't really wanna retread that ground (watch the philosophy tube video, NOW). What I do want to though is stress the importance of people's knowledge on the subject and the need for some coherent talking points around AI. Lots of the confusion around AI and as in Mcconnells video righteous indignation, comes from the fact that most people know there is something wrong with AI generated content but can't quite put their finger on why. So instead of addressing this one video I wanna lay out a piece of rhetoric that I think would be more effective.
One thing you might hear from supporters of AI art is that AI isn’t plagiarism because it does just what humans do, taking inspiration from many places. However I think that inspiration and more generally the idea that AI can “learn” is a bit of a misnomer that personifies the technology more than it deserves. When we as artists take inspiration we filter it through our human experience of the world, we apply our own context and reinterpret the source material. When we learn about things we apply our own intellect and reasoning to make general rules, and when given examples to the contrary we change our model.
On the other hand AI is only the data it's given, it makes these general rules about the world by brute force and adds nothing new to its data set. As opposed to being an artist in its own right, AI is more akin to a complicated statistical model that reproduces whatever words or groups of pixels are most likely in a given context. We can see this in action in the way new AI is slowly devolving. Now that the internet is full of AI generated prompts and images, the dataset that AI has to pull from is leading to worse and worse results. ChatGPT hasn’t been retrained since September of 2021, and it likely won’t ever be, instead just googling the information which is a thing you could just do yourself. Newer Chatbots like twitter *ahem* X’s Grok (awful name) have allegedly been caught regurgitating Chat GPT’s stock responses like some sort of slop ouroboros. AI is nothing without our content.
I think the best course of action in steering the development of neural nets in a better direction, and changing the minds of lay people on the matter is to change the language that surrounds them. AI does not learn, to say that it does is like saying that a graph or a plot learns. Hell, I don't even think we should call them AI’s. When we see a piece of media we necessarily meta-analyze it and connect to the person who made it, and to accept that a statistical model could wholly replace that person is to miss the communication inherent to art, or pretend that it's meaningless. It can only really lead to a shittier world IMO, and its important to me at least that we dont let ourselves think that.
So today around noon I went to the museum close to me and on display along with a lot of other modernist paintings was the aforementioned Convergence. I dunno why but out of all the Pollocks this one really intrigued me. In general I have mixed feelings about Pollock, the persona he modeled for the painters of the era was one of a cigar smoking, commie hating, rat bastard. Which I will admit is a mood besides the commie hating part. But the thing that irks me about his art is the philosophy of it. In an era dominated by the Greenburg strain of modernism Pollock and his contemporaries don't seek to use the elements unique to painting to create any sort of meaning, but instead in pursuit of the art object, of the painting for painting's sake. Its a philosophy that lends itself nicely to the image of a capitalist superstar artist and to a worldview that over everything else values making money off art
As I stood there kind of pissed at myself for wanting to stare at this as opposed to anything else in the museum, I realized what it was that caught my eye. There’s a small sheen in the black of Convergence (1952), and that's a detail that feels incredibly strange. The black which appears to have been laid first stands out in that it's one of only two colors that stains the unprimed canvas and that besides the black and blue, all the other colors are thick, opaque and imposing in a way that lets you almost feel them with your eyes. This sheen on the other hand is more mysterious. I don't know what the sheen is in the black of convergence or how it feels smells or tastes, but I do know that it lends a ghostly quality to the bones of this otherwise violently material painting. It's almost wet, mimicking what it once was. Like a photograph it carries the faded memory of the moment it was made.
I began to consider each color in Convergence its own character resting on the spectral scaffolding that apparently Pollock thought was quite shit. The blue occasionally dances with the white yellow and orange connecting them with wirey tendons to the painting's blackened bones, the white standing as some pristine structure battered by the winds in thick flat areas. It's ironic because I don't think Pollock had thought about any of this, or what it means to him. But I think it's interesting how a painting whose intent is meaninglessness can educate us on how to make meaning with paint