Or maybe artist should be able to not justify their existence monetarily and also not have their art fucking stolen and murdered to generate terrible pseudo art lmao.
That’s very pithy, but the material used as training data was probably produced by artists attempting to create art using tools (ai and otherwise), as well as more mundane data designed and produced by humans with no ai tools and some produced by humans with almost exclusively ai tools.
Don’t chicken/egg this. All of the training data was man-made at some point. Until the first LLMs started outputting based on it.
Secondly, the amount of human-produced content and LLM-produced content that’s in the training data is incomparable. And will continue to be so. Otherwise the models break.
That’s unprovable without some very strict definitions, but if we take it as a given (and for the record I don’t disagree, so we should) then that’s why the ai isn’t the artist. It’s just a tool an artist could use. MS Paint isn’t an artist either, and like ai neither are many of the people using it, but it still can be used to create art.
Meh, better approach it to assume it doesn’t understand emotion unless proven otherwise. Does a fork understand what human emotion is? A pillow? You wouldn’t assume that either I guess.
Welcome to radical constructivism :) The question whether other people or cats can experience emotions is in fact a problem people have been thinking about quite a lot. Answers are not very satisfactory, but one way to think about it (e.g., some constructivists would do that) is that assuming they do have a conscience simplifies your world model. In the case of “AI” though, we have good alternative explanations for their behavior and don’t need to assume they can experience anything.
The other important bit is that not assuming some phenomenon exists (e.g., “AI” can experience emotions) unless proven otherwise is the basis of modern (positivistic) science.
assuming they do have a conscience simplifies your world model.
Does it? Feels more like it merely excludes them from your model, since your model cannot explain their conscience. If that simplifies your model, then you can apply the same thinking to anything you don’t understand by simply saying it is similar to something else you also can’t explain.
The other important bit is that not assuming some phenomenon exists (e.g., “AI” can experience emotions) unless proven otherwise
The problem with this isn’t that it’s literally unprovable, it’s that proving it requires defining “can experience emotions” in a way everyone can agree on. Most trivial definitions that include everything we think ought obviously be included often bring in many things we often think ought be excluded, and many complicated definitions that prune out the things we think ought be excluded, often also cut out things we think should be included
So which of us are p-zombies? We’ve encountered the same problem by suggesting that human beings have consciousness or self awareness, or get what qualia are, except we can’t prove that anyone has any of these things. The difference of AI consciousness within its development community is a sorites paradox. Large AI packages like GPT-4 have more awareness than previous versions, but not as much awareness as humans. But it Chat GPT4 does exceed human control subjects in the Turing test.
Mind you the Turing is only one of several tests we use to rate how advanced AI is, but we can’t be sure even when an AI can make coffee given a machine and supplies, and construct flat-packed furniture given the IKEA visual instructions, that this counts as AGI, or is sentient.
Right now, there are artists who use generative AI to create art, and it is as much really art as photography was really art when illustrators were complaining they are just using a machine to replicate a real scene. As much as music production and music synthesis are art.
Now yes, I get that AI presents risks of workers losing income and their capacity to survive, but every time we toss our sabots into the gearworks to break the machines, we’re kicking overthrow of the system down the line, until we’re where we are today, not only looking at the dissolution of our democracy so that industrialists may continue to exist, but also the destruction of our habitat, because we can’t address what makes them money.
So capitalism is going to end you either way, unless you end it first. And I expect if you actually tried to make a fortune on your art, you would eventually find yourself selling out all your rights to one of the big corporate controllers, and they would own everything you did, and pay you a pittance for it… Unless you are James Hetfield kind of skilled and lucky. Somehow I doubt you are.
It doesn’t have to be a pleasant aesthetic or visual. It could be anything from a full image to a font used on a resume to the choice of words used in general to the way the email address sounds if you pronounce it out loud. It can be the sequence of smells, sounds, sights, taste, and feel of a single course or five course meal.
It can be puppets designed to last generations or an explosion that exists for a brief moment.
It can even be the cleverness of how a message is woven into an otherwise meaningless looking scene.
Or maybe artist should be able to not justify their existence monetarily and also not have their art fucking stolen and murdered to generate terrible pseudo art lmao.
Terrible pseudo art is what you get from hollywood and big music studios right now, for the most part.
Nice of you to think AI won’t just lead to even more formulaic stuff, but maybe from different megacorporations.
This. Art is expression. Wtf is AI art expressing?
Whatever the artist using the AI tool is trying to express?
If I ask a painter to paint a landscape, who’s making art, me or the painter?
Is the painter just a tool?
You can’t really have it both ways.
Is the things just a machine that’s following instructions and synthesizing its training data into different things? Then it’s a tool.
Is the things making choices and interpreting your inputs to produce a result? Then it’s an artist.
The painter I buy a commission from is an artist. The ai I use to generate a scene is a tool.
Was the “training data” produced by artists or tools?
I mean, yes?
That’s very pithy, but the material used as training data was probably produced by artists attempting to create art using tools (ai and otherwise), as well as more mundane data designed and produced by humans with no ai tools and some produced by humans with almost exclusively ai tools.
You probably live in a different world than I do.
Don’t chicken/egg this. All of the training data was man-made at some point. Until the first LLMs started outputting based on it.
Secondly, the amount of human-produced content and LLM-produced content that’s in the training data is incomparable. And will continue to be so. Otherwise the models break.
An AI doesn’t understand what human emotion is.
That’s unprovable without some very strict definitions, but if we take it as a given (and for the record I don’t disagree, so we should) then that’s why the ai isn’t the artist. It’s just a tool an artist could use. MS Paint isn’t an artist either, and like ai neither are many of the people using it, but it still can be used to create art.
Meh, better approach it to assume it doesn’t understand emotion unless proven otherwise. Does a fork understand what human emotion is? A pillow? You wouldn’t assume that either I guess.
What about a cat, or a person who’s different from you? It’s just as impossible to prove, and yet…
Welcome to radical constructivism :) The question whether other people or cats can experience emotions is in fact a problem people have been thinking about quite a lot. Answers are not very satisfactory, but one way to think about it (e.g., some constructivists would do that) is that assuming they do have a conscience simplifies your world model. In the case of “AI” though, we have good alternative explanations for their behavior and don’t need to assume they can experience anything.
The other important bit is that not assuming some phenomenon exists (e.g., “AI” can experience emotions) unless proven otherwise is the basis of modern (positivistic) science.
Does it? Feels more like it merely excludes them from your model, since your model cannot explain their conscience. If that simplifies your model, then you can apply the same thinking to anything you don’t understand by simply saying it is similar to something else you also can’t explain.
The problem with this isn’t that it’s literally unprovable, it’s that proving it requires defining “can experience emotions” in a way everyone can agree on. Most trivial definitions that include everything we think ought obviously be included often bring in many things we often think ought be excluded, and many complicated definitions that prune out the things we think ought be excluded, often also cut out things we think should be included
So which of us are p-zombies? We’ve encountered the same problem by suggesting that human beings have consciousness or self awareness, or get what qualia are, except we can’t prove that anyone has any of these things. The difference of AI consciousness within its development community is a sorites paradox. Large AI packages like GPT-4 have more awareness than previous versions, but not as much awareness as humans. But it Chat GPT4 does exceed human control subjects in the Turing test.
Mind you the Turing is only one of several tests we use to rate how advanced AI is, but we can’t be sure even when an AI can make coffee given a machine and supplies, and construct flat-packed furniture given the IKEA visual instructions, that this counts as AGI, or is sentient.
Right now, there are artists who use generative AI to create art, and it is as much really art as photography was really art when illustrators were complaining they are just using a machine to replicate a real scene. As much as music production and music synthesis are art.
Now yes, I get that AI presents risks of workers losing income and their capacity to survive, but every time we toss our sabots into the gearworks to break the machines, we’re kicking overthrow of the system down the line, until we’re where we are today, not only looking at the dissolution of our democracy so that industrialists may continue to exist, but also the destruction of our habitat, because we can’t address what makes them money.
So capitalism is going to end you either way, unless you end it first. And I expect if you actually tried to make a fortune on your art, you would eventually find yourself selling out all your rights to one of the big corporate controllers, and they would own everything you did, and pay you a pittance for it… Unless you are James Hetfield kind of skilled and lucky. Somehow I doubt you are.
Does a paintbrush?
This is a false equivalency. Unless the paintbrush is stolen I guess. 🙄
Art is aesthetic. It can express something but it doesn’t have to.
Not necessarily. Not all art is visual, not all art is made to create pleasing emosion. Not all is made to be beautiful.
It doesn’t have to be a pleasant aesthetic or visual. It could be anything from a full image to a font used on a resume to the choice of words used in general to the way the email address sounds if you pronounce it out loud. It can be the sequence of smells, sounds, sights, taste, and feel of a single course or five course meal.
It can be puppets designed to last generations or an explosion that exists for a brief moment.
It can even be the cleverness of how a message is woven into an otherwise meaningless looking scene.
What is it that humans are expressing when they are arting?
All those poor OCs, dead forever! No human can ever draw them again now that AI ate them!