It seems to me that the ethics of using learning software are nothing more than an intellectual exercise. It is already cheaper to use software to generate reports, presentations, and artwork than to do it "manually". Most of our planet runs on greed capitalism, which means that the solution that makes the most money will eventually be accepted. John Henry does not win -- the steam drill does. (And we're seeing very primitive examples of "AI" so far.)

The introduction of the power loom in the early industrial revolution turned weaving from a cottage industry employing most of the rural communities in England into a sweatshop industry employing orders of magnitude less. It ruined quite a few lives and arguably made many more worse. People hated it even at the time, but their anger made little difference in the end.

My advice to artists is to roll with the punch and start using the new software. If you don't, your competitor will.

And, in the end, learning software is just a force multiplier. It does nothing on its own. A human still has to decide if the result is useful or not. (This is why making decisions based on AI is so dangerous -- by the time you know if the result was useful, it's probably too late.) So, there will still be jobs for artists, just not as many.

Now, on a side note. This is the outcome programmers have been working at for most of a century. This is our nirvana -- the machine does the work, with minimal intervention. Programmers are lazy by nature; that's why we're willing to spend days coding a utility that will save a few hours of effort. Be careful what you wish for. ๐Ÿ™‚

    The topic is quite provocativeโ€ฆ

    My question is, have the descendants of the Arab who invented the numbers been paid royalties yet? All human culture is profoundly secondary. Already in Ancient Greece they formulated all the plots in literature and since then only repeat them in slightly different interpretations.

    duane Programmers are lazy by nature; that's why we're willing to spend days coding a utility that will save a few hours of effort.

    Right, but the spent days will go to one person, and the hours saved will go to many, resulting in a total gain of years.

      Related to this, and something that many of us may run into without realising: people are using ChatGPT to farm points on stack overflow and similar sites by answering as many questions as possible. So there's a growing number of answers that are unverified by a human, just fake answers that may or may not be correct.

        Kojack I am not shocked by that at all, I feel like something similar is happening on steam where weirdos are trying to farm award points on there too, lots of stupid meme reposting in the reviews among other things that's more bot behaviour though than ChatGPT but it will probably infest that store as well.

          Erich_L I guess what I'm getting at; what if these models were not built using images scraped from the web, what if instead they were built by robots walking around taking pictures, using a camera. What if those robots could also sit down and look at pictures online?

          Well these robots of yours, whatever they are ๐Ÿ™‚, they can take as many pictures as they like on and off line, but that's not the real source of their "smarts", is it? The pixel data needs to be tagged in all sorts of ways. That's where the juice is. This is very hard to automate. You'll always need humans to do it. Now you'll say: but but ai can teach itself. Sure, but it can only operate inside constraints of what it had already been "taught" via human tagged data. Every time a new apparent concept needs to be introduced into an ai system you'll need humans to format the data. Otherwise the ai system will just grind itself into derivative stagnation. Which is good if you look at it as an automation tool, but bad if you want to hype it out as a milestone towards agi. To put it poetically; generative ai is vampiric in nature. It needs constant influx of fresh human creativity to appear to be "getting smarter", and consequentially stay relevant as a cultural phenomenon. That's why it would never be able to beat humans at the game of concept invention. The real strength of ai is not in any way different from strength of your regular digital automation; it can munch and burp out a lot of derivative boring stuff in small amount of time. This, as always with tech automation, means that people who do a lot of derivative boring stuff will get a strong incentive to stop doing it. Be it "art" or number crunching.

            duane Programmers are lazy by nature; that's why we're willing to spend days coding a utility that will save a few hours of effort.

            Megalomaniak Who knows, we might be chatting a GPT in here too.

            We should have a bot member. phpBB had (and maybe still has) that as an add-on. It's open source, so it shouldn't be that hard to port it.

            Personally I have to admit I am all for it. Though when I had Wombo.art generate my sewage pipe piece, I started with my own crude drawing done in paint. A few iterations later and I got a shaded image I was happy with. I think the trick here to use such tools more efficiently is to not rely on the text prompt. For us in game dev, we often know what general shape or angle we want for an image so you can save a lot of time by using the option to generate an image based on a pre-existing one. Previously I thought (as I assume many do) that it's all about the text prompt. I beg to differ. I say take a page out of the average artist's playbook and start with an image reference reference.
            @duane I like your take, I assume that no, you don't/wouldn't feel any guilt at all even if starting with a "protected" image.
            @xyz As far as I know humans labeling data is largely already a thing of that past. "Otherwise the ai system will just grind itself into derivative stagnation." I have not experienced yet this even in the slightest. I think that most of us maybe besides our esteemed futurist cyber are looking at these models in their capacity to replace human thought rather than to aid it.

              xyz The pixel data needs to be tagged in all sorts of ways.

              Fortunately for the language models, humans are quite willing to do that for free, in our spare time -- as the existence of social media shows. Most of us don't even care how someone might be using our text.

              Tomcat Right, but the spent days will go to one person, and the hours saved will go to many, resulting in a total gain of years.

              That's the best case, but I never thought altruistically when I started banging together a utility, and most of them were forgotten before they ever got used again. If you've got a geeky manager, they'll excuse any delay when you show them a neat piece of code. ๐Ÿ™‚

              Erich_L I assume that no, you don't/wouldn't feel any guilt at all even if starting with a "protected" image.

              I probably would feel guilty if I wasn't buffered from the original by the software, but I'd get over it. Anyone who is really worried about it could check this out.

              https://arstechnica.com/information-technology/2023/03/ethical-ai-art-generation-adobe-firefly-may-be-the-answer/

              Copyright doesn't work in the digital age. We've been trying to shoehorn it in for decades, but it's what programmers refer to as a kludge. It might be somewhat functional now, but you know it will break as soon as anything changes. The only reason the concept ever existed is the pre-industrial idea of patronage -- in this case, society would go out of our way to encourage people to pay the author.

              It still exists because wealthy people bought lots of authors' work and want to continue to charge money for them. (*cough* mickey mouse) Make no mistake, the people with the power to make decisions don't care about struggling artists, except in the sense that a farmer cares about a potato plant. However, they will probably attempt to quash learning systems to protect their own portfolio.

              I'd normally say that such an effort is impossible -- the genie's out of the bottle, and it's not going back in -- but what if some hack puts together a legal "AI spotter". The ultimate copyright troll enforcer that tags millions of "offenders" every day. Then someone creates a framework of law that allows "AI judges" to rule on very limited cases of copyright infringement and redirect any attempts at payment to the copyright owner. (Sound familiar?) If you've read Melancholy Elephants, this is worse.

              I can think of much, much nastier things that this software could be (and probably is being) used for, but I don't want to give anyone nightmares tonight. ๐Ÿ™‚ Anyway, it's more likely that the results of AI art will be declared original, since the money makers would be happy to dispense with the bothersome human artists and save a few bucks.

              @Erich_L I don't see how a language model can ever "replace" human thought? In its essence it's just a database lookup. GPT, for example, can "say" or "code" only that which humans already said or coded. It can't and never will be able to implement a new algorithm from a textual description that it sees for the first time. It doesn't do programming... or thinking. There's nothing even resembling conceptual thinking there. It's just an intricate statistical analysis of enormous body of existing text, aggregated thanks to the internet and hordes of humans willingly typing all kinds of stuff into it. And as @duane pointed out, all of that text is implicitly tagged. That's the essence of what we currently call ai. Fueled by all the recent hype from companies who want to sell magical ai products to mass consumers, people have shown a strong tendency to project "ghost in the machine" entities into what these software products output. It's a peculiar form of modern superstition. I call it "digital pareidolia". Ok for common folk to think we're in some kind of B sf movie, but software engineers should know better.

              Erich_L As far as I know humans labeling data is largely already a thing of that past

              I'd disagree here, at least in the case of pixel data for image generators. If you want ai to emulate humans, your best bet is to train it with human tagged data. Ai trained on ai tagged data would only mimic itself. This is suboptimal if your goal is to incrementally improve the emulation.

              @duane I don't think we'll even need an AI spotter. By mere power of hyperproduction, synthesized images will become boring and easily recognizable by anyone, losing all of the allure they currently appear to have. Just like lens flare effect did in the 90s ๐Ÿ™‚

              I tried to add a ChatGPT bot to the forum, but it wasn't working. The developer of the extension said there was a bug he was looking into. But I like the idea, it should be working in a little bit.

              Something else to consider is, how does generated art compare to a musical remix? Are they very similar, and if so, does the human who guides the software deserve the same, greater, or less consideration as a music remixer?

              Remixing has been around for a while now, and usually doesn't generate the same knee-jerk responses that "AI" does. However, learning systems can't make art on their own. Generally, the operator has to guide the end result through feedback over and over again.

              "Make me a painting of Elvis Presley holding a porcupine."
              "No, a younger Elvis."
              "Now have an elephant walking over a VW beetle in the background."
              "Put a squadron of spacecraft flying overhead."
              "No, nix the spaceships and add a flight of dragons."

              It's not very likely that this results in a work that resembles any previous work, even though it heavily borrows from it.

              And although the explosion of artwork from non-artists is likely to be mostly noise, some finite amount will have cultural value, and will be noticed. In being noticed, it will provide feedback to the learning systems, and a sort of evolution will begin. The language models will not be static, so I don't think it will ever stabilize into something dull, any more than the Internet as whole has. (Though you might still not like it.)

                duane Remixing has been around for a while now, and usually doesn't generate the same knee-jerk responses that "AI" does.

                It did in the late 80's early 90's when people started doing it.

                ethical or not, it's really ugly!

                I don't think I'll have a strong opinion one way or another until computers start making art on their own, without input, just because.
                That would imply computers get the idea of "fun". I don't know if I'm ready for that.

                10 months later

                You can never defeat piracy. So it doesn't matter it's ethical or not.
                Companies will use it, no matter the cost.

                The real problem with these AI generated "art" is ... it's boring and inconsistent as hell.
                Don't expect any improvement. It's by design ... meaningless, blurry but "fancy".

                Look at the image 1, it's a boring door but it's functional.
                Image 2, think about how to open it. Then you will realize the added lighting and texture improve nothing.
                Image 3, the central symbol is corrupted, which is not consistent with the style of other parts. And where's the green light comes from ? How to open it?
                Image 4, same problem, can't image how to open it.
                The last one, what's the purpose of the grasses? The door frame is not cyberpunk at all.

                My conclusion is AI will replace most cheap and unskilled artists (and developers).
                Big companies will use this trash, and they don't care the complains of some consumers, because most people are brain-dead. (Including the CEOs, they can't find what's wrong with these generated contents)

                But don't ask any question, don't look at the details, or you'll find this is just the illusion of "art".