No, a better subject for that analogy would be software engineers complaining about ChatGPT's coding abilities. Wagons & cars are both a matter of mechanical engineering. A digital calculator didn't displace pen-and-paper computers in the way that power drills didn't displace carpenters. Transcribing and operating a printing press are both rote procedures where the skill involved achieves accuracy and the materials involved are mostly the same. But operating Stable Diffusion and painting in PhotoShop with a stylus require wildly different modes of operation, and enjoying or being good at the latter hardly suggests that one would enjoy or be good at the former.
This sort of image generation (especially extrapolating probable upgrades & improvements of just the next few years) displaces artists without providing an upgrade path. It shouldn't take much empathy to understand how that's frustrating and scary that must be for them.
I think pxoe's expression of frustration is totally reasonable. The engineers who made this stuff could have focused on using AI to enable new possibilities, instead of undercutting existing possibilities (to create new markets instead of overtaking existing ones). They could have used 100% consensual training data, but instead felt entitled to exploit a loophole/ambiguity in the social contract under which artists have been sharing their work on the internet.
A more appropriate analogy would be, this sounds a lot like social movements complaining about capitalist co-opting of their symbols, e.g. the use of "communist" rhetoric to build oligarchies (or the sale of "save the earth" mugs made from oil-derived plastics, etc.). Even that isn't a perfect analogy, though, as the co-opted output wasn't itself the displaced work-product, although it does a better job of capturing the emotional side of it, the sense of betrayal. Ultimately, I don't think it's productive to reduce the reaction to the decisions behind Stable Diffusion etc. to a single analogy, and it shouldn't be so hard to say "sorry, you're right, this is bad for you and good for me and you have every right to express your frustration over that irreversible decision".
This sort of image generation (especially extrapolating probable upgrades & improvements of just the next few years) displaces artists without providing an upgrade path. It shouldn't take much empathy to understand how that's frustrating and scary that must be for them.
I think pxoe's expression of frustration is totally reasonable. The engineers who made this stuff could have focused on using AI to enable new possibilities, instead of undercutting existing possibilities (to create new markets instead of overtaking existing ones). They could have used 100% consensual training data, but instead felt entitled to exploit a loophole/ambiguity in the social contract under which artists have been sharing their work on the internet.
A more appropriate analogy would be, this sounds a lot like social movements complaining about capitalist co-opting of their symbols, e.g. the use of "communist" rhetoric to build oligarchies (or the sale of "save the earth" mugs made from oil-derived plastics, etc.). Even that isn't a perfect analogy, though, as the co-opted output wasn't itself the displaced work-product, although it does a better job of capturing the emotional side of it, the sense of betrayal. Ultimately, I don't think it's productive to reduce the reaction to the decisions behind Stable Diffusion etc. to a single analogy, and it shouldn't be so hard to say "sorry, you're right, this is bad for you and good for me and you have every right to express your frustration over that irreversible decision".