Learnings from paying artists royalties for AI-generated art (kapwing.com)
138 points by jenthoven 9 hours ago
petterroea 5 hours ago
> The timing wasn’t right. We depended on artists helping us to promote the platform, and they didn’t.
There's a certain arrogance to believing the timing "simply wasn't right". It looks really bad if you try it with any recent controversy:
* "The timing wasn't right to charge people for heated car seats"
* "The timing wasn't right to make Photoshop a subscription service"
* "The timing wasn't right to increase fees"
It's a way of talking yourself away from the fact that what you are making may, inherently, be disliked. The cited survey even seems to have been read as favourably as possible:
> Surveys consistently showed that consumers believed artists deserved payment when AI generated content in their style.
This doesn't mean people want artists style to be generated by AI. It could mean they think it's horrible, but if it happens they should at least be compensated for it. In fact, the quotes survey even says 43% believe companies should ban copying artists styles. I could make the exact opposite argument with the same data:
"Many consumers believe companies should ban copying styles, and this may be a more common opinion than measured as most people have no experience with modern AI tools and therefore no chance to have made an opinion yet. What is known is that the majority believe that if artists were to be copied, they should at least be compensated"
edit: formatting, typo
raincole 3 hours ago
Making Photoshop a subscription service was an extremely successful business decision, so I'm not sure what the comparison is supposed to mean here.
I say this as someone who switched to Krita and canceled CC subscription.
danlitt 3 hours ago
"extremely successful business decision" and "inherently disliked" can both be true. Increasing fees quite often works out for the business too, but consumers don't generally like it.
logifail 2 hours ago
noosphr 3 hours ago
Enshificstion kills companies slowly, then all at once.
Terr_ 2 hours ago
Speaking in generalities, we underestimate how many things fail due to circumstances like "the market wasn't read for it." (In contrast to the more dramatic and common "all great success stories are due to leaders singularly imbued with unique and ineffable Greatness and Genius.")
codemog 5 hours ago
It’s not arrogant to be firm in your beliefs. You’re not arrogant for believing the timing is never right. You may even be 100% right, but you don’t have to belittle or put down the other side. In this case, they already lost, what more do you want?
petterroea 4 hours ago
How is it not arrogant to be firm in your belief, even if signals say otherwise? If I believe it is OK not to shower, and everyone around me complains about it, is it not arrogant of me to ignore the signals because "they just don't understand yet"?
I think a much more useful question is whether some arrogance is necessary to succeed. I personally think it is. But we are discussing a post mortem here, and the author is (in my opinion) clearly beating around the bush and using "the time wasn't right" to hide what may be uncomfortable truths.
Is a post mortem valuable if it doesn't address these face first? I am not the one with all the answers here, but what I am used to in mature tech teams is that the uncomfortable parts are usually the most important in any post mortem.
There are plenty of stories about companies that failed because the timing was wrong, and then see another company succeed in their place later on. That doesn't mean failure simply means "the timing was wrong" - you are putting a lot of weight on society adjusting to your belief. Consider that venture capital often invests in hundreds of founders like this, betting that at least one of them wasn't wrong. That's not statistically in your favor.
It is OK (in fact it is valuable) to fail and conclude that your signals may have been wrong. There's a reason some venture capital funds prefer investing in people who have failed before.
codemog 3 hours ago
antonvs 2 hours ago
If your beliefs are in conflict with reality, then holding them firmly may indeed be arrogant.
wolvesechoes 2 hours ago
> It’s not arrogant to be firm in your beliefs
I mean, if you keep ignoring stuff that undermines your beliefs that's the definition of arrogance.
spudlyo 8 hours ago
> Surveys consistently showed that consumers believed artists deserved payment when AI generated content in their style.
It's interesting that "consumers" are generally for the expansion of IP laws. At at the moment, I'm fairly certain that "style" is not something protected by Copyright. I personally do not want this, and I'm sure there are likely many like me. Poorly thought out IP laws lead to chilling-effects, DRM, stupid and unnecessary litigation, and ultimately a loss of digital freedoms.
> What 325 Cold Emails to Artists Taught Us
I'm surprised 1% didn't respond with "EAT HOT FLAMING DEATH SPAMMER" for sending them unsolicited commercial email. ;)
Gigachad 8 hours ago
Trying to protect a particular style is just unworkable for obvious reasons. The only solution I can think of is requiring AI companies to license all of the content they have in their training set so artists get paid for the training rather than trying to work out which source material links to which outputs which is impossible.
spudlyo 7 hours ago
When I buy a book, I don't buy a license to read it, I don't sign an EULA that says I won't scan it, digitize it, or write a program to analyze the word frequencies it contains. Do you want buy a license to read a book, because this is how you get there.
tdb7893 6 hours ago
Gigachad 7 hours ago
aerhardt 7 hours ago
throwawaysoxjje 6 hours ago
squokko 7 hours ago
mitthrowaway2 6 hours ago
add-sub-mul-div 7 hours ago
croes 6 hours ago
esafak 6 hours ago
saaaaaam 5 hours ago
numpad0 7 hours ago
The cumulative license fees required to properly compensate all artists is so absurd that it will probably genuinely burn down the entirety of global economy if paid. The only solution I can think of is to burn down just the AI to be revisited later to be rebuilt as a tool that won't require absurd amount of training data, that also leave a lot more to its human operator beyond merely accepting literal categorical descriptions that are fundamentally tangential to artistic values of outputs.
And I think same could happen to LLM. If it took all the fossil fuel on Earth just to barely able to drive a car to a car wash, there's more things wrong with the car than in the oil price.
Retric 6 hours ago
palmotea 5 hours ago
franciscop 5 hours ago
maplethorpe 6 hours ago
It's interesting you interpret the consumer's response as a desire for the expansion of IP laws. As an artist whose work exists in many of these training sets, I'm of a different opinion: IP laws can stay the same, but they should have purchased a license to use my art before including it in their training data.
Since the didn't, they should go to jail. The same way I would have gone to jail if I built Sora in my basement and sold it to the public.
JAlexoid 5 hours ago
As an artist your license didn't ban learning from your work. Unless your content was acquired without a license at all - you absolutely gave them permission to use it in training sets.
That is the gap in the legal landscape.
maplethorpe 4 hours ago
visarga 6 hours ago
I thought it was at most a monetary fine, do people go to jail for copyright infringement? But you seem to want to own all the air around your work, the ground beneath it too. Nothing can exist around it, so a creative person would do better to avert their eyes rather than loading useless ideas. Why should I install in my brain your "furniture" when I am not allowed to sit on it? In these cases I think authors provide a net negative to society by creating more works that further forbid others from creating in the same space.
Here, for example, any comment is open to read and respond to. On ArXiv any paper can be downloaded, read and cited. Wikipedia contains text from many thousands of editors, building on each other. We like collaboration more than asserting our exclusivity rights. That is why these places provide better quality than work for direct profit or, God forbid, ad revenue, that is where the slop starts flowing.
protocolture 6 hours ago
>IP laws can stay the same, but they should have purchased a license to use my art before including it in their training data.
But including your art in the training data is fair use (or otherwise exempt) by most standards, as no reproduction occurs. You are advocating for a change to IP law to make it more restrictive.
JoshTriplett 4 hours ago
abustamam 6 hours ago
heavyset_go 6 hours ago
throwawaysoxjje 6 hours ago
bluefirebrand 6 hours ago
onion2k 4 hours ago
It's interesting that "consumers" are generally for the expansion of IP laws. At at the moment, I'm fairly certain that "style" is not something protected by Copyright. I personally do not want this, and I'm sure there are likely many like me. Poorly thought out IP laws lead to chilling-effects, DRM, stupid and unnecessary litigation, and ultimately a loss of digital freedoms.
Kapwing is specifically designed for artists to share IP with other people in an IP-friendly and financially profitable way. A 'consumer' on Kapwing is not the same as an ordinary person browsing for AI generated art, and the fact that people who make money from selling their IP on there are in favour of expanding IP law shouldn't be a surprise.
All this really tells us is that Kapwing's artist community believe protecting their individual art style is more valuable to them than any money they'd earn from licensing it on a per-image basis to Kapwing's AI tool. I'd be willing to bet that if Kapwing changed the offer to a flat-fee-of-$50,000-a-year-plus-per-image-fee they'd find 99% of artists on there changed their minds. As with most things, people feel strongly about their rights all the way up until the price is right.
fennecfoxy 2 hours ago
Yeah as a furry (exposed to commission scene a lot) the number of commissions available for "Disney" or "Pixar" or whatever style art even before the whole AI era really tells you that they're hypocrites.
JAlexoid 5 hours ago
> I'm fairly certain that "style" is not something protected by Copyright
To a degree it is protected, but not by copyright. Design patents are a thing and companies have sued each other over them (Apple vs Samsung during the "smartphone wars" comes to mind)
j16sdiz 5 hours ago
> It's interesting that "consumers" are generally for the expansion of IP laws.
It's not. This total depends on how you ask it.
Q: Do you think artists deserved payment?
A: YES.
Q: Will you pay for art?
A: MAYBE.
Q: Do you think people should go to jail not paying for art
A: NO.
abustamam 6 hours ago
Just out of curiosity, do you believe artists deserve to be compensated when their art is used to generate stuff in their style?
I'm staunchly against expansion of IP laws. But I personally think that when a corporate machine gobbles up an artist's works so that people like me who can't draw can generate silly memes for a few bucks a month, the artist should be compensated. The company is profiting off of other people's work! That's not right.
The mechanism by which compensation is calculated appears to be an unsolved problem currently though.
csallen 5 hours ago
> The company is profiting off of other people's work! That's not right.
What's wrong with it?
We live in an interconnected world. Every company or individual who profits off anything does so, in very large part, thanks to work left behind by others that they don't directly compensate each other for.
Stated differently, if we look at the other side of the coin, it's one thing to create value, and another thing to capture value. If you are a business (and artists seeking profit are businesses), you create value then try to capture that value. Creating value and trying to capture (in the form of profit) is the entire name of the game. But no business captures 100% the value they create. If you make a product/artwork/service/whatever and release it to the public, lots of people may use it, view it, be inspired by it, learn from it, and ultimately profit off it in their own way without you necessarily being able to capture some part of it. And what's wrong with that?
Do we really want the entire world to be endlessly full of cookie-licking rent seekers who demand profit every time anyone does anything? Because they failed to capture the value they created, and thus demand a piece of the pie from those who are better at capturing value?
I like the way Thomas Jefferson put it:
> If nature has made any one thing less susceptible than all others of exclusive property, it is the action of the thinking power called an idea, which an individual may exclusively possess as long as he keeps it to himself; but the moment it is divulged, it forces itself into the possession of every one, and the receiver cannot dispossess himself of it. Its peculiar character, too, is that no one possesses the less, because every other possesses the whole of it. He who receives an idea from me, receives instruction himself without lessening mine; as he who lights his taper at mine, receives light without darkening me. That ideas should freely spread from one to another over the globe, for the moral and mutual instruction of man, and improvement of his condition, seems to have been peculiarly and benevolently designed by nature, when she made them, like fire, expansible over all space, without lessening their density in any point, and like the air in which we breathe, move, and have our physical being, incapable of confinement or exclusive appropriation. Inventions then cannot, in nature, be a subject of property. Society may give an exclusive right to the profits arising from them, as an encouragement to men to pursue ideas which may produce utility, but this may or may not be done, according to the will and convenience of the society, without claim or complaint from anybody. Accordingly, it is a fact, as far as I am informed, that England was, until we copied her, the only country on earth which ever, by a general law, gave a legal right to the exclusive use of an idea.
sciencejerk 5 hours ago
pastel8739 5 hours ago
JoshTriplett 3 hours ago
AnthonyMouse 5 hours ago
> It's interesting that "consumers" are generally for the expansion of IP laws.
Don't forget how polling works. Change the wording of the question and you get a different answer.
Try asking them if they think Comcast or Sony should be able to sue individuals for posting memes that don't even contain any copyrighted material.
SpicyLemonZest 7 hours ago
I don't think you can infer consumer positions on IP law from positions on who ought to get paid or how much they should be paid. Many of those same consumers, and indeed many of the artists, feel that fan art of your favorite characters should be legal and unrestricted so long as nobody's making too much money off of it.
spudlyo 7 hours ago
You're right. It's wrong to think that all of those people are busy writing to congress demanding new laws be enacted. The problem is, the vast majority of people (while possessing a vague sense of right and wrong) do not understand how IP law works, and what the tradeoffs vis-a-vis the public good are. I'm sure many among the supposed consumers in this survey think something akin to "there ought to be a law" -- a sentiment somtimes echoed by readers of this very forum.
gedy 7 hours ago
Yes this is where I fear big corps leverage hate for AI into adding even more nonsense copyright rules like protecting "style" which has never been under copyright in the US at least. Not defending AI scraping and training! But this will be abused even if no AI is involved.
nakedgremlin 8 hours ago
I thought this was a great write up on the current state for artists and AI engines. I'm honestly surprised by this nugget:
> A free Tess subscription to use their own model for brainstorming and scaling repetitive work (roughly 1 in 4 artists took advantage of this)
So based on the math I'm seeing... the 21 artists in the system, only 5 ("1 in 4") optioned to use the tool for their own productivity? That seems really low and makes me wonder what the user experience for creation feels like. I would assume if you decided to commit to this endeavor, you would want to see what derivative results will look like.
sciencejerk 5 hours ago
Only 21 artists??
ptmkenny 6 hours ago
I evaluated Tess.design about a year ago for an app I was building. At first I was excited because I wanted a service that compensated artists. However the number of artists was very limited and the blog post said “more will be added soon” but it had already been a year and it seemed like none had been added, not a good sign.
Then I tested out the image generation itself and I was unable to come up with prompts that achieved the kind of images I wanted. My only prior experience at the time was OpenAI API. With OpenAI I usually got what I wanted on the first or second try, but with Tess, I couldn’t get a usable result even after 20 tries.
So in addition to the limited number of artists, I think the quality of outputs vs. competing models was a huge factor. I needed to generate thousands of images, so I couldn’t afford to do dozens of attempts for each one.
Hopefully one day there will be a service that can match the quality of OpenAI Image API and Flux but with compensation for artists.
abustamam 6 hours ago
Yeah this just shows that ergonomics matters. I use Nano Banana and Grok Imagine to generate silly images for my friends and siblings (instead of reaction gifs I do reaction slop). The workflow is quite easy. Just plop in a prompt and usually the first image is good enough to share. Not that my standards are high anyway.
Would I pay extra to ensure that the artists that these models were trained on were compensated fairly? Absolutely! Would I pay extra for that but with degraded ergonomics? Given that this is just a silly hobby, probably not, if I'm being honest.
I think if that problem can be solved, and it's marketed to the correct group, a player in this space could certainly do well.
JAlexoid 5 hours ago
Most people can't even imagine the complexity it would require to actually build a system that correctly tracks down the sources for image generation. Not to mention that each image is generated from literally every single training image in a very small percentage.
It's not hard when someone inputs "create in style of studio ghibli" to say that studio Ghibli should get a cut. It's very different when you don't specify the source for the origin style.
And if you tried to identify the source material owner, the percentage of the output image that their work contributed to would be extremely - if not infinitely - small. You'd get minuscule payouts.
abustamam an hour ago
supermatt 5 hours ago
> …every image was traceable to a single consenting artist
> …fine-tune a Stable Diffusion base model.
So your entire business proposition was a lie, as you literally used a base model trained on billions of images by other artists too!
kennywinker 8 hours ago
They took a base model, so something trained on stolen work - and then added a vaneer of non-stolen work. I too would be skeptical of their legal position.
iso-logi 6 hours ago
I believe a service like this could succeed if the initial base model wasn't Stable Diffusion and wasn't trained internet scrapes without the copyright permissions.
Their solution basically just amounts of "Ethically sourced Styles" which still has all the red tape that a normal text2image model has because majority of the data is still unapproved for use in an AI model.
Businesses didn't want to get wrapped up in a pesudolegal model that really has no better legality than base SD.
protocolture 6 hours ago
They took a base model, trained on but not reproducing work, so entirely fair with no theft, and then tried to tweak it so it could make money for an artist.
kennywinker 5 hours ago
Except that as soon as it is used to create work, it’s reproducing work that is derived from what it was trained on. Not just the stuff it was TUNED on or asked to derive style from.
ocdtrekkie 7 hours ago
If anything the legal position is probably the opposite: The law is leaning towards AI training being transformative/fair use and AI generated content not getting any copyright protection at all. So something paying artists for style-rips probably was a net positive for artists, because it's very possible it will end up outright legal to have gen AI rip off artists' styles wholesale.
kennywinker 5 hours ago
In which case they would have zero business model.
Kim_Bruning 6 hours ago
Cite one legal case where an AI company trained on a particular work, and the judge ruled that they quote-stole it-unquote.
kdheiwns 6 hours ago
Courts pretty much always rule in favor of rich corps that steal from individuals, and increasingly so. AI companies have money. Artists don't. That makes AI thievery fine, doubly so since AI corps have financially contributed to the government.
JAlexoid 5 hours ago
devonkelley 6 hours ago
The 1 in 4 artists actually using the model for their own work is the most interesting data point here. If you're building a royalty system and 75% of the people being paid don't even want to use the tool themselves, that tells you something about the gap between "this is fair compensation" and "this is actually useful to my creative process." The royalty model might be the right thing ethically but it doesn't solve the adoption problem.
croes 5 hours ago
Or those 75% don’t want to work with that kind of tools no matter the compensation
Hansenq 6 hours ago
I love this writeup--it's one of the refreshing looks into how startup innovation happens on-the-ground. We're inundated with new products and startups so often that it's easy to forget that the people working on the product are taking a bet with no promise of future payoff. In this case, it didn't work out, despite the team putting in their hard work, sweat, and clearly lots of stress.
Startups are not for the weak but the process detailed here is how we've gotten some of the most transformative and innovative products in technology. Props on attempting this unique idea; very sad that it didn't work out, but sometimes the market just can't support certain ideas!
ipaddr 6 hours ago
They failed because they gave advances that were never going to be paid back and expected artists to bring in customers.
The demand to produce something in an artists style is low. The volume required to make it interesting to artist isn't present.
AI adoption and pushed back is greatest with artists you would be better off asking for money to shutdown AI.
The tech itself sounds interesting and would love that writeup.
jowsie 6 hours ago
The tech doesn't sound that interesting at all. Every AI Degen thread on 4chan and similar has included model fine tuning instructions for a few years now, for the express purpose of cloning an existing artists style. I also find it interesting that they included a quote from an artist pointing out the hypocrisy of using an existing model, trained on unlicensed material, but never actually discussed that particular issue in the article.
fennecfoxy an hour ago
I think you mean LORAs more than a fine tune. Yes, plus there are plenty of online resources to train a LORA as well, CivitAI you can just give it a bunch of images + labels and it just does it for you, the bar is pretty low.
Terr_ 8 hours ago
Props for a postmortem, much like scientific studies that publish negative results.
john-radio 8 hours ago
really well written and generous with interesting details, too.
s1mon 7 hours ago
This reminds me of the articles I occasionally see in the local newspaper about a restaurant that is closing down. So often it’s one that I’ve never heard of before that. To me, that’s the number one issue. If your likely customer base (or at least an audience member who reads a lot about the industry/market) hasn’t heard about your product, how are you going to have a successful business?
Yizahi an hour ago
I wonder, did they pay for the artists whose art they took without paying or asking to train that LLM model they are promoting? I guess we know the answer :)
fennecfoxy 2 hours ago
I mean there's no point; everyone still gets super mad even in the cases where models where trained only on content that a company owns or has paid for.
I wish artists would stop with the "it stole our work bullshit" and just be more honest about the "it can do what we do and we're terrified and scare for our future" part.
Because that I can 100% understand, and contrary to previous jobs just disappearing, we do live in "the future" and things like UBI or free cross-training should be available for this sort of thing.
herrfinste 5 hours ago
> One engineer who left Kapwing in fall of 2025 said that the short-lived Tess investment contributed to burnout.
Don’t take this personally.
Even if you told this person to work constantly and they believed in you and the business, it’s not totally your fault that they burned out. I say this as someone that has burned out twice, is currently burned out, and blames those that I currently and formerly worked for. I know the problem is as much me as them. Yes, employers have a responsibility to their employees not to burn them out. But, if they do, even if the employer is in a power position where the employee felt they had no other choice, and I felt that both times, the employee can choose not to work that much or care that much for almost whatever that means- if you’re literally holding a gun it’s different of course.
I know of a developer that committed suicide and the toll that took on the employer. But the employer can’t take on all of that themselves.
I’m sorry that your business failed, but I hope that something good comes out of this.
Also- I’m not saying that any part of your responsibility in burning out this person was ok. Just that not all of it is your fault.
rambambram 4 hours ago
I'm not a native English speaker, but since when became 'lessons' a 'learnings'?
defrost 4 hours ago
As a native (Commonwaelth) english speaker of six decades+ .. it only really "appeared" in frequency during the past decade, more heavily in the past five years or so, in central north American settings.
Grammerly will tell us:
Despite being more popular than “lessons” in the corporate setting, “learnings” is still incorrect. It's an erroneous plural form of the colloquial term “learning.”
~ https://grammarist.com/usage/learnings/As a business-speak buzz-word it might fade, or it may end up with a greater global footprint outside of the Biz-speak Babel tower.
billyjobob 37 minutes ago
Probably due to the movie "Borat! Cultural Learnings of America for Make Benefit Glorious Nation of Kazakhstan".
People missed the joke that it was poor English on purpose.
bandrami 8 hours ago
As somebody who occasionally gets tiny ASCAP checks I think an ASCAP/BMI model might work for artists (and maybe even writers?) I guess this is more like SESAC, but maybe that's how this will end up working.
hendry 5 hours ago
Are there successful non-AI artist platforms for works of art?
Papazsazsa 7 hours ago
The individual who figures out how to do this will be both wealthy and beloved.
minimaxir 7 hours ago
The majority of the artist responses were "hard no" in 2024. There's no way the artist demographic such a service would appeal to would be on board with anything even tangent to AI in 2026 (even done ethically) where the professional liability far exceeds the potential revenue.
bluefirebrand 6 hours ago
Most artists I have spoken to don't believe it's possible to do this AI stuff ethically
Maybe they're wrong but I tend to agree. Or even if it is possible to do it ethically, it still never will be done that way because there's just too much money in behaving unethically
JAlexoid 5 hours ago
tcbrah 4 hours ago
the spotify comparison is telling because spotify succeeded by being better than piracy, not by being more ethical. tess was trying to compete on ethics against tools that were just flat out better at the actual job.
i generate hundreds of images weekly for video content and the honest truth is i never think "i want this specific artist's style." i think "i need a documentary still that looks like 1970s film grain" or "i need a character that matches my last 50 frames." consistency and speed matter way more than provenance. the few times i tried artist-specific fine tunes the quality was noticeably worse than just prompting a good base model well.
the 6.5% artist signup rate buried in there is actually the real story. they cold emailed 325 high end editorial artists and got 21. those artists didn't want passive income from AI - they wanted AI to not exist in their market at all. paying someone royalties to automate away their livelihood is a weird value prop no matter how you frame it.
kingkawn 4 hours ago
How about that few want one artist’s particular style reproduced, instead they want what they are vaguely seeing in their head produced from a cacophony of styles
shevy-java 5 hours ago
I don't understand why we should pay for AI.
dragonwriter 5 hours ago
Is that because you don't believe we should use AI, or because you do not agree with “if you aren’t paying for the product, you are the product.”
JAlexoid 5 hours ago
Currently it's mostly to pay for running and training the models.
throwaway314155 6 hours ago
This article is bullshit. You can't get a full model from training on just one artist's work. A pretrained model is required. The pretrained model was likely one which was indeed trained on the works of others without consent.
What's more, their reasoning for abandoning the company was to build out another company with a suspiciously similar idea...