Artists are turning to a new social platform, Cara, developed by Singaporean photographer and director Jingna Zhang, as a move to shield their artwork from unauthorized AI training applications on mainstream platforms like Facebook and Instagram. Zhang launched Cara in April 2023 as a protective measure for artists against data scraping for AI, and the numbers have been growing amid increasing debate surrounding Facebook and Instagram’s contentious user data collection policies.
The escalating use of AI art generators such as DALL-E and Midjourney has further fueled this migration. Cara, thus far, has exceeded 300,000 downloads on the iOS App Store in the US, surpassing major applications like LinkedIn, Reddit, and Discord.
Zhang experienced IP theft firsthand in 2022, when a Luxembourg painter, Jeff Dieschburg, was convicted of plagiarism for unlawfully copying and profiting from a photoshoot Zhang conducted for Harper’s Bazaar Vietnam.
Cara places a firm rejection on generative AI. The platform will not host AI-generated portfolios until the extensive ethical and data privacy concerns around data sets are fully addressed by regulation. Furthermore, the platform does not entertain NFTs to avoid encouraging any rapid profit-making culture or deceptive practices.
Before the advent of Cara, artists sought refuge in platforms like Artful, a website tailored to artists, and Bluesky, a decentralized social media app. However, migration to such platforms tends to peak and then gradually decline, according to Laura Thiele, a 2D artist and illustrator based in Melbourne. Despite the constant efforts it takes to protect IP online, artists have been experimenting with Cara, prompting a significant surge in user numbers. Zhang has disclosed spending US$13,500 (AU$20,300) monthly to maintain the site’s servers.
Thiele believes that while the risk of AI poaching exists so long as art is accessible online, platforms like Cara offer a long-term solution by unifying artists socially, initiating conversations, and setting boundaries with entities intending to exploit them. According to her, a strong artist community is the best means to protect their work from AI misuse.