Out of nowhere, photo editing app Lensa AI became a huge hit when it launched its Magic Avatar feature, which creates AI portraits in a variety of styles. However, artists who were suspicious of the use of AI imagers became concerned when they saw what appeared to be signatures scrawled in the corners of many of these AI portraits. Some artists claim this could prove that the Lensa AI was stealing work that included the watermarks or signatures that artists use to prevent theft. One Twitter user, Lauryn Ipsum, pointed out this phenomenon in what is now a viral Tweet: "These are all Lensa portraits where the mangled remains of an artist's signature are still visible," wrote Ipsum, with photos attached. “These are the remnants of the signature of one of several artists that the AI stole.” The AI-created signatures were all illegible and, for Prisma Labs (parent company of Lensa AI), do not constitute proof of theft.
“The notion of 'remains of artist signatures' is based on the mistaken idea that neural networks can combine existing images. The actual process is different,” said Andrey Usoltsev, CEO of Prisma Labs. Although neural networks are trained on pre-existing images, once the training is completed the AI does not refer to the vast dataset of images it was trained on. Instead, it has now learned to imitate specific styles. According to Usoltsev, the AI learned that a key feature of the “painting” category is the signature, so the AI created one.
“On this occasion, AI mimics the paintings, the subset of the images that usually come with signatures. The AI understands the signatures as an inherent style feature and mimics them," wrote Usoltsev, who added that "the pointed details do not use any existing language and do not represent the particular artist's signature." According to Usoltsev, this mimicry is not theft, as no particular artist has had their signature distorted or, in Ipsum's words, “mutilated”. However, the point may not be that any artist is being robbed, but that, en masse, their work may have been used to train the technology that threatens to replace them.