The useful service set up by the Spawning art collective is called Have I been trained to allow you to discover in a few seconds if your own photo has been (often, unduly) used to train an image generator based on artificial intelligence. In fact, it is sufficient to load the photo on the internal engine which will then probe the huge archive of almost six billion images called LAION-5B which has been widely exploited to gradually improve the performance of various artificial intelligences. Spawning's goal is also to help artists and photographers protect their digital properties in the best possible way.
The artificial intelligences behind the increasingly popular image generators such as Dall-E, Google Imagen , Midjourney or Stable Diffusion must be trained every day with a large number of photographs with textual descriptions of the contents in order to offer an increasingly precise and rapid processing of the contents requested by users. A controversy was immediately created about the origin of the images used for the training, assuming that often they did not pay attention to whether or not they were covered by copyright and so the artistic collective Spawning has developed a portal that allows you to immediately discover if your photos are part of the large databases used for i.a .; for now we are based on LAION-5B, which fishes from the web with both hands without filtering too much protected content and in fact among the sources there are also Getty Images, Flickr or Pinterest. The generators that exploit this archive are Stable Diffusion and Google Imagen.
The Have I been trained portal allows you to search by keywords or, better still, upload one of your images to find out immediately if it has been used. "As a digital artist, in this era, we focus on being recognized on the Internet. Right now, when you type in my name, you see more work from the A.Y. than work that I have made myself, and it's terrifying. time will it take before the i.a. overwhelms my results and becomes indistinguishable from my works? ", the Polish artist Greg Rutkowski had commented to Forbes and in fact the goal of the portal is described on the collective's website:" Spawning is building tools for the intellectual property of artists' training data, enabling them to accept or decline the training of large AI models, set permissions on how style and similarity are used, and offer their models to the public. We believe every artist should have the tools to make their own decisions about how their data is used. " Once you have searched for an image and discovered its presence in the archive, you can decide whether or not to grant use for training. It will be enough to protect the artists whose name can be used as a keyword to get the AI to create. images that reflect the exact style of your photographs or creations?
The artificial intelligences behind the increasingly popular image generators such as Dall-E, Google Imagen , Midjourney or Stable Diffusion must be trained every day with a large number of photographs with textual descriptions of the contents in order to offer an increasingly precise and rapid processing of the contents requested by users. A controversy was immediately created about the origin of the images used for the training, assuming that often they did not pay attention to whether or not they were covered by copyright and so the artistic collective Spawning has developed a portal that allows you to immediately discover if your photos are part of the large databases used for i.a .; for now we are based on LAION-5B, which fishes from the web with both hands without filtering too much protected content and in fact among the sources there are also Getty Images, Flickr or Pinterest. The generators that exploit this archive are Stable Diffusion and Google Imagen.
The Have I been trained portal allows you to search by keywords or, better still, upload one of your images to find out immediately if it has been used. "As a digital artist, in this era, we focus on being recognized on the Internet. Right now, when you type in my name, you see more work from the A.Y. than work that I have made myself, and it's terrifying. time will it take before the i.a. overwhelms my results and becomes indistinguishable from my works? ", the Polish artist Greg Rutkowski had commented to Forbes and in fact the goal of the portal is described on the collective's website:" Spawning is building tools for the intellectual property of artists' training data, enabling them to accept or decline the training of large AI models, set permissions on how style and similarity are used, and offer their models to the public. We believe every artist should have the tools to make their own decisions about how their data is used. " Once you have searched for an image and discovered its presence in the archive, you can decide whether or not to grant use for training. It will be enough to protect the artists whose name can be used as a keyword to get the AI to create. images that reflect the exact style of your photographs or creations?