Artists advocate for transparency in AI-generated art
Concerns are growing about artists' rights as generative AI becomes more prevalent in the art world. Recent protests by musicians in the UK and an open letter from global artists have highlighted issues surrounding the use of copyright-protected work for AI training. This letter specifically called for Christie’s auction house to cancel a sale of AI-assisted artwork, arguing that many pieces were created without proper licensing. Generative AI, which creates content like images and music based on large data sets, raises questions about creativity and intellectual property. Artists worry that the current use of AI can lead to the unauthorized copying of their work. This debate touches on deeper issues regarding human creativity and the evolving role of technology in art. Research by artist Trevor Paglen focuses on understanding how AI image-processing works. This includes looking into how data sets are compiled and labeled before being used by AI models. Such understanding is crucial for determining how AI shapes our perception of the world. Paglen and fellow artists emphasize the importance of critically engaging with the mechanisms behind AI, rather than simply using it to generate images. Some artists, like Holly Herndon and Mat Dryhurst, actively manipulate data sets to reveal how AI produces images. Their project shows that AI can create exaggerated or distorted versions of reality based on its training data. This highlights the inconsistencies within AI processes and raises important ethical concerns, especially in fields like facial recognition and surveillance. The art community is calling for more accountability in AI technologies. There is a growing need for interdisciplinary research that bridges art and the humanities. By doing so, we can better understand and engage with a technology that increasingly influences our lives. The work by artists like Paglen and Anandol not only critiques AI but also inspires a more thoughtful approach to its use in art.