This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful. For full details on the cookies used on our website, please visit our Cookies Policy page.
Artificial intelligence (AI) is not merely a technical phenomenon—it’s a major force echoing across diverse industries, including the field of creative media. For educators, the challenge has two main aspects: to understand this rapidly evolving field and to prepare the next generation of creative professionals.
AI in the Creative Sector
Futureworks, a specialist provider of creative media courses, recently held the biannual meeting of their Industry Advisory Group (IAG), and surprise, surprise, the agenda item that generated the liveliest discussion was the use of AI in the creative industries. While we got the impression that most had played with it for marketing purposes, administrative functions, or just for the sheer fun of it, most of our members said that they were not currently using it to generate creative outputs. Given the scale of media attention from apps such as Midjourney, Bard, and Adobe Sensei, this was a bit of a surprise to us. However, we were able to use the meeting to dig deeper into why creative professionals were reluctant to use these disruptive technologies.
Futureworks asked the IAG group whether their companies were using AI and, if so, for which aspects of their business. Some were exploring AI’s potential to automate time-consuming tasks, generate new ideas, and enhance creative output. Overall, though, the feedback from our IAG demonstrated a more careful approach to AI use in the workplace. As academics, this may be the moment to pause and synchronise our adoption of AI with that of the specific media industry we teach. Parallels drawn from this caution of the use of AI in creative fields can be seen in some of the work proposed by academic work, which argues the case for students having access to the internet during examinations and explored this idea in his research which suggests that students were being taught and prepared for an obsolete examination system.
Ethics of AI in Higher Education and Industry
There are many ethical and legal issues that AI will present in industry practices. AI offers several opportunities for creative jobs but only generates from work that has already been designed and produced by a human, which only provides little creative value.
Games companies in the USA have concerns about a potential AI backlash regarding AI art, where the “No to AI” protest in which Artists expressed their disapproval of Artstation’s decision to allow AI-generated portfolio art by flooding it with protest images and encouraging artists use of image tagging stating ‘No AI Art’. Epic Games’ use of AI images in their web portfolio raised concern, but they noted that scraping others’ art on the web without permission for commercial gain should not become the norm. Earlier this year, US District Court Judge Howell presided over Thaler’s copyright lawsuit against the US Copyright Office for his AI-generated image. Refusing to grant Thaler ownership of copyright, she noted that copyright has never been granted to work “absent any guiding human hand,” adding that “human authorship is a bedrock requirement of copyright”, The Verge 2023. This case demonstrates that the development of a symbiotic relationship between AI and Creative Media still has a long way to go, especially when it is being adopted as an essential creative process by the student community when producing work for assessment.
Many in the meeting felt it was a positive development that AI was threatening the traditional assessment of creative work, and that more assessment focus could be placed on demonstrating an understanding of the management of the existing processes within creative production. For example, some critics argue that AI-generated content can be difficult to distinguish from human-created content, which raises questions about copyright and plagiarism. Applying this in the context of HE programmes potentially questions the creative authenticity of a candidate’s portfolio work being presented at an interview and to what degree AI had played in its production. The consensus is that most companies would not encourage candidates to rely on heavy AI generation for work presented in their portfolios, and this is an important message that must be shared with students as they learn how to use AI as their professional co-pilot.
Industry members reported that AI was used for concept art, text production, and AI audio software to ‘strip out’ the vocals (or individual instruments) from a commercial music recording. They either don’t have the original master tapes or to speed up delivery from our end when it comes to creatively re-mixing/re-imagining a track that has been licensed. In the CG field, it was reported that AI is generally only used for tasks such as compiling data, training texts and PBR (Physically Based Rendering) workflows.
As we go deeper into a world where AI is ubiquitous, the sector and industry sectors must coordinate with students and find out what they consider acceptable or ethical use of AI in their academic work to understand its use in the HE curriculum more effectively.