Chosen topic
- The Evolution of Social Movements in the Age of Social Media,
-Me too movement
Al and Ethics: Navigating the Moral Landscape of Machine Learning,
-Bringing ai to the health sector
-ai art stealing content
3 social media outlets:
-Pinterest
-Tiktok
-Instagram
-Linkdn
-Twitter
Twitter:
-Repost / analysis of an Artists opinion on A.I and its use in content creator
-Repost / analysis of a film star, YouTube creator, politician, or director's opinion on A.I in media creation
-My tweet / my opinion of A.I and how I think it will effect content creators jobs and how use consume media
Instagram:
-One reel / Similar to the TikTok
-A post with an image
-A post with an image
TikTok:
-Video analysis of SORA A.I and its effects on content creation
-Companies that have used A.I in the adverts
-Here are recently released moves that are proven to have used A.I and what this means for medical production in the future
_____________________________________________________________________________________
Spread awareness of how ai is a biased machine and the data that it collects may not always represent proper ethical things
Googles ai that tried to combat racism but instead created a lot of historical inaccuracies with its ai images
Explain why companies using ai in media production could result in boring and biased products
script writing ai is used in movies to direct the script writers and help finish the story
_____________________________________________________________________________________
references
Blog post: words
Target audience: Young adults and teens interested in media and the introduction of AI into the industry.
Title: Is AI Taking over the media industry?
Over recent years, digital AI tools have been frequently implemented into the development and pre-development of many multimedia products. From storyboarding, to script writing, and even music, big conglomerates have utilised these open AI tools in an attempt to save time and money; leaving the workers in these creative fields struggling to pay rent. With all its controversies, public opinion for open AI tools such as Dalle is not looking good. Specifically in the creative media industry, as bias data and "stereotypical cliches" (Tiku, Schaul, Yu Chen, 2023) seem to slip through the cracks of AI generated content. Plus, recent developments in AI such as SORA, have left filmmakers worried about their creative freedom when writing and producing a product. As professionally shot and realistic looking videos can now be generated by AI; via a simple prompt. So, is AI taking over the media industry, and will your favourite artists and franchises adopt these tools in their work? Or, will it shift the direction of content creation for the better?
There are three features of artificial intelligence generated content (AIGC) that I will examine in my critique of the rise in AIGC in media production.
1. The unethical practises in media production.
Since its inception, the idea of AIGC has seemed unattractive and unethical to most people. As people interested in movies, video games, music, and shows want to consume media that isn't created by an automated bot, that is replacing the jobs of your regular screen writers and artists who help produce pre-production documents that help launch a project of its feet. Introducing AI roles into your company such as "Netflix's AI product manager, with a £900,000 yearly salary" (Code, 2023) leaves a bad taste in your audiences mouth, who would rather not consume content that is influencing the unethical business practises that a lot of companies are using.
2. It just doesn't work!
With how bias these AIGC tools are, how can they possibly be used to represent content directed at a wide demographic of people. When AI tools have a "lack of diversity among those inputting the training data" (Shamim, 2024) It creates biased and racist imagery that distorts facial features when trying to generate images of people of colour. However, even when AIGC tools are given more data surrounding people of colour, such as Googles Gemini AIGC tool. They seem to generate historically inaccurate images that even depict "Nazi-era German soldiers as people of colour." (Robertson 2024) So until a true balance is found that can integrate different cultures without including racial stereotypes or historical inaccuracies, should they really be used to create media that defines different social structures or appeals to different demographics? Alongside this, many AIGC tools have been shown to be inconsistent with their quality, producing anatomically incorrect features on a persons face or hands that just doesn't look right. You don't even need a keen eye to see these mistakes, they can become blatantly obvious for the average viewer. Such as with Nicki Minaj, who used AIGC tools to "churn out boring AI images to promote their newest release" (Yalcinkaya, 2024) 'Gag city'. Which disappointed many fans and spawned some hilarious reposts and memes.
3. The death of innovation.
Finally, this model of content creation vary rarely inspires change, as the data that is being used can be repetitive and predictive. This makes the generated content seem safe and boring, as the lack of human interaction will ensure that unique and creative ideas will be put aside to make something that can potentially be marketed to the largest target audience. This applies to scripts, artwork, and character designs which are the areas in production that offer the most amount of chances for unique ideas and innovation. Ever consumed a media product and thought "This seems very familiar?" That's why!
What even is Artificial intelligence generated content?
Generative AI is a relatively new technology that can be incorporated in work to make it easier and faster for employees, and employers. AIGC tools are fed data from the internet in order to create "brand new content - based on the training that it is fed" (Reuters, 2023) Making it different from your average AI tool that only "categorises or identifies data" (Reuters, 2023).
The extent of this AI's capabilities are constantly evolving and it has already been used by major artists and conglomerates as a way to create imagery and art while saving time and money.
Take the music industry for example!
AIGC Drake music, created by someone using the 'Jammable tool' (Lopez, 2023) was circling online in early 2023. Now, the real Drake has utilised the same AIGC tool in his own music to capture both Snoop Dogg's and Tupac's likeness on one of his most recent tracks, without spending the time or money to get the real thing. Similarly, artists such as Nicki Minaj, Lil Yatchy, and others have been seen to use AIGC tools to create artwork released for their music. Completely cutting out the very talented artists that would create unique imagery that would expand on the artist work.
Why is this bad for YOU?
While these AIGC tools can be useful for indie artists with a small following and budget for their album releases, massive artists using these tools to save money and replace the talented artists that help bring the vision of an album to life is just unethical. Buying an album digitally or physically helps support all the artists that were involved with the creation of the project, and people may feel indecisive when buying an album when they know that the art was created by an artificial intelligence, instead of a hardworking artist.
While on the topic of AIGC album art, a lot of it simply looks bad and is very visibly AI art. With cases such as Lil Yatchy's artwork for his album 'Lets Start Here' the obvious AI art is used to tell a story that relates to the music. However many of Nicki's most recent releases feature artwork that is "Soulless and hyper-realistic" looking like it came straight from a "text to image generator" (Yalcinkaya, 2024).
Similar to this, many AIGC tools have been created that allow users to create music completely from scratch. This eliminates the need to learn the musical or digital elements that come with creating music in the modern age. AIGC tools like this seem like a great start for the average user who wants to try out these tools. However, conglomerate companies have already began to use these tools for media production.
"In 2024, writers and studios will likely begin to experiment with artificial intelligence to determine the capabilities of large language models and possible best practices for incorporating them in the screenwriting process." (Shomer, 2024)
I believe that this will eventually make all content seem safe and boring for the average viewer. Imagine if Drakes next album, or the next Marvel movie is written completely utilising these AI tools. Media creation could completely die out if AIGC tools are only used for monetary gain.
Comments
Post a Comment