Not the same. The more effort you put into CGI the more invisible it becomes. But you can’t prompt your way out of hallucinations and other AI artifacts. AI is a completely different technology from CGI. There is no equivalence between them.
i think they are referring to statements that they have "solved" hallucinations and it wont be a problem anymore (which it obviously isn't yet anyways)
My guess is that post-training has gotten a lot better in the last couple of years and what people are attributing to better models are actually just traditional (non-LLM) models they place on top of the LLM which makes it appears that the model has increased in quality (including by seemingly fewer hallucination).
If this is the case it would be observed with different prompting strategies, when you find a prompt which puts more weight on the post-training models.
The story is that I was getting into a new genre of music, namely Japanese City pop from the 1980s. I was totally unfamiliar with the genre and started listening to it on YouTube. I found one playlist, which I listened to a lot, thinking: “wow, this is very formulaic, and the lyrics are very generic” but I kind of thought that was just how the genre went. Finally had planned to use it for during a small local event, but when I went to find out who the artists were I embarrassingly found out it was all AI generated.
Thing is, in this instance I knew nothing of the source material, when I went to get actual songs, written by actual people, the difference was start. I would be able to recognize AI generated City pop in an instant now 8 months later. This experience kind of felt like I had been scammed. That my ignorance of the genre had been taken advantage of. It was not pleasant.
I had a very similar experience, looking for music to play during D&D sessions. Not paying close attention to the music, it seemed like it fit the bill. Once I started listening more closely, there were lots of issues that became readily apparent.
My dad has also started sharing with me links on Facebook to pop songs that have been re-arranged in different genres. This was a big area of fun for a number of folks in my family several years ago as we discovered YouTube artists like Chase Holfelder who put significant effort into making very high quality rearrangements. But I kept noticing these weird issues in the new songs.
I've gotten to where I can identify an AI generated song almost immediately: there's a weird, high frequency hiss in the mix that sounds like heavy noise getting to overcome compression artifacts but the source from which it's coming should be clean. There's a general lack of enthusiasm to the lyrics and a boring, nonsensical progression to the lyrics on original arrangements. Sometimes, the person generating the song tries to hide that last issue by generating instrumentals only or they use one of those try-to-hard-to-sound-badass Country Rock genres that are popular on Tik Tok to stick on top of clips from the TV show Yellowstone (WTF is with that?!), but then when I check the details, there's an obviously AI cover art for artists I've never heard of. The accounts will be anthologies full of these artists that have never existed.
So, I know people keep parroting "a good artist can use any tool". But I've yet to see it. All this "democratizing art" (didn't know anyone was gate keeping it to begin with, certainly have not seen any lack of talent online in several years) doesn't seem to be producing results. It becomes pretty obvious very quickly it's all just a pump and dump scheme to Get Them Clicks.
You don't understand. I mean content that even now, you don't know it is AI.
Obviously you think the AI content that you can identify is bad. But there is content you've encountered that you think is good and not AI content, that actually is AI generated.
This sounds dangerously close to a No True Scotsman argument. Any example one could provide, you've teed it up nicely to claim that no, you didn't mean that one, obviously, because you could tell. No, it's some other thing that you haven't found yet. That's the passing-AI.
I think it is worse then a No True Scotsman. I think your parent actually performed a category mistake here. Survivorship bias does not apply here. Whether or not I notice or even unknowingly enjoy AI generated content is not in the same category as how much I notice or enjoy CGI.
The difference is in the authorship. Actual work and skill goes into CGI, and people generally notice bad CGI, and it generally affects how you judge the art. Sometimes CGI is actually part of the art and you are supposed to notice it, and it is still good (think how Cher use Autotune in Do You Believe). There is no such equivalence with AI.
To further elaborate. Bad CGI is often (but not always) used as a cost-cutting means. Directors (or producers encourage directors to) use it when they want to save money on practical effects or even cover up mistakes that happened during shooting and want to avoid an expensive re-shoot. This can work OK if used sparingly and carefully, however if this is done a lot and without the needed care, you will notice it, and you will judge the work from it. AI content is kind of like that, except that is kind of all what AI is. The other couldn’t be bothered to do the work and just prompted an AI to do it for them.
To summarize: AI is not like CGI in general, it is much closer to a strict subset of CGI which only includes bad CGI.
Same thing is true of AI output.