Here Are Five Tips on How to Tell If a Song Is Real or A.I.
In the current age of technology, advancements are happening at a rapid rate and affecting a multitude of industries, hip-hop included. So much that it’s unclear if rap songs that are coming out are actually artist-made or created via artificial intelligence, better known as A.I. The landscape and craftsmanship of music is facing an evolution that could be for better or worse—a question that has yet to be answered and will likely linger for some time. Besides, who would have ever thought skepticism would become an innate response to newly released music or that some sort of verification would be required to be able to listen to songs with ease? Rap fans can navigate through this uncharted territory with five ways to tell if a song is real or A.I. from an expert who knows the landscape.
Before getting into how to tell the difference, there’s some dots to connect. Esteemed UMG Chairman and CEO Lucian Grainge, who has been a pillar in the music industry for decades, has witnessed the many transitions within the music space: CDs to streaming and now A.I. Back in January, Grainge shared his thoughts on the latest music-related development with The New Yorker. He expressed that “technology has served the industry very well.” However, the longtime music executive is also supportive of the U.S. Congress’ No A.I. Fraud Act, saying the “legislation is critical” and could be viewed “as a form of identity theft.”
The industry appears to be on both sides of the coin with this one. The CEO of the Recording Academy declared at the Fast Company Innovation Festival 2023 that “A.I. creativity” won’t be awarded. The Recording Academy also had a conversation with the U.S. Copyright Office last year to begin discussions on the measures that should be taken to protect music creators.
As it pertains to rap specifically, A.I. is single-handedly altering the hip-hop experience. It’s become exceedingly difficult to decipher which songs are real and which are fake. XXL speaks to A.I. expert, Daniel Anstandig, also the CEO and cofounder of Futuri Media, to get to the bottom of the how to determine the authenticity of a track. He explains that “detecting A.I.-generated anything (even hip-hop diss songs) is challenging because of how good the technology has become.”
Anstanding does admit that much of the music heard today incorporates A.I. elements in some way. “Also, the line between A.I. and human creation is blurred in modern music production,” he shares. “Nearly all producers utilize DAWs that incorporate A.I.-assisted plugins and processing. Even in songs we might label as entirely ‘human-made,’ there are likely elements that have been shaped or enhanced by A.I. We’re gradually merging with the tech, so it’s challenging to untangle the human from the machine in the creative process.”
There have been rap songs circulating on the internet over the last year that contain the likeness of some of rap’s biggest names by using A.I.: Drake, Ye, Lil Wayne, Future and, most recently, Tupac Shakur and Snoop Dogg—seemingly at the hands of Drizzy. Tracks that actually sound like they could be legit lose their validity because of A.I. and presumably should be taken with a grain of salt. Especially if the actual artists don’t promote the record themselves.
A perfect example would be the unconfirmed “leaked” Drake diss track aimed at Kendrick Lamar and others in April that people thought was artificially generated. It wasn’t until Drizzy stamped the track with posts in his Instagram Stories and then officially released it on streaming services days later that rap fans knew it was legit. The song, later titled “Push Ups,” is comprised of a series of jabs in response to Future, Metro Boomin and Kendrick Lamar’s “Like That” record that was released weeks prior on Future and Metro’s We Don’t Trust You album.
Then, there was the alleged Kendrick Lamar diss response that rap fans made a similar assumption about until his label confirmed that the record was created via A.I. The presumed Kendrick diss came shortly after Drake launched his fiery missile, “Push Ups,” targeting not only Kendrick, but Future, Metro, Rick Ross and The Weeknd. Following “Push Ups,” Drizzy took a unique approach that was praised by some and perplexed others when he delivered “Taylor Made Freestyle” via an Instagram post on April 19, which incorporated A.I. verses from Tupac Shakur and Snoop Dogg.
The golden question is: how does one determine what’s real and what’s fake when it comes to A.I.? Daniel Anstandig offers XXL five tips to tell whether or not a song is legit or if technology is playing a major role in this newfound musical deception.
1. Listen for generic or overly common lyrics. Is there a unique style and surprise element?
2. Pay attention to repetitive patterns compared to the artist’s original content. A.I.-generated lyrics—at least when GPT is used—tend to be packed with cliches.
3. Is there emotional depth and authenticity of the performance?
4. Are there subtle imperfections that feel human? This is hard though, because even A.I. systems are baking in subtle imperfection to “feel more human.”
5. As for A.I. detection services, approach them with a grain of salt. While some use sophisticated algorithms, they’re notoriously unreliable.