41st Black History Observance Breakfast to Highlight Power of Civic Voice
As the Tri-City community prepares for the 41st Annual Black History Observance Breakfast, Chairman Oz Roberts says the event’s
For more than three decades on the air with Davis Broadcasting Inc., a 100% Black-owned media company based in Columbus, Georgia, I’ve had a front-row seat to Black music—its genius, its evolution and its power. Some might call me a connoisseur. I simply call myself a witness. And what I’m seeing now, in the age of artificial intelligence, feels disturbingly familiar.
AI has “changed the game,” but not necessarily in a way that benefits the people who built the game. Technology now allows non-singers, non-musicians—and potentially people with no cultural connection to Black music—to generate songs using Black voices, Black styles and Black soul. The fear isn’t innovation; it’s exploitation. We’ve seen this movie before.
That concern was laid bare recently by syndicated radio personality Frank Ski, who offered a blunt warning about where this trend is headed. Ski pointed to the rise of AI-generated artists such as Xania Monet, a so-called “Black-generated” artist who has already landed on Billboard’s radio airplay charts—despite not being a real person.

According to Ski, the implications are alarming. AI makes it possible for people outside the Black community to generate music using Black voices and aesthetics, effectively cutting Black artists out of the process—and the profits. The industry risk is equally troubling. Record labels could use AI to manufacture new “artists” while devaluing the catalogs of real ones, particularly as legendary performers sell off their publishing rights. Even icons such as Patti LaBelle and Prince have recently sold rights to their music, raising a critical question: What happens when AI is turned loose on those catalogs?
Ski explained the mechanics plainly. A poet from Mississippi ,Telisha "Nikki" Jones entered her lyrics into an AI music generator. The app produced a song, then generated an image for a fictional artist. That artist charted. A multimillion-dollar deal followed. When questioned by Gayle King about the fact that the “artist” couldn’t actually sing, the response was chillingly casual: the AI persona was treated as an extension of the creator. In other words, the voice mattered—but not the lived experience behind it.
That is why this moment matters.
Black music has never been just entertainment. It has been the voice and fabric of our community since slavery—born in spirituals and gospel, carried through jazz, blues, soul, funk and hip-hop. It helped fund and fuel the Civil Rights Movement through artists like Louis Armstrong and Harry Belafonte, and it shaped Black pride and economic independence through James Brown and institutions like Motown Records.
Music is also one of the Black community’s most powerful cultural and economic engines. Now, with nothing more than a laptop and an algorithm, that engine can be mimicked, mass-produced and monetized—without the community that created it.
In the 1950s, white artists and labels stole Black music outright, re-recording songs and reaping the rewards while Black musicians were left behind. Today, the threat isn’t cover songs—it’s code. Different decade. Same danger.
The question before us is not whether AI will be used in music—it already is. The real question is whether Black artists, Black communities and Black-owned institutions will have ownership, protections and a seat at the table as this technology reshapes the industry.
If history teaches us anything, it’s this: when we don’t guard our culture, someone else will gladly monetize it.