It is not true that coal is more radioactive than spent nuclear fuel. It's very much the opposite: SNF is 10^11 times more radioactive than coal per kilogram, or 10^6 times more radioactive per energy unit.
Per the EPA, US coal has, at the high end, 10^3 Becquerel/kg of natural radioactivity [0].
Spent nuclear fuel has 3 million Curies/tonne (33 MWd/kg burnup fuel, at the age of 1 year) [1], which is equal to 10^14 Bq/kg. Since 33 MWd/kg is an energy density a factor of 10^5 greater than that of coal, the normalized ratio of [radioactivity]/[energy] is 10^6.
The graph in [1] depicts the decay of SNF activity on a log-log scale. It reaches the same radioactivity level as coal (again, normalized by energy output) at about 1 million years.
I'm fairly confident I know the origin of this social media-popular pseudofact. It's this poorly-titled Scientific American [2] article from 2007, which is about the (negligible) amount of radioactivity that nuclear plants release into the environment in the course of routine operation. It is *not* about spent fuel. It's a fair—but nuanced and easy to grossly misunderstand—point that coal power plants throw up all their pollution into the environment in routine operation, while nuclear plants, by default, contain theirs.
[0] https://www.epa.gov/radiation/tenorm-coal-combustion-residua... ("TENORM: Coal Combustion Residuals")
[1] https://www.researchgate.net/figure/n-situ-radioactivity-for... ("Impact of High Burnup on PWR Spent Fuel Characteristics" (2005))
[2] https://www.scientificamerican.com/article/coal-ash-is-more-... ("Coal Ash Is More Radioactive Than Nuclear Waste [sic]" (2007)
Sure, the spent fuel is considerably more radioactive per kilogram, but how many kilograms of coal does a typical plant burn in a decade, versus how many kilograms of nuclear fuel are spent?