At least it wrote a song, instead of stably-diffusing static into entire tracks from its training data. I can take those uninteresting notes, plug them into a DAW and build something worthwhile. I can only do this with Suno-generated stems after much faffing about with transposing and fixing rhythms, because Suno doesn't know how to write music, it just creates waveforms.
AI tools are decent at helping with code because they're editing language in a context. AI tools are terrible at helping with art because they are operating on the entirely wrong abstraction layer (in this case, waveforms) instead of the languages humans use to create art, and it's just supremely difficult to add to the context without destroying it.