If the AI hasn't specifically learned about SeqTracks as part of its training it's not going to give you useful answers. AI is not a crystal ball.
The problem is it's inability to say "I don't know". As soon as you reach the limits of the models knowledge it will readily start fabricating answers.
The problem is it's inability to say "I don't know". As soon as you reach the limits of the models knowledge it will readily start fabricating answers.