why tho? it's just an alternate alphabet/set of symbols.
Because its generally expected that models only work 'in distribution', i.e. they work on stuff they have previously seen.
They almost certainly have never seen regular conversations in Base64 in their training set, so its weird that it 'just works'.
Does that make sense?
Because its generally expected that models only work 'in distribution', i.e. they work on stuff they have previously seen.
They almost certainly have never seen regular conversations in Base64 in their training set, so its weird that it 'just works'.
Does that make sense?