Tokenizing text is ridiculously small part of the overall computation that goes into serving a request. With that said if you’re doing this on petabytes of data, never hurts to have something faster.
A language that isn’t memory-safe can definitely hurt. AI needs more security, not less.
A language that isn’t memory-safe can definitely hurt. AI needs more security, not less.