Same here. I have however seen a few out of memory cases in the past when given large input files.
it's not the focus or very performant but you can have it spill to disk if you run out of memory. I wouldn't suggest building a solution based on this approach though; the sweet-spot is memory-constrained.
Really? How large? I’ve only managed to crash it with hundreds/thousands of files so far, but haven’t so many huge files to deal with.
it's not the focus or very performant but you can have it spill to disk if you run out of memory. I wouldn't suggest building a solution based on this approach though; the sweet-spot is memory-constrained.