This is quite wrong. If an algorithm requires one memory read, and reading one byte from memory takes (size of data)^1/3 time steps, then the complexity of that algorithm is O(N^1/3), not O(1).
This is well known for doing large number arithmetic. For example, we typically consider that a+a is an operation that takes O(1) time to execute. This is because we know we are limiting the analysis to small numbers, even if we have a lot of them. But if the numbers can be arbitrarily large, then a+a doesn't take O(1) time, it takes O(log a) time. So, an algorithm that needs to perform O(N) additions doesn't take O(N) time, it takes O(N log K) time. And if we are doing something like adding the first N numbers, the complexity is actually O(N log N), not O(N).
The same thing happens to memory: if we assume that the problem will fit into very fast memory even for the largest sizes, we can indeed approximate memory access as O(1). But if we admit that the more memory we need to store the components of the problem, the slower memory access will get, then we need to model the cost of accessing memory as a function of program size - and the author is proposing O(N^1/3) as that cost. This means that our algorithm for adding the first N numbers actually requires O(N^4/3) time for reading N numbers from memory and then O(N log N) additions, which gives it a total complexity of O(N^4/3 + N log N) = O(N^4/3).
Now, this type of analysis is not useful if we think all our numbers will fit into a single computer's memory, probably. But if we are trying to, say, figure out how our algorithm will scale from gigabytes of data to hundreds of petabytes of data, we need to model the rising cost of memory access as our data set gets bigger too. At some point as N grows to infinity, even just computing the address of the next byte becomes a complex operation that can take longer than the whole time it took to add the first 100GB of numbers.
O(1) simply means that there's a constant upper bound to the algorithm as n -> ∞.
In English: There is some worst-case time taken to compute the algorithm that adding any further `n` will not take any longer.