So one of the interesting comparisons between Quantum computing vs classical in the video: 5 mins vs 10^25 years. So are there any tradeoffs or specific cases in which the use cases for Quantum computing works or is this generic for "all" computing use cases? if later then this will change everything and would change the world.
It is specific to cases where a quantum algorithm exists that provides speedup, it is not at all generic. The complexity class of interest is BQP: https://en.wikipedia.org/wiki/BQP
Also of note: P is in BQP, but it is not proven that BQP != P. Some problems like factoring have a known polynomial time algorithm, and the best known classical algorithm is exponential, which is where you see these massive speedups. But we don't know that there isn't an unknown polynomial time classical factoring algorithm and we just haven't discovered it yet. It is a (widely believed) conjecture, that there are hard problems solved in BQP that are outside P.
Its for a very, very, very narrow set of algorithms AFAIUI.
There are only certain kinds of computing tasks which are amenable to an exponential speedup from quantum computing. For many classical algorithms the best you get from a quantum computer is an improvement by a factor of sqrt(N) by using Grover's algorithm.
The other tradeoff is that quantum computers are much noisier than classical computers. The error rate of classical computers is exceedingly low, to the extent that most programmers can go their entire career without even considering it as a possibility. But you can see from the figures in this post that even in a state of the art chip, the error rates are of order ~0.03--0.3%. Hopefully this will go down over time, but it's going to be a non-negligible aspect of quantum computing for the foreseeable future.