Of the two, I think Adobe is most responsible for the decline of Flash. Even if smartphones had never entered the picture, laptops (where efficiency is important) were quickly becoming the most common form of PC, which would've eventually made Flash as it existed under Adobe untenable as well. The timeline was just accelerated by smartphones.
Honestly I can't understand the mental calculus that went on in the heads of Adobe execs at the time. Yes, cleaning up the ball of mud that the Flash codebase had become and making it not so battery hungry wouldn't have been an easy task, but it would've futureproofed it significantly. Instead they decided to keep tacking on new features which ended up being entirely the wrong decision.
EDIT: The constant stream of zero-days certainly didn't help things either. A rewrite would've been worthwhile if only to get a handle on that.
Flash was not particularly battery hungry (My go to example when HTML 5 demos started coming out was rebuilding a HTML 5 demo that was using 100% of 1 core into a flash app that used 5%).
The reason it burned CPU cycles is that non-coders could make programs with it and they would produce the world's worst code doing so that "worked". The runtime itself was fine (efficiency wise, not all the other things).
I think Apple is more responsible. One of Flash's chief benefits to the customers who paid the big bucks was that it 'just worked' everywhere. Once Apple stopped supporting Flash on the iPhone, that story was a lot less attractive.
The bugs were definitely Adobe's fault: as with most tech companies, they were far more interested in expanding the feature set than they were on fixing the bugs and stabilizing the platform.