I wasn't explicitly referring to the more "sane" people expressing doubts regarding Ai.
Hinton at least says that other issues in Ai should be dealt with, rather than just being an Ai doomer who only fears Ai takeover he actually realizes that there are other current issues as well
At this point, how many times should we have been dead for eliezer?
Like almost all the other doomers, Eliezer never claimed to know which generation of AIs would undergo a sudden increase in capability resulting in our extinction or some other doom, not with any specificity beyond saying that it would probably happen some time in the next few decades unless the AI project is stopped.