I think familiarity is a major factor, but the lower frame-rate and slower shutter speed also creates motion blur, which makes it easier to make the film look realistic since the details get blurred away. I remember when The Hobbit came out at 48 fps and people were complaining about how the increased clarity made it look obviously fake, like watching a filmed play instead of a movie.
I went out of my way to see the Hobbit in 24 and 48 fps when it came out, and weirdly liked 48 better. It was strange to behold, but felt like the sort of thing that would be worth getting used to. What I didn't like was the color grading. They didn't have enough time to get all the new Red tech right, that's for sure.
> I remember when The Hobbit came out at 48 fps and people were complaining about how the increased clarity made it look obviously fake, like watching a filmed play instead of a movie.
Curiously I can already get in this mindset with 24fps videos and much, much prefer the clarity of motion 48fps offers. All the complaining annoyed me, honestly. It reminds me of people complaining about "not being able to see things in dark scenes" which completely hampers the filmmakers ability to exploit high dynamic range.
Tbf, in both cases the consumer hardware can play a role in making this look bad.