Yahoo Answers is shutting down on May 4th, 2021 (Eastern Time) and beginning April 20th, 2021 (Eastern Time) the Yahoo Answers website will be in read-only mode. There will be no changes to other Yahoo properties or services, or your Yahoo account. You can find more information about the Yahoo Answers shutdown and how to download your data on this help page.

Why are movies shot in 24fps?

Most movies I watch are in 24fps. They work fine on my 60hz tv, which makes me think, why do filmmakers even record movies in 24fps? Would the movie not feel much more 'smoother' if they actually record at 60fps? Or even better at maybe 120fps? These movies run fine on my tv, but once I download some documentary or series which is 25fps, then the screen stutters, even though my tv has some build in options to 'reduce jitter' and a special 24fps mode etc etc. But why shoot movies in 24fps at all? In this digital age? 

2 Answers

Relevance
  • 2 months ago

    Way back in the olden days (think black & white silent movies), movies were shot at between 16 and 20 frames per second. Film projectors standardized on playing back at 24 fps, so theaters would play them back at 24 fps. This is why those old movies appear to be playing back at faster than normal.

    Audiences accepted this because they weren't expecting movies to appear to be real life. When talkies came along, people didn't accept the audio being played back 20% faster, so studios began shooting at 24 fps so that what's seen in the movie would match what's being 'said' on the audio track.

    When you start playing back much below 20 fps, audiences start to notice the framing. The cost of film acted to put a cap on going much faster though.

    Nowadays with digital cameras, the cost barrier to higher frames rates isn't so much any more. When Peter Jackson filmed the Hobbit, rather than shooting at 24fps, he shot at 120fps, now called HDR. This was met with mixed reviews, so it's unknown how well this will catch on.

    Unless your television is an old picture tube type television, it's not limited to 60hz. LCD, OLED, DLP, and Plasma televisions can handle most any frame rate you throw at it (limited mostly by the processor speed inside the television). Even the old tube-style televisions weren't 60 frames per second, but rather 60 fields per second, where two fields (odd and even) would be considered a single frame, to give you 30 frames per second. This what is now called 480i ('i' for interlaced).

  • Parky
    Lv 7
    2 months ago

    24fps is a simply what people are used to visually, and getting people to change is hard. 

    24fps was arrived at as a compromise. Back in the early days of film when movie cameras were hand cranked the film stock was expensive. 24fps was found to be a good compromise between not using too much stock and maintaining the illusion of movement. It became a standard and has been ever since. In this time of digital the film stock is no longer an issue but people still visually associate 24fps with the feel of film. There have been attempts to increase it, The Hobbit movies were released at 48fps. The effect was more realistic but many people thought it gave it a video look, like a news or sports broadcast. Ang Lee’s recent Gemini Man was shot at 120fps.

    The audience don't seem to be responding to these experiments though. It is worth mentioning that if a film contains visual effects then there is a cost associated with a higher frame rate, since the FX elements have be rendered for more frames. So it is not entirely without an overhead. If the market for higher frame rates increases then the studios will go that way, but right now there is not the motivation. 

Still have questions? Get your answers by asking now.