5. Long exposure time makes 24 FPS smooth

If 24 FPS is good enough for film, then surely, 30 FPS is good enough for games, right? No one ever complains about movies being choppy.

The reason movies don't appear choppy at 24 FPS is because they are recorded at 24 FPS. At their most basic, a movie is a sequence of photographs, and a film camera takes 24 photographs per second.

If you've ever attempted to photograph a moving subject, you've no doubt seen the resulting blurry mess in the developed picture. And you've wondered why professional photographers manage to get those glorious, unblurred images of action shots. It all has to do with a camera's shutter speed, or the image's exposure time. Essentially, the more milliseconds a camera takes to record an image, the more blurred the motion will be.

24 FPS footage introduces a lot of motion blur, and it's this blur that helps the illusion of a continuous, flowing sequence of animated images. If you take a screenshot of a movie during an action scene, you'll see motion blur everywhere. But if you take a screenshot of a game during an action scene, you won't see any blur at all - or rather, some minimal blur rendered in post-processing.

A game running at 24 FPS would look choppy, especially when there is a lot of movement on screen, because there isn't any motion blur to help guide our eye toward the movement. So why not just make games render motion blur? Because that requires additional processing power, and if you have the power to render motion blur, then you have the power to render more frames per second, making motion blur unnecessary!

6. Your display limits your FPS

Okay, this is something you probably know, but I couldn't let you walk away from this discussion without ensuring you do. More FPS in a video game is always better, right? Wrong.

In gaming, it doesn't matter how powerful your computer or video card is. It doesn't matter if your system can render 200+ FPS. If your monitor operates at a refresh rate of 60 Hz, then you're only getting 60 FPS.

"Refresh rate" is a measure of how many times per second your monitor updates the image it displays. Hz - or hertz - is simply a unit that defines how many times per second something happens. So if your monitor is only showing you 60 images per second, but your computer is rendering 120 FPS, then it's dropping 60 frames every second - frames it has put the computational power into rendering. What a waste!

That said, because there are areas in games where computers tend to "slow down" and frame rates drop, you want to ensure that your lowest frame rate never drops below your monitor's refresh rate, if you want a consistently smooth experience. Just keep in mind, when building that monster PC, that you may want to consider a monitor with a higher refresh rate as well.

So where do you stand in the frame rate debate? Is 30 FPS enough in video games? Is 60 enough? What about movies? Should we move away from the motion-blurred 24 FPS and move onto greater frame rates? Let us know in the comments!