Extra Credits

Extra Credits

Frame Rate - How Does Frame Rate Affect Gameplay? (2015x36)


Air date: Sep 23, 2015

To understand the debates over frame rate that pop up over so many new game releases, we have to start with an understanding of what frame rate means and does. All moving video is really composed of a sequence of still images, or frames. The rate at which we change those frames determines how smooth the motion looks. In film, the standard frame rate is 24fps, but that's actually a slow rate which creates effects like motion blur that our brains have been trained to recognize as cinematic. Video games aim for a minimum of 30fps, however, because the interactivity means that a slower frame rate can make the game feel laggy. While developers sometimes hit a higher framerate, we usually only hear 30fps and 60fps discussed because our TV and computer monitors refresh in intervals of 30, although we can turn off vertical sync (vsync) to get somewhat closer to the refresh rates pumped out by our graphics cards. So why do developers sometimes cap frame rate at 30fps? Sometimes it's contractual - they're prohibited from making a PC version definitively better than its console equivalent - and sometimes it's bad architecture - a game that's programmed to check something every frame becomes more laggy the more frames there are. But in the end, to achieve a higher framerate, developers have to trade off other aspects of the game's performance, and every industry study shows that better graphics trump better framerate when it comes to sales.

  • Premiered: Jul 2010
  • Episodes: 557
  • Followers: 9
  • Running
  • YouTube
  • Wednesday