Computer Hardware

Refresh Rate

Definition

Refresh rate is the number of times per second that a display hardware updates its buffer. This is distinct from the measure of frame rate. Refresh rate is measured in hertz (Hz).

Why It Matters

A higher refresh rate results in smoother, more fluid motion on the screen. This is especially noticeable and important in fast-paced video games, but it also makes general desktop use, like scrolling, feel more responsive.

Contextual Example

A standard monitor has a 60Hz refresh rate, meaning it refreshes the image 60 times per second. A gaming monitor might have a 144Hz or 240Hz refresh rate, providing a significant competitive advantage due to the smoother and clearer motion.

Common Misunderstandings

  • Refresh rate is a characteristic of the monitor. Frame rate is how many frames your computer's GPU is producing per second. To get the benefit of a high refresh rate, your GPU must be able to produce a frame rate to match it.
  • Technologies like G-Sync and FreeSync synchronize the monitor's refresh rate with the GPU's frame rate to prevent screen tearing.

Related Terms

Last Updated: December 17, 2025