Video Bitrate Vs. Frame Rate
In addition to resolution -- dimensions measured in pixels -- a digital video's appearance depends on its frame rate and bitrate. Frame rate measures how many still images appear on the screen over the span of one second, indicating how smooth the video looks. Bitrate serves as a more general indicator of quality, with higher resolutions, higher frame rates and lower compression all leading to an increased bitrate.
A video's bitrate describes how much data the video contains, measured in megabits per second. The bitrate depends partially on video resolution, because higher resolution video files contain more information. It also varies by the strength of the video's compression -- a heavily compressed video has a lower bitrate than a lightly compressed video. Because heavy compression degrades video quality, bitrate serves as an indicator of overall quality, as well as file size: When comparing the same recording encoded at two different bitrates, the video with the higher bitrate will have fewer compression artifacts that reduce image clarity, but take up more space on the computer or disc.
Common Video Bitrates
Bitrate levels vary depending on the source. Blu-ray discs support video bitrates up to 40Mbps, compared to 9.8Mbps on DVDs. Other home video sources offer far lower bitrates: even in HD, video on Netflix only reaches around 7Mbps. Bitrate drops even further in the context of user-made Web videos or mobile videos, where download speed and low data use are larger concerns than video quality. Blurry video and compression artifacts stand out less on small screens, however, so a video that might look unacceptable on your TV can look fine on your smartphone.
Effects of Frame Rate
Frame rate describes the speed at which a video plays. The more frames per second played in a video, the smoother the video appears. A high frame rate also increases the bitrate -- unrelated to the level of compression -- because of the data required to store the additional frames. Televisions with motion interpolation can automatically increase the frame rate of a video at the time of playback, but doing so can cause the "soap opera effect" when the video appears abnormally smooth.
Frame Rate Standards
Unlike bitrate, video has standardized frame rates. In the United States, most movies run at 24 frames per second, while most TV programs follow the NTSC standard, playing back at about 30fps. In areas that use the PAL standard, such as many countries in Europe, TV video plays at 25fps instead. There are exceptions, however: in 2012, "The Hobbit" played in some theaters at 48fps, reducing blurring, but causing some viewers to find the video unsettling due to its departure from the normal rate of animation.
References & Resources
- Journal of Visual Communication and Image Representation: Review of Postprocessing Techniques for Compression Artifact Removal
- Blu-ray.com: Blu-ray FAQ
- Gigaom: Netflix Bets Big On 4K, Strikes Partnerships With Four TV Vendors
- Adobe Media Encoder: About Video and Audio Encoding and Compression
- PCMag: 'The Hobbit' At 48fps: Frame Rates Explained
- CNET: What Is the 'Soap Opera Effect'?