Technical Article

CCD Image Sensors: What Is Frame Rate?

April 21, 2020 by Robert Keim

This article explores the relationship between CCD readout characteristics and the maximum rate at which images can be generated.

So far in this series on CCDs (charge-coupled devices), we've covered many aspects of how these devices operate, their output signals, and how you can improve their resulting signal-to-noise ratio and frame rate via binning. See the articles below for more information.

  1. Introduction to image sensors
  2. Structure and functionality of CCDs
  3. Types of CCDs (full-frame, interline-transfer, and frame-transfer)
  4. Clocking techniques for CCD readout
  5. CCD output signals
  6. Modifying output signals: sampling, amplifying, and digitizing
  7. Back-illuminated CCDs
  8. CCD binning

Now, we turn our attention to a new concept with CCDs: frame rate.

A CCD (charge-coupled device) is not inherently limited to only photographic or only video applications. The difference between a CCD still-image sensor and a CCD video sensor exists at the implementation level. If integration and readout occur such that one image is produced and then displayed (or stored, or analyzed via image-processing software), the CCD and its supporting components are functioning as a still camera.

If integration and readout occur such that images are produced at regular and relatively brief intervals, the CCD and its supporting components are functioning as a video camera.

With still cameras, the primary concern is usually image quality; resolution, sensitivity, or noise performance become critical parameters depending on the application. These issues are important in video cameras as well, but now we also have frame rate to consider, and sometimes image quality must be sacrificed in favor of this fundamental parameter of video applications.

 

What Is Frame Rate?

It’s the frequency at which individual images—i.e., frames—are produced. The standard unit is frames per second (fps).

I suppose it’s actually a little more complicated than that if you consider interlaced video, which has both a field rate and a frame rate. One field has only odd or even lines and therefore is half of a frame; thus, interlaced video described as 30 fps has a field rate of 60 fps.

Furthermore, the field rate is not a purely technical parameter with no relevance to the user, because interlacing results in a perceived frame rate that is higher than the actual frame rate. If you consider a field to be a frame, the frame rate is the field rate.

 

This image reveals what we usually don’t notice in interlaced video: the separation of a frame into even and odd fields. Image from Wikimedia.

 

The alternative to interlaced video is called progressive scan. This is the more intuitive type of video, i.e., one frame consists of all the lines produced by the sensor, with none of this awkward even/odd business. I don’t trust interlaced video—always trying to pretend that its frame rate is higher than it really is.

The rest of this article and the next article, which is a continuation of the same frame-rate discussion, were written with progressive scan in mind.

 

Frame Rate Standards

Standardized frame rates are 24 fps, which comes from cinematic video, and approximately 25 or 30 fps, which are used in television broadcasting.

However, if you’re designing a custom imaging device, you can choose whatever frame rate suits the objectives or limitations of your system. There is no law saying that video must have a certain number of frames per second in order to allow for effective viewing or analysis. If you can get the job done with 10 frames per second, all the better—go ahead and make your life easier by using slower clock speeds and relaxing digital storage or throughput requirements.     

 

Integration, Readout, and Frame Rate

The frame rate of a CCD-sensor-based imaging system is limited by the amount of time required to capture and read out an image. If a sensor must first integrate for 10 ms—i.e., its exposure time is 1/100th of a second—and then spend 90 ms reading out all the charge packets, the total time required to deliver an image is 100 ms. Thus, the maximum frame rate is (1 frame)/(100 ms) = 10 fps.

What happens, though, if the scene suddenly becomes much darker and the exposure time must be increased to 50 ms?

Now the image-delivery time is 140 ms, and the maximum frame rate is about 7 fps. Thus, the frame-rate capability of the system is influenced by exposure time, which may continually change in response to lighting conditions.

This irksome dependency of frame rate on exposure time can be remedied by reading out one frame of pixel data while integrating the next frame. This is possible with frame-transfer and interline-transfer CCDs, with the interline-transfer architecture being preferred because it produces less smearing.

 

After light-generated charge is transferred to the vertical shift registers, the photodiodes are free to integrate charge for the next frame even if pixel readout is still occurring. 
 

Then again, full-frame CCDs have their advantages, and maybe the frame rate associated with the system’s longest possible exposure time plus the readout time is adequate for your requirements. This goes back to what I was saying about standard and nonstandard frame rates.

By determining how many frames per second you really need, you might find that you can forgo simultaneous integration and readout, and instead enjoy the benefits of using a full-frame CCD instead of an interline-transfer CCD.

 

Conclusion

Sorry to end this discussion somewhat abruptly, but there’s too much material here for one article. We’re ready to take a detailed look at CCD readout in relation to frame rate, and that’s exactly what we’ll do in the follow-up article.

3 Comments
  • B
    B.K.Navinon May 02, 2020

    Nice article..
    And it seems CMOS are in use..
    This article applies to CMOS also?

    Like. Reply
    • RK37 May 04, 2020
      As the title suggests, this article focuses on CCD image sensors. Nevertheless, much of this information is applicable to CMOS sensors as well.
      Like. Reply
  • HyowonKim November 30, 2020

    Hi, I want to use some images from your articles.
    How can I contact you?

    Like. Reply