Book Read Free

The Design and Engineering of Curiosity

Page 27

by Emily Lakdawalla


  Table 7.1. Comparison of the capabilities of Curiosity’s cameras.

  FHaz

  RHaz

  Nav

  ML

  MR

  RMI

  MAHLI

  MARDI

  CCD Detector (pixels)

  1024 × 1024

  1024 × 1024

  1600 × 1200

  1024 × 1024

  1600 × 1200

  1600 × 1200

  FOV (°)

  124 × 124

  45 × 45

  20 × 15

  6.8 × 5.1

  1.3

  34.0–38.5

  70–52

  IFOV at center (mrad/pixel)

  2.1

  0.82

  0.22

  0.074

  0.022

  0.402–0.346

  0.76

  Stereo?

  yes

  yes

  yes, but with different resolution/FOV in each eye

  no

  yes, with arm movement

  yes, with rover movement

  Stereo separation (cm)

  16.7

  10

  42.4

  24.5

  –

  arbitrary

  arbitrary

  Depth information from focal depth?

  no

  no

  yes

  yes

  yes

  no

  Height above surface (m)

  0.68

  0.78

  1.9

  1.9

  2.1

  arbitrary

  0.66

  Spectral bandpass (nm)

  600–800

  600–800

  395–1100

  450–950

  420–690

  420–690

  Filters

  monochrome

  monochrome

  8 plus Bayer

  8 plus Bayer

  monochrome

  Bayer color

  Bayer color

  7.2 MASTCAM

  The Mastcam instrument consists of two camera heads located on the mast, an electronics assembly located in the belly of the rover, and a calibration target on the rover deck. With the Mastcams, the science team investigates geomorphology, stratigraphy, and texture of the landscape, rocks, and sediments around the rover. They also monitor atmospheric and even astronomical phenomena. They support the rover’s engineering activities and provide context images for data from other science instruments. The Mastcams were built by Malin Space Science Systems, San Diego, California. The principal investigator for the Mastcam experiment is Michael Malin of Malin Space Science Systems.

  The Mastcams differ from previous lander cameras in two significant ways. First, nearly all Mastcam views are in full, human-vision-like color. Second, the two camera “eyes” have different focal lengths, which makes stereo imaging more complex than for previous missions. (Read section 1.​5.​8 for the history of the development of Mastcam that led to the flight of a pair of Mastcams with different focal lengths.) The left Mastcam or Mastcam-34 has shorter focal length, lower angular resolution, and wider field of view. The right Mastcam or Mastcam-100 has longer focal length, higher angular resolution, and narrower field of view.

  7.2.1 How Mastcam works

  7.2.1.1 Camera heads

  The Mastcams are 2-megapixel color cameras with focusable lenses and filter wheels.1 The heads contain electronics, a detector, a filter wheel assembly, a focus mechanism, and a sunshade/baffle that also serves as a mount (Figure 7.2). Each head contains two stepper motors, one to drive the filter wheel and one to drive the focus mechanism. The two Mastcams have boresights separated by 24.64 centimeters, and they are angled inward by 2.5° (1.25° each) in order to ensure that the smaller field of view of the Mastcam-100 is entirely contained within the wider field of view of the Mastcam-34 for any target located farther than 1.4 meters away from the rover. The boresights cross at a distance 2.8 meters away from the cameras, at a spot on the ground 2 meters away from the rover. The detector is capable of capturing 720p high definition video (1280-by-720 pixels) at a rate of 5 frames per second. Further facts are summarized in Table 7.2.

  Figure 7.2. Parts of the Mastcam instrument. Photos of the Mastcam-100 camera head and digital electronics assembly were taken at Malin Space Science Systems before their delivery to JPL for assembly. Bottom self-portrait taken at the John Klein drill site on sol 177 by MAHLI. Inset self-portrait showing the back of the camera heads and their wire harnesses taken at Okoruso drill site, sol 1338. NASA/JPL-Caltech/MSSS/Emily Lakdawalla.

  Table 7.2. Mastcam facts.

  Mastcam-34 (Mastcam-L)

  Mastcam-100 (Mastcam-R)

  Boresight height above bottom of wheels

  1.97 m

  Elevator actuator axis height above bottom of wheels

  1.91 m

  Stereo separation

  24.64 cm

  FOV (horizontal 1600 pixels)

  20.6°

  6.8°

  FOV (vertical 1200 pixels)

  15°

  5.1°

  instantaneous field of view (IFOV)

  218 μrad

  74 μrad

  Pixel scale at a distance of 2 meters

  450 μm

  150 μm

  Pixel scale at a distance of 1 kilometer

  22 cm

  7.4 cm

  focal ratio

  f/8

  f/10

  effective focal length

  34 mm

  100 mm

  in-focus range

  0.4 m to infinity

  1.6 m to infinity

  exposure range

  0 to 838.8 s in 0.1 ms increments

  video frame rate

  5.9 to 7.7 fps at 720p (1280-by-720) 3.9 to 4.7 fps for full frame

  The Mastcams have the same detectors as MAHLI and MARDI and use the same focus mechanism as MAHLI. The detector is a Kodak KAI-2020CM Charge-Coupled Device (CCD), which is 1640 pixels wide by 1200 pixels high. The sides and corners of the images are partly occluded by the baffle and are affected by vignetting. The vignetting exists because the filter wheels, and specifically the shapes of their openings, were built before the descope of zoom capability, at a time when Mastcam only planned to produce 1200-by-1200-pixel subframes. Most images taken for science purposes crop away the sides to an image width of 1344 or 1200 pixels, operationally called a “full frame” (Figure 7.3). On sol 1589, the Mastcam team switched to using 1328-by-1184-pixel “full frames” for more efficient memory management.2 The original full-frame image size used 12.3 blocks in flash memory; the slightly smaller subframe uses just under 12 blocks at virtually no cost to the usefulness of the image.

  Figure 7.3. Size of the left Mastcam frame and common subframe areas. Mastcam image 320ML0010520330107781E01. NASA/JPL-Caltech/MSSS/Emily Lakdawalla.

  7.2.1.2 Color imaging

  Unlike most space cameras, the Mastcams, MAHLI, and MARDI take natural color images like consumer digital cameras. Each pixel is covered with a red, green, or blue filter in a Bayer pattern. A Bayer pattern is a checkerboard of colored pixels; in every 2-by-2 array of pixels, two corner pixels are covered by green filters, one is covered by a red filter, and one by a blue filter. Color comes from interpolating among the pixels to generate complete red, green, and blue images. Interpolation can happen onboard the spacecraft or on Earth.

  Each Mastcam eye is equipped with an 8-position filter wheel. It may seem odd to add a filter wheel to a camera that already has color filters over its detector, but fortunately for spectroscopists, the Bayer color filters on the Mastcam detectors are “leaky” in infrared wavelengths. During normal color imaging, a broadband filter blocks these infrared wavelengths (Figure 7.4). But the Mastcams can operate like other filter-wheel-equipped space cameras in the near-infrared with six narrowband science filters in each eye, used for spectroscopic imaging (Figure 7.5). The science filters were dis
tributed between the two cameras so that, if one camera fails, the other will still be able to accomplish some of the science objectives. Three of the filters are essentially identical between the two eyes, and three differ, so a total of nine distinct science filters is available for multispectral imaging. Each eye’s filter wheel also has one filter with a neutral-density coating that blocks most light and permits the Mastcams to directly image the Sun through a blue (right eye) and infrared (left eye) filter.

  Figure 7.4. Mastcam detector Bayer filter bandpasses without and with the “clear” infrared cutoff filter. Dark lines show the quantum efficiency of the optics and detector; at wavelengths beyond about 850 nanometers, all three Bayer filters allow an equal amount of light to pass. Brighter lines show the normalized transmission of the three Bayer filters with the clear filter in the optical path, which allows only visible wavelengths (420–690 nanometers) to pass. Data courtesy Jim Bell.

  Figure 7.5. Mastcam narrowband filter transmission. Data courtesy Jim Bell.

  The narrowband filters are usually named by their filter wheel positions (L0, L1, etc...) or referred to using the wavelengths that were requested from the Mastcam filter supplier (440, 525, 675, etc...), but their actual center wavelengths are slightly different from those values. The as-built center wavelengths of the filters on the cameras on Mars are listed in Table 7.3.Table 7.3. Mastcam spectral filters and bandpasses as built. Data from Bell et al. ( 2012 ).

  Filter Position

  Left Eye Wavelength± Bandwidth (nm)

  Nickname

  Right Eye Wavelength ± Bandwidth (nm)

  Nickname

  0

  590 ± 88

  Clear

  575 ± 90

  Clear

  640 ± 44

  Bayer red

  638 ± 44

  Bayer red

  554 ± 38

  Bayer green

  551 ± 39

  Bayer green

  495 ± 37

  Bayer blue

  493 ± 38

  Bayer blue

  1

  527 ± 7

  525, green

  527 ± 7

  525, green

  2

  445 ± 10

  440, blue

  447 ± 10

  440, blue

  3

  751 ± 10

  750

  805 ± 10

  800

  4

  676 ± 10

  675, red

  908 ± 10

  905

  5

  867 ± 10

  865

  937 ± 10

  935

  6

  1012 ± 21

  1035

  1013 ± 21

  1035

  7

  880 ± 10 ND5

  880, solar

  440 ± 20 ND5

  440, solar

  As a consequence of the convolution of Bayer and narrowband filters, some narrowband images contain less spatial information than others. In particular, an image taken through L2/R2 (440), L3 (750), R3 (800), L4 (675), or R7 (440 ND) has good signal only in one out of every four pixels (the red ones or blue ones), while L1/R1 (525) has signal in only one of every two pixels (the green ones). JPEG-compressing the full-size versions of these images before transmitting them to Earth would have very strange results. So before converting the shorter-wavelength narrowband images to JPEG, the camera electronics throw out data from the relatively unresponsive pixels and do bilinear interpolation to fill in data from the missing pixels. As an example, for the L2/R2 (blue) images, the electronics throw out the data from the red and green pixels and fill in with values interpolated from the blue pixels. Narrowband filter images that have been JPEG-compressed are returned to Earth as grayscale, with only the luminance (brightness and darkness) channel; the chrominance (color variation) information isn’t provided. Because the longer exposures required to take narrowband filter images accentuate the effects of bad pixels and because the images have intrinsically less spatial information, they tend to look noisier than the broadband color images.

  7.2.1.3 Focus

  The Mastcam focal mechanism uses a stepper motor with 16100 discrete motor positions. To autofocus, a Mastcam starts at a commanded motor position and then takes a set of images, incrementing the motor count by a specified step each time. Usually, the autofocus images are subframes of the full scene. The camera then JPEG-compresses the photos. The file size of the photos measures the complexity of the scene; an out-of-focus scene will be blurrier, so will compress to a smaller file size. The camera considers the motor count as a function of JPEG file size and fits a parabola to the sizes of the three largest files. The vertex of the parabola is taken to be the best-focus motor count, and Mastcam moves the focus to that position and takes one more image.

  When a scene has a lot of depth, the autofocus algorithm doesn’t always select the focal depth that scientists want, so it can be better to specify the focal depth for those observations. To save time when capturing landscape mosaics, the Mastcams can be commanded to autofocus one frame and then use the same focus setting for subsequent frames that are expected to be in focus at the same position.

  The motor count associated with an in-focus image is a function of the range to the best-focus features in the image. For the Mastcam-100, the temperature of the instrument also affects the focus. To determine range from motor count, use the following equations:3

  .

  7.2.1.4 Electronics

  Each Mastcam, MAHLI, and MARDI has its own board in the electronics assembly. The following discussion therefore applies to MARDI and MAHLI as well as each Mastcam.

  Each electronics board has a computer, 128 megabytes of SDRAM, and 8 gigabytes of flash memory for each camera, which can accommodate about 4000 total images. The electronics assembly determines autofocus and autoexposure parameters and sends this information to the camera heads. The camera detector captures 12-bit images. After it acquires an image, a camera head sends the data to the RAM inside the digital electronics assembly for further processing and storage. For all images, the camera head electronics create thumbnails 1/8th the size of the originals as they are transferred to the electronics board. (The electronics aren’t capable of downsampling images by any factor other than 8.) Mastcam converts the images from 12- to 8-bit depth to reduce file size. Most commonly, the team commands the instrument to use a square-root look-up table to do the 12-to-8-bit conversion. This allots more of the limited 256 values in the 8-bit image to darker areas, preserving detail in shadows that would otherwise be lost. Images are usually stored raw, without compression (in which case each full-size image takes about 2 megabytes of space, on average).

  The main rover computer maintains a list of the files in storage, and copies requested images to its own memory as commanded before transmitting them to Earth. Thumbnail images get returned to Earth very soon after acquisition, supplying the Mastcam team a visual index to the image data collected on the rover. When the rover computer requests an image from Mastcam, it requests that the image be compressed before transmission, either losslessly or lossily. The electronics board has a lot of options for lossy compression. Mastcam can use Bayer interpolation4 to convert pictures to color, then save them in JPEG format. Usually, the JPEG images are compressed using a method that preserves more detail in an image’s brightness and darkness (luminance) but downsamples the detail in the color variation (chrominance) by a factor of two. Such images are referred to as “JPEG 422”, while JPEG-compressed images that preserve full-resolution chrominance information are referred to as “JPEG 444”.

  Returning space science data in lossy JPEG format is somewhat unusual, although it’s getting more common as camera detector sizes outstrip our ability to transmit all that data to Earth. Even slight JPEG compression produces large savings in file sizes. A JPEG quality of 90 (measured on a scale from 1, lowest, to 100, highest quality) generally produces images with less than half the file size
of an uncompressed image (Figure 7.6).5 The team selects less compression (typically JPEG quality 85) for images intended to support science, and more compression (typically JPEG quality 65) for images taken for documentation purposes.

  Figure 7.6. Relationship between JPEG compression quality and file size for all four Curiosity color cameras. Mastcam-34 images have the largest file sizes because they are usually in focus over most of the image so have high amount of detail, which compresses poorly. MARDI, by contrast, is out of focus, so compresses much more readily. An uncompressed image has 8 bits per pixel. Compression quality 101 refers to losslessly compressed images. Figure from Bell et al. ( 2017 ), based on analysis by Jason Van Beek and Michael Malin.

  The cameras’ large flash memory volume makes it possible to keep raw data onboard and return science images months or even years after they were originally taken. At times when the rover is not capable of doing much science (like during holidays, solar conjunctions, or anomalies that restrict mobility) but is still capable of sending data to orbiters, daily downlinks can be packed with Mastcam data that has been idling on the rover for months, usually losslessly compressed versions of images that had previously been returned lossily.

  7.2.1.5 Artifacts and blemishes

  Several types of artifacts can affect the quality of Mastcam images. Some of these are intrinsic to the camera, some have to do with the way the data are stored or transferred, and some result from how the images are processed either within the camera or on Earth. Each camera has some (but very few) bad pixels: hot pixels that make bright spots, dead pixels that make dark spots, and gray pixels that don’t respond as well as others around them to incoming light. Occasionally, a new hot pixel appears on a camera detector, likely caused by an energetic particle flying from the MMRTG or from space. One such hot pixel appeared on the center right of the right Mastcam on sol 392, and had disappeared again by sol 710. A particularly bright one appeared near the top right of the left Mastcam on sol 834 and has remained ever since (Figure 7.7).

 

‹ Prev