Digital images are everywhere, from the photos we take on our phones to sophisticated satellite images that monitor Earth’s changes. Dive into the intricacies of digital image properties like pixels, resolution, channels, and how they relate to the advanced world of satellite imagery.
A “pixel”, short for “picture element”, is the smallest individual point in a digital image. If you zoom into a digital image far enough, you’d eventually see it’s made up of a grid of tiny squares. Each of these squares is a pixel.
Image dimensions describe the pixel width and height of an image, usually expressed as width x height. For example, an image with dimensions of 1920x1080 has 1920 pixels in width and 1080 pixels in height.
Resolution is a measure of the detail an image holds. In the digital realm, it’s often quantified as pixels per inch (PPI). A higher PPI indicates higher image quality because there are more pixels in the same physical space. Print images often use a resolution of 300 PPI to achieve sharpness, while screen displays typically sit around 72 PPI.
An image channel refers to the distinct grayscale image representing the values for a specific color or feature in an overall image.
Most digital images utilize the RGB color model:
- R (Red) Channel: Represents the red component.
- G (Green) Channel: Represents the green component.
- B (Blue) Channel: Represents the blue component.
Together, these channels produce the full-color image we see. Additionally, some images also have an Alpha Channel, which manages the opacity of the image. A pixel’s value in this channel determines its transparency level.
While digital images often use RGB, print images commonly use the CMYK model:
- C (Cyan)
- M (Magenta)
- Y (Yellow)
- K (Key/Black)
This model corresponds to the ink plates in color printing.
There are also other channels based on different color attributes, such as Hue, Saturation, and Lightness (HSL).
In satellite imagery, “channels” or “bands” often denote specific wavelengths or ranges of the electromagnetic spectrum. Different bands capture various features and properties of Earth.
Satellites often capture light in the visible spectrum (red, green, blue) and often extend into the near-infrared (NIR). The NIR is instrumental in assessing plant health, as healthy vegetation reflects a lot of NIR light.
Satellites equipped with advanced sensors can also detect shortwave infrared (SWIR), thermal infrared, and microwave bands. These bands can reveal insights like surface temperature, mineral presence, and surface roughness.
From the basic building blocks of digital images to the advanced spectral bands in satellite imagery, understanding the underlying properties and channels of images allows for more informed viewing, editing, and interpretation. Whether you’re fine-tuning a photograph or analyzing environmental changes from space, these concepts are foundational in the digital imaging world.