Category: Time series and analysis

Harmonization

Harmonization reduces differences between scenes or sensors so values are more comparable across time.

Also known as: radiometric harmonization, normalization

Expanded definition

Harmonization aims to make a dataset consistent enough that changes in pixel values are more likely to reflect real surface change and less likely to reflect processing or acquisition differences.

It can include radiometric normalization, BRDF correction, cross-sensor calibration, angle normalization (for SAR), and removal of seams in mosaics.

Harmonization is especially important for time series and machine learning because models are sensitive to small systematic shifts. Without harmonization, you often see “flicker” and false alerts that line up with acquisition conditions rather than real events.

Related terms