Category: Time series and analysis
Harmonization
Harmonization reduces differences between scenes or sensors so values are more comparable across time.
Also known as: radiometric harmonization, normalization
Expanded definition
Harmonization aims to make a dataset consistent enough that changes in pixel values are more likely to reflect real surface change and less likely to reflect processing or acquisition differences.
It can include radiometric normalization, BRDF correction, cross-sensor calibration, angle normalization (for SAR), and removal of seams in mosaics.
Harmonization is especially important for time series and machine learning because models are sensitive to small systematic shifts. Without harmonization, you often see “flicker” and false alerts that line up with acquisition conditions rather than real events.
Related terms
Radiometry
Radiometry is the measurement and calibration of electromagnetic energy recorded by a sensor.
BRDF
BRDF describes how reflectance changes with viewing and illumination geometry, affecting apparent brightness across scenes.
Time Series
A time series is a sequence of observations over time for the same location, used for monitoring and change detection.
Incidence Angle
Incidence angle is the angle between the radar beam and the surface; it affects SAR backscatter and comparability.