Harmonization vs Data Fusion: Making Multi-Sensor Imagery Agree and Work Together
2025-08-05 · 5 min read · Harmonization · Data Fusion · Sentinel-2 · Landsat · HLS · Sen2Cor · LaSRC · BRDF · Provenance · Uncertainty

TL;DR: Harmonization reduces systematic differences so sensors are comparable. Fusion then integrates multiple observations into a single best estimate. Harmonize first, fuse second, and publish QA and provenance so results are auditable.
Why this matters
Most regions cannot be monitored reliably with one sensor. Clouds, revisit gaps, and seasonal constraints make multi-source pipelines the practical default. That raises two questions. Can we compare inputs fairly across sensors and dates. Can we combine them without inventing structure or smearing time. The first question is harmonization. The second is fusion. Treating them separately yields smoother time series, higher coverage in cloudy seasons, and analyses that hold up in audits.
Large public programs apply this separation. NASA’s Harmonized Landsat Sentinel-2 products align Landsat 8 and 9 with Sentinel-2 surface reflectance on a common 30 meter grid before downstream analytics use the streams together. Sentinel-2 documentation highlights how illumination and viewing geometry affect cross-date comparability and why normalization belongs in the harmonization step.
What harmonization should change
Harmonization aligns the measurement systems so that remaining differences reflect the surface rather than the sensors.
- Geometry and geolocation. Orthorectify, co-register, and check that pixels truly overlap. One pixel of misalignment can dominate field-scale signals.
- Spectral response mapping. Landsat and Sentinel-2 bands are close but not identical. Use bandpass mapping or learned spectral transforms to place one into the other’s band space rather than histogram tricks.
- BRDF and illumination normalization. Reference all inputs to a standard sun and view geometry so low sun or off-nadir angles do not masquerade as surface change.
- Radiometric bias correction. Use pseudo invariant features and stable targets to correct small residual offsets.
- Topography and adjacency. Apply terrain illumination and adjacency handling in bright snow and water scenes so neighbors do not contaminate each other.
Done well, harmonization removes sensor idiosyncrasies while preserving the physics of the scene. Done poorly, it can imprint new artifacts or hide true variability. A good quick test is that invariant targets like large rock outcrops or airfields show the same reflectance trend after harmonization.
What fusion tries to solve
Fusion answers a different question. Given several observations, what single estimate should we deliver and how confident are we.
- Same-day fusion. Combine passes from the same date to reduce cloud and haze while keeping a defensible nominal date.
- Short temporal fusion. Reach back to prior days only when needed, and label those pixels clearly to avoid temporal leakage.
- Spatial goals. Sharpen structure using higher resolution cues or super-resolution, with care to avoid inventing false detail.
- Sensor complementarity. Use SAR for all-weather continuity and optical for spectral interpretation.
Good fusion is explicit about weights and uncertainty. Pixels with high cloud probability, poor geometry, or unstable aerosol estimates should count less. The product should carry enough provenance to show which sources contributed to each pixel.
Common pitfalls to avoid
- Histogram matching as a cure-all. It can hide bias on one scene and create bias on the next. Use spectral response and BRDF models first.
- Temporal leakage. Filling today with tomorrow improves coverage but breaks the meaning of the date. Mark and limit any prior-day use.
- Double counting. When two inputs overlap, avoid boosting confidence by counting them twice.
- Seamline bias. If weights change abruptly across tile boundaries, the fused surface will inherit seams.
Choosing a strategy
When extending an archive or running cross-sensor studies, start with harmonization so differences reflect the surface. When operating in cloudy seasons or when decisions depend on daily updates, fusion is essential to maintain continuity. Prefer same-day fusion for date-critical use, fall back to prior-day inputs only when necessary, and always record provenance. Most robust systems do both. Harmonize into a common frame, then fuse to reach the coverage and stability your application needs.
What to publish with the product
A small set of fields turns a map into an auditable record.
- Nominal date for each pixel or tile.
- Source list with acquisition times and sensors.
- Weights or contributions used in fusion.
- Quality layers such as cloud probability and viewing geometry.
- Processing version and parameters so results can be reproduced.
ClearSKY in practice
ClearSKY prioritizes data fusion for daily continuity while protecting date fidelity. We integrate all same-day observations first to keep the nominal date meaningful. If gaps remain, we add prior-day inputs in a controlled way with per-pixel provenance and weights. For longer baselines we harmonize Landsat archives into a Sentinel-2 style frame so studies can reach back to the 1980s. Methods vary by region and season, and order parameters are versioned so results can be audited.
FAQ
›Do I need harmonization if I only use Sentinel-2?
If you use a single level and sensor, harmonization is light. You still benefit from BRDF and illumination normalization for low sun and off-nadir scenes.
›Should I fuse before I harmonize?
No. Harmonize first so inputs land in a common frame, then fuse. Otherwise the fusion can learn the wrong corrections.
›Is histogram matching enough to harmonize sensors?
It’s a quick fix, not a stable solution. Prefer spectral response mapping and BRDF normalization, with a small bias correction from invariant targets.
›How do I mix SAR with optical?
Use SAR to maintain continuity through clouds and darkness, and optical for spectral labels and indices when a clear scene arrives. Track contributions explicitly in provenance.