Scaling dimensions in spectroscopy of soil and vegetation

https://doi.org/10.1016/j.jag.2006.08.003Get rights and content

Abstract

The paper revises and clarifies definitions of the term scale and scaling conversions for imaging spectroscopy of soil and vegetation. We demonstrate a new four-dimensional scale concept that includes not only spatial but also the spectral, directional and temporal components. Three scaling remote sensing techniques are reviewed: (1) radiative transfer, (2) spectral (un)mixing, and (3) data fusion. Relevant case studies are given in the context of their up- and/or down-scaling abilities over the soil/vegetation surfaces and a multi-source approach is proposed for their integration.

Radiative transfer (RT) models are described to show their capacity for spatial, spectral up-scaling, and directional down-scaling within a heterogeneous environment. Spectral information and spectral derivatives, like vegetation indices (e.g. TCARI/OSAVI), can be scaled and even tested by their means. Radiative transfer of an experimental Norway spruce (Picea abies (L.) Karst.) research plot in the Czech Republic was simulated by the Discrete Anisotropic Radiative Transfer (DART) model to prove relevance of the correct object optical properties scaled up to image data at two different spatial resolutions. Interconnection of the successive modelling levels in vegetation is shown. A future development in measurement and simulation of the leaf directional spectral properties is discussed.

We describe linear and/or non-linear spectral mixing techniques and unmixing methods that demonstrate spatial down-scaling. Relevance of proper selection or acquisition of the spectral endmembers using spectral libraries, field measurements, and pure pixels of the hyperspectral image is highlighted. An extensive list of advanced unmixing techniques, a particular example of unmixing a reflective optics system imaging spectrometer (ROSIS) image from Spain, and examples of other mixture applications give insight into the present status of scaling capabilities.

Simultaneous spatial and temporal down-scaling by means of a data fusion technique is described. A demonstrative example is given for the moderate resolution imaging spectroradiometer (MODIS) and LANDSAT Thematic Mapper (TM) data from Brazil. Corresponding spectral bands of both sensors were fused via a pyramidal wavelet transform in Fourier space. New spectral and temporal information of the resultant image can be used for thematic classification or qualitative mapping.

All three described scaling techniques can be integrated as the relevant methodological steps within a complex multi-source approach. We present this concept of combining numerous optical remote sensing data and methods to generate inputs for ecosystem process models.

Introduction

Quantitative research methods in environmental sciences are based on measurements of environmental variables. Each measurement of any quantitative parameter is expressed in the units of the measured variable. These units determine the level of the object observation, which is called scale. However, not only quantitative measurements, but also a qualitative description is related to a specific level of detail. For example, interactions of an elementary radiation element, photon, with the cellular structures inside a leaf (spatial dimension in micrometer) will be different from its behaviour inside a plant canopy between clumps of leaves, small twigs, branches, and trunks (spatial dimension in meter). This phenomenon of coexistence and interactions of differently sized objects makes scale an inseparable and basic component of any scientific domain.

The essential effort to connect different scales in environmental sciences and the importance to define their common preferable spatial scales has been identified by Marceau (1999). The term scale, however, has a worldwide meaning with several disconnected definitions. Key concepts related to scale are used in different ways across disciplines and scholars, which makes the comparison and communication among researchers and research results across subfields and disciplines more difficult (Schneider, 1994, Lam et al., 2004). After struggling with the confusion created by different uses of the same word, Gibson et al. (2000) presented definitions for the basic terms related to the concept of scales (Table 1). For a standardized lexicon, Quattrochi (1993) defined scale as “the integral of space and time over which a measurement is made”. In the geo-statistical sciences the word support is defined as “an n-dimensional volume within which linear average values of a regionalized variable may be computed” (Olea, 1991). The term scale is most commonly used in relation to the absolute or relative scale of space (Meentemeyer, 1989) (Table 1). Cao and Lam (1997) introduced four scale concepts of the spatial-temporal domain: (1) cartographic scale, (2) scale of spatial extent, (3) scale of action, and (4) spatial resolution. First, the cartographic or map scale refers to the ratio of a distance on a map against the corresponding distance on the ground. A large-scale map covers a small area with a high detail, where a small-scale map covers a larger area with less detailed information. The geographical or observational scale, which refers to the size or spatial extent of the study, takes the opposite perspective. A geographic large-scale study covers a large area of interest as opposite to a geographic small-scale study covering a small area (Cohen et al., 2003). Third, the operational scale refers to the level at which observed processes operate in the environment (Turner et al., 2003). This scale, called also scale of action, represents a level at which a certain process phenomenon is best observed. Finally, the fourth meaning is the resolution of the measurement scale. Spatial resolution refers to the smallest distinguishable parts of an object, for instance a pixel size of the raster map or remote sensing image (Dungan, 2001). Scale is a frequent and general feature such that a “science of scale” was proposed (Goodchild and Quattrochi, 1997). It has been concluded that such a science should address and answer the following interrelated issues: invariance of scale, the ability of scale change, measures of the impact of scale change, scale as a parameter of process models, and an implementation of multi-scale approaches.

Spectroscopy is the scientific branch of physics concerned with the production, transmission, measurements, and interpretation of electromagnetic spectra (Swain and Davis, 1978). Spectral properties can be measured using numerical spectroradiometers or imaging spectrometers in the laboratory or field, as well as from an airborne or satellite platform. Imaging spectroscopy is an imaging technique to record for every pixel in the image a separate electromagnetic spectrum (Meer and Jong, 2001). Imaging spectroscopy is an inseparable part of the passive optical remote sensing. It has many names including imaging spectrometry, hyperspectral remote sensing, or ultraspectral imaging (Kumar et al., 2001). Actually, imaging spectroscopy brought a new and expanded perception of the term scale in remote sensing mainly in the spectral domain. Imaging spectroscopy data, in contrast to multispectral data, contain a high number of narrow spectral bands. Moreover, they can be acquired from several oblique viewing angles and with a higher revisit frequency by means of modern satellite systems (e.g. spectrometer CHRIS on satellite PROBA).

An imaging spectroscopy measurement of the Earth surface reflectance R is predominantly a function fR defined by spatial, spectral, directional, and temporal scale (Baret, personal communication):R=fR(x,y;λ;Ωv;Ωs;t)where x, y are spatial coordinates, λ the wavelength of the electromagnetic spectra, Ωv, Ωs describes the angular viewing geometry of the sun–object–sensor system, and t is the time frequency of the observation. A definition of remote sensing spectral image scale must consider all of these four dimensions. Adjusting the scale definition of Quattrochi (1993) we can state that the spectroscopy scale of optical imaging data is “the combination of space, electromagnetic wavelengths, their directions, and time intervals over which a spectrometric measurement is made”. A more precise definition may be reached by extension of the term support (Olea, 1991). The original specification of support includes the geometrical shape, size, and orientation of the volume. The volume of spectroscopy support should enclose in addition to the spatial and geometrical content, a spectro-directional component and the time intervals between successive observations.

Traditional parameters describing the scale of remote sensing image data are resolution (grain) and extent. Consequently, considering the four-dimensional spectroscopy scale scheme, spatial resolution is equal to the elementary pixel size of a remotely sensed image and spatial extent corresponds to the total area covered within an image swath. These spatial parameters are functions of the digital matrix of the spectral sensor and the 'instantaneous field of view’ (IFOV) given by the optical system, flight altitude, and its flight velocity, respectively (Forshaw et al., 1983).

Spectral resolution is described by Lillesand and Kiefer (1994) as “the ability to discriminate fine spectral differences”. The spectral resolution of a sensor is often described by the 'full-width-half-maximum’ (FWHM) of the instrument response to a monochromatic source (Liang, 2004). The spectral extent, also named spectral range, is the difference between the minimum and maximum wavelengths in which measurements are made (λmax  λmin). A new parameter, spectral sampling, has to be introduced to describe the number and position of the spectral channels. Spectral sampling interval is the spacing between sample points in the spectrum (Liang, 2004). As shown in Fig. 1, the sampling interval is independent of the spectral resolution, which implies that there can be overlap between consecutive bands. This is usually the case in imaging spectroscopy instruments, since their aim is to derive a contiguous spectrum where over-sampling reduces the amount of incoming noise but at the cost of information redundancy. Spectral and spatial resolutions of the multi- and/or hyperspectral images are in an inverse relationship due to the technical constraints on the sensor side. There is usually a trade off between high spectral and low spatial resolution or vice versa, because of the limited extent and minimal element size of a 'charged-coupled device’ (CCD) array recording the spectral image. Lower spatial resolution caused by the binning of the spatial array columns allows a narrowing of the FWHM and, subsequently, increasing spectral resolution. Conversely, spectral binning of the wavelengths widens the FWHM and gives opportunity to increase the spatial resolution. More traditional multispectral satellite instruments operate on high spatial resolution and lower spectral resolution with a small temporal sampling interval (e.g. LANDSAT 7 ETM+ or SPOT 5 HRG). New imaging spectroscopy satellite sensors acquire data with coarser spatial resolution and higher spectral sampling interval and resolution (e.g. medium resolution imaging spectrometer (MERIS) on the ENVISAT satellite), as well as at both high spatial and spectral resolutions (e.g. the Hyperion sensor on board of the satellite EO-1).

Reflectance by the Earth’s surface and scattering by atmospheric particles and gases have a strong directional behaviour. This phenomenon is scientifically described by the concept of the bi-directional reflectance distribution function (BRDF). The BRDF is a conceptual quantity that describes the reflectance of a target as a function of the independent variables describing viewing and illumination angles and variables determining the geometrical and optical properties of the observed target (Nicodemus et al., 1977, Deering, 1989, Myneni and Ross, 1991, Liang, 2004). The BRDF describes the scattering of a parallel beam of incident light from one direction in the hemisphere into another direction in the hemisphere (Schaepman-Strub et al., 2004). The incident and viewing directions are each defined by the zenith and azimuth angles of illumination (in nature sun zenith θs and azimuth Φs angle) and view of sensor (viewing zenith θv and azimuth Φv angle). Difference of the viewing and illumination azimuth angle is called the relative azimuth angle (Φ = Φv  Φs) (Schönermark et al., 2004). Then the BRDF [sr−1] can be expressed as a reflectance function fBRDF of source illumination projected solid angle Ωs, viewing projected solid angle Ωv, and wavelength λ:BRDF=fBRDF(Ωs,Ωv,λ)=dLv(θs,Φs;θv,Φv;λ)dEs(θs,Φs;λ)where Lv is reflected radiance and Es is incident solar irradiance. The unitless bi-directional reflectance factor (BRF) is proportional to the BRDF according to the relation: BRF = π × BRDF. Finally, the reflectance acquired under illumination of the ambient hemispherical sky is called the hemispherical-directional reflectance factor (HDRF). The HDRF is physically defined in the same way as the BRF, except that the HDRF includes illumination coming from the entire hemisphere (Schaepman-Strub et al., 2004). Any outdoor HDRF measurement depends not only on the scattering optical properties of the observed object, but also on atmospheric conditions, the surrounding of the object, the topography, and wavelength. The spectral field measurements of the HDRF are often performed by a goniometer device. In order to obtain a high accuracy, goniometer reflectance measurements are usually performed at a local scale and on a specific vegetation or soil (snow) surface. Still, natural spatial patterns are not uniformly distributed within the space and their expanse covers the whole range from micro to macro scale (e.g. cell structure, leaves, branches, trees, forest). Details of ground directional reflectance measurements are given in Bruegge et al. (2004).

Accordingly to the BRDF concept, the directional resolution of the angular spectral image data is represented by the IFOV given by the parameters of the optical set, size of the CCD array basic element, tilt, motion speed, and altitude of a sensor. Directional extent is specified by the interval between minimal viewing direction (θv min, Φv min) and maximal viewing direction (θv max, Φv max) aside (maximal–minimal oblique viewing angles). Finally, directional sampling is expressed by the total number of viewing directions and their angular position within the hemispherical space. Presently, only a few real multi-angular imaging spectroscopy satellite sensors are operational. Examples of successful missions are the Compact High Resolution Imaging Spectrometer (CHRIS) sensor on board the PROBA satellite, providing five angular images in 63 spectral bands (NADIR, ±36°, ±55°), or the Multiangle Imaging SpectroRadiometer (MISR) on the NASA EOS Terra platform consisting of nine cameras capturing four VIS/NIR spectral bands in nine backwards and forwards along track viewing directions. Developments in remote sensing technology and radiative transfer modelling indicate that angular signatures can be exploited to provide not only improved accuracies relative to single-angle approaches but also unique diagnostic information about the Earth's atmosphere and surface, e.g. identification of atmospheric aerosol, cloud, or surface vegetation type (Diner et al., 1999), capitalizing on both the geometric aspects of the technique as well as the radiometric variations in signal with angle.

Temporal scale must be considered, since geochemical or geophysical constituents of the surface (e.g. concentration of various chemical compounds, or water content) exhibit specific spectral features that vary over time. Monitoring the variation over time becomes more important when ecosystems and their reactions to climate effects are observed. Some authors traditionally refer to temporal resolution as the image frequency which depends on the revisit time of a sensor; in other words, how often is an image acquired over a specific location on Earth (Franklin, 2001). However, strictly following the foregoing concept of the imaging spectroscopy spatial and spectro-directional scale, we propose the sensor revisit time to be called the temporal sampling interval rather than temporal resolution. Since image spatial resolution is given by size of the smallest CCD array element then similarly the temporal resolution should be defined as the shortest time span needed to integrate the reflected radiative information by the CCD array into the image. Nevertheless, this parameter is commonly called an integration and/or dwell time of a sensor. Temporal extent is taken as the time interval between the last and first observation of the same location (tmax  tmin), which can be several years for a given satellite platform. Perhaps the most important temporal characteristic, revisit frequency of the satellite driven by the orbit parameters and viewing extent varies from mission to mission. Among low revisiting frequency satellite platforms are LANDSAT 7, and EO-1 (both 16 days) or SPOT (26 days). Examples of the high frequency revisiting sensors are the moderate resolution imaging spectroradiometer (MODIS) aboard the Terra (EOS AM) and Aqua (EOS PM) satellites viewing the entire Earth's surface every one to two days or the Medium Resolution Imaging Spectrometer (MERIS) with the revisit frequency of three days. Note that the theoretical temporal sampling interval is usually higher than the practical one, due to cloudiness that can cover the location of interest during the time of the sensor overpass.

The increasing availability of remote sensing sensors provides the possibility of choosing the systems that are best adapted for specific research interests. Various factors such as cost, availability at a certain time and place, sensor characteristics (spatial, spectral, temporal, directional resolution) and, of course, specific research interests determine the final decision. Sensor characteristics and research interests are strongly related and this is where scale considerations play a major role. The choice of the appropriate scale for every dimension in a particular application depends on several factors and is a function of the type of environment and the kind of information desired (Woodcock and Strahler, 1987).

Transfer of data content from one scale to another one is called scaling. According to Dungan (2001), scaling when applied in remote sensing and GIS is a procedure that changes the size of a measurement unit. Basically, scaling can be performed by means of two approaches: bottom-up and top-down. The bottom-up approach up-scales information from smaller to larger observational scales, while the top-down approach down-scales, in other word decomposes, information at a certain geographical scale into its constituents at smaller scales (Marceau and Hay, 1999). The capability to process and present geographic information “up” and “down” from local, regional, to global scales has been advocated as a solution to understanding the global systems of both natural (e.g. global climate change) and societal (e.g. global economy) processes and the relationships between the two (Lam et al., 2004). But, as mentioned by Jarvis (1995), scaling represents a scientific challenge because of the non-linear nature between processes and variables, and heterogeneity of characteristics determining the rates of processes.

Much literature has been published on scaling in environmental research, with the vast majority concerning spatial scaling. Since the principles of spatial scaling in imaging spectroscopy do not differ from scaling in other research fields, this topic will not be discussed deeply in this paper. Several books have been written on this subject, so for further reading we direct the reader to the following references: Tate and Atkinson (2001), Cao and Lam (1997), and Goodchild and Quattrochi (1997).

We describe several techniques to perform up- or down-scaling in spectroscopy. We limit the description of these techniques to radiance and reflectance, observational directions, and time. Scaling of derived remote sensing mapping products is not described, since these data can be considered as any kind of thematic data and therefore not specific for imaging spectroscopy. Furthermore, the spatial scale is only discussed when a discussed technique is used specifically with spectroscopy, when imaging spectroscopy has a large added value, or when the described technique can be used for spatial scaling as well as for scaling in another dimension of the spectroscopic domain.

Three techniques are discussed in this paper: (1) radiative transfer modelling, (2) spectral unmixing, and (3) data-fusion. These techniques are widely investigated and accepted in the research field of spectroscopy. This does not mean that these techniques are easy to apply or result in standard products. Expert knowledge of the techniques and the physical processes specific for the study-area are needed for valuable use of these three techniques. Therefore, the described limitations of the techniques should be taken seriously; nonetheless these techniques may be useful for those who deal with scaling problems in spectroscopy. The techniques are illustrated with a case study in the field of environmental research, specifically vegetation and soil studies. Finally, we describe a concept of a multi-source approach offering the possibility to integrate all these techniques in one complex methodology using multi-resolution remote sensing data and field spectral measurements.

Section snippets

Radiative transfer theory—types of models

A modern physical approach for scaling the spectro-directional and spatial information of soil and vegetation is based on radiative transfer theory. This theory was originally developed for turbid media problems in astrophysics, nuclear, and atmospheric physics (Chandrasekhar, 1960, Marchuk and Lebedev, 1971, Sobolev, 1975). The original radiative transfer equation was a linear integrodifferential equation which has proved to be remarkably intractable (Hapke, 1993). Therefore,

Spectral mixing

Each pixel in a remote sensing image is treated as a homogenous area with a single reflectance value. However, homogeneity is a rare phenomenon at the Earth surface, so this one value is a combined reflectance of all objects present within the pixel. The spatial distribution of the objects within this pixel is lost, but much of the spectral information is preserved in the spectral signature.

When more materials or objects are present within a pixel, the measured reflectance is a result of the

Image fusion as a scaling technique

We currently have an increasing number of different sensors that image at a variety of spatial scales and spectral bands. Each of these sensors has its own characteristics with different levels of detail in the spatial, spectral, temporal or directional dimension. Therefore, the opportunity to integrate information from different sensors is now greater than ever before. The aim of image fusion is to integrate information from multi-source data sets to create a composite image that contains a

Multi-source remote sensing data

Reflectance measurements are usually performed at a local scale and on a specific vegetation surface and therefore exhibit limited significance for larger areas. The objective of this section is to propose a method of combining various remote sensing data and to point out the potential of diverse remote sensing data as input to local, regional or global ecosystem models (Huber et al., 2004). Thereby the whole processing chain from the input data sources to the end-user specific product, e.g.

General discussion and conclusions

Traditional understanding of the term scale in remote sensing is defined by two parameters: grid resolution (grain), and spatial extent. This review paper shows that the scale of the imaging spectroscopy data is not a simple spatial function but it must be defined as a complex four dimensional function of the space, wavelengths of electromagnetic spectra, angular geometrical vectors, and time. Consequently, this new idea, proposed for imaging spectroscopy data by Baret (personal communication),

References (182)

  • T.P. Dawson et al.

    LIBERTY—modeling the effects of leaf biochemical concentration on reflectance spectra

    Rem. Sens. Environ.

    (1998)
  • V. Demarez et al.

    A modeling approach for studying forest chlorophyll content

    Rem. Sens. Environ.

    (2000)
  • P.E. Dennison et al.

    Endmember selection for multiple endmember spectral mixture analysis using endmember average RMSE

    Rem. Sens.Environ.

    (2003)
  • K.S. Fassnacht et al.

    Estimating the leaf area index of North Central Wisconsin forests using the landsat thematic mapper

    Rem. Sens. Environ.

    (1997)
  • J.P. Gastellu-Etchegorry et al.

    Modeling radiative transfer in heterogeneous 3-D vegetation canopies

    Rem. Sens. Environ.

    (1996)
  • C.C. Gibson et al.

    The concept of scale and the human dimensions of global change: a survey

    Ecol. Econ.

    (2000)
  • N.S. Goel et al.

    A computer graphics based model for scattering from objects of arbitrary shapes in the optical region

    Rem. Sens. Environ.

    (1991)
  • H.N. Gross et al.

    Application of spectral mixture analysis and image fusion techniques for image sharpening

    Rem. Sens. Environ.

    (1998)
  • D. Haboudane et al.

    Integrated narrow-band vegetation indices for prediction of crop chlorophyll content for application to precision agriculture

    Rem. Sens. Environ.

    (2002)
  • A.R. Huete

    A soil-adjusted vegetation index (SAVI)

    Rem. Sens. Environ.

    (1988)
  • S. Jacquemoud et al.

    Prospect—a model of leaf optical properties spectra

    Rem. Sens. Environ.

    (1990)
  • S. Jacquemoud et al.

    Estimating leaf biochemistry using the PROSPECT leaf optical properties model

    Rem. Sens. Environ.

    (1996)
  • R. Lacaze et al.

    Retrieval of vegetation clumping index using hot spot signatures measured by POLDER instrument

    Rem. Sens. Environ.

    (2002)
  • R. Lacaze et al.

    G-function and HOt SpoT (GHOST) reflectance model: application to multi-scale airborne POLDER measurements

    Rem. Sens. Environ.

    (2001)
  • K.-S. Lee et al.

    Hyperspectral versus multispectral data for estimating leaf area index in four different biomes

    Rem. Sens. Environ.

    (2004)
  • C.C.D. Lelong et al.

    Hyperspectral imaging and stress mapping in agriculture: a case study on wheat in Beauce (France)

    Rem. Sens. Environ.

    (1998)
  • S. Li et al.

    Using the discrete wavelet frame transform to merge Landsat TM and SPOT panchromatic images

    Inf. Fusion

    (2002)
  • G.I. Metternicht et al.

    Remote sensing of soil salinity: potentials and constraints

    Rem. Sens. Environ.

    (2003)
  • R.B. Myneni

    Modeling radiative-transfer and photosynthesis in 3-dimensional vegetation canopies

    Agric. Forest Meteorol.

    (1991)
  • R.B. Myneni et al.

    Photon interaction cross-sections for aggregations of finite-dimensional leaves

    Rem. Sens. Environ.

    (1991)
  • R.B. Myneni et al.

    A 3-dimensional radiative-transfer method for optical remote-sensing of vegetated land surfaces

    Rem. Sens. Environ.

    (1992)
  • R.B. Myneni et al.

    A simplified formulation of photon transport in leaf canopies with scatterers of finite dimensions

    J. Quant. Spectrosc. Radiative Transfer

    (1991)
  • F.W. Acerbi-Junior et al.

    Are we using the right quality measures in multi-resolution data fusion?

  • J.B. Adams et al.

    Imaging spectroscopy: interpretation based on spectral mixture analysis

  • J.B. Adams et al.

    Spectral mixture modeling: a new analysis of rock and soil types at the Viking Lander I site

    J. Geophys. Res.

    (1985)
  • B. Aiazzi et al.

    Context-driven fusion of high spatial and spectral resolution images based on oversampled multiresolution analysis

    IEEE Trans. Geosci. Rem. Sens.

    (2002)
  • W.A. Allen et al.

    Plant-canopy irradiance specified by Duntley equations

    J. Opt. Soc. Am.

    (1970)
  • G. Asrar

    Theory and applications of optical remote sensing

  • S. Baronti et al.

    Pan-sharpening of very high-resolution multispectral images via generalised Laplacian pyramid fusion

    Bull. Soc. Fr. Photogrammetrie Teledetection

    (2002)
  • Beisl, U., 2001. Correction of Bidirectional Effects in Imaging Spectrometer Data. Ph.D. Thesis. Remote Sensing...
  • P. Blanc et al.

    Using iterated rational filter banks within the ARSIS concept for producing 10 m Landsat multispectral images

    Int. J. Rem. Sens.

    (1998)
  • J. Boardman

    Inversion of high spectral resolution data

    Proc. SPIE Tech. Symp. Opt. Electro-Opt. Sens.

    (1990)
  • J.W. Boardman et al.

    Mapping target signatures via partial unmixing of AVIRIS data: in summaries

    Fifth JPL Airborn Earth Science Workshop, JPL Publication

    (1995)
  • X. Briottet

    Fundamentals of bi-directional reflectance and BRDF modelling: Concepts and definitions

  • C.J. Bruegge et al.

    Field measurements of bi-directional reflectance

  • N.J.J. Bunnik

    The Multispectral Reflectance of Shortwave Radiation by Agricultural Crops in Relation with their Morphological and Optical Properties

    (1978)
  • C. Cao et al.

    Understanding the scale and resolution effects in remote sensing and GIS

  • L.M.T. Carvalho et al.

    Multi-scale feature extraction from images using wavelets.

  • S. Chandrasekhar

    Radiative Transfer

    (1960)
  • P.S. Chavez et al.

    Comparison of three different methods to merge multiresolution and multispectral data: Landsat TM and SPOT panchromatic

    Photogramm. Eng. Rem. Sens.

    (1991)
  • Cited by (46)

    • Remote sensing of spectral diversity: A new methodological approach to account for spatio-temporal dissimilarities between plant communities

      2021, Ecological Indicators
      Citation Excerpt :

      However, for plant ecological applications, plant communities or individual species need to be represented by “homogeneous” pixels (i.e., spectral signal) in terms of vegetation cover and management types. Otherwise, spectral unmixing techniques may be needed to extract a pure spectral signal (Malenovský et al., 2007). Spatial sampling units, such as those provided by MODIS, are often only partly covered by vegetation and therefore most likely violate the underlying assumption of equally distributed individual plants.

    • From local to regional: Functional diversity in differently managed alpine grasslands

      2020, Remote Sensing of Environment
      Citation Excerpt :

      The reflectance of a “pixel” measured by an air- or spaceborne sensor is only comparable with the CWMs of that pixel if the CWMs collected at the leaf level are scaled to the canopy level, e.g. by multiplying CWMs with plant biomass. Otherwise physical scaling from canopy to leaf level, or vice versa through an RTM is required (Homolová et al., 2013; Lausch et al., 2013; Malenovský et al., 2007). The disadvantage of an RTM is the simplification of the model itself and the equifinality of the model inversion, because several sets of input variables can yield almost identical spectra (Combal et al., 2002).

    • A multi-sensor and multi-temporal remote sensing approach to detect land cover change dynamics in heterogeneous urban landscapes

      2019, Ecological Indicators
      Citation Excerpt :

      Still, a combined use of multi-sensor images for long term continuous monitoring of urban land cover development with automatized processes is just at the beginning in today’s era of big data. Our study underlines the importance of carefully designed standardization approaches when using sensors with different sensor characteristics (Malenovský et al., 2007). Challenges in dealing with different spatial and spectral resolutions from different sensors and in applying rescaling methods to compare results were identified in a number of studies (Lausch et al., 2013; Atkinson, 1993; Goodin and Henebry, 2002; Xie and Weng, 2016).

    View all citing articles on Scopus
    View full text