2,190
Views
54
CrossRef citations to date
0
Altmetric
Original Article

Remote sensing image fusion: an update in the context of Digital Earth

&
Pages 158-172 | Received 20 Sep 2013, Accepted 21 Nov 2013, Published online: 19 Dec 2013

Abstract

Remote sensing image fusion has come a long way from research experiments to an operational image processing technology. Having established a framework for image fusion at the end of the 90s, we now provide an overview on the advances in image fusion during the past 15 years. Assembling information about new remote sensing image fusion techniques, recent technical developments and their influence on image fusion, international societies and working groups, and new journals and publications, we provide insight into new trends. It becomes clear that image fusion facilitates remote sensing image exploitation. It aims at achieving better and more reliable information to better understand complex Earth systems. The numerous publications during the last decade show that remote sensing image fusion is a well-established research field. The experiences gained foster other technological developments in terms of sensor configuration and data exploitation. Multi-modal data usage enables the implementation of the concept of Digital Earth. In order to advance in this respect, we recommend that updated guidelines and a set of commonly accepted quality assessment criteria are needed in image fusion.

1. Introduction

Since the publication of an extended remote sensing image fusion review paper in 1998 (Pohl and van Genderen Citation1998), there have been many technological advances in the field of remote sensing image fusion, an enormous increase in computer power, speed and data storage, as well as new algorithms and applications. In the same year that the above mentioned review was published, Al Gore gave his famous speech on ‘Digital Earth: Understanding our planet in the 21st Century’ (Gore Citation1998). His statement that ‘… we need a digital Earth. A multi-resolution, three-dimensional representation of the planet, into which we can embed vast quantities of geo-referenced data’ has stimulated much new research into interoperability and data integration of which remote sensing image fusion forms a part. In addition, numerous new satellites with much higher spatial and spectral resolutions have been launched since that date, which enabled many other developments in this increasingly useful research field and brought new challenges to the scientists. Hence this paper aims to inform the reader of some of these new developments, indicating the progress that has been made and highlighting future research needs.

In the late 90s, image fusion started to be accepted and applied in the remote sensing community. Since then, it has become part of commercial software packages, product lists of satellite image providers, and an important part of processing imagery for Digital Earth research. Fusion of multi-source images is considered to be a prime solution to optimize information extraction from remote sensing data (Zhang Citation2010; Fang et al. Citation2013). The challenges that researchers face nowadays are more related to the enormous increase in spatial and spectral resolutions of the images provided plus the development of standardized quality evaluation criteria (Thomas and Wald Citation2004; Thomas et al. Citation2008; Zhang Citation2008a; Bovolo et al. Citation2010).

The second section of this paper concentrates on new image fusion approaches followed by quality assessment methods. New developments in terms of remote sensing technology and computing hardware are presented in section four. Then we provide a description of new working groups, societies, and publications. The paper ends with conclusions to address new research fields.

2. New algorithms

The methods used to fuse remote sensing images have developed from generic algorithms towards adaptive and context-oriented methods. With the growing number of sensors and bands and the increase in resolution, this was a logical way to go to account for spatial and spectral integrity of the fused images. Researchers in the field have significantly improved the results of image fusion by considering the sensor as well as the study area characteristics along with the anticipated information that they are looking for. This led to more sophisticated approaches that give flexibility to the user to fuse three multispectral bands with one panchromatic band. The number of bands is not limited anymore, and the resulting fused image is adapted towards selected image content criteria, for example, spectral preservation or increased spatial enhancement. The following paragraphs discuss trends in image fusion, the challenges that users face today, and provide an overview on recently published techniques to fuse remotely sensed images.

2.1. Trends in image fusion

Back in the late 90s/early 2000, one of the main research areas in remote sensing image fusion was in the field of pansharpening, by combining high- and low-resolution images from the same satellite system. Examples are the combination of 10-meter panchromatic images with the 20-meter multispectral images from the Satellite Pour l'Observation de la Terre (SPOT) satellite (e.g. Vrabel Citation1996). Other commonly used image combinations applied optical data from different platforms, such as SPOT 10-meter panchromatic with Landsat multispectral images at 30-meter spatial resolution (e.g. Chavez, Sides, and Anderson Citation1991; Price Citation1999; Ranchin and Wald Citation2000). Other processing approaches focused on the use of optical and microwave remote sensing images due to applications that required input from synthetic aperture radar (SAR) (e.g. Harris, Murray, and Hirose Citation1990; Yesou et al. Citation1993). In the meantime image fusion has become well established. Pansharpening still enjoys high popularity (Ehlers et al. Citation2010; Guo et al. Citation2010; Massip, Blanc, and Wald Citation2012). A comprehensive review including an overview on the different applications and recommendations has been prepared by Zhang (Citation2008b). The importance of pansharpening is also shown by a patent on the Gram-Schmidt (GS) pansharpening algorithm that has been filed in the United States (e.g. No. 6011875 by Laben, Bernard, and Brower Citation2000) and commercially used algorithms (e.g. Zhang Citation2004; Fuze Go™ Citation2013). Now, fusion algorithms have become standard functions in most commercial software packages such as ERDAS, PCI, ENVI, etc. Satellite and airborne sensors with higher spatial and spectral resolution are increasingly available. As a result, remote sensing image fusion is now also feasible between satellite imagery and airborne data-sets such as hyperspectral and Light Detection and Ranging (LiDAR) data (Huang, Zhang, and Yu Citation2012; Berger et al. Citation2013).

2.2. New challenges for the user

Taking into account the increased demand for details and parameters to be considered, researchers are facing new challenges:

  • Integration of more than three spectral bands;

  • High-resolution images require higher measures in geometric accuracy prior to image fusion;

  • Observation angles impose different shadow features in different images in high-resolution data creating artifacts in fused images;

  • Changes in the range of spectral bands of new-generation sensors require particular consideration in matching the panchromatic channel;

  • The high spatial resolution imposes new features being disturbed (e.g. blurring of building edges or moving cars) that were not even noticeable before;

  • Higher spectral and spatial resolutions put higher demands on the quality of fused images and therefore on the fusion technique itself;

  • The geometric and radiometric requirements for fused imagery that is of good quality put a high demand on the operator's knowledge and processing complexity.

2.3. New sensors provide new potential

In terms of data sources being fused, there are new sensors being taken into account, for example, fusing airborne hyperspectral imagery with high-resolution satellite data (Borel and Spencer Citation2009; Cetin and Musaoglu Citation2009; Zhang 2012; Zhang et al. Citation2013), optical data with LiDAR data (Dalponte et al. Citation2008; Goodenough et al. Citation2008; Roberts, van Aardt, and Ahmed Citation2011; Buckley et al. Citation2013; Buddenbaum, Seeling, and Hill Citation2013), and high-resolution SAR images from satellite systems such as the 1-meter TerraSAR with high-resolution optical systems (Chen, Hepner, and Forster 2003; Amarsaikhan et al. Citation2010; Berger Citation2010; Lisini et al. Citation2011; Berger et al. Citation2013) for regular monitoring in cloud-covered regions of the world.

2.4. Development of new image fusion techniques

With the advancement in sensor technology and the availability of new, very diverse imagery available, the further development of commonly used fusion techniques became a prerequisite for success. Not only was it necessary to open the techniques to more than three bands, but also the amount of detail in the images required adaptive and context-oriented approaches. As a result, researchers derived new versions of popular fusion techniques, such as Intensity Hue Saturation (IHS), principal component analysis (PCA), Brovey transform (BT), wavelet-based techniques, and others and created so-called hybrid techniques (Zhang Citation2010) that consist of a combination of known techniques. A good summary and description of the adapted and amended methods can be found in Alparone et al. (Citation2007) apart from the listing of methods provided in the next paragraph.

In terms of new algorithms, many researchers are concerned with various uses of the wavelet transform (Ranchin et al. Citation2003; Mercer et al. Citation2005; Gamba et al. Citation2012; Xiao and He Citation2013). Another development is the curvelet transform. To separate the image into unconnected scales, curvelets use multi-scale ridgelets with a bandpass filtering. In comparison to wavelets it has the advantage of integrating directional elements (Nencini et al. Citation2007). Interesting enough, more and more techniques are used in an integrated manner. Wavelets started to be combined with the traditional method of IHS (Amolins, Zhang, and Dare Citation2007; Roberts, van Aardt, and Ahmed Citation2011). Others apply adapted versions of substitution techniques, such as General Intensity Hue Saturation (GIHS) with genetic algorithm (GIHS-GA), IHS in combination with the Fast fourier transform (FFT), called Ehlers fusion (Ehlers et al. Citation2010). The approach by Ehlers extracts the spatial detail in the frequency domain and incorporates this information into the multispectral data by the IHS transform.

Another new technique is the GS approach, also existing as adaptive approach (GSA) and GSA using a local model (GSA-CA) (Aiazzi, Baronti, and Selva Citation2007; Choi et al. Citation2013), just to name a few. Others tackle the fusion process at a higher level using a unified framework by establishing correspondences between structures within high spatial resolution and high temporal but low spatial resolution images (Huang and Song Citation2012). All recent techniques aim at taking into account sensor-specific characteristics, image band fit, and the local context in the imagery (Park and Kang Citation2004).

A commercial algorithm that just entered the market is the Fuze Go™ (Citation2013). It is based on the University of New Brunswick (UNB) algorithm, operating with regression analysis to produce weight factors, creating a synthetic image by multiplication and addition. The adaptive and context-specific approach preserves spectral integrity while adding spatial detail. Depending on the anticipated application, the user would choose different adjustments of the technique depending on the parameters needed for interpretation (spectral integrity or high spatial detail). The commercial offer is currently being extended to a visible and infrared (VIR)/SAR fusion algorithm called Fuze Go SAR.

Another interesting new look at image fusion is the approach published by Huang et al. (Citation2013) called spatial and spectral fusion model (SASFM) using spectral unmixing and spectral libraries. It is a two-stage algorithm where the first step establishes an optimal set of atoms from a spectral library, followed by step two where the spectrum of each pixel is reconstructed to produce the high-resolution multispectral image. In summary, it can be stated the fusion algorithms started to become ‘intelligent’ meaning that they are not generic anymore but considering sensor, context, application, and other knowledge about parameters in the process.

3. Quality assessment

There is value in performing image fusion only if the resulting fused image contains information of greater quality than the individual input images alone. This becomes obvious when we look at the accepted definition of image fusion proposed by Pohl and van Genderen (Citation1998). Thomas and Wald (Citation2004) contributed very valuable insights to the quality assessment aspect in image and data fusion. They refer to the issue of defining quality gain and using a genuine reference to obtain a quality statement. As a result, it is very important to distinguish between relative and absolute quality. Relative quality refers to the closeness of the fused image to the original image. Absolute quality corresponds to the similarity between the fused and a reference image. The latter is produced synthetically due to the fact that there is no access to the image that has the quality parameters for comparison (Thomas and Wald Citation2007; Alparone et al. Citation2007).

Most commonly, users of image fusion use two different ways to assess their achievements. The first, a qualitative approach, is the visible inspection of the results, comparing the outcome to the original input data. The second, a quantitative approach, uses statistics and other assessment methods to provide comparable quality measures, such as quality indexes. An excellent overview on quality metrics is compiled in Li, Li, and Gong (Citation2010). Others measure the outcome of an application of fused images, for example, in the context of land cover mapping and classification (Amarsaikhan et al. Citation2012; Colditz et al. Citation2006; Pohl and Hashim Citation2013).

A visual evaluation of the results to assess fused image quality is done in almost all published studies in the field. However, the research community has come up with a variety of established quantitative evaluation criteria that can be applied even though till today there is no unity on which criteria should be used as standard (Thomas and Wald Citation2004; Zhang Citation2008a).

A selection of commonly used indexes and assessment methods are listed in . This table is not meant as review, nor does it list all published indexes. It illustrates the variety of quality measures image fusion users have access to and the difficulty of choosing the ‘right’ one.

Table 1. Examples of commonly used fusion quality assessment indexes.

There is plenty of literature explaining the various indexes and applying them in image fusion studies (Gonzalez-Audicana et al. Citation2004; Wang et al. Citation2004; Alparone et al. Citation2007; Ling et al. Citation2007; Li, Li, and Gong 2010; Meng, Borders, and Madden Citation2010; Padwick et al. 2010; Fonseca et al. Citation2011). An interesting study is described in a recent paper by Abdikan et al. (Citation2012) focusing on quality considerations in the case of optical and radar image fusion.

What we can also see from is the fact that there is neither consensus in the literature on a uniform quality assessment approach, nor do we have a consistent naming for the various indexes. Looking at the Mean Structure Similarity Index Measure (MSSIM) and the Structural Similarity Quality Metric (SSQM), for example, the different authors use similar approaches but the individual researchers keep producing their own terminology (Wang et al. Citation2004; Zheng and Qin Citation2009). In the context of Digital Earth this is not very useful. To achieve a multi-dimensional, multi-scale, multi-temporal, and multi-layer information facility for a better understanding of the complex Earth systems, we need unified approaches and terms. This may be a role that the International Society of Digital Earth could take up via one of its Working Groups (ISDE Citation2013).

4. New developments in technology

Since 1998, computer performance has greatly increased: from about 2000 million instructions per second (MIPS) at 600 MHz in the end of the 90s to about 180,000 MIPS at 3 GHz in 2011. Data storage has increased tenfold. Instead of talking about megabyte capacity we are now dealing with Tera- or even Petabytes (Wikipedia Citation2013). As image fusion requires not only the input images but also many intermediate stages, such as geometrically and radiometrically corrected images, enhancement or adaptation pre-processed images plus the fused output image, these developments have made it much easier today to perform image fusion than 15 years ago. Hence current researchers are able to fuse very large image data-sets from different sensors at high resolutions even using sophisticated fusion techniques in a timely manner. This enables many more applications to be carried out efficiently but also introduces new problems and issues to be considered as already mentioned in section 2.2.

The imagery provided by the many launched satellites since 1998 also has improved in quality, with regard to looking at the spatial resolution and the number of spectral bands available. There are high-resolution optical (spatial and spectral resolution) as well as microwave (spatial resolution and polarization) sensors. The high-resolution data have become more and more publically available due to the launch of many commercial satellites in contrast to the many military satellites before. In the 90s spatial resolutions ranged between 10 and 80 meters, now we look at 0.5 to 1 meter resolutions introducing a whole lot of new challenges (e.g. matching accuracy, shadow, distortions, etc.). The number of spectral bands that we were concerned with in the 1990s was up to seven (Landsat TM, later ETM + ); now hyperspectral sensors deliver hundreds of bands. We have access to stereo capabilities, and the sensors cover the entire range of the electromagnetic spectrum. In image fusion there are new aspects to be considered due to the fact that the spectral range of panchromatic bands has changed (Padwick et al. Citation2010). In addition, previously working algorithms, such as the very popular IHS or the BT, are not working anymore because they are limited to three multispectral bands. Multi-look capabilities introduce different shadow effects between different image acquisition dates that have to be accounted for. In radar remote sensing, the multi-polarization aspect provides a whole lot of new information to the user to be dealt with (Goodenough et al. Citation2008), not even talking about the amount of data to be handled.

5. Image fusion working groups and societies

From experimental stages in the very beginning of remote sensing image fusion, for example by Welch and Ehlers (Citation1987), to commercial fused image production the technology has come a long way. It is only natural that professional societies recognized this and set up new mechanisms to advance knowledge in this field.

5.1. International Society of Digital Earth (ISDE)

In 1999, the Chinese Academy of Sciences organized the first International Symposium on Digital Earth in Beijing. That led to the establishment of the International Society of Digital Earth (ISDE), which organizes biannual Symposia and Digital Earth Summits. These events plus the Society's journal, the International Journal of Digital Earth, have shown how image fusion techniques have enabled highly accurate 3-D models, integration of multi-source, multi-resolution, and multi-temporal global earth observation data-sets to be produced (see also below) (Liu et al. Citation2012; Abdikan et al. Citation2012; Wu et al. Citation2013; Yu et al. Citation2013).

5.2. International Society of Photogrammetry and Remote Sensing (ISPRS)

In 2004, ISPRS set up a new Technical Commission VII entitled: Thematic Processing, Modeling and Analysis of Remotely Sensed Data. This Commission established a Working Group (VII-6) on Remote Sensing Image Fusion, chaired by Zhang Jixian (ISPRS Citation2013). During the past nine years, this Working Group has organized several international remote sensing image fusion symposia, such as on optical-SAR image fusion. In August 2013, the third International Workshop on Image and Data Fusion was held in Jilin, China. Major issues discussed in this workshop included the monitoring of changes and improvement of our understanding of the dynamics of processes related to food and water security, sustainability, climate change, and disasters. The researchers presented new fusion techniques; issues in the use of SAR, Interferometric SAR (InSAR), and Unmanned Aerial Vehicles (UAVs); as well as more reliable and efficient ways to produce and update spatial data to feed Digital Earth.

5.3. IEEE: Geoscience and Remote Sensing Society (GRSS)

Another activity that boosted research in remote sensing image fusion was the IEEE's GRSS Working Group on image fusion. The IEEE GRSS Data Fusion Technical Committee (DFTC) serves as a global, multi-disciplinary network for geospatial data fusion, connecting people and resources ( http://www.grss-ieee.org/community/technical-committees/data-fusion/ ). A very interesting and advancement-fostering activity of DFTC is the annual Data Fusion Contest (http://www.grss-ieee.org/community/technical-committees/data-fusion/). It has been run since 2006. Here, a well calibrated data-set is ;provided to all interested researchers to show their fusion algorithm for fusing and classifying the two different remote sensing data-sets. The first contest was on pansharpening (Alparone et al. Citation2007), the results of 2007 were published in Pacifici et al. (2008) dealing with urban mapping, while in 2012 on very high spatial resolution multi-modal/multi-temporal image fusion using multispectral, SAR, and LiDAR data (Berger et al. Citation2013), and this year (Citation2013) the contest participants received a hyperspectral data-set along with a LiDAR image. Throughout the years the interest in this contest increased to more than 1000 researchers.

5.4. The International Society of Information Fusion (ISIF)

The International Society of Information Fusion (ISIF) (http://www.isif.org) is a large society focusing on all aspects of information fusion. This is of course much broader than remote sensing image fusion. There is a paper on their website by Bostrom et al. (Citation2007) giving an interesting perspective on what information fusion is as a research discipline. The society holds an Annual International Conference, which produces relevant Proceedings. The ISIF also publishes a journal, namely the Journal of Advances in Information Fusion. They have produced several interesting Special Issues. The Society has three Working groups, of which the one on ‘Fusion Process Model and Framework’ is of most relevance to remote sensing researchers.

6. Publications

As a result of all the efforts being taken in remote sensing image fusion, a variety of new journals, textbooks, and research articles have been published in the meantime.

6.1. New journals

One of the most significant events in this field was the launch of a dedicated new international peer-reviewed journal entitled: International Journal of Image and Data Fusion (IJIDF). The first issue came out in the beginning of 2010. Next year (2014), that journal will apply for Science Citation indexing with Thomson-Reuters. The journal is published by the same publisher as this article. The IJIDF is not limited to but more than 90% of the published papers to date are on remote sensing image fusion (http://www.tandfonline.com/tidf).

Another new journal since 1998 is the Elsevier Journal ‘Information Fusion’. As has been mentioned above, the ISIF also has a journal, which at times contains useful papers for remote sensing researchers working on image fusion. These journals, although they include papers on remote sensing image fusion from time to time, are much broader in their aims and scope, and most papers are from non-remote sensing fields. However, their Special Issues are useful for remote sensing researchers.

Last but not least the International Journal of Digital Earth launched in 2008 has published many papers on various aspects of image and data fusion in relation to Digital Earth (Shupeng and van Genderen Citation2008; Abdikan et al. Citation2012; Wu et al. Citation2013; Yu et al. Citation2013). Due to the integrative nature of Digital Earth, the journal contributes significantly to the dissemination of advances in image and data fusion.

6.2. New textbooks

In 1998 there were no dedicated textbooks on remote sensing image fusion. In fact, even now, in 2013 there is not a good textbook about this subject yet. Some of the better books available back in 1998 were those by Hall (Citation1992), on mathematical techniques in multi-sensor data fusion, and that by Bar-Shalom in 1995 that was updated in 2012 (Bar-Shalom, Willett, and Tian Citation2011). This shows the increasing emphasis on image and data fusion, as the new version devotes four chapters (instead of just one) to image and data fusion (more than 250 pages). As of today, there are many up-to-date excellent books on data fusion, but fewer on image fusion, and even less on remote sensing image fusion. Books on data fusion that ought to be mentioned are those by Klein (Citation1999), Hall and McMullen (Citation2004), Liggins II, Hall, and Llinas (Citation2009), and Mitchell (Citation2012). One of the earliest books on remote sensing image fusion was that by Wald in 2002. It focused on pansharpening of SPOT images (panchromatic with multispectral images). Of course, as both sensors are on the same platform, with the same orbital characteristics, this is much more straightforward than fusing images from different satellite systems, with different orbit inclinations, different sensor geometry, completely different spectral characteristics, etc. There are some recent books on Remote Sensing Image Fusion, such as those by Mitchell (Citation2010), Dong et al. (Citation2011), Stathaki (Citation2011), Zheng (2011), and that by Zhao (Citation2012). But most of these are edited books consisting of a collection of papers on the subject by multiple authors. What is really needed is a good textbook for students, so that they understand the concepts, issues, techniques, and applications, as well as the challenges of fusing remote sensing images from disparate sources.

6.3. New research papers

Since 1998 more than 1000 papers on the subject of remote sensing image fusion have been identified by the authors, both in peer-reviewed international remote sensing journals, as well as many in remote sensing conference proceedings. Some of the most interesting recent review papers that have appeared on the subject of image fusion are those by Karathanassi, Kolokousis, and Ioannidou (Citation2007), Zhang (Citation2010), and Foo and Ng (Citation2013). An excellent review on the broader subject of data fusion can be found in Khaleghi et al. (Citation2013). As a matter of fact these authors tackle issues that also play an important role in remote sensing image fusion, such as data correlation and data registration.

In the many remote sensing image fusion papers that have appeared recently, one can observe the trend towards much more quantitative evaluation of the fusion algorithm performance. There has been a marked shift from visually comparing the fused outputs to a more systematic quantitative evaluation (Wang and Bovik Citation2002; Li, Li, and Gong Citation2010; Carvalho and Chang Citation2012; Gamba et al. Citation2012; Han et al. Citation2013). In terms of applications, urban analyses, fusing either high and low spatial resolution images or fusing optical and SAR data-sets (Amarsaikhan et al. Citation2010; Wurm et al. Citation2011; Brook, Ben-Dor, and Richter Citation2013; Dahiya, Garg, and Jat Citation2013), and land cover changes/change detection are other major application areas (Zeng et al. Citation2010; Du et al. Citation2013). Several journals have devoted ‘Special Issues’ on the subject of image and data fusion. Here one should be mentioned in particular: the IEEE's Transactions on Geoscience and Remote Sensing, Special Issue on Data Fusion (Citation2008). It contains a diverse number of papers on remote sensing image fusion for many different applications. Another application area that has received increasing attention from the remote sensing image fusion community is that of geology and mineral exploration (He and Zhao Citation2010; West and Resmini Citation2009; Beiranvand Pour and Hashim Citation2013). As a last example of a rapidly new field of research is the area of natural disasters (Llinas Citation2002; Cleary et al. Citation2012).

7. Conclusions

This update on the developments in remote sensing image fusion shows that there has been an enormous increase in the amount of research in this field over the past 15 years. At the time of the previous review paper, the main techniques focused on straightforward pixel-based fusion existing at that time and its applications. Today, the achievements of international research resulted in adapted, focused, and optimized techniques that take into account sensor and image characteristics as well as the intended application. It is obvious that higher resolutions in terms of temporal, spatial, and spectral characteristics require context dependent and locally operating algorithms and assessment tools. Furthermore, standardized assessment criteria are needed in order to evaluate the quality of fused products and the possibility to exchange information and apply fused images in the context of Digital Earth. Based on the efforts and achievements of Wald and his team, a fundamental framework has already been established (Wald Citation2000; Thomas and Wald Citation2004, Citation2007). Future research should concentrate on common image fusion strategies and optimization aspects to minimize efforts taken in fusion experiments in new application areas and the use of new sensor images along with standardization of quality assessment criteria for compatibility reasons.

Acknowledgements

The authors acknowledge the valuable comments of the reviewers to improve the original manuscript. They also acknowledge the support of the Research Alliance for Sustainability of the Universiti Technologi Malaysia.

References

  • Abdikan, S., F. B. Sanli, F. Sunar, and M. Ehlers. 2012. “A Comparative Data-fusion Analysis of Multi-sensor Satellite Images.” International Journal of Digital Earth. doi:10.1080/17538947.2012.748846.
  • Aiazzi, B., S. Baronti, and M. Selva. 2007. “Improving Component Substitution Pansharpening through Multivariate Regression of MS +Pan Data.” IEEE Transactions on Geoscience and Remote Sensing 45 (10): 3230–3239. doi:10.1109/TGRS.2007.901007.
  • Alparone, L., L. Wald, J. Chanussot, C. Thomas, P. Gamba, and L. M. Bruce. 2007. “Comparison of Pansharpening Algorithms: Outcome of the 2006 GRS-S Data Fusion Contest.” IEEE Transactions on Geoscience and Remote Sensing 45 (10): 3012–3021. doi:10.1109/TGRS.2007.904923.
  • Amarsaikhan, D., H. H. Blotevogel, J. L. van Genderen, M. Ganzorig, R. Gantuya, and B. Nergui. 2010. “Fusing High-resolution SAR and Optical Imagery for Improved Urban Land Cover Study and Classification.” International Journal of Image and Data Fusion 1 (1): 83–97. doi:10.1080/19479830903562041.
  • Amarsaikhan, D., M. Saandar, M. Ganzorig, H. H. Blotevogel, E. Egshiglen, R. Gantuyal, B. Nergui, and D. Enkhjargal. 2012. “Comparison of Multisource Image Fusion Methods and Land Cover Classification.” International Journal of Remote Sensing 33 (8): 2532–2550. doi:10.1080/01431161.2011.616552.
  • Amolins, K., Y. Zhang, and P. Dare. 2007. “Wavelet Based Image Fusion Techniques — An Introduction, Review and Comparison.” ISPRS Journal of Photogrammetry and Remote Sensing 62 (4): 249–263. doi:10.1016/j.isprsjprs.2007.05.009.
  • Bar-Shalom, Y., P. K. Willett, and X. Tian. 2011. Tracking and Data Fusion: A Handbook of Algorithms. Bradford: YBS.
  • Beiranvand Pour, A., and M. Hashim. 2013. “Fusing ASTER, ALI and Hyperion Data for Enhanced Mineral Mapping.” International Journal of Image and Data Fusion 4 (2): 126–145. doi:10.1080/19479832.2012.753115.
  • Berger, C. 2010. “Fusion of High Resolution SAR Data and Multispectral Imagery at Pixel Level – A Comprehensive Evaluation.” MSc thesis, Institute for Geography, Friedrich-Schiller-University Jena, Jena, Germany.
  • Berger, C., M. Voltersen, R. Eckardt, J. Eberle, T. Heyer, N. Salepci, S. Hese, et al. 2013. “Multi-modal and Multi-temporal Data Fusion: Outcome of the 2012 GRSS Data Fusion Contest.” IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 6 (3): 1324–1340. doi:10.1109/JSTARS.2013.2245860.
  • Borel, C. C., and C. H. Spencer. 2009. “Novel Methods for Panchromatic Sharpening of Multi/Hyper-spectral Image Data.” Proceedings of IGARSS 2009, IEEE International 4: 757–760.
  • Bostrom, H., S. F. Andler, M. Brohede, R. Johansson, A. Karlsson, J. van Laere, L. Niklasson, M. Nilsson, A. Persson, and T. Ziemke. 2007. “On the Definition of Information Fusion as a Field of Research.” Technical Report HS-IKI-TR-07-006, School of Humanities & Informatics, University of Skode, Sweden. Accessed September 2013. http://his.diva-portal.org/smash/get/diva2:2391/FULLTEXT01.
  • Bovolo, F., L. Bruzzone, L. Capobianco, A. Garzelli, S. Marchesi, and F. Nencini. 2010. “Analysis of the Effects of Pansharpening in Change Detection on VHR Images.” IEEE Geoscience and Remote Sensing Letters 7 (1): 53–57. doi:10.1109/LGRS.2009.2029248.
  • Brook, A., E. Ben-Dor, and R. Richter. 2013. “Modelling and Monitoring Urban Built Environment via Multi-source Integrated and Fused Remote Sensing Data.” International Journal of Image and Data Fusion 4 (1): 2–32. doi:10.1080/19479832.2011.618469.
  • Buckley, S. J., T. H. Kurz, J. A. Howell, and D. Schneider. 2013. “Terrestrial Lidar and Hyperspectral Data Fusion Products for Geological Outcrop Analysis.” Computers and Geosciences 54: 249–258. doi:10.1016/j.cageo.2013.01.018.
  • Buddenbaum, H., S. Seeling, and J. Hill. 2013. “Fusion of Full-waveform Lidar and Imaging Spectroscopy Remote Sensing Data for the Characterization of Forest Stands.” International Journal of Remote Sensing 34 (13): 4511–4524. doi:10.1080/01431161.2013.776721.
  • Carvalho, R. N., and K. Chang. 2012. “A Fusion Analysis Tool for Multi-sensor Classification Systems.” Journal of Advances in Information Fusion 7 (2): 142–152.
  • Cetin, M., and N. Musaoglu. 2009. “Merging Hyperspectral and Panchromatic Image Data: Qualitative and Quantitative Analysis.” International Journal of Remote Sensing 30 (7): 1779–1804. doi:10.1080/01431160802639525.
  • Chavez, P. S., S. C. Sides, and J. A. Anderson. 1991. “Comparison of Three Different Methods to Merge Multiresolution and Multispectral Data: Landsat TM and SPOT Panchromatic.” Photogrammetric Engineering and Remote Sensing 57 (3): 295–303.
  • Chen, C.-M., G. F. Hepner, and R. R. Forster. 2003. “Fusion of Hyperspectral and Radar Data Using the IHS Transformation to Enhance Urban Surface Features.” ISPRS Journal of Photogrammetry and Remote Sensing 58 (1–2): 19–30. doi:10.1016/S0924-2716(03)00014-5.
  • Choi, J., J. Yeom, A. Chang, Y. Byun, and Y. Kim. 2013. “Hybrid Pansharpening Algorithm for High Spatial Resolution Satellite Imagery to Improve Spatial Quality.” IEEE Geoscience and Remote Sensing Letters 10 (3): 490–494. doi:10.1109/LGRS.2012.2210857.
  • Cleary, P. W., M. Prakash, S. Mead, X. Tang, H. Wang, and S. Ouyeng. 2012. “Dynamic Simulation of Dam-break Scenarios for Risk Analysis and Disaster Management.” International Journal of Image and Data Fusion 3 (4): 333–363. doi:10.1080/19479832.2012.716084.
  • Colditz, R. R., T. Wehrmann, M. Bachmann, K. Steinocher, M. Schmidt, G. Strunz, and S. Dech. 2006. “Influence of Image Fusion Approaches on Classification Accuracy: A Case Study.” International Journal of Remote Sensing 27 (15): 3311–3335. doi:10.1080/01431160600649254.
  • Dahiya, S., P. K. Garg, and M. K. Jat. 2013. “A Comparative Study of Various Pixel-based Image Fusion Techniques as Applied to an Urban Environment.” International Journal of Image and Data Fusion 4 (3): 197–213. doi:10.1080/19479832.2013.778335.
  • Dalponte, M., L. Bruzzone, and D. Gianelle. 2008. “Fusion of Hyperspectral and LIDAR Remote Sensing Data for Classification of Complex Forest Areas.” IEEE Transactions Geoscience and Remote Sensing 46 (5): 1416–1427. doi:10.1109/TGRS.2008.916480.
  • Dong, J., D. Zhuang, Y. Huang, and J. Fu, eds. 2011. Survey of Multispectral Image Fusion Techniques in Remote Sensing. New York: InTech.
  • Du, P., S. Liu, J. Xia, and Y. Zhao. 2013. “Information Fusion Techniques for Change Detection from Multi-temporal Remote Sensing Images.” Information Fusion 14 (1): 19–27. doi:10.1016/j.inffus.2012.05.003.
  • Ehlers, M., S. Klonus, J. P. Åstrand, and P. Rosso. 2010. “Multi-sensor Image Fusion for Pansharpening in Remote Sensing.” International Journal of Image and Data Fusion 1 (1): 25–45. doi:10.1080/19479830903561985.
  • Fang, F., F. Li, G. Zhang, and C. Shen. 2013. “A Variational Method for Multisource Remote-sensing Image Fusion.” International Journal of Remote Sensing 34 (7): 2470–2486. doi:10.1080/01431161.2012.744882.
  • Foo, P. H., and G. W. Ng. 2013. “High-level Information Fusion: An Overview.” Journal of Advanced Information Fusion 8 (1): 33–72.
  • Fonseca, L., L. Namikawa, E. Castejon, L. Carvalho, C. Pinho, and A. Pagamisse. 2011. “Image Fusion for Remote Sensing Applications.” In Image Fusion and Its Applications, edited by Y. Zheng, 153–178. Rijeka: InTech.
  • Fuze Go™. 2013. Accessed September 20. http://www.fuzego.com/.
  • Gamba, P., P. Liu, P. Du, and H. Lin. 2012. “Evaluation and Analysis of Fusion Algorithms for Active and Passive Remote Sensing Image.” IGARSS Remote Sensing for a Dynamic Earth, 2272–2275, Munich, Germany, April 22–27.
  • Goodenough, D. G., C. Hao, A. Dyk, A. Richardson, and G. Hobart. 2008. “Data Fusion Study between Polarimetric SAR, Hyperspectral and Lidar Data for Forest Information.” Proceedings IGARSS 2008, IEEE International 2 (1): 281–284.
  • Gonzalez-Audicana, M., J. L. Saleta, R. G. Catalan, and R. Garcia. 2004. “Fusion of Multispectral and Panchromatic Images Using Improved IHS and PCA Mergers Based on Wavelet Decomposition.” IEEE Transactions on Geoscience and Remote Sensing 42 (6): 1291–1299. doi:10.1109/TGRS.2004.825593.
  • Gore, A. 1998. “The Digital Earth: Understanding Our Planet in the 21st Century.” Speech given at the California Science Center, Los Angeles, CA, January 21. Accessed September 2013. http://www.digitalearth.gov.
  • Guo, Q., S. Chen, H. Leung, and S. Liu. 2010. “Covariance Intersection Based Image Fusion Technique with Application to Pansharpening in Remote Sensing.” Information Sciences 180 (18): 3434–3443. doi:10.1016/j.ins.2010.05.010.
  • Hall, D. L. 1992. Mathematical Techniques in Multisensor Data Fusion. Norwood, MA: Artech House.
  • Hall, D. L., and S. A. H. McMullen. 2004. Mathematical Techniques in Multisensor Data Fusion. Norwood, MA: Artech House.
  • Han, Y., Y. Cai, Y. Cao, and X. Xu. 2013. “A New Image Fusion Performance Metric Based on Visual Information Fidelity.” Information Fusion 14 (2): 127–135. doi:10.1016/j.inffus.2011.08.002.
  • Harris, J. R., R. Murray, and T. Hirose. 1990. “IHS Transform for the Integration of Radar Imagery with Other Remotely Sensed Data.” Photogrammetric Engineering and Remote Sensing 56: 1631–1641.
  • He, H., and Y. Zhao. 2010. “Multisource Data Fusion Technology and Its Application in Geological and Mineral Survey.” In 2nd International Conference on Information Engineering and Computer Science (ICIECS), December 25–26, 1–6. Red Hook, NY: IEEE.
  • Huang, B., H. Zhang, and L. Yu. 2012. “Improving Landsat ETM+ Urban Area Mapping via Spatial and Angular Fusion with MISR Multi-angle Observation.” IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing 5 (1): 101–109. doi:10.1109/JSTARS.2011.2177247.
  • Huang, B., and H. Song. 2012. “Spatiotemporal Reflectance Fusion via Sparse Representation.” IEEE Transactions on Geoscience and Remote Sensing 50 (10): 3707–3716. doi:10.1109/TGRS.2012.2186638.
  • Huang, B., H. Song, H. Cui, J. Peng, and Z. Xu. 2013. “Spatial and Spectral Image Fusion Using Sparse Matrix Factorization.” IEEE Transactions on Geoscience and Remote Sensing, published online PP (99): 1–1.
  • IEEE. 2008. “Special Issue on Data Fusion.” IEEE Transactions on Geoscience and Remote Sensing 56: 1283–1575.
  • ISDE. 2013. Homepage. Accessed November 17. http://www.digitalearth-isde.org/.
  • ISPRS. 2013. WG VII/6 Remote Sensing Data Fusion Homepage. Accessed December 5. http://www2.isprs.org/commissions/comm7/wg6.html.
  • Karathanassi, V., P. Kolokousis, and S. Ioannidou. 2007. “A Comparison Study on Fusion Methods Using Evaluation Indicators.” International Journal of Remote Sensing 28 (10): 2309–2341. doi:10.1080/01431160600606890.
  • Khaleghi, B., A. Khamis, F. O. Karray, and S. N. Razavi. 2013. “Multisensor Data Fusion: A Review of the State of the Art.” Information Fusion 14 (1): 28–44. doi:10.1016/j.inffus.2011.08.001.
  • Klein, L. A. 1999. Sensor and Data Fusion Concepts and Application. Bellingham, WA: SPIE.
  • Laben, C. A., V. Bernard, and W. Brower. 2000. Process for Enhancing the Spatial Resolution of Multispectral Imagery Using Pan-sharpening. Accessed November 17, 2013. http://www.google.com/patents/US6011875.
  • Li, S., Z. Li, and J. Gong. 2010. “Multivariate Statistical Analysis of Measures for Assessing the Quality of Image Fusion” International Journal of Image and Data Fusion 1 (1): 47–66. doi:10.1080/19479830903562009.
  • Liggins II, M., D. Hall, and J. Llinas, eds. 2009. Handbook of Multisensor Data Fusion: Theory and Practice. London: CRC Press.
  • Ling, Y., M. Ehlers, E. L. Usery, and M. Madden. 2007. “FFT-enhanced IHS Transform Method for Fusing High-resolution Satellite Images.” ISPRS Journal of Photogrammetry and Remote Sensing 61 (6): 381–392. doi:10.1016/j.isprsjprs.2006.11.002.
  • Lisini, G., P. Gamba, F. Dell'Acqua, and F. Holecz. 2011. “First Results on Road Network Extraction and Fusion on Optical and SAR Images Using a Multi-scale Adaptive Approach.” International Journal of Image and Data Fusion 2 (4): 363–375. doi:10.1080/19479832.2011.613412.
  • Liu, Q. S., G. H. Liu, C. Huang, and C. J. Xie. 2012. “Using SPOT 5 Fusion-ready Imagery to Detect Chinese Tamarisk (Saltcedar) with Mathematical Morphological Method.” International Journal of Digital Earth 1–12. doi:10.1080/17538947.2012.671379.
  • Llinas, J. 2002. “Information Fusion for Natural and Man-made Disasters.” 5th International Conference on Information Fusion, 570–576. Annapolis, MD: Omnipress, July 8–11.
  • Massip, P., P. Blanc, and L. Wald. 2012. “A Method to Better Account for Modulation Transfer Functions in ARSIS-Based Pansharpening Methods.” IEEE Transactions on Geoscience and Remote Sensing 50 (3): 800–808. doi:10.1109/TGRS.2011.2162244.
  • Meng, Q., B. Borders, and M. Madden. 2010. “High-resolution Satellite Image Fusion Using Regression Kriging.” International Journal of Remote Sensing 31 (7): 1857–1876. doi:10.1080/01431160902927937.
  • Mercer, J. B., D. Edwards, G. Hong, J. Maduck, and Y. Zhang. 2005. “Fusion of INSAR High Resolution Imagery and Low Resolution Optical Imagery”. Proceedings of IGARSS 2005 6: 3931–3934. Art. No. 1525771.
  • Mitchell, H. B. 2010. Image Fusion: Theories, Techniques and Applications. Berlin: Springer-Verlag.
  • Mitchell, H. B. 2012. Data Fusion: Concepts and Ideas. Berlin: Springer-Verlag.
  • Nencini, F., A. Garzelli, S. Baronti, and L. Alparone. 2007. “Remote Sensing Image Fusion Using the Curvelet Transform.” Information Fusion 8 (2): 143–156. doi:10.1016/j.inffus.2006.02.001.
  • Pacifici, F., F. Del Frate, W. J. Emery, P. Gamba, and J. Chanussot. 2008. “Urban Mapping Using Coarse SAR and Optical Data: Outcome of the 2007 GRS-S Data Fusion Contest.” IEEE Geoscience and Remote Sensing Letters 5 (3): 331–335. doi:10.1109/LGRS.2008.915939.
  • Padwick, C., M. Deskevich, F. Pacifici, and S. Smallwood. 2010. “Worldview-2 Pan-sharpening.” American Society for Photogrammetry and Remote Sensing (ASRS) Annual Conference 2010: Opportunities for Emerging Geospatial Technologies, vol. 2, 740–753. San Diego, CA, April 26–30.
  • Park, J. H., and M. G. Kang. 2004. “Spatially Adaptive Multi-resolution Multispectral Image Fusion.” International Journal of Remote Sensing 25 (23): 5491–5508. doi:10.1080/01431160412331270830.
  • Pohl, C., and M. Hashim. 2013. “Increasing the Potential of Razaksat Images for Map-updating in the Tropics.” IOP Earth & Environmental Science, 8th International Symposium on Digital Earth 2013 ‘Transforming Knowledge into Sustainable Practice’, 6 p. Kuching – Sarawak, IOP Conference Series ‘Earth & Environmental Science’, Malaysia, August 26–29.
  • Pohl, C., and J. L. Van Genderen 1998. “Review Article Multisensor Image Fusion in Remote Sensing: Concepts, Methods and Applications.” International Journal of Remote Sensing 19 (5): 823–854. doi:10.1080/014311698215748.
  • Price, J. C. 1999. “Combining Multispectral Data of Differing Spatial Resolution.” IEEE Transactions on Geoscience and Remote Sensing 37 (3): 1199–1203. doi:10.1109/36.763272.
  • Ranchin, T., and L. Wald. 2000. “Fusion of High Spatial and Spectral Resolution Images: The ARSIS Concept and Its Implementation.” Photogrammetric Engineering & Remote Sensing 66 (1): 4–18.
  • Ranchin, T., B. Aiazzi, L. Alparone, S. Baronti, and L. Wald. 2003. “Image Fusion—The ARSIS Concept and Some Successful Implementation Schemes.” ISPRS Journal of Photogrammetry and Remote Sensing 58 (1–2): 4–18. doi:10.1016/S0924-2716(03)00013-3.
  • Roberts, J. W., J. A. N. van Aardt, and F. B. Ahmed. 2011. “Image Fusion for Enhanced Forest Structural Assessment.” International Journal of Remote Sensing 32 (1): 243–266. doi:10.1080/01431160903463684.
  • Shupeng, C., and J. van Genderen. 2008. “Digital Earth in Support of Global Change Research.” International Journal of Digital Earth 1 (1): 43–65. doi:10.1080/17538940701782510.
  • Stathaki, T. 2011. Image Fusion: Algorithms and Applications. London: Academic Press.
  • Thomas, C., and L. Wald. 2004. “Assessment of the Quality of Fused Products.” In Proceedings of the 24th Symposium of the European Association of Remote Sensing Laboratories (EARSeL) on ‘New Strategies for European Remote Sensing’, Dubrovnik, Croatia, May 25–27, 8 p. Rotterdam: Millpress.
  • Thomas, C., and L. Wald. 2007. “Comparing Distances for Quality Assessment of Fused Images.” In Proceedings of 26th EARSeL Symposium New Developments and Challenges in Remote Sensing, Varsovie, Poland, edited by Z. Bochenek, 101–111. Rotterdam: Millpress.
  • Thomas, C., T. Ranchin, L. Wald, and J. Chanussot. 2008. “Synthesis of Multispectral Images to High Spatial Resolution: A Critical Review of Fusion Methods Based on Remote Sensing Physics.” IEEE Transactions on Geoscience and Remote Sensing 46 (5): 1301–1312. doi:10.1109/TGRS.2007.912448.
  • Vrabel, J. 1996. “Multispectral Imagery Band Sharpening Study.” Photogrammetric Engineering & Remote Sensing 62 (9): 1075–1083.
  • Wald, L. 2000. “Quality of High Resolution Synthesised Images: Is there a Simple Criterion?” In Proceedings of 3rd Conference on ‘Fusion of Earth Data: Merging Point Measurements, Raster Maps and Remotely Sensed Images’, Sophia Antipolis, France, edited by T. Ranchin and L. Wald, 6 p. Nice: SEE/URISCA.
  • Wald, L. 2002. Data Fusion: Definitions and Architectures: Fusion of Images of Different Spatial Resolutions. Paris: Press de Mines. http://www.ebookdb.org/reading/2FGD1E4422461C383E5F7F69/Data-Fusion--Definitions-And-Architectures--Fusion-Of-Images-Of-Different-Spatia, http://www.ensmp.fr/Presses.
  • Wang, Z., and A. C. Bovik. 2002. “A Universal Image Quality Index.” IEEE Signal Processing Letters 9 (3): 81–84. doi:10.1109/97.995823.
  • Wang, Z., A. C. Bovik, and H. R. Sheikh, and E. P. Simoncelli. 2004. “Image Quality Assessment: From Error Visibility to Structural Similarity.” IEEE Transactions on Image Processing 13 (4): 600–612. doi:10.1109/TIP.2003.819861.
  • Welch, R., and M. Ehlers. 1987. “Merging Multiresolution SPOT HRV and Landsat TM Data.” Photogrammetric Engineering and Remote Sensing 53 (3): 301–303.
  • West, M. S., and R. G. Resmini. 2009. “Hyperspectral Imagery and LiDAR for Geological Analysis of Cuprite, Nevada.” In 15th Algorithms and Technologies for Multispectral, Hyperspectral and Ultraspectral Imagery, 13–17 April 2009, edited by S. S. Shen and P. E. Lewis, 7334. Orlando, FL: SPIE.
  • Wikipedia. 2013. Instructions Per Second. Accessed November 17. http://en.wikipedia.org/wiki/Instructions_per_second.
  • Wu, P., H. Shen, T. Ai, and Y. Liu. 2013. “Land Surface Temperature Retrieval at High Spatial and Temporal Resolutions Based on Multi-sensor Fusion.” International Journal of Digital Earth, 1–21. doi:10.1080/17538947.2013.783131.
  • Wurm, M., H. Taubenböck, M. Schardt, T. Esch, and S. Dech. 2011. “Object-based Image Information Fusion Using Multisensor Earth Observation Data Over Urban Areas.” International Journal of Image and Data Fusion 2 (2): 121–147. doi:10.1080/19479832.2010.543934.
  • Xiao, M., and Z. He. 2013. “Remote Sensing Image Fusion Based on Gaussian Mixture Model and Multiresolution Analysis.” Proceedings MIPPR 2013: Remote Sensing Image Processing, Geographic Information Systems, and Other Applications, edited by J. Tian and J. Ma, 1–8, 8921. Wuhan: SPIE.
  • Yesou, H., Y. Besnus, J. Rolet, J. C. Pion, and A. Aing. 1993. “Merging Seasat and SPOT Imagery for the Study of Geological Structures in a Temperate Agricultural Region.” Remote Sensing of Environment 43 (3): 265–279. doi:10.1016/0034-4257(93)90070-E.
  • Yu, L., J. Wang, N. Clinton, Q. Xin, L. Zhong, Y. Chen, and P. Gong. 2013. “FROM-GC: 30 m Global Cropland Extent Derived Through Multisource Data Integration.” International Journal of Digital Earth 6 (6): 521–533. doi:10.1080/17538947.2013.822574.
  • Zeng, Y., J. Zhang, J. L. van Genderen, and Y. Zhang. 2010. “Image Fusion for Land Cover Change Detection.” International Journal of Image and Data Fusion 1 (2): 193–215. doi:10.1080/19479831003802832.
  • Zhang, Y. 2004. “Understanding Image Fusion.” Photogrammetric Engineering & Remote Sensing 70 (6): 657–661.
  • Zhang, Y. 2008a. “Methods for Image Fusion Quality Assessment – A Review: Comparison and Analysis.” Proceedings of 37th ISPRS Congress, Remote Sensing and Spatial Information Sciences, XXXVII: 1101–1109.
  • Zhang, Y. 2008b. “Pan-sharpening for Improved Information Extraction.” In Advances in Photogrammetry, Remote Sensing and Spatial Information Sciences, vol. 7, edited by Z. Li, J. Chen and E. Baltsavias, 185–203. London: Taylor & Francis.
  • Zhang, J. 2010. “Multi-source Remote Sensing Data Fusion: Status and Trends.” International Journal of Image and Data Fusion 1 (1): 5–24. doi:10.1080/19479830903561035.
  • Zhang, Y. 2012. “Wavelet-based Bayesian Fusion of Multispectral and Hyperspectral Images Using Gaussian Scale Mixture Model.” International Journal of Image and Data Fusion 3 (1): 23–37. doi:10.1080/19479832.2010.551522.
  • Zhang, H., Y. Lan, C. P.-C. Suh, J. Westbrook, W. Clint Hoffmann, C. Yang, and Y. Huang. 2013. “Fusion of Remotely Sensed Data from Airborne and Ground-based Sensors to Enhance Detection of Cotton Plants.” Computers and Electronics in Agriculture 93: 55–59. doi:10.1016/j.compag.2013.02.001.
  • Zhao, S. 2012. Multi-source Remote Sensing Image Fusion and Application. Nanjing: University Press.
  • Zheng, Y, ed. 2011. Image Fusion and Its Applications. New York: InTech.
  • Zheng, Y., and Z. Qin. 2009. “Objective Image Fusion Quality Evaluation Using Structural Similarity.” Tsinghua Science & Technology 14 (6):703–709. doi:10.1016/S1007-0214(09)70138-5.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.