Remote sensing

Remote sensing is the acquisition of information about an object or phenomenon without making physical contact with the object and thus in contrast to on-site observation, especially the Earth. Remote sensing is used in numerous fields, including geography, land surveying and most Earth Science disciplines (for example, hydrology, ecology, meteorology, oceanography, glaciology, geology); it also has military, intelligence, commercial, economic, planning, and humanitarian applications.

In current usage, the term "remote sensing" generally refers to the use of satellite- or aircraft-based sensor technologies to detect and classify objects on Earth, including on the surface and in the atmosphere and oceans, based on propagated signals (e.g. electromagnetic radiation). It may be split into "active" remote sensing (such as when a signal is emitted by a satellite or aircraft and its reflection by the object is detected by the sensor) and "passive" remote sensing (such as when the reflection of sunlight is detected by the sensor).[1][2][3][4][5]

Overview

This video is about how Landsat was used to identify areas of conservation in the Democratic Republic of the Congo, and how it was used to help map an area called MLW in the north.

Passive sensors gather radiation that is emitted or reflected by the object or surrounding areas. Reflected sunlight is the most common source of radiation measured by passive sensors. Examples of passive remote sensors include film photography, infrared, charge-coupled devices, and radiometers. Active collection, on the other hand, emits energy in order to scan objects and areas whereupon a sensor then detects and measures the radiation that is reflected or backscattered from the target. RADAR and LiDAR are examples of active remote sensing where the time delay between emission and return is measured, establishing the location, speed and direction of an object.

Remote Sensing Illustration
Illustration of remote sensing

Remote sensing makes it possible to collect data of dangerous or inaccessible areas. Remote sensing applications include monitoring deforestation in areas such as the Amazon Basin, glacial features in Arctic and Antarctic regions, and depth sounding of coastal and ocean depths. Military collection during the Cold War made use of stand-off collection of data about dangerous border areas. Remote sensing also replaces costly and slow data collection on the ground, ensuring in the process that areas or objects are not disturbed.

Orbital platforms collect and transmit data from different parts of the electromagnetic spectrum, which in conjunction with larger scale aerial or ground-based sensing and analysis, provides researchers with enough information to monitor trends such as El Niño and other natural long and short term phenomena. Other uses include different areas of the earth sciences such as natural resource management, agricultural fields such as land usage and conservation,[6][7] and national security and overhead, ground-based and stand-off collection on border areas.[8]

Types of data acquisition techniques

The basis for multispectral collection and analysis is that of examined areas or objects that reflect or emit radiation that stand out from surrounding areas. For a summary of major remote sensing satellite systems see the overview table.

Applications of remote sensing

  • Conventional radar is mostly associated with aerial traffic control, early warning, and certain large scale meteorological data. Doppler radar is used by local law enforcements’ monitoring of speed limits and in enhanced meteorological collection such as wind speed and direction within weather systems in addition to precipitation location and intensity. Other types of active collection includes plasmas in the ionosphere. Interferometric synthetic aperture radar is used to produce precise digital elevation models of large scale terrain (See RADARSAT, TerraSAR-X, Magellan).
  • Laser and radar altimeters on satellites have provided a wide range of data. By measuring the bulges of water caused by gravity, they map features on the seafloor to a resolution of a mile or so. By measuring the height and wavelength of ocean waves, the altimeters measure wind speeds and direction, and surface ocean currents and directions.
  • Ultrasound (acoustic) and radar tide gauges measure sea level, tides and wave direction in coastal and offshore tide gauges.
  • Light detection and ranging (LIDAR) is well known in examples of weapon ranging, laser illuminated homing of projectiles. LIDAR is used to detect and measure the concentration of various chemicals in the atmosphere, while airborne LIDAR can be used to measure heights of objects and features on the ground more accurately than with radar technology. Vegetation remote sensing is a principal application of LIDAR.
  • Radiometers and photometers are the most common instrument in use, collecting reflected and emitted radiation in a wide range of frequencies. The most common are visible and infrared sensors, followed by microwave, gamma ray and rarely, ultraviolet. They may also be used to detect the emission spectra of various chemicals, providing data on chemical concentrations in the atmosphere.
  • Spectropolarimetric Imaging has been reported to be useful for target tracking purposes by researchers at the U.S. Army Research Laboratory. They determined that manmade items possess polarimetric signatures that are not found in natural objects. These conclusions were drawn from the imaging of military trucks, like the Humvee, and trailers with their acousto-optic tunable filter dual hyperspectral and spectropolarimetric VNIR Spectropolarimetric Imager.[9][10]
  • Stereographic pairs of aerial photographs have often been used to make topographic maps by imagery and terrain analysts in trafficability and highway departments for potential routes, in addition to modelling terrestrial habitat features.[11][12][13]
  • Simultaneous multi-spectral platforms such as Landsat have been in use since the 1970s. These thematic mappers take images in multiple wavelengths of electro-magnetic radiation (multi-spectral) and are usually found on Earth observation satellites, including (for example) the Landsat program or the IKONOS satellite. Maps of land cover and land use from thematic mapping can be used to prospect for minerals, detect or monitor land usage, detect invasive vegetation, deforestation, and examine the health of indigenous plants and crops, including entire farming regions or forests.[4][1] Prominent scientists using remote sensing for this purpose include Janet Franklin and Ruth DeFries. Landsat images are used by regulatory agencies such as KYDOW to indicate water quality parameters including Secchi depth, chlorophyll a density and total phosphorus content. Weather satellites are used in meteorology and climatology.
  • Hyperspectral imaging produces an image where each pixel has full spectral information with imaging narrow spectral bands over a contiguous spectral range. Hyperspectral imagers are used in various applications including mineralogy, biology, defence, and environmental measurements.
  • Within the scope of the combat against desertification, remote sensing allows researchers to follow up and monitor risk areas in the long term, to determine desertification factors, to support decision-makers in defining relevant measures of environmental management, and to assess their impacts.[14]

Geodetic

  • Geodetic remote sensing can be gravimetric or geometric. Overhead gravity data collection was first used in aerial submarine detection. This data revealed minute perturbations in the Earth's gravitational field that may be used to determine changes in the mass distribution of the Earth, which in turn may be used for geophysical studies, as in GRACE. Geometric remote sensing includes position and deformation imaging using InSAR, LIDAR, etc.[15]

Acoustic and near-acoustic

  • Sonar: passive sonar, listening for the sound made by another object (a vessel, a whale etc.); active sonar, emitting pulses of sounds and listening for echoes, used for detecting, ranging and measurements of underwater objects and terrain.
  • Seismograms taken at different locations can locate and measure earthquakes (after they occur) by comparing the relative intensity and precise timings.
  • Ultrasound: Ultrasound sensors, that emit high frequency pulses and listening for echoes, used for detecting water waves and water level, as in tide gauges or for towing tanks.

To coordinate a series of large-scale observations, most sensing systems depend on the following: platform location and the orientation of the sensor. High-end instruments now often use positional information from satellite navigation systems. The rotation and orientation is often provided within a degree or two with electronic compasses. Compasses can measure not just azimuth (i. e. degrees to magnetic north), but also altitude (degrees above the horizon), since the magnetic field curves into the Earth at different angles at different latitudes. More exact orientations require gyroscopic-aided orientation, periodically realigned by different methods including navigation from stars or known benchmarks.

Data characteristics

The quality of remote sensing data consists of its spatial, spectral, radiometric and temporal resolutions.

Spatial resolution
The size of a pixel that is recorded in a raster image – typically pixels may correspond to square areas ranging in side length from 1 to 1,000 metres (3.3 to 3,280.8 ft).
Spectral resolution
The wavelength of the different frequency bands recorded – usually, this is related to the number of frequency bands recorded by the platform. Current Landsat collection is that of seven bands, including several in the infrared spectrum, ranging from a spectral resolution of 0.7 to 2.1 μm. The Hyperion sensor on Earth Observing-1 resolves 220 bands from 0.4 to 2.5 μm, with a spectral resolution of 0.10 to 0.11 μm per band.
Radiometric resolution
The number of different intensities of radiation the sensor is able to distinguish. Typically, this ranges from 8 to 14 bits, corresponding to 256 levels of the gray scale and up to 16,384 intensities or "shades" of colour, in each band. It also depends on the instrument noise.
Temporal resolution
The frequency of flyovers by the satellite or plane, and is only relevant in time-series studies or those requiring an averaged or mosaic image as in deforesting monitoring. This was first used by the intelligence community where repeated coverage revealed changes in infrastructure, the deployment of units or the modification/introduction of equipment. Cloud cover over a given area or object makes it necessary to repeat the collection of said location.

Data processing

In order to create sensor-based maps, most remote sensing systems expect to extrapolate sensor data in relation to a reference point including distances between known points on the ground. This depends on the type of sensor used. For example, in conventional photographs, distances are accurate in the center of the image, with the distortion of measurements increasing the farther you get from the center. Another factor is that of the platen against which the film is pressed can cause severe errors when photographs are used to measure ground distances. The step in which this problem is resolved is called georeferencing, and involves computer-aided matching of points in the image (typically 30 or more points per image) which is extrapolated with the use of an established benchmark, "warping" the image to produce accurate spatial data. As of the early 1990s, most satellite images are sold fully georeferenced.

In addition, images may need to be radiometrically and atmospherically corrected.

Radiometric correction
Allows avoidance of radiometric errors and distortions. The illumination of objects on the Earth surface is uneven because of different properties of the relief. This factor is taken into account in the method of radiometric distortion correction.[16] Radiometric correction gives a scale to the pixel values, e. g. the monochromatic scale of 0 to 255 will be converted to actual radiance values.
Topographic correction (also called terrain correction)
In rugged mountains, as a result of terrain, the effective illumination of pixels varies considerably. In a remote sensing image, the pixel on the shady slope receives weak illumination and has a low radiance value, in contrast, the pixel on the sunny slope receives strong illumination and has a high radiance value. For the same object, the pixel radiance value on the shady slope will be different from that on the sunny slope. Additionally, different objects may have similar radiance values. These ambiguities seriously affected remote sensing image information extraction accuracy in mountainous areas. It became the main obstacle to further application of remote sensing images. The purpose of topographic correction is to eliminate this effect, recovering the true reflectivity or radiance of objects in horizontal conditions. It is the premise of quantitative remote sensing application.
Atmospheric correction
Elimination of atmospheric haze by rescaling each frequency band so that its minimum value (usually realised in water bodies) corresponds to a pixel value of 0. The digitizing of data also makes it possible to manipulate the data by changing gray-scale values.

Interpretation is the critical process of making sense of the data. The first application was that of aerial photographic collection which used the following process; spatial measurement through the use of a light table in both conventional single or stereographic coverage, added skills such as the use of photogrammetry, the use of photomosaics, repeat coverage, Making use of objects’ known dimensions in order to detect modifications. Image Analysis is the recently developed automated computer-aided application which is in increasing use.

Object-Based Image Analysis (OBIA) is a sub-discipline of GIScience devoted to partitioning remote sensing (RS) imagery into meaningful image-objects, and assessing their characteristics through spatial, spectral and temporal scale.

Old data from remote sensing is often valuable because it may provide the only long-term data for a large extent of geography. At the same time, the data is often complex to interpret, and bulky to store. Modern systems tend to store the data digitally, often with lossless compression. The difficulty with this approach is that the data is fragile, the format may be archaic, and the data may be easy to falsify. One of the best systems for archiving data series is as computer-generated machine-readable ultrafiche, usually in typefonts such as OCR-B, or as digitized half-tone images. Ultrafiches survive well in standard libraries, with lifetimes of several centuries. They can be created, copied, filed and retrieved by automated systems. They are about as compact as archival magnetic media, and yet can be read by human beings with minimal, standardized equipment.

Generally speaking, remote sensing works on the principle of the inverse problem: while the object or phenomenon of interest (the state) may not be directly measured, there exists some other variable that can be detected and measured (the observation) which may be related to the object of interest through a calculation. The common analogy given to describe this is trying to determine the type of animal from its footprints. For example, while it is impossible to directly measure temperatures in the upper atmosphere, it is possible to measure the spectral emissions from a known chemical species (such as carbon dioxide) in that region. The frequency of the emissions may then be related via thermodynamics to the temperature in that region.

Data processing levels

To facilitate the discussion of data processing in practice, several processing "levels" were first defined in 1986 by NASA as part of its Earth Observing System[17] and steadily adopted since then, both internally at NASA (e. g.,[18]) and elsewhere (e. g.,[19]); these definitions are:

Level Description
0 Reconstructed, unprocessed instrument and payload data at full resolution, with any and all communications artifacts (e. g., synchronization frames, communications headers, duplicate data) removed.
1a Reconstructed, unprocessed instrument data at full resolution, time-referenced, and annotated with ancillary information, including radiometric and geometric calibration coefficients and georeferencing parameters (e. g., platform ephemeris) computed and appended but not applied to the Level 0 data (or if applied, in a manner that level 0 is fully recoverable from level 1a data).
1b Level 1a data that have been processed to sensor units (e. g., radar backscatter cross section, brightness temperature, etc.); not all instruments have Level 1b data; level 0 data is not recoverable from level 1b data.
2 Derived geophysical variables (e. g., ocean wave height, soil moisture, ice concentration) at the same resolution and location as Level 1 source data.
3 Variables mapped on uniform spacetime grid scales, usually with some completeness and consistency (e. g., missing points interpolated, complete regions mosaicked together from multiple orbits, etc.).
4 Model output or results from analyses of lower level data (i. e., variables that were not measured by the instruments but instead are derived from these measurements).

A Level 1 data record is the most fundamental (i. e., highest reversible level) data record that has significant scientific utility, and is the foundation upon which all subsequent data sets are produced. Level 2 is the first level that is directly usable for most scientific applications; its value is much greater than the lower levels. Level 2 data sets tend to be less voluminous than Level 1 data because they have been reduced temporally, spatially, or spectrally. Level 3 data sets are generally smaller than lower level data sets and thus can be dealt with without incurring a great deal of data handling overhead. These data tend to be generally more useful for many applications. The regular spatial and temporal organization of Level 3 datasets makes it feasible to readily combine data from different sources.

While these processing levels are particularly suitable for typical satellite data processing pipelines, other data level vocabularies have been defined and may be appropriate for more heterogeneous workflows.

History

Usaf.u2.750pix
The TR-1 reconnaissance/surveillance aircraft
2001 mars odyssey wizja
The 2001 Mars Odyssey used spectrometers and imagers to hunt for evidence of past or present water and volcanic activity on Mars.

The modern discipline of remote sensing arose with the development of flight. The balloonist G. Tournachon (alias Nadar) made photographs of Paris from his balloon in 1858.[20] Messenger pigeons, kites, rockets and unmanned balloons were also used for early images. With the exception of balloons, these first, individual images were not particularly useful for map making or for scientific purposes.

Systematic aerial photography was developed for military surveillance and reconnaissance purposes beginning in World War I[21] and reaching a climax during the Cold War with the use of modified combat aircraft such as the P-51, P-38, RB-66 and the F-4C, or specifically designed collection platforms such as the U2/TR-1, SR-71, A-5 and the OV-1 series both in overhead and stand-off collection.[22] A more recent development is that of increasingly smaller sensor pods such as those used by law enforcement and the military, in both manned and unmanned platforms. The advantage of this approach is that this requires minimal modification to a given airframe. Later imaging technologies would include infrared, conventional, Doppler and synthetic aperture radar.[23]

The development of artificial satellites in the latter half of the 20th century allowed remote sensing to progress to a global scale as of the end of the Cold War.[24] Instrumentation aboard various Earth observing and weather satellites such as Landsat, the Nimbus and more recent missions such as RADARSAT and UARS provided global measurements of various data for civil, research, and military purposes. Space probes to other planets have also provided the opportunity to conduct remote sensing studies in extraterrestrial environments, synthetic aperture radar aboard the Magellan spacecraft provided detailed topographic maps of Venus, while instruments aboard SOHO allowed studies to be performed on the Sun and the solar wind, just to name a few examples.[25][26]

Recent developments include, beginning in the 1960s and 1970s with the development of image processing of satellite imagery. Several research groups in Silicon Valley including NASA Ames Research Center, GTE, and ESL Inc. developed Fourier transform techniques leading to the first notable enhancement of imagery data. In 1999 the first commercial satellite (IKONOS) collecting very high resolution imagery was launched.[27]

Training and education

Remote Sensing has a growing relevance in the modern information society. It represents a key technology as part of the aerospace industry and bears increasing economic relevance – new sensors e.g. TerraSAR-X and RapidEye are developed constantly and the demand for skilled labour is increasing steadily. Furthermore, remote sensing exceedingly influences everyday life, ranging from weather forecasts to reports on climate change or natural disasters. As an example, 80% of the German students use the services of Google Earth; in 2006 alone the software was downloaded 100 million times. But studies have shown that only a fraction of them know more about the data they are working with.[28] There exists a huge knowledge gap between the application and the understanding of satellite images. Remote sensing only plays a tangential role in schools, regardless of the political claims to strengthen the support for teaching on the subject.[29] A lot of the computer software explicitly developed for school lessons has not yet been implemented due to its complexity. Thereby, the subject is either not at all integrated into the curriculum or does not pass the step of an interpretation of analogue images. In fact, the subject of remote sensing requires a consolidation of physics and mathematics as well as competences in the fields of media and methods apart from the mere visual interpretation of satellite images.

Many teachers have great interest in the subject "remote sensing", being motivated to integrate this topic into teaching, provided that the curriculum is considered. In many cases, this encouragement fails because of confusing information.[30] In order to integrate remote sensing in a sustainable manner organizations like the EGU or Digital Earth[31] encourage the development of learning modules and learning portals. Examples include: FIS – Remote Sensing in School Lessons,[32] Geospektiv,[33] Ychange,[34] or Spatial Discovery,[35] to promote media and method qualifications as well as independent learning.

Software

Remote sensing data are processed and analyzed with computer software, known as a remote sensing application. A large number of proprietary and open source applications exist to process remote sensing data. Remote sensing software packages include:

Open source remote sensing software includes:

According to an NOAA Sponsored Research by Global Marketing Insights, Inc. the most used applications among Asian academic groups involved in remote sensing are as follows: ERDAS 36% (ERDAS IMAGINE 25% & ERMapper 11%); ESRI 30%; ITT Visual Information Solutions ENVI 17%; MapInfo 17%.

Among Western Academic respondents as follows: ESRI 39%, ERDAS IMAGINE 27%, MapInfo 9%, and AutoDesk 7%.

In education, those that want to go beyond simply looking at satellite images print-outs either use general remote sensing software (e.g. QGIS), Google Earth, StoryMaps or a software/ web-app developed specifically for education (e.g. desktop: LeoWorks, online: BLIF).

Satellites

First Satellite UV/VIS observations simply showed pictures of the Earth's surface and atmosphere. Such satellite images are still used, for instance as input for numerical weather forecast. The first spectroscopic UV/VIS observations started in 1970 on board of the US research satellite Nimbus 4. These measurements (backscatter ultraviolet, BUV, later also called Solar BUV, SBUV) operated in nadir geometry, i.e., they measured the solar light reflected from the ground or scattered from the atmosphere. Like for the Dobson instruments also the BUV/SBUV instruments measure the intensity in different narrow spectral intervals. The intention of these BUV/SBUV observations was to determine information on the atmospheric O3 profile, since the penetration depth into the atmosphere strongly depends on wavelength. For example, the light at the shortest wavelengths has only 'seen' the highest parts of the O3 layer whereas the longest wavelengths have seen the total column. While in principle the BUV/SBUV measurements worked well, they suffered from instrumental instabilities.

The big breakthrough in UV/VIS satellite remote sensing of the atmosphere took place in 1979 with the launch the Total Ozone Mapping Spectrometer (TOMS) on Nimbus 7. TOMS is similar to the BUV/SBUV instrument but measures light at longer wavelengths. Thus it is only sensitive to the total O3 column (instead of the O3 profile). However, compared to the BUV/SBUV instruments the TOMS instruments were much more stable. The TOMS instrument on board of Nimbus 7 yielded the so far longest global data set on O3 (1979 - 1992). This period in particular includes the discovery of the ozone hole. Several further TOMS instruments have been launched on other satellites. Like the Dobson instruments on the ground they yield very accurate O3 total column densities using a relatively simple method. Besides events of very strong atmospheric SO2 absorption and aerosols they are, however, only sensitive to O3.

Since April 1995 the first DOAS instrument is operating from space. The Global Ozone Monitoring Experiment (GOME) was launched on the European research satellite ERS-2. Like SBUV and TOMS also GOME is a nadir viewing instrument; unlike its predecessor instruments it covers a large spectral range (240 - 790 nm) at a total of 4096 wavelengths arranged in four 'channels' with a spectral resolution between 0.2 and 0.4 nm. Its normal ground pixel size in 320 x 40 km2. Global coverage is achieved after three days. For O3 profile measurements the intensities at short wavelengths are observed (BUV/SBUV instruments). For the determination of the total atmospheric O3 column the intensities at larger wavelengths are used (TOMS instruments). In contrast to the limited spectral information of BUV/SBUV and TOMS instruments, GOME spectra yield a surplus of spectral information. By applying the DOAS method to these measurements it is thus possible to retrieve a large variety of atmospheric trace gases, the majority of which are very weak absorbers (O3, NO2, BrO, OClO, HCHO, H2O, O2, O4, SO2). In addition also other quantities like aerosol absorptions, the ground albedo or indices characterising the solar cycle can be analysed. Because of the high sensitivity of GOME it is in particular possible to measure various tropospheric trace gases (NO2, BrO, HCHO, H2O, SO2). A further important advantage is that the GOME spectra can be analysed with respect to a spectrum of direct sun light, which contains no atmospheric absorptions. Therefore, in contrast to ground based DOAS measurements the DOAS analysis of GOME spectra yields total atmospheric column densities rather than the difference between two atmospheric spectra.

In March 2002 a second DOAS satellite instrument, the SCanning Imaging Absorption SpectroMeter for Atmospheric ChartographY (SCIAMACHY) was launched on board of the European research satellite Envisat. In addition to GOME it measures over a wider wavelength range (240 nm - 2380) including also the absorption of several greenhouse gases (CO2, CH4, N2O) and CO in the IR. It also operated in additional viewing modes (nadir, limb, occultation), which allows to derive stratospheric trace gas profiles. Another advantage is that the ground pixel size for the nadir viewing mode was significantly reduced to 30 x 60 km2 (in a special mode even to 15 x 30 km2). Especially for the observation of tropospheric trace gases this is very important because of the strong spatial gradients occurring for such species. The first tropospheric results of SCIAMACHY showed that it was now possible to identify pollution plumes of individual cities or other big sources.

See also

References

  1. ^ a b Ran, Lingyan; Zhang, Yanning; Wei, Wei; Zhang, Qilin (2017-10-23). "A Hyperspectral Image Classification Framework with Spatial Pixel Pair Features". Sensors. 17 (10): 2421. doi:10.3390/s17102421. PMC 5677443. PMID 29065535.
  2. ^ Schowengerdt, Robert A. (2007). Remote sensing: models and methods for image processing (3rd ed.). Academic Press. p. 2. ISBN 978-0-12-369407-2.
  3. ^ Schott, John Robert (2007). Remote sensing: the image chain approach (2nd ed.). Oxford University Press. p. 1. ISBN 978-0-19-517817-3.
  4. ^ a b Guo, Huadong; Huang, Qingni; Li, Xinwu; Sun, Zhongchang; Zhang, Ying (2013). "Spatiotemporal analysis of urban environment based on the vegetation–impervious surface–soil model". Journal of Applied Remote Sensing. 8: 084597. Bibcode:2014JARS....8.4597G. doi:10.1117/1.JRS.8.084597.
  5. ^ Liu, Jian Guo & Mason, Philippa J. (2009). Essential Image Processing for GIS and Remote Sensing. Wiley-Blackwell. p. 4. ISBN 978-0-470-51032-2.
  6. ^ "Saving the monkeys". SPIE Professional. Retrieved 1 Jan 2016.
  7. ^ Howard, A.; et al. (Aug 19, 2015). "Remote sensing and habitat mapping for bearded capuchin monkeys (Sapajus libidinosus): landscapes for the use of stone tools". Journal of Applied Remote Sensing. 9 (1): 096020. doi:10.1117/1.JRS.9.096020.
  8. ^ "Archived copy". Archived from the original on 29 September 2006. Retrieved 18 February 2009.CS1 maint: Archived copy as title (link)
  9. ^ Goldberg, A.; Stann, B.; Gupta, N. (July 2003). "Multispectral, Hyperspectral, and Three-Dimensional Imaging Research at the U.S. Army Research Laboratory" (PDF). Proceedings of the International Conference on International Fusion [6th]. 1: 499–506.
  10. ^ Makki, Ihab; Younes, Rafic; Francis, Clovis; Bianchi, Tiziano; Zucchetti, Massimo (2017-02-01). "A survey of landmine detection using hyperspectral imaging". ISPRS Journal of Photogrammetry and Remote Sensing. 124: 40–53. doi:10.1016/j.isprsjprs.2016.12.009. ISSN 0924-2716.
  11. ^ Mills, J.P.; et al. (1997). "Photogrammetry from Archived Digital Imagery for Seal Monitoring". The Photogrammetric Record. 15 (89): 715–724. doi:10.1111/0031-868X.00080.
  12. ^ Twiss, S.D.; et al. (2001). "Topographic spatial characterisation of grey seal Halichoerus grypus breeding habitat at a sub-seal size spatial grain". Ecography. 24 (3): 257–266. doi:10.1111/j.1600-0587.2001.tb00198.x.
  13. ^ Stewart, J.E.; et al. (2014). "Finescale ecological niche modeling provides evidence that lactating gray seals (Halichoerus grypus) prefer access to fresh water in order to drink". Marine Mammal Science. 30 (4): 1456–1472. doi:10.1111/mms.12126.
  14. ^ Begni G. Escadafal R. Fontannaz D. and Hong-Nga Nguyen A.-T. (2005). Remote sensing: a tool to monitor and assess desertification. Les dossiers thématiques du CSFD. Issue 2. 44 pp.
  15. ^ Geodetic Imaging
  16. ^ Grigoriev А.N. (2015). "Мethod of radiometric distortion correction of multispectral data for the earth remote sensing". Scientific and Technical Journal of Information Technologies, Mechanics and Optics. 15 (4): 595–602. doi:10.17586/2226-1494-2015-15-4-595-602.
  17. ^ NASA (1986), Report of the EOS data panel, Earth Observing System, Data and Information System, Data Panel Report, Vol. IIa., NASA Technical Memorandum 87777, June 1986, 62 pp. Available at http://hdl.handle.net/2060/19860021622
  18. ^ C. L. Parkinson, A. Ward, M. D. King (Eds.) Earth Science Reference Handbook – A Guide to NASA’s Earth Science Program and Earth Observing Satellite Missions, National Aeronautics and Space Administration Washington, D. C. Available at http://eospso.gsfc.nasa.gov/ftp_docs/2006ReferenceHandbook.pdf Archived 15 April 2010 at the Wayback Machine
  19. ^ GRAS-SAF (2009), Product User Manual, GRAS Satellite Application Facility, Version 1.2.1, 31 March 2009. Available at http://www.grassaf.org/general-documents/products/grassaf_pum_v121.pdf
  20. ^ Maksel, Rebecca. "Flight of the Giant". Air & Space Magazine. Retrieved 2019-02-19.
  21. ^ IWM, Alan Wakefield
    Head of photographs at (2014-04-04). "A bird's-eye view of the battlefield: aerial photography". ISSN 0307-1235. Retrieved 2019-02-19.
  22. ^ "Air Force Magazine". www.airforcemag.com. Retrieved 2019-02-19.
  23. ^ "Military Imaging and Surveillance Technology (MIST)". www.darpa.mil. Retrieved 2019-02-19.
  24. ^ "The Indian Society pf International Law - Newsletter: VOL. 15, No. 4, October - December 2016". doi:10.1163/2210-7975_hrd-9920-2016004.
  25. ^ "In Depth | Magellan". Solar System Exploration: NASA Science. Retrieved 2019-02-19.
  26. ^ Garner, Rob (2015-04-15). "SOHO - Solar and Heliospheric Observatory". NASA. Retrieved 2019-02-19.
  27. ^ Colen, Jerry (2015-04-08). "Ames Research Center Overview". NASA. Retrieved 2019-02-19.
  28. ^ Ditter, R., Haspel, M., Jahn, M., Kollar, I., Siegmund, A., Viehrig, K., Volz, D., Siegmund, A. (2012) Geospatial technologies in school – theoretical concept and practical implementation in K-12 schools. In: International Journal of Data Mining, Modelling and Management (IJDMMM): FutureGIS: Riding the Wave of a Growing Geospatial Technology Literate Society; Vol. X
  29. ^ Stork, E.J., Sakamoto, S.O., and Cowan, R.M. (1999) "The integration of science explorations through the use of earth images in middle school curriculum", Proc. IEEE Trans. Geosci. Remote Sensing 37, 1801–1817
  30. ^ Bednarz, S.W. and Whisenant, S.E. (2000) "Mission geography: linking national geography standards, innovative technologies and NASA", Proc. IGARSS, Honolulu, USA, 2780–2782 8
  31. ^ Digital Earth
  32. ^ FIS – Remote Sensing in School Lessons
  33. ^ geospektiv
  34. ^ YCHANGE
  35. ^ Landmap – Spatial Discovery

Further reading

  • Campbell, J. B. (2002). Introduction to remote sensing (3rd ed.). The Guilford Press. ISBN 978-1-57230-640-0.
  • Jensen, J. R. (2007). Remote sensing of the environment: an Earth resource perspective (2nd ed.). Prentice Hall. ISBN 978-0-13-188950-7.
  • Jensen, J. R. (2005). Digital Image Processing: a Remote Sensing Perspective (3rd ed.). Prentice Hall.
  • Lentile, Leigh B.; Holden, Zachary A.; Smith, Alistair M. S.; Falkowski, Michael J.; Hudak, Andrew T.; Morgan, Penelope; Lewis, Sarah A.; Gessler, Paul E.; Benson, Nate C. (2006). "Remote sensing techniques to assess active fire characteristics and post-fire effects". International Journal of Wildland Fire. 3 (15): 319–345. doi:10.1071/WF05097.
  • Lillesand, T. M.; R. W. Kiefer; J. W. Chipman (2003). Remote sensing and image interpretation (5th ed.). Wiley. ISBN 978-0-471-15227-9.
  • Richards, J. A.; X. Jia (2006). Remote sensing digital image analysis: an introduction (4th ed.). Springer. ISBN 978-3-540-25128-6.
  • US Army FM series.
  • US Army military intelligence museum, FT Huachuca, AZ
  • Datla, R.U.; Rice, J.P.; Lykke, K.R.; Johnson, B.C.; Butler, J.J.; Xiong, X. (March–April 2011). "Best practice guidelines for pre-launch characterization and calibration of instruments for passive optical remote sensing". Journal of Research of the National Institute of Standards and Technology. 116 (2): 612–646. doi:10.6028/jres.116.009.
  • Begni G., Escadafal R., Fontannaz D. and Hong-Nga Nguyen A.-T. (2005). Remote sensing: a tool to monitor and assess desertification. Les dossiers thématiques du CSFD. Issue 2. 44 pp.
  • KUENZER, C. ZHANG, J., TETZLAFF, A., and S. DECH, 2013: Thermal Infrared Remote Sensing of Surface and underground Coal Fires. In (eds.) Kuenzer, C. and S. Dech 2013: Thermal Infrared Remote Sensing – Sensors, Methods, Applications. Remote Sensing and Digital Image Processing Series, Volume 17, 572 pp., ISBN 978-94-007-6638-9, pp. 429–451
  • Kuenzer, C. and S. Dech 2013: Thermal Infrared Remote Sensing – Sensors, Methods, Applications. Remote Sensing and Digital Image Processing Series, Volume 17, 572 pp., ISBN 978-94-007-6638-9
  • Lasaponara, R. and Masini N. 2012: Satellite Remote Sensing - A new tool for Archaeology. Remote Sensing and Digital Image Processing Series, Volume 16, 364 pp., ISBN 978-90-481-8801-7.

External links

Earth observation

Earth observation (EO) is the gathering of information about the physical, chemical, and biological systems of the planet via remote-sensing technologies, supplemented by Earth-surveying techniques, which encompasses the collection, analysis, and presentation of data. Earth observation is used to monitor and assess the status of and changes in natural and built environments.

In recent years, Earth observation has become technologically sophisticated increasingly. It has also become more important due to the dramatic impact that modern human civilization is having on the world and the need to minimize negative effects (e.g. geohazards), along with the opportunities such observation provides to improve social and economic well-being.

The term Earth observation is used in two ways, leading to confusion. In Europe, in particular, it has often been used to refer to satellite-based remote sensing, but the term is also used to refer to any form of observations of the Earth system, including in situ and airborne observations, for example. The Group on Earth Observations, which has over 100 member countries and over 100 participating organizations, uses EO in this broader sense.

To add to the confusion, in the US, for example, the term remote sensing is often used to refer to satellite-based remote sensing, but sometimes used more broadly for observations using any form of remote sensing technology, including airborne sensors and even ground-based sensors such as cameras. Perhaps the least ambiguous term to use for satellite-based sensors is satellite remote sensing, or SRS, an acronym which is gradually starting to appear in the literature.

Earth observations may include:

numerical measurements taken by a thermometer, wind gauge, ocean buoy, altimeter or seismometer

photos and radar or sonar images taken from ground or ocean-based instruments

photos and radar images taken from remote-sensing satellites

decision-support tools based on processed information, such as maps and modelsJust as Earth observations consist of a wide variety of possible elements, they can be applied to a wide variety of possible uses. Some of the specific applications of Earth observations are:

forecasting weather

tracking biodiversity and wildlife trends

measuring land-use change (such as deforestation)

monitoring and responding to natural disasters, including fires, floods, earthquakes, landslides, land subsidence and tsunamis

managing natural resources, such as energy, freshwater and agriculture

addressing emerging diseases and other health risks

predicting, adapting to and mitigating climate changeThe quality and quantity of Earth observations continue to mount rapidly. In addition to the ongoing launch of new remote-sensing satellites, increasingly sophisticated in situ instruments located on the ground, on balloons and airplanes, and in rivers, lakes and oceans, are generating increasingly comprehensive, nearly real-time observations.

Earth observation satellite

An Earth observation satellite or Earth remote sensing satellite is satellite specifically designed for Earth observation from orbit, similar to spy satellites but intended for non-military uses such as environmental monitoring, meteorology, map making etc.

The first occurrence of satellite remote sensing can be dated to the launch of the first artificial satellite, Sputnik 1, by the Soviet Union on October 4, 1957. Sputnik 1 sent back radio signals, which scientists used to study the ionosphere.

NASA launched the first American satellite, Explorer 1, in January 31, 1958. The information sent back from its radiation detector led to the discovery of the Earth's Van Allen radiation belts.

The TIROS-1 spacecraft, launched on April 1, 1960 as part of NASA's TIROS (Television Infrared Observation Satellite) Program, sent back the first television footage of weather patterns to be taken from space.

As of 2008, more than 150 Earth observation satellites were in orbit, recording data with both passive and active sensors and acquiring more than 10 terabits of data daily.Most Earth observation satellites carry instruments that should be operated at a relatively low altitude. Altitudes below 500-600 kilometers are in general avoided, though, because of the significant air-drag at such low altitudes making frequent orbit reboost maneuvres necessary. The Earth observation satellites ERS-1, ERS-2 and Envisat of European Space Agency as well as the MetOp spacecraft of EUMETSAT are all operated at altitudes of about 800 km. The Proba-1, Proba-2 and SMOS spacecraft of European Space Agency are observing the Earth from an altitude of about 700 km. The Earth observation satellites of UAE, DubaiSat-1 & DubaiSat-2 are also placed in Low Earth Orbits (LEO) orbits and providing satellite imagery of various parts of the Earth.To get (nearly) global coverage with a low orbit it must be a polar orbit or nearly so. A low orbit will have an orbital period of roughly 100 minutes and the Earth will rotate around its polar axis with about 25 deg between successive orbits, with the result that the ground track is shifted towards west with these 25 deg in longitude. Most are in sun-synchronous orbits.

Spacecraft carrying instruments for which an altitude of 36000 km is suitable sometimes use a geostationary orbit. Such an orbit allows uninterrupted coverage of more than 1/3 of the Earth. Three geostationary spacecraft at longitudes separated with 120 deg can cover the whole Earth except the extreme polar regions. This type of orbit is mainly used for meteorological satellites.

European Remote-Sensing Satellite

European remote sensing satellite (ERS) was the European Space Agency's first Earth-observing satellite programme using a polar orbit. The first satellite was launched on 17 July 1991 into a Sun-synchronous polar orbit at an altitude of 782–785 km.

General Organization of Remote Sensing

General Organization of Remote Sensing (GORS) is a Syrian space research agency. Established in 1986 GORS is responsible for carrying out aerospace and land surveying using remote sensing techniques.

Indian Institute of Remote Sensing

The Indian Institute of Remote Sensing is a premier institute for research, higher education and training in the field of Remote Sensing, Geoinformatics and GPS Technology for Natural Resources, Environmental and Disaster Management under the Indian Department of Space, which was established in the year 1966. It is located in the city of Dehradun, Uttarakhand.

International Society for Photogrammetry and Remote Sensing

The International Society for Photogrammetry and Remote Sensing (ISPRS) is an international non-governmental organization that enhances international cooperation between the worldwide organizations with interests in the photogrammetry, remote sensing and spatial information sciences. Established in 1910, ISPRS is the oldest international umbrella organization in its field, which may be summarized as addressing “information from imagery”.

ISPRS achieves its aims by:

Advancing knowledge in the areas of interest of ISPRS by encouraging and facilitating research and development, scientific networking and inter-disciplinary activities

Facilitating education and training with particular emphasis in less developed countries

Enhancing public recognition of the contributions of the photogrammetry, remote sensing and spatial information sciences for the benefit of humankind and the sustainability of the environmentThe ISPRS scientific and technical programs are organized by five Technical Commissions. Each Commission is sponsored by an ISPRS member organization for the four-year period between Congresses. The five Technical Commissions have established around 60 Working Groups which are responsible for particular topics within the Commissions’ areas of interest. All Technical Commissions hold a Symposium within their country in 2018. Smaller workshops will be organized by the Working Groups before the 2020 Congress is organized from June 28 - July 4, 2020, in Nice, France, by the French Society for Photogrammetry and Remote Sensing (http://www.isprs2020-nice.com).

Mars Global Remote Sensing Orbiter and Small Rover

The Mars Global Remote Sensing Orbiter and Small Rover (HX-1) is a planned project by China to deploy an orbiter and rover on Mars. The mission is planned to be launched in July or August 2020 with a Long March 5 heavy lift rocket. Its stated objective is to search for evidence of both current and past life, and assessing the planet's environment.

Mawrth Vallis

Mawrth Vallis (Welsh: [maurθ]) (Mawrth means "Mars" in Welsh) is a valley on Mars, located in the Oxia Palus quadrangle at 22.3°N, 343.5°E with an elevation approximately two kilometers below datum. Situated between the southern highlands and northern lowlands, the valley is a channel formed by massive flooding which occurred in Mars’ ancient past. It is an ancient water outflow channel with light-colored clay-rich rocks.

Prior to the selection of Gale Crater for the Mars Science Laboratory (MSL) Curiosity rover mission, Mawrth Vallis was considered as a potential landing site because of the detection of a stratigraphic section rich in clay minerals. Clay minerals have implications for past aqueous environments as well as the potential to preserve biosignatures, making them ideal targets for the search for life on Mars. Although Mawrth Vallis was not chosen as a landing target, there is still interest in understanding the mineralogy and stratigraphy of the area. Until a rover mission is committed to exploring Mawrth Vallis, orbiters remain the only source of information. These orbiters consist of a number of spectrometers that contribute to our knowledge of Mawrth Vallis and the rest of the Martian surface.

Moderate Resolution Imaging Spectroradiometer

The Moderate Resolution Imaging Spectroradiometer (MODIS) is a payload imaging sensor built by Santa Barbara Remote Sensing that was launched into Earth orbit by NASA in 1999 on board the Terra (EOS AM) Satellite, and in 2002 on board the Aqua (EOS PM) satellite. The instruments capture data in 36 spectral bands ranging in wavelength from 0.4 µm to 14.4 µm and at varying spatial resolutions (2 bands at 250 m, 5 bands at 500 m and 29 bands at 1 km). Together the instruments image the entire Earth every 1 to 2 days. They are designed to provide measurements in large-scale global dynamics including changes in Earth's cloud cover, radiation budget and processes occurring in the oceans, on land, and in the lower atmosphere. MODIS utilizes four on-board calibrators in addition to the space view in order to provide in-flight calibration: solar diffuser (SD), solar diffuser stability monitor (SDSM), spectral radiometric calibration assembly (SRCA), and a v-groove black body. MODIS has used the marine optical buoy for vicarious calibration. MODIS is succeeded by the VIIRS instrument on board the Suomi NPP satellite launched in 2011 and future Joint Polar Satellite System (JPSS) satellites.

The MODIS characterization support team (MCST) is dedicated to the production of high-quality MODIS calibrated product which is a precursor to every geophysical science product. A detailed description of the MCST mission statement and other details can be found at MCST Web.

NISAR (satellite)

The NASA-ISRO Synthetic Aperture Radar (NISAR) mission is a joint project between NASA and ISRO to co-develop and launch a dual-frequency synthetic aperture radar on an Earth observation satellite. The satellite will be the first radar imaging satellite to use dual frequencies. It will be used for remote sensing, to observe and understand natural processes on Earth. For example, its right-facing instruments will study the Antarctic cryosphere.With a total cost estimated at US$1.5 billion, NISAR is likely to be the world's most expensive Earth-imaging satellite.

National Authority for Remote Sensing and Space Sciences

National Authority for Remote Sensing and Space Sciences (NARSS) is the pioneering Egyptian institution in the field of satellite remote sensing.

National Central University

National Central University (NCU, Chinese: 國立中央大學, Kuo-Li Chung-yang Ta-hsüeh, or 中大, Chung-ta) was founded in 1915 with roots from 258 CE in mainland China. Founded in Nanjing in 1915, NCU was the leading academic center in southeast China; the phrase “North the Peking University, South the Central University” at that time revealed the significance of NCU. NCU was renamed Nanjing University in 1949, and the former campus has been used by Nanjing Institute of Technology, which was later renamed Southeast University since Nanjing University relocated in 1952. NCU was re-established in Taiwan in 1962. The school was initially located in Miaoli but relocated to Zhongli in 1968 and developed into a comprehensive university. It has become Taiwan's leading school in drama, film studies, cultural studies, and gender studies, Hakka studies, geophysics, space science, remote sensing, astronomy, optoelectronics, nanotechnology, and business management as well as the first university in Taiwan to research industrial economics and economic development (Taiwan's Consumer Confidence Index is released monthly by NCU). NCU is a member of AACSB. In 2001, NCU was selected by the Ministry of Education as one of the eleven research-oriented universities in Taiwan.

NCU now has eight colleges in different areas, including College of Liberal Arts, College of Science, College of Engineering, College of Electrical Engineering and Computer Science, College of Biomedical Science and Engineering, College of Earth Sciences, College of Management, and College of Hakka Studies, also with areas in sociology, law and government studies, etc.

The undergraduate population is represented by the Associated Students of National Central University.

National Remote Sensing Centre

National Remote Sensing Centre (Hindi: राष्ट्रीय सुदूर संवेदन केन्द्र), or NRSC, located at Hyderabad is one of the centres of the Indian Space Research Organisation (ISRO). NRSC manages data from aerial and satellite sources.

Pakistan Remote Sensing Satellite

The Pakistan Remote Sensing Satellite (PRSS), commercially known as Remote Sensing Satellite System (RSSS), is a dual-purpose Earth observational and optical satellite. Pakistan Remote Sensing Satellite-1 (PRSS-1) was launched from China's Jiuquan Satellite Centre on 9 July 2018.

Remote sensing (archaeology)

Remote sensing techniques in archaeology are an increasingly important component of the technical and methodological tool set available in archaeological research. The use of remote sensing techniques allows archaeologists to uncover unique data that is unobtainable using traditional archaeological excavation techniques.

Space Research and Remote Sensing Organization

The Bangladesh Space Research and Remote Sensing Organization (Bengali: বাংলাদেশ মহাকাশ গবেষণা ও দূর অনুধাবন কেন্দ্র Bangladesh môhakash gôbeshôna o dur ônudhabôn kendrô), or SPARRSO, is a state agency concerned with astronomical research and the application of space technology in Bangladesh. Sparrso works closely with JAXA, NASA and the ESA in environmental and meteorological research. Using Japanese and American satellites, SPARRSO monitors agro-climatic conditions and water resources in Bangladesh.

Topography

Topography is the study of the shape and features of land surfaces. The topography of an area could refer to the surface shapes and features themselves, or a description (especially their depiction in maps).

Topography is a field of geoscience and planetary science and is concerned with local detail in general, including not only relief but also natural and artificial features, and even local history and culture. This meaning is less common in the United States, where topographic maps with elevation contours have made "topography" synonymous with relief.

Topography in a narrow sense involves the recording of relief or terrain, the three-dimensional quality of the surface, and the identification of specific landforms. This is also known as geomorphometry. In modern usage, this involves generation of elevation data in digital form (DEM). It is often considered to include the graphic representation of the landform on a map by a variety of techniques, including contour lines, hypsometric tints, and relief shading.

UNOSAT

UNOSAT was established in 2001 as an operational, technology-intensive programme of the United Nations Institute for Training and Research (UNITAR). UNOSAT provides satellite imagery analysis and capacity development to the UN system, UN member states, and its partners. The programme's work supports evidence-based decision making for peace, security and resilience. UNOSAT products are used in response to humanitarian crises and for implementation of the 2030 Agenda for Sustainable Development.

The UNOSAT team is mainly composed of GIS and imagery analysts, remote sensing experts, geologists, hydrogeologists and hydrologists, supported by IT engineers, programmers, and management experts.

UNOSAT is headquartered at The European Organization for Nuclear Research (CERN) in Geneva, Switzerland with regional presence in Bangkok, Nairobi and N’Djamena.

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.