1,445
Views
14
CrossRef citations to date
0
Altmetric
Original Articles

Open standards-based geoprocessing Web services support the study and management of hazard and risk

Pages 171-184 | Received 23 Dec 2009, Accepted 22 Jan 2010, Published online: 23 Jun 2010

Abstract

Over the last 15 years, and particularly in the last 5 years, a robust framework of open geoprocessing standards and sensor web enablement standards has been developed by the OGC, an open, consensus-based standards development organization, in close cooperation with other standards development organizations. These standards have been implemented by developers in a wide variety of commercially successful geoprocessing software products and Web services. The standards are coming into wide use in domains of activity such as ocean observation, defence and intelligence, and civil protection, and they are beginning to be used in many other domains, including the domain of natural hazards and risks. A description of the standards is provided, along with a discussion of their benefits and the changes they tend to encourage in business and institutional arrangements. It is expected that use of these standards will become ubiquitous as new computing models (‘cloud computing’, notably) replace old computing models. It is also expected that this progress will have significant consequences for environmental risk and hazard assessment and management as well as the institutions, practices and methods of sciences that produce and use geospatial information.

1. Introduction

To understand the significance of open standards for applications of geomatics that involve risk and hazard assessment and management, it is necessary to look at geomatics in its larger socio-technical context. All of the ‘communities of interest’ that use geomatics benefit from the increasing power of computer processor chips, the increasing speed and capacity of storage devices, the increasing resolution and speed of sensors, and the increasing data transfer bandwidth between components in devices and between devices on a network, usually the Internet. But what makes computing progress useful is communication.

Communication means transmitting or exchanging through a common system of symbols, signs, or behaviour. Standardization means agreeing on a common system. In the information and communication technology (ICT) world, standardization is undertaken by communities of interest who see various benefits to having a common system that results in reliable ‘behaviour’ in terms of computers passing instructions that result in desired responses. Systems that ‘work together’ through a common system of interfaces and encodings (also called ‘Application Programming Interfaces’) are said to ‘interoperate’.

The Internet and Web are enabled and defined by their open standards (TCP/IP, HTTP, HTML, XML, etc.). Their extraordinary success provides a model of interoperability that is driving the whole ICT industry in the direction of open standards. The OGC, whose membership now includes more than 385 government agencies, corporations, universities, research institutions and non-governmental organizations, is the only open standards organization focused on geoprocessing standards. But because geospatial information is critical across many domains of activity and technology, the OGC membership has learned that it is imperative to cooperate with many other standards organizations, most of which are also inclusive, international, consensus-based and driven by the requirements of both technology users and technology providers. OGC also necessarily coordinates with and obtains requirements from groups that represent geomatics-using communities of interest; most recently (in November 2009), the OGC signed memoranda of understanding with the International Environmental Modeling and Software Society (iEMSs) and the World Meteorological Organization (WMO).

People often confuse open standards and open source, and it is worthwhile to distinguish between these at the outset:

Open standards, as defined by the OGC, are standards that are created in an inclusive, international, participatory industry process and that are publicly available at no cost. In the era we are leaving behind, standards were usually imposed by a single successful vendor or cartel or else they were mandated by some authority. The success of the Internet and the Web suggests that consensus-derived open standards are likely to be more successful over time than imposed standards.

Open source means that anyone can have free access to the source code for a software product, but open source also involves certain licensing restrictions designed to maintain the integrity of an open source product and encourage the product's continual improvement (see the Open Source Initiative http://www.opensource.org/docs/definition.php).

Open source and open standards tend to support each other, but they are different. Increasingly important and popular, open standards and open source are even being explored as a means for advancement in non-ICT industries, such as the automobile industry (http://www.theoscarproject.org/). This observation underscores their significance as a catalyst for social and economic change.

Regardless of the market model employed to advance geospatial and location based technologies, OGC standards provide an underpinning of interoperability that is easing discovery, access, integration, fusion and use of location information in a range of uses worldwide.

2 OGC Web Service Standards

To understand the importance of open geoprocessing standards in the field of hazard and risk assessment, one must understand what such standards do. Beginning in 1994, the OGC membership first created a system of policies and procedures for its own operation, and then began developing geoprocessing standards. These differ in kind from earlier standards in the GIS and remote sensing industry which were mainly standards for data formats. OGC standards comprise a ‘common language’ by which computer systems can request geoprocessing services from other computer systems. Such a request might be simply, ‘Client requests certain data from server’, or it might be more complex, such as, ‘Client requests server to conflate this client-provided data with other specified data to be provided by server, and then to return the conflated data for display in a certain portrayal style’.

Not all OGC standards apply in the Web environment. In this paper we focus on OGC Web Services (OWS) standards, because the World Wide Web has become the dominant platform for network-based client/server computing and because this model of computing is rapidly becoming the dominant model.

Twenty-eight member-approved OGC standards are currently available. A quick scan of summary descriptions of the OGC standards provided below gives a sense of the scope of OWS standards:

2.1 OGC® Web Map Service Interface Standard (WMS)

This provides a simple HTTP interface for requesting geo-registered map images from one or more distributed geospatial databases. A WMS request defines the geographic layer(s) and area of interest to be returned as one or more geo-registered simple map images (JPEG, PNG, etc.) that can be displayed in a browser application. The interface also supports the ability to specify whether the returned images should be transparent so that layers from multiple servers can be visually overlaid or not.

2.2 OGC Web Feature Service Interface Standard

This defines the interface to a data access service that enables features from multiple feature collections (http://www.opengeospatial.org/ogc/glossary/f) (as in a vector GIS) to be queried and managed using HTTP. The standard defines operations that enable clients to discover feature collections available through the service, access features and their attributes, and enable editing, locking and a range of operations against features and attributes.

The canonical feature encoding for input and output is the Geography Markup Language (GML) (http://www.opengeospatial.org/standards/gml). However, the WFS standard is extensible and allows for other feature encodings to be supported.

2.3 OGC Geography Markup Language Encoding Standard (GML)

This defines a data encoding in XML for geographic data and a grammar to express models of such data using XML Schema (http://www.w3.org/XML/Schema). GML provides a means of encoding geographic information for both data transport and data storage, especially in a Web context. It is extensible, supporting a wide variety of spatial tasks, from portrayal to analysis. It separates content from presentation (graphic or otherwise), and permits easy integration of spatial and non-spatial data. Clients and servers with interfaces that implement the OGC® Web Feature Service Interface Standard (http://www.opengeospatial.org/standards/wfs) read and write GML data. GML is also an ISO standard (ISO 19136:2007) (http://www.iso.org/iso/iso_catalogue/catalogue_tc/catalogue_detail.htm?csnumber=32554).

2.4 OGC Web Coverage Service Interface Standard (WCS)

This specifies how software client applications can request parts of coverages (http://www.opengeospatial.org/ogc/glossary/c) (usually grid coverages) offered by geoweb server. (Client applications are software programs that invoke ‘services’ provided by server software in client/server environments such as the Web services based distributed processing environment.) The response to a WCS request includes coverage metadata and an output coverage whose pixels are encoded in a specified binary image format. A client application can define the spatial-temporal domain of an identified grid coverage to be returned as well as the identified parts of the range of values for the coverage. A coverage could, for example, be a digital elevation model or a multispectral image, where each pixel has an associated set of attributes that can be requested and accessed by a client.

2.5 OGC Symbology Encoding Standard (SE)

This defines an XML language for styling information that can be applied to digital geographic feature and coverage data.

2.6 OGC Catalogue Services Interface Standard (CS)

This supports the ability to publish and search collections of descriptive information (metadata) about geospatial data, services and related resources.

2.7 OGC Coordinate Transformation Service Standard (CT)

This provides a standard way for software to specify and access coordinate transformation services for use on specified spatial data.

2.8 OGC Geospatial eXtensible Access Control Markup Language Encoding Standard (GeoXACML)

This defines a geospatial extension to the OASIS standard ‘eXtensible Access Control Markup Language (XACML)’ (http://www.oasis-open.org/committees/xacml/).

2.9 OGC KML Interface Standard

The OGC KML Interface Standard, brought into the OGC by Google, is a language for geographic visualization, including annotation of maps and images. Geographic visualization includes not only the presentation of graphical data on the globe, but also the control of the user's navigation in the sense of where to go and where to look. KML is complementary to most of the key existing OGC standards including GML, WFS and WMS.

OGC Web Services are, of course, only a subset of the world's web services, most of which do not involve geoprocessing. But a key goal of the OGC has been to make OGC standards as compatible as possible with standards frameworks being developed for general purpose computing and for special technology domains that intersect with the geospatial technology domain. Such special technology domains include, for example, smart sensors, smart grid, security, digital rights management, workflow, semantic web, grid computing, 3D, and computer-aided design (CAD). This ongoing effort to weave geoprocessing into the larger fabric of Web-based computing is critically important for multidisciplinary activities such as hazard and risk assessment and management.

The full set of OGC standards, including Sensor Web Enablement (SWE) standards described below, along with reference models, standards profiles, public engineering reports, best practices documents, discussion papers and white papers are available at http://www.opengeospatial.org/standards.

3 OGC Sensor Web Enablement (SWE) Standards ()

Most geospatial data are created by means of sensing and measurement devices (gauges, GPS, satellite-borne imaging cameras, etc); most sensors have a location that is pertinent to the sensor's purpose; and there is value in aggregating sensor access and sensor data archives for use in geospatial applications. For these reasons, OGC members have developed a set of ‘Sensor Web Enablement’ (SWE) standards that provide a Web services framework for working with sensors and sensor data. Again, a quick scan of summary descriptions of some of the these standards provides a sense of the scope of this set of free and open resources that are available to tools and application developers serving the risk and hazard assessment community.

Figure 1. Sensor Web Enablement standards provide a platform for describing, publishing, discovering, controlling and collecting data from virtually any kind of sensor.

Figure 1. Sensor Web Enablement standards provide a platform for describing, publishing, discovering, controlling and collecting data from virtually any kind of sensor.

3.1 OGC® Sensor Model Language (SensorML) Encoding Standard

This defines the general models and XML encodings for sensors. SensorML enables:

Discovery of sensors and processes and plug-and-play operation of sensors – SensorML is the means by which sensors and processes make themselves and their capabilities known; describes inputs, outputs and taskable parameters.

Observation lineage – SensorML provides history of measurement and processing of observations; supports quality knowledge of observations.

On-demand processing – SensorML supports on-demand derivation of higher-level information (e.g. geolocation or products) without a priori knowledge of the sensor system.

Intelligent, autonomous sensor network – SensorML enables the development of taskable, adaptable sensor networks, and enables higher-level problem solving anticipated from the Semantic Web.

3.2 OGC Observations and Measurements (O&M) Encoding Standard

This defines the general models and XML encodings for sensor observations and measurements.

3.3 OGC Sensor Observation Service (SOS) Interface Standard

This provides an open application programming interface (API) for managing deployed sensors and retrieving sensor data and specifically ‘observation’ data.

3.4 OGC Sensor Planning Service (SPS) Interface Standard

This provides an open API for a service by which a client can (1) determine the feasibility of collecting data from one or more mobile sensors/platforms and (2) submit collection requests to these sensors/platforms.

4 Examples of OGC Standards used in Risk and Hazard Assessment and Management

A 2007 US National Academy of Sciences study, ‘Successful Response Starts with a Map’ (Board on Earth Sciences and Resources 2007) concluded that geospatial information should be an essential part of all aspects of emergency and disaster management. The same point was made in a June 2005 report of the US National Science and Technology Council Committee on Environment and Natural Resources: ‘Grand Challenges for Disaster Reduction – A Report of the Subcommittee on Disaster Reduction’ (Subcommittee on Disaster Reduction Citation2005). In that report, Grand Challenge #1 is to ‘Provide hazard and disaster information where and when it is needed’.

4.1 Taiwan Debris Flow Warning System

A scenario in the 2008/2009 OWS-6 Testbed Activity involved an actual implementation in Taiwan of SWE standards and chained Web services in a working debris flow monitoring system. In parts of Taiwan – due to the steep terrain, severe weather (frequent typhoons) and geology (unstable soils and frequent earthquakes) – upland river valleys are subject to sudden and dangerous flows of earth and boulders, and it is important to provide alerts and warnings. This scenario involved alerts, notifications, grid processing and real-time event architecture ().

Figure 2. Parts of Taiwan are susceptible to sometimes disastrous flows of earth and boulders in upland river beds. A system of sensors and analysis can provide information about danger levels based on weather and geology. The system can also provide real-time alerts when an event has begun. (Image by Yao-Min Fang, GIS Research Center, Feng Chia University, Taiwan, 2009 OGC Technical Committee meeting).

Figure 2. Parts of Taiwan are susceptible to sometimes disastrous flows of earth and boulders in upland river beds. A system of sensors and analysis can provide information about danger levels based on weather and geology. The system can also provide real-time alerts when an event has begun. (Image by Yao-Min Fang, GIS Research Center, Feng Chia University, Taiwan, 2009 OGC Technical Committee meeting).

Development and deployment was carried out by researchers and engineers at the GIS Research Center, at Feng Chia University in Taichung, Taiwan and the Industrial Technology and Research Institute. Deployment of an operational Taiwan Debris Flow Monitoring System proceeded throughout 2009, resulting in a working system that provides monitoring of critical conditions. This system provides significantly increased protection for citizens and businesses in thirteen of the river valleys in Taiwan that have the highest potential for debris flows.

At 13 fixed locations, sensors such as CCD cameras, rain gauges, geophones, wire sensors and water level meters provide observation data through ADSL networks or satellite links. Two mobile vehicles equipped with monitoring devices can be dispatched to areas where debris flows are likely.

Web services that implement the OGC Sensor Observation Service (SOS), Sensor Planning Service (SPS) and Sensor Alert Service (SAS) Interface Standards enable communication between the various parts of the system. Components that provide alert notices use the OGC Web Notification Service (WNS). Applications, including one that provides Short Message Service (SMS) alerts, can access stored and real-time data. In some applications, users can click on sensor icons in graphical 3D terrain views to check on the status and outputs of sensors.

The Feng Chia University and Industrial Technology and Research Institute team have provided a comprehensive OGC Best Practices document that serves as a cookbook for implementing SWE, including schemas, examples, and working code.

4.2 ORCHESTRA in Europe

ORCHESTRA, a major European ‘integrated project’ under IST-FP6, focused on the role of open standards in dealing with environmental risks. The ‘Orchestra Overview’http://www.eu-orchestra.org/overview.shtml) touches on the rationale for open standards:

Disaster-risk management activities involve multiple organizations at various administrative levels, each having their own systems and services.

and

The capacity to share relevant information required when dealing with cross-border environmental risks is too limited, thus preventing a truly efficient handling of problems. One of the most urgent and important challenges governments are facing is to get these systems to work together and share information to allow proper data analysis and resource management, both being critical elements of disaster risk management.

Thus, to improve technical interoperability for risk management, 15 organizations, including the OGC, worked together in ORCHESTRA to develop a service-oriented architecture for risk management based on open standards, together with a software infrastructure for enabling risk management services. The ORCHESTRA Architecture (Annoni et al Citation2009) is a platform-neutral specification based on Web service standards of the ISO (International Organization for Standardization), OGC, W3C (World Wide Web Consortium) and OASIS (Organization for the Advancement of Structured Information Standards).

4.3 Sensor Webs: The German tsunami warning system for the Indian Ocean ()

Figure 3. The German Indonesian Tsunami Early Warning System (GITEWS) uses SWE standards to help “fuse” data from multiple sensors types. (Image from the GITEWS Project website: http://www.gitews.org/fileadmin/images/Content/Homepage/GITEWS_dt_schema.gif)

Figure 3. The German Indonesian Tsunami Early Warning System (GITEWS) uses SWE standards to help “fuse” data from multiple sensors types. (Image from the GITEWS Project website: http://www.gitews.org/fileadmin/images/Content/Homepage/GITEWS_dt_schema.gif)

Open source software that implements SWE standards is being used in a number of real-world systems, including a monitoring and control system for the Wupper River watershed in Germany, an Advanced Fire Information System (AFIS), and a wildfire monitoring system in South Africa. Another is the German Indonesian Tsunami Early Warning System (GITEWS), a 35 Million Euro project of the German aerospace agency (DLR) and the GeoForschungsZentrum Potsdam (GFZ), Germany's National Research Centre for Geosciences. GITEWS uses SWE services for sharing tsunami-related information among GITEWS software components. Real time sensors and simulation models provide data for GITEWS.

4.4 Hazard and risk scenarios in OGC interoperability initiatives

The OGC, through its Interoperability Program (IP), facilitates testbeds, pilot projects and interoperability experiments in which standards are created and tested. Many of these ‘interoperability initiatives’, as shown in the examples below, have had at least a partial focus on hazards and risk, both natural and human-caused.

In the Architecture, Engineering, Construction, Owner and Operator Phase 1 Testbed (AECOO-1) the OGC worked with leaders in the AEC community to improve interoperability of building information models (BIM). The goal of AECOO-1 was to aid the AEC industry in advancing a collaborative prototyping environment to improve BIM-related standards development. Through the joint testbed process, OGC, buildingSMART and others worked collaboratively to advance BIM standards objectives through active prototyping, while setting the stage for follow on initiatives to address the incorporation of standards based web services. Progress in this area, building on the OGC CityGML Encoding Standard, will result in improved representation, storage and exchange of virtual 3D city and landscape models for use in applications such as hazard and risk assessment. Among its capabilities, CityGML provides a means for deriving Web standards based 3D models from models based on the buildingSMART alliance's Industry Foundation Class (IFC). IFC is an object oriented file format with a data model developed by buildingSMART alliance to facilitate interoperability in the building industry, and it is a commonly used format for Building Information Models (BIM).

OGC Web Services, Phase 6 (OWS-6) and Phase 7 (OWS-7)–OWS-6 brought together 10 sponsor organizations and 32 participating organizations to advance standards in five major technology focus areas or ‘threads’, including: Sensor Web Enablement (SWE), Geo Processing Workflow, Aeronautical Information Management, and Decision Support Services. OWS-7 will focus on Sensor Fusion Enablement, Feature and Decision Fusion, and aviation information systems. Fusion is defined as ‘the act or process of combining two or more pieces of data or information regarding one or more entities to improve the capability for detection, identification or characterization of that entity’. For example, if a power plant has been flooded, advanced fusion capabilities will enable rapid combination of information from many sources for the purposes of assessing the risks inherent in the situation.

The Geo-interface for Atmosphere, Land, Earth, and Ocean netCDF Interoperability Experiment (GALEON IE) has produced reports that describe how OGC Web Services were used in applications that involve netCDF data, a data format used widely in meteorology and climatology. The goal of GALEON IE is to improve interoperability among diverse geospatial information systems used in studying and responding to major storm events, tsunamis and earthquakes ().

In January 2007, the Ocean Science OGC Interoperability Experiment (Oceans IE) was begun by members of the OGC to study implementations of OWS and SWE standards. OGC SWE standards and complementary standards from organizations including the ISO, IEEE, and OASIS are being used by more than a dozen ocean observation initiatives. These include regional application-oriented organizations such as the US National Oceanic and Atmospheric Administration IOOS (Integrated Ocean Observing System) program, the open-source, community-oriented OOSTethys initiative, the Coastal Ocean Observing and Prediction (SCOOP), the related community initiative OPENIOOS.ORG and the Interoperable GMES Services for Environmental Risk Management in Marine and Coastal Areas of Europe (InterRisk).

The GEOSS Architecture Implementation Pilot (AIP), a multiyear OGC Interoperability Initiative, has brought together technical contributions from over 120 organizations. Their task is to provide the architecture for the Global Earth Observing System of Systems (GEOSS) being developed by the Group on Earth Observations (GEO), a partnership of 126 governments and international organizations. The governments seek to make available on Web servers a very large collection of geospatial data, including live sensor data and stored geodata of many kinds. The goal is to improve understanding and capacity in dealing with challenges involving disasters, health, energy, climate, water, weather, ecosystems, agriculture and biodiversity.

Figure 4. Ocean observation applications of SWE demonstrate that in many applications, “neighbouring” standards need to be harmonized by standards development organizations so that diverse configurations can be accommodated.

Figure 4. Ocean observation applications of SWE demonstrate that in many applications, “neighbouring” standards need to be harmonized by standards development organizations so that diverse configurations can be accommodated.

This list of activities is provided to illustrate the value of open standards in studying and managing natural risks and hazards, and also to show the importance of collaborative efforts among users in developing the standards they will use. Science has always been a collaborative activity, but collaboration is particularly important in studying natural hazards and risks, because observations from multiple disciplines and multiple data collection activities need to be considered together. Arguably, all disciplines that produce and use geospatial data have a need for more data sharing and collaboration. In the presence of modern information technologies, geoscientists are developing a new paradigm for advancing scientific knowledge.

5 Changes induced by the new paradigm

Open standards-based geoprocessing Web services are part of the transition from standalone computers to cloud computing. As Sun Microsystems once proclaimed, ‘the network is the computer’. In the geospatial industry this means that not only geospatial data but also geoprocessing services (once available only as the discrete functions bundled in ‘full-featured’ desktop GISs and imaging packages) are becoming increasingly available on the Web. Metadata standards and efficient tools for creating metadata are in place. Standards for geospatial catalogues enable development of catalogues that provide fine-grained search and discovery of geospatial feature collections and coverages. Catalogue services can search multiple catalogues. Publishing, discovering, accessing, and processing data have become cheap, and far less time is now spent preparing data for use. ‘Data files’ are becoming less and less important as work objects for practitioners, because queries frequently return specific subsets of data that are used and discarded without being saved. Data are becoming cheaper to obtain, too, but relative to other costs, they are becoming more expensive. Re-use increases the value and cost-effectiveness of data, but most data's value in re-use never materializes because few data are catalogued and published online. Institutional resistance to data sharing is slowly yielding as data holders realize that they have an opportunity to increase the value of their data through standards-based cataloguing and publishing.

The advance of the Web has also resulted in many new applications that involve volunteered geospatial information. Some of these applications are designed for individuals' routine use, but others provide critical and time sensitive information for hazard and risk management. The use of volunteered location information has been publicized in press coverage of recent disasters, and thus people who are not particularly knowledgeable about geospatial technology are prepared to provide information in the event that they become involved in a crisis.

These changes affect or will affect many things, including the work of practitioners, the curricula offered to practitioners in training, the information environment for decision-makers, the business models of academic publishers, the incentives that motivate researchers, the balance of responsibility (federal/national, state/provincial, local, private sector, individual) for data provision and quality, and many other activities and arrangements that are germane to the success of hazard and risk assessment and management efforts.

Understanding, forecasting and responding to these socio-technical changes requires consideration of three kinds of interoperability: technical, semantic and institutional.

Technical interoperability as described in this paper provides not only communication of instructions between diverse systems but also a framework in which it becomes much easier to establish semantic interoperability within and between different information communities that use different data models.

Semantic issues are becoming the main issue in the proceedings of data coordination groups (1) as they cease to be bogged down in issues of data formats and vendor-to-vendor non-interoperability and (2) as they begin to use tools that facilitate creation of OGC and ISO compliant data schemas.

In-community and inter-community discussions of semantics (how to describe or what to name a feature type) and XML-enabled semantic translation (what a feature type in one schema translates to in another schema) bring data sharing a step closer to the doorstep of institutional interoperability.

Though progress seems imminent, government, commercial and academic institutions all present their own obstacles to institutional data sharing:

‘Data fiefdoms’ and pre-Web ideas about cost-recovery persist in government agencies.

Commercial data providers and companies whose internal data may have commercial or social value (during a disaster, for instance) lack adequate digital rights management Web service standards (though such standards are pending – see the GeoDRM Reference Model) and so, in some cases, they have not yet developed viable e-commerce business models and emergency data-sharing policies.

Bound by tradition and a regime of incentives that don't reward publishing of data, many academics don't yet quite realize that the value of data increases with the number of researchers who can use it. But many scientists are aware that open data would be valuable in cross-discipline and longitudinal studies. Scientists are beginning to discuss how open publishing of climate data might advance academic understanding and public confidence in climate science. And as computer models, such as climate models, become more sophisticated and widely used, scientists increasingly recognize the value of having robust access to libraries of data. Some research funding institutions are getting close to changing the incentives to favour open data, but they are struggling with questions such as how to address costs and responsibilities associated with cataloguing and hosting data.

In this context, whether one is an advocate of data sharing and open data or a student of socio-technical transition, it is useful to refer to the literature of regime change, in which a regime is defined as a set of practices, rules, institutions, power relations, and shared assumptions that dominate a system and its actors (Bergman Citation2008). As Bergman et al explain, regimes rise and then decline or evolve, and they decline or evolve because they are vulnerable in these ways. First, niche successes plant seeds of change. Niches may advance through socio-technical prototypes such as the implementation examples described above. These prototyping activities:

provide evidence to support later, larger actions;

test options, knowing some will fail;

pursue ambitious goals;

are large enough to yield observable results, but small enough to be affordable;

are designed to assess complex, socio-technical interactions;

assist knowledge diffusion by providing a visible model;

are created from scratch or by adapting another project.

Second, a regime is vulnerable to landscape changes, such as world trends. For example, in most domains this assertion from the European JRC (Joint Research Center) ‘Advanced Regional Spatial Data Infrastructures in Europe’ study is true and destabilizing: ‘The technology is cheap, data are expensive, and social relations are invaluable’. A federal government mandate for open standards and Web archiving of data created for federally funded construction projects would constitute a landscape change.

Third, a regime is vulnerable to internal misalignment among actors (early adopters, change agents, mavens, etc.).

Another perspective to consider is the simple financial perspective: New technologies cause old financial arrangements to change, and then new answers to the ‘four who's’ questions must be found: Who benefits? Who pays? Who provides? and, as the case may be, Who loses? (Maier and Rechtin Citation2000).

Yet another perspective is McLuhan's: ‘The Medium is the Massage’. That is, the perceptions, mindsets and institutional arrangements of technology users are shaped by their technology. In an article originally published in 1998, Sui and Goodchild called for ‘a shift of perspective, from viewing them (GIS) as instruments for problem-solving to viewing them as media for communication’ (Sui and Goodchild, Citation2003). In the 11 years since, what they called ‘GIS’ has been submerged in a much-expanded sea of ubiquitous geospatial capabilities available to both geospatial specialists and the wider public. Those authors touch on McLuhan's insight that all our tools, not only our communication media, alter us. Their review and reflections provide a useful starting point for further study of the deep perceptual and behavioural aspects of socio-technical change resulting from and shaping the development of geospatial technologies.

6 Conclusions

The world of geomatics and the geomatics-dependent world of hazard and risk assessment and management are changing along with changes in their underlying technical foundation. The new world that is emerging will surely be different. It will likely reflect some version of Metcalfe's Law: ‘The value of a telecommunications network is proportional to the square of the number of connected users of the system’ (en.wikipedia.org/wiki/Metcalfe's_law). Versions of that law include estimates of the network-enhanced value of each node, such as online geospatial feature collections. The new world of geomatics applications will also likely reflect Web-based social networking values, ubiquitous location-aware mobile devices, and much more. These technical and commercial developments are also cultural, and they are in some respects outrunning law and policy. They introduce risks in areas such as privacy, physical security, copyright, liability and ‘territorial integrity’.

For those ensconced in the old world, the changes are unsettling, but they are undeniable and largely unstoppable, and to understand them – both the technological transitions and the socio-economic transitions – is to enable discovery of options for adaptation. Those who understand the technological changes and believe in their value can accelerate change by applying an understanding of the vulnerabilities of regimes, or more simply and perhaps more practically, by participating in projects and programs that employ the new open technologies to advance understanding of hazards and risks and more effectively reduce their harms.

A thought for readers concerned about the hazards and risks of runaway technology: In a world where the development and use of technology is increasingly shaped by standards, consensus standards organizations offer unique potential as a means for guiding technology development.

References

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.