328
Views
0
CrossRef citations to date
0
Altmetric
Original Articles

Expeditious management plan towards digital earth

, , , , &
Pages 635-649 | Received 06 Feb 2012, Accepted 23 Nov 2012, Published online: 11 Jan 2013

Abstract

The breakthrough developments in geospatial technologies and the increasing availability of spatial data make geoinformation a business and a decisional element to the management. Hence, it is important to have a management plan to factor in practical and feasible data sources, in building geo applications. The authors of this paper are motivated by the fact that right data sources could outclass in-house resources in various application scenarios. This paper outlines pragmatic cases for the tangible benefits of the existing potential data and expeditious patterns for digital earth. This work also proposes ‘good-enough’ solutions based on the pragmatic cases, available literature, and the 3D city model developed that could be sufficient in contriving the objectives of the common public usage and open business models. To demonstrate this approach, the paper encapsulated the low-cost development of virtual 3D city model using publicly available cadastral data and web services.

1. Introduction

Resource management is one of the key components of a project. Finding the right sources and the optimal usage of these resources will economically benefit an organization. The growing interactions with spatial data and its substantial usage in various domains make it increasingly significant in making initiative analysis, planning, and collaborative decisions from managers, geoscientists, and to the common public (Goodchild Citation2009). From the study laid out by Rajani (Citation1996) to the recent study by Goodchild (Citation2008), it is evident that spatial data are widely being used for basic operations like visualization and querying rather than for modeling or advanced analysis. Technical people along with the managers, who are not technical, need a management plan in place to familiarize with the spatial data sources. Such plans not only illustrate the merits and demerits of each method, but also facilitate choosing of the right sources and in some scenarios, employing the merits of one source to another for a controlled and effective resource planning.

Spatial data collection is a very expensive (Krek Citation2003) and vital component for mapping activities. A survey held in Spain demonstrates that spatial data reuse by the companies, rates Geographic Information System (GIS) as one of the most active options (Aporta 2011). In addition, studies such as Morten (Citation2011) and GITA (Citation2005) show that the sharing of data at no cost or paying a reasonable amount saves human effort, time, and financial resources for the project.

There are various data collection techniques for building the data models required by a project. Remote sensing, photogrammetry, surveying, and cartographic digitization techniques employ their unique principles in creating the spatial data (Khagendra and Robert Citation1991; Omar and Ayhan Citation2009). On the other hand, concepts like spatial portals (Voyager Citation2011), spatial data infrastructure (SDI) (GeoGratis Citation2011), spatial data providers like (KCSC Citation2011; NEO Citation2011; SEC Citation2010), and freelancing sites (http://freegisdata.rtwilson.com/) are some of the platforms that provide no or low price data through web services or the data can be downloaded in their standard formats. Further, EuroGEOSS is working on a project to provide access not only to the data but also to various analytical models that can be reused for other application domains (EuroGEOSS Citation2011). In addition, Goodchild and Glenon (Citation2010) discussed crowd sourcing and volunteered geographic information (VGI) as a source during natural catastrophes. In these cases, the quality of data produced by the volunteers is equally good or even better than that of the authoritative sources. Crowd sourcing applications like Waze (http://world.waze. com/) and disaster response programs like the Sandy Humanitarian OpenStreetMapotosm (http://sandy.hotosm.org/) are playing substantial role in generating spatial data. Social networking sites such as Twitter turns humans into sensors that provide real time community based data for natural disasters, elections, games, etc. Thus, all these user driven public services altogether brings in a new addition to the data sources.

This work aims to elucidate management plan that could suffice in achieving the objectives of the project in an expeditious way. Over the management plan, a 3D digital view of the city model has been developed using publicly available information. This work also highlights the data value improvements and extends the use of exiting 2D cadastral data to a digital earth model.

The following section discusses businesslike approaches based on the pragmatic cases. Section 3 discusses and presents the expeditious management pattern. Section 4 illustrates the development of a 3D model using Spanish cadastral service based on the proposed expeditious management pattern. The paper ends with a conclusion section.

2. Pragmatic cases

This section illustrates various cases demonstrating minimal approaches that are sufficient to cater the application objective in a fast and efficient way. These cases gauge the dimensions of a solution at different scales (national, international, and application usability). This discussion is not exhaustive since there is no single best approach that will be appropriate in each application scenario. Here we are trying to highlight various currently available techniques, multiple geospatial and nongeospatial services that could be adequate and economical for building common usage applications.

2.1. 3D cadastral mapping

The Spanish Cadastral Virtual Office has developed an online application to access the geometry of the plot and to view the 3D volumetric representation of the parcel on Google Earth (OVC Citation2010). This 3D model of cadastral parcels is developed using standard formats (OGC KML) and official exchange format (FXCC). FXCC is an official format for the exchange of graphical information of cadastral parcel data in Spain (Cartesia Citation2010). The SIGCA2 is a GIS tool developed to manage the cadastral mapping data. It has a provision for processing a shape file to the keyhole markup language (KML) format, for quality check of FXCC format, and it allows viewing of the georeferenced raster data on Google Earth. The SIGCA2 has modules that have been developed for validating, editing, and for an automatic update of mapping using the FXCC format. Each plot cadastral encoded in the FXCC format is represented by two files: DXF file that contains the vector information of each parcel significantly by lines and text organized in different layers and a file ASC (ASCII) plain text where the information is collected conferring the plot: boundaries, surfaces (Virgós and Olivares Citation2008). The application interface () gives access to cadastral reference and address of the parcel. ‘Ver Plantas 3D’ facilitate viewing the building parcel in 3D and, ‘Volver’ is to return back to the normal mode. Recent additions are the photographs in JPG format that represent the facade of the buildings. This 3D cadastral mapping can also be developed by acquiring light detection and ranging (LIDAR) data (probably the top level in quality and quantity). Nevertheless, the use of this kind of data would have resulted in an overcharge in relation with the necessities of this case. These kinds of 3D applications are intuitive for territory planning, to determine real estate (taken into account that cadastral is not ‘real-time’ information because administrative process implies time gap between reality and the data) and taxation (Žiūriene et al. 2006) for improved operating decisions. This approach extrapolates us to contemplate cadastral information for the 3D model development in this paper.

Figure 1. 3D models generated from the cadastral data, displayed on Google Earth (Virgós and Olivares Citation2008).
Figure 1. 3D models generated from the cadastral data, displayed on Google Earth (Virgós and Olivares Citation2008).

2.2. Environmental management

To follow the forest fire evolution in any geographical area can be of great help to the environmental managers. Applications like the one presented (Anon 2011) are very useful tools. This application has been developed using the Application Programming Interface (API) technology, but the plan identified in this approach may not be apt for studies that require an in depth analysis of data. Nonetheless, an API could be an enterprising response service in time sensitive situations for managers in planning. Apparently, this kind of technology allows us to build applications instantly on existing spatial data by using spatial functions.

This application is developed for communicating static and dynamic forest fire spots using distributed geo services (). The static data prepared from tabular data (comma-separated values) is converted to a shape file format and eventually to a KML. An overlay function is used to overlay KML data on Google maps for dynamic data generation. All the nonspatial information is triggered from the Fire Information for Resource Management System database, which is overlaid dynamically on to Google Maps and Google Earth data using APIs, ArcGIS, overlay function, and network link model (). With regard to , when a user clicks on any firespot, it displays the following information: firespot latitude and longitude, date acquired, time, confidence level (to gauge the quality of an individual firespot), satellite details, and web-link to access external information. It also provides a provision for mobile alerts and emails as to receive updates on forest fires, which will aid decision makers in planning and resource allocation forthwith. With these kinds of geo services, eclectic approaches, nonspatial data from one database, and spatial data from one source, applications could be built easily with up-to-date information, which is of pressing importance to the nation.

Figure 2. The schema for forest fire mapping (Naveen and Deepak 2010).
Figure 2. The schema for forest fire mapping (Naveen and Deepak 2010).
Figure 3. MODIS forest fires data on Google Maps (Naveen and Deepak 2010).
Figure 3. MODIS forest fires data on Google Maps (Naveen and Deepak 2010).

2.3. Geomarketing

Anderson (Citation2004) envisages geomarketing as a tool for the promotion of a regional sustainable development. Geospatial applications are the instruments used for marketing tasks like site selection, location-based advertising, competitive assessment, and proximity analysis. Essentially, geomarketing applications require layers like the streets, buildings, business data, and remote sensing data. The street layer can be built by accessing and customizing the OpenStreetMap data or by using automatic digitizing software like the R2V software (http://www.ablesw.com/r2v/). Yecheng (Citation1997) elaborates that the R2V software can complete the digitizing and labeling of a map of 1:50,000 scale in two days, which usually takes 10 days through manual digitization, while 3D models can be developed with basic elevation data by accessing Elevation API, a new service by Google (Google 2011a), or by using paper maps to generate the elevation data through software like the R2V. This data could be further validated and interpolated for near representation of the real world scenarios. GoogleEarth API for the building models (Google 2011b) and satellite data can be integrated from services like Geogratis and GeoNorge. Hence, these integration and visualization techniques may eventually retrench in-house resources by conflating within their own GIS system.

3. General scheme for expeditious management plan

The previous section has unveiled that the characteristics of data and associated necessities bear a close relationship with the objectives of the system under development. In this way, this section identifies a general management pattern for choosing a better data source taking into account several aspects related to the objects of the project used as an application example in this paper. A plan is a set of sequential steps along with the resources required to accomplish an objective (Wikipedia Citation2012). Further, a contextual concept called ‘pattern’ would also be valuable and highly complementary to the plan concept. Martin (Citation1997) provides an insightful definition of a pattern: ‘An idea that has been useful in one practical context and will probably be useful in others.’ Apart from solving specific problems, patterns are also used in providing architectural outlines that may be reused in the development process of a program (Wikipedia Citation2011). Deriving and incorporating plan and pattern concepts into geo management for fast, efficient, and an economical way of building geo applications is expeditious management plan. This section discusses a solution in geo project management by structuring it into three rudimentary components (): workflow management, pragmatic process, and data sources framework.

Figure 4. General scheme for expeditious management pattern.
Figure 4. General scheme for expeditious management pattern.

3.1. Workflow management

The workflow management is based on the classic waterfall life cycle model used in Software Engineering. The basic steps are

  • Requirements analysis. It is necessary to review the requirements of a project before its inception. This step analyzes the project proposal in order to determine the data necessities for the project, besides the quality and quantity that the project requires.

  • Matching necessities and availability. The next step is to determine how a manager can match the necessities with the available resources. In this way, it would be necessary to review the accessible ‘markets’ looking for the resources available. This ‘market’ will also be related to the requirements of the project. It will be necessary to take into account our necessities and the monetary resources the project has. As a result, a Framework path will be selected in order to determine the data resources that will be used and the data transformation processes to be applied. This framework path is a part of the pragmatic process and data sources framework explained in the later sections.

  • Data construction and/or acquisition. By applying resource selection and transformation processes established by the framework path (), the workflow management will provide a data source proposal to be used for achieving the objectives of the project.

  • Quality and quantity check. Finally, before providing the data for the project, it will be necessary to check the proposal in order to fulfill the quality and quantity that the project requires.

3.2. Pragmatic process

This process aids in selecting a data source that is minimalist, beneficial, and useful to the project based on the following components:

Problem: Extending the software management viewpoint to geo application leads to identification of the problem that is in two folds:

  1. Context: The context in which the application has to be developed is based on its objective and factors such as quality of data, method of acquiring the data, timeframe, and financial and human resources available. As per our application objective, we develop 3D city models by reducing the process cost in terms of data acquisition time and cost of the data.

  2. Forces: Constraints, like the availability of data, time, its fitness for use, technology support, and costs to procure the data are some of the forces or setbacks in an application development. Such forces in our project are the unavailability of LIDAR data as per the project schedule, study area, and the price of data.

Based on the definition of the problem the data sources can be selected. We found a way to achieve the objective of our application through the mix and match approach as described in the sections below:

Data sources: Based on the problem identified, we present different approaches in obtaining the data for building the digital earth below:

  1. Accessing the external datasets without hassles of creating and maintaining the data by exploring readymade data using spatial portals, web services, or other data sharing technologies. For example, consider the environmental mapping case discussed in the above section.

  2. Using the existing readymade resources as base, data can be made compatible by deciding on the right mix of services and tools. For instance, North Atlantic Treaty Organization operations in Libya (http://www.guardian.co.uk/news/datablog/interactive/2011/may/23/libya-natobombing-no-fly-zone) and hurricane application (http://crisislanding.appspot.com/) build spatial data from one source as base data and nonspatial data from other sources. With online tools like GeoCommons (geocommons.com), users can make an analysis and visualize maps of one's own data, various available datasets, and base maps. It also has a provision to download or integrate on the websites. Likewise, in portals like Map tools (www.maptools.org), programmers and users can find various open source web tools, desktop tools, source codes, and many other potential resources that come handy during building of applications.

  3. Choosing economically data collection techniques as compiled by the respective authors. For example, portrays various 3D optical acquisition methods based on the application requirements; shows the most suitable data methods for various application domains. However, they may not be copper-bottomed but they give an approximate idea.

  4. Mix and match approaches: The application objective can also be achieved by making use of the benefits of any or all three approaches stated above. For instance, the geomarketing case explained in our earlier section or the concept behind the Google Earth builder (http://www.google.com/enterprise/earthmaps/builder.html). This approach named as mix and match is explained in detail in the next section.

Table 1. Suggested technology for 3D data based on the application requirements (Zhenhua and Ioannis Citation2009).

Table 2. Data sources and various application areas (Omar and Ayhan Citation2009).

3.3. Data sources framework

This Framework has been structured around the research on the optimized data sources to get the desired readymade data or to customize already existing data for the project. In this process six basic elements have been identified:

  1. Remote Sensing: Khagendra and Robert (Citation1991) state that this technology offers us a wide range of data sources from low-resolution to high-resolution satellite images. Cary (Citation2009) in his studies says that optimal technologies like the LIDAR provide point data, which is widely used in many 3D applications. Undoubtedly, this data source provides the best results. Nevertheless, factors like the cost involved, unavailability of data for our study area and requirement of specialized software and hardware to handle the enormous data were a setback.

  2. Photogrammetry: The studies of Zhenhua and Ioannis (Citation2009) and NCHRP (Citation2003) state that the data obtained by the photogrammetry methods are accurate, updated, and flexible. These characteristics of a data source provide a good option in achieving the objective. Nevertheless, complimentary web service for orthophotos could be a better option to ease the project resources.

  3. Digitization: The NCHRP (Citation2003) report explains that this technique gives an accuracy of 5 to 50 feet; it is less expensive and faster than other conventional methods. Converting the paper maps to spatial data using automatic, semiautomatic, and manual methods can be an option for some applications. As mentioned in the above section, the advent of specialized software such as R2V and WinTopo saves a lot of human effort and time. As a secondary option, this could be combined to the photogrammetric technique to obtain the plain metric and height data.

  4. Surveying: The NCHRP (Citation2003) report says that this approach could be beneficial for projects limited to smaller study areas or when the data needs to be updated. Surveying element gives highly accurate data (it is the best fit for urban application).

  5. Spatial data sharing: The fundamental principle behind the concepts such as spatial data infrastructure, open source, interoperability, and web services fosters the idea of sharing data. Shared data can be used by different people for a multitude of applications scenarios (GSDI Citation2010). This not only saves time, money, and human resources but also avoids the duplication effort involved in creating and managing data. Spatial portals provide a platform in finding a data-set in a faster and efficient fashion (Winnie and Jan Citation2005). All these concepts aid in discovering and integrating the apt data sources for the quick building of applications.

  6. Mix and match approaches: Google Earth builder (http://www.google.com/enterprise/earthmaps/builder.html) uses this approach to upload and manage data from various sources to form a custom map. This is a good example of the mix and match approach. Creating new data would exhaust resources. As highlighted earlier, the discovery of the data that fits or base data that is close to the objective can be tailored by using any of the methods mentioned above to make it fit for the purpose of the project. For example, the case wherein, if the base data that is discovered through spatial portals or any other sources is incomplete to meet the application objectives, can be completed or improved by discovering related web services, right mix of web services, spending minimal resources in field survey on the pivotal area of our study for updating or enhancing or purchasing the data component that is required to complete the data-set. In this way, instead of preparing the data from scratch, a mix and match approach may subdue project resources. Empirically, this approach saves resources and boosts efficiency. The 3D model that is developed in next section shows the 2D cadastral data discovered via web services used as base data, which is eventually enhanced to generate a 3D model.

4. Digital earth 3D city model

The plan presented in the previous section has been used to obtain a 3D model of the Zaragoza city, Spain. This 3D model is going to be used in the development of a project that has as its main objective, which is the construction of the 3D information that will represent the shadows of the city at different times in a day and different days in a year. This information will be used in different domains such as in the models of which analysis the spread of pollution in the air (that are affected by differences in temperature), selection of the location of solar panels on buildings, or tourist routes (different tourist routes depending on the season and time of the day).

The workflow management proposal has been applied to this particular problem by developing the task presented in the following steps.

4.1. Requirements analysis

As mentioned earlier, the 3D model is going to be used in the development of a project that has to provide 3D information to represent the shadows of the city at different times in a day and different days in a year. This information is not going to be used directly for visualization, so its adjustment to the real appearance is not a requirement. In addition, the relevant information that the project requires is the volume of each building (height, width, and length). The granularity has to be at the building level because it will be necessary to analyze the building connections and dependencies.

4.2. Matching necessities and availability

In order to select the Data sources, experiment and discover the source using the mix and match element of the data source framework. CartoCiudad service is used to get the number of floors of the buildings and to obtain the coordinates of each building (CartoCiudad Citation2010). Nevertheless, the problem with this approach was that the accuracy of the information provided by CartoCiudad is not homogeneous and does not have the same quality in all geographic areas, which eventually did not give the desired accuracy for use in an urban setting. Revamping this approach by analysis of various existing services like the OpenStreetMap, CartoCiudad, and Catastro, the Cadastre WMS (Catastro Citation2010) was selected. Catastro results were closer to the ground truth and gave a uniform quality of data necessary for the urban environment.

4.3. Data construction and/or acquisition

The 3D model for Spanish urban areas of Zaragoza has been sequential developed as shown in the flow diagram () and as the steps described below:

Figure 5. Illustrates the stages involved in making the digital earth model.
Figure 5. Illustrates the stages involved in making the digital earth model.

Extraction: The map data of Zaragoza city was procured by making requests to the Cadastre database by OGC WMS service for buildings (CONSTRU) and building with labels (TXTCONSTRU) layers, which contains necessary information for the 3D modeling.

Transformation: The data transformation or enhancing the data from 2D to 3D is done by vectorizing the building parcels that provides 2D data, by applying optical character recognition (OCR) algorithms for recognizing labels that contains height values, and by performing a spatial join between building shapes and georeferenced height values for obtaining 3D buildings.

CONSTRU layer that renders the geometry of the building parcels, and the TXTCONSTRU layer that contains both the geometry of the building parcels and textual numerical values are in raster format. In order to get the height information (TXT), a map subtraction operation is performed between the buildings with labels layer and the buildings layer. Then, an OCR algorithm is applied on the result of the subtraction to read and calculate the numerical values of each building parcel to get the number of floors and to estimate the height of the building.

Finally, the vectorized buildings layer and the georeferenced heights are spatially joined for obtaining 3D model of Zaragoza city.

The time taken to build on the town of Zaragoza, covering an area of one square km and a resolution of 0.5 meters stored in a file in ASCIIGrid format of 8030 KB is eight hours. The whole process has been set up on the Ubuntu 9.04 operating machine. Geographic Resources Analysis Support System (GRASS Citation2010) and the Java programming language have been used to implement some of the specific processes that were required for this experiment. There are also a few drawbacks to this approach. The process time is high, as all operations had to be performed sequentially. Part of the process relies on Cadastre WMS, on which there is no control because sometimes the service may be unavailable or may return incomplete responses during the data acquisition. Thus, during some stages like eliminating duplicate polygons, dangles etc human intervention for the quality check is needed. Apparently, the digital terrain model may not be as accurate as produced from conventional methods. Despite the setbacks in this approach, this is resource saving, interesting, and may bear fruit to our application. This demonstration also highlights the data value improvements and the substantial usage of exiting data.

4.4. Quality and quantity check

The revision of the data generated in the previous step satisfies the requirements for the project and as presented in the section Requirements analysis:

  • The information created can be represented graphically by ‘overlaying’ an aerial photograph over the structures created (). However, it is not an aesthetic view of the city, but could be enough according to the project requirements.

  • The information has been built as a raster file that represents a digital terrain model that which includes the buildings. With this information, it is possible to know the volume of each building (height, width, and length).

Figure 6. The 3D model of the Zaragoza city.
Figure 6. The 3D model of the Zaragoza city.

The above 3D city model has been briefly compared with an analogous 3D representation based on LIDAR data. In this way, the free LIDAR data provided by the Gipuzkoa province in Spain (Gipuzkoa Citation2011) is used for this task. The LIDAR data of grid one square kilometre was processed in an hour by a computer machine. Likewise, the use of our approach on the same area grid has taken five hours. The time taken to process the LIDAR data is exceptionally fast. Nonetheless, considering parameters like the availability of data, cost and time involved in procuring the data, and expertise, the software and hardware infrastructure required is much more demanding. Apparently, LIDAR data gives a very high quality and is dense; however, a good enough solution like the one demonstrated in this paper will be sufficient for some digital earth applications.

5. Conclusions

Web map services, geoprocessing services and concepts like SDIs, spatial portals, interoperability, and VGI revolutionize by opening new opportunities in discovering and combining multiple potential sources for applications. A survey undertaken in Spain demonstrated that spatial data reuse by the companies tops GIS as one of the most active domains among others such as statistical and the meteorological ones (Aporta Citation2011). Studies presented by Morten (Citation2011) and GITA (Citation2005) brought out that sharing of data for no cost or paying a reasonable amount saves human effort, time, and financial resources to the project. The overall idea of this paper has been emphasizing on expeditious plan and the usage of ‘existing potential sources’ and the benefits outweighing ‘creation of a new dataset.’ However, there are a few glitches and legal issues if the applications are developed for some commercial purpose when using some services like the Google Maps service.

Each approach discussed in the above sections could be appropriate depending on the modeling context and forces. This paper addressed fundamental applications in relation to choosing the data sources that could be directly integrated into our GIS environment and could be enough for the benefit of the public and business markets. The management plan explained in this paper gives a quick insight into choosing the data source. As in relation to our future work, we would like to delve on eclectic approaches for digital earth and the automation of quality check on the cadastral data to explore the possibility of improving the quality of data for analytical and sophisticated applications.

References

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.