Implementing Geoprocessing
The methods and tools we develop are successfully applied in various projects and use cases. Some of the success stories are listed below.
The methods and tools we develop are successfully applied in various projects and use cases. Some of the success stories are listed below.
The 52°North Web Processing Service implementation is one of the official reference implementations of the OGC Web Processing Service. It is widely used and has been part of various OGC Testbeds.
In OGC Testbed 12 we have implemented WPSs for conflation of road datasets and for asynchronous delivery of spatial information. Furthermore, we have specified a Geoprocessing REST API that is currently being worked on in the WPS Standards Working Group to become part of the WPS specification.
In OGC Testbed 13, we are currently evaluating workflow description languages and implement several WPSs that are composed to a geoprocessing workflow and can be automatically executed in an workflow engine.
52°North has implemented several time series analysis and spatial interpolation functionality in R and wrapped these as well as a complex run-off model in WPSs for an easy and flexible integration in a Web based water dam monitoring system. A publication in the International Journal of Digital Earth describes the approach in detail.
enviroCar is an open platform for collecting and analyzing car driven tracks. The collected tracks consist of GPS positions enriched with additional sensor information by utilizing the vehicle’s inertial diagnostic capabilities. Based on the sensor information, fuel consumption and CO2 emissions are estimated for petrol cars. Users have full control on their collected tracks and can share them in an anonymized fashion via an open API. Various analysis tools support the exploration of the data, e.g. in map-based views or time-series charts. In particular, we have developed a Web Processing Service that allows to extract stops at points of interest. Further more sophisticated analysis tools are currently being developed, e.g. a tool for automated map matching to OSM street segments.
With the development of novel and cost-efficient observation technology, the amount of spatio-temporal data about our environment is rapidly increasing. Due to the amount of data available and its heterogeneity regarding format, metadata, storage, and interfaces, geoprocessing tools offered in traditional desktop GIS are unsuitable and novel approaches for distributed geoprocessing are needed.
The 52°North Geoprocessing Open Lab addresses these issues and develops novel solutions for sharing, distributing, and integrating geoprocessing tools in different technical environments. These tools may range from simple operators, e.g. buffering line segments, up to complex environmental models relying on spatial information, e.g. flood prediction models utilizing in-situ and satellite observations.
A core focus of the lab is the integration of geoprocessing facilities in spatial data infrastructures by means of the OGC Web Processing Service (WPS) standard. 52°North has led the standardization process of the OGC WPS version 2.0 and provides a Java-based implementation, the 52°North Web Processing Service, which features a pluggable architecture for processes and data encodings and comes with several ready-to-use process repositories.
Assuming that geoprocessing tools can be easily shared and deployed flexibly at various locations in spatial data infrastructures, the problem of how to find and execute these tools in a user-friendly way remains unsolved. Therefore, we develop novel concepts and clients for the discovery, orchestration and execution of geoprocessing tools.
Processing large amounts of spatio-temporal data requires pre-processing and post-processing steps. Novel approaches are also necessary for deploying geoprocessing tools close to the data instead of transferring data to locations running geoprocessing tools. Hence, another challenge is to identify necessary pre- and post-processing steps for big datasets and develop concepts for comparing and optimizing architectures for distributed geoprocessing.
The challenges mentioned above are largely technical challenges. We also deal with geostatistical issues, e.g. modeling spatio-temporal dependencies or extreme events and work on describing how data was gathered and processed in order to infer a meaningful application of models and operators to datasets.
Geoprocessing refers to the generation of spatial information from some base data. Advancing digitalization requires novel ways to process geospatial information. Manual inspection and processing may be cumbersome or no longer possible – how can we process large amounts of spatial data? How can we automize workflows to generate maps or reports from sensor networks and other base data? How can we integrate different analysis tools in an efficient way? 52°North’s Geoprocessing Lab aims to find answers to these questions with the overall goal to realize efficient geoprocessing. Our research partners and customers are from academia and industry covering different applications domains including environmental monitoring, agricultural applications, or disaster management.
In particular, the Geoprocessing Lab addresses the following research topics.
Since more and more (large) datasets are available in the Web, geoprocessing can no longer be executed locally and also needs to be run in the Web. We develop novel approaches and tools to deploy geoprocessing functionality in the Web and thereby apply and evaluate the usage of standards, the OGC Web Processing Service in particular. Furthermore, we develop novel approaches for processing sensor data in the Web with a close link to the Sensor Web lab.
The usage of Geoprocessing should be as simple as possible. We hence develop and evaluate different user interfaces to discover and use Geoprocessing functionality, e.g. custom Web clients, extensions to common GIS software, or expert software with the aim to find the best possible solution for the user ranging from non-trained end users to expert users.
The amount of spatio-temporal data available is rapidly growing requiring novel ways to organize and process such large datasets. We develop novel solutions and optimize existing solutions to deal with large amounts of heterogeneous spatio-temporal information, by, for example, parallelizing processing tasks through frameworks like Apache Hadoop.
Scientific Geoprocessing usually involves complex model implementations, implemented for example in R or MatLab, and large volumes of data. Therefore, we aim to implement models as a service that allow to run models in the Web and couple them with other models to realize the vision of the Model Web.
Using the Web for Geoprocessing application also offers novel ways for collaborating on and sharing Geoprocessing functionality. We also aim to develop concepts for such collaborative and shared Geoprocessing with a focus on Citizen Science applications, where Citizens and other stakeholders may share processing functionality beyond only sharing data.
Automatizing the production process of maps or other information products relying on geospatial datasets requires that such workflows can be described, exchanged and executed. We develop and evaluate concepts for composing geoprocessing workflows and describing them in languages such as BPMN in order to make them transparent and reusable. Furthermore we work on concepts to track provenance of spatial data.
Besides technical solutions for the topcis mentioned above, we also do research on spatio-temporal statistics, such as modeling of spatio-temporal dependencies or assessment and communication of probabilistic uncertainties.