Efficient Processing of Geospatial Information
Geoprocessing refers to the generation of spatial information from some base data. Advancing digitalization requires novel ways to process geospatial information. Manual inspection and processing may be cumbersome or no longer possible – how can we process large amounts of spatial data? How can we automize workflows to generate maps or reports from sensor networks and other base data? How can we integrate different analysis tools in an efficient way? 52°North’s Geoprocessing Lab aims to find answers to these questions with the overall goal to realize efficient geoprocessing. Our research partners and customers are from academia and industry covering different applications domains including environmental monitoring, agricultural applications, or disaster management.
In particular, the Geoprocessing Lab addresses the following research topics.
Since more and more (large) datasets are available in the Web, geoprocessing can no longer be executed locally and also needs to be run in the Web. We develop novel approaches and tools to deploy geoprocessing functionality in the Web and thereby apply and evaluate the usage of standards, the OGC Web Processing Service in particular. Furthermore, we develop novel approaches for processing sensor data in the Web with a close link to the Sensor Web lab.
The usage of Geoprocessing should be as simple as possible. We hence develop and evaluate different user interfaces to discover and use Geoprocessing functionality, e.g. custom Web clients, extensions to common GIS software, or expert software with the aim to find the best possible solution for the user ranging from non-trained end users to expert users.
The amount of spatio-temporal data available is rapidly growing requiring novel ways to organize and process such large datasets. We develop novel solutions and optimize existing solutions to deal with large amounts of heterogeneous spatio-temporal information, by, for example, parallelizing processing tasks through frameworks like Apache Hadoop.
Scientific Geoprocessing usually involves complex model implementations, implemented for example in R or MatLab, and large volumes of data. Therefore, we aim to implement models as a service that allow to run models in the Web and couple them with other models to realize the vision of the Model Web.
Automatizing the production process of maps or other information products relying on geospatial datasets requires that such workflows can be described, exchanged and executed. We develop and evaluate concepts for composing geoprocessing workflows and describing them in languages such as BPMN in order to make them transparent and reusable. Furthermore we work on concepts to track provenance of spatial data.
Besides technical solutions for the topcis mentioned above, we also do research on spatio-temporal statistics, such as modeling of spatio-temporal dependencies or assessment and communication of probabilistic uncertainties.