Tag Archives: W3C

Linking Geospatial Data 2014

Archaeogeomancy: Digital Heritage Specialists – archaeological geomatics – the majick of spatial data in archaeology – archaeological information systems for the digital age:

LGD14 Barcamp, featuring open plan space and beanbags.

LGD14 Barcamp, featuring open plan space and beanbags.

I was very pleased to attend this event co-organised by the World Wide Web Consortium (W3C) through the SmartOpenData project, the Open Geospatial Consortium (OGC), the UK Government (data.gov.uk), the Ordnance Survey (OS) and Google. Hosted by Google Campus London, the two day event comprised presentations, lightening talks and a barcamp, all focussing on the use of geospatial data within the world of Linked Data. It was refreshing to be amongst researchers, users, developers and commercial folk all working in this area; I for one picked up some good ideas to help with my research project and hopefully my contributions were of use.

It was certainly good to bring together the camps working in this area: the geospatial technologists on the one side and the web folks on the other (And people like me who have one foot in each camp, as well as limbs in other domains, my primary domain being digital cultural heritage of course). To make this stuff work it’s going to take both groups working together through their respective consortia, the W3C and OGC.

Highlights

I noted a number of specific highlights that really inspired and gave me food for thought. Some reinforced my own perceptions and others gave me some new ideas for application to my project. The extensive use of IRC and Twitter combined with fast internet access throughout the event made it possible to discuss and find out more whilst talks were ongoing. The format lent itself to interaction and I was impressed by the amount of progress made in such a short space of time, with new working groups forming and ideas for revisions to standards such as GeoSPARQL forthcoming.

Some of my favourite bits:

Ontologies and Linked Data

The discussion of the relationship of ontologies to Linked Data resources was informative. Whilst there is tendency in the world of the web to target the low hanging fruit, publish data and sort out issues later, it is my opinion that there needs to be robust semantics within our Linked Data resources. Otherwise we have a web of mess rather than semantically interoperable data. I noted a couple of points made by Tim Duffy (British Geological Survey) that resonated here:

Kerry Taylor (Commonwealth Scientific and Industrial Research Organisation) gave some examples of where ontological development can support but also restrict aims, showing how things can go wrong when trying to implement the various standards out there. This is an important point; ontologies need to be simple enough to work with but also suit the domain and applications.

GeoSPARQL and geometries

It was interesting to note that the use of Well Known Text (WKT) within GeoSPARQL can be problematic; hearing Lars G. Svensson (Deutsche Nationalbibliothek) talk about their experiences was reassuring given my experiences over the past few months!

Two crucial issues were raised by Raphaël Troncy (Eurecom) relating firstly to the use of coordinate systems and secondly to the way in which geometries are represented. I have often found the way in which geospatial data is used on the web to be problematic, with web developers focussing solely on location with only minimal respect for Coordinate Reference Systems (CRS), Spatial Reference System (SRS) or Spatial Reference Identifiers (SRID). In many cases, this is an acceptable way of working (if you just want features on maps in roughly the right place) but lack of clarity regarding spatial frameworks is problematic for any more detailed use of geospatial data. Being explicit about coordinate systems is essential for transforming between different coordinate systems and also takes into account factors such as tectonic plate movement. Put simply, assuming WGS84 is the only way to reference coordinates is a gross oversimplification.

Secondly, he went on to talk about the implementation of this within GeoSPARQL. The standard does support CRS (a good start) but the implementation is a little complex in my view. He suggested making CRS definitions simply part of the semantic model rather than being fudged into a geometry node as they currently are; a geometry node currently comprises up to three parts, the first being an (optional) SRID, the second being the geometry itself and the third being a literal describing the format of the geometry (eg a WKT or GML literal). It was suggested that these could better be stored as individual assertions relating to a geometry object and this was well received and may well appear in the next version of the standard: hurrah!

Versioning

A thorny issue if ever there was one. With heritage data in particular, it is important to know provenance of vocabularies. This topic came up a couple of times and it was pleasing to hear that a lightweight solution exists (current and then historical, versioned namespaces; bit clunky but doable) and versioning can be more fully supported using ontologies designed for the purpose.

Re-use

A key question with Linked Data is how do you know who is using your data? Does this matter? Arguably not, but as with anything, proper citation and accreditation is useful, polite and can be used to demonstrate impact (a good thing when looking for funding). Turns out that Adam Leadbetter (British Oceanographic Data Centre) and Dicky Allison (Woods Hole Oceanographic Institution) have both been using the Heritage Data vocabularies I blogged about previously, which is great stuff but this only came to light through seeing the inclusion of English Heritage as a provider on one of their slides!

Precision & Accuracy

Important concepts for heritage data are precision and accuracy. When working with historic maps in particular it is important to be able to record tolerances against which data has been captured. As with coordinate systems, this is an area often ignored in the world of Linked Data with coordinates expressed to spurious levels of precision (ten decimal places is a *seriously* precise measurement!) with no metadata to describe overall accuracy. Coming from a geospatial background where these are core items of metadata, the lack of proper support for this within current Linked Data standards is problematic. It took a speaker working with heritage data to make this point; nice one Rob Warren (Big Data Institute, Dalhousie University).

Time

There was talk of temporal aspects to data, most spatial data have some kind of temporal component to it. Interestingly, the data I work with is placed in archaeological time and rarely do we have any absolute temporal data; chronologies are typically relative and imprecise with occasional pegs to actual temporal classes generally used in Linked Data (timestamps, dates, etc). I think this makes for an interesting area to try out ideas and the way this is represented in cultural heritage ontologies such as the CIDOC CRM, whilst being a bit different to the norm, actually encapsulates some very powerful constructs for working with spatio-temporal data.

Cool stuff!

Last but not least, there was a liberal spread of really cool stuff.

Strabon

Strabon had a few mentions, with a point made that the GeoKnow report on platforms had evaluated an old version and actually Strabon is now a very capable and scalable system. Being a semantic spatio-temporal system built from the ground up rather than than adding semantic, spatial and temporal functionality to an existing system sounds promising. I will certainly be reviewing it in more detail as a result.

Sextant

Also, building on the Strabon system comes Sextant. This application is described as:

Sextant is a web-based system for the visualization and exploration of time-evolving linked geospatial data and the creation, sharing, and collaborative editing of `temporally-enriched’ thematic maps which are produced by combining different sources of such data and other geospatial information available in standard OGC file formats (e.g., KML).

This looks like a very interesting platform for mapping geosemantic data, one which I will definitely be investigating further.

RAGLD

An absolutely brilliant piece of work was presented by John Goodwin (Ordnance Survey) called entitled Rapid Assembly of Geo-centred Linked Data applications (RAGLD). A collaboration between the University of Southampton, Ordnance Survey and Seme4, this project provides a neat suite of developer tools (currently in beta) for working with Linked Geospatial Data. Massive +1 from me!

map4rdf

Another really interesting platform is map4rdf. This is described as:

map4rdf is a mapping and faceted browsing tool for exploring and visualizing RDF datasets enhanced with geometrical Information. map4rdf is an open source software. Just configure it to use your SPARQL endpoint and provide your users with a nice map-based visualization of your data.

Again, this is one I will be investigating further for my GSTAR project.

Campus London

Is just cool. Enough said. Love their displays of historical computer gear and of course the open plan, bean bag filled working space. Really tempted to join up and hang out there more (if only the trains to London didn’t require a mortgage…)

[flickr-gallery mode="photoset" photoset="72157642122872833"]

Summary

A brilliant event, well organised and some amazing ideas and discussion. Not only that, but an excellent forum for meeting people working in the same subject area; my Twitter peeps grew considerably as a result and I’ve added lots of new folks to my LinkedGeoData list.

Big thanks of course to John Goodwin and Phil Archer for leading on the organisation front.

Looking forward to LGD 15 :-)

The post Linking Geospatial Data 2014 appeared first on Archaeogeomancy: Digital Heritage Specialists.