GSTAR web services; now with added GeoJSON

Archaeogeomancy: Digital Heritage Specialists – archaeological geomatics – the majick of spatial data in archaeology – archaeological information systems for the digital age:

"A boundary is an array of an array of an array of arrays" by Paul Downey

“A boundary is an array of an array of an array of arrays” by Paul Downey

Following on from the last update concerning the GSTAR web services, the final pieces of infrastructure for the case studies and demonstrator are nearly complete. Building on the API, a GeoJSON output format has been added so that results from GeoSPARQL queries can a) be accessed via a simple URL as with all other outputs and b) visualised using a web map or indeed any platform which can consume GeoJSON. 

I had assumed this last element would be straightforward, after all, plotting results on a map is just one of those things one would expect from some geospatial resource. But a couple of hurdles presented themselves.

1. Well Known Text

Geospatial data modelled using the GeoSPARQL ontology and stored in a triple store such as Parliament typically makes use of either Well Known Text (WKT) or Geographic Markup Language (GML) to store geometries. Importantly, as per the GeoSPARQL standard there is the ability to include a URI for a Coordinate Reference System (CRS) within a geo:wktLiteral node:

Req 10: All RDFS Literals of type geo:wktLiteral shall consist of an optional URI identifying the coordinate reference system followed by Simple Features Well Known Text (WKT) describing a geometric value. Valid geo:wktLiterals are formed by concatenating a valid, absolute URI as defined in [RFC 2396], one or more spaces (Unicode U+0020 character) as a separator, and a WKT string as defined in Simple
Features [ISO 19125-1].

If using any of the standard outputs from Parliament via Jena using a SPARQL endpoint, the output includes the data as stored and (as was politely pointed out to me by a couple of eminent GeoJSON folks) embedding WKT within JSON structures is not really the done thing:

Consider me slapped. But the triple store is doing exactly as asked, outputting a JSON version of the query results, whatever form they may be in. So I needed an extra step to produce a proper GeoJSON output in which geometries are represented as JSON arrays rather than WKT.

This inclusion of a CRS URI element in a geo:wktLiteral node is problematic for systems expecting a plain old WKT string in which the first element is the geometry type. The overall node structure of a feature geometry when represented in RDF is as follows, comprising the three elements: CRS, WKT + Datatype.

"<http://www.opengis.net/def/crs/EPSG/0/4326> Point(33.95 -83.38)"^^<http://www.opengis.net/ont/geosparql#wktLiteral>

The final element here is the datatype, recorded using the standard ^^<datatype> notation, as generally used for typed literals; this is handled easily and becomes a datatype node in XML and JSON output formats. As such, this element is not a problem when appended to a WKT string.

{ "datatype": "http://www.opengis.net/ont/geosparql#wktLiteral" , "type": "typed-literal", "value" : "<http://www.opengis.net/def/crs/EPSG/0/4326> Point(33.95 -83.38)"}

More generally speaking, it has been noted by the research community (for example by a number of contributors to the Linked Geosptial Data event I attended) that this compound approach is unhelpful and somewhat at odds with the usual approach to semantic structures in which each assertion is represented as a discrete triple. But currently, this is what we have, so that’s that.

Anyway, the solution was actually rather low tech. One of the only Triple Stores to natively support GeoJSON output is Strabon, but the Triple Store selected for use in the GSTAR project is Parliament. So it was necessary to investigate suitable approaches using the platforms in use. A quick look at the Strabon source code (available online under the Mozilla Public License v2.0) revealed an elegantly simple solution. Taking the same example used above, it is clear that there is a pattern to the structure which, with some judicious use of basic Java string functions, can be used to clean up the string and extract the individual components.

Firstly, the EPSG code for the CRS can be extracted by first checking to see if the string begins with a < character, indicating there is a URI present. If so, the EPSG code will be the last element of the URI, appended to the base URI for the namespace. Secondly, the WKT serialisation itself will follow the closing > for the CRS URI tag. These marker elements used to get indices used for substrings are shown in green, the EPSG code shown in blue and the WKT geometry shown in red.

<http://www.opengis.net/def/crs/EPSG/0/4326> Point(33.95 -83.38)

So this gives a nice plain WKT string (the red bit) which can be processed using GeoTools with the added benefit of a Proj4 Coordinate Reference System (derived from the blue bit) which can be used to control any transformations. If there is no CRS element, the input string can be safely returned as is.

2. Generating GeoJSON output

Having resolved the WKT issue, the only remaining hurdle was to take the processed WKT geometries and output JSON coordinate arrays within a GeoJSON structure, as per the GeoJSON standard. This was accomplished using Jackson to manipulate the JSON data combined with geojson-jackson to fuse the JSON and GeoJSON elements. This made it relatively easy to take the Query Results Json Format data and transform it into GeoJSON.

This extra output format has now been added to the Sparqlr API so GeoJSON can be returned for the results of any query containing spatial data.

Results

A final check was to ensure the GeoJSON data is well formed using a suitable viewer, in this case QGIS which has good support for a wide range of OGC standard formats including GeoJSON.

GeoJSON output from Parliament via the Sparqlr API, viewed in QGIS

GeoJSON output from Parliament via the Sparqlr API, viewed in QGIS

Next steps

The final pieces of the puzzle are the demonstrator interface and case study queries showing the capabilities of such resources. The former comprises a web based interface including some Seneschal widgets (for selecting controlled vocabulary items to be used in queries) combined with a web map for visualising results and making spatial selections. The latter comprises some interesting prebuilt GeoSPARQL queries based on real world archaeological research questions, visualised using the same web based UI.

Then just need to finish off that thesis…

The post GSTAR web services; now with added GeoJSON appeared first on Archaeogeomancy: Digital Heritage Specialists.

GSTAR at Salisbury Museum

Archaeogeomancy: Digital Heritage Specialists – archaeological geomatics – the majick of spatial data in archaeology – archaeological information systems for the digital age:

Beaker Pots by Wessex Archaeology

Beaker Pots by Wessex Archaeology

The final batch of source data has now safely received and is being processed for inclusion in my GSTAR project, kindly provided by the good people at Salisbury Museum. Thanks in particular are due to David Balston for assisting me and Adrian Green for giving the necessary permissions to use the data. This final batch of museum collection data will augment that already supplied by Wiltshire Museum to provide a more complete coverage across the study area. Importantly, this dataset includes much of the material from excavations undertaken by Wessex Archaeology who have also kindly made their archives available to me. Data was extracted in two ways; firstly using the Places node in the collections management system to look for parish names within the study area and secondly using the People/Organisations node to look for projects undertaken by Wessex Archaeology. This provided a good coverage of data for inclusion in GSTAR of around 8000 detailed records.

So the triumvirate of data is complete: data from archaeological fieldwork, data from museums collections and data from the Historic Environment Record. The next step is to use the tools already developed for museums collection data to process this source material and generate more Linked Geospatial Data for analysis and to support the technology demonstrators being constructed. Thankfully, Salisbury Museum, like Wiltshire Museum, use Modes for their collections management so the data provided as an xml export uses the same structure; as such, the xlst transforms and Stellar templates already built can be applied to this new data also without having to develop another processing pipeline.

Amesbury Archer - gold hair ornaments by Wessex Archaeology

Amesbury Archer – gold hair ornaments by Wessex Archaeology

It was also very nice to see the impressive new Wessex Gallery at the museum whilst visiting. Impossible to resist having a quick look around and I will be taking my girls there as soon as possible to see the incredible range of artefacts and have a go with the hands on archaeology activities. Of particular interest for me was the Amesbury Archer display, a chap excavated by Wessex Archaeology up at Boscombe Down and someone who I have been privileged to see close up whilst he was being looked after by WA post excavation and prior to being deposited with the museum.

The post GSTAR at Salisbury Museum appeared first on Archaeogeomancy: Digital Heritage Specialists.

Day of Archaeology 2015

Archaeogeomancy: Digital Heritage Specialists – archaeological geomatics – the majick of spatial data in archaeology – archaeological information systems for the digital age:

Dr. Henry Walton "Indiana" Jones, Jr.

Dr. Henry Walton “Indiana” Jones, Jr.

Yes, it’s that time of year again: Time for the annual Day of Archaeology. And once again, my day does not involve any temples in remote jungles, crystal skulls or raiding any tombs. Indeed, as has become the norm, it does not even involve any digging of holes, artefacts or suchlike.

Yep, archaeology involves a much broader range of activities than many folk believe, many of which are lab based and/or computer based with the result that some archaeologists (myself included) rarely get to see daylight let alone travel to distant far off lands in search of ancient peoples. And this is one reason why I love the Day of Archaeology so much as the range of posts each year covers just about every aspect of archaeology and cultural heritage and goes a long way towards showing what we, as professional archaeologists, really get up to, shattering stereotypes perpetuated by the likes of Indiana Jones and Lara Croft.

Anyway, here’s my post for this year which focusses on the usual range of geospatial and geosemantic stuff and not being chased by angry tribespeople or making dramatic and implausible escapes from imminent danger and almost certain death (although I did get a small electric shock off a laptop power supply this morning…)

The post Day of Archaeology 2015 appeared first on Archaeogeomancy: Digital Heritage Specialists.

GSTAR Web Services

Archaeogeomancy: Digital Heritage Specialists – archaeological geomatics – the majick of spatial data in archaeology – archaeological information systems for the digital age:

Web by David Reid

Web by David Reid

With all the source data prepped and ready to go, the next step is to build some demonstrators to show how such geosemantic resources can be used in practice. Whilst very powerful, a Sparql endpoint is not the most friendly way of interacting with data resources, especially from within a web based application where options for programming are a bit limited. There is still quite some debate on this topic which will be covered in more detail in the thesis (watch this space; still on track for submission 1st/2nd quarter 2016!) but the approach I have opted for is an API using web services to provide a range of outputs via a combination of URLs and parameters.

API vs Endpoint

The API is currently being finalised but the initial tests are working well, happily providing a range of outputs from the triple store. This approach allows the web service backend to draw on all the resources needed to handle geosemantic and respond to a range of requests in a form readily digestable by an xHTML+Javascript web application, including mapping. From a geospatial perspective, it means geospatial data can be requested from the webservice in a form ready to be mapped (eg JSON) or used in map popups (eg HTML) rather than having to process large piles of RDF within the browser. It also takes advantage of browser cacheing for the URL based resources, dramatically improving performance by reducing the need for trips to the server to get data.

SPARQLr – the GSTAR web service

Sparklers by Derek Key

Sparklers by Derek Key

The Sparqlr web service which implements the API is a RESTful service and is being built using Jersey which is a great platform for such tasks. This talks to the Parliament triple store via Jena with some Geotools components added to handle spatial data. As the service runs as a Java application on a GlassFish web server, it is possible to make use of the full range of Java tools out there without being limited to what can be achieved within a web browser. And thankfully, much of the code produced previously for the GSTAR Pilot is being recycled! As usual, all development is being undertaken using Eclipse+Maven.

Various queries can be performed in this way, some using basic URL syntax eg http://gstar:8080/sparqlr/api/features to return records about excavated features from archaeological archives as an RDF graph (default) or http://gstar:8080/sparqlr/api/artefacts/ntriple to return records about archaeological objects from museum collections as N-Triples. Parameters are also implemented which can be used to return particular records (eg http://gstar:8080/sparqlr/api/sites?MonUID=MWI11909 to return records about some pits near Stonehenge).

On the geosemantic front, parameters are also being used to pass in geometries as UTF-8 encoded WKT strings to facilitate spatial searches with the incoming WKT geometries used within the web service to add Geosparql filters to Sparql queries (eg http://gstar:8080/sparqlr/api/sites/within?polygon=POLYGON+%28%28569186.11565982+169502.18457639%2C+569186.02731245+169502.23116132%2C+569185.82348375+169502.25234642%2C+569185.70491406+169502.19113299%2C+569185.57672168+169502.04343594%2C+569185.54491719+169501.88381892%2C+569185.64107717+169501.66842781%2C+569185.82308162+169501.55315212%2C+569186.01894577+169501.56512577%2C+569186.19385893+169501.68723228%2C+569186.29291775+169501.91384472%2C+569186.25804717+169502.07848985%2C+569186.11565982+169502.18457639%29%29 to return all sites/monuments within a specified region such as a user generated polygon drawn on a web map).

The bulk of this is now up and running with the next step being to build the web application. This will involve the construction of a web map using OpenLayers to present results and facilitate user input (eg to capture polygonal search areas or points with buffer distances; the kind of spatial searches typically used within web based GIS).

The post GSTAR Web Services appeared first on Archaeogeomancy: Digital Heritage Specialists.

Linked Data: From interoperable to interoperating

Archaeogeomancy: Digital Heritage Specialists – archaeological geomatics – the majick of spatial data in archaeology – archaeological information systems for the digital age:

Piazza Mercato, Siena

Piazza Mercato, Siena

Videos of all the presentations in this CAA session, held in Siena 2015, which I blogged about earlier. Full credit and thanks due to Doug Rocks-Macqueen and his Recording Archaeology project for recording and making this and other sessions available (see also the session on ArchaeoFOSS and the keynotes). Thanks also to Leif Isaksen and Keith May for organising and chairing the session.

The session outline:

Linked Data and Semantic Web based approaches to data management have now become commonplace in the field of heritage. So commonplace in fact, that despite frequent mention in digital literature, and a growing familiarity with concepts such as URIs and RDF across the domain, it is starting to see fall off in Computer Science conferences and journals as many of the purely technical issues are seen to be ‘solved’. So is the revolution over? We propose that until the benefits of Linked Data are seen in real interconnections between independent systems it will not properly have begun. This session will discuss the socio-technical challenges required to build a concrete Semantic Web in the heritage sector.

The videos for the accepted papers:

  • The Syrian Heritage Project in the IT infrastructure of the German Archaeological InstitutePhilipp Gerth, Sebastian Cuy (video)
  • Using CIDOC CRM for dynamically querying ArSol, a relational database, from the semantic webOlivier Marlet, Stéphane Curet, Xavier Rodier, Béatrice Bouchou-Markhoff (video)
  • How to move from Relational to Linked Open Data 5 Star – a numismatic exampleKarsten Tolle, David Wigg-Wolf (video)
  • The Labeling System: A bottom-up approach for enriched vocabularies in the humanitiesFlorian Thiery, Thomas Engel (video)
  • From interoperable to interoperating Geosemantic resourcesPaul J Cripps, Douglas Tudhope (video)

The playlist for the session:

The post Linked Data: From interoperable to interoperating appeared first on Archaeogeomancy: Digital Heritage Specialists.