Archive

Posts Tagged ‘Semantic Web’

Quick Play with Cayley Graph DB and Ordnance Survey Linked Data

June 29, 2014 2 comments

Earlier this month Google announced the release of the open source graph database/triplestore Cayley. This weekend I thought I would have a quick look at it, and try some simple queries using the Ordnance Survey Linked Data.

Cayley is written in Go, so first I had to download and install that. I then downloaded Cayley from here. As an initial experiment I decided to use the Boundary Line Linked Data, and you can grabbed the data as n-triples here. I only wanted a subset of this data – I didn’t need all of the triplestores storing the complex boundary geometries for my initial test so I discarded the files of the form *-geom.nt and the files of the form county.nt, dbu.nt etc. (these are the ones with the boundaries in). Finally I put the remainder of the data into one file so it was ready to load into Cayley.

It is very easy to load data into Cayley – see the getting started section part on the Cayley pages here. I decided I wanted to try the web interface so loading the data (in a file called all.nt) was a simple case of typing:

./cayley http –dbpath=./boundaryline/all.nt

Once you’ve done this point your web browser to http://localhost:64210/ and you should see something like:

Screen Shot 2014-06-29 at 10.43.35

 

One of the things that will first strike people used to using RDF/triplestores is that Cayley does not have a SPARQL interface, and instead uses a query language based on Gremlin. I am new to Gremlin, but seems it has already been used to explore linked data – see blog from Dan Brickley from a few years ago.

The main purpose of this blog post is to give a few simple examples of queries you can perform on the Ordnance Survey data in Cayley. If you have Cayley running then you can find the query language documented here.

At the simplest level the query language seems to be an easy way to traverse the graph by starting at a node/vertex and following incoming or outgoing links. So to find All the regions that touch Southampton it is a simple case of starting at the Southampton node, following a touches outbound link and returning the results:

g.V(“http://data.ordnancesurvey.co.uk/id/7000000000037256“).Out(“http://data.ordnancesurvey.co.uk/ontology/spatialrelations/touches“).All()

Giving:

Screen Shot 2014-06-29 at 10.56.15

If you want to return the names and not the IDs:

g.V(“http://data.ordnancesurvey.co.uk/id/7000000000037256“).Out(“http://data.ordnancesurvey.co.uk/ontology/spatialrelations/touches“).Out(“http://www.w3.org/2000/01/rdf-schema#label“).All()

Screen Shot 2014-06-29 at 10.58.30

You can used also filter – so to just see the counties bordering Southampton:

g.V(“http://data.ordnancesurvey.co.uk/id/7000000000037256“).Out(“http://data.ordnancesurvey.co.uk/ontology/spatialrelations/touches“).Has(“http://www.w3.org/1999/02/22-rdf-syntax-ns#type“,”http://data.ordnancesurvey.co.uk/ontology/admingeo/County“).Out(“http://www.w3.org/2000/01/rdf-schema#label“).All()

Screen Shot 2014-06-29 at 11.01.17

 

The Ordnance Survey linked data also has spatial predicates ‘contains’, ‘within’ as well as ‘touches’. Analogous queries can be done with those. E.g. find me everything Southampton contains:

g.V(“http://data.ordnancesurvey.co.uk/id/7000000000037256“).Out(“http://data.ordnancesurvey.co.uk/ontology/spatialrelations/contains“).Out(“http://www.w3.org/2000/01/rdf-schema#label“).All()

So after this very quick initial experiment it seems that Cayley is very good at providing an easy way of doing very quick/simple queries. One query I wanted to do was find everything in, say, Hampshire – the full transitive closure. This is very easy to do in SPARQL, but in Cayley (at first glance) you’d have to write some extra code (not exactly rocket science, but a bit of a faff compared to SPARQL). I rarely touch Javascript these days so for me personally this will never replace a triplestore with a SPARQL endpoint, but for JS developers this tool will be a great way to get started with and explore linked data/RDF. I might well brush up on my Javascript and provide more complicated examples in a later blog post…

 

 

 

New Ordnance Survey Linked Data Site not just for Data Geeks

June 3, 2013 1 comment

Ordnance Survey’s new linked data site went live today. You can read the official press release here. One of the major improvements to the site is the look and feel of the site, and as a result of this the site should be useful to people who don’t care about ‘scary things’ like APIs, linked data or RDF.

One key additional feature of the new site is map views (!) of entities in the data. This means the site could be useful if you want to share your postcode with friends or colleagues as a means of locating your house or place of work. Every postcode in Great Britain has a webpage in the OS linked data of the form:

http://data.ordnancesurvey.co.uk/id/postcodeunit/{POSTCODE}

Examples of this would be the OS HQ postcode:

http://data.ordnancesurvey.co.uk/id/postcodeunit/SO160AS

or the postcode for the University of Southampton:

http://data.ordnancesurvey.co.uk/id/postcodeunit/SO171BJ

Click on either of these links you’ll see a map of the postcode – which you can view at various levels of zoom. You’ll also see useful information about the postcode such as its lat/long coordinate. More interestingly you’ll notice that it provides information about the ward, district/unitary authority, county (where applicable) and country your postcode is located in. So for the University of Southampton postcode we can see it’s located in the ward Portswood, the district Southampton and the country England.

Another interesting addition to the site is links to a few useful external sites such as: They Work For You, Fix My Street, NHS Choice and Police UK. This hopefully makes the linked data site a useful location based hub to information about what’s going on in your particular postcode area.

Why not give it a try with your postcode…:)

Announcing new beta Ordnance Survey Linked Data Site

April 25, 2013 1 comment

Ordnance Survey has released a new beta linked data site. You can read the official press release here.

I thought I’d write a quick (unofficial) guide to some of the changes. The most obvious one that is hopefully apparent as you navigate round the site is the much improved look and feel of the site. Including maps (!) showing where particular resources are located. Try this and this for example. Maps can be viewed at different levels of zoom.

Another improvement is the addition of new APIs. The first of these is an improved search function. Supported fields for search and some examples can be found here. The search API now includes a spatial search element.

The SPARQL API is improved. Output is now available in additional formats (such as CSV) as well as the usual SPARQL-XML and SPARQL-JSON. Example SPARQL queries are also included to get users started.

Another interesting addition is a new reconciliation API. This allows developers to use the Ordnance Survey linked data with the Open Refine tool. This would allow a user to match a list of postcodes or place names in a spreadsheet to URIs in the Ordnance Survey linked data.

In the new release the Ordnance Survey linked data has been split into distinct datasets. You could use the above described APIs with the complete dataset or, if preferred, just work on the Code-Point Open or Boundary Line datasets.

For details on where to send feedback on the new site please see the official press release here.

Update: I blogged a bit more about some of the new APIs here.

What is Linked Data?

July 5, 2012 Leave a comment

I wrote an introductory blog entitled “What is Linked Data?” over at the newly revamped data.gov.uk. You can read it here.

About Time…

April 22, 2012 8 comments

I’ve had an initial stab at encoding the Allen Interval algebra as an ontology mainly using this page as guidance for property composition. I’ve done two versions: the first is limited to the subset of the composition rules that can be expressed in OWL 2 and the second contains a hopefully complete axiomatisation using DL Safe SWRL rules.

I’ve included some simple examples in the ontologies to show the inference at work.

Next step will be aligning this ontology to the OWL Time ontology. Feed back on potential applications etc. would be appreciated.

Introducing RAGLD

December 21, 2011 1 comment

RAGLD (Rapid Assembly of Geo-centred Linked Data) is a project looking at the development of a software component library to support the Rapid Assembly of Geo-centred Linked Data applications

The advent of new standards and initiatives for data publication in the context of the World Wide Web (in particular the move to linked data formats) has resulted in the availability of rich sources of information about the changing economic, geographic and socio-cultural landscape of the United Kingdom, and many other countries around the world. In order to exploit the latent potential of these linked data assets, we need to provide access to tools and technologies that enable data consumers to easily select, filter, manipulate, visualize, transform and communicate data in ways that are suited to specific decision-making processes.In this project, we will enable organizations to press maximum value from the UK’s growing portfolio of linked data assets. In particular, we will develop a suite of software components that enables diverse organizations to rapidly assemble ‘goal-oriented’ linked data applications and data processing pipelines in order to enhance their awareness and understanding of the UK’s geographic, economic and socio-cultural landscape.A specific goal for the project will be to support comparative and multi-perspective region-based analysis of UK linked data assets (this refers to an ability to manipulate data with respect to various geographic region overlays), and as part of this activity we will incorporate the results of recent experimental efforts which seek to extend the kind of geo-centred regional overlays that can be used for both analytic and navigational purposes. The technical outcomes of this project will lead to significant improvements in our ability to exploit large-scale linked datasets for the purposes of strategic decision-making.RAGLD is a collaboative research initiative between the Ordnance Survey, Seme4 Ltd and the University of Southampton, and is funded in part by the Technology Strategy Board‘s “Harnessing Large and Diverse Sources of Data” programme. Commencing October 2011, the project runs for 18 months.

If you’d like to input into the requirements phase of the project I’d be very grateful if you could fill in one of these questionnaires. Many thanks in advance.

So what can I do with the new Ordnance Survey Linked Data?

October 25, 2010 7 comments

In a previous post I wrote up some of the features of the new Ordnance Survey Linked Data. In this blog post I want to run through a concrete example of the sort of thing you can build using this linked data.

A while ago Talis built their BIS Explorer. The aim of this application was to allow users to “identify centres of excellence at the click of a button” and more can be read about the application here. This data mash-up took different data sources about funded research projects and joined them together using linked data. In the original application you could, for example, look at funded research projects by European Region in Great Britain. This can be seen here. At the time this demo was created Ordnance Survey was yet to publish its postcode data as linked data, but if they had it would have been very easy to get a more fine grained view of research projects down at the county and district level. Here’s how…

The basic data model of the original BIS data was fairly straight forward. Universities and businesses have a link to the projects they worked on. For each university there is also postcode information. Things get interesting if instead of/as well as linking to a string representation of a postcode you link to the URI for said postcode. This can be done by using the property:

http://data.ordnancesurvey.co.uk/ontology/postcode/postcode

So say we wanted to do this for Imperial College. All we need is this (this example is in N-Triple format) in our data:

<http://education.data.gov.uk/id/institution/ImperialCollegeOfScienceTechnologyAndMedicine>

<http://data.ordnancesurvey.co.uk/ontology/postcode/postcode>

<http://data.ordnancesurvey.co.uk/id/postcodeunit/W68RF> .

Now, by the power of linked data, connecting to a resource for the postcode means we can now enrich our university dataset with knowledge of the ward, county and district the university is in. Also, given that the university is connected to a project we have a link from project to region. Through the link from project to university to postcode to region we can now start to have a more finely grained view on which areas are getting more funding.

So how do we do this in practice? There are the steps I followed.

  1. Download the BIS data from here and load it into a triple store (linked data database) of your choice. There are plenty of good open source ones available e.g. Sesame or TDB to name two.
  2. I then then added the links to postcode URIs as described above.
  3. Following that I loaded the data for the postcodes in a similar manner to that described here. A relatively simple script retrieved the RDF for the relevant postcodes and loaded the RDF into my store. The nice thing about linked data and RDF is that the stores are like a big bucket of data and you can keep throwing more and more in. Hopefully future linked data tools will make this step trivial, but for now some scripting was required.
  4. Job done. I now have links from research projects to regions.

Basically what I created from this was an aggregation of various datasets that you can now query. This is something that is made very easy using linked data and URIs to identify things like postcodes. As more publishers release data in linked data form there is more and more potential for building services and applications on top of aggregations of these datasets.  So that’s what I decided to do…

This application (I make the usual apology for my lack of web development skills and for the slowness which some caching would not doubt sort out) builds a clickable map view of this data aggregation. The OS OpenSpace API makes it possible to retrieve the unit ID for selected polygons. I can then use this unit ID in a SPARQL query to find the projects funded in that region.

However, it would have been easier if there was a RESTful API on top of the data aggregation that would have let me retrieve these results instead of doing some SPARQL. So that is what I decided to build next using the Linked Data API. The linked data API basically lets you create RESTful type shortcuts to relatively complex SPARQL query. Due to my lack of PHP skills it was an initially bumpy ride getting it to work (see here) but I got there in the end and the result was an API that lets you return research projects by selecting regions either through their SNAC codes or Ordnance Survey IDs, e.g:

http://www.johngoodwin.me.uk/bis/api/project/county/os/{unit id},

e.g. http://www.johngoodwin.me.uk/bis/api/project/county/os/17765

 

results can be returned in different formats using content negotiation [1] or by simple adding the relevant .html, .json to the URI, e.g.:

http://www.johngoodwin.me.uk/bis/api/project/euro/os/41424.html

http://www.johngoodwin.me.uk/bis/api/project/euro/os/41424.json

I hope this example shows how linked data can be useful in building applications on top of data aggregations. To summarise:

  1. Publishers release data in linked data format.
  2. Having data in a common format (RDF) with dereferencable URIs makes it relatively each to retrieve and aggregate from a number of resources, especially if data is linked to URIs for ‘things’ and not just ‘strings’.
  3. The linked data API makes it possible to build a RESTful service on top of a data aggregation so web developers need not be put of by complex SPARQL queries.
  4. Applications can then built using these services.

[1] for some reason the HTML conneg only seems to work in Firefox.

/location /location /location – exploring Ordnance Survey Linked Data – Part 2

October 25, 2010 5 comments

Ordnance Survey have now released an update to their linked data, which can be seen here. The new data now includes postcode information as well as a few changes to the administrative geography data. In this post I’ll go through what’s in the data, and give a few sample SPARQL queries.

I spoke a bit about the administrative geography data in a previous blog post – but the data has changed a bit since then. Just to re-cap the administrative geography linked data contains information about administrative and voting geographic regions. These include unitary authorities, counties, wards, constituencies, Welsh Assembly regions and a whole lot more [1]. Here are some examples:

If you want to find a full list of the sorts of thing you can find in the data simply go to the query interface (or SPARQL endpoint as it is know) and try the following query:

select distinct ?type

where { ?a a ?type . }

Now you have the list all of type of things in the data you can as for lists of instances of those types.

For example, the following query will return all of the unitary authorities:

select ?a

where {

?a a <http://data.ordnancesurvey.co.uk/ontology/admingeo/UnitaryAuthority&gt; .

}

All of the names of all the regions have now been modelled using the SKOS vocabulary. If you want to find the official names of all the unitary authorities you can simple issue a query like:

select ?a ?name

where

{

?a a <http://data.ordnancesurvey.co.uk/ontology/admingeo/UnitaryAuthority> .

?a <http://www.w3.org/2004/02/skos/core#prefLabel&gt; ?name  .}

Also included in the data are two attributes called Unit ID and Area Code. These values are useful if you want to produce a mashup using this data and display it by boundary.

So for example, for Southampton (http://data.ordnancesurvey.co.uk/id/7000000000037256) the area code is UTA (for unitary authority) and the unit ID is 37256. These values can be used as follows:

/*here we set-up the our variable called ‘boundaryLayer’ with the strategies that we require. In this case, it is its ID and type i.e. Unitary Authority */

boundaryLayer = new OpenSpace.Layer.Boundary(“Boundaries”,

{ strategies: [new OpenSpace.Strategy.BBOX()], admin_unit_ids: ["37256"], area_code: ["UTA"] });

//then we add the bounadry to the map osMap.addLayer(boundaryLayer);

//this effectively refreshes the map, so that the boundary is visible

osMap.setCenter(osMap.getCenter());

to display the Southampton boundary using the OS OpenSpace API. See http://openspace.ordnancesurvey.co.uk/openspace/support.html for more details.

Arguably the most useful information in this data are the qualitative spatial relationships between different regions. Regions are related to the regions they contain, they are within and they touch. In the case of the touching relationship only regions of the same type have an explicit touching relationship. The exception to this are unitary authorities, counties, district and metropolitan district that also have touching relationships between each other. The following simple query will return a list of all counties, districts and unitary authorities that border The City of Southampton. It will also return their names:

PREFIX spatialrelations: <http://data.ordnancesurvey.co.uk/ontology/spatialrelations/&gt;

select ?a ?name

where

{

?a spatialrelations:touches <http://data.ordnancesurvey.co.uk/id/7000000000037256&gt; .

?a <http://www.w3.org/2004/02/skos/core#prefLabel&gt; ?name  .

}

If you are only interested in the bordering counties you can add an extra line to your query:

PREFIX spatialrelations: <http://data.ordnancesurvey.co.uk/ontology/spatialrelations/&gt;

select ?a ?name

where

{

?a spatialrelations:touches <http://data.ordnancesurvey.co.uk/id/7000000000037256&gt; .

?a <http://www.w3.org/2004/02/skos/core#prefLabel&gt; ?name  .

?a a <http://data.ordnancesurvey.co.uk/ontology/admingeo/County> .

}

Similarly, the following query returns all the county electoral divisions (and their names) within Hampshire:

PREFIX spatialrelations: <http://data.ordnancesurvey.co.uk/ontology/spatialrelations/&gt;

select ?a ?name

where

{

?a spatialrelations:within <http://data.ordnancesurvey.co.uk/id/7000000000017765&gt; .

?a <http://www.w3.org/2004/02/skos/core#prefLabel&gt; ?name  .

?a a <http://data.ordnancesurvey.co.uk/ontology/admingeo/CountyElectoralDivision> .

}

For convenience some shortcuts have been added to the data in this release. For certain nesting geographies, such as the county – district – parish or district – ward nestings, various new properties have been added. For example, the property ‘counyElectoralDivision‘ relates all counties to their constituent county electoral divisions. The above query can now be done in a simpler way:

PREFIX admingeo: <http://data.ordnancesurvey.co.uk/ontology/admingeo/&gt;

select ?a ?name

where

{

<http://data.ordnancesurvey.co.uk/id/7000000000017765&gt; admingeo:countyElectoralDivision ?a .

?a <http://www.w3.org/2004/02/skos/core#prefLabel&gt; ?name  .

}

Similar predicates such as ‘county‘, ‘district‘, ‘ward‘, ‘constituency‘ etc. provide similar shortcuts. For example, the following returns all the Westminster constituencies in South East England.

PREFIX admingeo: <http://data.ordnancesurvey.co.uk/ontology/admingeo/&gt;

select ?a ?name

where {

<http://data.ordnancesurvey.co.uk/id/7000000000041421&gt; admingeo:westminsterConstituency ?a .

?a <http://www.w3.org/2004/02/skos/core#prefLabel&gt; ?name  . }

The most significant introduction to this data is the inclusion of postcode information. The data now contains information about postcode units, postcode sectors, postcode districts and postcode areas. For each postcode unit an easting/northing coordinate value is given [2] along with the district, ward and county (where applicable) that contains said postcode unit. An example of this can be seen for the Ordnance Survey postcode SO16 4GU. Each postcode is also related to its containinb postcode area, sector and district.

The properties ‘ward‘, ‘district‘ and ‘county‘ relate a postcode to the relevant regions. The simple query:

PREFIX postcode: <http://data.ordnancesurvey.co.uk/ontology/postcode/&gt;

select ?district

where {

<http://data.ordnancesurvey.co.uk/id/postcodeunit/SO164GU&gt; postcode:district ?district .

}

returns the unitary authority that contains the postcode SO16 4GU.

This query:

PREFIX spatialrelations: <http://data.ordnancesurvey.co.uk/ontology/spatialrelations/&gt;

select ?postcode

where

{

?postcode spatialrelations:within <http://data.ordnancesurvey.co.uk/id/postcodearea/SO&gt; .

}

returns all the postcodes in the SO postcode area.

We can combine the above two queries to find the areas, along with their names, covered by the postcode area SO:

PREFIX spatialrelations: <http://data.ordnancesurvey.co.uk/ontology/spatialrelations/&gt;

PREFIX postcode: <http://data.ordnancesurvey.co.uk/ontology/postcode/&gt;

select distinct ?district ?name

where

{

?postcode spatialrelations:within <http://data.ordnancesurvey.co.uk/id/postcodearea/SO&gt; .

?postcode postcode:district ?district .

?district <http://www.w3.org/2004/02/skos/core#prefLabel&gt; ?name  .

}

Hopefully these few examples will give you enough information to fully explore this new release of the Ordnance Survey linked data. For those of you who don’t like SPARQL watch this space – hopefully we will soon(ish) have an API built on top of this data to allow for even easy access.

[1] you’ll notice the ‘isDefinedBy’ link currently returns a 404 – not for long I hope :)

[2] lat/long to follow

Some quick linked data hacks

June 16, 2010 22 comments

In previous posts I discussed the work I’d been doing on my family tree linked data. I decided it might be interesting to plot places of birth for my ancestors on a map to get a true idea of where they all came from. The result, a faceted browser that lets me filter based on family name or birth place, can be seen here. This mashup was very easy to achieve using linked data and a tool called Exhibit. To quote: “Exhibit lets you easily create web pages with advanced text search and filtering functionalities, with interactive maps, timelines, and other visualizations…”.

As I explained in a previous post the places of birth for family members were recorded in my family tree linked data by linking to place resources in DBpedia, for example: http://www.johngoodwin.me.uk/family/event1917. In order to perform the mashup I need lat/long values for each place of birth. One option might have been to do some kind of geo-coding on the place name using an API. However, I didn’t relish the world of pain I’d get from retrieving data in some arbitrary XML format or the issues with ambiguities in place names. The easiest way to get that information was to enrich my family tree data by consuming the linked data I’d connected to. This is how I did it…

First I ran a simple SPARQL query to find all the places referenced:

select distinct ?place
where {?a <http://purl.org/NET/c4dm/event.owl#place&gt;
?place .}

(match on all triples of the form ?a <http://purl.org/NET/c4dm/event.owl#place&gt; ?place, and then return all distinct values of ?place).

The results are URIs of the form http://dbpedia.org/resource/Luton. I then used CURL (a command line tool for transferring data with URL syntax) to retrieve the RDF/XML behind of the URIs:

curl -H “Accept: application/rdf+xml” http://dbpedia.org/resource/Luton

This basically says give me back RDF/XML for the resource http://dbpedia.org/resource/Luton. It was then easy to insert this RDF/XML into my triplestore (RDF database). I can do this because my family tree data was in linked data format (RDF) and linked to an existing resources also in RDF – so there was no problem with integrating data in different schemas/formats.

Now all I had to do was retrieve the information I needed to do the mashup. This was done using a SPARQL query:

select ?a ?name ?familyname ?birthdate ?birthplacename ?latlong
where
FILTER langMatches( lang(?birthplace), “EN” )
}
ORDER BY ?birthdate

Given that Exhibit works really well with JSON I opted to return the results to the query in that format (SPARQL queries are typically returned as XML or JSON). It was then a simple matter of making the resultant JSON into a suitable form that Exhibit can process.

I did another simple mashup using the BBC linked data here. This followed a similar process, except that the BBC had already enhanced there data by following links to DBpedia. This BBC mashup basically lets you find episodes of brands of radio show that play your favourite artists/genres. The BBC data contains links between artists and radio shows. There are ‘sameAs’ links from the BBC artist data to DBpedia. It is DBpedia that then provides the connection between artists and their genre(s).

Hopefully this shows the power of linked data in a simple way. There is a simple pattern to follow…

1) Make data, and make that data available in RDF. People can then link to you, and you can link to other people who have data in RDF. So I made family tree data in RDF, the BBC made music/programme data in RDF.

2) Link to linked data resources on the web (in this case we both linked to DBpedia).

3) Enhance your data by consuming the data behind those links – this is trivial because they are both in the linked data format RDF.

4) Make something cool/useful :)

In fact this will be even easier to build useful services when the linked data API is in use as this will bypass the need for SPARQL in the many cases. As more and more people provide linked data we will have an easy way to provide services built on top of combined data sources, and the linked data API will make it web 2.0 friendly for those (understandably?) put off by SPARQL.

/location /location /location – exploring Ordnance Survey Linked Data

October 25, 2009 5 comments

Ordnance Survey now have some linked data available here. This data includes information about the local authority and voting regions of Great Britain. Included in this data are the names (and official names as set out by Statutory Instrument where applicable), census code and area in hectares of the region. Also included are topological relationships between the administrative areas. These allow users to do qualitative spatial queries on the data.  So for example, the data contains information about which regions are contained by other regions. Bordering information is given between regions of the same type (e.g. between consituencies). There is one exception to this where additional bordering information is given between counties, unitary authorities, districts and metropolitan districts [1].

So what can you do with the data? First you can simply explore it in your browser. For example look at the URI for The City of Southampton:  http://data.ordnancesurvey.co.uk/id/7000000000037256. As you can see this contains a list of the regions Southampton borders, contains and overlaps [2].

It is possible to perform free text searches on the data here. The results are returned as an RSS feed. Try it out – type the name of the region you are looking for in the first search box. Typing in Southampton gives three results: the unitary authority The City of Southampton and two westminster constituencies Southampton, Test and Southampton, Itchen.

The interesting queries, however, are done at the SPARQL endpoint located here.  I’ll give a handful of SPARQL queries to get you going. You will need to add this at the top of each query:

PREFIX owl: <http://www.w3.org/2002/07/owl#&gt;
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#&gt;
PREFIX xsd: <http://www.w3.org/2001/XMLSchema#&gt;
PREFIX foaf: <http://xmlns.com/foaf/0.1/&gt;
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#&gt;
PREFIX admingeo: <http://data.ordnancesurvey.co.uk/ontology/admingeo/&gt;
PREFIX spatialrelations: <http://data.ordnancesurvey.co.uk/ontology/spatialrelations/&gt;

So first of all I can ask for a list of the types of the things in the data:

select distinct ?type
where
{
?a rdf:type ?type .
}

Seeing the data mentions Unitary Authorities I can ask for a list of all unitary authorities and their official names:

select ?a ?name
where
{
?a rdf:type admingeo:UnitaryAuthority .
?a admingeo:hasOfficialName ?name .
}

I can now issue a topological query: find me all westminster consituencies contained by the unitary authority Southampton:

select ?a ?name
where
{
<http://data.ordnancesurvey.co.uk/id/7000000000037256&gt; spatialrelations:contains ?a .
?a rdf:type admingeo:WestminsterConstituency .
?a foaf:name ?name .
}

or find me the regions (and their names) that contain the district of Winchester:

select ?a ?name
where
{
?a spatialrelations:contains
<http://data.ordnancesurvey.co.uk/id/7000000000017754> .
?a foaf:name ?name .
}

This query finds me the regions (and their name and type) that border Winchester:

select ?a ?name ?type
where
{
<
http://data.ordnancesurvey.co.uk/id/7000000000017754 > spatialrelations:borders ?a .
?a rdf:type ?type .
?a foaf:name ?name .
}

This query returns me a list of counties, and the county electoral divisions contained within them along with the names of the county and county electoral division:

PREFIX owl: <http://www.w3.org/2002/07/owl#&gt;
PREFIX rdfs: <http://www.w3.org/2000/01/rdf-schema#&gt;
PREFIX xsd: <http://www.w3.org/2001/XMLSchema#&gt;
PREFIX foaf: <http://xmlns.com/foaf/0.1/&gt;
PREFIX rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#&gt;
PREFIX admingeo: <http://data.ordnancesurvey.co.uk/ontology/admingeo/&gt;
PREFIX spatialrelations: <http://data.ordnancesurvey.co.uk/ontology/spatialrelations/&gt;

select ?ced ?county ?cedname ?countyname
where
{
?county rdf:type admingeo:County .
?ced rdf:type admingeo:CountyElectoralDivision .
?county spatialrelations:contains ?ced .
?ced rdfs:label ?cedname .
?county rdfs:label ?countyname .
}

One final note for people wanting to do mashups with this data. If you wish to see the boundary on a map then the area code and unit ID attributes can be used in the OS OpenSpace API to display the boundary.

So for example, for Southampton (http://data.ordnancesurvey.co.uk/id/7000000000037256) the area code is UTA (for unitary authority) and the unit ID is 37256. These values can be used as follows:

/*here we set-up the our variable called ‘boundaryLayer’ with the strategies that we require.
In this case, it is its ID and type i.e. Unitary Authority */
boundaryLayer = new OpenSpace.Layer.Boundary(“Boundaries”, {
strategies: [new OpenSpace.Strategy.BBOX()],
admin_unit_ids: ["37256"],
area_code: ["UTA"]
});
//then we add the bounadry to the map
osMap.addLayer(boundaryLayer);
//this effectively refreshes the map, so that the boundary is visible
osMap.setCenter(osMap.getCenter());

to display the Southampton boundary using the OS OpenSpace API. See http://openspace.ordnancesurvey.co.uk/openspace/support.html for more details. An example of the output can be seen here.

Happy SPARQLing…

[1] – if you are (rightly) confused about the geography of Great Britain then there is a handy glossary here.

[2] – the regions that contain Southampton will be added shortly.

Reblog this post [with Zemanta]
Follow

Get every new post delivered to your Inbox.

Join 2,189 other followers