Citizenship hoops

citizenshipAt long last I have got around to sorting out the paperwork for getting British citizenship. At first it appears to be a fairly straightforward process:

Pay the Home Office £88 for a letter stating I do not hold British citizenship. On recipt of the letter submit the application for the retention of South African citizenship to the South African High Commission in London. With £25.  Once that application is confirmed, pay the UK Border Agency a large sum of money and submit 5 years worth of identity documents and proof of address.  Once citizenship has been granted, attend a citizenship ceremony to complete the legal process.  Bob’s your uncle and Betty your Queen – go apply for a passport. Continue reading Citizenship hoops

Exiftool – sorting photos

I recently had a few days off and managed to sort out the growing collection of photographs accumulating on my hard drive. The collection is almost 150GB with 52000+ image and video files spanning 10 years. I have used a variety of photo management tools over the years including Canon software that came with the camera, FSpot, gThumb, iPhoto and Digikam (the tool of choice). The resulting mess of nested folders and sub-folders demanded some TLC. Thankfully I had a couple of backups on different disks as well as two live working copies so I was safe in case I messed up.

Enter exiftool. A command line tool to manage all aspects of your photo metadata.

I copied my collection to a scratch processing space year by year and processed them in chunks using a single line of exiftool wizardry:

exiftool -r -d ../output/%Y_%m/%Y-%m-%d_%H-%M-%S_%%f.%%e "-filename<datetimeoriginal" input

This command recurses (-r) through the input directory finding all supported image and video files. It moves the files to the output folder, creating a YEAR_MONTH sub-folder (%Y_%m) using the original creation date of the file to be moved. The creation date and time (%Y-%m-%d_%H-%M-%S) is prefixed to the original filename (%%f.%%e). For each year of photos I end up with 12 folders (2005_01, 2005_02, etc.) containing all the nicely sorted photos.

Exiftool also reports errors and files it is unable to process and these remain in the input folders after processing making it simple to manually check through them.  I also had some success with the remnants using the Last Modified Date.

exiftool -r -d ../output/%Y_%m/%Y-%m-%d_%H-%M-%S_%%f.%%e "-filename<filemodifydate" input

ogr2ogr: PostGIS to PostGIS

I recently had to update a live database with updated tables from a staging database and then continue to update on a daily basis.  As it is a regular update and the source and destination tables won’t change I generated a text file with a list of layers to process and tables to write.  Like this:

srcTable1, destTable1
srcTable2, destTable2

The first column is the list of layers in the staging database to process.  This is the %G variable in the shell script.  The second column is the new table to write, the %H variable.

The initial load read in the layers from the staging database and created them in the live database.  I set the progress flag to check it was doing something (this can be deleted), set the geometry column and output schema.

FOR /F "tokens=1,2 delims=," %G IN (list.txt) DO ogr2ogr -progress -lco GEOMETRY_NAME=geometry -lco SCHEMA=outputSchema -nln %H -f PostgreSQL --config PG_USE_COPY YES PG:"dbname='destdbName' host='srcHost' port='5432' user='srcUserName' password='srcPassWord'" PG:"dbname='srcdbName' host='destHost' port='5432' user='destUserName' password='destPassWord'" %G

Subsequent loads overwrite the tables in update mode.

FOR /F "tokens=1,2 delims=," %G IN (list.txt) DO ogr2ogr -update -overwrite -progress -lco GEOMETRY_NAME=geometry -lco SCHEMA=outputSchema -nln %H -f PostgreSQL --config PG_USE_COPY YES PG:"dbname='destdbName' host='srcHost' port='5432' user='srcUserName' password='srcPassWord'" PG:"dbname='srcdbName' host='destHost' port='5432' user='destUserName' password='destPassWord'" %G

Set the appropriate values in the scripts above: database name, host, port if different, username and password.

Converting NMEA to GPX with GPSBabel

These are the instructions for batch converting NMEA formatted GPS file to GPX format on Windows 7.  Download and install GPSBabel ( first.

Open a command prompt and switch to your directory of NMEA files.  Mine are in:

  • C:\Workspace\GPS\nmea
  • C:\Workspace\GPS\gpx
for /f %i in ('dir /b *.txt') do "C:\Program Files\GPSBabel\gpsbabel.exe" -w -t -i nmea -f C:/Workspace/GPS/nmea/%i -o gpx -F C:/Workspace/GPS/gpx/%~ni.gpx

This can be translated as follows.  For all the text files in the directory read them in as NMEA format and write them out to GPX format in the gpx folder. The modifier strips the .txt extension from the filename and writes out the input name with the .gpx extension.

First Scottish QGIS user group

Together with the good folks at thinkWhere I have been organising the first Scottish QGIs user group meeting.  It is happening on 19th March 2014 in Stirling at the Stirling Management Centre.  Doors open at 9:30 with a 10:00 start.  Registration is through Evenbrite ( and there are 50 places available working on a first come, first served basis.

Details on how to get to the Centre are available here (

The agenda will be published a bit closer to the time once speakers have been finalised.  If you would like to present let me know as it would be good to have a mix of input to the day.  There are both 20 minute and “lightning talk” 5-10 minute slots available.

A big thanks to thinkWhere for hosting this first QGIS event in Scotland and QGISUK for enthusiasm and passion.

It’s All Happening Now

Phew! What a whirlwind the last few weeks have been.  Actually the last eight months has been hectic.  Mom, Dad and Great Gran arrive tomorrow for three weeks holiday. K turns three next week with a butterfly party and the whole family.  D turned five last month with an Octonauts party and Amy celebrated the last of her thirties. Kevin hit the early mid thirties.  E has started smiling and laughing and also to sleep for more than three hours at a time.  D has learned to use a computer mouse and to find LEGO video clips online.

There is a new climbing frame in the garden and the apple trees are over loaded with a bountiful crop this year.  The garden shed needs a new roof and there is much to be done in side as well – new dishwasher, new tv, new coffee table, some painting, new carpet and much furniture shuffling to accommodate new bits and pieces.

Work is exciting with new work being done using open-source GIS application and developing the web GIS further.  Just need to get over the tiredness and sleep more…

PostGIS Spiders

I had a request for some “spider diagrams” showing the connections between service centres and their customers and was given some sample data of about 140000 records.

QGIS spider/hub diagram

The data contained a customer ID and customer coordinates and a service centre ID.  Using another table of service centres I was able to add and update for each record the service centre coordinates (eastings and northings on the British National Grid EPSG:27700). Continue reading PostGIS Spiders

Speeding up pgRouting

pgRouting and accessibility
pgRouting and accessibility

I have been using pgRouting for some accessibility analysis to various facilities on the network and experimenting with different ways of making the process faster.

My initial network had 28000 edges and to solve a catchment area problem for one location on the network to all other nodes on the network was taking 40 minutes on a 2.93GHz quad core processor with 4GB RAM (Windows 7 PostgreSQL 9.2 PostGIS 2.0.3 and pgRouting 1.0.7).  I put the query into a looping function that processed the facilities in order but any more than 4 and the machine would run out of memory as the complete solution is stored in RAM until the loop finishes.

First step, reduce the number of edges in the network to 23000 and number of nodes to 17000 by removing pedestrian walkways, alleys, private and restricted roads.  Now the query is solved in about 12-14 minutes using about 200MB RAM per facility. Continue reading Speeding up pgRouting

I am in the process of rendering a series of map tiles based on the OS OpenData products using the script (and an updated version that uses all cores on the machine to speed things up).  The different raster products are rendered at different scales and then displayed using LeafletJS and OpenLayers applications as simple demonstrations.

The following command generates the tiles for the zoom levels I need:

python -z '7-9' -e -p raster -r average osvmd.vrt osvmdtiles/

Continue reading