There and Back Again, a Work’s Experience Tale @thinkWhere by Joshua Fawcett

Ever since I was a child I have been amazed by the wonders of maps. When I was small reading the hobbit I was more intrigued with where the characters were on the map then the actual story. I love how maps can so clearly give you an idea of a place without you never visiting it. This passion has developed since then moving away from story books and on to more advanced maps and the future of the mapping world, Geographic Information Systems (GIS). The GIS world has allowed me to explore the amazing ways in which data can be displayed in map form. Leading me to exploring mapping further by studying a BSc Geography degree at University College London to further explore my passion for GIS. So, when the opportunity to do a week’s placement at one of the UK’s leading GIS mapping companies came up it was one which I could not turn down.

josh pic

Me (Josh Fawcett) with thinkWhere Logo

The first day at the company I was welcomed into the Agile way of working with the 10am daily stand-up ceremony. This was my first ever exposure to the Agile way of working a method of working that is used by many tech companies moving away from the more traditional waterfall method of working. The agile way of working essentially follows the holy Agile Manifesto.

 agile manifesto

Agile Manifesto

During my time at thinkWhere I was allowed to see the inner workings of the business attending various events. As I had joined the team mid-way through a sprint (a 2-week period of work) I was able to get an insight into the Operation Team Backlog Refinement meeting. This is where the team decides together how long each story (task) will take by putting forward a rating of the difficulty of the task rather than giving it a set length of time. Furthermore, I was given my own stories to complete and progress them along the four sections from to-do all the way to done (see image below). This type of management and progress checking is one which I will apply to other large-scale group projects that I will do in the future.


Jira Dashboard Active Sprint Tacker

Alongside observing the various ceremonies at thinkWhere I was allowed the opportunity to speak to the amazingly kind and friendly employees. The team at thinkWhere are from diverse backgrounds allowing me to gain an invaluable insight into the true working of a company that deals with GIS is like in the commercial world. Allowing me to see GIS outside of the academic sphere and in the practical and commercial sector the real world will be beneficial in my future career. Getting the opportunity to present at the valentine’s edition of the companies monthly Meet and Eat was an amazing opportunity to present outside of a university environment. Discussing the work I completed over the summer mapping Biodiversity Opportunity areas for the Tees Valley Area was a true confidence builder. During this session I got the chance to listen to an interesting piece of work about ‘MapAction’ learning how GIS is applied to humanitarian crisis situations around the world.


Presenting at Meet and Eat

While doing my placement I was given the opportunity to attend a course about the basic introduction to PostGIS. During this daylong session I learnt how to load and manage spatial database something which I have never done before and aim to explore further in the future. Furthermore, following the advice given I have expanded upon my python and other coding skills learning the coding languages that are essential for the development of the open source GIS systems.


PostGIS User Training Certificate

Just like Bilbo Baggins on his journey I too have learnt a lot and overall I have gained an experience which is invaluable. I have learnt things from my weeks’ worth of experience that will stay with me forever. Helping me greatly in later life to develop my career path giving me inside knowledge about the workings of the GIS systems. Whilst also giving me advice and tips on how I can improve my CV and help to further enhance my career opportunities.

My gratitude and thanks goes to all the team at thinkWhere who has made my time here unforgettable and thank you for making me feel so welcomed.

thank you

First-time Scrum master in a cross-functional team

My name is Zlata and I have been in a scrum master position for 6 months which I do alongside my QA role. I was lucky enough to not be the first scrum master in the team, as the role is rotating. Because of this, I had learnt a lot from participating in scrum ceremonies established by my brilliant predecessors. I had been a scrum master with the development team for about 4 months and we recently restructured to have our developers and consultants merged into one single cross-functional team called ‘Operations’. Being the scrum master for the first time in an established team had provided many opportunities for learning, however, being the scrum master in the newly formed team has brought about many more interesting challenges.

This blog post contains my reflections on the challenges we are encountering in a small multi-skilled team and how we are addressing them. I hope some of the information here will be useful to a beginner scrum master or one working in a smaller organisation.

There is no handbook.

Ok, there are tons of books, papers, blog posts and memes but the truth is that processes and team dynamics never perfectly match the real life scenarios. There is no template you can exactly follow, furthermore, the one you tailor and adopt will always be changing. Oftentimes, when reading a book or an article, you realise that conceptually scrum may not tick all the boxes for your use case, however it still provides a great framework for getting things done.

There are many interpretations of scrum and agile out there and it is worth remembering that the definitions in the Scrum Guide may not perfectly fit the processes in your team and should be interpreted to match you organisation’s requirements. For example at thinkWhere we look at planned and committed work which the team estimates and limit it to two weeks, which seems reasonable iteration to deliver and review the processes. We also focus on having more face to face conversation and continuously self-improve in a constructive manner.

giphy-downsized (1)

This point is probably more relevant to the new scrum masters who are starting out, because with the amount of information out there you will see a lot of mixed discourses. I keep on reminding myself to use the proliferated theory as a guidance rather than the overall truth. It is important to stick to some of the main principles that focus on the team collaboration and being able to adapt to change rather than any particular details listed by an agile influencer. For me it was the hardest point to learn, this probably stems from the fact that you cannot actually formalise and enforce all the processes in agile, as those depend on the team and the circumstances.

Do not ask for permission, ask for forgiveness.

Sorry for the misleading heading. In my interpretation of the statement above, changing things and asking for constructive feedback is essential. The main lesson I have learnt is that challenging your own preconceived ideas and following people’s advice on the internet can be a great thing. Furthermore, getting people out of their comfort zone improves alertness and focus. Stress can be good in small doses.

Changes in everyday behaviour help keep us focused, and in scrum this can be easily implemented through changing ceremony setup, for example, the daily stand up locations can be moved, questions can be rephrased and many more. For the daily stand-up, one of the recent changes included removing the Jira board from the daily ceremony and limiting it to a weekly occurrence, as it seems to have been diverting team focus. For other ceremonies, for example, retrospectives, the format can be modified as there have been many engaging models developed and tested.

One of the first things we did as a new Operations team was to implement geographic changes. We have relocated everyone’s desk to ensure the team members are in closer proximity to each other. This may have created a deserted corner in our office but has ultimately brought people together.

For a scrum master, it is really easy to get into a certain routine of doing things, which may slow down the continuous improvement process. The potential solution for that could be temporarily delegating some of your responsibilities. You can learn a lot from how other people approach your tasks. For instance, I feel that I have been a bit lazy with the retrospective part of the meetings and it would be great to get other people to facilitate. The team may just learn about it from this blog post.

download (2).jpg

Skill distribution in Cross functional teams.

With the Operations team’s creation, there has been an increase in the range of skills, which has contributed positively towards the knowledge exchange. We have a lot to learn from each other and there are many ways to encourage internal cross-pollination of information.

Because the team is quite new, we balance various specialisation by doing capacity review during sprint planning to ensure effort and skill capacity is fully met for a story’s completion.


Fostering shared ownership of the issue is another approach to encourage knowledge exchange. Linking ‘3 Amigos’ to the sprint goal is one way to make sure there is more than one person who has a grasp of the project’s progress and encourage taking joint accountability. When capacity allows, we encourage paring up on tasks and capture required improvements in documentation or as technical debt tickets.


In a small company it can be easy to lose it with the range of solutions offered and developed. Not to mention the unplanned support or high priority sales jobs that may come in. There are many approaches that can help improve focus and many of them are built within scrum.

Stick to scrum ceremonies. They are important as they provide the temporal indication on when the scrum begins and ends as well as the regulated framework for everyday communication. Ceremonies and meetings are the most effective way of formalising scrum.


Document. Unplanned work is often unavoidable and it is important that the team capture it and make it visible on the board. If any other work is affected, then it should be flagged and, ultimately, reprioritised and removed from the sprint.

Agendas. Make sure ceremonies and meetings are time-boxed and well-coordinated. A detailed meeting agenda shared with the team keeps yourself on track as well as others. This improves flow and can be modified and perfected over time. Make sure the team is familiar with it as well. I attach meeting agendas to the confluence page for each sprint overview.


Fun may seem distracting, however, well-coordinated fun outlined in the meeting plan helps the team to be engaged and more focused with the routine supplemented by some amusement.


For example, I try to integrate two short sessions in a 2 hour meeting. The first is time-boxed to 5 minutes and is at the beginning of the sprint review meeting. The sessions vary and one can be a modified version of Candy Love, wherein I offer people skittles and ask random questions: What did you do over the weekend? What is your favourite office machine? The second short session is a modified version of 360 Degree Appreciation which follows the retrospective part of the meeting and I ask people to provide a constructive positive work feedback to the person sitting next to them.

These activities make people smile and also provide more natural transitions of the meeting’s agenda. They also provide positive reinforcement and contribute towards team building.

Retrospectives are the catalyst for actions.

Retrospective meetings are the perfect space to turn the potential rants into constructive actions that lead to resolving issues and break poor cycles. If you are a new scrum master, it is useful to review retrospective notes from previous sprints, as this will help you to acknowledge the improvements the team has been making and appreciate the ceremony more.


It is important to use the retrospective meetings effectively: to derive executable action items and assign ownership to them. Mid sprint, when we review the scrum progress during the stand-up meeting, the team goals are also reminded and reviewed. In time, these action points can also provide an effective metric to demonstrate to the team the number of items completed within certain period of time.

Retrospectives don’t need to be boring. As mentioned earlier, it is relatively easy to alternate retrospectives’ styles, and the introduction of a new style improves focus. We do, however, tend to use online boards, so that people could write up their thoughts prior to the meeting, as this saves time and ensures that everyone is prepared. This approach may change with another team member facilitating this short session.

Oftentimes, retrospective meetings can bring up many negative points. During the meetings, it is of great value to allocate time to positive feedback, see my previous point. It is easier to focus on the negative events and to forget the achievements, therefore constructive compliments should be encouraged.


Being the first-time or the part-time scrum master in a multi-skilled team is both challenging and interesting. The challenges can often differ from your main specialisation, hence providing some exciting learning opportunities.

I hope that some of the points mentioned in this post will be useful to anyone working in scrum by sharing some of the challenges we are overcoming here at thinkWhere.

Demystifying Coordinate Transformations


This blog is going to look at the mathematical calculations involved in rendering 2D GIS vector data, based on a particular area that we want to see on a map. I.e. converting vertices from map space into view space, and back.

For each vertex in a feature geometry, we need to perform a set of operations to calculate the screen position, based on our chosen location, scale, and rotation. Also, most applications will use top left as the origin rather than bottom left for map projections.

Normally, we would use client software such as OpenLayers, or application software such as GeoServer to make these calculations (and render the points, lines, or polygons) for us, however, occasionally it is more convenient to make the calculations required ourselves. E.g. generation of an image file on a server when the data is not available as a WMS through GeoServer.

At thinkWhere, we have implemented these calculations in a number of programming languages over the years as the languages we have used have changed. We have gone from C++, through C# and Actionscript (Flash), to Python and Javascript.

The Maths

To make the calculations we use Affine Transformations. For 2D data, these take the form of 3×3 Matrices which can be applied to a point location, producing a modified point.

We can create a number of transformations, each of which represents an action on the data. E.g translate, scale, or rotate the point.

These can be combined through matrix multiplication into a single transformation.

We can also calculate the inverse of a matrix, making it just as easy to perform the reverse set of actions.

The beauty of this approach is that we can apply a number of actions to each vertex in a small number of arithmetic operations making them quick to apply. Even for rotation, the sin/cosine values are calculated once, and the calculated matrix values used in simple multiplications and additions from then on. Once the completed matrix has been generated, the calculations required for each vertex are:

x' = (a * x) + (b * y) + c

y' = (d * x) + (e * y) + f

The Good News

The good news is that once we have created a class to perform the matrix arithmetic, we don’t have to think about it again. We can then create a sequence of actions which will transform our map coordinates to view coordinates.

  • Translate the data based on our chosen centre point – centring the data at 0,0
  • Scale the data based on our chosen view scale
  • Rotate the data around the centre if required
  • Flip the horizontal axis to switch the origin from bottom left to top left
  • Translate again to the centre of our view (based on the view size)

The example below shows this in action.


Sample Application

This sample application has been written using an HTML5 Canvas to draw a GIS dataset from a GeoJSON file. It uses the turf library to assist in reading the GeoJSON files.

It uses an implementation of affine transformations to calculate the necessary transform to convert map coordinates in the GeoJSON file to screen coordinates. A standard class called CanvasRenderingContext2D draws the features. We have added additional members to make the drawing easier. This object already implements affine transformations. You can apply individual actions using the rotate(), scale(), and translate() methods, which alter the current transformation to allow you to build up a set of actions, or you can supply the six relevant parameters of the matrix in a single call to transform().

We have used the approach of calculating the matrix ourselves, in order to also produce the inverse transformation. This is then used to make the reverse calculation I.e. for a point clicked on the map, find the map coordinate that this relates to in order to then search for the clicked feature within the source data.

The code below uses the view control values to create the required transformation, and pass this to the CanvasRenderingContext2D

function setupTransform(canvas, context)
 var r = document.getElementById("rangeAngle");
 var tx = document.getElementById("rangeCentreX");
 var ty = document.getElementById("rangeCentreY");
 var sc = document.getElementById("rangeScale");
 var scalefactor = canvas.width/sc.value;

var t = new Transform();
 t.apply(new Translate(-parseFloat(tx.value),-parseFloat(ty.value)));
 t.apply(new HorizontalFlip());
 t.apply(new Rotate(parseFloat(r.value)));
 t.apply(new Scale(scalefactor));
 t.apply(new Translate(250, 250));

currentTransform = t;
 currentInverse = t.inverse();
 currentScaleFactor = scalefactor;

alternative method using a combination of the built-in methods

When a data file is loaded, the maximum extent of the features is calculated, and the controls are set up accordingly. When the controls are modified the transformation is adjusted, and the features redrawn.

Loops and more

Since it it quite simple to set up a set of actions passing in the various parameters, we can make use of loops to run though a variety of settings and start having some fun.

This example uses the Python libraries: Shapely (to read the GeoJSON data), Pillow (drawing to image), and images2gif (animated GIF) to produce an animated map.

Address Search OS OpenNames with PostGIS, SQLAlchemy and Python – PART 2

Part 1 of this post outlined how to configure a PostGIS database to allow us to run Full Text searches against the OS OpenNames dataset.

In Part 2 we look at writing a simple Python 3 CLI app that will show you how easy it is to integrate this powerful functionality into your apps and APIs. Other than Python the only dependency we need is the SQLAlchemy ORM to let our app communicate with Postgres.


Installing SQLAlchemy

SQLAlchemy can be installed using pip. It is dependent on psycopg2, which you may struggle to install on Mac without Postgres present, which is frustrating (however solutions can be found on Stack Overflow).

A simple address search CLI

Let me draw your attention to…

Hopefully this script is fairly easy to follow, but there are a couple of lines to draw your attention to

  • Line 4 – Note we have to tell SQLAlchemy we’re using the Postgres dialect so it understands TSVECTOR.
  • Lines 8 – 12 is simply SQLAlchemy boiler plate that sets up our connection and session for the app. You’ll need to swap out the connection details for your own.
  • Lines 17-20 I’ve chosen to map only 3 columns, you’ll probably want to map more.
  • Line 25 – is very important, here we append the OR operator to every word the user has supplied, meaning we’re returning addresses. You could extend this to allow the user to specify on exact match operator and change this to an & search.
  • Line 26 – Finally note we ask SQLAlchemy to match our search, and importantly we must supply the postgresql_reconfig param to say we’re searching in English. This is vital or you wont get the matches you expect.

Running our app

We can run our app from the command line simply by entering the following command:

python 'forth street'

And we see our app print out all matching addresses that contain either Forth or Street 🙂


Hopefully you can see how easy it would be take the above code and integrate it into your apps and APIs. I hope you’ve found these tutorials useful. Happy text searching!

Address Search OS OpenNames with PostGIS, SQLAlchemy and Python – PART 1

In this two part post we’ll look at implementing an address search using the Ordnance Survey Open Names dataset. We’ll use the power of Postgres with the PostGIS extension leveraging it’s built in Full Text Search, and use Python and the SQLAlchemy ORM to create a simple CLI.


Part 1 – Data Load and DB Config

Address Data

The UK is badly served for free address data. The best we have is the Ordnance Survey OpenNames dataset. It will work as a Postcode lookup or a street finder (at a push), but the dataset would require a lot of additional processing to be a useful address search. OS really want you to purchase AddressBase.

That said, OpenNames will suffice for this example and it should be easy to extend the example to a fuller dataset if you’re lucky enough to have one.

Loading Data to PostGIS

You can download OpenNames as either CSV, or GML. I’d recommend GML as it’s simpler to load it into PostGIS using OGR2OGR.

Once you unzip the archive you’ll see that the files are referenced according to the British National Grid, so you can load as much or as little as you want.

We’ll load NS68 which contains addresses in my home town of Stirling, as follows (swap out the values for your db):

ogr2ogr -f PostgreSQL PG:"host=localhost dbname=Real-World port=5432 user=iain password=password" NS68.gml -progress -nln open_names --config PG_USE_COPY YES 

You should now have a new table called open_names containing the addressing info.

Note if you want to load more gml files just use the -append flag:

ogr2ogr -f PostgreSQL PG:"host=localhost dbname=Real-World port=5432 user=iain password=password" NS88.gml -append -progress -nln open_names --config PG_USE_COPY YES 

Setting up Full Text Search

We now have our open_names table, but no text search column. So we can add a textsearchable column which must be of type TSVECTOR as follows:

ALTER TABLE open_names ADD COLUMN textsearchable TSVECTOR;

We can populate the column by using the built in function TO_TSVECTOR, this tokenises the words based on the supplied config, in our case english. However, multiple configs are supported.

UPDATE open_names SET textsearchable = TO_TSVECTOR('english', text || ' ' || localid);

If you look at the data in your new column you’ll see that it now contains text tokens representing the address data.

Increase accuracy by concatenating multiple columns

Note that we’re concatenating 2 columns together in this update statement – text and localid. In our case the reason for doing this is that the postcode in the localid column is stored without a space, meaning our search will return a result if the user enters a postcode without a space.

However, it should be clear if we had better address data, we could concat multiple columns. Meaning if a user searched for “1 Main St, Stirling, FK3 4GG” we would be able to return an accurate match.

Add an Index for faster searching

Now that we have data set up we can add an index to our new column which will ensure searches are fast:

CREATE INDEX textsearch_idx ON open_names USING GIN (textsearchable);

Let’s do some searches

Now lets query our new column to see if we can find some matches using the TO_TSQUERY function

SELECT COUNT(1) FROM open_names WHERE textsearchable @@ TO_TSQUERY('english', 'avenue')

Here we find we have 41 streets in Stirling area containing the word avenue. You’ll note that I don’t need to worry about lowercase, uppercase or where the word might appear in the string. Full text search takes care of that for me 🙂

The @@ operator basically means that the query matches the tsvector column.

Using AND and OR for better matches

A very powerful feature of Postgres’ Full Text Search is the ability to find matches contain all or some of the words in the query using the AND & operator or the OR | operator, as these examples show:

select * from open_names where textsearchable @@ to_tsquery('english', 'forth & view');

Here we only return one result Forth View which contains both Forth and View, if we change this to an OR search:

select * from open_names where textsearchable @@ to_tsquery('english', 'forth | view')

We get 7 results including Forth View, Bruce View, Forth Place.

Again it should be easy to see how powerful text searches could be built for complex text documents.

A final note on Triggers

While our address data is fairly static, if you had a table where users were regularly editing address data, or any other columns you wanted to run a full text search on, you should consider adding a trigger to keep the TSVECTOR column up to date, as outlined here.

So for our example the trigger would look like:

tsvector_update_trigger(textsearchable, 'pg_catalog.english', localid, text);

Up Next

Hopefully Part 1 has demonstrated how it is very easy to set up powerful text searching in Postgres. In Part 2 we’ll look at how we can use Python and SQLAlchemy to allow you to integrate this functionality into your apps and APIs.

Restoring a Postgres database to AWS RDS using Docker

In this post I look at using Docker to restore a Postgres dump file to a Postgres database running in the cloud on AWS RDS.

Keep it clean

One of the big selling points of docker, for me, is that I can have lots of apps and utils running in nice containers on my dev laptop, without having to install them locally.  This ensures my laptop stays nice and responsive and I don’t clutter/break my laptop with lots of weird dependencies and running processes that I’m then too scared to delete.

Postgres is a good example – I don’t want to install it locally, but I do need access to the command line tools like psql and pg_restore, to be able to work with my databases effectively.

One way of accessing these tools would be to ssh onto the AWS cloud instances, but there’s a bunch of reasons most pertinently security (not to mention the faff) why you’d want to avoid that every time you want to run some sql.  So let’s look at how we use Docker to ease the pain instead.

Start Me Up

With Docker installed you can build this simple Dockerfile to create a local Postgres container.  The User and Password env vars aren’t strictly required, however, if you want to actually connect to the containerised DB, it’s pretty handy

You can build, run and connect to the container as follows (assumes you are on Mac)

Note line 4 where I map the data-load dir I created at line 1 to a new directory called data-loader inside my container.  This means that when I copy the Postgres dump file into my local data-load directory, it will be available to the postgres tools available in the container.

Line 6  allows me to connect to the container, swap the imageId  for your locally running containerID.

Restoring your database with pg_restore

I’ll assume you already have a Postgres database set up within the AWS cloud.  So now we have connected to our container, we can use pg_restore to use restore our dumpfile into AWS (note this command will prompt you for the admin password)

A note on schemas

If you’re doing a partial restore, you may want to restore your dumpfile to a separate schema.  Unfortunately there appears to be no way to do this from the command line.  What you have to do is to rename the public schema, create a new public schema and restore into that, then reverse the process.

This StackOverflow answer outlines the process.

Restore Complete

You should now have a complete restore of your dumpfile in the cloud.  Please add comments if anything is unclear.

My work placement week @ thinkWhere

My name is Yacouba Traore. I am currently studying my second year BSc (Hons) Information Technology at Teeside University School of Computing.

I have had a great week placement at thinkWhere. During my week placement I was presented with a variety of opportunities. I had a great chance to meet everyone from the various parts of the business including CEO, Portfolio Manager, Business Managers, Accounts Managers, Developers, Service Desk Consultants and Office Administrators.

I have learned how the agile development systems work such as Scrum, as well as the differences between Agile and Waterfall. I have taken part in the different ceremonies including daily stand up, backlog refinement, demo and sprint planning.

Attending the Scrum daily-stand up

I have also learned about GIS (Geographical Information Systems) and the theMapCloud. I have also created my own map using QGIS where I am from in Côte d’Ivoire (Ivory Coast).

My first QGIS map of Côte d’Ivoire

The thinkWhere development team also worked with me to develop my coding skills including help with JavaScript, HTML and CSS using PyCharm. I used these skills to script a page showing a graph of theMapCloud usage metrics.

The role has allowed me to learn key skills and competencies of IT and business systems. On top of this I delivered 30 minutes of presentation about myself and a project I am working on at university.

One of the best things about the company is the people. They were very friendly, approachable and well organised. I have also made a great network of colleagues and made new friends.

I have really enjoyed my work placement. I have learnt a lot and have gained skills that I will take forward with me. I have also been given many opportunities and many new experiences. I’ve gained a deeper understanding of how large IT projects work, which is going to help me in the future.

Developing my JavaScript, HTML & CSS skills using PyCharm.

Being part of this placement has also helped me to develop my interview skills and my job prospects. It is also very valuable experience for my CV.

I hope to be back at thinkWhere again one day!