It is now becoming increasingly difficult to spot errors in OpenStreetMap just by looking at the map on openstreetmap.org. To help we have a collection of quality assurance tools, and this weekend I discovered a new one developed by Strava. The keen cyclists or runners amongst us will recognise the name – Strava is a phone app/service then enables you to monitor your athletic progress over time and compare it against others. To do this it keeps a track of your run/cycle using the GPS built in to your smartphone. Consequently Strava have built up a large database of GPS traces and user contributed map errors. From this they have built a “routing errors” website to help iron out any issues in the underlying map data, which naturally is based on OpenStreetMap.
The service highlights 2 different types of errors: those manually entered by it’s users, and those automatically detected from the map data and the recorded GPS traces (e.g. many cyclists riding the “wrong” way down what is a one-way street in OpenStreetMap suggests that the OSM data may be incorrect).
I’ve had a quick play with this new quality assurance tool (with mixed results), but I’d love to hear your views. One concern is that the manually submitted “errors” may be a problem with user behaviour or the Strava app itself, rather than an issue with the OpenStreetMap data, al la MapDust – a similar, and now quite dated service from Skobbler. But is this view too pessimistic? Is there useful data in the manually submitted errors? And how about the (US only) automatically detected errors? Are these a reliable source of quality assurance data for OpenStreetMap?
Let us know your views in the comments section below and feel free to share any other great QA tips you may have.