Fitness app Strava found itself in hot water this week, as a marketing gimmick spectacularly backfired and forced the company to re-assess its own privacy settings—as well as putting the military on high alert.
Strava is intended to be used as a fitness-logging app, used particularly to track workouts and fitness schedules. The company, apparently to create a “fun and informative” post showing how its users workout across the world, released a global heatmap comprising over data from over 1 billion activities uploaded to the app.
Strava users the GPS on a subscriber’s mobile phone to track fitness activity, and also gets information from connected devices, such as Jawbone and Fitbit. The company used over three trillion pieces of this user data, uploaded to the app between 2015 and 2017, to put together the heatmap—and in doing so, made it possible for those viewing the map to locate and identify military bases in various countries around the world.
The question at this heart of this story, however, is why Strava felt the need to use their users’ data for nothing more than a promotional gimmick in the first place. Lifestyle apps that use location services to track their subscribers’ activity have a great duty to protect that information for the sake of the user’s privacy and security.
Strava would argue that the information shown on the heatmap had been anonymised and shown en masse, meaning that an individual user’s data could not be isolated. But as the backlash from the map has shown, you can still put the safety and security of individuals at risk, even when data has been “protected” in this way.
Of course, if Strava hadn’t thought to use their users’ information in this manner, none of this would have happened—and it’s difficult to understand why they considered this appropriate in the first place.
People use tracker-style apps for a huge variety of different things; from fitness and nutrition to sleep optimization and family organization. Users upload their data to these apps in good faith, believing that the information will be used responsibly.
To use that data for nothing more than a gimmick undermines the trust that a user places in an app—and, as this situation shows, it can also threaten user safety.
In an open letter, Strava CEO James Quarles acknowledged the impact the heatmap had on the military, but did not comment on why the company felt the need to show user data in this way:
I’d like to take a moment to address the recent attention focused on Strava and our global heatmap. Our heatmap provides a visualization of activities around the world, and many of you use it to find places to be active in your hometown or when you travel. In building it, we respected activity and profile privacy selections, including the ability to opt out of heatmaps altogether. However, we learned over the weekend that Strava members in the military, humanitarian workers and others living abroad may have shared their location in areas without other activity density and, in doing so, inadvertently increased awareness of sensitive locations.
Many team members at Strava and in our community, including me, have family members in the armed forces. Please know that we are taking this matter seriously and understand our responsibility related to the data you share with us.
Despite this claim, at the time of writing, the Strava heatmap was still available to view.
This isn’t the first time that Strava has been criticised for its handling of user data, and the above response fails to make clear that the app understands its duty to its users.
However, Quarles did state that the company would be reviewing its privacy features:
We are committed to working with military and government officials to address potentially sensitive data
We are reviewing features that were originally designed for athlete motivation and inspiration to ensure they cannot be compromised by people with bad intent.
We continue to increase awareness of our privacy and safety tools.
Our engineering and user-experience teams are simplifying our privacy and safety features to ensure you know how to control your own data.
Let’s hope that some reforms follow—and that consumers keep an ever-vigilant watch over how companies use their information.