Posted by Robert Vamosi on February 6, 2018
A few weeks ago, the fitness company Strava published a worldwide heatmap showing specific routes their customers have taken over the years. While it is a visual heatmap of human activity on the planet, according to various security researchers the heatmap also exposed private areas such as routes within top secret military bases. This is not a conventional data breach where sensitive data was exposed through a specific vulnerability in the software or device. In this case the device and software operated as designed. Rather, the flaw appears to be in how the end-user has configured their device in terms of collection and sharing, and that’s a flaw all too common in the Internet of Things (IoT) today.
Thanks to the marketing team at Strava, the fine line between convenience and security is once again in the public for discussion. The map itself doesn’t tell us anything that we didn’t already know (such as secret locations); rather it’s the sensitive information lurking within its dataset (such as the specific names of the people using the device).
The Strava heatmap, as presented, is not live but generated from its sensor data collected between 2015 and 2017. Even though it is historical data, using a Strava feature known as “Flyby” — in which you can compare your run with that of others — researchers could identify who (if they have an account) was running at the time. Perhaps this feature is for a building social networks (“Hey, what’d you think of that hill?”), but Henrik Lied, a journalist at Norway’s NRKbeta –Norwegian Broadcasting Corporation NRK’s sandbox for technology and media, soon realized he could construct and upload a faux run (GPX) in the past (specifically on dates between 2015-2017) to tease out the identities of any runners nearby.
“Maybe we can create fake runs in the areas we’re interested in, and ‘go back in time’ and check for activity in those areas using the Flyby feature,” Lied told Security Ledger. “You can upload 25 GPX files at a time via the website. So I sat down and created a script which takes one GPX track as input, and creates 364 versions of the file as output–one file for each day. Each file has small differences in time of day and tiny differences in the track route to be able to catch as many possible routes as possible.”
Lied constructed his runs near military bases in Afghanistan. And that’s where the problem came in.
“The leaderboard for one 600m stretch outside an airbase in Afghanistan, for instance, reveals the full names of more than 50 service members who were stationed there, and the date they ran that stretch,” wrote The Guardian. “One of the runners set his personal best on 20 January this year, meaning he is almost certainly still stationed there.” In other examples, most of the identified military personnel had since moved on.
The military, eager to see how fit their enlisted may be, promoted the use of wearable devices, but never anticipated this kind of data leak.
“Secretary Mattis has been very clear about not highlighting our capabilities to aid the enemy or give the enemy any advantage,” said Pentagon spokesman Army Col. Rob Manning, according to Army Times. “The secretary is aware [of the breach] and we are taking a look at our department-wide policies to determine if [they] need to be updated.”
“[The] DoD takes matters like these very seriously and is reviewing the situation to determine if any additional training or guidance is required, and if any additional policy must be developed to ensure the continued safety of DoD personnel at home and abroad,” Maj. Audricia Harris, a Pentagon press official, said in a statement to Military Times.
The fact is the typical user already knows, on some level, this was happening. Strava has a robust community where runners share their favorite routes and compare times. And it’s not just limited to Strava; other wearables and other IoT devices freely post location.
DHS Cybersecurity Rob Joyce wrote in a tweet “Strava heatmap forces all to look at risks of big data analytics. It goes well beyond fitness trackers. Security and OPSEC need to be considered in our new reality. While policy evolution is needed, it is important to make good security policy balanced by not over reacting too.”
Indeed, we’ve already seen how one IoT device, the personal drone, is managed using software. In creating a public policy for commercial drone use, the FAA created geo-fenced or “no fly” zones. You cannot, for example, fly too close to an airport, nor can you fly a personal aircraft? too high. The software limits the user in some aspects while permitting a vast number of other freedoms. The user is not necessarily aware this happening.
Something similar exists within many wearable and home IoT devices. Customers can – but very often do not — chose how much personal data can be collected or uploaded. But many of the privacy settings do not adequately explain the consequences of enabling or disabling a specific feature—for example, you might think you have turned off one feature only to find another feature in the same device collects similar data in a different way.
In response, Strava CEO James Quarles wrote, “Many team members at Strava and in our community, including me, have family members in the armed forces. Please know that we are taking this matter seriously and understand our responsibility related to the data you share with us.
“Here’s what we are doing in response to what we’ve learned:
• We are committed to working with military and government officials to address potentially sensitive data
• We are reviewing features that were originally designed for athlete motivation and inspiration to ensure they cannot be compromised by people with bad intent
• We continue to increase awareness of our privacy and safety tools
• Our engineering and user-experience teams are simplifying our privacy and safety features to ensure you know how to control your own data.”
This is a great first step. And hopefully this incident prompts other IoT device manufactures to simplify their privacy safety features.
Robert Vamosi is a CISSP and Head of Corporate Content Strategy at Synopsys. He is the author of When Gadgets Betray Us: The Dark Side of our Infatuation with New Technologies and The Art of Invisibility (with Kevin Mitnick). He is also featured in the history-of-hacking documentary, Code2600. As an award-winning journalist, Vamosi has been writing about information security for more than 15 years for sites including Forbes.com, ZDNet, CNET, CBS News, PC World and Security Ledger.
Melissa Kirschner is Web Editor-in-Chief at Synopsys. She has been a writer, editor, and content strategist in high-tech for more than 15 years. Melissa is fascinated by the socio-cultural implications of the digital age. When not researching AI, autonomous driving, 5G, cryptography, and medical advancements, she enjoys reading the works of Neil Gaiman, watching dystopian dramas, and rescuing abused animals. Most of all, she likes to write things that people like to read.