Watch your step: Learn from the Strava Debacle

By TERESA NG | February 22, 2018

Screen Shot 2018-02-21 at 11.10.18 PM

Two weeks ago, your Fitbit was outed as a national security threat. Strava, an app that allows users to share and compare the data from their fitness trackers, released a global heatmap aggregating users’ movements — including those of personnel in secretive overseas military bases. 

On Strava’s map, lines of activity from 2015 to 2017 are mapped out in glowing orange. 

In parts of the world like Afghanistan and Syria where user density should be nearly non-existent, small, neat patterns of light show signs of militaries at work. 

Nathan Ruser, an Australian college student, first drew attention to the map by tweeting screengrabs of patterns in these countries that could be associated with U.S., Turkish and Russian operating bases and patrols. 

Since then, security analysts and journalists everywhere have joined in on a disturbing version of “Where’s Waldo?” They continuously post more discoveries and implications on social media. 

Some even claim to have stumbled upon the locations of previously unknown military sites. But the map is a risk even for the many base locations that are already public knowledge. It shows where personnel live in these compounds, where activity is concentrated, and where key supply or patrol routes might be. 

Combined with other social media data, Strava can also be used to identify individuals who might be in sensitive occupations, like what security analyst Tobias Schneider did for 573 people who regularly jogged around British intelligence headquarters. 

Some say the solution is simple and should have been obvious from the start: Simply ban personnel from wearing fitbits, or get companies to monitor and censor sensitive information. 

However, the Strava map is only the latest instance in a larger issue that militaries have faced before and will continue to face.

Back in 2007, a geotagged photo of new helicopters posted on the internet by soldiers at a base in Iraq resulted in enemy forces being able to precisely locate and destroy four of the helicopters. 

The 2016 Pokémon Go craze posed another problem for the Pentagon, which worried that the data collected by the app could be used by foreign spies. 

In 2013 the U.S. Army distributed over 2000 fitbits to soldiers as part of a fitness program. Last week, as the news on Strava broke, the deputy secretary of defense announced himself “guilty” of using one too. 

Such technology has simply become ubiquitous in our lives. Our smartphones, smarthomes and smartwear silently collect information on our routines, locations, health and lifestyle through the many apps we may or may not know are tracking us. This makes any regulation on them difficult to craft or enforce, like cutting heads off a hydra. 

Of course, for a small group of top-tier secret operatives, renouncing the use of wearable technology is a necessary commitment for the job. But, for many of the rank-and-file, the situation is more ambiguous. Precisely because such technology is embedded in our lives, it could influence the mental health of military personnel on long postings in far-off and foreign environments. 

Strategist Lynette Nusbacher spoke on the importance of technology in keeping people grounded to life back home. 

“People on their third and fourth development are going to lose their minds or their marriages if they can’t use tech to simulate normalcy,” she said, according to Wired magazine.

The U.S. personnel in Iraq and Afghanistan who shared their run times for routes like “Base Perimeter” and “Sniper Alley” were probably doing just that. 

To complicate things further, even companies can’t fully know exactly how the data they collect will be used in the future — especially in combination with data collected by other platforms. The continued rise of data collection and machine learning means that new data sets will be collected and will become available in the future, shedding unexpected light on older, once innocuous data sets. 

The problem also isn’t just one of U.S. national security. UN peacekeepers, operatives from other countries and international aid workers could be compromised by this data too. It’s difficult to say how exactly U.S.-based multinational companies should deal with the many interests at play here, especially when these could come in conflict. 

The solution perhaps comes down to basic tech savviness and a reconsideration of the way data privacy consent currently works. The very fact that part of Strava’s initial response was to urge users to check their privacy settings is instructive: Data collection and storage tends to be opt-out rather than opt-in. Also, with instances like the Strava heatmap, opting out involves a long and circuitous series of steps, and it does not delete or prevent the use of data collected previous to opting out.

It’s easy to see why tech companies have the incentive to maximize data collection now and ask questions later. It may be time to figure out how to restructure these incentives to ensure that there can be more meaningful consent between users and companies. 

Comments powered by Disqus

Please note All comments are eligible for publication in The News-Letter.