Cybersecurity

Your Fitness App Is a Surveillance Tool

A French newspaper recently tracked the real-time location of France's only aircraft carrier — the Charles de Gaulle, one of the most strategically sensitive military assets in Europe — using publicly available data from fitness apps. Crew members were running laps on the flight deck with Strava recording every step. The ship's position, course, and speed were visible to anyone who knew where to look.

This wasn't a hack. Nobody exploited a vulnerability or bypassed a security control. The data was public by default, shared willingly by users who didn't think about what their jogging route revealed about a nuclear-powered warship's deployment pattern. It's the same class of problem that exposed secret military bases in 2018 when Strava's global heatmap lit up running routes in Afghanistan and Syria. The lesson hasn't been learned.

The Problem With 'Harmless' Data

Location data from fitness apps seems innocuous in isolation. You went for a run. You cycled to work. You did laps at the pool. But aggregated over time, this data reveals patterns that are anything but harmless.

  • Home and work locations. Your most frequent start and end points are almost certainly your home and office. Any app that records your running routes knows where you live and work, even if you never explicitly provided this information.
  • Daily schedule. Run at 6 AM every weekday? The data shows when your house is empty. Travel for work on Tuesdays? Your absence pattern is recorded.
  • Sensitive locations. Military personnel, intelligence officers, government employees, corporate executives — anyone who works at or visits sensitive locations leaves a trail. Even a single workout at a classified facility reveals that the facility exists and that you have access to it.
  • Social connections. Group runs, shared routes, and segment leaderboards reveal who exercises together. In intelligence contexts, this can map social networks of individuals at sensitive installations.

The aircraft carrier case is dramatic, but the same principles apply to ordinary people. Stalkers have used fitness app data to locate victims. Burglars could identify when homes are unoccupied. Employers could monitor employee movements outside work hours. The data is rich, granular, and in many cases available to anyone with a free account.

How the Data Leaks

Fitness apps leak location data through multiple channels, some obvious and some subtle.

Public Activity Profiles

Many apps default to public profiles. Your runs, rides, and swims are visible to anyone, complete with GPS traces plotted on a map. Strava's defaults have improved over the years, but many users created accounts when defaults were more permissive and never revisited their settings. A quick search on Strava for activities near a military base, data center, or government building often reveals more than it should.

Heatmaps and Aggregated Data

Even when individual activities are private, aggregated data can reveal sensitive patterns. Strava's global heatmap — which shows the density of all recorded activities — famously outlined the floor plans of forward operating bases in conflict zones. The bases showed up as bright spots of activity in otherwise empty desert. Individual users were anonymous, but the aggregate pattern was unmistakable.

Segment Leaderboards

Strava segments — user-defined stretches of road or trail where users compete for the fastest time — create persistent location markers. A segment on an aircraft carrier flight deck identifies every user who has run that specific route. Even if individual profiles are private, the segment leaderboard shows usernames, times, and dates. Cross-referencing this with other public information can identify individuals.

API Access and Data Brokers

Fitness apps often share data with third parties through APIs, partnerships, and data broker relationships. Even if you set your profile to private, your data may flow to health insurance companies, advertisers, or data aggregators who combine it with other datasets. Location data from fitness apps has appeared in commercial databases sold to law enforcement, intelligence agencies, and private investigators.

OPSEC Lessons for Regular Developers

If you're building applications that handle location data — or any data that could reveal sensitive patterns — the fitness app failures offer concrete lessons.

Default to Private

This sounds obvious, but most fitness apps violated it for years because public profiles drove engagement and growth. If your app collects location data, the default should be private with users explicitly opting in to sharing. Not 'public with an option to go private buried in settings' — actually private by default.

// Bad: default to public, user must opt out
const userSettings = {
profileVisibility: 'public',
activityVisibility: 'public',
showOnHeatmap: true,
shareWithPartners: true
};
// Good: default to private, user must opt in
const userSettings = {
profileVisibility: 'private',
activityVisibility: 'private',
showOnHeatmap: false,
shareWithPartners: false
};
// Better: explain what each setting actually means
const privacySettings = {
profileVisibility: {
value: 'private',
description: 'Only people you approve can see your profile',
riskLevel: 'Your name and activity history are visible'
},
activityVisibility: {
value: 'private',
description: 'Only you can see your activities',
riskLevel: 'GPS routes show where you live, work, and travel'
}
};

Implement Privacy Zones

Strava eventually added 'privacy zones' — areas around sensitive locations where GPS data is hidden. This is a good feature that was implemented years too late. If your app records location data, let users define areas where recording is suppressed or data is fuzzed. Most users will set this around their home and workplace, which eliminates the most sensitive patterns.

Implementation detail: fuzzing the start/end of a route by a random offset isn't sufficient if the route pattern is distinctive. Someone who runs the same 5K loop every day produces a recognizable shape even with endpoints obscured. Better approaches: trim the first and last N meters of the route, or snap endpoints to a grid to prevent exact location inference.

Think About Aggregation Attacks

Individual data points may be harmless. Aggregate patterns may not be. When you build features that combine or visualize data across users — heatmaps, leaderboards, trending locations — think about what patterns emerge. A heatmap that highlights a secret facility is an aggregation attack, even if no individual user's data is exposed.

The defense: set minimum thresholds for aggregation (don't show heatmap data for areas with fewer than N unique users), exclude sensitive geographic areas, and review visualizations for unexpected pattern disclosure before releasing them. Differential privacy techniques — adding calibrated noise to aggregated queries — can help, though they're complex to implement correctly.

Audit Your Data Sharing

Every API endpoint that serves location data, every data export feature, every third-party integration — these are all channels through which sensitive data can leak. Map your data flows. Know exactly where user location data goes, who can access it, and under what conditions. If you share data with partners or advertisers, be explicit with users about what you share and with whom.

The Bigger Picture: Data Minimization

The fundamental question is whether your application needs the data it collects. Does a running app need GPS coordinates at one-second intervals? For route visualization, yes. For distance and pace calculation, a lower resolution would suffice. For calorie estimation, you only need total distance and elevation change.

Data minimization — collecting only the data you need for the specific feature the user is using — is the strongest protection against data leaks. Data you don't collect can't be breached, subpoenaed, sold, or aggregated into surveillance patterns. This conflicts with the growth-oriented instinct to collect everything and figure out uses later, but it's the only approach that's truly robust.

The best way to protect user data isn't better encryption or stricter access controls. It's not having the data in the first place.

The fitness app industry learned this lesson the hard way — and some companies still haven't learned it. Military personnel are still using Strava on aircraft carriers. Intelligence officers are still logging runs near classified facilities. The data is still flowing to servers, APIs, and data brokers who can piece together patterns that no individual user intended to reveal. As developers, we can either build systems that protect users from these risks by default, or we can build systems that exploit their inattention. The aircraft carrier's crew didn't make a security decision when they opened their running app. The app's developers did, years earlier, when they chose the default settings.