In 2019, Mike Yeagley, a tech expert who had spent years working with government projects, started sharing a major concern with anyone in Washington, D.C.’s national security circle who would listen: the U.S. had a big issue with Grindr, the dating app. Grindr, which helps people find potential partners nearby using the GPS in their phones, had become incredibly popular since its launch, turning into a significant part of gay culture worldwide. However, Yeagley saw Grindr as a potential risk, not because of the app’s purpose but because it, like many other apps, was sharing loads of user data without most people realizing it.
Yeagley put together a presentation to show that Grindr’s data could pose a national security risk. He explained how advertisers could get real-time locations of Grindr users through something called real-time bidding – this is how digital ads work. When you see an ad on an app or website, there’s a complex process behind it that decides which ads to show you, based on lots of data, including where you are at that moment. Yeagley was able to use this data to see which government employees might be using Grindr, by tracking where their phones were during the day and where they went afterwards. His point wasn’t to get anyone in trouble; instead, he wanted to show how easy it was to track people’s movements and personal lives through seemingly harmless apps.
The data from ads can reveal a lot about where we go and what we do, and it’s incredibly valuable not just for advertisers but potentially for governments too. Yeagley was highlighting a new kind of intelligence gathering, dubbed “ADINT,” where data from the ad tech industry is used for spying. This was a big shift from traditional spying methods, which had been disrupted by new technology and privacy concerns.
To make his point clearer, Yeagley used the example of how data collection works when you use an app like the Weather Channel. When you decide to check the weather by tapping the Weather Channel app on your phone, you unknowingly kick off a whirlwind of digital activity aimed at showing you a customized advertisement. This process starts with something called an advertising exchange. Imagine it as a huge digital marketplace where countless phones and computers tell a central system whenever they have a spot available for an ad.
Almost instantly, the Weather Channel app sends out a bunch of information to this ad exchange. This includes your phone’s IP address, what kind of Android it uses, your mobile service provider, and even the specific setup of your phone, like its screen resolution. The most crucial bits of information shared, though, are your phone’s exact GPS location and a special advertising ID that Google gave your phone, which is like a secret nickname for her device (Apple phones have a similar nickname system).
To the average person, this advertising ID just looks like a random mix of letters and numbers. But for advertisers, it’s like finding treasure. They can figure out, for example, that a specific ID belongs to someone with a Google Pixel phone who uses the Nike Run Club app, often visits a runners’ website, and seems really interested in buying a particular type of running shoes. They know all this because companies like Nike, Google, and various websites share information in this giant advertising network, all to better understand and target potential customers.
Advertisers take this detailed info to decide how and where to show their ads. For instance, if Nike and Brooks (another shoe brand) both want to market to women interested in running, they use all this data to create an “audience” – a big list of these advertising IDs thought to be in the market for running shoes. They then participate in super-fast, automated auctions to decide who gets to show their ad to this audience whenever someone loads an app or a webpage.
There are supposed to be some rules to protect people’s information. For example, users can change their advertising ID (though not many people do or even know they can), and they can restrict what information apps can access, like their GPS location. However, not all apps follow these rules as strictly as they should, and the ad exchanges don’t always check very carefully who’s buying and using this data.
Even the companies that don’t win the ad space auction still end up seeing all the data shared during the auction. This has led to a whole business model of collecting this data from auctions, repackaging it, and selling it to help other businesses understand consumer habits better.
The most prized piece of data from all this is the location information. The business of tracking where phones go has become worth billions of dollars. This geolocation data isn’t just for showing people ads for nearby restaurants; it’s used to track consumer habits, decide where to open new stores, and even to figure out if a store’s customer traffic is going up or down, which can affect a lot of business decisions.
Yeagley’s work was about warning people, especially in the government, about these risks while also pointing out the opportunities this kind of data could offer for intelligence work. He wasn’t just criticizing; he was showing that in the world of national security and intelligence, understanding and using data could be a powerful tool.
The Journey
In 2015, Mike Yeagley joined PlaceIQ, a pioneering company in the location data industry, driven by Duncan McCall’s vision to harness the vast amount of geospatial data generated by consumers. McCall’s inspiration came from an adventurous drive across the Western Sahara, where he realized the potential of GPS technology for navigation without traditional guides. PlaceIQ initially utilized data from Flickr but soon shifted to sourcing information from mobile ad exchanges, marking the beginning of a profitable business model.
Yeagley’s recruitment followed an investment from In-Q-Tel, the CIA’s venture arm, underscoring the intelligence community’s interest in geospatial data for understanding movements and interactions of individuals. PlaceIQ’s expertise in analyzing geographic data caught the CIA’s attention for its potential to reveal patterns such as covert meetings.
Working at PlaceIQ, Yeagley realized the significant value their data could offer to government agencies, beyond the company’s software solutions. Seeking to explore this further, he moved to PlanetRisk, a firm that operated both in the corporate and government contracting spheres. PlanetRisk aimed to help clients assess risks associated with different global locations by analyzing data on crime, civil unrest, and extreme weather.
PlanetRisk’s innovative use of commercial data for intelligence purposes became evident in a project that tracked mobile devices in Aleppo, Syria. This project demonstrated the unexpected availability and utility of mobile data even in war zones, offering insights into refugee movements. The data, sourced from companies like UberMedia, revealed the potential of advertising data for surveillance and intelligence beyond commercial advertising purposes.
The Locomotive project at PlanetRisk, named by Yeagley’s daughter, epitomized this new approach to intelligence gathering. With a modest budget from Pentagon research funds, Locomotive delved into the movements of individuals and devices across the globe, uncovering patterns of interest to national security. Notably, the project could track the movements of Vladimir Putin’s entourage, indicating the feasibility of monitoring world leaders through commercially available data.
However, the project also exposed vulnerabilities, such as the potential for foreign governments to acquire sensitive data about U.S. military operations. The realization that advertising data could reveal the locations of special operations units underscored the double-edged nature of this intelligence tool.
As the development of Locomotive advanced, Yeagley felt that PlanetRisk’s approach was off the mark. Instead of creating a data visualization tool for the government, he believed it was better to supply the raw data directly, allowing government analysts to work with it as they saw fit. Yeagley’s vision was not about selling software licenses but providing valuable data for a straightforward fee.
Ultimately, Yeagley and PlanetRisk went their separate ways, leading him to Aelius Exploitation Technologies. There, he aimed to evolve Locomotive into a fully-fledged government tool for the Joint Special Operations Command, known for their pivotal role in counter-terrorism operations, including the elimination of Osama bin Laden and Ayman Al Zarqawi, and dismantling ISIS.
The program, rebranded as VISR (Virtual Intelligence, Surveillance, and Reconnaissance), became a shared resource within the U.S. intelligence community, generating valuable leads. Its applications were broad, including a period when the FBI explored its use in domestic criminal investigations, though they later withdrew from the program. Despite this, VISR, like other adtech data products, found a keen user in the Department of Homeland Security, which utilized it for various purposes, from locating border tunnels to tracking unauthorized immigrants.
Concerns have been raised about the privacy implications of using such data, with a government inspector general calling for a halt until more stringent privacy safeguards are established. However, the Department of Homeland Security argued for its continued use, citing its critical role in filling investigative gaps.
VISR is just one among many tools harnessing adtech data for intelligence purposes, with companies worldwide developing similar technologies. Some have even integrated the ability to deliver malware through targeted ads, showcasing the dual-use nature of these technologies.
The widespread availability of detailed movement data from smartphones highlights a new privacy concern far beyond what many might expect. Not only can intelligence agencies access this data, but so can foreign governments, private investigators, and even journalists. This raises alarms about the extent of privacy individuals can realistically expect in the digital age, underscoring the ease with which personal movements and potentially sensitive activities can be tracked and analyzed.
This article is based on the following article:
https://www.wired.com/story/how-pentagon-learned-targeted-ads-to-find-targets-and-vladimir-putin/
Background Information
Understanding these aspects provides a comprehensive backdrop to the article’s exploration of how Grindr’s data sharing practices, and those of similar technologies, intersect with broader concerns about privacy, national security, and the ethical use of digital data. It sheds light on the evolving landscape where digital privacy, social networking, and national security converge, raising critical questions about the boundaries of technology’s role in society.
1. Grindr and Its Significance
- Grindr: An app launched in 2009, Grindr is a location-based social networking and online dating application for gay, bi, trans, and queer people. It uses the GPS functionality on mobile devices to allow users to see other users’ profiles based on their geographical proximity.
- Cultural Impact: Beyond its functionality as a dating app, Grindr has played a significant role in LGBTQ+ culture and communities, facilitating connections in environments where such interactions might be difficult or risky offline.
2. Location Data and Privacy Concerns
- Location Data: Smartphones collect and share data about the user’s geographical location through GPS, Wi-Fi, and cellular networks. This data is invaluable for many apps and services but also raises privacy concerns.
- Privacy Concerns: The sharing of location data with third parties (e.g., advertisers, data brokers) can lead to unintended privacy invasions, allowing for detailed tracking of individuals’ movements and behaviors.
3. Ad Tech Industry and Real-Time Bidding (RTB)
- Ad Tech Industry: The technology and services that enable digital advertising, connecting advertisers with potential audiences through complex digital platforms and exchanges.
- Real-Time Bidding (RTB): A process where advertising inventory is bought and sold on a per-impression basis, via programmatic instantaneous auction, similar to financial markets.
4. The Concept of “ADINT”
- ADINT: A portmanteau of “advertising” and “intelligence,” referring to the use of collected advertising data (such as location, browsing habits, app usage) for surveillance and intelligence purposes, a practice that diverges from traditional espionage methods.
5. National Security Implications
- Risks to Individuals: The potential for foreign entities or malicious actors to exploit data for blackmail, espionage, or to uncover sensitive personnel locations.
- Strategic Risks: The aggregate data could reveal patterns or behaviors of government employees, military operations, or sensitive installations, posing a broader national security threat.
6. Regulations and Ethical Considerations
- Regulatory Framework: There are laws and regulations intended to protect personal data and privacy (e.g., GDPR in the European Union, CCPA in California). However, the enforcement and applicability, especially in the context of national security and international data flows, can be complex.
- Ethical Concerns: The ethical debate centers on the balance between the benefits of technology and data utilization (for security, convenience, economic growth) versus the rights to privacy and the potential for abuse.
Suggested Debate/Essay Questions
- Does the benefit of using ad tech data for intelligence and national security outweigh the potential risks to personal privacy and data protection?
- Can technology companies be trusted to self-regulate when it comes to protecting user data, or is government intervention necessary?
- Should users have more control over their digital data, and if so, how can this be effectively implemented?
Please subscribe to Insight Fortnight, our biweekly newsletter!