Pokémon Go Players Helped Build a New AI—And They Had No Idea
Players scanned PokéStops; Niantic collected data for its next big AI project.
Remember Pokémon Go?
Back in 2016, it seemed like everyone was wandering the streets, phones in hand, trying to catch virtual creatures superimposed onto the real world. It was a viral phenomenon that got people outside, exploring their neighborhoods and experiencing augmented reality on a massive scale.
But what if I told you that those innocent days of Pikachu hunting might have contributed to something far more complex—and potentially concerning?
Earlier this month, Niantic, the company behind Pokémon Go, announced that it's been using data collected from its millions of players to build something called a "Large Geospatial Model" (LGM). As Ryan Broderick reported in his Garbage Day newsletter, Niantic's LGM is essentially a Large Language Model but for visualizing and mapping physical space. They're calling it the Visual Positioning System (VPS), and they plan to use it for future augmented reality products and robotics.
On the surface, this sounds like a natural progression for a company that made its name blending the digital and physical worlds. But not everyone is thrilled about the potential implications. Elise Thomas, an analyst at the Institute for Strategic Dialogue, pointed out a darker side of this development. "It's so incredibly 2020s coded that Pokémon Go is being used to build an AI system which will almost inevitably end up being used by automated weapons systems to kill people," she wrote on X.
That might sound alarmist at first, but let's look deeper. Niantic's LGM aims to help computers perceive, comprehend, and navigate the physical world in a way that mimics human spatial understanding. By training AI models on millions of geolocated images collected from players, the technology could be used in fields beyond gaming—including robotics and autonomous systems.
According to a report by Emanuel Maiberg at 404 Media, Niantic's Senior Vice President of Engineering, Brian McClendon, acknowledged that governments and militaries could be interested in purchasing this technology. When asked whether he could see militaries using LGMs, McClendon responded, "I could definitely see it... If the use case is specific in military and adding amplitude to war then that's obviously an issue."
This raises significant ethical questions. Did players realize that by scanning PokéStops and participating in AR mapping tasks, they were contributing to a dataset that could potentially be used for military purposes? As Maiberg noted, "Players of the incredibly viral Pokémon Go had no way of knowing that when they downloaded the game in 2016 that it would one day fuel this type of AI product."
Niantic has responded by stating that the scanning feature is completely optional and that "merely walking around playing our games does not train an AI model." They also emphasized that the LGM is an early-stage project and that they plan to tackle any questions responsibly and thoughtfully.
But the fact remains that massive amounts of data—collected from players who were likely more interested in catching a rare Charizard than contributing to an AI model—are now being used in ways that might extend far beyond gaming. This isn't just about Pokémon anymore; it's about the commodification of user-generated data and the unforeseen ways it can be repurposed.
In an era where data is the new oil, companies are racing to build AI models powered by vast datasets, often without explicit informed consent from the people who generated that data. We've seen this with text and images scraped from the internet to train language and image models. Now, with geospatial data, the stakes could be even higher.
Perhaps it's time for a broader conversation about data ethics and transparency. When we participate in seemingly innocuous activities like playing a mobile game, we deserve to know how our data might be used—not just now, but in the future.
So the next time you're tempted to scan that PokéStop or try out a new AR feature, it might be worth considering: What are you really contributing to? And are you okay with where that data might end up?
Recommended reading:
at brilliantly satirizes a lot of the post-election takes with “Kamala Harris Lost Because I Was Right.” at sounds the alarm on the right’s push to end abortion “exceptions” designed to save women’s lives.And over at
, puts a price tag on Donald Trump’s plan to eject 15,000 trans servicemembers from the military.
YOU SOLD ME OUT, PIKACHU!
I've been playing since its inception but this is one aspect of the game I never did because it felt weird and completely tacked on. I guess now we know why.