A system developed in Italy can recognize people with 95% accuracy using only Wi-Fi waves. But its potential for mass surveillance raises serious concerns

La Sapienza University of Rome has just opened a technological Pandora’s box. Their new WhoFi system can identify any person with 95.5% accuracy using only the Wi-Fi waves that already exist in virtually all modern spaces. It doesn’t need cameras, doesn’t require you to carry devices, and works completely in secret.

The researchers present it as a revolution for convenience and privacy. But a deeper analysis reveals a much more complex and potentially disturbing landscape.

How It Works: The Invisible Digital Fingerprint

WhoFi exploits the fact that each human body interacts uniquely with electromagnetic waves. When a Wi-Fi signal propagates through an environment, its waveform is altered by the presence and physical characteristics of objects and people along its path.

The technology analyzes something called Channel State Information (CSI), which captures how Wi-Fi signals deform when passing through different materials and structures. Unlike optical systems that perceive only the outer surface of a person, Wi-Fi signals interact with internal structures, such as bones, organs, and body composition.

The system uses advanced artificial intelligence, specifically Transformer architectures (the same technology behind ChatGPT), to learn and recognize these unique patterns. In tests with 14 subjects, the Transformer-based model achieved 95.5% accuracy for the Rank-1 metric and an mAP score of 88.4%.

The Promises: A World Without Cameras

The developers paint a seductive future:

Smart homes that recognize you instantly, automatically adjusting temperature, music, and lighting according to your personal preferences.

Discreet security in airports and public spaces, where authorities can identify persons of interest without visible facial recognition systems.

Personalized commerce where stores offer customized experiences without the invasiveness of facial recognition cameras.

Continuous and non-intrusive medical monitoring in hospitals and nursing homes.

The researchers emphasize that, unlike cameras, WhoFi “doesn’t record images” and therefore preserves visual privacy. It sounds perfect. Too perfect.

The Reality: Hidden Dangers

1. Invisible and Omnipresent Surveillance

WhoFi’s biggest risk isn’t what it does, but what you can’t see it doing. Unlike security cameras, which are visible and can be avoided, this technology is completely invisible.

Imagine a world where every public Wi-Fi router becomes a surveillance point. Shopping malls, airports, cafes, libraries, universities: all could track you without you having the slightest idea it’s happening.

There are no flashing red lights, no lenses pointing at you, no signs saying “area under surveillance.” Just invisible waves creating a biometric profile of you while you simply exist in space.

2. The Myth of “Privacy Preserved”

The developers insist that WhoFi is “privacy-respecting” because it doesn’t capture images. This claim is technically correct but fundamentally misleading.

Permanent biometric data: Your Wi-Fi signature, based on your bone structure and body composition, is as immutable as your fingerprints. Once a system has “learned” you, it can recognize you forever.

Cross-platform tracking: Different organizations could share Wi-Fi signature databases, creating a tracking system that transcends locations and contexts. Your signature captured in a shopping mall could be used to identify you at a political protest.

Sensitive inferences: Although the system doesn’t “see” your appearance, it can infer sensitive medical information. Changes in your Wi-Fi signature could reveal pregnancies, weight loss, degenerative diseases, or even the use of implanted medical devices.

3. Massive Scalability

The current study used specialized equipment, but researchers are already working to make any commercial router capable of running this technology. This means:

Immediate mass implementation: No new infrastructure needed. Millions of existing Wi-Fi points could be updated with a simple software change.

Minimal costs: Unlike installing facial recognition cameras, which requires specialized hardware and is expensive, WhoFi could be deployed massively at almost zero cost.

Stealthy adoption: Companies and governments could implement this technology without public announcements, terms of service updates, or consent processes.

4. Absence of Regulatory Framework

Currently, there’s no specific legislation regulating this type of invisible biometric identification. While facial recognition faces increasing regulatory scrutiny, WhoFi operates in a legal vacuum.

Insufficient GDPR: Although the European General Data Protection Regulation covers biometric data, it was designed with fingerprints and facial recognition in mind. Applicability to electromagnetic signatures is uncertain.

Impossible consent: How do you give informed consent for something you don’t know is happening? How can you opt out of a system that’s invisible?

5. Potential for Government Abuse

In the hands of authoritarian regimes, WhoFi would represent the perfect surveillance tool:

Tracking dissidents: Protesters, journalists, and activists could be tracked across multiple locations without their knowledge.

Social control: Governments could monitor movement patterns, social associations, and behaviors of entire populations.

Selective repression: The ability to identify specific individuals in crowds without visible technological means would facilitate targeted arrests.

6. Security Vulnerabilities

Like any connected system, WhoFi would be vulnerable to:

Hacking: Criminals could access biometric signature databases for stalking or crime planning.

Spoofing: Techniques to falsify Wi-Fi signatures could be used to frame innocent people.

Biometric identity theft: Unlike passwords, you can’t “change” your electromagnetic signature if it’s compromised.

Concerning Use Cases

Algorithmic Discrimination

If WhoFi can infer physical or medical characteristics, it could facilitate automated discrimination. People with certain health conditions could be subtly excluded from spaces or services.

Extreme Commercial Surveillance

Retailers could create detailed profiles of shopping behavior, visit frequency, and movement patterns. This information could be used for dynamic price manipulation or exclusion of certain demographic groups.

Corporate Control

Companies could monitor employees invisibly, tracking how much time they spend in different areas, who they interact with, and even infer health states that could affect hiring or promotion decisions.

The Innovation Paradox

WhoFi exemplifies a common paradox in modern technology: an innovation that promises to solve privacy problems while potentially creating much larger ones.

The researchers have created something technically impressive. Their scientific work is solid and their intentions seem genuine. But as we’ve learned with social media, GPS, and artificial intelligence, the unintended consequences of powerful technologies can be more significant than their original benefits.

Unanswered Questions

Before WhoFi is commercialized, society needs answers to critical questions:

  • Who will regulate this technology and how?
  • How will we ensure informed consent?
  • What safeguards will exist against government abuse?
  • How will we protect Wi-Fi signatures as sensitive biometric data?
  • Is there a “right to be anonymous” in public spaces?

The Future Awaiting Us

WhoFi isn’t just another technological innovation. It’s potentially the final nail in the coffin of public anonymity. In a world where every step you take can be tracked by your phone, every purchase recorded by cards, and every online search monitored, WhoFi could eliminate the last refuge: the ability to move physically through the world without being identified.

The developers are right about one thing: WhoFi could make surveillance cameras obsolete. But not because it’s more privacy-respecting, but because it’s much more invasive in ways we can’t even see.

The question isn’t whether this technology will be developed and deployed. The question is whether, as a society, we’ll be prepared for the consequences of living in a world where physical invisibility becomes technological impossibility.

WhoFi’s future isn’t in our routers. It’s in our regulatory decisions today.


The study “WhoFi: Deep Person Re-Identification via Wi-Fi Channel Signal Encoding” is available on arXiv. The researchers did not respond to requests for comments on the privacy implications of their technology.

Scroll to Top