Surveillance versus Telemetry

From OMXUS
Jump to navigation Jump to search

Surveillance versus Telemetry is a conceptual distinction in technology ethics that differentiates between data collection systems based on their power structures, transparency, and benefit flows rather than on the act of data collection itself. The distinction is central to the OMXUS project's approach to identity and safety systems.

The core claim is that the ethics of data collection are determined not by what is collected, but by who controls it, who can see it, and who benefits from it. Under this framework, the same data—location, activity patterns, social connections—can be either oppressive or liberating depending on its architectural context.

Definition

Surveillance

Surveillance refers to data collection systems characterised by:

  • Asymmetry: One party observes while the other is observed, with no reciprocal visibility
  • Opacity: The observed party does not know what is collected, when, or how it is used
  • Extractive value flow: Benefits accrue to the observer, not the observed
  • Weaponisability: Collected data can be used against the observed party (targeting, discrimination, punishment, coercion)
  • Power imbalance: The architecture creates dependency and enables control

Historical examples include state surveillance programs, corporate data harvesting for advertising, and social credit systems where individuals cannot access or contest their own records.

Telemetry for Humans

Telemetry for humans refers to data collection systems characterised by:

  • Symmetry: All participants operate under identical rules and can see the same information
  • Transparency: Participants know exactly what is collected, why, and how it is used
  • Reciprocal value flow: Benefits return to the data subject (insights, earnings, safety)
  • Non-weaponisability: Architectural constraints prevent data from being used for punishment or coercion
  • Power balance: Participants own and control their own data

The term telemetry is borrowed from engineering and medicine, where it refers to the remote measurement of data for the benefit of the system being measured (a spacecraft, a patient). The phrase "for humans" emphasises that the data subject is the intended beneficiary.

The distinction framework

Dimension Surveillance Telemetry for Humans
Who sees Observer only Everyone equally, including self
What's visible Hidden from subject Fully transparent to subject
Value direction Extracted from subject Returned to subject
Potential for harm Weaponisable against subject Architecturally non-weaponisable
Power relationship Asymmetric (watcher/watched) Symmetric (mutual visibility)
Psychological effect Paranoia, self-censorship Confidence, mutual accountability

The Shared Equal and Transparent test

A practical test for distinguishing surveillance from telemetry asks four questions about any data collection system:

  1. What data is collected?
  2. Why is it collected?
  3. How is it used?
  4. Who benefits from it?

If the answers to these questions are:

  • Knowable by all participants
  • Identical for all participants
  • Verifiable by all participants

...the system passes the test and qualifies as telemetry rather than surveillance. If any answers differ between observer and observed, or are hidden from the observed party, the system constitutes surveillance.

This test is related to but distinct from informed consent frameworks. A system could obtain consent while still being surveillance (if the consenting party cannot verify claims about data use), and a system could be telemetry without individual consent (if participation is universal and symmetric).

Philosophical foundations

Veil of ignorance

The distinction draws on John Rawls' concept of the veil of ignorance: a thought experiment asking what social arrangements people would accept if they did not know their position within them.[1]

Applied to data systems:

  • Surveillance fails the test: A rational person would not accept a system where they might be the watched party with no visibility into who watches them or how data is used
  • Telemetry passes the test: A rational person could accept a system where everyone is measured if everyone benefits equally and sees the same information

Panopticon inversion

The philosopher Michel Foucault analysed the Panopticon—a prison design where inmates cannot tell if they are being watched—as a model for modern surveillance society. The uncertainty of observation produces self-discipline and conformity.[2]

Telemetry for humans represents a structural inversion: rather than many being watched by few (with the many unable to see the watchers), everyone watches everyone equally, including themselves. The asymmetry that produces Foucauldian discipline is eliminated.

Sousveillance

The concept relates to sousveillance ("watching from below"), coined by Steve Mann to describe the recording of authority figures by citizens.[3] Telemetry for humans extends this by making all observation bidirectional and transparent, rather than merely inverting the observer/observed relationship.

Architectural requirements

For a system to qualify as telemetry for humans rather than surveillance, certain architectural properties must be enforced technically rather than merely promised by policy:

User ownership

Data subjects must have cryptographic control over their own data. In the OMXUS implementation, this is achieved through the Human Existence Record (HER), a self-sovereign identity document signed by the user's own keys.

Purpose limitation

Data must be usable only for specified beneficial purposes, with technical constraints preventing repurposing. This differs from policy-based purpose limitation (which can be violated) in that the architecture itself makes violations impossible or detectable.

Benefit sharing

Value generated from data must flow back to data subjects. In OMXUS, this includes:

  • Personal insights derived from one's own patterns
  • Revenue share from verification services (40% to vouchers, 30% to token holders)
  • Safety benefits from the community response network

Transparency

All data collection must be visible to the subject in real time. This requires not only disclosure but legibility—the subject must be able to understand what is collected, not merely be notified in incomprehensible terms.

Symmetry

All participants must operate under identical rules. No participant may have access to data or capabilities unavailable to others. In OMXUS, this extends to the system operators themselves—there is no administrative backdoor that allows asymmetric observation.

Applications

Safety networks

Traditional approach (surveillance): Police cameras monitor "high-crime areas," footage accessible only to authorities, used for post-hoc investigation and prosecution.

Telemetry approach: All participants' proximity data visible to nearby community members, enabling 60-second mutual aid response. The same location data that would enable tracking under surveillance instead enables assistance under telemetry.

Identity systems

Traditional approach (surveillance): Government databases track citizens, accessible to authorities, used for verification and control. Citizens cannot access, audit, or contest their records.

Telemetry approach: Self-sovereign identity records controlled by the individual, auditable by the individual, monetised by the individual through verification services.

Economic systems

Traditional approach (surveillance): Corporate tracking of consumer behaviour for advertising profit. Data subjects receive no value; their attention and behaviour are the product sold to third parties.

Telemetry approach: Contribution metrics visible to the contributor and the community, generating reputation and revenue share. The data subject is the beneficiary rather than the product.

Criticisms and limitations

Transparency is not sufficient

Critics argue that making surveillance transparent does not eliminate its harms. If everyone can see everything, the result may be universal conformity pressure rather than liberation. The distinction requires not only transparency but also non-weaponisability—technical guarantees that data cannot be used for punishment or coercion.

Privacy as a value

Some privacy theorists argue that the ability to hide is itself valuable, and that universal transparency—even if symmetric—eliminates spaces necessary for personal development, dissent, and intimacy.[4]

The telemetry framework responds that selective privacy remains possible (participants choose what to share beyond baseline safety data), and that asymmetric hiding (where powerful actors hide while surveilling others) is worse than symmetric visibility.

Implementation difficulty

Achieving true architectural non-weaponisability is technically challenging. Systems that claim to be telemetry may have hidden asymmetries, backdoors, or emergent surveillance properties. Verification requires not only code audits but ongoing monitoring of system behaviour.

Power differentials persist

Even with symmetric data access, pre-existing power differentials may allow some participants to extract more value from shared data than others. A corporation with analytical resources may benefit more from transparent data than an individual, even if both have equal access.

See also

References

  1. Rawls, John. A Theory of Justice. Harvard University Press, 1971.
  2. Foucault, Michel. Discipline and Punish: The Birth of the Prison. Pantheon Books, 1977.
  3. Mann, Steve. "Sousveillance: Inverse Surveillance in Multimedia Imaging." ACM Multimedia, 2004.
  4. Nissenbaum, Helen. Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford University Press, 2009.

Further reading

  • Zuboff, Shoshana. The Age of Surveillance Capitalism. PublicAffairs, 2019.
  • Schneier, Bruce. Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World. W. W. Norton, 2015.
  • Mann, Steve, and Ferenbok, Joseph. "New Media and the Power Politics of Sousveillance in a Surveillance-Dominated World." Surveillance & Society, 2013.