The misleading name, metaphor defiance, and awesome potential of "personal data" — part 3 of 3

The misleading name, metaphor defiance, and awesome potential of "personal data" — part 3 of 3
By Gratisography

In the first post of this series I asserted that data is data. In other words, it's not like anything else. The second post explored and dismissed conceptualizations of data-as-property and data-as-labour. This, the last post in this series, explores data-as-reputation, data-as-public-good, and data-as-me, and then points to some architectural principles for a new direction — interpersonal data.

The problem of the way we frame the opportunity and problem

Data-as-reputation

Rachel Botsman discusses reputation scoring in her book What's Mine Is Yours (check your library), and summarizes the opportunity in a later magazine article:

Imagine a world where banks take into account your online reputation alongside traditional credit ratings to determine your loan; where headhunters hire you based on the expertise you've demonstrated on online forums such as Quora; where your status from renting a house through Airbnb helps you become a trusted car renter on WhipCar; where your feedback on eBay can be used to get a head-start selling on Etsy; where traditional business cards are replaced by profiles of your digital trustworthiness, updated in real-time. Where reputation data becomes the window into how we behave, what motivates us, how our peers view us and ultimately whether we can or can't be trusted.

Welcome to the reputation economy, where your online history becomes more powerful than your credit history.

... It's the culmination of many layers of reputation you build in different places that genuinely reflect who you are as a person and figuring out exactly how that carries value in a variety of contexts.

The most basic level is verification of your true identity -- is this person a real person? Are they who they say they are?

Rachel Botsman, Wired Magazine

There is nothing to dislike about the advantages touched upon here. Unfortunately, like most things in life, the upsides come with downsides.

Programmatic quantification of reputation is an inevitable evil due to the unavoidable self-moderation and modulation it inflicts on its subjects beyond that which might be argued as 'good for society'. To be clear, the social accretion of local, contextually-relevant reputation with forgiving opportunities for reparation has served communities for millennia. We are however considering universal, non-contextual and irremediable scoring and algorithmic assessment. Mix in a naïve conceptualization of identity, as inferred by the final paragraph of the quote above, and we will have excluded from society anyone for whom identification is a question of personal safety, and shackled the very psychological change processes we all experience and rely upon.

Context is a keyword. I want to know if Alice is trustworthy to drive me safely from A to B. I couldn't care less if she's up on her mortgage payments, or has been dropping litter (I do care about that generally, but not in this context), or has been 'dutifully' supportive of the current government, which of course would be an abhorrent context here and one we must ensure we don't engender accidentally. The introduction to Botsman's latest book, Who Can You Trust? (check your library), does describe such an emerging system as this in China as Orwellian, and warns that we're creating "reputation trails where one mistake or misdemeanour could follow us potentially for the rest of our lives."

Some may think the GDPR limits such poor outcomes, but the regulation does not give anyone the right to an explanation of automated decision-making for example, and when algorithms feed other algorithms in complicated sequence with dashes of machine learning for good measure, with its inevitable biases, a black box can be just as opaque to its operators as to those on whom it operates.

We change ourselves, consciously and subconsciously, when we know we're a subject of gaze — witness the 'selfie-generation', poor bastards. And in fact, if only a small coterie gets to set the associated algorithms, there is a massive potential abuse of power.

PEN America, an organization that exists to protect open expression in literature and related arts, reported in 2013 that writers are engaging in self-censorship directly attributable to their concerns about government surveillance, a system that inevitably 'scores' its subjects. Approximately a quarter of writers surveyed had curtailed or avoided social media and had deliberately avoided certain topics in phone and email conversations, and that's before the current President's Orwellian attacks on the media.

If you resist censorship, it seems you must also resist reputation scoring.

Fortunately, we're not actually interested in reputation per se, only as a proxy for trustworthiness and accountability. With some forethought we can engineer our way around these challenges and avoid a Nosedive. I suspect the architecture will resemble the past in terms of our having to re-engineer appropriate contextual and social frictions.

Data-as-public-good

The title of Mariana Mazzucato's latest book strikes at the heart of my focus here — The Value of Everything: Making and Taking in the Global Economy. The book revisits her previous work, noting how much of the technology underpinning the success of the dominant digital companies today was actually funded by the state, e.g. the Internet, GPS, the touchscreen, voice activation. She then expands on her grievance:

Facebook's and Google's business models are built on the commodification of personal data, transforming through the alchemy of a two-sided market our friendships, interests, beliefs, and preferences into sellable propositions.

Mariana Mazzucato, The Value of Everything: Making and Taking in the Global Economy, (check your library)

Mazzucato believes the public should own its own data, not merely with the possibility of selling it to the tech giants, but also to allow the public to have more collective agency over its collective application for public benefit. She is not a technologist so does not expand on the technical architecture that might be required beyond namechecking big data and artificial intelligence. As an economist she is considerably more cogent in criticising the application of neoclassical economics to personal data, the reduction of value to a market price, the market's inability to distinguish between value creation and value extraction, and a dedication to the latter leading ultimately to value destruction.

As Mazzucato concludes in an accompanying article:

Only by thinking about digital platforms as collective creations can we construct a new model that offers something of real value, driven by public purpose. We're never far from a media story that stirs up a debate about the need to regulate tech companies, which creates a sense that there's a war between their interests and those of national governments. We need to move beyond this narrative. The digital economy must be subject to the needs of all sides; it's a partnership of equals where regulators should have the confidence to be market shapers and value creators.

Mariana Mazzucato, Technology Review

Viktor Mayer-Schönberger advocates a "progressive data-sharing mandate".

... regulators wanting to ensure competitive markets should mandate the sharing of data. To this end, economists Jens Prüfer and Christoph Schottmüller offer an intriguing idea. They suggest that large players using feedback data [essential to machine learning] must share such data (stripping it of obvious personal identifiers, and stringently ensuring that privacy is not being unduly compromised) with their competitors.

... Building on this idea, we suggest what we term a progressive data-sharing mandate. It would kick in once a company's market share reaches an initial threshold --- say, 10 percent. It would then have to share a randomly chosen portion of its feedback data with every other player in the same market that requests it.

Viktor Mayer-Schönberger, Reinventing Capitalism in the Age of Big Data (check your library)

Academics such as Mazzucato and Mayer-Schönberger see a deeper and perhaps higher value in personal data than they believe the market can realise.

Similarly, in Virtual Competition: The Promise and Perils of the Algorithm-Driven Economy (check your library), Professors Ariel Ezrachi and Maurice Stucke argue that today's dominant players are in fact now freed from the market's invisible hand. Indeed, one Silicon Valley businessman, Peter Thiel, considers this an aspirational outcome when he spouts that "competition is for losers."

On reading a draft of this post, Elizabeth Renieris noted that the role of regulatory frameworks, such as antitrust laws, shouldn't be under-appreciated. They are core to Viktor Mayer-Schönberger's outlook. In a subsequent conversation we noted that antitrust principles are founded on ensuring consumer benefit through competition. They typically focus on acts of monopolizing behaviour rather than monopolies per se, where there is an actual or potential consumer detriment. Many could argue however that the main centralizing, data-hoarding companies have provided consumer benefit rather than detriment, dismissing mere 'philosophical' concerns as raised here in the process. It seems the application of such regulatory frameworks will first require their revision, as indeed is under review in Europe — this 2016 Franco/German study on Competition Law and data for example.

In conclusion, I share the academics' concerns, but at the end of the day market regulators can only regulate markets, and markets are dedicated to property ownership. The second post in this series already dismissed data-as-property.

By Sergey Yurkov

Data-as-me

In 2005, Professor Luciano Floridi pointed out that personal-data-as-property completely fails to resolve some very real privacy concerns. Information contamination for example, where our privacy is invaded by junkmail and loud and intrusive chatter. In public contexts in which privacy norms still exist without any concept of ownership — the right not to have the contents of your packed lunch logged even though you eat it in plain sight for example. And perhaps the most fundamental flaw in the data-as-property view of the world as noted in my previous post — it's non-rivalrous.

Floridi offers up an alternative:

Informational privacy requires radical re-interpretation, one that takes into account the essentially informational nature of human beings and of their operations as informational social agents. Such re-interpretation is achieved by considering each person as constituted by his or her information, and hence by understanding a breach of one's informational privacy as a form of aggression towards one's personal identity.

Professor Luciano Floridi

After all, we call a type of privacy transgression identity theft. He observes that digital technologies must then be developed to allow individuals the opportunity:

... to design, shape and maintain their identities as informational agents. ... [and that] collecting, storing, reproducing, manipulating etc. one's information amounts now to stages in stealing, cloning or breeding someone else's personal identity.

... one's informational sphere and one's personal identity are co-referential, or two sides of the same coin. 'You are your information', so anything done to your information is done to you, not to your belongings.

This is an exciting vista. Nevertheless, it requires some modification in the wei spirit (see the first post in this series) to encourage the emergence of collective intelligence and our anti-rivalrous flourishing.

How might we realise this value while respecting personal dignity and agency? Indeed while respecting personal privacy, because what agency can anyone be said to have if they cannot maintain personally desired and contextually-appropriate privacy?

With some hope of bringing this three post series to a conclusion, I can at least aspire to identify some qualifying architectural principles.

Interpersonal data architectural principles

Humans, not data subjects

You and I are not mere data subjects. Data is now integral to what it means to be human.

All human beings are born free and equal in dignity and rights

This is the first part of the first article of the Universal Declaration of Human Rights, and harks back to Thomas Jefferson's immortal declaration. It must be a core principle here of course.

Interpersonal, not personal

This principle was presented in the first post of this series, so I won't repeat the explanation here.

Edge-centric, not node-centric

If you accept the reframing in terms of interpersonal data, then architecturally the locus of our attention is shifted from the nodes to the edges.

Data is data

No pre-existing conceptualization or metaphor can be ported to our needs here, and definitely not one based on a property paradigm. So wave goodbye to words such as my in the context of data ownership. But not I, me and we in combination — or indeed wei.

Agency, not control

Agency refers not to the intentions people have in doing things but to their capability of doing those things in the first place. ... To be able to 'act otherwise' means being able to intervene in the world, or to refrain from such intervention, with the effect of influencing a specific process or state of affairs.

Anthony Giddens

Agency entails a negotiation in and with the world that the word control appears to deny.

Sociologists appreciate that agency is unevenly distributed. Subscribing to a humanist emancipatory ideal dedicated to human flourishing on top of a natural self-interest in nurturing resilient living systems, one might want for digital technologies and services to expand agency on the whole, attenuate pre-digital constraints, and potentially spread it around a little more evenly.

Support posthumanism

Human beings are, of course, prioritized here over other person types (e.g. corporates), and it is worth then considering what we mean by human being in the digital age, specifically our digital augmentation as touched upon in the first post when defining person. Norbert Wiener anticipated this informational, cyber extension of ourselves way back in 1950!

... where a man's word goes, and where his power of perception goes, to that point his control and in a sense his physical existence is extended. To see and to give commands to the whole world is almost the same as being everywhere.

Norbert Wiener, Human Use of Human Beings

Two thirds of a century later it seems almost timid to (re)define the human as biological, psychological, social, informational (i.e. 'personal data', per Floridi, and interpersonal data), and interfacial (the application of information for the self-sovereign sense-making of information, per the hi:project).

In terms of the three levels of detachment referred to at the top of the first post in this series, Floridi observes that information technologies enhance the corporeal membrane, empower the cognitive membrane, and extend the consciousness membrane, but I would go further. The informational and interfacial aspects of the human being may weaken the individualizing effects of Floridi's three membranes by casting new complementary wei-shaped ones. Once again, we're surveying a landscape first identified decades ago:

Gregory Bateson has clearly shown that what he calls the 'ecology of ideas' cannot be contained within the domain of the psychology of the individual, but organizes itself into systems or 'minds', the boundaries of which no longer coincide with the participant individuals.

Felix Guattari, The Three Ecologies

The concept of mind may be considered a consequence of our being social animals rather than just what's in here [taps skull]. Conceptualizations of cognition encompass the very digitally-enabled information flows that are our focus here, enabling embedded, embodied, extended, distributed and collective cognition.

Differences, not absolutes

Given Bateson's definition of information as a difference that makes a difference, it would seem a little unnecessary to include this architectural principle. Nevertheless, it's worth noting the broader potential value of focusing on difference in the positive rather than negative (i.e. A minus B) sense.

We're all human. We're also all now on one worldwide network, and we need to keep that human too. Nothing is more human than our differences — not only from each other, but from our former selves, even from moment to moment and context to context.

Doc Searls
[This new era] ushers in the concept that difference is what all of us have in common. That identity is not fixed but malleable. That technology is not separate but part of the body. That dependence, not individual independence, is the rule.

... Form follows dysfunction.

Lennard J. Davis
By Luboš Plný

Rhizomes, not trees

Deleuze and Guattari use the word rhizome to describe theory that allows for multiple, non-hierarchical entry and exit points in data representation and interpretation. They define it negatively in good part as not hierarchical, not tree-like. This is an important assertion given how frequently tree-like metaphors pepper our sciences, computing, and social constructions.

... unlike trees or their roots, the rhizome connects any point to any other point, and its traits are not necessarily linked to traits of the same nature ... The rhizome is reducible to neither the One nor the multiple. ... It is composed not of units but of dimensions, or rather directions in motion. It has neither beginning nor end, but always a middle (milieu) from which it grows and which it overspills. ... Unlike a structure, which is defined by a set of points and positions ... the rhizome is made only of lines. ... In contrast to centered (even polycentric) systems with hierarchical modes of communication and pre-established paths, the rhizome is an acentered, nonhierarchical, non-signifying system without a General and without an organizing memory or central automaton, defined solely by a circulation of states.

Deleuze and Guattari

Cache, not facsimile

A cache is a store of data solely for the purpose of serving future requests for that data faster. Caching is integral to IPFS for example. On the other hand, I use facsimile here in the sense of a mere copy of lesser import than the master record. You may have a facsimile of your bank account transactions for example, but you would never expect your bank to refer to your copy of events.

Note that the data is rendered into information — into valuable, actionable insight — when data flows in combination. Flow does not require static facsimile. Combination does not require isolated facsimile. Moreover, facsimile will by definition never be master. I contend then that the concept generally known as personal data store (also vault or wallet) is redundant before it's even made its way into the world. Rather, we need insight into where interpersonal data is flowing and for what purposes and with what consequences, entailing an interface into and onto the permissions and flows.

Ultimately, if we get this right, your bank's cloud (should they continue to maintain one, and should banks continue to be necessary to banking) will be the facsimile.

Information is simply complex

We are dealing with a phenomenon here that appears atomistically simple but in fact forms, informs and infects everything. It's the original complex. Identity is information. Relationships is information. Reputation is information. Exchange is information. Organizing is information. Life is information. We should then proceed with due caution — ethically and technically. Ethically, we need to take appropriate time for due diligence. Ethically, we cannot delay and allow the data-as-property protagonists time to establish the mother of all Nash equilibria.

Please get in touch

A human being's subjectivity (our biases when it comes to perceiving and assessing reality) may continue to emphasise itself, by which I mean over-emphasise its self, but the rate at which information technologies transform the interconnecting and interdepending is considerably greater than in the context of the self alone — because that's what information technologies do.

We are each conscious as evolution has rendered sufficient complexity for such phenomena to arise. The complexity in which we are bound may not (yet) be conscious, but it is nevertheless inseparable. As we co-engineer sufficient complexity in the flow of interpersonal data, we may develop the ecology of mind required for a step change in wei's intelligence, and perhaps its consciousness. Our co-development of regenerative planetary systems may depend on it, and so the survival of our species and others.

Are you a like or dislike mind? Please do get in touch.


This post, the final in a series of three, has been shared on the AKASHA dapp, and is also available on my personal blog and Medium.

To the point made in the first post, I cannot thank a thousand interlocutors, and their thousand interlocutors. But I can thank those who commented on the drafting of these posts: Tony Fish, Elizabeth Renieris, Mihai Alisie, Martin Etzrodt, Andrei Sambra, Adrian Gropper, Christina Bowen, Roy Brooks, John Grant, Jim Pasquale, Sergio Maldonado, Marc Lauritsen, Joel Doerfel, Rui Vale