Let’s start this second part with getting a bit more clarity on why this approach is relevant for you.
Unplash image
It was said that digital self-determination puts data subjects and their communities up front in a new approach to controlling data access and use. They might be better thought of as the ’self’ component in DSD. Simply, when looking at what has come to be known as personal data there needs to be some relationship between people and their data. For DSD this sort of data comes from people and takes form in the communication of messages between others – even with machines/technology. These messages naturally vary depending on their context and purpose. Such messages could range from a request to buy a train ticket or a proposal of marriage, which can reveal different degrees of personal information. Why? because they say something about where we want to travel, or who we might want to spend the rest of our lives with. The communication of personal data from its original source (with the data subject) and others could create data communities, connected through bonds of trust and shared values which these messages might confirm or challenge. So, the ‘self’ is not just you and me, but it is people communicating their data between others that they want to message. And in doing this the data has an original purpose that is often lost when data is used and reused.
Data communication in data communities happens somewhere. Data is communicated and exchanged in what we call digital spaces – where through digital technology people can transact data for a commercial purpose like when purchasing online, or more personal reasons such as when we take a blood test and the results are processed via technology and passed on to our doctor. These digital spaces might also be seen as data spaces (digital/data spaces) where digital tech helps transact data across the space. Whatever we use digital spaces for, if our data is conveyed there, we want to trust that space and how technology facilitates data communication. If we suspect that a digital space is not safe because in that space our data may be used for purposes we never intended, then we will not trust it. How many of you have been victims of scams that occurred in digital spaces which you thought you could trust but turned out to be unsafe because you didn’t have enough information about data use or you had no control over the eventual processing of that data? That is where determination comes in. DSD works to provide data subjects (you and me) with the information necessary, and with genuine opportunities to choose our digital spaces, the conditions that make them safe for us, and decide what happens with our data if it might be used in ways we did not understand or intend.
Now having cleared that up let’s move on to the reasons for reading Blog 2. Many data protection regimes you might be aware of are designed to deter poor data practices or to identify and sanction those who break these rules. Even those approaches that rely on the data rights of individuals, hope that these rights will encourage complying with the principles on which rights are based because not to do so will have bad outcomes for bad practice. In looking at data protection in these ways, data subjects need to rely on regulations or laws, and the intervention of external institutions or agencies to see that their data is safe. DSD takes a different approach.
Central to the notion of self-determination is the understanding that individuals can have an important role in controlling their data. Self-determination calls on you to do it for yourself and not wait on others to do it for you. This type of thinking could be seen as looking at data protection positively, and it requires that players or stakeholders in a digital/data space cooperate in that outcome. By positively we mean that rather than sanctioning data breaches, DSD offers a best practice framework that rests on respectful and responsible data exchange.
Now you might be thinking that data subjects are small people in these settings when compared with the power of big information platforms or government databases.
Exploring the Driving Forces Behind Digital Self-Determination
Why would influential entities allow data subjects to control their data when its unconsented use is profitable or furthers their organizational goals?
In its early stage of activation DSD accepts that there will be power imbalances in data relationships.One way of levelling these, is to recognize the self interest of each main stakeholder and to give it space to listen to other interests.An example here will be helpful. Recently banks and some financial institutions have opted for data portability–where customers can know what data is held on them and if they want to move from one bank to another, they should be able to take that data with them. But surely banks have an interest in keeping their customers and not encouraging access to their competitors. Maybe so, but banks are more likely to retain accounts and attract new customers if they are trusted rather than being secretive about personal data.
In this way reputation is a market asset for the banks. Reputation is gained and grows through trust. In addition, if a bank lets data subjects know about their personal data this is a check and balance that this data is accurate. In a world of big data, data accuracy is not easy to guarantee. So, what initially looks like the banks have an interest in data secrecy, they can be convinced that the commercial value of data openness is greater for them and so their self-interest shifts, and data subjects are empowered as a result. This is the way that the interests of the data holder and the data subject become more mutual.
How will this happen?
Corporations are not in business to be generous to data subjects, are they? The answer may at first be no, but again this is where DSD is a novel way of looking at data relations, and as such it works on the idea that participants will want to maximise their interests but not to do so in ways that endangered the sustainability of data relations. Imagine a situation where you might want to know more about your data and have some say in how it is accessed and used. Pause and think for ten seconds. There will be stakeholders in that situation, some more powerful than others, and these stakeholders initially might have no connection except in terms of data access and use, known or unknown. You want more information about your data and a big platform wants more of your data. At this point it might be necessary to get some external help from another player, who has an interest in getting you and the big platform talking. There isn’t the time here to explore all the possibilities when it comes to who or what might facilitate this first discussion or negotiation. An example is where some companies or organizations have officers they call data stewards. These people are charged with helping maintain good data practices inside the organization. To do that the steward might encourage the company to reach out to data subjects and their communities so some dialogue can happen that will benefit both parties. As with the reputation example, there can be many other reasons why the management of a platform might want to communicate with data subjects besides just asking you to agree to their cookies. One might be the desire to harvest more data for business, but recognizing that bad data access and use practices might lead to heavier external data regulation and eventually less data to use in the long run.
Say the organization is a government agency responsible for migration policy or disability services. It would make good sense to test whether policy initiatives that depend on personal data will work or are working by talking to people whose data is being used in that policy and who are directly affected by these policies. To develop trust in any such dialogue, particularly where the data subject might otherwise not think much of the policy or those who promote it, the promise of openness and some accountability might start trust growing. The agency gets benefit through building up a knowledge base and the data subject learns about data use. This mutual information interest enables better quality data, better understanding of data use, and in the long benefits trust in data. The next stage in the DSD operational process is building on that trust so that data subjects can have some say in how their data is managed. DSD grows through establishing trust and gradually negotiating shared interests. There are some use cases on DSD and migration, and disability experience that you can find at ……. And these show how the interest of the parties when they come to DSD will change and mutualise as other stakeholders adapt their self-interests through negotiation and trust building.
Exploring the Essence of Trust: A Key Concept in DSD
We are starting out, aren’t we, not knowing enough about how data is used so how are we going to trust those who keep us in the dark and test it if we do? Well DSD is a two-way street. Everything we do in our digital world depends on trust. We don’t know how our smart phone works and we trust it. But there will be bumps in the road where that blind trust needs more information and active involvement for it to work to our advantage. In return for the data subject’s trust and the benefits it offers, big data users need to invest back into that trust with information and capacity to participate in user decisions. If the data subject and her community know more about the access and use of their data and they can participate in ensuring data accuracy as well as opening up to new data exchanges.This might happen because their relationships with data holders are respectful and responsible then this means that the marketizing of such data will become more ‘fact based’ with the monetary benefits that provides. Further, data subjects can control who they prefer to engage with in any marketizing of their data and this will stimulate competition in data markets where data users must better account for data subject preferences, and data subjects can become real market players.
If you are thinking that this all sounds too aspirational, then think again. These shifts from self-interest to mutual interest are happening all around us. If you are renting an apartment, the landlord and you don’t look at the clauses of the lease every time you want to ensure your quiet enjoyment and they want you to pay the rent for that. Reduce this relationship down to data. The landlord expects a financial benefit and you want a place to live. But to get both, money has to be transacted, and this is most likely today done digitally through trusted data spaces like your online banking. Rental payments produce data that could be converted by credit rating agencies to scores for determining whether you are able to rent in the future. So, data about rent (your personal data) which describes a benefit passed to the landlord gets converted to your future benefit and that of future landlords if you rent another place. Then to expect DSD will work on mutual benefit in data access and use is not too far-fetched. Offering another example of the progressive benefits of data sharing in trusted/safe data spaces, a patient would expect that the digitized data produces through MRI scanning will enable a more tailored approach to their treatment. From here, if the data subject knows the legitimate medical health research possibilities of turning that data from personal to population data then they may be happy for this wider data use on the basis that it could contribute to new preventative therapies and treatments. But this will only process in an empathetic data environment if openness is assured at the initial patient-to-radiologist stage.
DSD isn’t a top-down approach to managing your data
Instead, it depends on you wanting to be part of managing your data. To motivate the participation of data subjects (discussed more in the next blog), DSD offers trusted/safe data spaces where you can find information about how your data is used, talk to those who use it, and be given real opportunities to negotiate their interests and yours in respectful and responsible data practice. We call this cocreating a data governance strategy, and coproduction in operationalizing its stages and outcomes.
DSD does not deny rights or risk approaches to data protection. DSD can’t solve all data access and use challenges. For instance, if big data players do not want to engage in responsible/respectful data conversations with data subjects, then providing data rights exist where you can be reverted to. And if these recalcitrant data users practice risky data access/use behaviors then more command-and-control regulation might need to intervene. But, as most data relations exist anyway, and are not confrontational then DSD will have wide application.
Our advocacy for DSD might now need to address its limitations. We have already touched on motivating stakeholders to take the first steps in mutual negotiation. The fragility of trust has been referred to. The difficulty in getting data subjects to understand and take seriously the challenges posed in data use and re-use also needs mentioning. How can any or all of these be overcome? Without conscious trust building then some of these limitations will need work. Getting stakeholders to initially engage could well require external ‘nudging’ and to spark trust there will certainly need to be evidence that trust will be reciprocated. Blog 3 will talk through some of the use-cases which have trialed DSD in very different contexts and hopefully provide that assurance.
It is early days for DSD as a recognized and preferred data governance principle. But its time has come. As mentioned in the first blog, worries surrounding the evolution of generative AI build on a longer standing history of concern about the pace of digital transformation and the way that tech and data use are moving further and further from human supervision. The assurances by the big information lobby that we can trust their ethical commitment, or from governments and their regulators that they are sufficiently independent and tough enough to rein in the possible excesses of technological advance are ringing hollow for many. In the same way that people in the Covid pandemic were activating against mass surveillance and medical intervention when the risks were not fully understood and the control strategies seemed to isolate self-determination and despite contrary calls to trust the regulators, there is emerging a countermovement looking for more understanding of data dependencies and more involvement in their resolution. If this is not enough to drive DSD then the responsibility rests with those promoting this approach to put its potential out there to the widest audience.
Such is the reason for the first blog, this one and the one to follow that will talk particularly on why you should get involved. Since you are reading this from the website of our DSD international network, this might incentivize you to learn more about the network’s public engagement, purpose, and its commitment to create awareness for data subjects when they consider DSD to help manage their data.