Skip to content

Data Disclosure and Suicide – 21st Jan 2022

-- 6 min read --

I had the chance to contribute to a debate initiated by Baroness Kidron on social media and harms to children in the House of Lords yesterday.

A sensitive issue that was raised during the work of the Joint Committee on the Online Safety Bill was the question of access to data after a young person has taken their own life.

The Committee’s report argues for a way to be found to improve the situation for families and I agree with this.

This is the contribution I made to the debate :-

My Lords, I will speak to one particular issue that the noble Baroness has raised, quite rightly in my opinion, in this debate and in the report of the Draft Online Safety Bill Joint Committee, of which I know she was a very active member. This is the question of access to data from the accounts of people who have sadly taken their own lives where there is a view that it may reveal something useful and important for their grieving relatives.

I do this as somebody who used to work for a social media platform and took part in the decision-making process on responding to requests for data in these tragic circumstances. In the internal debate, we had to weigh two potential harms against each other. It was obvious that refusing to disclose data would add to the pain and distress of grieving families, which the noble Baroness eloquently described for us, and, importantly, reduce opportunities for lessons to be learned from these awful situations. But there was also a fear that disclosing data might lead to other harms if it included sensitive information related to the connections of the person who had passed away.

The reluctance to disclose is sometimes described as being for “privacy reasons”. We should be more explicit; the concern in these cases is that, in trying to address one tragedy, we take an action that leads to further tragedy. The nightmare scenario for those discussing these issues within the companies is that another young person becomes so distressed by something that has been disclosed that they go on to harm themselves in turn. This genuine fear means that platforms will likely err on the side of non-disclosure as long as providing data is discretionary for them. If we want to solve this problem, we need to move to a system where disclosure is mandated in some form of legal order. I will briefly describe how this might work.

Families should not have to go directly to companies at a time of serious distress; they should instead be able to turn to a specialist unit within our court system which can assess their request and send disclosure orders to relevant companies. The noble Baroness eloquently described the problem we have with the status quo, where people approach companies directly. The platforms would then be required to provide data to the courts, which would need to be able to carry out two functions before making it available to families and coroners as appropriate.

First, they should be able to go through the data to identify whether there are particular sensitivities that might require them to withhold or effectively anonymise any of the content. To the extent possible, they should notify affected people and seek consent to the disclosure. In many cases, the platforms will have contact details for those individuals. Secondly, they must be able to consider any conflicts of law that might arise from disclosure, especially considering content related to individuals who may be protected by laws outside of the jurisdiction of the UK courts. This would need to include making decisions on content where consent has been withheld. If we could set up a structure such as this, we could have a workable regime that would work for all interested parties.

A few minutes is obviously not long enough to cover all these issues in detail, so I will publish a more comprehensive post on my blog, which is aptly named regulate.tech. I thank the noble Baroness for creating an opportunity to consider this important issue, one I am sure we will return to during the passage of the online safety Bill.It was obvious that refusing to disclose data would add to the pain and distress of grieving families and reduce opportunities for lessons to be learnt from these awful situations.

House of Lords Hansard Column 235GC

 

This was necessarily a short contribution given the debate was time limited to one hour and a number of peers wished to speak but I can add some more detail here about what a good regime might look like.

I suggested that a specialist unit be created within the UK courts system as I think there are significant benefits to all parties when we establish networks of ‘Single Points of Contact’ (SPoC).

This means that the people making the requests are familiar with the processes and policies of the platforms and can build up long-term working relationships with those who will be responding to their disclosure orders.

This is how the system for Police requests for data from internet companies works in the UK with officers investigating crimes able to go to their force’s SPoC when there is a need to request data.

For families, it would be much better to be able to liaise with a skilled specialist within the UK court system rather than have to try and work out a direct route to different companies, or draft in support from local agencies who may be very sympathetic but also unfamiliar with this kind of request.

From a platform point of view, being able to discuss any specific issues related to a particular request with a specialist unit would help make this process as quick and smooth as possible.

Platforms will also be more comfortable disclosing data where they know it is first going to a specialist court which has the capability to assess any risks and take measures to address them.

There are two areas of risk that they will especially need to consider that are related but not necessarily coterminous.

Most importantly, they will need to be on guard for any disclosures that might cause someone else to feel significant distress.

There may be a whole range of sensitivities in the data including details of personal relationships and people’s views of others that could be shared more widely than anyone intended as a result of the disclosure process.

An important tool in addressing this would be a system of notice and consent where the court seeks to contact people who are mentioned in the data that is going to be disclosed.

Assistance from platforms can help facilitate this process as they will usually have contact details for most of the connections in the disclosed data.

Where someone has freely consented to disclosure of data that relates to them then we can be confident that it is OK for it to be shared.

Where they have not given consent, either because they could not be contacted or if they have replied refusing permission, then the court will need to make a decision about what to do with this data.

This is where the second key function of the court will come into play and this is the ability to understand and address any conflicts of law that are relevant to the disclosed data.

Where the data relates to people in the UK then they will have to make a determination in line with any relevant UK legislation, notably but not exclusively the UK General Data Protection Regulation (UK GDPR), tailored by the Data Protection Act 2018.

The Information Commissioner’s Office (ICO) will have a key role in guiding decisions about disclosure considering all the legitimate grounds for the processing of personal data set out in the legislation.

Decisions to disclose or withhold specific data will sometimes still leave people unhappy but where they are being taken by an official UK body informed by UK law this may be more acceptable than decisions taken by a private company under their own terms.

Other legislation may also be relevant, for example where the data includes sexually explicit images then there could be complex issues to address in respect of the criminal law.

Many people have connections outside of their home country on social media whose data could be part of the disclosure process and particular care will be needed to ensure their local legal rights are not breached.

Notice and consent will again be a useful mechanism for data related to people anywhere in the world, but where they do not give consent for disclosure then it would not be appropriate for a determination to be made solely UK law.

I describe these challenges not to make a case for why disclosure cannot happen but rather to try and chart a path towards it happening in a way that does not later have to be reversed because of legal challenges or negative outcomes.

Many cases will be relatively straightforward, but my experience working for a social media platform is that there are always hard cases that challenge any set of policies or processes you establish.

You cannot plan for every eventuality but it is wise to build systems that are able to assess and respond to specific circumstances and I hope this post can contribute to designing such a system.

We are unlikely to be able to lessen the pain for those who find themselves living through such an awful experience but we can at least do our best to avoid adding to that pain.

Leave a Reply

Your email address will not be published.