One aspect of the response has been the development of apps. These might be used to collect information about the spread of the disease (including via contact tracing), as a way of communicating information on symptoms, the number of cases and to identify “at risk” groups, or to monitor quarantine measures. The data could be used to inform policy around “de-escalating” lockdown measures and to ensure precious resources are allocated to “hot spots” in an evidence-based manner.
It is understandable that these apps are seen as vital to helping contain the disease. In times when physical contact with health professionals is necessarily limited, giving people a digital resource from which to obtain information and help limit community seeding is important. However, the use of such apps should be seen not as “silver bullets”, but part of a wider package of measures; the longer term implications should be borne in mind, especially where there is limited evidence that the apps improve the position, at least in respect of some of the proposed functionality.
The growth of “solutionism” has been rapid in recent years, but it has rarely reached the political elites on this scale. Politicians are viewing technology as a means to reducing/ending lockdowns and guiding economies back to “normal”, but a “blind faith” reliance on technology is unlikely to be wholly successful. For example, mass adoption is a pre-requisite to accuracy, especially around contact tracing apps. If that cannot be achieved, then teams of human contact tracers will still be required. There is also a question over inter-operability, with more than 30 different systems under development across the world. If the scope of their application is limited, cases may slip through the virtual “net”.
It is important to recognise that there are a number of different purposes behind the development of the apps. Some apps are designed as databases, with details of numbers of cases, guidance on social distancing and on symptoms to look out for, some of these also contain “self diagnosis” tools. Others are focussed on contact tracing, collecting information on the user’s proximity to other devices, including those of confirmed disease carriers. The last main group of apps concentrate on tracking and monitoring the movement of users, principally to demonstrate that quarantine measures are being adhered to, and people are staying within pre-defined areas. This range of functionality means that not all “apps” should be grouped collectively when looking at the issues that arise from their use.
App development is being assessed on a global level. The European Commission published its recommendation concerning the adoption of a pan-European approach and the introduction of a Toolbox (essentially a broadly principled “canvas” on which app development should be based), followed by guidance. The Commission recognised that Member States would have variances in approach, but called for collaboration around interoperability of systems, respect for human rights (principally privacy) and having a clear strategy for ceasing the collection of data once the crisis passed.
The European Data Protection Board has also issued guidance, followed more recently by the ICO in the UK. The eHealth Network has issued a draft of the Toolbox and further revisions are expected. The French privacy regulator, the CNIL, has also weighed in with its views on such apps. Whilst there is now a raft of information out there (and conflicting approaches in some cases), the underlying message is clear – consider the privacy and security implications, and bake these concepts into the design, production and distribution from the outset (much like the development of any new product). At a time when cyber criminals are looking to exploit the complacency which might attach to working remotely and people’s natural fears around the pandemic, securing our information has never been more important.
We also need to remain in control over what happens to our data (which after all, is likely in most cases to be sensitive health data). Differentiating between apps and their functionality is all the more important in this context – collecting health data from a user when they are simply looking up the number of cases locally is not only intrusive and unnecessary, but has no utility in controlling the spread of the disease. The legislation governing access to and use of data is also complex in these areas (the ePrivacy Directive requires consent of the user or necessity for use of the service for a third party to access data held on devices, for example).
It appears that Bluetooth communications will be widely used in the contact tracing process, given they are recognised as being more accurate for identifying specific locations than GPS or cellular data. Storage of such data can also be limited to “on device”, meaning that only relevant contact data need be centralised and used.
Allied to considering the legal basis for how the data will be used, by whom and for what purpose, and it is apparent that there are many challenges to be faced. They are however, not insurmountable, it is simply important to embed these considerations at an early stage and recognise their importance. Balancing the need for sufficient information in order to improve the accuracy of the app is the likely apathy around making the apps voluntary (in Singapore, an adoption rate of one third was achieved, against a requirement of around 60%). This reduces the accuracy of the information and the efficacy of the app – we have all seen the consequences of relying on inaccurate data.
Deactivation and data deletion are key considerations. Whilst the immediate focus is stemming the disease, we also need to consider what the world will look like afterwards. Experience tells us that mass surveillance measures are easy to introduce, but much more difficult to wade back from (consider the number of years it took to reduce mass surveillance powers after 9-11, a concern already raised by the European Data Protection Supervisor). The Coronavirus Act in the UK widened the government’s powers of mass surveillance, for example.
The use of algorithms will have to be closely monitored and any “next steps” advice must be delivered by a human and be based on medical advice. Whilst technology can take us some of the way in some areas, NHSX (the innovation arm of the NHS in the UK) is using machine learning in a number of ways, including the efficient allocation of resources (including PPE). The GoodSam network has enabled the rapid deployment of volunteers, matching skillsets and location to required roles such as transporting patients and delivering shopping. The Pan-European Privacy-Preserving Proximity Tracing (PEPPPT) app is also under development and is an example of an app that generates temporary, anonymised and encrypted ID data which can be adapted to cater for jurisdictional variations in approach.
In the long term, just consider what all of this might mean. If you crossed paths with someone who subsequently tests positive for CV, the app will notify you, tell you to go home and you’d have to self-isolate for 14 days. A test will have been arranged for a date following that period as a consequence of the notification. What happens if you don’t follow those instructions? Will the app notify the authorities? If so, would you then be warned, charged with an offence, or incarcerated in an isolation centre against your will?
If this seems far-fetched, consider what is happening today – in India, some states are requiring hourly selfies to evidence your location and status; in Poland, geo-located selfies are required and if you fail to upload the photo in time, the police are automatically alerted by the app. In Israel, the police have been granted access to the entire nation’s geolocation phone data.
In a small jurisdiction with demonstrably good governance, it is easy to be complacent. However, bear in mind the risk of stigmatisation, which we see on a weekly basis on social media. Imagine if you were identified as a carrier of the disease – how tolerant would your friends, family or work colleagues be if you wished to return to work (and States guidance allowed for this)?
It is easy to seek a convenient solution in times of difficulty, but we also need to collectively consider whether these tools impinge upon our fundamental rights and freedoms, with longer term consequences. Finding the right balance is surely achievable, with the common good in mind. In the words of one commentator, “I value my privacy, but I value my parents more”.