COVID-19 and Digital Technologies - Part 2


COVID-19 and Digital Technologies – Part 2

Raising the temperature on technology deployment

In the previous post we discussed some ethical considerations in the use of digital technologies whilst tackling the COVID-19 pandemic. In this post we will highlight more of the technologies that are being deployed during the opening up of society and present an approach to ensure technologies are trusted.

By Alexander Galt | June 19, 2020

All the ways to Track and Trace

With a number of countries beginning to open up their societies to restricted work and leisure activities we have seen the acceleration of government technologies being trialled and tested to facilitate an attempted guard against a second wave of infections. The most ubiquitous of these technologies are the digital tracking and tracing applications (DDTs) with ethical guidance regarding such systems being circulated by the WHO and prominent scholars. This guidance details concerns about efficacy, the proportionality of trade-offs between risks in using the app and benefits it can bring, the scope of use (both functionally and temporally), governance mechanisms and aspects of privacy, autonomy and consent.

In a troubling example of scope creep of these systems the US state of Minnesota has “begun contact tracing arrestees” in order to geographically profile protesters, highlighting the potential privacy eroding nature of such technologies. The Privacy-Preserving Contact Tracing system developed by Apple and Google promises technical safeguards against such extended surveillance uses, with the first apps to use it’s backend technology launching in Switzerland and Germany. In recent days the UK government has abandoned development of their own system in favour of the Apple-Google system due to poor performance during trials, with the proprietary system in Norway also being pulled due to privacy concerns. Whilst the Google-Apple system has supporters from the majority of privacy professionals due to its decentralized nature, other pressing questions have been posed about the power dynamics at play regarding the coercion that the tech giants have displayed against government attempts to develop their own systems.

Government surveillance hasn’t been limited to proximity tracing, with temperature monitoring being trialled via thermal imaging cameras mounted on drones and helmets worn by the police. Both systems have been criticised for their inability to detect fever effectively. The result on society has being described as “little more than a public health theatre”; in which supposed ‘tech-solutions’ influence the public expectation that they are protected when there is very little supporting evidence. Behind these scientific soundness debates (and impact on public behaviour) are real concerns that privacy, autonomy and dignity are being sacrificed on the altar of ineffective health security safeguards – with French courts banning police use of drones to enforce lockdown, deeming them an infringement of privacy rights.

The above technologies highlight an apparent trend in which European countries are adopting solutions that have been deployed across Asia - from DDTs to drones to thermal imaging. We are left thinking what comes next - will it be a 'robot dog' roaming around your local park as seen in Singapore, or facial recognition systems helping to enforce quarantine as seen in China? Now more
than ever there is the requirement for society to engage in an open and
informed debate about how technologies are deployed, that takes into account
the wider implications on stakeholders, rather than a scattergun approach which utilises any and every technical solution that's available.

Read more about Digital Ethics

Back to Work in the Future of Work

Governments aren’t the only party that need consider the ethical impacts of health and surveillance technologies, as new tech solutions have emerged that enable proximity monitoring in the office or shop floor as seen in the freight company trials at Amsterdam Airport Schiphol. Health monitoring is also becoming a corporate reality with symptom tracing technologies emerging alongside the potential for people to digitally share their immunity status with employers. The dynamic between employer and employee is a complex one when introducing tracking measures that are either invasive or obtrusive and that have the potential to impact on the identity and autonomy of people in the workplace. Organisations need to demonstrate that they are acting responsibly and proportionately, with a consideration about how they will demonstrate accountability and transparency about any health and safety decisions that are made with the assistance of technology.

Employees will increasingly be returning to work in a circumstance where the ‘future of work’ trends have been fast-tracked at scale including workforce replacement, digital transformations and remote working. Our vision on the return to work includes embracing purpose, potential, perspective and possibility in which organizations should evolve their thinking about technology from taking a purely substitution view (replacing humans with technology) to using technology as an augmentation or collaboration strategy.

Technology and Social experimentation

Technologies need to be trusted in order for them to be fully effective and accepted by the end users. If they aren’t, they don’t just risk reduced efficacy but may also become counter-productive with people undermining and obfuscating their interactions with the technology at large. In some scenarios these technologies may even present a threat to fundamental human rights. A method to protect against these risks is to treat technology as a ‘social experiment’ – in which you should know what you want to achieve with it, set boundaries and parameters as to how it should be used, know your target group and gain their consent and above all consider the wider ethical implications of your use of technology.

Did you find this useful?