University Of Essex Study Finds That Wearable Healthcare Devices Pose Threat To Human Rights Of Elderly
Many wearable healthcare technologies pose acute risks to the right to privacy, which is particularly serious given that these technologies are used in people’s homes, including bedrooms and bathrooms
Researchers at the UK-based University of Essex have warned that Digital technology used to improve care, and the lives, of old people, could be risking their basic human rights. According to new research, although digital technologies can be used to improve social care, they can also adversely impact human rights and contribute to the segregation and isolation of old people. The report, ‘A Digital Cage is Still a Cage’, argues that emerging technologies used to monitor health and wellbeing, detect a fall, or carry out household tasks, demand further careful scrutiny.
The report has been conducted by researchers from the University of Essex’s ESRC Human Rights, Big Data and Technology Project. The report’s authors, Professor Lorna McGregor and Neil Crowther, of the ESRC-funded Human Rights, Big Data and Technology Project, warn that technology can pose a significant risk to the right to privacy and the right to autonomy and could lead to old people being ‘institutionalised’ in their own homes.
Professor Lorna McGregor said: “New and emerging digital technologies, including technologies enabled by Artificial Intelligence (AI), offer many opportunities to protect and advance our health and wellbeing as we get old and need more support to live our lives. But without careful attention they – or rather, the reasons for their employment and the way they are used – could also pose risks to our human rights, replicating some of the historic challenges associated with social care, such as control and coercion, while throwing up new ones.”
A range of new and emerging technologies are used by old people, their families, and care providers for different purposes in social care. For example, to monitor health and wellbeing, such as detecting a fall; carry out discrete household tasks, such as vacuuming; stimulate long-term memory and set reminders for particular tasks, such as taking medicine; support decision-making by care providers; and detect pain.
But the current design and use of many technologies pose acute risks to the right to privacy, which is particularly serious given that these technologies are used in people’s homes, including in bedrooms and bathrooms, and may be recording, processing and sharing data about the most intimate details of old people’s lives.
These risks may increase as technologies become more sophisticated, and are able to interact, for example, within the context of a smart home.
Smart homes could help reduce the number of people in care homes, but if programmed to determine when a person gets up or goes to bed, whether they can leave the property or when and what they eat, they could easily replicate the features of institutionalisation, depriving people of liberty and autonomy.
As a result of these concerns, the report concludes that there are two possible futures. One is a future of increasingly remote and automated services where these new technologies contain costs and maintain us. The other is a future in which these technologies become instruments for achieving the best conditions for later life, based on rights, autonomy, dignity, and social connection.
The authors conclude that, if their findings and recommendations for a framework to protect human rights are taken on board, digital technologies could help to support old people to live independently.
Professor McGregor said: “We show how, depending on their design and deployment, and the framework in place to protect human rights, new and emerging technologies can pose significant risks to old people’s enjoyment of human rights, or they could enhance their autonomy and dignity, supporting them to live independently and participate in the community.”
Around The World