NEWS – “without comment”

Posted by: Ian (D. Withers)

ICO issues guidance on workplace surveillance

Guidance on employee monitoring covers how employers can conduct their digital surveillance lawfully, transparently and fairly, and warns against businesses intruding on their workers’ private lives.
By: Sebastian Klovig Skelton, Senior reporter
Published: 04 Oct 2023

The Information Commissioner’s Office (ICO) has published guidance on the monitoring of workers by employers, warning that any workplace surveillance being conducted must respect their staff’s right to privacy.

According to research commissioned by the ICO ahead of the guidance, almost one in five people believe they have been monitored by their employer, of whom 40% said they had their timekeeping and access to the workplace tracked; 25% said they had their calls, emails or messages checked; and a further 15% said they had audio and video footage recorded by their employer.

Of those surveyed, around 70% said they would find monitoring in the workplace intrusive. Fewer than one in five respondents (19%) said they would feel comfortable accepting a new job if they knew that their employer would be keeping tabs on them, whereas 57% said they would feel uncomfortable taking a new job if they knew that their employer would be monitoring them.

Published on 3 October 2023, the ICO’s guidance is intended to ensure that workplace monitoring is legal and complies with the UK’s data protection rules. The ICO said while workplace monitoring was possible, it could easily intrude into people’s private lives and undermine their privacy if it became excessive.

“Our research shows that monitoring at work is a real cause for concern, particularly with the rise of flexible working – nobody wants to feel like their privacy is at risk, especially in their own home,” said Emily Keaney, deputy commissioner for regulatory policy at the ICO.

“If not conducted lawfully, monitoring can have a negative impact on an employee’s well-being and worsen the power dynamics that already exist in the workplace. We want people to be aware of their rights under data protection law and empower them to both identify and challenge intrusive practices at work,” she added.

“While data protection law does not prevent monitoring, our guidance is clear that it must be necessary, proportionate and respect the rights and freedoms of workers” – Emily Keaney, deputy commissioner for regulatory policy at the ICO.

“We are urging all organisations to consider both their legal obligations and their workers’ rights before any monitoring is implemented. While data protection law does not prevent monitoring, our guidance is clear that it must be necessary, proportionate and respect the rights and freedoms of workers. We will take action if we believe people’s privacy is being threatened.”

The guidance outlined steps employers must take when conducting workplace monitoring, including making employees aware of the nature, extent and reasons for the monitoring; having a clearly defined purpose and using the least intrusive means possible; retaining only the relevant personal information to that purpose; and making all information collected about employees available through subject access requests (SARs).

It said employers must also have a clear legal basis for processing the data collected via monitoring, such as consent, a specific legal obligation, or the fulfilment of a contract. All of these things must be done by employers because they are legislative requirements.

“You can monitor workers if you do it in a way which is consistent with data protection law. When deciding whether to monitor workers, carefully balance your business interests as an employer and workers’ rights and freedoms under data protection law,” said the guidance.

“If you carry out monitoring in a way which is unfair, this will impact on their rights and freedoms under data protection law. It will also negatively affect the trust between you and your workers, as well as potentially affecting their mental well-being.”

It added that just because a form of monitoring is available does not mean it is the best way to achieve a business’s aims.

If special category data is being collected – which is sensitive personal information relating to, for example, race or ethnicity, sexual orientation, trade union membership or disability – the ICO added that employers must also identify a special category processing condition on top of a lawful basis.

The ICO further added that while conducting a data protection impact assessment (DPIA) is only a legal obligation if the processing is likely to cause high risk to workers, organisations should still do so even if not required.

On monitoring tools that use automated processes where there is no human involvement in the decision-making, the ICO said Article 22 of the UK General Data Protection Regulation (GDPR) prevents organisations from using solely automated decision-making processes if it has “legal or similarly significant effects” on people’s lives.

It added that such processing was only legally viable in very limited circumstances, such as if it’s necessary for the performance of a contract, authorised by a specific law that applies to the employer, or the worker has explicitly given consent.

The guidance also covered how employers must approach workplace monitoring in the context of biometric data, and further outlined considerations for different methods of monitoring workers, such as when workers use a company vehicle or where a third-party IT supplier is involved.

Responding to the guidance, Andrew Pakes, deputy general secretary of Prospect Union, said there had been an explosion in the use of surveillance software since the onset of the Covid-19 pandemic.

“With the regulator’s own research showing one in five workers feel they have been subject to digital monitoring at work, a light clearly needs to be shone on the practice,” he said.

“We need clear rules to prevent the abuse of people’s rights and protect them from intrusive and unwarranted behaviours. Transparency is a good start, but goes nowhere near far enough. The use of new technology can promote better jobs and higher productivity, but it needs to be done in partnership with workers, rather than imposed,” added Pakes.
Ongoing concerns

In August 2023, a Parliamentary select committee said workplace surveillance “should only be done in consultation with, and with the consent of, those being monitored”, given the negative impacts such surveillance could have on work intensification and mental health.

It added that the UK government should commission research to improve the evidence base around the deployment of automated data collection systems at work.

While that committee looked specifically at the surveillance of employees through connected devices, politicians and unions have long been decrying a range of surveillance-enabling technologies in the workplace.

For example, a Parliamentary inquiry into artificial intelligence (AI)-powered workplace surveillance, conducted by the All-Party Parliamentary Group (APPG) for the Future of Work, found in November 2021 that AI was being used to monitor and control workers with little accountability or transparency, and called for the creation of an Accountability for Algorithms Act.

In March 2022, the Trades Union Congress (TUC) also said the use of surveillance technology in the workplace was “spiralling out of control” and could lead to widespread discrimination, work intensification and unfair treatment without stronger regulation to protect workers.

In April 2023 – a month after the government published its AI whitepaper – the TUC further warned that the government was failing to protect workers from being “exploited” by AI technologies, noting that the whitepaper only offered a series of “vague” and “flimsy” commitments for the ethical use of AI at work, and that its separate Data Protection and Digital Information Bill (DPDI) had set a “worrying direction of travel”.

In May 2023, Labour MP Mick Whitley introduced “a people-focused and rights-based” bill to regulate the use of AI at work, which includes provisions for employers to meaningfully consult with employees and their trade unions before introducing AI into the workplace.