Deploying productivity monitoring software ethically

While remote working is nothing new, the widespread abandonment of offices during the Covid-19 pandemic has prompted many enterprises concerned about workforce productivity to adopt monitoring software that can help them keep track of remote employees’ day-to-day activities.

Many of the digital productivity monitoring tools available today allow enterprises to see a range of information about their employees’ activities, from recording their keystrokes and mouse clicks to tracking their physical location and use of applications or websites.

Using these and a variety of other metrics, the software can help enterprises to conduct predictive and behavioural analytics, enabling managers to understand and track how productive employees are over time. 

While the use of productivity monitoring tools was already ramping up before Covid-19 – a 2019 Accenture survey of C-suite executives found that 62% of their enterprises were “using new technologies to collect data on their people and their work to gain more actionable insights” – the move to remote working has facilitated a dramatic increase in their use.

In the UK specifically, a November 2020 YouGov survey of 2,000 employers found that 12% of firms were already monitoring their staff remotely, 8% already had plans to implement monitoring and a further 6% were considering it going forward.

In the same month, a report from the UK’s Trades Union Congress (TUC) found that one in seven workers had experienced increased monitoring at work since the start of the coronavirus pandemic.

However, despite the substantial increase in sales of employee productivity monitoring software and the benefits it can bring organisations, some are worried about how it is implemented and deployed, citing privacy and employee distrust as major concerns.

What are the ethical concerns?

According to a June 2019 workplace technology survey by the Chartered Institute of Personnel and Development (CIPD), only around one in 10 workers think workplace monitoring will have more benefits than downsides – a view shared by a similar proportion of managers.

“Almost three-quarters (73%) of respondents said that introducing new technologies to monitor the workplace will damage trust between workers and employers, and the same proportion (73%) also said employers should not be allowed to monitor employees outside of working hours, including on breaks,” says Hayfa Mohdzaini, a senior research adviser on data, technology and artificial intelligence (AI) at the CIPD.

A November 2020 YouGov survey of 2,000 UK employers found that 12% of firms were already monitoring their staff remotely, 8% already had plans to implement monitoring and a further 6% were considering it going forward

“Our interviews with UK workers highlight that some felt monitoring would increase anxiety and encourage presenteeism. However, monitoring on a smaller scale with clear alignment with organisational performance is more acceptable – for example, using timesheets aligned to sales targets – but this must be reciprocated with trust,” she adds.

A similar survey published by Prospect Union in October 2020 found around half of the respondents felt the introduction of workplace monitoring software would negatively affect their relationship with managers, a number which rose to 62% among younger staff aged 18 to 24.

Andrew Pakes, director of research and communications at Prospect Union, says the whole concept of privacy has been radically changed by the mass extension of remote working during the pandemic, which has further blurred the line between home and work life.

“Even before the pandemic we had concerns about the always-on culture, the different ways technology was blurring the line between home and work, and the pressure it was putting on many people to be answering emails first thing when they wake up and last thing before going to bed,” he says.

“How do we have a right to a private life when our home is also our office? Many of us are bringing work, colleagues and monitoring into our most private spaces – the home is supposed to be our sanctity – it’s our most personalised space for our family, our friends.”

He adds that putting “monitoring into the heart of our private lives in the name of work” also raises questions about the power disparities between employers and employees, and that increased work flexibility has to work for workers, not just employers.

On this power disparity, Phillipa Collins, a lecturer in law at Bristol University who specialises in labour law, human rights and technology, says employers using productivity monitoring software as the basis for disciplinary procedures will have to take into account an employee’s right to not be unfairly dismissed under the UK’s 1996 Employment Rights Act, as well as its 2010 Equality Act.

“It’s completely foreseeable, for example, that women who are primary caregivers working from home may spend less time working because they are balancing work, home and caring,” she says. “So if your women employees were three percentage points lower or five percentage points lower, that would constitute indirect sex discrimination and then you’d have to justify your use of the software to an employment tribunal.”

Necessity of data processing?

Collins adds that human resources departments and other decision-makers should not take it for granted that deploying productivity monitoring is inherently a good thing. “More data means more [legal] risks, but also potentially less trust. People under surveillance are not happy people, so you’d be risking a mass exodus,” she warns.

Pakes and Collins are also concerned about the “mission creep” of productivity monitoring technologies and how the data collected could be repurposed further down the line.

“More data means more [legal] risks, but also potentially less trust. People under surveillance are not happy people, so you’d be risking a mass exodus”
Phillipa Collins, Bristol University

“Is this data going to be used for management information further down the line? Will public health or safety data suddenly end up in your next appraisal? Will you be presented with a list of websites you’ve looked at from your laptop?” says Pakes, adding that such uses of data would “breed a level of division that is really unhelpful…in terms of building good work cultures”.

Collins adds that organisations should only collect and process data that is absolutely necessary for the purposes they have defined, adding that they must be able to understand what the software is doing and clearly explain why it is necessary.

“If I was in the room with an [external] data protection officer… could I convince them that every single data processing point was necessary? I think that’s going to be quite tricky,” she says, adding the software’s deployment also needs to be monitored on an ongoing basis to ensure its use remains within the original purpose and is therefore compliant with the General Data Protection Regulation (GDPR).

“You’ve got to keep checking. It’s not like you just introduce it one day and your obligations are done – you have to keep checking that what you’re doing is legitimate and that it’s actually achieving the aims you set out to achieve,” says Collins.

Ethical use and implementation

According to Tom Moran, chief strategy officer of productivity monitoring software provider Prodoscore, although data can be a powerful tool, it does not replace skill, judgement and leadership abilities, meaning it is important to have clear communication and alignment around what the technology is supposed to achieve.

“Change management is a consideration not to be overlooked with any technology implementation. The human element is the most important part of the equation. Transparency regarding Prodoscore is encouraged so that everyone recognises how the benefits can be leveraged in a very positive way,” he says, adding that “a differentiated workplace is based on trust and outcomes” rather than attendance metrics.

“Data-based decision-making is well understood. However, analytics can be of questionable value unless leadership stays in close alignment with employees,” adds Moran.

Reid Blackman, CEO and founder of ethical technology consultancy Virtue, says both suppliers and users of monitoring technology should develop a number of ethical best practices.

These include determining exactly what metrics it is necessary to collect for the purpose of the processing – such as why they are being used, what they show, why they matter, and so on – developing onboarding processes that provide users in management roles with ethics training, and setting out procedures for how information about employees should be used.

“If companies want to engage their workforce, ensure staff are performing well and are supported while being remote, the worst way to do that is to break down trust and bring in micromanagement and digital monitoring tools”
Andrew Pakes, Prospect Union

“One question those developers have to ask is why they are including this feature – are we collecting this data because it gives us more data on the employee, or are we collecting the data because we can see how this is relevant to a manager making a good decision about how to treat this employee?” he says, arguing that while an employer may want to know whether people are working or not, collecting information on the non-work-related activity itself “seems overly invasive”.

On the buyer side, Blackman adds that once organisations have decided for themselves which data is essential to the purposes they want to achieve, procedures should be put in place to ensure the data is not used against the employee in an abusive way.

“If we have a conversation because an employee is falling short [according to] these metrics, what does the conversation look like with an employee in those cases? Is it, ‘What’s going on here?’ or is there a ‘script’ to it that helps managers have the right kinds of conversations with employees instead of looking at them with a magnifying glass,” he says.

For Pakes, the surge in enterprise use of productivity monitoring tools reflects wider problems around management culture in the UK and what actually makes people productive, with many companies using it wrongly, eliding the ability to track with the ability to improve productivity.

“If companies want to engage their workforce, ensure staff are performing well and are supported while being remote and out of the office, the worst way to do that is to break down trust and bring in micromanagement and digital monitoring tools,” he says.

“It isn’t measuring people’s keystrokes at their laptop or whether they’re on Teams permanently that makes them more productive, it’s building a culture of trust and collaboration, and using technology to support good work and keep workers connected,” adds Pakes.

However, to build best practices for ethical use of monitoring software, organisations must consult employees over the technology and exactly how it will deployed.

Employee consultation is key

According to Eli Sutton, vice-president of global operations at user activity monitoring (UAM) provider Teramind, a common question clients ask is whether they should inform employees that they’re being monitored. The answer, he says, is “most definitely”.

Sutton says staff must know if their employer is using software to monitor productivity to ensure “the trust factor is not infringed upon” and warns against the temptation to micromanage as it could reduce company morale if employees feel they are being watched over.

“It isn’t measuring people’s keystrokes at their laptop or whether they’re on Teams permanently that makes them more productive, it’s building a culture of trust and collaboration, and using technology to support good work and keep workers connected”
Andrew Pakes, Prospect Union

“It all depends on how you utilise the solution. Utilise it in a way [that lets you] assist the employee in being more productive, then you shouldn’t have any issues, especially if you let the employees know they’re being monitored,” he says.

Of Teramind’s UAM software, Sutton says it is up to the client itself to configure, but it provides different controls that allow users to regain privacy depending on what they are accessing and what the firm wants to monitor.

“Teramind offers a ‘revealed agent’, where the user is fully aware they’re being monitored, they can even interact with the agent and disable it if they’re doing something that’s a little bit more private to them… And then we have what’s known as the ‘stealth’ or the ‘hidden’ agent, which is typically recommended when you’re monitoring for security… [because] you want to catch the bad eggs immediately without necessarily waving a flag,” he says.

The CIPD’s Mohdzaini says the process of consultation should be about having “open two-way conversations” where employees feel safe voicing their concerns, and organisations should specifically consult employees to ensure the productivity data collected is relevant and necessary and is not used in ways that discriminate against minority groups.

“The CIPD workplace technology report finds that employees who were consulted were more likely to agree with the benefits of a technology change than those who were not. For example, employees who were consulted were twice as likely to agree that the technology change improved the quality of goods and services than those who were not,” she says.

Pakes, too, says employers should be involving “workers and unions” from day one in discussions as workers have a right to be informed when monitoring is taking place. He adds that companies should be undertaking data protection impact assessments (DPIAs) of any new tools they wish to adopt, which should include extensive input from employees as the data subjects.

“New technology can make work more efficient, it can make it more enjoyable in some instances, it can get rid of poor tasks and allow people to focus on higher-value tasks, but it has to be introduced appropriately and with the involvement of workers in the decisions about how its deployed,” he says.

The legal landscape

Collins says the processing of data by productivity monitoring software is going to constitute “high-risk processing” – the legal threshold needed to trigger a data protection impact assessment under UK data protection laws – but that there is a tendency for DPIAs and related consultations to become box-ticking exercises.

“Understanding about all of this stuff is still fairly low, so making sure that you and whoever you’re consulting with on behalf of the workforce understand what’s going on, and can debate and engage genuinely on an equal footing about the software, [is important],” she says.

However, while Collins agrees employees should be consulted, both generally and as data subjects for DPIAs, she says organisations cannot simply rely on employee consent as their legal basis for using the technology.

“It’s very clear that, because of the imbalance of bargaining power between workers and employers, an employer would not, as a data processor, be able to rely on the consent of their employees to process their data,” she says, adding it is a common misconception that signing an employment contract automatically allows enterprises to process their workers’ data.

“If you’re that employer, you’re looking to be compliant, you’re looking to do this in a way that’s entirely legitimate. Consultation would improve how you go about it – it would improve the logistics of it because you’d have employee buy-in, but you’d still want to be looking for another lawful basis to support your processing.”

Through processes of consultation, organisations can therefore navigate the ethical quandaries of productivity monitoring, giving both employees and management more confidence in the systems and how they are used. 

However, when identifying the legal basis for the processing, organisations should be wary of relying solely on the consent of employees and must regularly assess whether continued use of the tools still fits within their legitimate aims.