Data protection law in the UK is focused on personal data. This reflects both its political origins, which value the preservation of individual rights and freedoms, and a view of the digital economy which saw the primary value of data to be related to classification and prediction of human behaviour. For this reason, privacy law has been considered interchangeable with competition and anti-trust in tech (e.g. Kerber 2022; Kuenzler 2022).
Yet with the emergence of Generative AI, attention has shifted from a focus on the use of data about people, to data about work. However, this debate has narrowly focused on (a) creative work (b) collection via web-scraping and (c) governance via intellectual property. In practice, a wide range of workplaces - and data gathering by more invasive means - are collecting vast amounts of data about work which is not covered by these uses and is more than personal, such as work methods and processes. This is not only in knowledge work, those working on a computer at a desk – but also in manufacturing, logistics and other essential areas of work.
Codifying work methods in order that machines can substitute for labour is foundational to automation. Conventionally, both the codification of work and the battle over the rewards that come from this have been waged between workers and their employers. This reflected a period where, broadly, information about work processes was held within firms on local hardware. But the collection and processing of data by ‘Software as a Service’ (SaaS) makes this picture more complicated. The boundary of firms as containers of information – as they once were – has been transformed by cloud-based software services.
We saw this when examining a specific variant of SaaS which saw a rapid rise in adoption during Covid. ‘Connected worker’ platforms (CWPs) were introduced across essential work sectors (Gilbert and Thomas, 2021). Analogous to Microsoft Office for non-office workers, CWPs deliver in-work solutions for the 80% of the world’s workforce who don’t sit at a desk, tracking work methods and processes. They are seen in manufacturing, maintenance, logistics, mining, telecoms, energy and food production. The overwhelming share of these firms were based in the US.
CWP developers we spoke to were explicit about their intention to elicit work methods – but also to model entire industry practices and processes from the data gathered in workplaces they serviced. This could be through (a) employers inputting work instructions and managing compliance by requiring staff to document their work (e.g. with photographs) (b) inviting workers to transcribe their work methods as they undertook a process, adding pictures to support future instructions in real-time or (c) deducing work methods through inferences, relying on the most extensive surveillance, sometimes even tracking eye movements to understand where attention falls in a process. These capabilities have been further and rapidly accelerated by the integration of LLMs. As language can be interpreted and processed, it is easier to derive insights, encode them and compare between cases.
This can impact worker, firm and national interests in different ways.
The EU and China are conscious of these issues. The EU has more recently taken an interest in non-personal datasets as part of a broader agenda to strengthen competitive advantage. Yet, this seems to remain focused on the maintenance of industrial machines rather than workplace data in general. China defined data as a fifth pillar of the economy, with Article 21 of the China Data Security Law (DSL), which took effect in 2021, suggesting that data would be classified according to the degree of importance it has to economic and social development.
Our ongoing research in this area suggests that the UK could benefit from further examining the importance of industrial and workplace data for individual, firm national interests and – with the CMA – monitors risks and impacts closely.
The Responsible AI Sandbox that we have developed looks to support businesses dealing with these challenges – and regulators to better understand the issues for businesses - through a dedicated ‘open call’ on tools used to elicit work methods, including those for Digital Twin and Edge AI. Find more about it here.
Dr Abby Gilbert is Co-Director of IFOW
Dr Abigail Gilbert