Recently in the context of its duties under  the Procurement Bill I argued for an obligation on Government to ensure that it had regard to the need to secure good work for those carrying out contracts under its procurement activities. This is what I said: 

My own interests, and indeed concerns, in this area go back to the House of Lords Select Committee on AI. I chaired this ad hoc inquiry, which produced two reports: AI in the UK: Ready, Willing and Able? and a follow-up report via the Liaison Committee, AI in the UK: No Room for Complacency, which I mentioned in the debate on a previous group.

The issue of the adoption of AI and its relationship to the augmentation of human employment or substitution is key. We were very mindful of the Frey and Osborne predictions in 2013, which estimated that 47% of US jobs are at risk of automation—since watered down—relating to the sheer potential scale of automation over the next few years through the adoption of new technology. The IPPR in 2017 was equally pessimistic. Others, such as the OECD, have been more optimistic about the job-creation potential of these new technologies, but it is notable that the former chief economist of the Bank of England, Andrew Haldane, entered the prediction game not long ago with a rather pessimistic outlook.

Whatever the actual outcome, we said in our report that we need to prepare for major disruption in the workplace. We emphasised that public procurement has a major role in terms of the consequences of AI adoption on jobs and that risk and impact assessments need to be embedded in the tender process.

The noble Lord, Lord Knight, mentioned the All-Party Parliamentary Group on the Future of Work, which, alongside the Institute for the Future of Work, has produced some valuable reports and recommendations in the whole area of the impact of new technology on the workplace. In their reports—the APPG’s The New Frontier and the institute’s Mind the Gap—they recommend that public authorities be obliged to conduct algorithmic impact assessments as a systematic approach to and framework for accountability and as a regulatory tool to enhance the accountability and transparency of algorithmic systems. I tried to introduce in the last Session a Private Member’s Bill that would have obliged public authorities to complete an algorithmic impact assessment where they procure or develop an automated 

decision-making system, based on the Canadian directive on artificial intelligence’s impact assessments and the 2022 US Algorithmic Accountability Act.

In particular, we need to consider the consequences for work and working people, as well as the impact of AI on the quality of employment. We also need to ensure that people have the opportunity to reskill and retrain so that they can adapt to the evolving labour market caused by AI. The all-party group said:

“The principles of Good Work should be recognised as fundamental values … to guide development and application of a human-centred AI Strategy. This will ensure that the AI Strategy works to serve the public interest in vision and practice, and that its remit extends to consider the automation of work.”

The Institute for the Future of Work’s Good Work Charter is a useful checklist of AI impacts for risk and impact assessments—for instance, in a workplace context, issues relating to

“access … fair pay … fair conditions … equality … dignity … autonomy … wellbeing … support”

and participation. The noble Lord, Lord Knight, and the noble Baroness, Lady Bennett, have said that these amendments would ensure that impacts on the creation of good, local jobs and other impacts in terms of access to, terms of and quality of work are taken into account in the course of undertaking public procurement.

As the Work Foundation put it in a recent report,

“In many senses, insecure work has become an accepted part of the UK’s labour market over the last 20 years. Successive governments have prioritised raising employment and lowering unemployment, while paying far less attention to the quality and security of the jobs available.”

The Taylor review of modern working practices, Good Work—an independent report commissioned by the Department for Business, Energy and Industrial Strategy that remains largely unimplemented—concluded that there is a need to provide a framework that better reflects the realities of the modern economy and the spectrum of work carried out.

The Government have failed to legislate to ensure that we do not move even further down the track towards a preponderantly gig economy. It is crucial that they use their procurement muscle to ensure, as in Good Work, that these measures are taken on every major public procurement involving AI and automated decision-making.