AFR Workforce Summit, Sydney
Introduction
- Thank you Cosima.
- And thank you to the Australian Financial Review for the invitation to speak this morning.
- I acknowledge the Gadigal people of the Eora nation as the Traditional Custodians of the lands we meet on.
- And I pay my respects to their elders past and present.
- This morning, I want to talk about the impacts of AI in the workplace as a critical policy issue for Australia, that we must tackle together.
- Because while artificial intelligence isn’t the only force currently shaping Australian workplaces – it’s certainly a significant one, as the AFR’s front pages reveal.
- The Fin has been leading Australian coverage of AI since Australia’s very first conference on artificial intelligence in 1986.
- Justice Michael Kirby spoke at that inaugural national conference about the ethics of what they called “expert systems”.
- Justice Kirby said:
- “The fundamental issue is the extent to which our human intelligence will remain the master, and not become the servant, of the remarkable developments of AI.”
- Those comments are still relevant 40 years later.
- There is no doubt AI has the potential to deliver significant economic gains for Australian firms – and the broader economy.
- But as Minister for Employment and Workplace Relations, the questions I am focused on are:
- How can we harness the benefits of AI for both business and workers?
- How can we ensure AI is working for humans, rather than the other way around?
- So today I want to talk about what we can learn from the past, including the importance of fairly sharing the benefits of technological change.
- But also highlight the fact that those benefits won’t be realised without trust between workers and employers.
- Trust that is built on transparency and a commitment to involving workers in the adoption of AI.
Lessons from history
- Of course, AI is not the first technology to reshape workplaces.
- If we look back at history, previous technological developments can offer some lessons for us today.
- One that comes to mind is the story of the Luddites.
- These were skilled weavers in England during the Industrial Revolution, who faced significant economic deprivation from the introduction of mechanised looms.
- But what often gets misunderstood in this story is that the Luddites weren’t actually opposed to new machinery.
- They were in favour of tools that would enhance employment opportunities and make their life easier.
- What they objected to was the gains flowing only to employers.
- And the fact that factory owners were using technology to degrade working conditions and erode wages.
- In just a decade, the average weekly wage for a weaver in Lancashire fell by almost half.
- More than one hundred thousand textile workers lost their jobs, severely depressing local economies.
- But the factory owners refused to engage with their workforce, and the government failed to intervene.
- Violence and social unrest followed.
- The desperate workers resorted to destroying the machines they saw as taking their jobs.
- The consequences were twofold.
- Workers were rapidly pushed out of the labour market on a significant scale.
- And then found themselves competing, often with children, for low-paid and dangerous jobs operating the machines that had replaced them.
- Now in the long-term, mechanisation did eventually lead to improved living standards overall.
- But many workers never got back to a position comparable to the one they had previously enjoyed, within their lifetime.
- They became part of a lower-paid industrial working class, with significantly reduced autonomy and working conditions.
- They were treated as disposable.
- Cast onto the scrap heap.
- Obviously the circumstances are very different between 19th century England and 21st century Australia.
- But this story shows us the dangers of failing to harness those technological benefits for both business and workers.
- And just like the challenge we face now, it highlights serious questions we must address.
- How can business and the economy successfully embrace AI, while ensuring workers are supported with the skills and knowledge to adopt AI to improve their working lives?
- And how can government facilitate this adoption so the benefits are spread fairly?
Involving workers to realise the benefits
- As a starting point, workers must be meaningfully brought along on the journey.
- Because there is real fear out there.
- Fear that workers will be displaced from their jobs, fear that employees are disposable.
- Recent RedBridge polling published in the AFR showed 73 per cent of workers believe AI will be negative for job security.
- As Tony Barry observed:
- “As workers are learning more about AI, they are increasingly deciding that AI is not their friend.”
- If workers are treated as disposable by their employer, or if AI is used as an excuse for job loss, this anxiety will harden into distrust.
- People will lose faith in the technology, undermining its benefits.
- So employers must address these fears by meaningfully involving their workers in the adoption of AI in the workplace.
- And importantly, supporting staff so they are in a position to adopt AI, not be replaced by it.
- Because workers need to have confidence that by adopting AI, they are not just training their own replacement.
- And they need to feel that they can use AI to contribute to the business’s success and improve their own working lives.
- This will help address some of the anxiety for workers – but will also benefit the business.
- Jobs and Skills Australia published a major study last year showing that businesses get the best return on AI when they involve their workers in its delivery.
- That’s because business can maximise the effectiveness of AI when they build on workers’ expertise and their knowledge of how work is actually done.
- However, this is only half the story.
- Bringing workers along for the journey is critical.
- But workers also need to understand where that journey is taking them.
- Workers need to see the benefits of AI for them, rather than just facing the prospect of being displaced by AI.
- That research from Jobs and Skills Australia I just mentioned also showed that in the medium term, AI is far more likely to augment a worker’s duties, rather than fully automate them.
- But effective augmentation first requires an understanding of what tasks can be delegated to AI, and how it can help an employee to do their job better.
- Which parts of a job are intrinsically human – and must remain that way.
- How roles can be elevated through an increased capacity to undertake higher-value work, which in many cases is more fulfilling or rewarding.
- And how AI can create completely new opportunities for workers in the future, in jobs like AI orchestrator or trust engineer.
- If not for workers currently in the workforce, then for their children.
- But workers need to have confidence that these jobs will actually offer secure, well-paid employment.
- Developing this understanding must be an essential first step for companies’ workforce planning and design, implemented in partnership with workers.
- Through that process, companies need to highlight how the adoption of AI will deliver benefits to workers.
- In aged care, for example, research from the Centre for Decent Work and Industry has found AI can reduce the administrative burden for workers by taking over the lower-order paperwork.
- This frees carers up to spend more time with patients, enhancing the quality of care.
- And these workers are also able to benefit by spending more time doing the parts of the job that they enjoy.
- The adoption of AI can also help workers use more of their intrinsically human skills, like empathy, judgement and problem-solving.
- A call centre operator, for example, can move into a quality assurance role rather than answering the phones themselves.
- This helps them to focus their skills where they can make the biggest impact.
- But the additional value that a worker can bring to a firm, using AI to augment their role, needs to be recognised – and remunerated appropriately.
- It is short-sighted to use AI to simply cut a company’s wage bill.
- Because the adoption of AI represents an opportunity for firms to improve their productivity.
- And where there are productivity gains, then workers should share in those gains – through pay rises and more fulfilling work.
- However, as I have already mentioned, many workers are fearful of AI – particularly if they lack the confidence or skills to adopt it into their job.
- Businesses therefore have a responsibility to upskill their current employees, instead of replacing them with workers who already have skills and confidence in using AI.
- Providing time and resources for on-the-job training will be critical for workers to adapt and adopt.
- This will ensure employees are not left behind, which is essential for building trust.
- But it will also benefit employers.
- Research from the London School of Economics indicates that training existing workers in AI skills is one of the fastest ways for organisations to achieve returns on their investment in AI.
- 93 per cent of workers whose employer had provided AI training used those tools in their job, compared to just over 50 per cent for those who hadn’t received training.
- And greater uptake drives greater productivity benefits.
- Those employees who had received AI training saved 11 hours per week, compared with 5 hours for the untrained.
- Of course, inevitably, there will be some workers whose roles may become redundant as a result of AI.
- But to minimise displacement, and the loss of trust that comes with it, employers should be providing redeployment opportunities within their organisation wherever possible.
- Opportunities identified through proactive forward planning.
- But the Commonwealth can also use the architecture of government to ensure that workers who may be displaced because of AI are prepared and able to quickly take up other jobs in our economy that are in demand.
- We can facilitate this through the provision of responsive, fit-for-purpose employment services.
- And through our strong investments in training and re-training.
- Government can use its levers to look at how we can increase opportunities for those entering the workforce or changing careers to have core AI skills.
- Through the National Skills Agreement, our Government is providing funding to the states and territories to build AI and digital skills for students in the VET sector.
- Like at a TAFE I visited recently in Melbourne, which is embedding AI literacy in all their courses, so every student can graduate with knowledge and familiarity of AI across multiple industries.
- Earlier this year we established the TAFE Centre of Excellence for AI together with the Victorian Government, which will focus on uplifting national standards in digital literacy and AI skills.
- And our tripartite Jobs and Skills Council for the tech sector is working with industry to build units of AI competency across our qualifications framework.
- This will ensure that training products reflect how AI and digital tools are actually being used across jobs and industries, to enhance students’ employability.
- I also want to see AI literacy being built into our foundational skills programs.
- Because these skills are fast becoming essential in many employment settings.
Role of Government
- Of course, Government also has a role to play in the adoption of AI in the workplace, as stewards of our workplace relations system.
- Because the question of how we can share the benefits I have mentioned is not yet settled.
- To answer this question, business, unions and their workers – along with Government – must work together.
- That is why we need a tripartite dialogue, to bring all parties together.
- And work out how we can get those productivity benefits, ensure workers are brought along on the journey, and that those benefits are shared.
- One of the ways I am building this dialogue is through the newly elevated AI Employment and Workplaces Forum.
- This tripartite Forum will bring together government, employers and unions around the same table, demonstrating the good will of all parties to tackle this challenge in a constructive way.
- And I am pleased to announce that tomorrow, I will be convening the Forum’s first ministerial-level meeting.
- The Forum will examine five key themes which will be essential to the adoption of AI in workplaces: trust, capability, transparency, safety and productivity.
- These themes will shape our discussions on how we can build common understanding and translate these themes into actions and outcomes in workplaces.
- Of course, this is a really fast-moving policy area.
- That’s why under the National AI Plan, my job as Minister for Employment and Workplace Relations is to ensure our workplace institutions, legislative settings and frameworks are compatible with the widespread adoption of AI.
- Because there is a live question about whether these settings are fit-for-purpose.
- My department is currently undertaking a gap analysis to identify how the current frameworks and institutions are interacting with the adoption of AI.
- And I commit to working with stakeholders on possible responses to that question, including through the Forum.
- Now can I be very clear.
- Tripartism does not, and should not, involve a right of veto.
- There will always be contestability – and I am not pretending this will be a silver bullet solution.
- But I really believe tripartism can build shared understanding and trust.
- Of course, in order to make these consultation mechanisms work effectively, we all need to be operating from the same set of facts.
- There are a lot of predictions out there about what will happen to the labour market because of AI.
- While augmentation in the medium term is the most likely scenario, according to JSA, we need reliable data to accurately monitor the real-world impacts of AI adoption.
- Our Government is building the capability to track these changes, if and when they occur.
- This descriptive data will look at how the labour market has changed since the launch of ChatGPT in November 2022, with a particular focus on entry-level roles and workforce composition.
- My department will soon finalise a report on this data.
- But I’d like to give you all a preview of some of the early findings.
- Pleasingly, employment outcomes for young tertiary graduates have been positive, despite some expectations that they could be the ‘canaries in the coalmine’ for AI in the workplace.
- We are not seeing an elevated rate of compositional change, meaning that the mix of jobs in the economy is not changing faster than usual.
- However, we are starting to see a slight softening in the rate of growth for occupations that are most exposed to AI adoption, like filing clerks or keyboard operators.
- Now, I would say that this data is reflective of a particular point in time, and is not predictive.
- But it will help build a shared understanding of what is actually happening in the labour market, which gives us a platform to build trust.
Effect on work practices
- But AI doesn’t just affect whether you have a job, or what your job involves, or what jobs you can apply for.
- It can also reshape the expectations of you as an individual, and how you are managed at work.
- And this leads me to another issue that often gets missed in this debate: the use of AI as a workforce management tool.
- Of course, AI can create efficiencies in rostering or management processes, for example.
- But when decisions about workers are made by software, instead of a human supervisor, then people are made to feel dehumanised.
- When AI makes decisions that affect workers, the errors it makes are not the only problem.
- The real unfairness arises when those decisions cannot be challenged or understood, leading directly to mistrust in the workplace.
- Because AI cannot replicate the human empathy or understanding that a good manager might have.
- So there needs to be fairness built into both the process and outcomes, for the adoption of AI as a workforce management tool.
- Whether through transparency or ensuring there is human oversight of decisions.
- We know the risks posed by automated systems when there isn’t a human in the loop.
- For example, gig workers having their source of income cut off immediately by being excluded from a platform, because of an unfair decision made by an algorithm.
- This has disastrous consequences for those workers and their families.
- That’s why last year our Government made it a requirement for gig workers to be given a fair process before they are deactivated.
- Including, critically, the right to have their deactivation considered by a human.
- The use of AI in the workplace is also posing emerging work health and safety risks.
- A recent study in the Harvard Business Review revealed that despite the predictions that we would all be sitting around twiddling our thumbs, generative AI tools didn’t actually reduce work.
- Instead – they consistently intensified it.
- AI can accelerate certain tasks, consistently raising expectations of workers through constant, real-time quantification of performance.
- And what initially looks like higher productivity, actually turns out to be unsustainable workloads.
- Which leads to much higher risk of cognitive overload and burnout.
- The use of AI-powered surveillance can also raise concerns about privacy and loss of autonomy for workers.
- Feeling like you’re always being watched significantly increases psychological stress and the risk of psychosocial injuries.
- Business and government both have a responsibility to monitor and manage these emerging work health and safety risks, to safeguard workers and maintain trust in the adoption of AI tools.
Data Centres Expectations
- But the impacts of AI on the broader labour market go beyond the deployment of AI in workplaces.
- The construction and operation of AI’s enabling infrastructure will also create workforce opportunities and challenges.
- Australia is an attractive destination for data centres and AI infrastructure developers, as highlighted last week by Microsoft’s announcement that it will be making significant investment in AI skills and capability in Australia.
- And Government has an important role in shaping how these investments can provide benefit to the broader Australian community, including through Memorandums of Understanding and our Government’s expectations of data centres and AI infrastructure developers.
- These expectations send a clear signal to incentivise investments that are consistent with our national interest.
- Not just around water and energy, but in our domestic construction skills pipeline as well.
- Because we know that there will be a workforce of tradespeople required to build these data centres.
- And we expect these data centre developers and operators to contribute to the development of the Australian workforce.
- Australia will prioritise proposals that invest in our skills pipeline, so we have enough tradies to build this vital infrastructure.
- Our Government will work closely with the states and territories to implement these expectations in their approval processes.
- And those conversations are ongoing.
- But the types of actions that I believe could help meet these expectations are for data centre developers to set targets for minimum amounts of work for apprentices and trainees on these construction projects.
- Adopting explicit workforce participation targets for women, First Nations people and other priority cohorts across construction and operational roles.
- Funding experienced workers to obtain a Certificate IV in training, and providing paid release time for teaching.
- And ensuring that workers on these projects are gaining a range of skills, so they can progress in their careers once construction is completed.
Conclusion
- So despite the obvious challenges, I am optimistic about the future AI can offer Australia.
- It offers a major economic opportunity that can boost our productivity and raise living standards.
- And I believe we are well positioned to seize that opportunity.
- To harness the benefits of AI for both business and workers.
- And ensure AI is working for humans, rather than the other way around.
- As long as we remember Justice Kirby’s assertion from 40 years ago: that human intelligence must remain the master, and not the servant.
Thank you.
[ENDS]