Your browser is no longer supported

For the best possible experience using our website we recommend you upgrade to a newer version or another browser.

Your browser appears to have cookies disabled. For the best experience of this website, please enable cookies in your browser

We'll assume we have your consent to use cookies, for example so you won't need to log in each time you visit our site.
Learn more

Nicola Graham: do not fear the AI revolution

  • Comment

Society is standing on the threshold of a fourth industrial revolution (4IR). Emerging technology has created boundless possibilities but, as with its three predecessors, 4IR also evokes a degree of scepticism and fear.

Artificial intelligence, in particular, has tremendous potential to change how we all live, work, and communicate. It has the capacity to positively reshape government, education, healthcare and commerce. But it also prompts people to ask whether their jobs will be safe or rendered obsolete by robots.

Public sector technology leaders need to proactively embrace the benefits of technical innovation, while responding sympathetically to the implications it has for wider society and public discourse.

AI, particularly, has the potential to vastly improve place-based outcomes for service users. The public sector needs to harness the most relevant and appropriate tools to achieve this. However, at a time when people are concerned about the impact on their livelihoods, about data misuse and privacy breaches, striking a balance between service improvement and risk requires a flexible approach grounded firmly in ethical conduct.

The only way is ethics

As Socitm president, one of my main priorities is to address some of the challenges I see about the ethical use of AI. How can we make the best use of tools, such as data analytics, automation and augmentation, to create a more effective public services workforce? Could AI algorithms potentially discriminate against particular groups? How do machines affect our behaviour and interaction?

The ethical implications of emerging technologies and data is one of Socitm’s key policy themes. To ask the right questions for our members, we are proactively engaging with thought leaders and specialist organisations in local government, digital innovation, AI and data in order to establish a library of best practice at a time when the use of AI is still in its infancy in the public sector.

People literate technology

Although it’s early days, some councils are starting to trial and use AI for discrete functions and services. It’s important that we look at their experiences and share the lessons learned.

In 2016 Enfield LBC became a pioneer in the adoption of cognitive technology by implementing Amelia, a sophisticated service agent chat bot. Enfield is one of London’s largest boroughs and its population is growing by 4-5,000 each year. Demand for services is growing and each month the council receives 100,000 visits to its website and 55,000 telephone calls. Such high volumes of traffic make maintaining high-standards of customer service challenging, particularly against a backdrop of central government spending cuts.

Chat bot Amelia is able to absorb time-intensive routine requests, freeing up staff resources for more complex tasks and helping to deliver more with existing resources. In order to balance these added efficiencies with the need to maintain good customer service, Enfield has aimed to make Amelia ‘people-literate’ rather than demanding service users are technology literate. In fact, there’s a hope that callers won’t notice they’re not speaking to a person because Amelia’s personality and social skills are based on natural language processing. This is in contrast to the computer process of interpreting the emotion in a human voice before responding appropriately.

During testing, Amelia answered planning permission queries from a limited group of constituents on the council’s website. In the first three months, she handled over 2,300 queries and understood the intention of requests 98% of the time.

Such positive findings highlight how the public sector needs to explore the full potential of AI further. A path Enfield has already started with the potential to develop a system that could be widely used in local government.

We need to talk about AI

But, as well as celebrating the potential for AI, it’s vitally important we also look at the challenges it presents. Despite the positives with Enfield’s use of cognitive technology, the full implementation of Amelia has been delayed by complexities revealed by the pilot process and it’s substantially behind schedule.

There is a clear need for an immediate internal debate on the implementation of emerging technologies and the data they produce. We need to put our heads together and openly discuss the risks of AI implementation in an evolving society.

It is also imperative that we work together to establish, issue and monitor guidelines to avoid negative press coverage and public mistrust.

We have to pool our resources in order to actively seek out bias within systems, particularly when starting their deployment, because technology should counter inequalities, not create new or more deeply entrenched ones.

Furthermore, in order to pre-empt negative publicity from intended and unintended ethical consequences of automation and digital technology, public sector digital leaders need to ensure ethical use of data as an integral part of business operations.

To broaden and strengthen global understanding of the ethical implications of AI it is essential that the voice of the public sector be heard and listened to. There is a need to bring individual experiences to the fore, sharing best practice and continually learning where improvements can be made.

Empowering leadership and decision-making

My ongoing discussion with AI thought-leaders and the wider sector strongly suggests that a lack of clarity is a constraint to embarking on this journey for many.

We’ve undertaken considerable research into the risks and benefits of AI for local public services. Societal wellbeing and outcomes previously side-lined by an austerity-driven obsession with efficiency come to the fore as ethical and intelligent use of technology and data re-focuses our attention on the real needs and vulnerabilities in our diverse localities.

Without further debate and collaboration there are pivotal questions that, unanswered, are hampering immediate progress.

For example, should we be trying to come up with codes of ethics and checklists? There’s a hesitancy here that we may risk having to challenge the assumption that the public sector cannot be trusted to behave in the right way with the right outcomes in mind.

The most appropriate way forward is to empower public sector professionals to make the right decisions in support of outcomes pivotal to people in their localities.

Public sector professionals should be supported in making judgment calls on a case-by-case basis rather than being restrained by a code or framework drafted by policy makers that could prove to be nothing more than broad high-level principles. After all, starting from the premise that the public sector knows to do the right thing is far healthier than assuming that it will automatically get it wrong unless strictly regulated.

What is clear to me is that the public sector and members of the technology community should support both the healthy use of data and technologies and take a lead in explaining the risks behind not implementing technology. Positive experiences and case studies should be used to counter negativity and build more trust in councils, their partner organisations and the people who work for them. Engendering this trust should be based on discussion, collaboration and shared best practice within the sector.

Nicola Graham, president, Socitm; head of ICT, Aberdeen City Council

Socitm’s annual President’s Conference takes place in Birmingham on June 18-19

 

  • Comment

Have your say

You must sign in to make a comment

Please remember that the submission of any material is governed by our Terms and Conditions and by submitting material you confirm your agreement to these Terms and Conditions.

Links may be included in your comments but HTML is not permitted.