‘Not enough control on algorithms’ Bookmark has been added
‘Not enough control on algorithms’
Video interview Noel Sharkey: Emeritus Professor of AI and Robotics
‘Ethical guidelines, lists of frameworks and ethical boards are popping up left and right. But there is still isn’t enough control on AI, says professor and AI expert Noel Sharkey in an interview with Deloitte. “It’s about very high-level things. Be fair, be just. But what does that exactly mean? It needs to come down to the application.”
There has been a large uptake in interest in ethics on the responsible use of AI, but professor Sharkey hasn’t seen much application yet. “Up until now, there has mostly been discussions and committees”, explains Sharkey. “They haven’t done anything yet, but they are working on it.”
It is unacceptable to have algorithmic bias that operates unjustly on the lives of ethnic groups, women, the poor and the vulnerable. They are multiplying throughout the world and need to be shut down until adequately tested for equality.
Sharkey sees a need for tech workers and specifically programmers to be trained and made aware of ethical issues. “They need to consider when they are building their programs or building their products. It’s not just ethical issues, it’s about social responsibility. What impact will this have on my society? If you’re developing a system, this has to be down there in the design process. On the first line of code, you need to be working on principles.”
That’s not easy to do that, Sharkey acknowledges. “You need to flesh the principles out and take some time to consider the impact on society. And you’ll need to do this beforehand. Because you don’t write the code and say: what’s the impact on this?” This also entails the need for more diversity in the development teams, Sharkey says. “Wherever algorithms have been used for justice – from welfare payments through to judicial reviews and prison sentencing – these turned out to be gender and racially biased. If you can’t demonstrate fully and test that an algorithm isn’t biased, then it should not be used. Not without a certification.”