The ethical dilemmas of the world’s first accounting chatbot

Elizabeth Barry 28 July 2017 NEWS

business chatbot

Kriti Sharma, VP of bots and artificial intelligence at Sage, talks through the surprisingly complex world of developing AI for small business.

For most people, artificial intelligence (AI) is Siri, Alexa or Google. The voice is female, the commands are simple, and if you ask it the right question, you can even get something funny out of them.

So is there anything wrong with this perception?

For Kriti Sharma, VP of bots and artificial intelligence at Sage, there are inherent ethical problems in AI. This was something she became acutely aware of when developing Pegg, the world’s first business accounting robot.

Finding Pegg

For Sharma, the basic goal when developing Pegg was to create a personalised accounting expert, finance expert and HR expert rolled into one.

“It's a personal trainer for your business,” Sharma explains.

“It looks at all your data and identifies what needs to happen to help you understand your business in simpler terms. It guides you on what's the next step where you should be focusing on. On top of that, it also takes care of a lot of admin activities with automation.”

While that is the functional definition of Pegg, that’s not where the development of the chatbot ended for Sage.

“For a very long time, humans have had to learn to speak like computers. But now we believe it's time for software to learn to speak like humans. And that's what we're focusing on.”

Challenging the black box approach

Sharma’s experience in computer engineering, which includes being the head of Big Data and Analytics at Barclays, means she was conscious of the ethical dilemmas of developing AI.

“I've always been very conscious of building technology the right way,” says Sharma.

“But when we started working on Pegg, we realised that the problems for ethics in business and financial services are very different to others.”

Sharma developed five core principles she believes need to be adopted by the industry to ensure ethical accountability when developing AI. One of these is accountability, which relates to the AI “black box” approach.

“At the moment, AI is treated as a black box and the reason for that is, if you build a bunch of models and add a lot of data to it, it pops out a suggestion or a result. And, you don't exactly know what attributed to that result.”

An example would be Netflix or YouTube recommending the next movie or video you should watch.

“But when it comes to more serious applications in financial services or healthcare, it is very important to know what's going on,” she says.

“Why does the ER recommend a certain treatment, or a certain diagnosis or a certain recommendation to the financials up there?”

“That's where we believe this black box approach of AI needs to be challenged.”

How diversity plays a part

The lack of diversity in workplaces across the world is a serious issue, and one that Sharma says comes into AI.

“There’s a lack of diversity in the workforce and there's a shortage of women in technology – in AI even more so. The consequence of that is, among other reasons, that we end up with a bunch of very popular AI having feminine personalities,” says Sharma.

“In the early days of Siri, Microsoft's Cortana, Amazon's Alexa, even Google, they have female voices. And they're doing simple tasks like turning lights on and off, scheduling meetings, ordering Ubers and getting you pizza, and that is just not fair.”

Pegg was developed with “British accounting humour” and a personality which is proud to be a bot. Pegg is also gender-neutral.

Sharma says thinking about personality and gender is an extremely important part of developing AI.

“AI is very different to other technologies we've seen before. It learns on its own. Humans build relationships with their AI. And that means it's very important to address these ethical design challenges in the right way.”

Adding value

Many chatbots have also had the novelty value in the past, but Sharma says we need to move beyond just novelty and ensure AI delivers real value to consumers and businesses.

“If you just build a very approachable, fun, interesting bot, that's not going to lead to anything beyond the initial few interactions. Because then you're just going to get bored of harassing it, asking it out on dates, swearing at it and saying, "I love you,", which happens to all the bots.”

“We need to get to a point where the bot is adding value to humans. And that is where people will become more serious about this technology. Because the reality is it's not perfect.”

She admits that Pegg is not perfect, but it’s important to grow and “be purposeful” with any AI.

Still a way to go

Pegg is currently available for small businesses, but this isn’t the end of the development road for Sage.

“AI is nothing new. It's been around since the 60s. So, now is the time to figure out how do we build it the right way?” says Sharma.

“It's not just about getting to the end results with AI, but also how it gets there. In the workplace when we measure objectives or success, we look at what we've achieved, and how we have achieved. With AI, we need to model both of these principles.

Picture: Shutterstock

Get more from finder

Ask an Expert

You are about to post a question on finder.com.au:

  • Do not enter personal information (eg. surname, phone number, bank details) as your question will be made public
  • finder.com.au is a financial comparison and information service, not a bank or product provider
  • We cannot provide you with personal advice or recommendations
  • Your answer might already be waiting – check previous questions below to see if yours has already been asked

Finder only provides general advice and factual information, so consider your own circumstances, or seek advice before you decide to act on our content. By submitting a question, you're accepting our Terms and Conditions and Privacy Policy.
Ask a question
Go to site