While many businesses are eager to harness the latest AI technologies to improve their processes, reduce costs, and enjoy other benefits, rapid adoption without establishing the proper IT infrastructure and other guardrails could expose businesses to numerous risks.
Case in point is the tribunal case involving a faulty chatbot (owned by Air Canada) and a disgruntled plaintiff. After months of contesting the charges, Air Canada was forced to give a partial refund to a passenger who was given inaccurate information about the airline’s bereavement travel policy by its chatbot.
In 2022, Jake Moffatt booked a flight from Vancouver to Toronto after his grandmother died. Unsure of how the airline’s bereavement rates worked, Moffatt consulted Air Canada’s chatbot. The chatbot inaccurately told him that he could book a full-fare flight and then submit his ticket later on for a reduced bereavement rate.
When Moffatt applied for the discount, he was told that the chatbot had given him inaccurate information. Passengers needed to submit the request for a reduced bereavement rate before their flight. Apart from refusing to issue the discount, Air Canada also refused to take any responsibility for the chabot’s actions, stating that it was a “separate legal entity that is responsible for its own actions”.
Air Canada did promise to update its chatbot, as well as offer Moffatt a $200 coupon to use on a future flight.
Dissatisfied with this resolution, Moffatt filed a small claims complaint in Canada’s Civil Resolution Tribunal. After dragging on for months, the tribunal decided in favour of Moffatt. Tribunal member Christopher Rivers ordered Air Canada to pay a partial refund from the original fare, as well as settle additional damages to cover Moffatt’s tribunal fees and interest on the airfare.
Establish Guardrails Before Jumping on the AI Bandwagon
Air Canada’s recent debacle is just one example of businesses that have suffered reputational damage and other losses resulting from malfunctioning AI. Like any other type of technology, AI isn’t infallible and even the best tools are liable to errors, malfunctions, and biases.
Despite improvements in the underlying technology, generative AI chatbots and computer vision tools are still prone to hallucinations. The recent Air Canada case is just one example of the damages – reputational, financial, or worse – that could be caused by malfunctioning AI tools.
While Velocity does not discourage its clients from exploring the latest AI tools or from using them to improve various business processes, it does caution businesses to establish certain guardrails before jumping into the AI bandwagon.
“We encourage our clients to be curious and open minded about new AI technologies that can enhance productivity and improve business processes. However, we encourage them not to jump in too quickly without having the proper IT infrastructure, as well as checks and policies, in place to manage these tools,” notes James Dwyer, Velocity’s COO.
James recommends that businesses set up Modern Workplace Tools and Practices to transform their processes before initiating AI adoption.
Examples of Modern Workplace Tools and Practices include cloud storage and file sharing (for the secure accessing and sharing of documents from any device or location), collaboration platforms (which facilitate communication and collaboration among team members), and flexible work policies (which support remote and hybrid work arrangements without compromising productivity).
“Businesses should also establish comprehensive AI policies before implementing any of these tools,” James said. “Your AI policy should provide your employees with a clear understanding of their rights and responsibilities when it comes to AI. The AI policy should also cover key points, such as data privacy, bias, transparency, and accountability. Guardrails such as these can minimise risks resulting from malfunctioning AI.”
To learn more about our Modern Workplace Solution, get in touch for a free consultation.
Sources:
https://arstechnica.com/tech-policy/2024/02/air-canada-must-honor-refund-policy-invented-by-airlines-chatbot/
https://www.bbc.com/travel/article/20240222-air-canada-chatbot-misinformation-what-travellers-should-know