^

Freeman Cebu Business

Artificial Intelligence – what about its ethical deployment?

INTEGRITY BEAT - Henry Schumacker - The Freeman

While companies are making progress deploying artificial intelligence (AI) and machine learning, more than half of enterprises overestimate their levels of maturity in deploying responsible AI models. The responsible AI framework evaluates the technology's potential effects on safety, privacy and society at large.

Businesses often struggle with three areas of responsible AI maturity:

• fairness and equity,

• social and environmental impact mitigation, and

• whether a system is able to safeguard human well-being and preserve human authority. I like the emphasis on human authority.

AI technology underpins critical company systems. It's becoming the backbone of services across the enterprise, from consumer-facing applications to tools that help coordinate manufacturing or logistics. But leaders have long struggled with the ethical dimensions of the technology, including privacy and bias.

Businesses are working to deploy AI but for many, the ethical dimensions of AI are still out of grasp. Companies find security or privacy concerns represent an obstacle in AI implementation, according to Gartner data. There are also challenges integrating AI solutions with existing architecture and the data volume or complexity.

However, benefits of responsible AI include brand differentiation and an upper hand in employee recruiting and retention, as well as a culture of responsible innovation.

Customer expectations, risk mitigation, regulatory compliance and social responsibility also need to drive business leaders to seek a responsible AI deployment.

To deal with the potential ethical implications of AI, business leaders need to focus on minimum requirements expected by the Integrity Initiative and minimum compliance enforced by the National Privacy Commission.

Ethics can come up in unusual ways, such as blowback for not supporting social issues or labor practices. "Tech is not immune" to the ethical issues faced by all of the supply chain, such as whether to support a vendor known for union busting or a vendor contracting with controversial organizations. Amazon and Walmart have started to review the extent of their AI implementation.

IT business decisions bring a whole new set of ethical challenges. Use of data and AI, for example, can present privacy and discrimination issues on top of traditional supply chain ethics. And it falls on IT leaders to account for the ethical dilemmas.

The ethical issues fit into a business's bottom line, too. Strong ethics sustains internal and external trust, which is good for competitive advantage. This notion of ethics is becoming much more visible to stakeholders across the board and they are using that as a measure of trust, both internally and externally.

If an organization doesn't do its ethical due diligence, customers will catch on and trust will be diminished!

In conclusion, yes, AI is part of our future, but it is essential to understand the need for its ethical deployment. It is to be understood also that humans need to manage AI and see to it that AI does not drive the performance of the organization in the wrong direction.

As mentioned above, responsible AI benefits brand differentiation and an upper hand in employee recruiting and retention. To deal with the potential ethical implications of AI, business leaders need to focus on the minimum requirements expected by the Integrity Initiative and minimum compliance enforced by the National Privacy Commission.

I look forward to your views on this topic; contact me at [email protected]

vuukle comment

ARTIFICIAL INTELLIGENCE

Philstar
x
  • Latest
Latest
Latest
abtest
Recommended
Are you sure you want to log out?
X
Login

Philstar.com is one of the most vibrant, opinionated, discerning communities of readers on cyberspace. With your meaningful insights, help shape the stories that can shape the country. Sign up now!

Get Updated:

Signup for the News Round now

FORGOT PASSWORD?
SIGN IN
or sign in with