^

Freeman Cebu Business

The new AI systems– should we be scared?

INTEGRITY BEAT - Henry Schumacher - The Freeman

If you ask me: yes, we should be scared. With Artificial Intelligence (AI) starting to become ubiquitous, governments around the world are thinking that it’s high time to start creating protections for users.

For example, the Biden Administration has released a layout for a Bill of Rights that would try to govern how AI systems are developed. Although the US federal government won’t be able to legally enforce the principles, several use cases for how AI can negatively impact people’s health and quality of life may force Congress to codify it into law.

Programmed protection

Here are the main points that the Biden administration’s Office of Science & Technology outlined in a “Blueprint for an AI Bill of Rights”:

• The systems should be safe and effective, with input on them given by various communities to ensure that blind spots are addressed.

• There should be no discrimination built into the algorithms, and companies should continually monitor and test the systems to ensure they stay that way.

• Users should know when an AI system is using their data and what it means for the system to use it — nothing should be left up to assumption.

• Privacy is paramount, so users should be protected from “abusive data practices” and have control over how their data is used.

• Users should be able to opt-out of their data being included in the systems (and be able to do so by talking to an actual person).

The White House said that federal agencies using AI systems would soon adhere to all these principles, encouraging private companies to follow suit. It won’t be able to actually enforce their adoption in the private sector, but government agencies will still be monitoring them.

Crowdsourced rights

While the White House hopes to be an authority in AI regulation, it’s not the first entity to start thinking about reigning in the tech’s power.

• IBM published a set of principles back in 2017, which proposed that companies (or the AI systems themselves) should be able to explain how their underlying algorithms work.

• The EU also did so in 2019, calling for “trustworthy AI” built on several ethical principles.

• And the Vatican — yes, the Vatican — wrote a “framework” that pushed for AI to be developed in such a way that it protects the “the rights and the freedom of individuals so they are not discriminated against by algorithms.”

And with AI systems soon to be powering humanoid robots, these rules can’t be implemented soon enough!

Why am I scared:

Automated systems can influence or even determine important aspects of our lives, including healthcare, employment, housing, financing, and education.

Security principles that should be incorporated into AI systems to ensure their safety and transparency, limit the impact of algorithmic discrimination, and give users control over data.

Let’s look at real-world consequences of failures to put such principles into practice.

• A model meant to predict the likelihood of sepsis in hospitalized patients underperformed and caused “alert fatigue” with false warnings.

• A hiring tool that “learned” employees were predominantly men rejected women applicants, with resumes that had language like “women’s chess club captain” penalized in ranking candidates.

• These technologies are causing real harms in the lives of people, and harms run counter to our core democratic values, including the fundamental right to privacy, freedom from discrimination and our basic dignity.

• Will AI understand integrity and ethics?

What’s your view? Will you fight uncontrolled AI also? I am looking forward to your views – contact me at [email protected]

vuukle comment

ARTIFICIAL INTELLIGENCE

Philstar
x
  • Latest
Latest
Latest
abtest
Are you sure you want to log out?
X
Login

Philstar.com is one of the most vibrant, opinionated, discerning communities of readers on cyberspace. With your meaningful insights, help shape the stories that can shape the country. Sign up now!

Get Updated:

Signup for the News Round now

FORGOT PASSWORD?
SIGN IN
or sign in with