• About
  • Privacy Policy
  • Contact
Tech Newsday
  • Security
  • Future of Work
  • Mobility
  • Emerging Tech
  • Today’s News
No Result
View All Result
Tech Newsday
  • Security
  • Future of Work
  • Mobility
  • Emerging Tech
  • Today’s News
No Result
View All Result
Tech Newsday
No Result
View All Result
Home Artificial Intelligence

Movement to Hold AI Accountable Gains Ground

TND News Desk by TND News Desk
December 8, 2021
in Artificial Intelligence
0 0
0

Efforts to better understand how AI works and hold users accountable are rapidly gaining ground, and a number of initiatives are beginning to unfold.

Last month, the New York City Council passed a new law requiring the testing of algorithms used by employers in hiring or promotion.

The law, the first in the nation, requires employers to include outsiders in assessing whether an algorithm shows bias based on sex, race, or ethnicity. Employers are also required to inform applicants who live in New York when artificial intelligence plays a critical role in deciding whether to promote or hire staff.

Members of Congress are drafting a bill in Washington, DC that would mandate companies to evaluate automated decision-making systems used in health care, housing, the labour market, or education, and to notify the Federal Trade Commission of the findings. Three of the five members of the FTC favour more regulation of the algorithm.

Last month, the White House proposed a Bill of Rights requiring disclosure when AI makes decisions that affect a person’s civil rights, and called, among other things, for greater scrutiny of AI systems to rid them of prejudice.

A forthcoming report by the Algorithmic Justice League (AJL), a private non-profit organization, advocates disclosure when using an AI model and creating a public repository of incidents where AI has caused harm.

The repository will help auditors identify potential problems with algorithms and help regulators sanction repeat offenders. AJL co-founder Joy Buolamwini co-authored a major audit in 2018 that revealed that facial recognition algorithms work best for white men and worst for women with darker skin.

The report underlined the importance of independent auditors and that the results should be publicly available.

Deb Raji is an Audit Evaluator Fellow at AJL and participates in the 2018 review of facial-recognition algorithms. She calls for the creation of an audit oversight body within a federal agency to enforce standards or act as a mediator in disputes between companies and auditors, similar to the Financial Accounting Standards Board or the Food and Drug Administration’s standards for evaluating medical devices.

Cathy O’Neil founded O’Neil Risk Consulting & Algorithmic Auditing (Orcaa), with the aim of assessing artificial intelligence that is inaccessible to the public. An example of this is Orcaa’s cooperation with the attorneys general of several US states in assessing financial or consumer algorithms. O’Neil admitted to losing potential customers because companies still maintain deniability and choose not to be informed about how their artificial intelligence potentially harms people.

In a forthcoming paper in the Harvard Journal of Law & Technology by UCLA professor Andrew Selbst, he argues for documentation to help people fully understand how AI harms people. Documentation of impact assessments, he mentioned, will be vital for people interested in to file a lawsuit.

First brought to the fore in 2019, a revised version of the Algorithmic Accountability Act is now being deliberated in Congress. The bill seeks to require companies that use automated decision-making systems in health care, housing, employment, or education to conduct regular impact assessments and report their findings to the FTC.

As recently as August this year, the Center for Long-Term Cybersecurity at the University of Berkeley declared that a tool developed by the federal government to assess the risk of AI should include factors such as a system’s carbon footprint and the potential to perpetuate inequality, and called on the government to take stronger action against AI than has been the case with cybersecurity.

In 2020, users uncovered bias that discriminates against people with dark skin on Twitter and Zoom. These resulted in Zoom tweaking its algorithm and Twitter ending its use of AI for cropping photos.

Another report by Data & Society’s AI on the Ground team, published in June this year, explains why community activists, critical scientists, politicians, and technicians working for the public interest should be included in the assessment of algorithms. The report claims that what counts as an impact often is a reflection of the wants and needs of people in power. When done incorrectly, impact assessments can perpetuate existing power structures while making businesses and governments appear accountable, instead of enabling and empowering regular people to act when things go wrong.

For more information, you may view the original story from Arstechnica.

Tags: Artificial Intelligence

Subscribe

About Tech News Day

In just 10 minutes you will have all your leadership tech news needs covered. Our Editors browse the top tech news sites for you, get rid of the fluff and post summaries of the best. Our content is created by trained professionals and enhanced for IT leaders using leading edge artificial intelligence.

About

Tech Newsday

Tech News Day picks the new, most relevant tech stories.

Our selection is done by industry professionals – executives like you who pick the top stories for that day. Our writers summarize these to give you a quick summary and the key takeaways.

SUBSCRIBE

Categories

  • Artificial Intelligence
  • Auto Tech
  • Blockchain
  • Careers & Education
  • Channel Strategy
  • Cloud
  • Communications & Telecom
  • Companies
  • Data & Ananytics
  • Development
  • Digital Transformation
  • Distribution
  • Diversity & Inclusion
  • eCommerce
  • Emerging Tech
  • End User Hardware
  • Engineering
  • Financial
  • Fintech
  • Future of Work
  • Governance
  • Government & Public Sector
  • Human Resources
  • Infrastructure
  • IoT
  • Leadership
  • Legal
  • Legislation & Regulation
  • Managed Services & Outsourcing
  • Marketing
  • Martech
  • Medical
  • Mobility
  • Not for Profit
  • Open Source
  • Operations
  • People
  • Podcasts
  • Privacy
  • Security
  • Service
  • Smart Home
  • SMB
  • Social Networks
  • Software
  • Supply Chain
  • Sustainability
  • Today's News
  • Top Stories This Week
  • Women in Tech
  • Home
  • Today’s News
  • About
  • Privacy
  • Contact

2022 Tech News Day

No Result
View All Result
  • Security
  • Future of Work
  • Mobility
  • Emerging Tech
  • Today’s News

2022 Tech News Day

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In

Why are you leaving?

About Tech News Day

Tech News Day is a daily publication featuring key daily news stories about technology and how it affects businesses. We know that you are busy and that there’s a lot of information coming at you. While there are lots of programs that will curate based on what you have already read or followed, Tech News Day picks the new stories that we feel are most relevant.

Our selection is done by industry professionals – executives like you who pick the top stories for that day. Our writers summarize these to give you a quick summary and the key takeaways. If you want to do a deeper dive and get even more information, we provide a link to at least one of the longer stories from one of our sources (we are often following stories from more than one source).

We also have a daily podcast, published each morning so that you can get the news stories of the day from wherever you get your podcasts.

We hope you find this to be useful to you in keeping up to date in these challenging times. We love your input and opinions. You can use our feedback widget to rate individual stories or you can write us at NewsDesk@technewsday.com.

Click Here

-
00:00
00:00

Queue

Update Required Flash plugin
-
00:00
00:00