Expert warns Biden’s AI order has ‘wrong priorities’ despite some positive reviews

President Biden signed what he called a “landmark” executive order (EO) on artificial intelligence, drawing mixed reviews from experts in the rapidly developing technology.

“One key area the Biden AI [executive order] is focused on includes the provision of ‘testing data’ for review by the federal government. If this provision allows the federal government a way to examine the ‘black box’ algorithms that could lead to a biased AI algorithm, it could be helpful,” Christopher Alexander, chief analytics officer of Pioneer Development Group, told Fox News Digital. 

“Since core algorithms are proprietary, there really is no other way to provide oversight and commercial protections,” Alexander added. “At the same time, this needs to be a bipartisan, technocratic effort that checks political ideology at the door or this will likely make the threat of AI worse rather than mitigate it.”

Alexander’s comments come after Biden unveiled a long-anticipated executive order containing new regulations for AI, hailing it as the “most sweeping actions ever taken to protect Americans from the potential risks of AI systems.”

WHITE HOUSE UNVEILS AI EXECUTIVE ORDER, REQUIRING COMPANIES TO SHARE NATIONAL SECURITY RISKS WITH FEDS

The executive order will require AI developers to share safety test results with the government, create standards to monitor and ensure the safety of AI and erect guardrails meant to protect Americans’ privacy as AI technology rapidly grows.

“AI is all around us,” Biden said before signing the order, according to a report from The Associated Press. “To realize the promise of AI and avoid the risk, we need to govern this technology.”

WHAT IS ARTIFICIAL INTELLIGENCE?

Jon Schweppe, policy director of American Principles Project, told Fox News Digital that the concerns about AI that led to the executive order are “warranted” and complimented some of the details of Biden’s executive order, but also argued that some of the order focuses “on the wrong priorities.”

“There’s a role for direct government oversight over AI, especially when it comes to scientific research and homeland security,” Schweppe said. “But ultimately we don’t need government bureaucrats micromanaging all facets of the issue. Certainly we shouldn’t want a Bureau of Artificial Intelligence running around conducting investigations into whether a company’s AI algorithm is adequately ‘woke.'”

A man is seen using the OpenAI ChatGPT artificial intelligence chat website

EXPERTS CALL BIDEN EXECUTIVE ORDER ON AI A ‘FIRST STEP,’ BUT SOME EXPRESS DOUBTS 

Schweppe argued that there is also a role for “private oversight” of the growing technology, while also noting that AI developers should be exposed to “significant liability.”

“AI companies and their creators should be held liable for everything their AI does, and Congress should create a private right of action giving citizens their day in court when AI harms them in a material way,” Schweppe said. “This fear of liability would lead to self-correction in the marketplace — we wouldn’t need government-approved authentication badges because private companies would already be going out of their way to protect themselves from being sued.”

The order was designed to build on voluntary commitments by some of the largest technology companies the president helped broker earlier this year, which will require the firms to share data about AI safety with the government.

Ziven Havens, policy director of the Bull Moose Project, told Fox News Digital that Biden’s order is a “decent first attempt at AI policy.”

CLICK HERE FOR MORE US NEWS

“A significant portion of the EO is setting expectations for guidelines and regulations for topics including watermarks, workforce impact and national security,” Havens said. “All of which are crucial in the future of this new technology.”

But Havens also cautioned that there should still be concerns about “how long it will take to develop this guidance.”

“Falling behind in the AI race due to a slow and inefficient bureaucracy will amount to total failure,” Havens said.

Phil Siegel, founder of the Center for Advanced Preparedness and Threat Response Simulation, told Fox News Digital that Biden’s order was “thorough,” but questioned whether it attempted to “take on too much.”

The outside of the White House

Siegel argued that there are “four pillars to AI regulation,” including protecting vulnerable populations such as children and the elderly; developing laws that “take into account the scope of AI”; ensuring that algorithms are fair by removing bias; and ensuring trust and safety in algorithms.

“I would give the EO high marks on [pillars] three and four and more of an incomplete on one and two,” Siegel said. “Sadly, there is only so much that can be done in an EO anyway, and it is necessary for Congress to engage with the White House to make some of this into law.”

The White House did not immediately respond to a Fox News request for comment.

Read the full article Here

Leave a Reply

Your email address will not be published. Required fields are marked *

DON’T MISS OUT!
Subscribe To Newsletter
Be the first to get latest updates and exclusive content straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
close-link