President Joe Biden issued a wide-ranging executive order on Monday that aims to safeguard against threats posed by artificial intelligence, ensuring that bad actors do not use the technology to develop devastating weapons or mount supercharged cyberattacks.
However, how do you regulate something that has the potential to both help and harm people, that touches every sector of the economy and that is changing so quickly even the experts can’t keep up?
That has been the main challenge for governments when it comes to artificial intelligence.
React too quickly and you risk writing bad or harmful rules, stifling innovation or ending up in a position like the European Union’s. It first released its A.I. Act in 2021, just before a wave of new generative A.I. tools arrived, rendering much of the act obsolete.
In addition, a cultural battle has broken out in Silicon Valley, as some researchers and experts urge the A.I. industry to slow down, and others push for its full-throttle acceleration.
The move stakes out a role for the federal government in a nearly half-trillion-dollar industry at the center of fierce competition between some of the nation’s largest companies, including Google and Amazon.
One of the order’s more important mandates requires that companies developing the most advanced AI models report to the government information on model training, parameter weights, and safety testing. Transparency about results of safety tests sounds practical, but in reality it could discourage tech companies from doing more testing, since results need to be shared with the federal government. Moreover, the very essence of AI research is iterative experimentation, and this mandate could bog down companies in red tape and reporting when they should be tweaking their models to improve safety. Given these tradeoffs, it’s unclear that all the reporting will improve safety for anyone.
The Biden administration also calls on Congress to pass data privacy legislation, an achievement that has eluded lawmakers for years despite multiple attempts.