Mr Schmidt agreed with US export controls on powerful microchips which power the most advanced AI systems.
Before he left office, former US President Joe Biden restricted the export of microchips to all but 18 countries, in order to slow adversaries’ progress on AI research.
The decision could still be reversed by Donald Trump.
“Think about North Korea, or Iran, or even Russia, who have some evil goal,” Mr Schmidt said.
“This technology is fast enough for them to adopt that they could misuse it and do real harm,” he told Today presenter Amol Rajan.
He added AI systems, in the wrong hands, could be used to develop weapons to create “a bad biological attack from some evil person.”
“I’m always worried about the ‘Osama Bin Laden’ scenario, where you have some truly evil person who takes over some aspect of our modern life and uses it to harm innocent people,” he said.
Bin Laden orchestrated the 9/11 attacks in 2001, where planes were used to kill thousands of people on American soil.
Mr Schmidt proposed a balance between government oversight of AI development and over-regulation of the sector.
“The truth is that AI and the future is largely going to be built by private companies,” Mr Schmidt said.
“It’s really important that governments understand what we’re doing and keep their eye on us.”
He added: “We’re not arguing that we should unilaterally be able to do these things without oversight, we think it should be regulated.”
He was speaking from Paris, where the AI Action Summit finished with the US and UK refusing to sign the agreement.
US Vice President JD Vance said regulation would “kill a transformative industry just as it’s taking off”.
Mr Schmidt said the result of too much regulation in Europe “is that the AI revolution, which is the most important revolution in my opinion since electricity, is not going to be invented in Europe.”
He also said the large tech companies “did not understand 15 years ago” the potential that AI had, but does now.
“My experience with the tech leaders is that they do have an understanding of the impact they’re having, but they might make a different values judgment than the government would make,” he said.