
South Korea is investing heavily in AI. The Seoul Metropolitan Government plans to train 10,000 AI professionals annually. Major companies like Microsoft and Intel are backing education programs.
But what if all this is missing the point?
“Korea is doing what every ambitious country does when a new technology arrives: it’s training specialists,” said Vilas Dhar, president of the $1.5 billion Patrick J. McGovern Foundation, in an interview with The Korea Herald. “But building an AI future isn’t just about specialists. It should also be about citizens.”
Dhar, in Seoul for the 2025 Asian Leadership Conference on May 21-22, leads one of the world’s largest philanthropic institutions focused on AI and digital equity.
With a background in both computer science and law, he advises major global bodies including the United Nations, the OECD, and Stanford’s Institute for Human-Centered Artificial Intelligence.
As South Korea heads into a presidential election on June 3, both major parties are competing over bold visions for AI development. The Democratic Party has pledged a massive 100 trillion won (about $73 billion) investment while People Power Party has promised to train 200,000 young AI professionals. Both aim to position South Korea among the world’s top three leaders in AI.
But these plans, Dhar argues, reflect a familiar, and potentially dangerous, pattern seen in many countries: rapid investment in AI infrastructure and workforce development, with little attention paid to how the general public understands and engages with these powerful technologies.
During his visit here, he met with Seoul Mayor Oh Se-hoon and discussed some of the capital's recent AI initiatives, including the expansion of the “Seoul Software Academy.” The program now offers short-term training in AI coding and data skills at 20 campuses across the city. Just last month, Seoul added 45 new AI-related courses, touting a 76 percent job placement rate in 2024.
While efficient, Dhar pointed out that the focus remains overwhelmingly technical. “Training someone to code is useful,” he said. “But what happens when AI coding also gets replaced in 10 years? More importantly, what happens when that same person is later asked to decide whether an AI system should be used to allocate welfare or predict crime? Do they have the context? Do they know how to ask whether it’s fair or biased?”
Dhar sees a crucial distinction between “AI skilling” and “AI fluency.” The first, he says, is about teaching people how to build AI systems. The second is about equipping people to live with them, which is about understanding what these systems do, how they affect daily life, and how to hold them accountable.
When asked what real leadership on AI education looks like, Vilas Dhar referenced the US. In April this year, President Donald Trump launched a national initiative to introduce AI education across schools and workforce programs, with a White House task force coordinating efforts between educators, industry, and government. “It’s not perfect,” Dhar said, “but it shows a willingness to ask, 'how do ordinary people learn to live with AI, not just build it'?”
He doesn’t underestimate the challenge.
“Sometimes, elected officials don’t fully understand the systems they’re deploying, which is understandable.” he said. “But that’s also exactly why we need public institutions that make AI legible and accountable to ordinary people.”
Otherwise, the gap between those who build AI and those who live under it will keep growing.
“The most advanced AI society won’t be the one that codes the fastest. It’ll be the one where ordinary people know what AI is, what it isn’t, and how to live alongside it.”
mjh@heraldcorp.com