The ChatGPT app is displayed on an iPhone in New York, May 18. A federal judge imposed $5,000 fines on two lawyers and a law firm, June 22, in an unprecedented instance in which ChatGPT was blamed for their submission of fictitious legal research in an aviation injury claim. AP-Yonhap |
Countries with no regulations will become havens for 'bad actors'
By Kim Yoo-chul
Advanced economies are seeking to maximize the economic and national security benefits of artificial intelligence (AI) because of its applications in sectors such as connected vehicles, embedded technology solutions, healthcare, education and even military systems.
To take up the opportunities available and move forward with technological innovation, countries are seeking ways to build solid and sustainable AI ecosystems. AI systems are actually advancing at a breakneck pace with the rise of ChatGPT and Stability Diffusion, which exponentially expand the possibilities of the technology.
Within that context, Korea ― home to the world's two largest memory chipmakers ― hopes to maintain its economic vitality. And AI, a transformative technology that can significantly enhance productivity and improve efficiency across various industries, has been identified as its next area of growth.
The government says it hopes to make the country one of the world's top three AI powerhouses by 2027. However, little is known about how it plans to achieve this. Korea is ahead of Japan when it comes to utilizing AI, although Japanese policymakers are also seeking to capitalize on the technology.
Both Seoul and Tokyo have similar export-driven economic growth models and the two countries are also home to leading high-tech companies. Korea's strength in chip manufacturing is expected to help the country advance its AI systems faster, according to government officials.
But is it possible for Korea to become a hub for AI innovation? And should the government develop applicable accountability measures for AI systems?
AI experts contacted by The Korea Times said it is possible for Korea to become a "mid-power alternative" because national comparisons on computing power ― one of the three core factors for the development of modern AI systems alongside data collection and algorithm design ― could differ depending on chip supply chains and cloud computing. They added that Korea is weak in terms of AI literacy, AI talent and AI research.
No monopoly within AI
Seoul's core strength in AI, aside from the country's position as a leader in semiconductors, is that it is one of only a few countries with companies that have developed foundational models for AI.
U.S. President Joe Biden, Governor of California Gavin Newsom and other officials attend a panel on artificial intelligence, in San Francisco, California, June 20. Reuters-Yonhap |
"LG and Naver, for example, are looking for ways to create a niche for themselves, outside the market focus of major players like OpenAI. AI is, for the next decade, what search engines were for the previous decade. There's tremendous power in AI leadership as consumers rely more and more on AI for informational purposes," Modulus CEO Richard Gardner noted in a written interview.
Modulus is the largest holder of fintech intellectual property and has built financial services and product technologies for major brokerages, hedge funds, trading firms and leading technology companies ― including Merrill Lynch, JP Morgan Chase, Goldman Sachs, Barclays, Shell, Microsoft, Yahoo and NASA. Gardner was also recently on Fox News proposing responsible AI regulations, alongside U.S. Representative Ted Lieu, the California Democrat, who's a major proponent of such policymaking in Congress.
According to Gardner, it is important to ensure that a monopoly does not form in the AI industry.
"And Korea can offer alternatives outside of those backed by big-tech," the CEO said, adding that securing continued access to advanced semiconductors is absolutely necessary for building domestic computing capacity with fine-tuned AI capabilities ― supporting public cloud computing and data centers, for example.
In a separate written interview, Anthony Clemons, a senior training manager for a Fortune 500 firm said that while he viewed Korea's commitment to AI as commendable, continual efforts in fostering talented AI engineers from a young age could further strengthen the country's position in the sector.
The expanding capabilities of generative AI have a lot of experts worried. While not all of them agree on whether AI technology should be regulated, the Biden administration favors responsible AI systems as the White House considers placing regulations on developments such as ChatGPT.
In the EU, Google is talking with regulators about the bloc's regulations and how it and other "big tech" companies can build AI safely and responsibly. The EU Parliament recently launched the "EU AI Act," aimed at ensuring the data for generative AI tools doesn't violate copyright laws.
In Korea, the National Assembly Broadcasting and ICT Committee has passed proposed legislation to enact the "AI Act," which could take effect within this year. Like in the EU, the AI Act protects the "rights to content" produced by people, differentiating it from that generated by AI.
Lawmakers at the European Parliament in Strasbourg, eastern France, vote on the Artificial Intelligence act, June 14. European Union consumer protection groups urged regulators, June 20, to investigate the type of artificial intelligence underpinning systems such as ChatGPT, citing risks that leave people vulnerable and the delay before the bloc's groundbreaking AI regulations take effect. AP-Yonhap |
Coming up with regulations dealing with emerging technologies requires a lot of work ― and may not be the best method ― as they are some of the hardest to control due to the fast pace of development, according to experts.
Gardner, however, stressed the necessity of Korea developing a detailed national regulatory strategy for AI.
"There are questions about consumer privacy, as well as on the programming of AI leading to bias or hallucination. Then, of course, there are questions related to employment. The balancing act is to design a regulatory strategy which protects the general welfare without hampering AI development. Korea, like every other country, does need a strategy for AI," he said.
According to the Modulus CEO, countries without "any regulation at all" will become havens for "bad actors" to continue to develop and experiment with AI, so it is "critical" that responsible governments ensure that "positive actors" have the same opportunities for innovation.
"Because if one country cracks down on AI to a point where it is not feasible to keep up with global developments, other countries, as well as international bad actors, will not have those same prohibitions. AI is a technology where you don't want your enemy to be ahead of you in development. It's impossible to 'stop' AI research or otherwise put the genie back in the bottle," Gardner elaborated.
Citing the validity of the EU AI Act, Clemons, who also researches AI in education at Northern Illinois University in the U.S., highlighted the importance of articulating a national strategy in terms of guiding the responsible and beneficial development of AI.
"Such a strategy can help Korea align its resources and policies towards common goals. It would emphasize education and skill development to prepare the workforce for an AI-driven future," Clemons responded.
He added that Korea's AI Act, if enacted, could help the country ensure the responsible and ethical use of AI. "But, it's vital that the Act should strike a balance between safeguarding interests and encouraging competitiveness in the AI sector. While protecting data sovereignty is critical, it's equally important to ensure that AI regulations don't stifle innovation."