Seoul – South Korean President Yoon Seok-yeol unexpectedly declared emergency martial law on the evening of December 3, a move that shocked many, especially after it was revealed that Yoon had dispatched troops to the National Election Commission to deal with the chaos.
Many suspect that Yoon's actions were influenced by far-right extremist YouTube channels, which have long spread election fraud conspiracy theories, including claims of irregularities in the 2020 and 2024 elections.
The suspicions gained traction after reports emerged that Yin's keenness to watch the channels was an “open secret”. News emerged again that Yoon invited about 30 far-right YouTubers to his inauguration, including operators of channels such as Lee Bong-gyu TV and Sisa Warehouse, which have about 927,000 subscribers and 144,000 subscribers respectively .
On December 13, former Google employee and congressman Lee Hae-min pointed out, “Mr. Yoon does believe in the election fraud theories circulated on far-right YouTube channels. He believes that they are the root of all problems.
Critics, including former People's Power Party leader Han Dong-hoon, have warned that aligning with conspiracy theorists and extremist YouTubers who “create commercial fear” could destroy conservatism in South Korea, suggesting that such YouTube channels have permeating the country's politics.
As the controversy intensifies, the role of YouTube's algorithms in guiding political opinions has come under increasing scrutiny, raising the question: How does YouTube shape users' beliefs?
confirmation bias
Experts believe the mechanical nature of YouTube's algorithm amplifies confirmation bias.
“When you enter the YouTube homepage, recommended videos will appear on the right side. These suggestions are based on the user's viewing history, and there is a certain bias in the way of suggestions,” said Professor at Seoul National University Graduate School of International Studies, 2022 paper ” said Han Jeong-hun, author of “Understanding Political Polarization.” Based on user activity: A case study of Korean political YouTube channels.
Han continued: “Human nature also plays a role, as people naturally gravitate toward content they like. These factors combine to create a cycle in which users are repeatedly exposed to similar content, reinforcing biased consumption of information that over time Over time, this information consumption may affect people's behavior.
Experts also note that content that is extreme or radical in nature is more likely to go viral. “YouTube’s algorithm is a mathematical function. Provocative content is promoted. Stuff that is propaganda—whether it comes from the right, the left, or even sex—tends to be more obvious because the platform’s goal is to increase monthly active users. ,” said Yu Hyun-jae, a journalism professor at Sogang University.
YouTube's popularity among older users
In South Korea, YouTube's influence is particularly worrisome as a growing number of older users, often unaware of the platform's potential harms, are becoming increasingly active.
“There has been a significant increase in the number of people in their 50s, 60s and 70s turning to YouTube for political news and similar content,” said Lee Moon-haeng, a media communication professor at Suwon University.
“This shift is largely driven by convenience, as YouTube now offers content tailored to their preferences. Unlike children who may present different opinions, older adults seek out a peer group on YouTube to reinforce their opinions.
One of the most pressing issues is that efforts to disable algorithms in order to explore diverse content are difficult to implement.
“Even if you block the algorithm, you can't go a day without turning it back on. The convenience goes away and you end up turning it back on,” Yu said.
Mozilla researcher Jesse McCroskey pointed out in a 2022 report that “Even if users indicate that they do not want to watch it, YouTube will continue to recommend videos, just at a slightly reduced frequency.” He added, “This means that users cannot effectively control YouTube’s algorithm. “
Urgent need for media literacy
The most profitable right-wing YouTuber on the platform, as well as an active left-wing YouTuber who is reported to be spreading disinformation through major left-wing platforms such as The DDanziGroup and Ruliweb, now faces a unique opportunity in what is expected to be a more intense period for Korean politics following a brief period of emergency martial law. Chaos and chaos.
Experts emphasize the urgent need for media literacy in this environment. “Because of its virtually unrestricted network and strong infrastructure, few countries have linked YouTube to politics as closely as South Korea,” Yu said. “The best solution is to educate the public on media literacy.”
Lee agreed, stressing that media literacy ignorance can have harmful effects on society as a whole. 「這不僅僅是被動地接受訊息——而是在不驗證其真實性的情況下進行分享,通常也沒有任何檢查的慾望。人們看到他們認為重要或準確的東西並傳播它,放大了錯誤message.
Han echoed this sentiment, arguing that blocking algorithms or regulating content is not only impractical but also detrimental to freedom.
“The public must be more aware. People should be able to filter content themselves and if they suspect something is false, they should cross-check it with other media sources. This is what citizens in a democratic society should do.