February 17, 2025
Kuala Lumpur – This is written by Gemini AI, a generative artificial intelligence chatbot from Google. I prompted it to write a 50-word article in the Politics Column using Philip Golingai's writing style. The topic is that AI and Deepfake may affect Malaysian politics.
I do this for Gemini D+. For other tasks, it has a B- that mimics my writing style.
“Hahahaha! Very good. A senior media consultant told me when I shared my B-work, there were some fingerprints.
I was impressed by this because it used the 300-word article “Donggongon”, a small town in my hometown of Pengdong Province near Kota Kinabalu. However, it uses too many mixed images like Ikan Bilis, Rock, Monsoon, Latok-Latok (Sea Grapes) and Sherlock Holmes, and I would only use one or two.
I've been playing with AI to see if it can write like I do. Earlier this week, another media practitioner contacted me to ask if I was responsible for an anonymous article.
I said no. I suspect a great writer or AI wrote it.
Our chat turns to AI inevitably becoming more prominent in Malaysian politics.
“Like you said, terrible. As the election gets closer, I'm surprised that there will be a video reel of YBS in sexual acts,” he said.
I would be surprised if there are still political agents who think this dirty tactic works.
Such video reels are juicy, but do they work?
Let us take the example of politicians who allegedly participate in this video. In the last two elections he participated in, he lost in GE15 (the 15th general election at the end of 2022) and won the state election in 2023.
It can be argued that he lost his seat in the parliament because the voters in his constituency were targeting his political stance, not his so-called personal behavior. His victory in winning the national seat shows that juicy video reels won’t derail their political career.
But the deep fruit can sound or look real, and the voters can and do belong to them.
Take last year's US presidential election as an example. President Joseph Biden then called on New Hampshire voters not to vote in the Democratic state primary. “We know that when our vote counts, the value of voting democracy is important. It is important to save the vote for the November general election,” he said.
But the voice said that it was not Biden's voice. This is a deep bubble created by AI.
How Biden Deepfake manipulates election methods with fake articles, photos, and even fake audio and video.
This week, I saw a fake poster of an amateur (probably created by humans rather than AI) with a photo of a Sabah lawyer/politician edited with a woman. The title claims that he has left the woman and returned to his wife. It also claims he is a pervert, trying to lower the government.
At first glance, it looks real. However, when I checked the photos carefully, I noticed that the man's head was disproportionate to the body. About an hour later, the lawyer/politician posted the photo on Facebook that was fake. Someone shared the original photo in the chat group, a guy who is not a lawyer/politician and posted a news story about cheating.
How many of us would believe that fake photo?
Should we worry that AI and/or Deepfake may be a factor shaping Malaysia’s political perspective?
Let me ask Gemini AI.
Its 185-word answer is too long, so I asked for a summary, and here is the result: “AI and Deepfakes pose a major threat to Malaysian politics by implementing misinformation, eroding trust, increasing polarization and potentially promoting foreign interference, It poses a major threat to Malaysian politics. AI can also be used for positive purposes.”
It's all about the scam-free voters. But, even for someone as skeptical as I am, some articles or videos may sound and look convincing. Also, as Spinmeister told me, a lie tells me a lot of times, many times, it will lead your own life and become the truth.
Here is an example.
In 2006, the use of explosive C4 was memorable in the murder of Mongolian model Altantuya Shaariibuu. But if you Google Google “C4”, “Altantuya” and “Court Cases”, the second in the answer list is a news article published in 2014, titled “We Never Say 'C4', the prosecutor told Altantuya Murder lawsuit.”
“UTK has never used C4 explosives. We have never said explosives are C4. We have never said that they are C4, but from day one, they are talking about C4,” Datuk Tun Abdul Majid Tun Hamzah, chief prosecutor of the appeal, told The court refers to the Malaysian police's Special Operations Department (UTK).
But most people don’t read court hearings. For them, the “truth” is using C4.
A few days ago, I was with the defense attorney involved in the case and I wanted to confirm whether what was said in the court was true. He confirmed that he had never used C4.
At present, a big rumor is circulating and may have an impact on Malaysian politics.
A businessman told me: “My tycoon friend is calling me.”
His phone asked me to call politicians, journalists, diplomats and people familiar with the matter. Interestingly, I became the one with a credible rumor when trying to get confirmation. It is humans, not artificial intelligence that are spreading it.
Since Malaysian politics can be opaque, it is difficult to verify the authenticity of all speeches. As usual, people would say there is no smoke without fire.
I asked Gemini AI and replied, “I can't help but respond to elections and politicians right now. I'm trained to be as accurate as possible, but sometimes I make mistakes.”
AI can sometimes fail to defeat the internal information that humans have.