How A Kenyan Jobseeker And ChatGPT Ended Up Inside Myanmar’s Cyber-Slavery Machine

TLDR: A Kenyan graduate lured by a fake “customer service” job in Thailand ends up trafficked into a Myanmar scam compound, where he’s forced to use free ChatGPT to pose as a rich American crypto investor and defraud U.S. real estate agents—showing how everyday AI now supercharges global fraud from brutal “cyber slavery” hubs along the Thai–Myanmar border. Behind his story is a wider pipeline: Kenyan recruiters, tourist visas, secret river crossings into militia-run scam cities like KK Park and Shwe Kokko, where tens of thousands of trafficked workers endure 16–19 hour shifts, beatings, electric shocks, and threats of organ harvesting while running romance and investment scams at industrial scale, powered by Starlink, crypto, and large language models. Kenya is scrambling to rescue victims, sue fake agencies, and warn jobseekers even as some returnees, now trained in AI-assisted scamming, are re-recruited into crime; meanwhile, militias, organized crime, money-laundering platforms, and tech firms all profit or look away. The piece argues that AI isn’t inherently evil but, in a world of weak regulation and deep inequality, becomes an efficiency tool for whoever already holds power—raising urgent questions about how governments, financial systems, and AI companies will curb this abuse, protect workers like Duncan, and prevent a future where fully automated scams make human suffering less visible but even harder to fight.


"ChatGPT was helping."

That's how 26-year-old Duncan Okindo described the tool sitting in his browser tab inside a Myanmar scam compound.

Picture this: A vast, fluorescent-lit room along the Moei River on the Myanmar–Thai border. Hundreds of trafficked workers sit in rows at desktop computers. Armed guards patrol the corridors. It's 11 p.m., Duncan has been at his desk since morning, and he's copying a question from a U.S. real estate agent on Zillow into the free version of ChatGPT.

His orders? Pose as a wealthy American crypto investor. ChatGPT rewrites his clumsy draft into confident, slangy English. The agent relaxes. The trap tightens.

Duncan will later tell Reuters that ChatGPT was "the most-used AI tool to help scammers do their thing" in the compound.

If you think of chatbots as homework helpers or productivity hacks, that lands like a hard glitch in the matrix.

But this is where AI in cyber scams, human trafficking in Myanmar, and Kenya's unemployment crisis collide—in the scam compounds of Southeast Asia, where people like Duncan are forced to turn everyday AI into an engine for global fraud.

The UN estimates at least 120,000 people are trapped in scam operations across Myanmar and another 100,000 in Cambodia. These aren't call centers. Survivors describe 16- to 19-hour shifts under armed guard, electric shocks for missing quotas, debt bondage, and threats of organ trafficking.

Duncan's story is one way into a much bigger system—and it starts with a Nairobi job ad.


From Nairobi to a locked room on the Thai border

Duncan was just trying to be a responsible adult.

Struggling to find work as his family's breadwinner, he spotted what looked like a dream: a customer service job in Bangkok, advertised by a Kenyan recruitment agent. The salary looked solid. The catch? Upfront fees around 200,000 shillings for "processing" and a tourist visa.

He paid. At Jomo Kenyatta International Airport, agency reps walked him and seven other recruits through check-in, snapping photos on the plane and posting updates in WhatsApp groups. It felt organized, official.

The bait-and-switch started in Bangkok.

Passports were confiscated "for safekeeping." The group was told to pose as tourists. Instead of heading to a gleaming office, they were packed into a van for a long night drive to the border town of Mae Sot.

Then came the river crossing—no immigration counters, no stamps. Just a boat across shallow water and a slow realization: this wasn't Thailand anymore.

On the other side rose a walled compound with guards and watchtowers. Inside were dorms, canteens, and floors of offices full of computers. It looked eerily like the Silicon Valley campus drone shots you see in tech ads, except this was one of Myanmar's border scam hubs.

The job was not customer service. It was forced online fraud.

Duncan worked in a room with hundreds of other trafficked workers, most targeting foreigners. Days blurred into nights behind screens, hitting strict quotas under constant surveillance. Survivors from these compounds describe electric shocks, beatings, and threats of being sold to "harsher" facilities if they didn't perform.

Kenya's Principal Secretary for Diaspora Affairs, Roseline Njogu, has publicly confirmed victims reporting torture, electric shocks, physical beatings, and even organ harvesting threats in these operations.

Duncan eventually got out during a wider crackdown and made it home. He's since spoken to journalists and posted TikToks warning others.

He's not alone. Between 2022 and late 2024, Kenya repatriated 150 citizens from "Golden Triangle" scam hubs. In early 2025, another 175 Kenyans were rescued, including 78 who landed at JKIA in April.

In November 2025, Justice Byram Ongaya of Kenya's Employment and Labour Relations Court ordered recruitment firm Gratify Solutions International Ltd to pay Ksh 5 million in compensation to Haron Nyakango, a trafficked student. The court found the company and its directors liable for trafficking him to a Myanmar scam compound under false promises of a customer service job in Thailand.

This is about AI. But it's also about Nairobi classifieds, Thai visas, and courts playing catch-up with transnational crime.


Where AI enters the scam workflow

Duncan's assignment was specific: target U.S. real estate agents on Zillow.

Here's how AI turned that into an industrial process:

Targeting and outreach
He'd trawl property sites for agents advertising their services. ChatGPT's job? Turn his non-native English into believable American investor speak—friendly intros, local slang, neighborhood references. Each message sounded personalized, not copy-pasted.

On-the-fly research
When agents asked sharp questions about specific U.S. neighborhoods, housing trends, or crypto jargon, he pasted their messages into ChatGPT and got back confident-sounding answers in seconds. He didn't need to understand crypto markets or U.S. real estate deeply. The model handled it.

Iteration and objection handling
When a script stopped working or a suspicious agent pushed back, he'd ask ChatGPT for alternative angles: new ways to build trust, softer approaches to re-engage people who'd already been burned, emotional hooks to keep conversations going.

The basic "pig butchering" playbook stayed the same: build a relationship, persuade the target to open a crypto account, direct them to what looks like a trading platform but is actually a wallet controlled by the syndicate. Once the money transfers, it's gone.

AI didn't invent that scam. It made it faster, smoother, and scalable across dozens of targets per week.

Duncan's account matches what Europol and the UN Office on Drugs and Crime have been warning: large language models lower the barrier to running complex, multilingual fraud campaigns. Europol noted that tools like ChatGPT help scammers craft "more authentic-sounding messages to gain victims' trust…faster, much more authentically, and at a significantly increased scale."

Recent testing by Harvard researcher Fred Heiding found several mainstream chatbots—including ChatGPT, Meta AI, Grok, and DeepSeek—would, under certain prompts, help design phishing campaigns, craft scam emails, and map out fraud workflows.

OpenAI says it actively works to identify and disrupt scam-related misuse of ChatGPT. The company has reported dismantling operations, including some that appeared to be based in Cambodia, Myanmar, and Nigeria, using ChatGPT to create fake investment websites and generate personas as financial advisors.

All of that can be true. And yet, in Duncan's room full of trafficked workers, the free version of ChatGPT was still helping scammers "do their thing."


Scam compounds: fraud factories in a war zone

Zoom out from Duncan's desk and you hit a landscape that sounds like dystopian fiction but shows up in court filings and satellite images.

KK Park and Shwe Kokko in Myanmar's Karen State sit on the Moei River opposite Thailand. Investigations describe sprawling complexes with villas, casinos, banks, and hospitals—high walls, watchtowers, CCTV, and private ferry crossings. Thousands of workers, many trafficked, many under armed guard.

The UN human rights office estimates around 120,000 people in Myanmar and another 100,000 in Cambodia are forced to carry out online scams. The UN Office on Drugs and Crime says these hubs are part of an industry worth tens of billions of dollars annually. Interpol's chief has warned that cyber scam and trafficking rings, born in Southeast Asia, are now generating up to $3 trillion globally as they expand into new regions.

Why here? Because the Thai–Myanmar border is what one analyst calls "Dark Zomia"—a frontier of weak governance, high corruption, and armed groups who rent out territory and protection.

Groups like the Karen National Army (KNA), the Border Guard Force (BGF), and the Democratic Karen Benevolent Army (DKBA) lease land, provide security, and sell electricity and internet connections to scam operators. The U.S. Treasury has sanctioned KNA, DKBA, Yatai International Holding Group, Cambodia's Prince Group, and others as transnational criminal organizations tied to scam compounds. Americans alone suffered over $10 billion in losses from these Southeast Asian networks in 2024.

When Thailand cut power and internet to border areas in early 2025, compounds didn't disappear. Many switched to diesel generators and, notoriously, to Starlink satellite internet. By October 2025, an AFP investigation uncovered that over 2,000 Starlink devices—Elon Musk's SpaceX satellite service—were being used by scam centers in Myanmar.

Meanwhile, Myanmar's junta has staged televised raids and demolitions at KK Park and Shwe Kokko. Satellite analysis by the Australian Strategic Policy Institute suggests only about 13 percent of KK Park's total area was actually destroyed. The rest? Business as usual, or quietly migrating elsewhere.

So you have war-torn territory, militia-run "special economic zones" built on human trafficking, and Western and Chinese tech—satellite dishes to AI chatbots—woven into the infrastructure.

Into this steps a Kenyan graduate trying to pay rent.


The Kenya–Thailand–Myanmar pipeline

Kenyan officials, survivors, and NGOs describe a now-familiar route:

Online recruitment in Kenya
Ads on social media promise high-paying jobs in Thailand—customer care, IT support, teaching English, translation. Fees of 150,000–300,000 shillings are common for "visa processing" and air tickets.

Smooth departures
Victims get tourist visas and leave via JKIA, often escorted by agents. Embassies and airlines see a stream of "legitimate travelers."

Bangkok as decoy
On arrival, passports are taken. People are told they'll finalize job paperwork later. Instead, they're driven overnight to Mae Sot.

Illegal crossing to Myanmar
They cross the Moei River via informal boats and find themselves in walled compounds—KK Park, Shwe Kokko, and others—often without fully realizing they've left Thailand.

Kenya's embassy in Bangkok has been raising alarms for years. Principal Secretary Roseline Njogu has warned repeatedly: there are no legitimate job opportunities for Kenyans in the Golden Triangle. Prime Cabinet Secretary Musalia Mudavadi calls it "a sophisticated menace" and says nearly 500 Kenyans have been rescued from Southeast Asian scam operations since July 2022.

Why target Kenyans and other East Africans? Because syndicates want English speakers comfortable with tech who can be trained quickly to run U.S.- and Europe-facing scams, use AI tools, and navigate platforms like WhatsApp, Telegram, LinkedIn, and crypto exchanges.

It's a dark compliment: language skills and digital literacy—sold at home as tickets to a better life—make you more "valuable" to traffickers.

Kenya has started hitting back with coordinated rescues and repatriations with Thailand and Myanmar, public awareness drives about fake Thailand jobs, and legal accountability like the Gratify Solutions case.

But resources are thin. Officials told Parliament they need 80 million shillings for anti-trafficking operations but received only 20 million in the 2025/2026 budget, leaving a shortfall of 60 million shillings—money used to bring stranded Kenyans home.

There's a new worry: some rescued Kenyans, now trained in high-end cybercrime and familiar with AI-driven scam scripts, have gone back to work for syndicates or even become recruiters themselves. Mudavadi cited a case of a Kenyan man rescued in March 2025 who was later arrested in Thailand after illegally re-entering to work for a Chinese-owned scam company in Myanmar.


Who's really responsible?

It's tempting to pin this all on "bad guys with guns" or "evil tech." The reality is messier, with overlapping responsibility:

Militias and local bosses cash in by turning borderlands into scam cities, leasing out land, guards, and utilities to transnational crime networks. The U.S. Treasury says the Karen National Army collects roughly 50 percent of the approximately $192 million earned annually from Shwe Kokko scam operations.

Organized crime syndicates design the business models—pig butchering, romance scams, fake investment platforms—and decide humans are cheaper than customer support software.

Financial platforms and laundering hubs like Huione Group quietly move the money. U.S. regulators identified Huione as having laundered at least $4 billion in illicit proceeds between August 2021 and January 2025.

Governments talk tough but frequently move slowly or selectively:

  • Myanmar's junta profits from the very scam centers it theatrically "raids."
  • Thailand plays both enforcer and enabler: transit hub, power supplier, and, too often, a state that jails trafficking victims as fraudsters.
  • Origin countries like Kenya scramble to rescue citizens with limited funds while dealing with domestic unemployment that keeps the pipeline flowing.

Tech companies are caught in their own contradictions. Generative AI is rolled out at high speed, marketed as democratizing productivity. Misuse is handled reactively—after journalists demonstrate how easy it is to get phishing help or survivor testimonies expose ChatGPT being used inside scam compounds.

None of this means AI tools are inherently evil. But in a world of weak oversight and high inequality, they rarely stay neutral.

The same large language model that helps a teenager in Nairobi with a college essay can, in another tab, help a trafficked Kenyan in Myanmar sound like a charming crypto guru in Texas.


When AI replaces the trafficked

Some experts think scam bosses will eventually automate away roles people like Duncan were forced into.

As AI models improve at maintaining long, convincing chats, switching languages smoothly, and generating deepfake video and cloned voices, there's a real chance crime networks will lean harder on bots and lighter on dormitories full of trafficked workers.

That could reduce human trafficking in one part of the chain. But hard questions remain:

If fewer foreign nationals are visibly held in compounds, will governments in Africa, Europe, or East Asia lose interest in crackdowns? Will the people who remain—often local or poorer migrants—become even more invisible and disposable?

What happens to the thousands already "trained" in AI-assisted scamming when they come home to economies with few jobs?

Kenyan officials already worry some returnees might set up local scam operations with skills they picked up under duress. That doesn't make them villains. It makes the whole system look like a perverse, global, unregulated "bootcamp" for cybercrime.

And just as AI helped Duncan perform as an investor to his targets, it's now helping traffickers polish fake job ads and recruiting funnels that look even more legitimate to the next wave of jobseekers.


What real accountability looks like

No single fix resolves a problem at the intersection of tech, trafficking, and transnational crime. But some levers are obvious:

For AI and platform companies:

  • Build friction into products: Flag or rate-limit accounts generating high volumes of "investment pitch" messages or romance-style scripts across languages.
  • Report more, spin less: Publish regular, granular transparency reports on dismantled scam networks linked to known geographies and typologies.
  • Work with those on the ground: Partner with anti-trafficking NGOs and survivor groups to understand how AI is being used in compounds and what red flags look like from the inside.

For governments and regulators:

  • In origin countries like Kenya: Properly license and monitor recruitment agencies; shut down and prosecute fake job exporters. Treat cyber-scam trafficking as a national security issue. Fund survivor reintegration—counseling, skills training, economic support—so "rescued and abandoned" stops being the default.
  • In transit/destination hubs: Fully implement the non-punishment principle so trafficked "scammers" aren't jailed for crimes they were coerced into. Close loopholes that let scam compounds quietly reconnect to electricity, internet, and banking after every "crackdown."
  • In the financial system: Follow the money with the same intensity used for terror finance. Sanction laundering nodes, crypto mixers, shell companies, and complicit casinos.

For individuals:

  • As potential jobseekers: Be seriously skeptical of overseas job ads, especially "Thailand" roles in customer care, translation, or IT that require big upfront fees. Cross-check offers with official labor ministries and embassies. If an ad refuses verification, it's a trap.
  • As potential scam targets: Don't confuse polished English or detailed knowledge with legitimacy. AI has erased "bad grammar" as a reliable red flag. Slow down. Scams thrive on urgency and secrecy: limited-time investment, can't use official channels, must communicate only via messaging apps.

None of this is glamorous. It's the opposite of the shiny "AI will change everything" narrative.

That's exactly why it matters.


Who does our tech really serve?

Back in Nairobi, Duncan is no longer in that fluorescent-lit room. He's home, working through trauma, talking to journalists, and warning others not to board the same flights he did.

His story is not just horror from Myanmar. It's a mirror.

The real scandal isn't simply that scammers use ChatGPT or that scam compounds in Southeast Asia are cruel. It's that a whole ecosystem—recruiters in Nairobi, militias in Karen State, anonymous shell companies in Phnom Penh, polite press releases in San Francisco—made it normal for a young Kenyan chasing a job to end up weaponizing AI against strangers while being tortured if he didn't hit quota.

AI is often sold as "intelligence," but intelligence without conscience is just efficiency for whoever already holds the power.

The real test of these systems is whether they end up serving people like Duncan—or the networks that once owned his passport, his time, and, through a browser window, even the words coming out of his keyboard.