When AI turns cricket into controversy: How a fake ‘Burkha-Clad’ Bangladesh Women’s team fooled millions on social media
GH News October 14, 2025 04:06 PM
When Fake News Wears a Burkha: The AI Hoax That Fooled Cricket Fans. A crash-course in case you blinked and scrolled past. For a few sleepless hours a picture of Bangladesh’s women’s cricket team bowling and fielding in the women’s world cup decked out in full black burkhas sprang up on every social media feed. Twitter Facebook WhatsApp forwards the full monty. The picture went viral. From provocative to patriotic from irreligious to ironic every caption whipped a storm out of it. Menstruation women’s rights religious extremism free speech cultural self-assertion: If a holy-war-on-cricket-pitch Twitter handle is in your feed there was a hashtag storming about your profile.   The match: Bangladesh vs South Africa World Cup a tightly-contested affair. The pop-up de jour wasn’t an absolute six an overzealous appeal or a batter who flaked out in the last over. Instead social media lost its mind over a digitally manipulated picture generated using Google’s Gemini AI of two Bangladeshi cricket players decked head-to-toe in black.   In the original post shared within minutes of the match the picture raked up thousands of shares and hundreds of binary verdicts and comments.   Praise it. Criticise it. Mock it. Boycott it. Celebrate it. Hate it. Cry blasphemy. Laugh at it. Love it. Whatever. In less than three hours the image was everywhere trolling every culture-identity-religion-gender node there is on the web.   Some spotted the flaws early unusual lighting skewed shadows unnatural textures and confirmed their suspicions when they zoomed in on a watermark near the edge hidden but plain for anyone looking: Gemini.   Fact Check: Bangladesh Played Normal  First things first. The Bangladesh women’s cricket team were on the field in their usual team jerseys. Green with red sleeves. With the Bangladesh Cricket Board crest. In every television shot from official broadcasters on social media from the ICC and national cricket boards from newspaper cut-outs eyewitness accounts: The team played in their usual match-wear no modifications no additions no subtractions. Bangladesh versus South Africa in the women’s cricket world cup took place as scheduled and without any controversy around hijabs or burkhas. The women’s cricket team played a hard-fought match in their regular uniforms. That’s all. No hullabaloo.  The viral picture? An AI-generated fake. The much-hyped “burkha match”? A digital hallucination. A rumour. A fake.  Viral News Underdogs: Your Daily Dose of Misinformation  Why do these hoaxes catch fire? The same reason since time immemorial: Clickbait. Lies and deception will always spread faster and wider when they fan popular prejudices tapping into our collective hatred fears and fury be it about mosques women’s freedom or a headscarf vs. football furore. When an incident stokes rumours about religious hypocrisy or nationalist fervour it’s like petrol on a pyre. Add to this perfect storm the current climate of debates riots censorship about women’s rights to cover up uncover and whatever the mood takes them to in sport: One viral doctored image was enough to make it a game-changer. Sad to note some well-known blue ticked Twitter accounts regurgitated the meme without even one moment of fact-checking or inquiry. Blindly.  AI vs. Sports Truths  The good news is it’s unlikely to happen again at least in cricket. The bad news? It won’t just be in cricket. This season the internet saw several matchday hoaxes all fanning the flames from celebrity sightings (George Clooney playing in the nets) to political smear-memetics (Indian opposition parties holding fake news rallies) to women’s rights (pregnant women should not play cricket at all). Some were doctored others from old unrelated photos. AI-generated image programmes like Google’s Gemini or OpenAI’s DALL·E are becoming so sophisticated at churning such hoaxes fact-checkers worldwide now have teams of forensic analysts to run before-and-after digital filters to analyse textures weird angles in shadows hidden watermarks even ‘fingerprints’ or flaws that reveal the AI tools used.   Playing Away from the Margins  It’s easy to laugh off sports hoaxes rumour-mongering and trolling that builds careers out of bellyaching at scantily clad female athletes or beach-balls at Wankhede stadium or some other non-news. It’s also less easy to ignore their wider impact polarising cultures misinforming publics reinforcing negative stereotypes about women and Muslims and generally distracting from actual cricket like Bangladesh’s spirited and valiant World Cup debut.   For the players it’s doubly cruel. Yes international sport in any country team league is a pressure cooker with every chance of failure on the playing field as off. But you don’t even have to play to lose when photos can be Photoshopped entire players dressed up as fiction and all kinds of lies recirculated as received truth at the click of a button.   Simple Tools to Beat Fake News    Reverse image search: If an image has no provenance with official sites like ICC or ESPNcricinfo don’t trust it.   Look for watermarks: AI images sometimes carry ‘invisible ink’ watermarks to brand them as AI-generated like Gemini DALL·E 2. Zoom in on the side of a picture to look for these.   Cross-verify with reports: If the entire official-media circuit is silent on it it’s unlikely to be true.   Don’t share until you verify: If it’s fake it may only be shared by one person. But one share can turn fiction into “fact” for millions.   Don’t believe your eyes  The tale of the “burkha-clad Bangladesh team” at the Women’s cricket world cup 2023 is going down in the history books as one of the most audacious memes ever made by an AI deepfake. The team didn’t break any world records or dress codes. The viral game didn’t happen. It was just another run-of-the-mill pull-the-rug-out-from-under-your-damn-feet moment for AI-generated misinformation. But if you see something this bananas just pause before you share. No records no riots just rumours in a black-and-white pixel that only a forensic deepfake unit could tell. Your feed will thank you for it. 
© Copyright @2025 LIDEA. All Rights Reserved.