An emotional picture was becoming increasingly viral on social media, in which a mother and child are seen in water. This picture was being shared by linking it to the Bargi Dam cruise accident in Jabalpur, Madhya Pradesh. But during Fact Check investigation this claim proved to be completely wrong. In the claim being made regarding the picture, it was said that this is the actual picture found during the rescue after the accident. The official X account of Jabalpur district administration has also given clarification regarding this viral picture. He says that this picture is either AI generated or taken from some other source and it has no connection with the Bargi cruise accident.
The cruise accident that took place at Bargi Dam in Jabalpur on April 30 had shocked the entire country. A cruise boat capsized due to a strong storm, in which nine people died and 22 people were rescued. After this incident, a picture started going viral on social media in which a woman and child are seen in water. It was claimed that this was the last picture after the accident. It was also said in the post that the mother kept her child close to her chest till the last moment. This emotional claim spread quickly and created confusion among the people.
PTI Fact Check: This edited picture of mother and son being shared on social media has no connection with the Bargi cruise accident.
Read: https://t.co/u92yzqCUI8
follow #PTIFactCheck on WhatsApp Channel https://t.co/yFNHsOCaQU pic.twitter.com/rK46j7a606
— PTI Fact Check (@ptifactcheck) May 1, 2026
PTI Fact Check also says that this picture may be AI generated and has no connection with the actual incident. PTI used several AI detection tools in its initial investigation of this picture. The results found the photo to be not entirely realistic. In the Hive moderation tool, it was said to be AI generated up to about 96 percent. The SiteEngine tool also did not give clear results but the AI potential was estimated to be around 48 percent. Apart from this, many unusual elements were found in the photo during SynthID analysis and Google Gemini investigation. These included the shape of hands, unusual movement of water and smoother texture of skin, which are considered hallmarks of AI images.
PTI Fact Check: This edited picture of mother and son being shared on social media has no connection with the Bargi cruise accident.
Read: https://t.co/u92yzqCUI8
follow #PTIFactCheck on WhatsApp Channel https://t.co/yFNHsOCaQU pic.twitter.com/rK46j7a606
— PTI Fact Check (@ptifactcheck) May 1, 2026
During investigation, clarification regarding this viral picture was also found on the official X account of Jabalpur district administration. Collector Jabalpur clearly said that this picture is either AI generated or taken from some other source and it has no connection with the Bargi cruise accident. This official statement completely exposed the truth of the viral claim. The administration appealed to people not to share such unverified pictures and posts as it spreads confusion and misinformation.
This case once again shows how quickly sentimental pictures can go viral on social media in the wrong context. This picture related to the Bargi accident was shared by many users without any investigation. PTI Fact Check and investigation by Jabalpur district administration made it clear that this claim is false and the picture has no connection with the actual incident. AI generated images and edited photos are now spreading rapidly, increasing the risk of misinformation. Therefore, it is very important to verify any viral content before sharing it.
Such pictures are usually made with the help of Artificial Intelligence i.e. AI tools. In these, no real photo is required, rather the computer itself creates the complete image on the basis of text input. First, the user provides a text prompt, such as “Mother and child in water with life jacket.” The AI model understands that description and generates a new picture accordingly. This technology is trained on deep learning and large image data sets, allowing it to combine things like faces, water, clothes and environment to produce a realistic photo-like output.
Nowadays many AI tools like Google Gemini, ChatGTP, Grok AI, Midjourney, Dell-E and other generative models are used to create such images. Some of these tools have become so advanced that it becomes difficult to differentiate between real and fake images. Apart from this, in some cases such pictures are also created by combining different photos with the help of editing software. This includes changing the background, adding faces or editing the entire scene.
This is the reason why many emotional or shocking pictures going viral these days are not real, but have been created through AI or editing. Therefore, sharing any photo further without checking it can lead to spreading of wrong information.