China's AI model DeepSeek has created a stir in the world. It has become the most downloaded free app on the Apple App Store in the US. DeepSeek is giving tough competition to ChatGPT. At the same time, serious concerns are being raised about its data privacy and potential misuse. Many leading AI experts have advised users to use this chatbot cautiously and avoid sharing sensitive information. They believe that the data entered in DeepSeek may be accessible to the Chinese government, which may put the private information of users at risk.
Experts say that the success of DeepSeek shows that artificial intelligence is developing rapidly and will play a more influential role in the future. Therefore, governments must make strict rules to control the use of AI and protect the privacy of users. Governments and regulatory agencies must create a framework that uses AI technology in a transparent and secure manner so that the privacy and security of users are maintained.
Risk of data theft
Michael Wooldridge, a professor of AI at the University of Oxford, warned users not to enter sensitive data into this chatbot. According to a report by the Guardian, Wooldridge said, "I think it's fine to download it and discuss the performance of Liverpool Football Club or the history of the Roman Empire." But, would I recommend putting any sensitive, personal, or confidential information in it? Not... because you don't know where the data goes."
United Nations AI advisor Dame Wendy Hall highlighted the potential dangers of DeepSeek, saying, "If you are a Chinese tech company and are doing information-related work, you are subject to the rules of the Chinese government. Companies are obliged to share information with the government." Companies like DeepSeek have to follow the instructions of the Chinese government, which brings the privacy of users' data into question.
Surveillance and propaganda
With the growing popularity of DeepSeek, experts are also concerned that this app can be used in surveillance and propaganda campaigns. Ross Burley, co-founder of the Center for Information Resilience, expressed serious concern over this. He said, "We have seen over and over again how Beijing uses its technological dominance to monitor, control, and suppress, whether domestically or abroad." Burley believes that if technologies like DeepSeek are not controlled, they can promote misinformation campaigns, undermine public trust, and strengthen authoritarian ideology in democratic systems.
Governments are also worried.
Some countries are also worried about the growing popularity of DeepSeek. UK Technology Secretary Peter Kyle said that this app includes censorship and that users should be careful before downloading it. He said in 'The News Agents' podcast, "I think people should decide on this themselves because we have not taken enough time to understand it fully. This is a Chinese model in which censorship is already included. So it does not have the freedom that you expect from other models." Australia's national security agencies have not yet issued any formal advice on the use of this app, but Australia's Housing Minister Claire O'Neill has advised caution. She said in the 'Sunrise' program, "This app was launched only a few days ago ... Our national security agencies are currently investigating its settings and trying to understand how it works." Users should act with discretion The emergence of DeepSeek has made it clear that there is a major tension between technological innovation and data privacy. Experts and government officials are advising users to be cautious and think twice before sharing personal information. As Wooldridge said, "Curiosity is fine, but vigilance is necessary." In this context, users should ensure that they keep their personal and sensitive information safe when using AI models like DeepSeek.
Disclaimer: This content has been sourced and edited from News 18 hindi. While we have made modifications for clarity and presentation, the original content belongs to its respective authors and website. We do not claim ownership of the content.