Deep Dive into DeepSeek
May 12, 2025
This story was originally published in SiouxFalls.Business on April 30.
A research team at Dakota State University has taken a deep dive into the artificial intelligence company DeepSeek.
DeepSeek is a large language model that is reportedly faster than similar apps, so it gained a lot of attention after its initial release, but that soon spiraled into questions on data privacy.
“When it first came out, there was concern because it’s a product of a Chinese company,” said Will Campbell, digital forensics analyst with DSU’s digital forensics lab. Many countries have restrictions on the app, including Australia, Canada, Netherlands, South Korea and Italy, along with the U.S. Navy and NASA, he stated.
Lab director Dr. Arica Kulm said a research project was the best way to determine the legitimacy of concern with this app, and tasked Campbell and undergraduate research assistant, Reina Girouard, with the project.
Exploring new technology like this is one of the roles of the Digital Forensics Lab.
“This project demonstrates our commitment to thought leadership in cybersecurity—examining cybercrime, open-source intelligence, and emerging technology to investigate complex threats,” said Dr. Ashley Podhradsky, vice president for Research & Economic Development at DSU. “Under Dr. Arica Kulm’s leadership, the initiative continues to elevate our role as a trusted voice in the digital landscape.”
President José-Marie Griffiths said this project came about at her request.
“One of the roles we can play is to conduct a thorough cybersecurity investigation of questionable products. Most importantly, we can do so in a controlled lab environment,” she stated, adding that similar investigations have been conducted by DSU's MadLabs in the past couple of years.
The primary objective of this DeepSeek project was to explore the data acquisition capabilities of the app’s services, understand their potential impact on user data, and examine the broader implications of these effects. The team looked at what the application does, who it’s reaching out to, and checked on privacy implications and policies.
“We did find legitimate concerns and similarities to applications like Tik Tok,” Campbell said. “We found DeepSeek had security issues and data breaches, which included user information and chat conversations, so we questioned its security”.
The full results are outlined in a 27-page blog for the digital forensics lab webpage.
They found that a data breach did occur, but Girouard pointed out that information was not publicly breached, yet other research articles confirmed obvious vulnerabilities, so the fact that a breach could have occurred was a major concern.
Other data privacy concerns included the fact that it collects information about the devices, such as location information, which is transmitted to DeepSeek. Because it’s a Chinese company, Campbell said, that information is subject to Chinese law, which brings potential data privacy concerns.
They also found examples of model bias. For example, questions about Taiwan were either shut down, or it would give an answer that aligned with Chinese ideals. Interestingly, this took place when on the web application, but if the model was downloaded and run offline (locally), the response seemed more similar to an answer expected from a chatbot, said Girouard.
“Running locally, it didn’t sensor the answer, but was still biased to the Chinese perspective,” she said.
Responses to questions on historical events, like the Tiananmen Square protests and massacre in 1989, were surprising. Online, the app would begin a response, then erase it, and finally put up a sentence saying that it couldn’t talk about that incident.
“Watching that in real time was kind of creepy,” Campbell admitted. Running locally, the output would be biased in some way through the app’s censors.
Attempts to replicate results were challenging. “It wasn’t consistent,” said Girouard. “We would try a prompt again to see if we could recreate it to document it, but we couldn’t, which was really interesting.”
The team’s final advice is simple.
“Stay away from DeepSeek,” Campbell said, but if someone does run this or any similar apps, “for data privacy and security, it's definitely worth running a model like that locally,” but to expect some biases with data privacy and security, even when running locally.
The blog is available on the digital forensics website, along with others https://blogs.dsu.edu/digforce/2025/04/09/forensic-analysis-and-security-implications-of-deepseek/