Wednesday, January 29, 2025

DeepSeek may face trust issues in India – Technology News

Must read

Chinese artificial intelligence startup DeepSeek may have created waves rocking global technology stocks on Monday, as America’s technological dominance in the field came into question. The shockwaves through Silicon Valley were not without reason.

The Chinese model, which comes as an alternative to OpenAI’s ChatGPT and Google’s Gemini, unlike its US counterparts fully relies on open-source technology and lower-end chips. Developed at a cost of just $5.6 million, it sidesteps the need for high-end hardware restricted by US export controls. Simply put, DeepSeek is available at a cost, which is 2% of what users would spend on OpenAI’s O1 model.

However, despite these advantages it’s unlikely to expand its reach in India as other Chinese tech products have done in the past. Reason: trust factor. Since the 2020 border clashes with China, the government has been restricting Chinese technology as they do not fall under trusted category sources. As a result Indian firms developing applications would continue to depend on US AI technologies, and graphics processing units manufactured by Nvidia.

Tech policy experts said that given the history of bans amid security implications with Chinese apps like TikTok and equipment by companies such as Huawei and ZTE, DeepSeek could face similar scrutiny in the country.

Officials said currently the Chinese generative AI model is being evaluated and any decision related to trusted sources would come in the voluntary ethics code, which are in the works.

“It is premature to talk about any kind of security implications with the use of open source models like DeepSeek. However, the companies looking to leverage the same should do their due diligence given that there have been concerns in the past over Chinese apps” said Dhruv Garg, partner at Indian Governance and Policy Project (IGAP) and a tech lawyer.

“The security threat of using models like DeepSeek can be avoided if the companies using those models are deploying the solutions on their servers located locally,” said Ankush Sabharwal, founder and CEO of CoRover and BharatGPT.

Experts point out that there could be transfer of personal data by users using these models just as it happens in the case of ChatGPT. This means, if domestic firms opt for using DeepSeek, they would need to ensure that they do not use readymade models but only APIs to make their own models with data hosted on their own servers.

Kamesh Shekar, senior programme manager at The Dialogue said, “DeepSeek offers an affordable, open-source alternative, particularly amidst tightened American export controls. However, its rise also highlights a growing dependency on Chinese models, raising critical questions about technological self-reliance and global digital sovereignty”.

According to Shekar, cross-border data flow which is being looked at by the government with the rollout of rules to implement the data protection law, may pose restrictions on certain types of data from reaching China or another country which would act as a brake in using the Chinese tech.

Jameela Sahiba, senior programme manager at The Dialogue said, “this also is a call for action for developing countries like India to optimise its compute and training resources, enhance efficiency, and foster innovation in home-grown model development, addressing regional needs with locally tailored AI solutions”.

According to Sahiba, the launch of the $500 billion Stargate project in the US and the latest export control order on graphics processing units (GPUs) opens the whole debate around the geopolitical implications of AI. For India, advancing indigenous AI capabilities will not only strengthen its technological sovereignty but also allow it to engage more effectively in geopolitical dialogues on equitable access to AI advancements and infrastructure.

While India is also looking at building its own compute infrastructure and own large language models (LLMs), these are early days and proposals in this regard are yet to be finalised.

Latest article