Techno Blender
Digitally Yours.

genai advisory lawyers: Experts can’t find any legal basis for government’s LLM missive

0 24


The recent government advisory asking companies to take permission before launching generative artificial intelligence models may not have a solid legal foundation, technology lawyers told ET.

According to them, there are questions about the statutory basis and enforcement competence of such an advisory, especially concerning permission of the government for under-testing models. Terms like “bias” and “unreliability” are too vague and companies may not be able to ensure compliance even with the best intent, they argued.

Elevate Your Tech Prowess with High-Value Skill Courses

Offering College Course Website
MIT MIT Technology Leadership and Innovation Visit
Indian School of Business ISB Professional Certificate in Product Management Visit
IIM Lucknow IIML Executive Programme in FinTech, Banking & Applied Risk Management Visit

“It is not clear which legal provision MeitY (Ministry of Electronics and Information Technology) is relying on for this requirement and what it will cite to legally enforce this. There is no definition of ‘unreliable’ in the advisory either,” said Ranjana Adhikari, partner at IndusLaw.

To be sure, IT minister Ashwini Vaishnaw said on Monday that the advisory was not legally binding.

When serious regulatory objectives such as election integrity are at stake, the regulatory framework must have unimpeachable statutory validity, precise definitions, clear obligations and predictable enforcement mechanisms, said Dhruv Garg, a lawyer and tech policy advisor.

“Piecemeal regulation is not the answer. AI regulation is a complex techno-legal issue and must be done through a comprehensive legislative process centred on wide public consultation,” Garg said.

Discover the stories of your interest


Aaron Solomon, managing partner, Solomon & Co, said: “In our view, the defined meaning of an intermediary would not extend to ChatGPT, Gemini, Krutrim and Perplexity AI, which are genAI technologies, and the exemption granted under the IT Act to intermediaries would not apply to OpenAI, Google, Ola and Perplexity.”Section 79 of the IT Act provides an exemption to intermediaries on third-party content and has not been designed to protect entities which provide content generated by themselves, he reasoned. He was referring to the safe harbour provision under which intermediaries are protected from liability for third-party content.

People+ai, an initiative by EkStep Foundation, is collating views from India’s AI community and startup ecosystem on the recent government advisory. EkStep is an organisation cofounded by Infosys cofounder Nandan Nilekani.

People+ai is “gathering insights from Indian startup founders on their concerns and aspirations regarding the responsible development and deployment of AI in India”, said a form created by it seeking responses from startups.

“Your feedback will be shared with policymakers and the broader AI community to shape future regulations that foster innovation and societal good,” it said.

Conversational AI platform Haptik’s chief executive Aakrit Vaish told ET that the form went live on Wednesday and once they have enough responses, he along with members of People+ai would meet MeitY officials.

“The biggest question is that of applicability. If a company builds AI products, do they have to submit details of their model to the government, or do fine-tuned models also come under the purview,” Vaish said.

Two of the questions on the form are: What are your short-term concerns regarding AI regulation in India? What other aspects of AI development and deployment in India would you like the Government of India to support?

ET could not immediately reach People+ai head Tanuj Bhojwani and director of strategy and operations Tanvi Lall.

(Dia Rekhi in Chennai contributed to this article.)


The recent government advisory asking companies to take permission before launching generative artificial intelligence models may not have a solid legal foundation, technology lawyers told ET.

According to them, there are questions about the statutory basis and enforcement competence of such an advisory, especially concerning permission of the government for under-testing models. Terms like “bias” and “unreliability” are too vague and companies may not be able to ensure compliance even with the best intent, they argued.

Elevate Your Tech Prowess with High-Value Skill Courses

Offering College Course Website
MIT MIT Technology Leadership and Innovation Visit
Indian School of Business ISB Professional Certificate in Product Management Visit
IIM Lucknow IIML Executive Programme in FinTech, Banking & Applied Risk Management Visit

“It is not clear which legal provision MeitY (Ministry of Electronics and Information Technology) is relying on for this requirement and what it will cite to legally enforce this. There is no definition of ‘unreliable’ in the advisory either,” said Ranjana Adhikari, partner at IndusLaw.

To be sure, IT minister Ashwini Vaishnaw said on Monday that the advisory was not legally binding.

When serious regulatory objectives such as election integrity are at stake, the regulatory framework must have unimpeachable statutory validity, precise definitions, clear obligations and predictable enforcement mechanisms, said Dhruv Garg, a lawyer and tech policy advisor.

“Piecemeal regulation is not the answer. AI regulation is a complex techno-legal issue and must be done through a comprehensive legislative process centred on wide public consultation,” Garg said.

Discover the stories of your interest


Aaron Solomon, managing partner, Solomon & Co, said: “In our view, the defined meaning of an intermediary would not extend to ChatGPT, Gemini, Krutrim and Perplexity AI, which are genAI technologies, and the exemption granted under the IT Act to intermediaries would not apply to OpenAI, Google, Ola and Perplexity.”Section 79 of the IT Act provides an exemption to intermediaries on third-party content and has not been designed to protect entities which provide content generated by themselves, he reasoned. He was referring to the safe harbour provision under which intermediaries are protected from liability for third-party content.

People+ai, an initiative by EkStep Foundation, is collating views from India’s AI community and startup ecosystem on the recent government advisory. EkStep is an organisation cofounded by Infosys cofounder Nandan Nilekani.

People+ai is “gathering insights from Indian startup founders on their concerns and aspirations regarding the responsible development and deployment of AI in India”, said a form created by it seeking responses from startups.

“Your feedback will be shared with policymakers and the broader AI community to shape future regulations that foster innovation and societal good,” it said.

Conversational AI platform Haptik’s chief executive Aakrit Vaish told ET that the form went live on Wednesday and once they have enough responses, he along with members of People+ai would meet MeitY officials.

“The biggest question is that of applicability. If a company builds AI products, do they have to submit details of their model to the government, or do fine-tuned models also come under the purview,” Vaish said.

Two of the questions on the form are: What are your short-term concerns regarding AI regulation in India? What other aspects of AI development and deployment in India would you like the Government of India to support?

ET could not immediately reach People+ai head Tanuj Bhojwani and director of strategy and operations Tanvi Lall.

(Dia Rekhi in Chennai contributed to this article.)

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.
Leave a comment