
IndiaAI’s RegTech Challenge: A Deal Only A Desperate Start-up Would Sign
- Business
- Published on 5 March 2026 6:00 AM IST
IndiaAI and NFRA have designed a challenge that asks India's most capable AI startups to build critical regulatory infrastructure for the price of a studio apartment in a Mumbai suburb. The fine print is even more oppressive.
India's financial watchdog is drowning in paper. The National Financial Reporting Authority (NFRA), the body responsible for overseeing India’s financial audit quality and reporting, receives thousands of filings every year and checks them by hand. Human reviewers, manual checklists, and not nearly enough of either.
The obvious fix: an AI engine capable of extracting text, tables, and financial data from multi-format documents, segmenting them, and validating compliance against a configurable regulatory framework. Also included will be an automated analytics layer and a chatbot interface over NFRA's entire corpus
NFRA and IndiaAI have jointly set out to do that. IndiaAI is an Independent Business Division (IBD) under the Digital India Corporation (DIC) of the Ministry of Electronics and IT (MeitY). It serves as the implementation agency of the IndiaAI Mission.
Their IndiaAI Financial Reporting Compliance Challenge asks India's best AI startups to build an automated compliance engine for the regulator.
The winning startup will receive up to Rs 1 crore over two years. In exchange, it builds, deploys, and maintains a mission-critical government system — and surrenders meaningful control over the intellectual property it creates along the way. If a dispute arises, the government decides the outcome. If the startup continues improving its own product after the contract ends, it must keep sharing those improvements with the regulator — indefinitely, for free.
The ambition is real. It's a genuinely important problem. So is the problem being solved. AI is, without a doubt, the right tool.
The trouble is the deal that’s being offered for the AI solution, structured one-sidedly against the startups most capable of creating it.
Seeking Enterprise Grade For The Price Of A Pilot
The structure of the challenge is a three-stage funnel. Stage 1 shortlists up to ten teams, each of whom receives Rs 5 lakh to refine their solution on NFRA-curated data. Stage 2 narrows to three teams for a five-day on-premises round in New Delhi. Stage 3 produces one winner, who receives a two-year work contract of up to Rs 1 crore. That price — not a floor, a ceiling — must cover end-to-end development, deployment, maintenance, and bug-fixing across the entire application.
The product being sought is not a minimum viable product. It is an enterprise-grade regulatory compliance engine for one of India's apex financial oversight bodies. The solution must meet rigorous technical requirements, handle scanned and digital formats, map outputs against multiple regulatory frameworks simultaneously, and be designed from the outset for vector database integration and encrypted storage.
It must be explainable when it flags a non-compliance, citing the specific rule or text passage that triggered the flag.
The Rs 5 lakh Stage 2 grant is structured as access compensation rather than genuine R&D support. It requires teams to first sign an NDA before seeing any real data. Applications are submitted and managed via the AIKosh portal. The reward for the NDA and the data exposure is the chance to compete for a contract that, spread across twenty-four months, implies Rs 4.17 lakh per month before GST, overheads, employee benefits, and licensing costs. For context, a single mid-level machine learning engineer in Mumbai currently commands Rs 1.5–2.5 lakh per month in total cost to company. The Rs 1 crore envelope is not sufficient to retain a team of three competent engineers for two years, let alone build, deploy, and maintain an enterprise AI system.
This sits uncomfortably alongside the surrounding rhetoric. At the India AI Impact Summit in February 2026, Union Minister Ashwini Vaishnaw declared that India is building sovereign compute infrastructure with "a national objective, to empower Indian startups, researchers and entrepreneurs to build world-class AI solutions." He added that "more than USD 200 billion is likely to flow into India's AI ecosystem over the next two years." Abhishek Singh, CEO of the IndiaAI Mission, has consistently positioned the IADI, the programme under which this challenge is launched, as a vehicle for transformative AI deployment across critical sectors.
The aspiration is world-class. The contract on offer is anything but.
The Intellectual Property Trap
The challenge document uses conflicting language. Section 9 opens with the assurance that all Intellectual Property Rights in the submitted solution "shall remain with the solution owner." The immediate next line states that NFRA and IndiaAI "shall have non-exclusive, royalty-free, irrevocable and perpetual license to use, reproduce, modify, and deploy the awarded AI solution, including any IPR arising out of its use."
The word "irrevocable" is doing the heaviest lifting in that sentence. Ownership without the ability to exclude is not ownership in any commercially meaningful sense; it is a title without value. A startup cannot license exclusively to a third party. It cannot raise a valuation premium from a strategic acquirer on the basis of that product. The government's rights cannot be bargained away, renegotiated, or extinguished by any future transaction.
The document then goes further. Stages 2 and 3 require that all derivative datasets, metadata, or outputs generated from NFRA-provided data belong exclusively to NFRA. This means that, if the winning entity trains or fine-tunes its model on NFRA's regulatory corpus, the improved model weights and their outputs belong to the regulator. The startup leaves the engagement with a product it cannot freely monetise, trained on data it cannot claim, operating under a license it cannot revoke.
Section 10(j) adds a perpetual obligation: any new enhancements, features, or innovations must be released on the chosen cloud environment, and updated source code must be shared with the partnering institution for its free use "at all times." The contract has a two-year term. The obligation has no sunset. A startup that continues developing its AI compliance product after the contract ends — using its own capital, its own team — must continue sharing improvements with the government if those improvements relate to this solution. The government has acquired, in effect, a perpetual R&D subscription with no renewal fee.
Head-I-Win-Tails-You Lose Dispute Clause
Section 10(p) of the challenge document contains one sentence that any experienced counsel would flag before their client signed anything: "In case of any dispute on any other matter related to the project during the course of its implementation, the decision of the CEO, IndiaAI and NFRA shall be final and binding on the winning entity."
The buyer is the judge. The entity with the power to define scope, approve costs, and assess deliverables is simultaneously the final arbiter of any dispute arising from scope definition, cost approval, or deliverable assessment. Not mentioned is either an independent arbitration clause or an escalation mechanism. No reference can be seen to the Arbitration and Conciliation Act, 1996, the standard backstop in government contracts of this kind.
Failure to comply with the government's ruling carries a consequence specified elsewhere in the document: blacklisting from State, Central Government, PSU, and PSE contracts. For a startup that has built its business model around government AI deployments — the profile most likely to succeed in this challenge — that consequence is existential.
The foreign collaboration restriction in Section 10(n) introduces a second governance risk. The winning entity cannot engage with any foreign individual, academic institution, or industry partner in executing this project without prior approval from IndiaAI and NFRA. The restriction applies to "execution of this project," a phrase nowhere defined in scope or duration. Given the irrevocable, perpetual nature of the government's license, a cautious legal reading suggests this project never formally concludes — the approval requirement for foreign collaboration is potentially permanent.
India's AI Start-Up Ecosystem Deserves Better
None of this is to say that the problem being solved is unimportant. NFRA's compliance monitoring workload is real, as its under-resourcing, and the application of AI to financial regulatory oversight is a legitimate and valuable use of the technology.
The problem is the transaction structure. AI startups are the entities with the skills and fire to build what NFRA needs. But India's most technically capable are being asked to subsidise a government IT infrastructure in exchange for title without value, under a contract where the buyer adjudicates disputes, at a price that does not cover costs.
Shikha Dhaiya, Joint Director at MeitY, is listed alongside Vidhu Sood as a co-presenter at the challenge webinar. Between them, they represent the two institutions with the authority to revise these terms before the next edition of this challenge or the next similar initiative.
The IndiaAI Mission has the mandate, the institutional backing, and the political moment to build something genuinely lasting in regulatory AI. The commercial terms of any engagement with solution providers should be revised to include a realistic contract value, an independent dispute resolution mechanism, and a defined sunset on the perpetual enhancement-sharing obligation. Without weakening the government's position, it would make it possible for India’s best-qualified teams to participate in and drive the country’s AI future without existential risks for themselves.
The compliance gap in Indian financial reporting is a legitimate national problem. It deserves a deal that India's best AI builders can afford to solve.
Dev Chandrasekhar advises corporations on multi-stakeholder narratives related to markets, valuation, governance, and doing-by-design.

