top of page

Section 20-A – Transparency and Accountability in AI-related Government Initiatives and Public-Private Partnerships

PUBLISHED

Section 20A – Transparency and Accountability in AI-related Government Initiatives and Public-Private Partnerships

(1) This section applies to all AI-related initiatives undertaken by any governmental body, statutory authority, public sector entity, or public-private partnership (PPP) involving AI technologies for public services or infrastructure.


(2) Transparency Requirements: All entities under this section must comply with the Right to Information Act, 2005, by publicly disclosing the following information about AI initiatives:

(i) A clear statement of the project’s purpose and expected outcomes;

(ii) Details of funding, including public funds, subsidies, or PPP financial arrangements;

(iii) Summaries of risk assessments addressing privacy, security, and ethical impacts;

(iv) Descriptions of algorithms used in decision-making for public services, including their purpose and functionality;

(v) Key performance indicators (KPIs) to evaluate the AI system’s effectiveness.


(3) Additional Obligations for Public-Private Partnerships (PPPs): PPPs involving AI technologies must:

(i) Disclose key contractual terms, including payment structures, risk allocation, and responsibilities of each party;

(ii) Provide public access to data generated by AI systems in public service contexts, unless restricted under Section 8 of the RTI Act, 2005, or Section 6 of the DPDP Act, 2023;

(iii) Conduct annual independent audits to verify compliance with ethical standards and performance metrics, and publish the audit results.


(4) Algorithmic Accountability: AI systems used in government or PPP initiatives that impact individuals’ rights or access to public services must:

(i) Provide written explanations of algorithmic decisions upon request by affected individuals;

(ii) Document and disclose measures to prevent algorithmic bias, including details of data selection and validation processes;

(iii) Conduct and publish impact assessments before deployment, evaluating risks to vulnerable populations.


(5) Before launching large-scale AI projects or entering PPPs involving AI, the responsible government body must:

(i) Hold public consultations with stakeholders, including civil society, industry experts, academics, and affected communities;

(ii) Publish a summary of consultation feedback and explain how it was incorporated into the project plan.


(6) All entities under this section must submit an annual report to the Indian Artificial Intelligence Council (IAIC), which must be published on official government websites, detailing:

(i) Progress on AI projects;

(ii) Results of audits or impact assessments;

(iii) Incidents of AI misuse or failure, with corrective actions taken;

(iv) Measures implemented to address transparency, accountability, and ethical concerns.


(7) Exemptions: Information may be withheld from disclosure if it:

(i) Compromises national security;

(ii) Violates data protection rights of data principals under the Digital Personal Data Protection Act, 2023;

(iii) Interferes with ongoing investigations or enforcement actions;

(iv) Conflicts with legitimate use purposes as defined under Section 6 of the DPDP Act, 2023, per Section 8 of the Right to Information Act, 2005.

Related Indian AI Regulation Sources

bottom of page