The Legal Status Of AI Agents In E-Contracts: Need For Reform In Indian Contract Law
- Lets Learn Law
- Aug 21
- 4 min read
Overview
A shift in paradigm in the formulation, negotiation, and enforcement of contracts has been caused by the infusion of artificial intelligence (AI) into digital commerce. Today, decision-making and e-contracting are increasingly being taken by AI agents, including chatbots, recommendation engines, and automated procurement systems. Without human oversight, there is no need for humans; all things are managed by AI. This raises an important legal question: Under Indian law, can AI agents form binding contracts? Based on conventional concepts of human will and consensus ad idem (meeting of minds), the Indian Contract Act of 1872 never contemplated the possibility of non-human entities making decisions. It is necessary to reassess the legal position of AI agents in contract formation and recommend the required changes as India approaches the status of a digitally automated economy.
Understanding E-Contracts and AI Agents
Those contracts formed electronically through email, the internet, or computerized systems are referred to as e-contracts. Such contracts are usually shaped utilizing electronic impulses between two human parties. Be that as it may, these days, AI agents—computer programs using algorithmic considering or machine learning—may perform assignments such as tolerating orders, concurring on terms, and making item offers autonomously. To form ideal choices, these specialists can work autonomously and adjust based on past involvement and data. For illustration, without coordinated human endorsement, an exchanging robot in the back can make high-frequency exchanges based on the condition of the advertisement. Even if such efficiency is worth economic value, it raises a legal problem: can a contract be held valid if one or both of the contracting parties are not human?
Essentials For Valid Contract
A valid contract should have the following essential elements, according to the Indian Contract Act of 1872:
offer and acceptance;
valid consideration;
capable parties;
free consent;
legal object.
Under the Act, all parties to a contract have to be natural persons or companies so that they are capable of giving their consent and understanding the legal effects. An individual is capable of entering into a contract where he or she is above the majority age, of sound mind, and not disqualified by law as per Section 11 of the Act.
Under this principle, AI agents are not competent to form contracts since they are neither natural persons nor legal persons. They lack three foundational pillars of classical theory of contract—consciousness, legal capacity, and intention.
Issues and Challenges
1. Insufficient Legal Personality
In Indian law, AI systems such as advanced generative AI or machine learning models does not come under the definition of legal persons. Therefore, they cannot be sued or held liable. When contracts are violated or errors are committed due to AI decisions on their own, uncertainty results.
2. Lack of Informed Consent
Consent is the necessity of contract law that demands it to be free, voluntary, and informed. AI robots do not have a conscious mind and emotional intelligence; they only operate based on preloaded information and data analysis. The question remains: can AI actions be considered as "consent"?
3. Liability Attribution
Deciding who is liable in AI-influenced contracts—the user, the company, the developer, or the AI—is challenging. Is the owner of an AI chatbot liable if the chatbot makes an incorrect offer? The current legislation provides no clear-cut solution.
4. Global Disintegration
While Singapore and the United Kingdom are beginning to adapt their legal systems to permit electronic agents, India is not. Though automated systems may form contracts under the UNCITRAL Model Law on E-Commerce (1996), Indian law applies this concept only in part.
Need For Reform In Indian Law
India's contract law must be revised to account for prevailing technological development while ensuring consumer protection and legal certainty. Some of the suggested reforms include
1. Statutory Recognition of Electronic Agents
Legally define an "electronic agent," as per UNCITRAL or UETA standards, as a system that can initiate or respond to electronic messages without human intervention.
2. Imputation of Intention
Attribute legal responsibility to the person or entity deploying AI agents for their intentions and willingness. This is akin to how companies utilize human agents under the principle of vicarious liability.
3. Establishing Norms for AI Responsibility
Alter existing law to hold AI implementers accountable for the actions of autonomous systems, especially in commercial and consumer transactions.
4. Standardized E-Contracting Guidelines
Government or judicial standardized practices for the application of AI can be enforced; these can require disclosures when interacting with bots or human monitoring for transactions involving high risk.
Conclusion
India's legal framework needs to evolve to accommodate technological progress without compromising the integrity of the law as the nation gets closer to realizing an AI-driven digital economy. The Indian Contract Act is not well-suited to deal with the issues of autonomous systems forming contracts due to its reliance on 19th-century logic. India can ensure legal certainty as well as economic growth by recognizing the role of AI agents in online trade and introducing special legislation on electronic agents. It is not only apt but also necessary to bring contract law up to date to include AI in it in order to establish a legal system that is future-ready.
This article is authored by Shruti Sahay. She was among the Top 40 performers in the Quiz Competition on Mergers and Acquisitions organized by Lets Learn Law.




Comments