In 1865, the British Parliament passed Locomotive Act, requiring every self-propelled vehicle to be preceded by a person on foot carrying a red flag. The law was designed for steam engines lumbering through market towns.
By the time it was repealed in 1896, France and Germany had built automobile industries that Britain would spend decades trying to match. Regulatory caution, when it outlasts the risk it was designed to address, produces losses that are real, large and invisible.
Any economic regulation is supposed to have a philosophical foundation. Arrow-Debreu model of general equilibrium shows that competitive markets produce efficient outcomes only under conditions never met in practice, i.e., complete markets, perfect information, no externalities, no public goods.
George Akerlof in 'The Market for Lemons' showed that information asymmetry alone can cause markets to unravel. Externalities drive a wedge between private and social costs. Natural monopolies and coordination failures require external correction. Regulation exists to address these failures. But it must balance three tensions that rarely resolve. Efficiency against equity, market freedom against state control and short-term price stability against long-term investment. Done well, regulation enables markets. Done poorly, it replaces them with discretion. The difference is measurable. India has rarely chosen to measure it.
The problem sharpens when the regulated field is new. Ex-ante regulation (anticipatory regulations) asks regulator to know, in advance, what an industry will look like, what risks it will generate, what structures will emerge. In a new field, the regulator knows none of this. Frank Knight formalised the distinction in 1921. Risk is measurable. Probabilities can be assigned, tables constructed and insurance priced. Uncertainty is different in kind. It is the domain where probability distributions themselves are unknown. A new technology inhabits this domain. The information the regulator needs does not yet exist. It is tacit, local and embedded in the practices of actors who have not yet arrived.
James C Scott's framework in 'Seeing Like a State' explains how states achieve administrative control by making society legible through standardised categories. But legibility destroys what it cannot map. The practical, contextual knowledge that makes new industries function, what Scott calls metis, cannot survive the imposition of categories designed for a different world. What cannot be mapped cannot be approved. What cannot be approved cannot exist.
Four mechanisms explain why regulators impose ex-ante frameworks on new fields regardless:
Asymmetric costs
When a regulator approves something that causes harm, the causal chain is short and politically damaging. When a regulator blocks something and an industry fails to emerge, no one is blamed. Daniel Kahneman and Amos Tversky showed that losses are weighted approximately twice as heavily as equivalent gains. For regulators, this produces systematic over-restriction. Their private loss function and social welfare function point in opposite directions. This is not corruption. It is a structurally broken incentive system.
Temporal misalignment
Regulators calibrate to last accident, not next tech. US Nuclear Regulatory Commission, rebuilt after Three Mile Island in 1979, created processes so elaborate that no new nuclear plant was commissioned for 30 years. France, applying a standardised framework to the same tech at the same moment, built 56 reactors.
Categorical error
EU applied a process-based precautionary framework to GMOs in the 1990s, treating recombinant DNA technology as presumptively dangerous regardless of the specific product. US asked whether the crop was harmful. European agricultural biotechnology collapsed as a commercial enterprise. Cass Sunstein's precautionary paradox explains why. Precautions against one risk necessarily create exposure to others. Applied to genuine uncertainty, the precautionary principle is prohibition in academic clothing.
Jurisdictional fragmentation
When a new tech fits no existing category, multiple agencies claim it simultaneously, each applying its own approval logic and compounding the compliance burden. India's response to cryptocurrency is an example. It produced an ecosystem that moved to Singapore and Dubai and continued serving Indian customers from outside the regulatory perimeter RBI had been attempting to enforce.
India's disposition is structural. Colonial administrative architecture was built for extraction and control. Burden of proof falls on the applicant to show that an activity is permissible, not on the state to show that it is harmful. Where neither party has sufficient information, the system defaults to prohibition.
India is now writing regulatory frameworks for AI, genomics, space commercialisation and synthetic biology. These are Knightian uncertainty domains. The frameworks being designed now will determine which of these industries India builds over the next two decades and which it exports to jurisdictions that answered the same question differently.
Three changes are necessary:
By the time it was repealed in 1896, France and Germany had built automobile industries that Britain would spend decades trying to match. Regulatory caution, when it outlasts the risk it was designed to address, produces losses that are real, large and invisible.
Any economic regulation is supposed to have a philosophical foundation. Arrow-Debreu model of general equilibrium shows that competitive markets produce efficient outcomes only under conditions never met in practice, i.e., complete markets, perfect information, no externalities, no public goods.
George Akerlof in 'The Market for Lemons' showed that information asymmetry alone can cause markets to unravel. Externalities drive a wedge between private and social costs. Natural monopolies and coordination failures require external correction. Regulation exists to address these failures. But it must balance three tensions that rarely resolve. Efficiency against equity, market freedom against state control and short-term price stability against long-term investment. Done well, regulation enables markets. Done poorly, it replaces them with discretion. The difference is measurable. India has rarely chosen to measure it.
The problem sharpens when the regulated field is new. Ex-ante regulation (anticipatory regulations) asks regulator to know, in advance, what an industry will look like, what risks it will generate, what structures will emerge. In a new field, the regulator knows none of this. Frank Knight formalised the distinction in 1921. Risk is measurable. Probabilities can be assigned, tables constructed and insurance priced. Uncertainty is different in kind. It is the domain where probability distributions themselves are unknown. A new technology inhabits this domain. The information the regulator needs does not yet exist. It is tacit, local and embedded in the practices of actors who have not yet arrived.
James C Scott's framework in 'Seeing Like a State' explains how states achieve administrative control by making society legible through standardised categories. But legibility destroys what it cannot map. The practical, contextual knowledge that makes new industries function, what Scott calls metis, cannot survive the imposition of categories designed for a different world. What cannot be mapped cannot be approved. What cannot be approved cannot exist.
Four mechanisms explain why regulators impose ex-ante frameworks on new fields regardless:
Asymmetric costs
When a regulator approves something that causes harm, the causal chain is short and politically damaging. When a regulator blocks something and an industry fails to emerge, no one is blamed. Daniel Kahneman and Amos Tversky showed that losses are weighted approximately twice as heavily as equivalent gains. For regulators, this produces systematic over-restriction. Their private loss function and social welfare function point in opposite directions. This is not corruption. It is a structurally broken incentive system.
Temporal misalignment
Regulators calibrate to last accident, not next tech. US Nuclear Regulatory Commission, rebuilt after Three Mile Island in 1979, created processes so elaborate that no new nuclear plant was commissioned for 30 years. France, applying a standardised framework to the same tech at the same moment, built 56 reactors.
Categorical error
EU applied a process-based precautionary framework to GMOs in the 1990s, treating recombinant DNA technology as presumptively dangerous regardless of the specific product. US asked whether the crop was harmful. European agricultural biotechnology collapsed as a commercial enterprise. Cass Sunstein's precautionary paradox explains why. Precautions against one risk necessarily create exposure to others. Applied to genuine uncertainty, the precautionary principle is prohibition in academic clothing.
Jurisdictional fragmentation
When a new tech fits no existing category, multiple agencies claim it simultaneously, each applying its own approval logic and compounding the compliance burden. India's response to cryptocurrency is an example. It produced an ecosystem that moved to Singapore and Dubai and continued serving Indian customers from outside the regulatory perimeter RBI had been attempting to enforce.
India's disposition is structural. Colonial administrative architecture was built for extraction and control. Burden of proof falls on the applicant to show that an activity is permissible, not on the state to show that it is harmful. Where neither party has sufficient information, the system defaults to prohibition.
India is now writing regulatory frameworks for AI, genomics, space commercialisation and synthetic biology. These are Knightian uncertainty domains. The frameworks being designed now will determine which of these industries India builds over the next two decades and which it exports to jurisdictions that answered the same question differently.
Three changes are necessary:
- Regulatory sandboxes must become the default mode of engaging with new sectors.
- Every ex-ante requirement for an emerging technology must carry a sunset clause requiring evidence-based re-justification within three years.
- Regulatory impact assessments must be applied to regulations before enactment, holding the rule-maker to the same evidentiary standard it demands of everyone else.
(Disclaimer: The opinions expressed in this column are that of the writer. The facts and opinions expressed here do not reflect the views of www.economictimes.com.)






Aditya Sinha
Assistant consultant, Economic Advisory Council to PM