Scaling laws are mathematical relationships that describe how a system's behavior changes as its size or parameters change.
Scaling laws apply to AI, physics, biology, internet infrastructure, and even the growth rate of cities.
In short, scaling laws aim to answer questions like:
In the context of machine learning techniques such as supervised learning, neural scaling laws describe how performance improves as resources like model size, dataset size, or compute increase.
In the 2020 OpenAI paper Scaling Laws for Neural Language Models, researchers found:
Doubling the number of parameters while simultaneously scaling data and compute led to consistent, predictable improvements in performance.
In biology, Kleiber's Law is a scaling law that explains why larger animals are more energy-efficient per unit of mass than smaller ones.
In the context of the internet, the most common scaling law is Metcalfe’s Law. It states that:
In other words, Metcalfe's Law describes the exponential benefit of connectivity. For example, when more people use email, the web, or social networks, the entire system becomes more valuable—not just linearly, but exponentially.
Scaling laws matter because they are predictive, enable better planning, and reveal insights into underlying structures and constraints.