## The Idea In Brief **Goodhart’s Law** warns against using a metric as a target. Once a measure is tied to an incentive or becomes a goal, it loses its reliability as an unbiased indicator. The law highlights how systems and individuals often distort or game metrics, leading to unintended consequences. It is particularly relevant in areas like economics, education, governance, and artificial intelligence. --- ## Key Concepts ### 1. **Definition** Goodhart’s Law is often summarised as: > “When a measure becomes a target, it ceases to be a good measure.” This means that if a specific metric is used to evaluate or control a system, people may optimise for the metric in ways that undermine its original purpose. --- ### 2. **Origins and History** - Introduced by **Charles Goodhart**, a British economist and former advisor to the Bank of England. - First stated formally in a 1975 paper, critiquing the use of monetary aggregates in policy-making. - The idea gained broader attention through policy failures in centrally planned economies and later through the work of social scientists like Donald T. Campbell. --- ### 3. **Mechanisms of Distortion** When a metric becomes a performance target, several things can happen: - **Gaming**: People manipulate inputs or behaviours to artificially boost scores. - **Tunnel vision**: Focus narrows to only what is measured, neglecting broader goals. - **Data manipulation**: Records or statistics are adjusted to meet targets. - **Strategic compliance**: Rules are followed in form but not in spirit. ### 4. **Implications** - Use **multiple metrics** rather than a single target. - Combine **quantitative** and **qualitative** assessments. - Regularly review whether a measure still reflects the underlying goal. - Emphasise **intent and behaviour**, not just outcomes.