posts_gdocs: 12hNREWcrwyOrJugzc7gYHN5Qv_UJBJ-GnY_FzMbXByE
Data license: CC-BY
This data as json
id | slug | type | content | published | createdAt | publishedAt | updatedAt | publicationContext | revisionId | breadcrumbs | markdown | title |
---|---|---|---|---|---|---|---|---|---|---|---|---|
12hNREWcrwyOrJugzc7gYHN5Qv_UJBJ-GnY_FzMbXByE | moores-law | article | { "toc": [ { "slug": "computing-efficiency-and-cost", "text": "Computing efficiency and cost", "title": "Computing efficiency and cost", "supertitle": "", "isSubheading": false } ], "body": [ { "type": "text", "value": [ { "text": "The observation that the number of transistors on computer chips doubles approximately every two years is known as Moore\u2019s Law.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "Moore\u2019s Law is not a law of nature, but an observation of a long-term trend in how technology is changing.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "The law was first described by Gordon E. Moore, the co-founder of Intel, in 1965.", "spanType": "span-simple-text" }, { "url": "#note-1", "children": [ { "children": [ { "text": "1", "spanType": "span-simple-text" } ], "spanType": "span-superscript" } ], "spanType": "span-ref" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "The chart shows Moore\u2019s original graph that he drew in 1965 to describe this regularity. At the time, he had only a handful of data points. Note that he drew it on a logarithmic scale, and remember that a straight line on a log-axis means that the\u00a0", "spanType": "span-simple-text" }, { "children": [ { "text": "growth rate", "spanType": "span-simple-text" } ], "spanType": "span-italic" }, { "text": " is constant and it is therefore showing the exponential growth of the number of transistors.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "However, he hypothesized that this relationship would continue at a similar rate: \u201cThere is no reason to believe it will not remain constant for at least 10 years\u201d.", "spanType": "span-simple-text" }, { "url": "#note-2", "children": [ { "children": [ { "text": "2", "spanType": "span-simple-text" } ], "spanType": "span-superscript" } ], "spanType": "span-ref" } ], "parseErrors": [] }, { "alt": "The Number of Components per Integrated Function Moore's Original Graph - Moore0", "size": "narrow", "type": "image", "filename": "The-Number-of-Components-per-Integrated-Function-Moores-Original-Graph-Moore0.png", "parseErrors": [] }, { "type": "horizontal-rule", "value": {}, "parseErrors": [] }, { "text": [ { "text": "Moore\u2019s Law has held true for more than half a century", "spanType": "span-simple-text" } ], "type": "heading", "level": 1, "parseErrors": [] }, { "type": "text", "value": [ { "text": "In 1965, Gordon Moore predicted that this growth would continue for another 10 years, at least. Was he right?", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "In the chart, we\u2019ve visualized the growth in transistor density \u2013 the number of transistors on integrated circuits \u2013 from 1970 onwards.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "It looks strikingly similar to Moore\u2019s simple plot from 1965. Note again that the transistor count is on a logarithmic axis, so the linear relationship over time means that the growth ", "spanType": "span-simple-text" }, { "children": [ { "text": "rate", "spanType": "span-simple-text" } ], "spanType": "span-italic" }, { "text": " has been constant.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "This means that the growth of the transistor count has, in fact, been exponential. You can also see this on our ", "spanType": "span-simple-text" }, { "url": "https://ourworldindata.org/grapher/transistors-per-microprocessor", "children": [ { "text": "interactive chart", "spanType": "span-simple-text" } ], "spanType": "span-link" }, { "text": ", which shows the average transistor count over time and where you can switch between a linear and a log axis", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "Transistor counts have doubled approximately every two years, just as Moore predicted.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "This has held true for more than 50 years now.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "alt": "", "size": "wide", "type": "image", "filename": "Transistor-Count-over-time.png", "parseErrors": [] }, { "type": "horizontal-rule", "value": {}, "parseErrors": [] }, { "text": [ { "text": "There are many examples of exponential technological change", "spanType": "span-simple-text" } ], "type": "heading", "level": 1, "parseErrors": [] }, { "type": "text", "value": [ { "text": "Moore\u2019s Law describes the increasing number of transistors on integrated circuits, which in itself doesn\u2019t matter for us as users of computer equipment. But it matters for those aspects that we do care about, like the speed and cost of computing.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "Many related metrics show a similar pattern of exponential growth. The computational capacity of computers has increased exponentially, doubling every 1.5 years, from 1975 to 2009.", "spanType": "span-simple-text" }, { "url": "#note-3", "children": [ { "children": [ { "text": "3", "spanType": "span-simple-text" } ], "spanType": "span-superscript" } ], "spanType": "span-ref" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "More recent data is shown in the interactive chart. It shows the increase in supercomputer power, measured as the largest supercomputer in any given year. The unit of measurement is FLOPS: the number of computations the machine can carry out per second.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "url": "https://ourworldindata.org/grapher/supercomputer-power-flops", "type": "chart", "parseErrors": [] }, { "text": [ { "text": "Computing efficiency and cost", "spanType": "span-simple-text" } ], "type": "heading", "level": 2, "parseErrors": [] }, { "type": "text", "value": [ { "text": "Computing efficiency \u2013 measuring the energy use of computers \u2013 has halved every 1.5 years over the last 60 years.", "spanType": "span-simple-text" }, { "url": "#note-4", "children": [ { "children": [ { "text": "4", "spanType": "span-simple-text" } ], "spanType": "span-superscript" } ], "spanType": "span-ref" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "Exponential progress is also found in the cost of computer memory and storage. In the chart, we see the cost of computer storage across different mediums \u2013 disks, flash drives, and internal memory \u2013 since the 1950s. This is measured as the price per terabyte.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "type": "text", "value": [ { "text": "Moore\u2019s observation that the transistor count on integrated circuits grows exponentially is at the heart of many of the most consequential changes of our time. In ", "spanType": "span-simple-text" }, { "url": "https://ourworldindata.org/artificial-intelligence", "children": [ { "text": "our work on artificial intelligence", "spanType": "span-simple-text" } ], "spanType": "span-link" }, { "text": " we explore how the exponential growth translates in computing technology translated into more and more powerful AI systems.", "spanType": "span-simple-text" } ], "parseErrors": [] }, { "url": "https://ourworldindata.org/grapher/historical-cost-of-computer-memory-and-storage?country=~OWID_WRL", "type": "chart", "parseErrors": [] } ], "refs": { "errors": [], "definitions": { "1e0670cf3cfea1e14145351f4a51551ca051b67a": { "id": "1e0670cf3cfea1e14145351f4a51551ca051b67a", "index": 1, "content": [ { "type": "text", "value": [ { "text": "Quoted from Gordon E. Moore (1965) \u2013 ", "spanType": "span-simple-text" }, { "url": "https://web.archive.org/web/20211221191553/http://www.monolithic3d.com/uploads/6/0/5/5/6055488/gordon_moore_1965_article.pdf", "children": [ { "text": "Cramming more components onto integrated circuits", "spanType": "span-simple-text" } ], "spanType": "span-link" }, { "text": ". In Electronics, Volume 38, Number 8, April 19, 1965.", "spanType": "span-simple-text" } ], "parseErrors": [] } ], "parseErrors": [] }, "706ac47f1d04bda6d203c0997a4bbe9fc4c2983e": { "id": "706ac47f1d04bda6d203c0997a4bbe9fc4c2983e", "index": 2, "content": [ { "type": "text", "value": [ { "text": "Koomey, Berard, Sanchez, and Wong (2011) \u2013 Implications of Historical Trends in the Electrical Efficiency of Computing. In IEEE Annals of the History of Computing, 33, 3, 46\u201354.", "spanType": "span-simple-text" } ], "parseErrors": [] } ], "parseErrors": [] }, "80365e386c82cf874a3bfb955e5e9d839bb15bd8": { "id": "80365e386c82cf874a3bfb955e5e9d839bb15bd8", "index": 0, "content": [ { "type": "text", "value": [ { "text": "The original paper is Gordon E. Moore (1965) \u2013 ", "spanType": "span-simple-text" }, { "url": "https://web.archive.org/web/20211221191553/http://www.monolithic3d.com/uploads/6/0/5/5/6055488/gordon_moore_1965_article.pdf", "children": [ { "text": "Cramming more components onto integrated circuits", "spanType": "span-simple-text" } ], "spanType": "span-link" }, { "text": ". In Electronics, Volume 38, Number 8, April 19, 1965.", "spanType": "span-simple-text" } ], "parseErrors": [] } ], "parseErrors": [] }, "9a1ec2d1e7cdaa8e2e6e63852c6dc847b2258e1f": { "id": "9a1ec2d1e7cdaa8e2e6e63852c6dc847b2258e1f", "index": 3, "content": [ { "type": "text", "value": [ { "text": "A short ungated article on this research is in the ", "spanType": "span-simple-text" }, { "url": "https://web.archive.org/web/20151218143336/http://www.technologyreview.com:80/news/425398/a-new-and-improved-moores-law/", "children": [ { "text": "MIT Technology Review", "spanType": "span-simple-text" } ], "spanType": "span-link" }, { "text": ".", "spanType": "span-simple-text" } ], "parseErrors": [] } ], "parseErrors": [] } } }, "type": "article", "title": "What is Moore's Law?", "authors": [ "Max Roser", "Hannah Ritchie", "Edouard Mathieu" ], "excerpt": "Exponential growth is at the heart of the rapid increase of computing capabilities.", "dateline": "March 28, 2023", "subtitle": "Exponential growth is at the heart of the rapid increase of computing capabilities.", "sidebar-toc": false, "featured-image": "Moores-Law-01.png" } |
1 | 2024-02-01 18:23:46 | 2023-03-28 07:03:00 | 2024-02-01 18:32:53 | listed | ALBJ4LurdHvzoG6Wp7e099qFaBydZndpVqi8a2draq1TjKzjLJdEYzXG4dSZMg5XXEj1ElHUVsOoj4TCqPgGuA | The observation that the number of transistors on computer chips doubles approximately every two years is known as Moore’s Law. Moore’s Law is not a law of nature, but an observation of a long-term trend in how technology is changing. The law was first described by Gordon E. Moore, the co-founder of Intel, in 1965.1 The chart shows Moore’s original graph that he drew in 1965 to describe this regularity. At the time, he had only a handful of data points. Note that he drew it on a logarithmic scale, and remember that a straight line on a log-axis means that the _growth rate_ is constant and it is therefore showing the exponential growth of the number of transistors. However, he hypothesized that this relationship would continue at a similar rate: “There is no reason to believe it will not remain constant for at least 10 years”.2 <Image filename="The-Number-of-Components-per-Integrated-Function-Moores-Original-Graph-Moore0.png" alt="The Number of Components per Integrated Function Moore's Original Graph - Moore0"/> --- # Moore’s Law has held true for more than half a century In 1965, Gordon Moore predicted that this growth would continue for another 10 years, at least. Was he right? In the chart, we’ve visualized the growth in transistor density – the number of transistors on integrated circuits – from 1970 onwards. It looks strikingly similar to Moore’s simple plot from 1965. Note again that the transistor count is on a logarithmic axis, so the linear relationship over time means that the growth _rate_ has been constant. This means that the growth of the transistor count has, in fact, been exponential. You can also see this on our [interactive chart](https://ourworldindata.org/grapher/transistors-per-microprocessor), which shows the average transistor count over time and where you can switch between a linear and a log axis Transistor counts have doubled approximately every two years, just as Moore predicted. This has held true for more than 50 years now. <Image filename="Transistor-Count-over-time.png" alt=""/> --- # There are many examples of exponential technological change Moore’s Law describes the increasing number of transistors on integrated circuits, which in itself doesn’t matter for us as users of computer equipment. But it matters for those aspects that we do care about, like the speed and cost of computing. Many related metrics show a similar pattern of exponential growth. The computational capacity of computers has increased exponentially, doubling every 1.5 years, from 1975 to 2009.3 More recent data is shown in the interactive chart. It shows the increase in supercomputer power, measured as the largest supercomputer in any given year. The unit of measurement is FLOPS: the number of computations the machine can carry out per second. <Chart url="https://ourworldindata.org/grapher/supercomputer-power-flops"/> ## Computing efficiency and cost Computing efficiency – measuring the energy use of computers – has halved every 1.5 years over the last 60 years.4 Exponential progress is also found in the cost of computer memory and storage. In the chart, we see the cost of computer storage across different mediums – disks, flash drives, and internal memory – since the 1950s. This is measured as the price per terabyte. Moore’s observation that the transistor count on integrated circuits grows exponentially is at the heart of many of the most consequential changes of our time. In [our work on artificial intelligence](https://ourworldindata.org/artificial-intelligence) we explore how the exponential growth translates in computing technology translated into more and more powerful AI systems. <Chart url="https://ourworldindata.org/grapher/historical-cost-of-computer-memory-and-storage?country=~OWID_WRL"/> The original paper is Gordon E. Moore (1965) – [Cramming more components onto integrated circuits](https://web.archive.org/web/20211221191553/http://www.monolithic3d.com/uploads/6/0/5/5/6055488/gordon_moore_1965_article.pdf). In Electronics, Volume 38, Number 8, April 19, 1965. Quoted from Gordon E. Moore (1965) – [Cramming more components onto integrated circuits](https://web.archive.org/web/20211221191553/http://www.monolithic3d.com/uploads/6/0/5/5/6055488/gordon_moore_1965_article.pdf). In Electronics, Volume 38, Number 8, April 19, 1965. Koomey, Berard, Sanchez, and Wong (2011) – Implications of Historical Trends in the Electrical Efficiency of Computing. In IEEE Annals of the History of Computing, 33, 3, 46–54. A short ungated article on this research is in the [MIT Technology Review](https://web.archive.org/web/20151218143336/http://www.technologyreview.com:80/news/425398/a-new-and-improved-moores-law/). | What is Moore's Law? |