variables: 450118
Data license: CC-BY
This data as json
id | name | unit | description | createdAt | updatedAt | code | coverage | timespan | datasetId | sourceId | shortUnit | display | columnOrder | originalMetadata | grapherConfigAdmin | shortName | catalogPath | dimensions | schemaVersion | processingLevel | processingLog | titlePublic | titleVariant | attributionShort | attribution | descriptionShort | descriptionFromProducer | descriptionKey | descriptionProcessing | licenses | license | grapherConfigETL | type | sort | dataChecksum | metadataChecksum |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
450118 | Cost_training_computation | Cost of training computation was calculated as the <a href="https://ourworldindata.org/grapher/artificial-intelligence-training-computation" target=”_blank”>system's training computation</a> divided by a floating-point operation (FLOP)/$ value. The FLOP/$ value was calculated using one of two methods: 1. The value of FLOP/s per $ at the time of the system's publication (according to the <a href="https://epochai.org/blog/trends-in-gpu-price-performance" target=”_blank”>"Our data" trend line in Figure 1 here</a>). 2. Dividing the theoretical peak throughput (including "Tensor Core" performance) by the reported unit price of the hardware that was actually used for training. The authors expect that Method 2 is more accurate on average. If an estimate via Method 2 is available, they report that estimate; otherwise, they use Method 1. Additionally, the authors made the following assumptions for all systems, in order to convert theoretical peak FLOP/s per $ into realized FLOP/$: 1. Hardware utilization is 35% 2. The amount of GPU time the given hardware is used for in its lifetime is 2 years. | 2022-07-09 09:28:43 | 2023-06-15 05:05:42 | 5515 | 21327 | { "unit": "real 2020 US$", "zeroDay": "2020-01-01", "shortUnit": "$", "yearIsDay": true, "includeInTable": true, "numDecimalPlaces": 0 } |
0 | 1 |