variables: 935636
Data license: CC-BY
This data as json
id | name | unit | description | createdAt | updatedAt | code | coverage | timespan | datasetId | sourceId | shortUnit | display | columnOrder | originalMetadata | grapherConfigAdmin | shortName | catalogPath | dimensions | schemaVersion | processingLevel | processingLog | titlePublic | titleVariant | attributionShort | attribution | descriptionShort | descriptionFromProducer | descriptionKey | descriptionProcessing | licenses | license | grapherConfigETL | type | sort | dataChecksum | metadataChecksum |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
935636 | Number of parameters | 2024-06-19 14:36:00 | 2024-07-08 16:38:43 | 6571 | { "zeroDay": "1949-01-01", "yearIsDay": true, "numDecimalPlaces": 0 } |
0 | parameters | grapher/artificial_intelligence/2024-06-19/epoch_compute_intensive/epoch_compute_intensive#parameters | 2 | major | Total number of learnable variables or weights that the model contains. Parameters are adjusted during the training process to optimize the model's performance. | [ "Parameters are internal variables that machine learning models adjust during their training process to improve their ability to make accurate predictions. They act as the model's \"knobs\" that are fine-tuned based on the provided data. In deep learning, a subset of artificial intelligence (AI), parameters primarily consist of the weights assigned to the connections between the small processing units called neurons. Picture a vast network of interconnected neurons where the strength of each connection represents a parameter.", "The total number of parameters in a model is influenced by various factors. The model's structure and the number of \u201clayers\u201d of neurons play a significant role. Generally, more complex models with additional layers tend to have a higher number of parameters. Special components of specific deep learning architectures can further contribute to the overall parameter count.", "Understanding the number of parameters in a model is crucial to design effective models. More parameters can help the model understand complex data patterns, potentially leading to higher accuracy. However, there's a fine balance to strike. If a model has too many parameters, it risks memorizing the specific examples in its training data rather than learning their underlying patterns. Consequently, it may perform poorly when presented with new, unseen data. Achieving the right balance of parameters is a critical consideration in model development.", "In recent times, the AI community has witnessed the emergence of what are often referred to as \"giant models.\" These models boast an astounding number of parameters, reaching into the billions or even trillions. While these huge models have achieved remarkable performance, they have a significant computational cost. Effectively managing and training such large-scale models has become a prominent and active area of research and discussion within the AI field." ] |
{ "note": "Confirmed large-scale AI models are those where the training compute exceeds 10\u00b2\u00b3 floating-point operations (FLOP)." } |
int | [] |
e27c885c33f029aed4d2574721d24ca1 | c63ec5c1d3a64f1a07290a47176f0ca0 |