Temperature
Parameter often associated with models that generate probability distributions
'Temperature' is a parameter often associated with models that generate probability distributions, such as language models. Adjusting the `temperature` of the AI model influences its probability distribution, making the text more focused or diverse.
- High Temperature (>1): When the temperature is set to a high value (greater than 1), the probability distribution becomes flatter. This means that tokens across the vocabulary are assigned more equal probabilities, making less likely tokens more probable. As a result, the model introduces more randomness and diversity in its output. This increased variability can lead to creative and exploratory generation, producing a wider range of possible sequences or outcomes.
- Low Temperature (<1): Conversely, when the temperature is set to a low value (less than 1), the probability distribution becomes sharper. The model becomes more deterministic, assigning higher probabilities to the most likely tokens based on its learned patterns. In this case, the model is more likely to generate sequences that closely resemble the most probable predictions. Lower temperature values result in more focused and conservative outputs, reducing the diversity but often enhancing the coherence and reliability of the generated content
In summary, adjusting the temperature allows users to fine-tune the balance between randomness and determinism in the generative model's output.
Liked the content? you'll love our emails!
Is Explainability critical for your 'AI' solutions?
Schedule a demo with our team to understand how AryaXAI can make your mission-critical 'AI' acceptable and aligned with all your stakeholders.