Temperature

Turning up or down refers to adjusting the model’s “temperature” parameter to a higher or lower setting. The temperature parameter is crucial in controlling the randomness or creativity in the model’s responses.

TEMPERATURE SETTINGDESCRIPTIONEFFECT ON OUTPUTSUSE CASES
High Temperature (Closer to 1 or above)Results in more random, diverse, and creative outputs. The model takes risks in language generation, producing varied and sometimes unexpected or less probable responses.May generate creative or unusual sentences, but can also lead to less coherent or relevant responses.Creative tasks like poetry or story writing might benefit from higher temperature settings to introduce more creativity and novelty.
Low Temperature (Closer to 0)Makes the model’s responses more deterministic and conservative. The model is likely to choose the most probable next word or phrase, leading to more predictable and consistent outputs.Generally produces more reliable, coherent, and contextually appropriate responses, but with less variation and creativity.Tasks requiring accuracy and coherence, like factual summarization or technical explanations, are better suited to lower temperature settings.

Adjusting the “temperature” parameter in an LLM refers to controlling the randomness or creativity in the model’s responses. Lowering the temperature reduces randomness, leading to more predictable and coherent outputs, while increasing it introduces more creativity and diversity.