Parameters
An Azure LLM interface can have the following parameters:| Parameter | Type | Description |
|---|---|---|
temperature | float | The temperature parameter for the model. |
max_tokens | int | The maximum number of tokens to generate. |
top_p | float | The top-p parameter for the model. |
frequency_penalty | float | The frequency penalty parameter for the model. |
presence_penalty | float | The presence penalty parameter for the model. |
Usage
Here is how you setup an interface to interact with your Azure models.- OpenAI Models
- Other Models
1
Create a
config.yaml file in the same directory as your code.- 📁 src
- 🐍 PythonCode.py
- 🐍 PyNotebook.ipynb
- 📄 config.yaml
2
Define your Azure OpenAI provider and models inside the
config.yaml file.3
Create your llm instance.
4
Optional: You can add your parameters as follows:
You are done setting up your Azure LLM!