Share via


OpenAIPromptExecutionSettings Class

Common request settings for (Azure) OpenAI services.

Initialize the prompt execution settings.

Constructor

OpenAIPromptExecutionSettings(service_id: str | None = None, *, extension_data: dict[str, Any] = None, function_choice_behavior: FunctionChoiceBehavior | None = None, ai_model_id: str | None = None, frequency_penalty: Annotated[float | None, Ge(ge=-2.0), Le(le=2.0)] = None, logit_bias: dict[str | int, float] | None = None, max_tokens: Annotated[int | None, Gt(gt=0)] = None, number_of_responses: Annotated[int | None, Ge(ge=1), Le(le=128)] = None, presence_penalty: Annotated[float | None, Ge(ge=-2.0), Le(le=2.0)] = None, seed: int | None = None, stop: str | list[str] | None = None, stream: bool = False, temperature: Annotated[float | None, Ge(ge=0.0), Le(le=2.0)] = None, top_p: Annotated[float | None, Ge(ge=0.0), Le(le=1.0)] = None, user: str | None = None, store: bool | None = None, metadata: dict[str, str] | None = None)

Parameters

Name Description
service_id
str

The service ID to use for the request.

Default value: None
kwargs
Required
Any

Additional keyword arguments, these are attempted to parse into the keys of the specific prompt execution settings.

Keyword-Only Parameters

Name Description
extension_data
Required
function_choice_behavior
Required
ai_model_id
Required
frequency_penalty
Required
logit_bias
Required
max_tokens
Required
number_of_responses
Required
presence_penalty
Required
seed
Required
stop
Required
stream
Required
temperature
Required
top_p
Required
user
Required
store
Required
metadata
Required

Attributes

ai_model_id

ai_model_id: Annotated[str | None, FieldInfo(annotation=NoneType, required=True, alias_priority=2, serialization_alias='model')]

frequency_penalty

frequency_penalty: Annotated[float | None, FieldInfo(annotation=NoneType, required=True, metadata=[Ge(ge=-2.0), Le(le=2.0)])]

logit_bias

logit_bias: dict[str | int, float] | None

max_tokens

max_tokens: Annotated[int | None, FieldInfo(annotation=NoneType, required=True, metadata=[Gt(gt=0)])]

metadata

metadata: dict[str, str] | None

number_of_responses

number_of_responses: Annotated[int | None, FieldInfo(annotation=NoneType, required=True, alias_priority=2, serialization_alias='n', metadata=[Ge(ge=1), Le(le=128)])]

presence_penalty

presence_penalty: Annotated[float | None, FieldInfo(annotation=NoneType, required=True, metadata=[Ge(ge=-2.0), Le(le=2.0)])]

seed

seed: int | None

stop

stop: str | list[str] | None

store

store: bool | None

stream

stream: bool

temperature

temperature: Annotated[float | None, FieldInfo(annotation=NoneType, required=True, metadata=[Ge(ge=0.0), Le(le=2.0)])]

top_p

top_p: Annotated[float | None, FieldInfo(annotation=NoneType, required=True, metadata=[Ge(ge=0.0), Le(le=1.0)])]

user

user: str | None