chainfury.components.tune package

class chainfury.components.tune.ChatNBX(*, model: str, messages: List[Message], max_tokens: int | None = 4294967296, temperature: float | None = 1)[source]

Bases: BaseModel

This model is used to chat with the OpenAI API

class Message(*, role: str, content: str, name: str | None = None, function_call: str | Dict[str, str] | None = None)[source]

Bases: BaseModel

content: str
function_call: str | Dict[str, str] | None
name: str | None
role: str
max_tokens: int | None
messages: List[Message]
model: str
temperature: float | None
chainfury.components.tune.chatnbx(model: str, messages: List[Dict[str, str]], chatnbx_api_key: Secret = '', max_tokens: int = 1024, temperature: float = 1, *, retry_count: int = 3, retry_delay: int = 1) Dict[str, Any][source]

Chat with the ChatNBX API with OpenAI compatability, see more at https://chat.nbox.ai/

Note: This is a API is partially compatible with OpenAI’s API, so messages should be of type [{"role": ..., "content": ...}]

Parameters:
  • model (str) – The model to use, see https://chat.nbox.ai/ for more info

  • messages (List[Dict[str, str]]) – A list of messages to send to the API which are OpenAI compatible

  • chatnbx_api_key (Secret, optional) – The API key to use or set CHATNBX_KEY environment variable

  • max_tokens (int, optional) – The maximum number of tokens to generate. Defaults to 1024.

  • temperature (float, optional) – The higher the temperature, the crazier the text. Defaults to 1.

Returns:

The response from the API

Return type:

Dict[str, Any]