openai_gpt
langroid/language_models/openai_gpt.py
AnthropicModel
¶
Bases: str
, Enum
Enum for Anthropic models
OpenAIChatModel
¶
Bases: str
, Enum
Enum for OpenAI Chat models
OpenAICompletionModel
¶
Bases: str
, Enum
Enum for OpenAI Completion models
OpenAICallParams
¶
Bases: BaseModel
Various params that can be sent to an OpenAI API chat-completion call. When specified, any param here overrides the one with same name in the OpenAIGPTConfig.
OpenAIGPTConfig(**kwargs)
¶
Bases: LLMConfig
Class for any LLM with an OpenAI-like API: besides the OpenAI models this includes: (a) locally-served models behind an OpenAI-compatible API (b) non-local models, using a proxy adaptor lib like litellm that provides an OpenAI-compatible API. We could rename this class to OpenAILikeConfig.
Source code in langroid/language_models/openai_gpt.py
create(prefix)
classmethod
¶
Create a config class whose params can be set via a desired prefix from the .env file or env vars. E.g., using
you can have a group of params prefixed by "OLLAMA_", to be used with models served viaollama
.
This way, you can maintain several setting-groups in your .env file,
one per model type.
Source code in langroid/language_models/openai_gpt.py
OpenAIResponse
¶
Bases: BaseModel
OpenAI response model, either completion or chat.
OpenAIGPT(config=OpenAIGPTConfig())
¶
Bases: LanguageModel
Class for OpenAI LLMs
Source code in langroid/language_models/openai_gpt.py
398 399 400 401 402 403 404 405 406 407 408 409 410 411 412 413 414 415 416 417 418 419 420 421 422 423 424 425 426 427 428 429 430 431 432 433 434 435 436 437 438 439 440 441 442 443 444 445 446 447 448 449 450 451 452 453 454 455 456 457 458 459 460 461 462 463 464 465 466 467 468 469 470 471 472 473 474 475 476 477 478 479 480 481 482 483 484 485 486 487 488 489 490 491 492 493 494 495 496 497 498 499 500 501 502 503 504 505 506 507 508 509 510 511 512 513 514 515 516 517 518 519 520 521 522 523 524 525 526 527 528 529 530 531 532 533 534 535 536 537 538 539 540 541 542 543 544 545 546 547 548 549 550 551 552 553 554 555 556 557 558 559 560 561 562 563 564 565 566 567 568 569 570 571 572 |
|
unsupported_params()
¶
List of params that are not supported by the current model
Source code in langroid/language_models/openai_gpt.py
rename_params()
¶
Map of param name -> new name for specific models. Currently main troublemaker is o1* series.
Source code in langroid/language_models/openai_gpt.py
chat_context_length()
¶
Context-length for chat-completion models/endpoints Get it from the dict, otherwise fail-over to general method
Source code in langroid/language_models/openai_gpt.py
completion_context_length()
¶
Context-length for completion models/endpoints Get it from the dict, otherwise fail-over to general method
Source code in langroid/language_models/openai_gpt.py
chat_cost()
¶
(Prompt, Generation) cost per 1000 tokens, for chat-completion models/endpoints. Get it from the dict, otherwise fail-over to general method
Source code in langroid/language_models/openai_gpt.py
set_stream(stream)
¶
Enable or disable streaming output from API. Args: stream: enable streaming output from API Returns: previous value of stream
Source code in langroid/language_models/openai_gpt.py
get_stream()
¶
tool_deltas_to_tools(tools)
staticmethod
¶
Convert accumulated tool-call deltas to OpenAIToolCall objects. Adapted from this excellent code: https://community.openai.com/t/help-for-function-calls-with-streaming/627170/2
Parameters:
Name | Type | Description | Default |
---|---|---|---|
tools |
List[Dict[str, Any]]
|
list of tool deltas received from streaming API |
required |
Returns:
Name | Type | Description |
---|---|---|
str |
str
|
plain text corresponding to tool calls that failed to parse |
List[OpenAIToolCall]
|
List[OpenAIToolCall]: list of OpenAIToolCall objects |
|
List[Dict[str, Any]]
|
List[Dict[str, Any]]: list of tool dicts (to reconstruct OpenAI API response, so it can be cached) |
Source code in langroid/language_models/openai_gpt.py
891 892 893 894 895 896 897 898 899 900 901 902 903 904 905 906 907 908 909 910 911 912 913 914 915 916 917 918 919 920 921 922 923 924 925 926 927 928 929 930 931 932 933 934 935 936 937 938 939 940 941 942 943 944 945 946 947 948 949 950 951 952 953 954 955 956 957 958 959 960 961 962 963 964 965 966 967 968 969 970 971 972 973 974 |
|
noop()
¶
litellm_logging_fn(model_call_dict)
¶
Logging function for litellm