mirror of https://github.com/hwchase17/langchain
add max_context_size property in BaseOpenAI (#6239)
Hi, I make a small improvement for BaseOpenAI.
I added a max_context_size attribute to BaseOpenAI so that we can get
the max context size directly instead of only getting the maximum token
size of the prompt through the max_tokens_for_prompt method.
Who can review?
@hwchase17 @agola11
I followed the [Common
Tasks](c7db9febb0/.github/CONTRIBUTING.md
),
the test is all passed.
---------
Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
pull/6287/head^2
parent
3e3ed8c5c9
commit
ca7a44d024
Loading…
Reference in New Issue