forked from Archives/langchain
8d9e9e013c
# Token text splitter for sentence transformers The current TokenTextSplitter only works with OpenAi models via the `tiktoken` package. This is not clear from the name `TokenTextSplitter`. In this (first PR) a token based text splitter for sentence transformer models is added. In the future I think we should work towards injecting a tokenizer into the TokenTextSplitter to make ti more flexible. Could perhaps be reviewed by @dev2049 --------- Co-authored-by: Harrison Chase <hw.chase.17@gmail.com> |
||
---|---|---|
.. | ||
_static | ||
additional_resources | ||
ecosystem | ||
getting_started | ||
integrations | ||
modules | ||
reference | ||
templates | ||
tracing | ||
use_cases | ||
conf.py | ||
dependents.md | ||
index.rst | ||
integrations.rst | ||
make.bat | ||
Makefile | ||
reference.rst | ||
requirements.txt |