Big docs refactor! Motivation is to make it easier for people to find
resources they are looking for. To accomplish this, there are now three
main sections:
- Getting Started: steps for getting started, walking through most core
functionality
- Modules: these are different modules of functionality that langchain
provides. Each part here has a "getting started", "how to", "key
concepts" and "reference" section (except in a few select cases where it
didnt easily fit).
- Use Cases: this is to separate use cases (like summarization, question
answering, evaluation, etc) from the modules, and provide a different
entry point to the code base.
There is also a full reference section, as well as extra resources
(glossary, gallery, etc)
Co-authored-by: Shreya Rajpal <ShreyaR@users.noreply.github.com>
This PR has two contributions:
1. Add test for when stop token is found in middle of text
2. Add code coverage tooling and instructions
- Add pytest-cov via poetry
- Add necessary config files
- Add new make instruction for `coverage`
- Update README with coverage guidance
- Update minor README formatting/spelling
Co-authored-by: Hunter Gerlach <hunter@huntergerlach.com>
Love the project, a ton of fun!
I think the PR is pretty self-explanatory, happy to make any changes! I
am working on using it in an `LLMBashChain` and may update as that
progresses.
Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
Add support for calling HuggingFace embedding models
using the HuggingFaceHub Inference API. New class mirrors
the existing HuggingFaceHub LLM implementation. Currently
only supports 'sentence-transformers' models.
Closes#86
Add MemoryChain and ConversationChain as chains that take a docstore in
addition to the prompt, and use the docstore to stuff context into the
prompt. This can be used to have an ongoing conversation with a chatbot.
Probably needs a bit of refactoring for code quality
Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
Also updated docs, and noticed an issue with the add_texts method on
VectorStores that I had missed before -- the metadatas arg should be
required to match the classmethod which initializes the VectorStores
(the add_example methods break otherwise in the ExampleSelectors)
this will break atm but wanted to get thoughts on implementation.
1. should add() be on docstore interface?
2. should InMemoryDocstore change to take a list of documents as init?
(makes this slightly easier to implement in FAISS -- if we think it is
less clean then could expose a method to get the number of documents
currently in the dict, and perform the logic of creating the necessary
dictionary in the FAISS.add_texts method.
Co-authored-by: Harrison Chase <hw.chase.17@gmail.com>
`SQLDatabase` now accepts two `init` arguments:
1. `ignore_tables` to pass in a list of tables to not search over
2. `include_tables` to restrict to a list of tables to consider
This is a simple proof of concept of using external files as templates.
I'm still feeling my way around the codebase.
As a user, I want to use files as prompts, so it will be easier to
manage and test prompts.
The future direction is to use a template engine, most likely Mako.
Add support for huggingface hub
I could not find a good way to enforce stop tokens over the huggingface
hub api - that needs to hopefully be cleaned up in the future