This PR adds a LangChain implementation of CAMEL role-playing example:
https://github.com/lightaime/camel.
I am sorry that I am not that familiar with LangChain. So I only
implement it in a naive way. There may be a better way to implement it.
**Description**
Add custom vector field name and text field name while indexing and
querying for OpenSearch
**Issues**
https://github.com/hwchase17/langchain/issues/2500
Signed-off-by: Naveen Tatikonda <navtat@amazon.com>
#2681
Original type hints
```python
allowed_special: Union[Literal["all"], AbstractSet[str]] = set(), # noqa: B006
disallowed_special: Union[Literal["all"], Collection[str]] = "all",
```
from
46287bfa49/tiktoken/core.py (L79-L80)
are not compatible with pydantic
<img width="718" alt="image"
src="https://user-images.githubusercontent.com/5096640/230993236-c744940e-85fb-4baa-b9da-8b00fb60a2a8.png">
I think we could use
```python
allowed_special: Union[Literal["all"], Set[str]] = set()
disallowed_special: Union[Literal["all"], Set[str], Tuple[()]] = "all"
```
Please let me know if you would like to implement it differently.
Hi,
just wanted to mention that I added `langchain` to
[conda-forge](https://github.com/conda-forge/langchain-feedstock), so
that it can be installed with `conda`/`mamba` etc.
This makes it available to some corporate users with custom
conda-servers and people who like to manage their python envs with
conda.
**Problem:**
OpenAI Embeddings has a few minor issues: method name and comment for
_completion_with_retry seems to be a copypasta error and a few comments
around usage of embedding_ctx_length seem to be incorrect.
**Solution:**
Clean up issues.
---------
Co-authored-by: Vijay Rajaram <vrajaram3@gatech.edu>
Took me a bit to find the proper places to get the API keys. The link
earlier provided to setup search is still good, but why not provide
direct link to the Google cloud tools that give you ability to create
keys?
`combine_docs` does not go through the standard chain call path which
means that chain callbacks won't be triggered, meaning QA chains won't
be traced properly, this fixes that.
Also fix several errors in the chat_vector_db notebook
Adds a new pdf loader using the existing dependency on PDFMiner.
The new loader can be helpful for chunking texts semantically into
sections as the output html content can be parsed via `BeautifulSoup` to
get more structured and rich information about font size, page numbers,
pdf headers/footers, etc. which may not be available otherwise with
other pdf loaders
Improvements to Deep Lake Vector Store
- much faster view loading of embeddings after filters with
`fetch_chunks=True`
- 2x faster ingestion
- use np.float32 for embeddings to save 2x storage, LZ4 compression for
text and metadata storage (saves up to 4x storage for text data)
- user defined functions as filters
Docs
- Added retriever full example for analyzing twitter the-algorithm
source code with GPT4
- Added a use case for code analysis (please let us know your thoughts
how we can improve it)
---------
Co-authored-by: Davit Buniatyan <d@activeloop.ai>
## Why this PR?
Fixes#2624
There's a missing import statement in AzureOpenAI embeddings example.
## What's new in this PR?
- Import `OpenAIEmbeddings` before creating it's object.
## How it's tested?
- By running notebook and creating embedding object.
Signed-off-by: letmerecall <girishsharma001@gmail.com>
Referencing #2595
Added optional default parameter to adjust index metadata upon
collection creation per chroma code
ce0bc89777/chromadb/api/local.py (L74)
Allowing for user to have the ability to adjust distance calculation
functions.
closes#1634
Adds support for loading files from a shared Google Drive folder to
`GoogleDriveLoader`. Shared drives are commonly used by businesses on
their Google Workspace accounts (this is my particular use case).
RWKV is an RNN with a hidden state that is part of its inference.
However, the model state should not be carried across uses and it's a
bug to do so.
This resets the state for multiple invocations
Added support for passing the openai_organization as an argument, as it
was only supported by the environment variable but openai_api_key was
supported by both environment variables and arguments.
`ChatOpenAI(temperature=0, model_name="gpt-4", openai_api_key="sk-****",
openai_organization="org-****")`