Commit Graph

65 Commits

Author SHA1 Message Date
Jacob Lee
3328507f11
langchain[patch], experimental[minor]: Adds OllamaFunctions wrapper (#13330)
CC @baskaryan @hwchase17 @jmorganca 

Having a bit of trouble importing `langchain_experimental` from a
notebook, will figure it out tomorrow

~Ah and also is blocked by #13226~

---------

Co-authored-by: Lance Martin <lance@langchain.dev>
Co-authored-by: Bagatur <baskaryan@gmail.com>
2023-11-30 16:13:57 -08:00
Bagatur
a20e8f8bb0
experimental[patch]: release 0.0.43 (#13570) 2023-11-28 15:38:09 -08:00
Bagatur
48fbc5513d
infra[patch], langchain[patch]: fix test deps and upper bound langchain dep on core(#13984) 2023-11-28 13:26:15 -08:00
Nuno Campos
8329f81072
Use pytest asyncio auto mode (#13643)
<!-- Thank you for contributing to LangChain!

Replace this entire comment with:
  - **Description:** a description of the change, 
  - **Issue:** the issue # it fixes (if applicable),
  - **Dependencies:** any dependencies required for this change,
- **Tag maintainer:** for a quicker response, tag the relevant
maintainer (see below),
- **Twitter handle:** we announce bigger features on Twitter. If your PR
gets announced, and you'd like a mention, we'll gladly shout you out!

Please make sure your PR is passing linting and testing before
submitting. Run `make format`, `make lint` and `make test` to check this
locally.

See contribution guidelines for more information on how to write/run
tests, lint, etc:

https://github.com/langchain-ai/langchain/blob/master/.github/CONTRIBUTING.md

If you're adding a new integration, please include:
1. a test for the integration, preferably unit tests that do not rely on
network access,
2. an example notebook showing its use. It lives in `docs/extras`
directory.

If no one reviews your PR within a few days, please @-mention one of
@baskaryan, @eyurtsev, @hwchase17.
 -->
2023-11-21 15:00:13 +00:00
Bagatur
78a1f4b264
bump 338, exp 42 (#13564) 2023-11-18 15:12:07 -08:00
Martin Krasser
79ed66f870
EXPERIMENTAL Generic LLM wrapper to support chat model interface with configurable chat prompt format (#8295)
## Update 2023-09-08

This PR now supports further models in addition to Lllama-2 chat models.
See [this comment](#issuecomment-1668988543) for further details. The
title of this PR has been updated accordingly.

## Original PR description

This PR adds a generic `Llama2Chat` model, a wrapper for LLMs able to
serve Llama-2 chat models (like `LlamaCPP`,
`HuggingFaceTextGenInference`, ...). It implements `BaseChatModel`,
converts a list of chat messages into the [required Llama-2 chat prompt
format](https://huggingface.co/blog/llama2#how-to-prompt-llama-2) and
forwards the formatted prompt as `str` to the wrapped `LLM`. Usage
example:

```python
# uses a locally hosted Llama2 chat model
llm = HuggingFaceTextGenInference(
    inference_server_url="http://127.0.0.1:8080/",
    max_new_tokens=512,
    top_k=50,
    temperature=0.1,
    repetition_penalty=1.03,
)

# Wrap llm to support Llama2 chat prompt format.
# Resulting model is a chat model
model = Llama2Chat(llm=llm)

messages = [
    SystemMessage(content="You are a helpful assistant."),
    MessagesPlaceholder(variable_name="chat_history"),
    HumanMessagePromptTemplate.from_template("{text}"),
]

prompt = ChatPromptTemplate.from_messages(messages)
memory = ConversationBufferMemory(memory_key="chat_history", return_messages=True)
chain = LLMChain(llm=model, prompt=prompt, memory=memory)

# use chat model in a conversation
# ...
```

Also part of this PR are tests and a demo notebook.

- Tag maintainer: @hwchase17
- Twitter handle: `@mrt1nz`

---------

Co-authored-by: Erick Friis <erick@langchain.dev>
2023-11-17 16:32:13 -08:00
Bagatur
a9b2c943e6
bump 336, exp 44 (#13420) 2023-11-15 14:08:34 -08:00
Predrag Gruevski
d63d4994c0
Bump all libraries to the latest ruff version. (#13350)
This version of `ruff` is the one we'll be using to lint the docs and
cookbooks (#12677), so I'm making it used everywhere else too.
2023-11-14 16:00:21 -05:00
Bagatur
24386e0860
bump 334, exp 40 (#13211) 2023-11-10 09:43:29 -08:00
Bagatur
1f85ec34d5
bump 331rc3 exp 39 (#13086) 2023-11-08 13:00:13 -08:00
Bagatur
cf481c9418
bump exp 38 (#13016) 2023-11-07 11:49:23 -08:00
Bagatur
eee5181b7a
bump 328, exp 37 (#12722) 2023-11-01 10:27:39 -07:00
Predrag Gruevski
f94e24dfd7
Install and use ruff format instead of black for code formatting. (#12585)
Best to review one commit at a time, since two of the commits are 100%
autogenerated changes from running `ruff format`:
- Install and use `ruff format` instead of black for code formatting.
- Output of `ruff format .` in the `langchain` package.
- Use `ruff format` in experimental package.
- Format changes in experimental package by `ruff format`.
- Manual formatting fixes to make `ruff .` pass.
2023-10-31 10:53:12 -04:00
Harrison Chase
eb903e211c
bump to 36 (#12487) 2023-10-28 08:51:23 -07:00
Bagatur
c6a733802b
bump 324 and 35 (#12352) 2023-10-26 10:10:26 -07:00
Bagatur
286a29a49e
bump 322 and 34 (#12228) 2023-10-24 13:52:17 -07:00
Bagatur
963ff93476
bump 321 (#12161) 2023-10-23 12:49:38 -04:00
Bagatur
85302a9ec1
Add CI check that integration tests compile (#12090) 2023-10-21 10:52:18 -04:00
Bagatur
35c7c1f050
bump 317 (#11986) 2023-10-18 09:25:18 -07:00
Predrag Gruevski
dcd0392423
Upgrade to newer black (23.10) and ruff (first 0.1.x!) versions. (#11944)
Minor lint dependency version upgrade to pick up latest functionality.

Ruff's new v0.1 version comes with lots of nice features, like
fix-safety guarantees and a preview mode for not-yet-stable features:
https://astral.sh/blog/ruff-v0.1.0
2023-10-17 17:24:51 -04:00
Bagatur
ba0d729961
bump 316 (#11928) 2023-10-17 09:47:57 -07:00
Bagatur
25b1d65305
bump 315 (#11850) 2023-10-16 00:50:54 -07:00
Erick Friis
1861cc7100
General anthropic functions, steps towards experimental integration tests (#11727)
To match change in js here
https://github.com/langchain-ai/langchainjs/pull/2892

Some integration tests need a bit more work in experimental:
![Screenshot 2023-10-12 at 12 02 49
PM](https://github.com/langchain-ai/langchain/assets/9557659/262d7d22-c405-40e9-afef-669e8d585307)

Pretty sure the sqldatabase ones are an actual regression or change in
interface because it's returning a placeholder.

---------

Co-authored-by: Bagatur <baskaryan@gmail.com>
2023-10-13 09:48:24 -07:00
Bagatur
9c0584be74
bump 313 (#11718) 2023-10-12 09:48:54 -07:00
Bagatur
7232e082de
bump 312 (#11621) 2023-10-10 12:34:49 -07:00
Bagatur
53887242a1
bump 310 (#11486) 2023-10-06 09:49:10 -07:00
Bagatur
8fafa1af91 merge 2023-10-05 18:09:35 -07:00
Bagatur
8b6b8bf68c
bump 309 (#11443) 2023-10-05 09:29:14 -07:00
Predrag Gruevski
c9986bc3a9
Tweak type hints to match dependency's behavior. (#11355)
Needs #11353 to merge first, and a new `langchain` to be published with
those changes.
2023-10-04 22:36:58 -04:00
Bagatur
16a80779b9
bump 307 (#11380) 2023-10-04 10:03:17 -04:00
Bagatur
8eec43ed91
bump 306 (#11289) 2023-10-02 10:25:08 -04:00
Bagatur
77c7c9ab97
bump 305 (#11224) 2023-09-29 08:55:00 -07:00
Bagatur
12fb393a43
bump 302 (#11070) 2023-09-26 08:13:01 -07:00
Bagatur
aa6e6db8c7
bump 301 (#11018) 2023-09-25 08:50:47 -07:00
Bagatur
24cb5cd379
bump 298 (#10892) 2023-09-21 08:26:11 -07:00
Bagatur
46aa90062b
bump exp 19 (#10851) 2023-09-20 10:17:52 -07:00
Bagatur
0d1550da91
Bagatur/bump 295 (#10785) 2023-09-19 08:22:42 -07:00
Bagatur
f7f3c02585
bump 287 (#10498) 2023-09-12 08:06:47 -07:00
olgavrou
b78d672a43 merge from upstream/master 2023-09-11 12:18:23 -04:00
olgavrou
11f20cded1 move everything into experimental 2023-09-11 12:16:08 -04:00
Bagatur
d2d11ccf63
bump 285 (#10373) 2023-09-08 08:26:31 -07:00
Bagatur
672907bbbb
bump 284 (#10330) 2023-09-07 08:45:42 -07:00
Tomaz Bratanic
db73c9d5b5
Diffbot Graph Transformer / Neo4j Graph document ingestion (#9979)
Co-authored-by: Bagatur <baskaryan@gmail.com>
2023-09-06 13:32:59 -07:00
Bagatur
098b4aa465
bump 281 (#10189) 2023-09-04 08:51:50 -07:00
Bagatur
0e4c5dd176
bump 13 (#10130) 2023-09-02 10:22:31 -07:00
maks-operlejn-ds
a8f804a618
Add data anonymizer (#9863)
### Description

The feature for anonymizing data has been implemented. In order to
protect private data, such as when querying external APIs (OpenAI), it
is worth pseudonymizing sensitive data to maintain full privacy.

Anonynization consists of two steps:

1. **Identification:** Identify all data fields that contain personally
identifiable information (PII).
2. **Replacement**: Replace all PIIs with pseudo values or codes that do
not reveal any personal information about the individual but can be used
for reference. We're not using regular encryption, because the language
model won't be able to understand the meaning or context of the
encrypted data.

We use *Microsoft Presidio* together with *Faker* framework for
anonymization purposes because of the wide range of functionalities they
provide. The full implementation is available in `PresidioAnonymizer`.

### Future works

- **deanonymization** - add the ability to reverse anonymization. For
example, the workflow could look like this: `anonymize -> LLMChain ->
deanonymize`. By doing this, we will retain anonymity in requests to,
for example, OpenAI, and then be able restore the original data.
- **instance anonymization** - at this point, each occurrence of PII is
treated as a separate entity and separately anonymized. Therefore, two
occurrences of the name John Doe in the text will be changed to two
different names. It is therefore worth introducing support for full
instance detection, so that repeated occurrences are treated as a single
object.

### Twitter handle
@deepsense_ai / @MaksOpp

---------

Co-authored-by: MaksOpp <maks.operlejn@gmail.com>
Co-authored-by: Bagatur <baskaryan@gmail.com>
2023-08-30 10:39:44 -07:00
Bagatur
d6957921f0
bump 276 (#9931) 2023-08-29 08:00:38 -07:00
Bagatur
9731ce5a40
bump 273 (#9751) 2023-08-25 03:05:04 -07:00
Predrag Gruevski
eee0d1d0dd
Update repository links in the package metadata. (#9454) 2023-08-18 12:55:43 -04:00
Bagatur
a69d1b84f4
bump 267 (#9403) 2023-08-17 08:47:13 -07:00