Commit Graph

2052 Commits

Author SHA1 Message Date
Harrison Chase
d86ed15d88
bump version to 158 (#4091) 2023-05-04 09:14:47 -07:00
OlajideOgun
624554a43a
DeepLake: Pass in rest of args to self._search_helper (#4080)
As of right now when trying to use functions like
`max_marginal_relevance_search()` or
`max_marginal_relevance_search_by_vector()` the rest of the kwargs are
not propagated to `self._search_helper()`. For example a user cannot
explicitly state the distance_metric they want to use when calling
`max_marginal_relevance_search`
2023-05-04 02:14:22 -07:00
Eduard van Valkenburg
6d84541ff9
fix base url (#4095)
Noticed a mistake in the base url and group vs non-group urls
2023-05-04 02:08:21 -07:00
Harrison Chase
a9c2450330
Harrison/toml loader (#4090)
Co-authored-by: Mika Ayenson <Mikaayenson@users.noreply.github.com>
2023-05-03 23:14:39 -07:00
Harrison Chase
d4cf1eb60a
Add firestore memory (#3792) (#3941)
If you have any other suggestions or feedback, please let me know.

---------

Co-authored-by: yakigac <10434946+yakigac@users.noreply.github.com>
2023-05-03 22:55:47 -07:00
Harrison Chase
fba6921b50
Harrison/one drive loader (#4081)
Co-authored-by: José Ferraz Neto <netoferraz@gmail.com>
2023-05-03 22:55:34 -07:00
golergka
bd277b5327
feat: prune summary buffer (#4004)
If the library user has to decrease the `max_token_limit`, he would
probably want to prune the summary buffer even though he haven't added
any new messages.

Personally, I need it because I want to serialise memory buffer object
and save to database, and when I load it, I may have re-configured my
code to have a shorter memory to save on tokens.
2023-05-03 22:45:48 -07:00
AndreLCanada
bf726f9d8a
Update python_repl docs (#4012)
In the example for creating a Python REPL tool under the Agent module,
the ".run" was omitted in the example. I believe this is required when
defining a Tool.
2023-05-03 22:45:32 -07:00
Mike Wang
67db495fcf
[agent] Add Spark Agent (#4020)
- added support for spark through pyspark library.
- added jupyter notebook as example.
2023-05-03 22:45:23 -07:00
Gengliang Wang
8af25867cb
Simplify HumanMessages in the quick start guide (#4026)
In the section `Get Message Completions from a Chat Model` of the quick
start guide, the HumanMessage doesn't need to include `Translate this
sentence from English to French.` when there is a system message.

Simplify HumanMessages in these examples can further demonstrate the
power of LLM.
2023-05-03 22:45:03 -07:00
Harrison Chase
087a4bd2b8
improve agent documentation (#4062) 2023-05-03 22:44:01 -07:00
rogerserper
b1446bea5f
google-serper: async + full json results + support for Google Images, Places and News (#4078)
* implemented arun, results, and aresults. Reuses aiosession if
available.
* helper tools GoogleSerperRun and GoogleSerperResults
* support for Google Images, Places and News (examples given) and
filtering based on time (e.g. past hour)
* updated docs
2023-05-03 22:35:48 -07:00
mbchang
cdea47491d
refactor: refactor dialogue examples (DialogueAgent, DialogueSimulator) (#4074)
refactor dialogue examples to have same DialogueAgent and
DialogueSimulator definitions
2023-05-03 22:32:26 -07:00
Jan Philipp Harries
657f5f259f
Added option to reduce verbosity of Deeplake integration (#4038)
The deeplake integration was/is very verbose (see e.g. [the
documentation
example](https://python.langchain.com/en/latest/use_cases/code/code-analysis-deeplake.html)
when loading or creating a deeplake dataset with only limited options to
dial down verbosity.

Additionally, the warning that a "Deep Lake Dataset already exists" was
confusing, as there is as far as I can tell no other way to load a
dataset.

This small PR changes that and introduces an explicit `verbose` argument
which is also passed to the deeplake library.

There should be minimal changes to the default output (the loading line
is printed instead of warned to make it consistent with `ds.summary()`
which also prints.
2023-05-03 22:16:27 -07:00
Davis Chase
7f8727bbcd
Router chains (#4019)
Unpolished router examples to help flesh out abstractions and use cases 
![Screenshot 2023-05-02 at 7 02 58
PM](https://user-images.githubusercontent.com/130488702/235820394-389e5584-db0b-415e-a260-2824b5555167.png)

---------

Co-authored-by: Shreya Rajpal <shreya.rajpal@gmail.com>
2023-05-03 22:02:55 -07:00
Pulkit Mehta
bbbca10704
issue#4082 base_language had wrong code comment that it was using gpt… (#4084)
…3 to tokenize text instead of gpt-2

Co-authored-by: Pulkit <pulkit.mehta@catylex.com>
2023-05-03 21:58:29 -07:00
Leonid Ganeline
6caba8e759
docs: added a link to the Google Scholar articles (#4007)
Google Scholar outputs a nice list of scientific and research articles
that use LangChain.
I added a link to the Google Scholar page to the `gallery` doc page
2023-05-03 21:54:44 -07:00
obbiondo
d18e788ee3
bugfix: return whole document when loading with ConfluenceLoader.load by label (#3980)
Method confluence.get_all_pages_by_label, returns only metadata about
documents with a certain label (such as pageId, titles, ...). To return
all documents with a certain label we need to extract all page ids given
a certain label and get pages content by these ids.

---------

Co-authored-by: Andrea Biondo <a.biondo@reply.it>
2023-05-03 21:52:05 -07:00
Harrison Chase
5f30cc8713
Harrison/knn retriever (#4083)
Co-authored-by: Yuichi Tateno (secon) <hotchpotch@users.noreply.github.com>
2023-05-03 21:21:58 -07:00
Zander Chase
65c3b146c9
Accept str or list[str] for shell (#4060)
Relax the requirements
2023-05-03 21:11:06 -07:00
Harrison Chase
5a269d3175
Harrison/media wiki xml (#4072)
Co-authored-by: Géraud de Drouas <gdedrouas@users.noreply.github.com>
2023-05-03 20:45:33 -07:00
Zeeland
c186f18aab
fix: incorrect data type when construct_path in chain (#4031)
A incorrect data type error happened when executing _construct_path in
`chain.py` as follows:

```python
Error with message replace() argument 2 must be str, not int
```

The path is always a string. But the result of `args.pop(param, "")` is
undefined.
2023-05-03 18:49:47 -07:00
engkheng
349ba88aee
Export FileChatMessageHistory (#4042) 2023-05-03 18:14:47 -07:00
Nikolas Garske
1608f5dcae
Remove pip stdout and fix typo (#4050) 2023-05-03 18:06:39 -07:00
Ivo Stranic
3b556eae44
Update deeplake example (#4055) 2023-05-03 18:03:51 -07:00
Steve Kim
9b830f437c
Deleted importing Document from document_loaders.base because Documen… (#4068)
Hi,

- Modification:
https://python.langchain.com/en/latest/modules/indexes/document_loaders/examples/arxiv.html
- Reason: In this example, the first line is unnecessary because the
Document class does not exist in the base.
- Resolves: Issue #4052

--------
P.S: This pull-request is my first time, so please let me know if I need
to correct or write more explanation.
2023-05-03 17:54:30 -07:00
hp0404
374725a715
Refactor TelegramChatLoader and FacebookChatLoader classes and add tests (#3863)
This PR includes two main changes:

- Refactor the `TelegramChatLoader` and `FacebookChatLoader` classes by
removing the dependency on pandas and simplifying the message filtering
process.

- Add test cases for the `TelegramChatLoader` and `FacebookChatLoader`
classes. This test ensures that the class correctly loads and processes
the example chat data, providing better test coverage for this
functionality.
2023-05-03 15:59:19 -07:00
Jon Saginaw
ea64b1716d
Enhancement: option to Get All Tokens with a single Blockchain Document Loader call (#3797)
The Blockchain Document Loader's default behavior is to return 100
tokens at a time which is the Alchemy API limit. The Document Loader
exposes a startToken that can be used for pagination against the API.

This enhancement includes an optional get_all_tokens param (default:
False) which will:

- Iterate over the Alchemy API until it receives all the tokens, and
return the tokens in a single call to the loader.
- Manage all/most tokenId formats (this can be int, hex16 with zero or
all the leading zeros). There aren't constraints as to how smart
contracts can represent this value, but these three are most common.

Note that a contract with 10,000 tokens will issue 100 calls to the
Alchemy API, and could take about a minute, which is why this param will
default to False. But I've been using the doc loader with these
utilities on the side, so figured it might make sense to build them in
for others to use.
2023-05-03 15:46:44 -07:00
Akash Sharma
525db1b6cb
Fixed typo leading to broken link (#4034) 2023-05-03 14:45:54 -07:00
Zander Chase
afa9d1292b
Re-Permit Partials in Tool (#4058)
Resolved issue #4053

Now that StructuredTool is a separate class, this constraint is no
longer needed.

Added/updated a unit test
2023-05-03 13:16:41 -07:00
Zander Chase
7e967aa4d5
Update Notebooks (#4051) 2023-05-03 09:31:02 -07:00
Nuno Campos
f3ec6d2449
Replace remaining usage of basellm with baselangmodel (#3981) 2023-05-02 21:52:29 -07:00
mbchang
f291fd7eed
docs: remove stdout from pip install (for gymnasium) (#3993) 2023-05-02 21:51:40 -07:00
Harrison Chase
b67be55ab8
bump ver (#4018) 2023-05-02 19:02:02 -07:00
Harrison Chase
a5dd73c1a6
Revert "[agent][property type] Change allowed_tools to Set as Duplicate doesn’t make sense" (#4014)
Reverts hwchase17/langchain#3840
2023-05-02 18:58:05 -07:00
Davis Chase
df3bc707fc
Dev2049/callback example fix (#4010)
Closes #3997

---------

Co-authored-by: Akshaj Jain <akshaj.jain@gmail.com>
2023-05-02 16:20:16 -07:00
Davis Chase
f08a76250f
Better custom model handling OpenAICallbackHandler (#4009)
Thanks @maykcaldas for flagging! think this should resolve #3988. Let me
know if you still see issues after next release.
2023-05-02 16:19:57 -07:00
Zander Chase
aa38355999
Vwp/docs improved document loaders (#4006)
Huge thanks to @leo-gan for improving the document loaders notebooks

---------

Co-authored-by: Leonid Ganeline <leo.gan.57@gmail.com>
2023-05-02 15:24:53 -07:00
Zander Chase
1c68cbdb28
Fix typing of attribute (#3999) 2023-05-02 15:11:23 -07:00
MichaelMDowling
36ee60c96c
Update \docs\modules\models\text_embedding\examples\openai.ipynb (#3976)
Single edit to: models/text_embedding/examples/openai.ipynb - Line 88:
changed from: "embeddings = OpenAIEmbeddings(model_name=\"ada\")" to
"embeddings = OpenAIEmbeddings()" as model_name is no longer part of the
OpenAIEmbeddings class.
2023-05-02 14:41:31 -07:00
Harrison Chase
e23391965b
fix import (#4003) 2023-05-02 14:26:46 -07:00
Jinto Jose
013208cce6
Fix Documentation - Nomic - Atlas Jupyter Notebook (#3987)
Correction to Numic-Atlas Jupyter Notebook Docs
2023-05-02 14:20:01 -07:00
Ankush Gola
18f9d7b4f6
don't deepcopy handlers (#3995)
Co-authored-by: Sami Liedes <sami.liedes@iki.fi>
Co-authored-by: Sami Liedes <sami.liedes@rocket-science.ch>
2023-05-02 13:53:27 -07:00
Mike Wang
c26cf04110
[check] add import check and warning for pandas (#3944)
- as titled, add an `import` catch for pandas with a user suggestion
message.
2023-05-02 10:08:16 -07:00
Chop Tr
71a337dac6
Update output_fixing_parser.ipynb (#3978) 2023-05-02 09:33:46 -07:00
Ankush Gola
3bd5a99b83
v2 tracer with single runs endpoint (#3951) 2023-05-01 22:41:32 -07:00
Harrison Chase
8fcb56e74a
bump version to 155 (#3943) 2023-05-01 22:05:52 -07:00
Harrison Chase
ca08a34a98
retry to parsing (#3696) 2023-05-01 22:05:42 -07:00
mbchang
3993166b5e
docs: remove stdout from pip install (#3945) 2023-05-01 22:05:22 -07:00
Harrison Chase
2366e71bed
Harrison/azure openai (#3942)
Co-authored-by: Saverio Proto <zioproto@gmail.com>
2023-05-01 21:34:16 -07:00