mirror of https://github.com/hwchase17/langchain
master
wfh/update_project_name
erick/community-deprecate-community-ollama-integrations
bagatur/rm_optional_defaults
eugene/add_grit_linter
erick/test-ruff-output-existing
cc/document_loader_web
cc/sidebars
erick/docs-bump-memory-limit
cc/unstructured_deprecations
v0.2
eugene/qa_test_2
eugene/ci_fix_question_template
erick/partners-box-release-0-2-2
eugene/add_memory_equivalents
isaac/toolerrorpassing
bagatur/enable_azure_feats
cc/extended_tests
v0.1
cc/release_mongo
eugene/robocorp
eugene/foo
eugene/pydantic_v1_foo
erick/community-release-0-2-17
erick/docs-mdx-v3-compat-wip
eugene/update_is_caller_internal
isaac/toolerrorhandling03
bagatur/des_from_path
bagatur/v0.3/preview_api_ref
bagatur/run_tutorials
bagatur/run_all_docs_how_to
isaac/toolerrorhandling
bagatur/rfc_docs_gha
eugene/foo_meow
bagatur/fix_embeddings_filter_init
wfh/keyword_runnable_like_bu
eugene/add_mypy_plugin
eugene/v0.3_wut
bagatur/core_0_3_0_dev1
bagatur/v0.3rc_merge_master_2
bagatur/v0.3rc_merge_master
eugene/core_0.3rc_first
bagatur/oai_emb_fix
isaac-recursiveurlloader-testing
harrison/3.0
bagatur/core_pydocstyle_lint
harrison/support-kwargs-vectorstore
wfh/more_interops
bagatur/format_content_as
erick/community-undo-azure-ad-access-token-breaking-change
isaac/responseformatstuff
bagatur/delight
erick/ai21-integration-test-fixes
erick/all-more-lint-additions
erick/infra-continue-on-error
erick/ai21-address-breaking-changes-in-sdk-2-14-0-wip
bagatur/rfc_anthropic_cache_usage
bagatur/json_mode_standard
bagatur/dict_msg_tmpl2
wfh/nowarn
bagatur/rfc_dont_update_run_name
bagatur/oai_disabled_params
jacob/templates
bagatur/content_block_template
erick/chroma-fix-typing
isaac/moreembeddingtests
eugene/openai_0.3
bagatur/standard_tests_async
eugene/merge_pydantic_3_changes
eugene/0.3_release_docs
cc/deprecate_evaluators
bagatur/selector_add_examples
bagatutr/langsmith_example_selector
docs/fix_chat_model_os
eugene/security_related
eugene/add_async_api
eugene/integration_docs_cohere
eugene/multimodal_embedding_model
isaac/moretooltables
bagatur/merged_docs_styling
eugene/clean_up_pre_init
eugene/v0.3_meow
bagatur/serialize-pydantic-metadata
eugene/fix_tool_extra
erick/infra-pydantic-v2-scheduled-testing
bagatur/add_pydantic_sys_info
eugene/community_draft
cc/fix_exp_ci
eugene/pydantic_v_3
erick/docs-llm-embed-index-tables-wip
eugene/update_llm_result_types
eugene/add_json_schema_methods
bagatur/docs_versioning
bagatur/simplify_docs_header_footer
bagatur/docs_cp_ls_theme
erick/cli-new-template-types
isaac/ollamauniversalchat
isaac/llmintegrationtests
bagatur/fewshot_scratch
isaac/ollamallmfix
wfh/warn_name
eugene/add_indexer_to_retriever
bagatur/07_26_24/poetry_lock
cc/toolkits
erick/docs-new-integrations-docs
cc/api_chain
wfh/link
eugene/rate_limiting_requests
eugene/add_rate_limiter_integrations
isaac/ollamaimageissues
cc/many_tools_guide
eugene/update_relative_imports
eugene/add_timeout_for_tests
isaac/tavilynewlinefix
isaac/create_react_agent_doc_fix
eugene/add_tests_for_pydantic_models
eugene/document_indexer_v2
bagatur/ruff_0_5_3
wfh/parent
wfh/async_chromium
erick/infra-try-removing-uv-for-editable-install
eugene/indexing_abstraction_minimal
jacob/currying_docs
jacob/curry_tools
eugene/indexing_abstraction
wfh/triggered
cc/bind_tools
jacob/docs_style
isaac/fewshotpromptdocs
eugene/migrate_graphvectorstore_to_community
jacob/tools_additional_params
wfh/curry
wfh/inherit
eugene/stores_new
bagatur/rfc_tool_call_id_in_config
bagatur/docs_intro_wording
bagatur/lint_all_core_deps
eugene/update_add_texts
eugene/index_manager
eugene/tracing_interop2
eugene/migrate_vectorstores
eugene/cleanup_deprecated_code
wfh/is_error
eugene/root_validators_02
eugene/root_validators_03
eugene/pinecone_add_standard_tests
eugene/indexing_v2
wfh/url
bagatur/rfc_configurable_model
cc/tool_token_counts
renderer
bagatur/mypy_v1_update
wfh/add_list_support
bagatur/rfc_tool_call_filter
erick/anthropic-release-0-1-16
wfh/add_tool_param_descripts_2
wfh/add_tool_param_descripts
nc/19jun/core-no-pydantic
bagatur/opinionated_formatter
bagatur/rfc_docstring_lint
cc/fix_milvus
cc/update_pydantic_dep
maddy/support-options-in-langchainhub
isaac/chatopenaiparalleltoolcallingparam
erick/huggingface-relax-tokenizers-dep
rlm/test-llama-cpp
bagatur/retrieval_v2_scratch
erick/core-loosen-packaging-lib-version
eugene/disable_lint_rule
eugene/get_model_defaults
wfh/allyourtreesarebelongtome
eugene/pydantic_migration_2_b
isaac/sitemaploader-goldendocs
bagatur/recursive_url_bash
erick/docs-update-chatbedrock-with-tool-calling-docs-dont-use
erick/docs-update-chatbedrock-with-tool-calling-docs-do-not-use
isaac/sitemaploader-testing
isaac/sitemaploader-tests
eugene/pydantic_migration_2
erick/core-throw-error-on-invalid-alternative-import-in-deprecated
erick/core-throw-errors-on-invalid-alternative-import
bagatur/rfc_smithify_docs
eugene/async_history_2
bagatur/ai21_0_1_6
erick/docs-rewrite-contributor-docs
cc/update_openai_streaming_token_counts
eugene/llm_token_counts
eugene/langchain_how_to_config
bagatur/docs-format-api-ref
eugene/callbacks_propagate
erick/docs-v02-url-rfc
maddy/default-prompt-private-in-hub
eugene/update_version_docs
bagatur/parse_tool_docstring_fix
erick/docs-algolia-api-key-update
eugene/how_does_this_stream
eugene/langchain_core_manager
eugene/update_linting
erick/docs-ignore-echo-false-blocks
dqbd/api_ref_styles
erick/infra-codespell-v1
erick/infra-codespell-in-v1-branch
erick/community-release-0-2-0rc1
eugene/add_change_log2
harrison/new-docs
cc/retriever_score
bagatur/community_0_0_37
wfh/may3/help
eugene/core_0.2.0rc1
cc/docs_build
eugene/update_warnings2
bagatur/oai_tool_choice_required
cc/fix_openai
wfh/add_rid_to_chain
eugene/migrate_document_loaders
bagatur/mistral_client
wfh/add_parameter_descriptions
erick/core-remove-batch-size-from-llm-start-callbacks
eugene/refactor_deprecations
eugene/release_0_2_0
eugene/web_retriever
eugene/move_memories_2
bagatur/tryout_uv
eugene/entity_store
eugene/run_type_for_lambdas
bagatur/rfc_standardize_input_msgs
bagatur/rfc_serialized_tool
brace/show-last-update-docs
erick/release-note-experiments
eugene/runnable_config
cc/function_message
rlm/rag_eval_guide
bagatur/rfc_token_usage
eugene/custom_embeddings
eugene/community_fix_imports
bagatur/goog_doc_nit
erick/docs-runnable-list-operations
bagatur/rm_convert_to_tool_docs
eugen/providers_update
erick/core-deprecate-vectorstore-relevance-scoring
eugene/outline_wrapper_1
erick/pytest-experiments-2
erick/pytest-experiments
erick/partner-cloudflare
rlm/langsmith_testing
erick/community-patch-clickhouse-make-it-possible-to-not-specify-index
eugene/postgres_vectorstore
bagatur/openllm_new_api
bagatur/layerupai
cc/deprecated_imports
erick/cohere-adaptive-rag-cookbook
erick/cohere-multi-tool-integration-test
dqbd/openai-lax-jsonschema
eugene/xml_again
brace/format-dpcs
eugene/pull_to_funcs
bagatur/fix_getattr
erick/core-patch-placeholder-message-shorthand
bagatur/0.2
eugene/unsafe_pydantic
bagatur/community_migration_script
bagatur/versioned_docs_2
bagatur/versioned_docs
bagatur/find_broken_links
bagatur/stream_pydantic
wfh/add_hub_version
eugene/stackframe
wfh/log_error
wfh/add_eval_metadata
erick/airbyte-patch-baseloader-wip
bagatur/rename_msg_kwargs
wfh/specify_version
fork/feature_audio_loader_auzre_speech
erick/infra-remove-venv-from-poetry-cache
erick/ci-test-timeout
erick/test-community-ci
erick/test-ci
wfh/add_warnings
eugene/huggingface
erick/core-patch-community-patch-baseloader-to-core
erick/core-minor-multimodal-document-page-content-rfc
bagatur/support_pydantic_context
bagatur/rfc_structured_list
erick/test-partner-failure
erick/test-partner-success
erick/test-error
eugene/fix_openai_community_stream
erick/test-ci-should-fail
erick/testutils
jacob/people
erick/docs-remove-platforms-redirect
eugene/add_people
erick/infra-check-min-versions-in-pr-ci
langchain-ai/langchain@5cbabbd
eugene/test_lint
erick/exa-lint
bagatur/make_cohere_client_optional
bagatur/rfc_as_str
erick/infra--individual-template-ci-
bagatur/rfc_@chain_typing
erick/infra-try-1-job-sphinx-build
rlm/mistral_cookbook
bagatur/rfc_chat_invoke_llm_res
erick/infra-rtd-build-bump-null
erick/infra-rtd-build-bump
erick/autoapi-test
bagatur/speedup_sphinx
erick/community-lint
erick/partner-nomic
erick/cli-langchain-dep-versions
bagatur/init_chat_prompt_msg_like
wfh/custom_prompt
fork/async-doc-loader
rlm/rag_from_scratch
eugene/message_history_test
erick/api-ref-navbar-update
bagatur/rfc_bind_collision
bagatur/bind_outside_agent
erick/release-notes
jacob/chatbot_message_passing
bagatur/3_12_ci
bagatur/assign_unpack
bagatur/docs_top_nav
bagatur/batch_overload_typing
bagatur/core_0_1_15_rc_1
bagatur/rfc_rich_retrieval
bagatur/runnable_drop
bagatur/class_chain
bagatur/initial_tool_choices
eugene/streaming_events
erick/core-patch-fallbacks-error-chain
harrison/tool-invocation
bagatur/tool_executor
erick/deepinfra-chat
eugene/agents_docs
eugene/update_index.md
bagatur/rfc_tool_executor
bagatur/rfc_extraction_improvement
bagatur/downgrade_setup_python
erick/mistralai-patch-enforce-stop-tokens
bagatur/thread_inof
bagatur/docs_last_updated_2
bagatur/docs_last_updated
erick/google-docs
dqbd/json-output-oai-parser-serialization
bagatur/cli_pkg_tmpl_lc_ver
erick/infra-try-show-last-update-time
bagatur/try_stat
bagatur/core_tracer_backwards_compat
erick/infra-ci-python-matrix-update-3-12
bagatur/rfc_retriever_return_str
do-not-merge
bagatur/dispatch_main_ci
bagatur/show_last_update_time
harrison/docs-m
harrison/docs-revamp-mirror
harrison/new-docs-revamp
harrison/agents-rewrite-code
bagatur/api_flyout
harrison/revamp-memory
harrison/merged-branches
bagatur/stuff_docs_lcel
harrison/agent-docs-concepts
harrison/agent-docs-custom
bagatur/api-ref-navbar-update
bagatur/combine_docs_chain_as_runnable
erick/ci-test-do-not-merge
bagatur/chat_hf
erick/infra--run-ci-on-all-prs-
bagatur/core_update_ruff_mypy
erick/nbconvert
bagatur/lc_stack_update
wfh/bind_tools
eugene/fix_xml_agent
harrison/anthropic-package
wfh/vertexai_fixup
bagatur/core_0_0_13
wfh/add_oai_agent_core_examples
erick/docs-bullet-points
wfh/gemini
eugene/bug_history
bagatur/community
harrison/turn-off-serializable
harrison/serializable-baga
wfh/prevent_outside
harrison/mongo-agent
erick/all-patch---change-ci-title-in-event-of-no-matrix-expansion-
rlm/update-img-prompt
eugene/update_file_chat_memory
harrison/deepsparse
bagatur/core_0.1
harrison/integrations
erick/docs-docusaurus-3
rlm/mm-rag-deck
bagatur/core_lint_docstring
bagatur/core_0_0_8
bagatur/lcel_get_started
bagatur/fmt_notebooks
bagatur/export_prompt_chat_classes
harrison/add-imports
bagatur/serialization_tests
bagatur/patch_0.0.400
wfh/tqdm_for_wait
bagatur/fix_core_namespace
erick/core-namespace-same
erick/api-docs-core-bugfix-
brace/new-lc-stack-svg
wfh/tqdm_wait
v0.0.339
bagatur/cogniswitch
dqbd/docs-responsivity-fix
bagatur/multi_return_source
wfh/func_eval
bagatur/full_template_docs
bagatur/callbacks-refactor
(vectorstore)/PGVectorAsync
rlm/mm_template
bagatur/rfc_bind_getattr
erick/skip-release-check-cli
bagatur/rm_return_direct_error
rlm/sql-pgvector-template
erick/improvement-format-notebooks
erick/improvement-default-docs-url-root
bagautr/rfc_image_template
refactorChromaInitLogic
rlm/ollama_json
bagatur/rfc_pinecone_hybrid
eugene/document_runnables2
harrison/root-listeners
wfh/add_llm_output_to_adapter
rlm/biomedical-rag
bagatur/cohere_input_type
bagatur/update-schema
erick/cli-codegen
wfh/content_union
bagatur/docs_smith_serve
rlm/open_clip_embd_expt
rlm/multi-modal-template
wfh/ossinvoc
pg/test-publish-rc-versions
wfh/conversational_feedback
bagatur/voyage-ai
template-readme-missing-env
rescana-com/master
bagatur/lakefs-loader
bagatur/readthedocs-loader-improvements
hwchase17-patch-1
eugene/fix_type_onbase_transformer
bagatur/deep_memory_version_1
api-reference-agents-functions
erick/cli-ci
bagatur/retry_nit
wfh/tree_distance
jacoblee93-patch-1-1
wfh/runnable_traceable
bagatur/rfc_chat_batch_gen
rlm/text-to-pgvector
bagatur/e2b-integration2
bagatur/api-reference-agents-functions
shorthills-ai/master
bagatur/e2b-integration
wfh/save_model_name
bagatur/voyage
rlm/LLaMA2_sql_scrub
bagatur/cogniswitch_chains
bagatur/private_fn
erick/langservehub
bagatur/rfc_vecstore_interface
nc/repl-lib
charlie/fine-tuning-notebook
harrison/move-imports
wfh/rtds
wfh/json_schema_evaluator
pg/python-3.12
wfh/eval_public_dataset
ankush/single-generations
nc/pandas-eval
eugene/update_warning_class
ankush/single-input
ankush/delete_v1_tracer
wfh/background
bagatur/bump_304
bagatur/dedup_transformer
eugene/fix_webbase_loader
harrison/move-pydantic-v1
wfh/vectorstore_tracing
vdaas-feature/vald
harrison/agents-exoskelton-1
harrison/agents-exoskeleton
jacob/routing_cookbook
harrison/more-imports
harrison/remove-from-init
eugene/automaton_variant_4
harrison/specified-input-keys
bagatur/docs_zoom
jacob/feature_vercel_analytics
wfh/update_types
francisco/sql_agent_improvements
wfh/implicit_client
bagatur/auto_rewrite_retrieval
bagatur/konko
eugene/automaton_variant_3
wfh/default_retries
bagatur/lint_fix
rlm/fix-prompts
wfh/redirects
eugene/automaton_variant_2
bagatur/fix_multiquery
wfh/json_other
wfh/fix_link
bagatur/add-data-anonymizer
bagatur/mem_session
molly/vectorstore-batching
deepsense-ai/llama-cpp-grammar
bagatur/gpt_4_docstring
harrison/add-llm-kwargs
bagatur/redis_refactor
rlm/llama-grammar
bagatur/runnable_mem
wfh/clirun
eugene/document_pipeline
harrison/retrieval-agents
bagatur/rfc_fallback_inherit
bagatur/epsilla
bagatur/promptguard
harrison/pydantic-bridge
bagatur/cheatsheet
wfh/update_criteria_prompt
pydantic/b2_bump
eugene/pydantic_v2_tools2
wfh/criteria_strat
eugene/wrap_openapi_stuff
bagatur/bump_264
bagatur/new_msg
rlm/agent_use_case
harrison/remove-things-from-init
harrison/clean-up-imports
bagatur/lite_llm
bagatur/rfc_zep_search
bagatur/pydantic_agnostic
bagatur/bagel
bagatur/fix_sched_2
wfh/async_eval_default
bagatur/respect_light_mode
bagatur/docsly
eugene/automaton_variant_1
wfh/return_exceptions
wfh/example_id_config
bagatur/rm_nuclia_ext
wfh/fix_recursive_url_loader
bagatur/runnable_locals
wfh/embeddings_callbacks_v3
bagatur/google_drive
rlm/chatbots_use_case
wfh/langsmith_nopydantic
eugene/enum_rendering
harrison/add-memory-to-sql
bagatur/rfc_fallbacks
harrison/xml-agent
bagatur/mod_desc
wfh/memory_interface
wfh/throw_on_broken_links
eugene/expand_documentation
wfh/api_ref
wfh/swizzle
eugene/test
harrison/async-web
harrison/fix-typo
wfh/retriever_additional_data
harrison/experimental-package
bagatu/rfc_pkg_per_chain
wfh/default_data_type
harrison/move-to-schema/chain
harrison/move-to-schema-more-callbacks
wfh/to_prompt_template
wfh/not_implemented
wfh/limit_concurrency
wfh/delete_deprecated
harrison/move_to_core
harrison/move-to-core/prompts
wfh/add_agent_trajectory_loader
ankush/message-eval
harrison/variable-table
wfh/skip_no_output
harrison/apply-async
harrison/improve-docs-formatting
vwp/embedding_fuzzy
wfh/evals_docs_reorg_draft
vwp/comparison_with_references
wfh/comparison_with_references
harrison/split-schema-dir
vwp/accept_no_reasoning
wfh/embeddings_callbcaks
vwp/fix_promptlayer
wfh/key_matching
harrison/marqo
vwp/make_new_eval_chain_run
vwp/any_callable
vwp/time_to_first_token
vwp/accept_chain
vwp/evals_docs_reorg
vwp/similarity
vwp/use_langsmith
vwp/rm_dep
vwp/script_for_adding_docs
octoml/master
harrison/set-pydantic-docs
harrison/markdown-docs
vwp/drafts/unit_testing
harrison/functions
ankush/asyncio-gather-agenerate
vwp/retriever_callbacks_v2
vwp/schema_dir
harrison/allow-kwargs
eugene/persistence_db
vwp/anthropic_token_usage
vwp/evaluator_chains
vwp/envurl
eugene/research_v1
eugene/chain_generics
harrison/neo4j-lint
ankush/callbacks-cleanup
dev2049/pgvector_fix
harrison/anthropic-chat
vwp/simplify_tracer2
vwp/simplify_tracer
harrison/schema-directory
harrison/comp-prompt
dev2049/rough_draft_doc_manager
vwp/child_runs
ankush/chat-agent-parsing
dev2049/combine_quickstart
dev2049/concise_get_started
vwp/feedback_crud
harrison/exclude-embedings
dev2049/azure_vecstore
vwp/base_model
dev2049/getting_started_clean
dev2049/change_llm_name
dev2049/embedding_rename
eugene/prompt_template
harrison/serialize-chat
dev2049/embed_docs_to_texts
dev2049/doc_clean
dev2049/chroma_cleanup
harrison/few-shot-w-template-fix
retrievalqafinetune
dev2049/retrieval_eval
vwp/tracing_docs
vwp/bold_headergs
eugene/add_file_system
harrison/return-prompt
vwp/tracer-async-call
eugene/check_something
dev2049/combine_refac
eugene/updat_extended_tests
eugene/meow_draft
eugene/fix_google_palm_tests
vwp/dcv2
eugene/retriever_version
tjaffri/dgloader
harrison/pdfplumber
vwp/patch
harrison/character-chat-agent
harrison/mongo-loader
harrison/sharepoint
eugene/add_caching_from_master_only
dev2049/save-to-notion-tool
dev2049/self_query_integration
dev2049/update_lock
eugene/revert_workflows
revert-4465-harrison/env-var
dev2049/pgvector-size-fix
vwp/eval_examples
fork-chains
eugene/test_branch
vwp/add-github-api-utility
vwp/from_llm_and_tools
vwp/pandas_cb_manager
add-scenexplain-tool
vwp/tools_callbacks
vwp/relax_chat_agent
vwp/parser__type
vwp/filter_ambiguous_args
harrison/get-working-with-agents
dev2049/null_callback_hack
dev2049/llm_requests_chain
vwp/test_on_built_wheel
vwp/avoid_poetry_deps_in_ci
eugene/openai_optional
vwp/agent_tests
vwp/structured_tools
vwp/align_search_tools
vwp/structured_tools_with_pyd
vwp/inheritance_same_agents
vwp/chatregtests
dev2049/default_models
dev2049/perfect_retriever
dev2049/docs_stateful
vwp/add_args
khimaros/master
vwp/chroma_elements
vwp/default_dont_raise
vwp/lintfix
harrison/anthropic
dev2049/retrieval_eval_nb
harrison/contextual-compression
vwp/marathon
agents-4-18
harrison-outerr-exc
vwp/hf_image_gen
vwp/hf_imagen
vwp/tools_undo
vwp/characters_2
vwp/tools-refactor-2
harrison/autogpt
harrison/typeo
dev2049/fmt_nbs
vwp/numexpr
harrison/characters-nb
vwp/characters_with_planning
harrison/pinecone-backwards-compat
vwp/openapi_with_tool_retrieval
harrison/aws-text
ankush/patch1
harrison/processor
harrison/script-update
harrison/api-chain
harrison/ai21-embeddings
harrison/alpaca
nc/poe-handler-chat-model
nc/poe-handler
harrison/mrkl-parser
harrison/agent-experiments
harrison/replicate
harrison/chat-chain
harrison/update-wandb
harrison/debug
harrison/qasper
harrison/dbpedia
harrison/changes
jeremy/guardrails
nc/guardrails-error-handling
harrison/guardrails
harrison/use-output-parsers
John-Church-guard
agent_evaluation
harrison/kor-chain
harrison/inference-api
ankush/callback-refactor
harrison/eval
harrison/audio
ankush/prompt-abstractions
harrison/memory-chat
harrison/indexes
ankush/partial-prompt-apply
harrison/sagemaker
harrison/datetime
harrison/openapiagent
harrison/paged-pdf
harrison/pswsl
ankush/example-runner
harrison/guards
scad/api-chain
harrison/prompt-bugs
harrison/sql-agent
harrison/pinecone-try-except
harrison/callback-updates
harrison/map-rerank
harrison/combine-docs-parse
harrison/azure-rfc
harrison/sequential_chain_from_prompts
harrison/agent-refactor
harrison/agent_intermediate_steps
harrison/agent_multi_inputs
harrison/promot-mrkl
harrison/fix_logging_api
harrison/use_output_parser
harrison/track_intermediate_steps
harrison/sql_error
harrison/logging_to_file
harrison/output_parser
harrison/flexible_model_args
harrison/agent-improvements
harrison/router_docs
harrison/docs
samantha/add_llm_to_example
harrison/reorg_smart_chains
mako-templates
harrison/save_metadatas
harrison/router
harrison/custom_pipeline
harrison/chain_pipeline
harrison/prompts_docs
harrison/attempt_citing_in_prompt
harrison/load_prompt
harrison/prompts_take_2
harrison/ape
harrison/prompt_examples
harrison/add_dependencies
langchain-ai21==0.1.4
langchain-ai21==0.1.5
langchain-ai21==0.1.6
langchain-ai21==0.1.7
langchain-airbyte==0.1.1
langchain-anthropic==0.1.12
langchain-anthropic==0.1.13
langchain-anthropic==0.1.14rc1
langchain-anthropic==0.1.14rc2
langchain-anthropic==0.1.15
langchain-anthropic==0.1.16
langchain-anthropic==0.1.17
langchain-anthropic==0.1.18
langchain-anthropic==0.1.19
langchain-anthropic==0.1.20
langchain-anthropic==0.1.21
langchain-anthropic==0.1.22
langchain-anthropic==0.1.23
langchain-anthropic==0.2.0
langchain-anthropic==0.2.0.dev0
langchain-anthropic==0.2.0.dev1
langchain-anthropic==0.2.1
langchain-azure-dynamic-sessions==0.1.0
langchain-azure-dynamic-sessions==0.1.0rc0
langchain-azure-dynamic-sessions==0.2.0
langchain-box==0.1.0
langchain-box==0.2.0
langchain-box==0.2.1
langchain-chroma==0.1.1
langchain-chroma==0.1.2
langchain-chroma==0.1.4
langchain-cli==0.0.22
langchain-cli==0.0.23
langchain-cli==0.0.24
langchain-cli==0.0.25
langchain-cli==0.0.26
langchain-cli==0.0.27
langchain-cli==0.0.28
langchain-cli==0.0.29
langchain-cli==0.0.30
langchain-cli==0.0.31
langchain-community==0.0.35
langchain-community==0.0.36
langchain-community==0.0.37
langchain-community==0.0.38
langchain-community==0.2.0
langchain-community==0.2.0rc1
langchain-community==0.2.1
langchain-community==0.2.10
langchain-community==0.2.11
langchain-community==0.2.12
langchain-community==0.2.13
langchain-community==0.2.14
langchain-community==0.2.15
langchain-community==0.2.16
langchain-community==0.2.17
langchain-community==0.2.2
langchain-community==0.2.3
langchain-community==0.2.4
langchain-community==0.2.5
langchain-community==0.2.6
langchain-community==0.2.7
langchain-community==0.2.9
langchain-community==0.3.0
langchain-community==0.3.0.dev1
langchain-community==0.3.0.dev2
langchain-core==0.1.47
langchain-core==0.1.48
langchain-core==0.1.50
langchain-core==0.1.51
langchain-core==0.1.52
langchain-core==0.2.0
langchain-core==0.2.0rc1
langchain-core==0.2.1
langchain-core==0.2.10
langchain-core==0.2.11
langchain-core==0.2.12
langchain-core==0.2.13
langchain-core==0.2.15
langchain-core==0.2.16
langchain-core==0.2.17
langchain-core==0.2.18
langchain-core==0.2.19
langchain-core==0.2.2
langchain-core==0.2.20
langchain-core==0.2.21
langchain-core==0.2.22
langchain-core==0.2.23
langchain-core==0.2.24
langchain-core==0.2.25
langchain-core==0.2.26
langchain-core==0.2.27
langchain-core==0.2.28
langchain-core==0.2.29
langchain-core==0.2.29rc1
langchain-core==0.2.2rc1
langchain-core==0.2.3
langchain-core==0.2.30
langchain-core==0.2.31
langchain-core==0.2.32
langchain-core==0.2.33
langchain-core==0.2.34
langchain-core==0.2.35
langchain-core==0.2.36
langchain-core==0.2.37
langchain-core==0.2.38
langchain-core==0.2.39
langchain-core==0.2.4
langchain-core==0.2.40
langchain-core==0.2.41
langchain-core==0.2.5
langchain-core==0.2.6
langchain-core==0.2.7
langchain-core==0.2.8
langchain-core==0.2.9
langchain-core==0.3.0
langchain-core==0.3.0.dev1
langchain-core==0.3.0.dev2
langchain-core==0.3.0.dev3
langchain-core==0.3.0.dev4
langchain-core==0.3.0.dev5
langchain-core==0.3.1
langchain-core==0.3.2
langchain-core==0.3.3
langchain-core==0.3.4
langchain-core==0.3.5
langchain-couchbase==0.0.1
langchain-couchbase==0.1.0
langchain-couchbase==0.1.1
langchain-exa==0.1.0
langchain-exa==0.2.0
langchain-experimental==0.0.58
langchain-experimental==0.0.59
langchain-experimental==0.0.60
langchain-experimental==0.0.61
langchain-experimental==0.0.62
langchain-experimental==0.0.63
langchain-experimental==0.0.64
langchain-experimental==0.0.65
langchain-experimental==0.3.0
langchain-experimental==0.3.0.dev1
langchain-fireworks==0.1.3
langchain-fireworks==0.1.4
langchain-fireworks==0.1.5
langchain-fireworks==0.1.6
langchain-fireworks==0.1.7
langchain-fireworks==0.2.0
langchain-fireworks==0.2.0.dev0
langchain-fireworks==0.2.0.dev1
langchain-fireworks==0.2.0.dev2
langchain-groq==0.1.10
langchain-groq==0.1.4
langchain-groq==0.1.5
langchain-groq==0.1.6
langchain-groq==0.1.8
langchain-groq==0.1.9
langchain-groq==0.2.0
langchain-groq==0.2.0.dev0
langchain-groq==0.2.0.dev1
langchain-huggingface==0.0.1
langchain-huggingface==0.0.2
langchain-huggingface==0.0.3
langchain-huggingface==0.1.0
langchain-huggingface==0.1.0.dev1
langchain-ibm==0.1.5
langchain-ibm==0.1.6
langchain-ibm==0.1.7
langchain-ibm==0.1.8
langchain-ibm==0.1.9
langchain-milvus==0.1.0
langchain-milvus==0.1.1
langchain-milvus==0.1.2
langchain-milvus==0.1.3
langchain-milvus==0.1.4
langchain-milvus==0.1.5
langchain-mistralai==0.1.10
langchain-mistralai==0.1.11
langchain-mistralai==0.1.12
langchain-mistralai==0.1.13
langchain-mistralai==0.1.6
langchain-mistralai==0.1.7
langchain-mistralai==0.1.8
langchain-mistralai==0.1.9
langchain-mistralai==0.2.0
langchain-mistralai==0.2.0.dev1
langchain-mongodb==0.1.4
langchain-mongodb==0.1.5
langchain-mongodb==0.1.6
langchain-mongodb==0.1.7
langchain-mongodb==0.1.8
langchain-mongodb==0.1.9
langchain-mongodb==0.2.0
langchain-mongodb==0.2.0.dev1
langchain-nomic==0.1.0
langchain-nomic==0.1.1
langchain-nomic==0.1.2
langchain-nomic==0.1.3
langchain-ollama==0.1.0
langchain-ollama==0.1.1
langchain-ollama==0.1.2
langchain-ollama==0.1.3
langchain-ollama==0.2.0
langchain-ollama==0.2.0.dev1
langchain-openai==0.1.10
langchain-openai==0.1.11
langchain-openai==0.1.12
langchain-openai==0.1.13
langchain-openai==0.1.14
langchain-openai==0.1.15
langchain-openai==0.1.16
langchain-openai==0.1.17
langchain-openai==0.1.19
langchain-openai==0.1.20
langchain-openai==0.1.21
langchain-openai==0.1.21rc1
langchain-openai==0.1.21rc2
langchain-openai==0.1.22
langchain-openai==0.1.23
langchain-openai==0.1.24
langchain-openai==0.1.25
langchain-openai==0.1.5
langchain-openai==0.1.6
langchain-openai==0.1.7
langchain-openai==0.1.8
langchain-openai==0.1.8rc1
langchain-openai==0.1.9
langchain-openai==0.2.0
langchain-openai==0.2.0.dev0
langchain-openai==0.2.0.dev1
langchain-openai==0.2.0.dev2
langchain-pinecone==0.1.1
langchain-pinecone==0.1.2
langchain-pinecone==0.1.3
langchain-pinecone==0.2.0
langchain-pinecone==0.2.0.dev1
langchain-prompty==0.0.1
langchain-prompty==0.0.2
langchain-prompty==0.0.3
langchain-prompty==0.1.0
langchain-qdrant==0.0.1
langchain-qdrant==0.1.0
langchain-qdrant==0.1.1
langchain-qdrant==0.1.2
langchain-qdrant==0.1.3
langchain-qdrant==0.1.4
langchain-qdrant==0.2.0.dev1
langchain-robocorp==0.0.10
langchain-robocorp==0.0.10.post1
langchain-robocorp==0.0.6
langchain-robocorp==0.0.7
langchain-robocorp==0.0.8
langchain-robocorp==0.0.9
langchain-robocorp==0.0.9.post1
langchain-text-splitters==0.0.2
langchain-text-splitters==0.2.0
langchain-text-splitters==0.2.1
langchain-text-splitters==0.2.2
langchain-text-splitters==0.2.4
langchain-text-splitters==0.3.0
langchain-text-splitters==0.3.0.dev0
langchain-text-splitters==0.3.0.dev1
langchain-together==0.1.1
langchain-together==0.1.2
langchain-together==0.1.3
langchain-together==0.1.4
langchain-together==0.1.5
langchain-unstructured==0.1.0
langchain-unstructured==0.1.1
langchain-unstructured==0.1.2
langchain-unstructured==0.1.4
langchain-upstage==0.1.4
langchain-upstage==0.1.5
langchain-voyageai==0.1.1
langchain-voyageai==0.1.2
langchain==0.1.17
langchain==0.1.19
langchain==0.1.20
langchain==0.2.0
langchain==0.2.0rc1
langchain==0.2.0rc2
langchain==0.2.1
langchain==0.2.10
langchain==0.2.11
langchain==0.2.12
langchain==0.2.13
langchain==0.2.14
langchain==0.2.15
langchain==0.2.16
langchain==0.2.2
langchain==0.2.3
langchain==0.2.4
langchain==0.2.5
langchain==0.2.6
langchain==0.2.7
langchain==0.2.8
langchain==0.2.9
langchain==0.3.0
langchain==0.3.0.dev1
langchain==0.3.0.dev2
v0.0.1
v0.0.100
v0.0.101
v0.0.102
v0.0.103
v0.0.104
v0.0.105
v0.0.106
v0.0.107
v0.0.108
v0.0.109
v0.0.110
v0.0.111
v0.0.112
v0.0.113
v0.0.114
v0.0.115
v0.0.116
v0.0.117
v0.0.118
v0.0.119
v0.0.120
v0.0.121
v0.0.122
v0.0.123
v0.0.124
v0.0.125
v0.0.126
v0.0.127
v0.0.128
v0.0.129
v0.0.130
v0.0.131
v0.0.132
v0.0.133
v0.0.134
v0.0.135
v0.0.136
v0.0.137
v0.0.138
v0.0.139
v0.0.140
v0.0.141
v0.0.142
v0.0.143
v0.0.144
v0.0.145
v0.0.146
v0.0.147
v0.0.148
v0.0.149
v0.0.150
v0.0.151
v0.0.152
v0.0.153
v0.0.154
v0.0.155
v0.0.156
v0.0.157
v0.0.158
v0.0.159
v0.0.160
v0.0.161
v0.0.162
v0.0.163
v0.0.164
v0.0.165
v0.0.166
v0.0.167
v0.0.168
v0.0.169
v0.0.170
v0.0.171
v0.0.172
v0.0.173
v0.0.174
v0.0.175
v0.0.176
v0.0.177
v0.0.178
v0.0.179
v0.0.180
v0.0.181
v0.0.182
v0.0.183
v0.0.184
v0.0.185
v0.0.186
v0.0.187
v0.0.188
v0.0.189
v0.0.190
v0.0.191
v0.0.192
v0.0.193
v0.0.194
v0.0.195
v0.0.196
v0.0.197
v0.0.198
v0.0.199
v0.0.1rc0
v0.0.1rc1
v0.0.1rc2
v0.0.1rc3
v0.0.1rc4
v0.0.2
v0.0.200
v0.0.201
v0.0.202
v0.0.203
v0.0.204
v0.0.205
v0.0.206
v0.0.207
v0.0.208
v0.0.209
v0.0.210
v0.0.211
v0.0.212
v0.0.213
v0.0.214
v0.0.215
v0.0.216
v0.0.217
v0.0.218
v0.0.219
v0.0.220
v0.0.221
v0.0.222
v0.0.223
v0.0.224
v0.0.225
v0.0.226
v0.0.227
v0.0.228
v0.0.229
v0.0.230
v0.0.231
v0.0.232
v0.0.233
v0.0.234
v0.0.235
v0.0.236
v0.0.237
v0.0.238
v0.0.239
v0.0.240
v0.0.240rc0
v0.0.240rc1
v0.0.240rc4
v0.0.242
v0.0.243
v0.0.244
v0.0.245
v0.0.247
v0.0.248
v0.0.249
v0.0.250
v0.0.251
v0.0.252
v0.0.253
v0.0.254
v0.0.255
v0.0.256
v0.0.257
v0.0.258
v0.0.259
v0.0.260
v0.0.261
v0.0.262
v0.0.263
v0.0.264
v0.0.265
v0.0.266
v0.0.267
v0.0.268
v0.0.269
v0.0.270
v0.0.271
v0.0.272
v0.0.273
v0.0.274
v0.0.275
v0.0.276
v0.0.277
v0.0.278
v0.0.279
v0.0.281
v0.0.283
v0.0.284
v0.0.285
v0.0.286
v0.0.287
v0.0.288
v0.0.289
v0.0.290
v0.0.291
v0.0.292
v0.0.293
v0.0.294
v0.0.295
v0.0.296
v0.0.297
v0.0.298
v0.0.299
v0.0.300
v0.0.301
v0.0.302
v0.0.303
v0.0.304
v0.0.305
v0.0.306
v0.0.307
v0.0.308
v0.0.309
v0.0.310
v0.0.311
v0.0.312
v0.0.313
v0.0.314
v0.0.315
v0.0.316
v0.0.317
v0.0.318
v0.0.319
v0.0.320
v0.0.321
v0.0.322
v0.0.323
v0.0.324
v0.0.325
v0.0.326
v0.0.327
v0.0.329
v0.0.330
v0.0.331
v0.0.331rc0
v0.0.331rc1
v0.0.331rc2
v0.0.331rc3
v0.0.332
v0.0.333
v0.0.334
v0.0.335
v0.0.336
v0.0.337
v0.0.338
v0.0.339
v0.0.339rc0
v0.0.339rc1
v0.0.339rc2
v0.0.339rc3
v0.0.340
v0.0.341
v0.0.342
v0.0.343
v0.0.344
v0.0.345
v0.0.346
v0.0.347
v0.0.348
v0.0.349
v0.0.349-rc.1
v0.0.349-rc.2
v0.0.350
v0.0.351
v0.0.352
v0.0.353
v0.0.354
v0.0.4
v0.0.5
v0.0.64
v0.0.65
v0.0.66
v0.0.67
v0.0.68
v0.0.69
v0.0.70
v0.0.71
v0.0.72
v0.0.73
v0.0.74
v0.0.75
v0.0.76
v0.0.77
v0.0.78
v0.0.79
v0.0.80
v0.0.81
v0.0.82
v0.0.83
v0.0.84
v0.0.85
v0.0.86
v0.0.87
v0.0.88
v0.0.89
v0.0.90
v0.0.91
v0.0.92
v0.0.93
v0.0.94
v0.0.95
v0.0.96
v0.0.97
v0.0.98
v0.0.99
v0.1.0
v0.1.1
v0.1.10
v0.1.11
v0.1.12
v0.1.13
v0.1.14
v0.1.15
v0.1.16
v0.1.17rc1
v0.1.2
v0.1.3
v0.1.4
v0.1.5
v0.1.6
v0.1.7
v0.1.8
v0.1.9
${ noResults }
512 Commits (35ebd2620c56d8a109321d7eb3f7676e20175d69)
Author | SHA1 | Message | Date |
---|---|---|---|
Virat Singh |
264ab96980
|
community: Add stock market tools from financialdatasets.ai (#25025)
**Description:** In this PR, I am adding three stock market tools from financialdatasets.ai (my API!): - get balance sheets - get cash flow statements - get income statements Twitter handle: [@virattt](https://twitter.com/virattt) --------- Co-authored-by: Erick Friis <erick@langchain.dev> |
2 months ago |
maang-h |
1028af17e7
|
docs: Standardize Tongyi (#25103)
- **Description:** Standardize Tongyi LLM,include: - docs, the issue #24803 - model init arg names, the issue #20085 |
2 months ago |
Dobiichi-Origami |
061ed250f6
|
delete the default model value from langchain and discard the need fo… (#24915)
- description: I remove the limitation of mandatory existence of `QIANFAN_AK` and default model name which langchain uses cause there is already a default model nama underlying `qianfan` SDK powering langchain component. --------- Co-authored-by: Chester Curme <chester.curme@gmail.com> |
2 months ago |
ZhangShenao |
cda79dbb6c
|
community[patch]: Optimize test case for `MoonshotChat` (#25050)
Optimize test case for `MoonshotChat`. Use standard ChatModelIntegrationTests. |
2 months ago |
maang-h |
f5da0d6d87
|
docs: Standardize MiniMaxEmbeddings (#24983)
- **Description:** Standardize MiniMaxEmbeddings - docs, the issue #24856 - model init arg names, the issue #20085 |
2 months ago |
Isaac Francisco |
73570873ab
|
docs: standardizing tavily tool docs (#24736)
Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com> Co-authored-by: Bagatur <baskaryan@gmail.com> |
2 months ago |
Bagatur |
8e2316b8c2
|
community[patch]: Release 0.2.11 (#24989) | 2 months ago |
ZhangShenao |
71c0564c9f
|
community[patch]: Add test case for MoonshotChat (#24960)
Add test case for `MoonshotChat`. |
2 months ago |
Serena Ruan |
1827bb4042
|
community[patch]: support bind_tools for ChatMlflow (#24547)
Thank you for contributing to LangChain! - [x] **PR title**: "package: description" - Where "package" is whichever of langchain, community, core, experimental, etc. is being modified. Use "docs: ..." for purely docs changes, "templates: ..." for template changes, "infra: ..." for CI changes. - Example: "community: add foobar LLM" - **Description:** Support ChatMlflow.bind_tools method Tested in Databricks: <img width="836" alt="image" src="https://github.com/user-attachments/assets/fa28ef50-0110-4698-8eda-4faf6f0b9ef8"> - [x] **Add tests and docs**: If you're adding a new integration, please include 1. a test for the integration, preferably unit tests that do not rely on network access, 2. an example notebook showing its use. It lives in `docs/docs/integrations` directory. - [x] **Lint and test**: Run `make format`, `make lint` and `make test` from the root of the package(s) you've modified. See contribution guidelines for more: https://python.langchain.com/docs/contributing/ Additional guidelines: - Make sure optional dependencies are imported within a function. - Please do not add dependencies to pyproject.toml files (even optional ones) unless they are required for unit tests. - Most PRs should not touch more than one package. - Changes should be backwards compatible. - If you are adding something to community, do not re-import it in langchain. If no one reviews your PR within a few days, please @-mention one of baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17. --------- Signed-off-by: Serena Ruan <serena.rxy@gmail.com> |
2 months ago |
Nikita Pakunov |
c776471ac6
|
community: fix AttributeError: 'YandexGPT' object has no attribute '_grpc_metadata' (#24432)
Fixes #24049 --------- Co-authored-by: Erick Friis <erick@langchain.dev> |
2 months ago |
Eugene Yurtsev |
d24b82357f
|
community[patch]: Add missing annotations (#24890)
This PR adds annotations in comunity package. Annotations are only strictly needed in subclasses of BaseModel for pydantic 2 compatibility. This PR adds some unnecessary annotations, but they're not bad to have regardless for documentation pages. |
2 months ago |
Rajendra Kadam |
a6add89bd4
|
community[minor]: [PebbloSafeLoader] Implement content-size-based batching (#24871)
- **Title:** [PebbloSafeLoader] Implement content-size-based batching in the classification flow(loader/doc API) - **Description:** - Implemented content-size-based batching in the loader/doc API, set to 100KB with no external configuration option, intentionally hard-coded to prevent timeouts. - Remove unused field(pb_id) from doc_metadata - **Issue:** NA - **Dependencies:** NA - **Add tests and docs:** Updated |
2 months ago |
TrumanYan |
096b66db4a
|
community: replace it with Tencent Cloud SDK (#24172)
Description: The old method will be discontinued; use the official SDK for more model options. Issue: None Dependencies: None Twitter handle: None Co-authored-by: trumanyan <trumanyan@tencent.com> |
2 months ago |
Anush |
51b15448cc
|
community: Fix FastEmbedEmbeddings (#24462)
## Description This PR: - Fixes the validation error in `FastEmbedEmbeddings`. - Adds support for `batch_size`, `parallel` params. - Removes support for very old FastEmbed versions. - Updates the FastEmbed doc with the new params. Associated Issues: - Resolves #24039 - Resolves #https://github.com/qdrant/fastembed/issues/296 |
2 months ago |
Igor Drozdov |
c2706cfb9e
|
feat(community): add tools support for litellm (#23906)
I used the following example to validate the behavior ```python from langchain_core.prompts import ChatPromptTemplate from langchain_core.runnables import ConfigurableField from langchain_anthropic import ChatAnthropic from langchain_community.chat_models import ChatLiteLLM from langchain_core.tools import tool from langchain.agents import create_tool_calling_agent, AgentExecutor @tool def multiply(x: float, y: float) -> float: """Multiply 'x' times 'y'.""" return x * y @tool def exponentiate(x: float, y: float) -> float: """Raise 'x' to the 'y'.""" return x**y @tool def add(x: float, y: float) -> float: """Add 'x' and 'y'.""" return x + y prompt = ChatPromptTemplate.from_messages([ ("system", "you're a helpful assistant"), ("human", "{input}"), ("placeholder", "{agent_scratchpad}"), ]) tools = [multiply, exponentiate, add] llm = ChatAnthropic(model="claude-3-sonnet-20240229", temperature=0) # llm = ChatLiteLLM(model="claude-3-sonnet-20240229", temperature=0) agent = create_tool_calling_agent(llm, tools, prompt) agent_executor = AgentExecutor(agent=agent, tools=tools, verbose=True) agent_executor.invoke({"input": "what's 3 plus 5 raised to the 2.743. also what's 17.24 - 918.1241", }) ``` `ChatAnthropic` version works: ``` > Entering new AgentExecutor chain... Invoking: `exponentiate` with `{'x': 5, 'y': 2.743}` responded: [{'text': 'To calculate 3 + 5^2.743, we can use the "exponentiate" and "add" tools:', 'type': 'text', 'index': 0}, {'id': 'toolu_01Gf54DFTkfLMJQX3TXffmxe', 'input': {}, 'name': 'exponentiate', 'type': 'tool_use', 'index': 1, 'partial_json': '{"x": 5, "y": 2.743}'}] 82.65606421491815 Invoking: `add` with `{'x': 3, 'y': 82.65606421491815}` responded: [{'id': 'toolu_01XUq9S56GT3Yv2N1KmNmmWp', 'input': {}, 'name': 'add', 'type': 'tool_use', 'index': 0, 'partial_json': '{"x": 3, "y": 82.65606421491815}'}] 85.65606421491815 Invoking: `add` with `{'x': 17.24, 'y': -918.1241}` responded: [{'text': '\n\nSo 3 + 5^2.743 = 85.66\n\nTo calculate 17.24 - 918.1241, we can use:', 'type': 'text', 'index': 0}, {'id': 'toolu_01BkXTwP7ec9JKYtZPy5JKjm', 'input': {}, 'name': 'add', 'type': 'tool_use', 'index': 1, 'partial_json': '{"x": 17.24, "y": -918.1241}'}] -900.8841[{'text': '\n\nTherefore, 17.24 - 918.1241 = -900.88', 'type': 'text', 'index': 0}] > Finished chain. ``` While `ChatLiteLLM` version doesn't. But with the changes in this PR, along with: - https://github.com/langchain-ai/langchain/pull/23823 - https://github.com/BerriAI/litellm/pull/4554 The result is _almost_ the same: ``` > Entering new AgentExecutor chain... Invoking: `exponentiate` with `{'x': 5, 'y': 2.743}` responded: To calculate 3 + 5^2.743, we can use the "exponentiate" and "add" tools: 82.65606421491815 Invoking: `add` with `{'x': 3, 'y': 82.65606421491815}` 85.65606421491815 Invoking: `add` with `{'x': 17.24, 'y': -918.1241}` responded: So 3 + 5^2.743 = 85.66 To calculate 17.24 - 918.1241, we can use: -900.8841 Therefore, 17.24 - 918.1241 = -900.88 > Finished chain. ``` If no one reviews your PR within a few days, please @-mention one of baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17. Co-authored-by: ccurme <chester.curme@gmail.com> |
2 months ago |
maang-h |
4bb1a11e02
|
community: Add MiniMaxChat bind_tools and structured output (#24310)
- **Description:** - Add `bind_tools` method to support tool calling - Add `with_structured_output` method to support structured output |
2 months ago |
maang-h |
bf685c242f
|
docs: Standardize QianfanEmbeddingsEndpoint (#24786)
- **Description:** Standardize QianfanEmbeddingsEndpoint, include: - docstrings, the issue #21983 - model init arg names, the issue #20085 |
2 months ago |
Haijian Wang |
cda3025ee1
|
Integrating the Yi family of models. (#24491)
Thank you for contributing to LangChain! - [x] **PR title**: "community:add Yi LLM", "docs:add Yi Documentation" - [x] **PR message**: ***Delete this entire checklist*** and replace with - **Description:** This PR adds support for the Yi model to LangChain. - **Dependencies:** [langchain_core,requests,contextlib,typing,logging,json,langchain_community] - **Twitter handle:** 01.AI - [x] **Add tests and docs**: I've added the corresponding documentation to the relevant paths --------- Co-authored-by: Bagatur <baskaryan@gmail.com> Co-authored-by: isaac hershenson <ihershenson@hmc.edu> |
2 months ago |
yonarw |
b65ac8d39c
|
community[minor]: Self query retriever for HANA Cloud Vector Engine (#24494)
Description: - This PR adds a self query retriever implementation for SAP HANA Cloud Vector Engine. The retriever supports all operators except for contains. - Issue: N/A - Dependencies: no new dependencies added **Add tests and docs:** Added integration tests to: libs/community/tests/unit_tests/query_constructors/test_hanavector.py **Documentation for self query retriever:** /docs/integrations/retrievers/self_query/hanavector_self_query.ipynb --------- Co-authored-by: Bagatur <baskaryan@gmail.com> Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com> |
2 months ago |
nobbbbby |
4f3b4fc7fe
|
community[patch]: Extend Baichuan model with tool support (#24529)
**Description:** Expanded the chat model functionality to support tools in the 'baichuan.py' file. Updated module imports and added tool object handling in message conversions. Additional changes include the implementation of tool binding and related unit tests. The alterations offer enhanced model capabilities by enabling interaction with tool-like objects. --------- Co-authored-by: ccurme <chester.curme@gmail.com> |
2 months ago |
Rave Harpaz |
ee399e3ec5
|
community[patch]: Add OCI Generative AI tool and structured output support (#24693)
- [x] **PR title**: community: Add OCI Generative AI tool and structured output support - [x] **PR message**: - **Description:** adding tool calling and structured output support for chat models offered by OCI Generative AI services. This is an update to our last PR 22880 with changes in /langchain_community/chat_models/oci_generative_ai.py - **Issue:** NA - **Dependencies:** NA - **Twitter handle:** NA - [x] **Add tests and docs**: 1. we have updated our unit tests 2. we have updated our documentation under /docs/docs/integrations/chat/oci_generative_ai.ipynb - [x] **Lint and test**: `make format`, `make lint` and `make test` we run successfully --------- Co-authored-by: RHARPAZ <RHARPAZ@RHARPAZ-5750.us.oracle.com> Co-authored-by: Arthur Cheng <arthur.cheng@oracle.com> |
2 months ago |
Yuki Watanabe |
2b6a262f84
|
community[patch]: Replace `filters` argument to `filter` in DatabricksVectorSearch (#24530)
The [DatabricksVectorSearch](https://github.com/langchain-ai/langchain/blob/master/libs/community/langchain_community/vectorstores/databricks_vector_search.py#L21) class exposes similarity search APIs with argument `filters`, which is inconsistent with other VS classes who uses `filter` (singular). This PR updates the argument and add alias for backward compatibility. --------- Signed-off-by: B-Step62 <yuki.watanabe@databricks.com> |
2 months ago |
Chaunte W. Lacewell |
69eacaa887
|
Community[minor]: Update VDMS vectorstore (#23729)
**Description:** - This PR exposes some functions in VDMS vectorstore, updates VDMS related notebooks, updates tests, and upgrade version of VDMS (>=0.0.20) **Issue:** N/A **Dependencies:** - Update vdms>=0.0.20 |
2 months ago |
KyrianC |
0fdbaf4a8d
|
community: fix ChatEdenAI + EdenAI Tools (#23715)
Fixes for Eden AI Custom tools and ChatEdenAI: - add missing import in __init__ of chat_models - add `args_schema` to custom tools. otherwise '__arg1' would sometimes be passed to the `run` method - fix IndexError when no human msg is added in ChatEdenAI |
2 months ago |
rick-SOPTIM |
cd563fb628
|
community[minor]: passthrough auth parameter on requests to Ollama-LLMs (#24068)
Thank you for contributing to LangChain! **Description:** This PR allows users of `langchain_community.llms.ollama.Ollama` to specify the `auth` parameter, which is then forwarded to all internal calls of `requests.request`. This works in the same way as the existing `headers` parameters. The auth parameter enables the usage of the given class with Ollama instances, which are secured by more complex authentication mechanisms, that do not only rely on static headers. An example are AWS API Gateways secured by the IAM authorizer, which expects signatures dynamically calculated on the specific HTTP request. **Issue:** Integrating a remote LLM running through Ollama using `langchain_community.llms.ollama.Ollama` only allows setting static HTTP headers with the parameter `headers`. This does not work, if the given instance of Ollama is secured with an authentication mechanism that makes use of dynamically created HTTP headers which for example may depend on the content of a given request. **Dependencies:** None **Twitter handle:** None --------- Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com> |
2 months ago |
Oleg Kulyk |
4b1b7959a2
|
community[minor]: Add ScrapingAnt Loader Community Integration (#24514)
Added [ScrapingAnt](https://scrapingant.com/) Web Loader integration. ScrapingAnt is a web scraping API that allows extracting web page data into accessible and well-formatted markdown. Description: Added ScrapingAnt web loader for retrieving web page data as markdown Dependencies: scrapingant-client Twitter: @WeRunTheWorld3 --------- Co-authored-by: Oleg Kulyk <oleg@scrapingant.com> |
2 months ago |
Anindyadeep |
12c3454fd9
|
[Community] PremAI Tool Calling Functionality (#23931)
This PR is under WIP and adds the following functionalities: - [X] Supports tool calling across the langchain ecosystem. (However streaming is not supported) - [X] Update documentation |
2 months ago |
Vishnu Nandakumar |
e271965d1e
|
community: retrievers: added capability for using Product Quantization as one of the retriever. (#22424)
- [ ] **Community**: "Retrievers: Product Quantization" - [X] This PR adds Product Quantization feature to the retrievers to the Langchain Community. PQ is one of the fastest retrieval methods if the embeddings are rich enough in context due to the concepts of quantization and representation through centroids - **Description:** Adding PQ as one of the retrievers - **Dependencies:** using the package nanopq for this PR - **Twitter handle:** vishnunkumar_ - [X] **Add tests and docs**: If you're adding a new integration, please include - [X] Added unit tests for the same in the retrievers. - [] Will add an example notebook subsequently - [X] **Lint and test**: Run `make format`, `make lint` and `make test` from the root of the package(s) you've modified. See contribution guidelines for more: https://python.langchain.com/docs/contributing/ - done the same --------- Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com> Co-authored-by: Bagatur <baskaryan@gmail.com> Co-authored-by: Chester Curme <chester.curme@gmail.com> |
2 months ago |
Aayush Kataria |
0f45ac4088
|
LangChain Community: VectorStores: Azure Cosmos DB Filtered Vector Search (#24087)
Thank you for contributing to LangChain! - This PR adds vector search filtering for Azure Cosmos DB Mongo vCore and NoSQL. - [ ] **PR message**: ***Delete this entire checklist*** and replace with - **Description:** a description of the change - **Issue:** the issue # it fixes, if applicable - **Dependencies:** any dependencies required for this change - **Twitter handle:** if your PR gets announced, and you'd like a mention, we'll gladly shout you out! - [ ] **Add tests and docs**: If you're adding a new integration, please include 1. a test for the integration, preferably unit tests that do not rely on network access, 2. an example notebook showing its use. It lives in `docs/docs/integrations` directory. - [ ] **Lint and test**: Run `make format`, `make lint` and `make test` from the root of the package(s) you've modified. See contribution guidelines for more: https://python.langchain.com/docs/contributing/ Additional guidelines: - Make sure optional dependencies are imported within a function. - Please do not add dependencies to pyproject.toml files (even optional ones) unless they are required for unit tests. - Most PRs should not touch more than one package. - Changes should be backwards compatible. - If you are adding something to community, do not re-import it in langchain. If no one reviews your PR within a few days, please @-mention one of baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17. |
2 months ago |
Alexander Golodkov |
2a70a07aad
|
community[minor]: added new document loaders based on dedoc library (#24303)
### Description This pull request added new document loaders to load documents of various formats using [Dedoc](https://github.com/ispras/dedoc): - `DedocFileLoader` (determine file types automatically and parse) - `DedocPDFLoader` (for `PDF` and images parsing) - `DedocAPIFileLoader` (determine file types automatically and parse using Dedoc API without library installation) [Dedoc](https://dedoc.readthedocs.io) is an open-source library/service that extracts texts, tables, attached files and document structure (e.g., titles, list items, etc.) from files of various formats. The library is actively developed and maintained by a group of developers. `Dedoc` supports `DOCX`, `XLSX`, `PPTX`, `EML`, `HTML`, `PDF`, images and more. Full list of supported formats can be found [here](https://dedoc.readthedocs.io/en/latest/#id1). For `PDF` documents, `Dedoc` allows to determine textual layer correctness and split the document into paragraphs. ### Issue This pull request extends variety of document loaders supported by `langchain_community` allowing users to choose the most suitable option for raw documents parsing. ### Dependencies The PR added a new (optional) dependency `dedoc>=2.2.5` ([library documentation](https://dedoc.readthedocs.io)) to the `extended_testing_deps.txt` ### Twitter handle None ### Add tests and docs 1. Test for the integration: `libs/community/tests/integration_tests/document_loaders/test_dedoc.py` 2. Example notebook: `docs/docs/integrations/document_loaders/dedoc.ipynb` 3. Information about the library: `docs/docs/integrations/providers/dedoc.mdx` ### Lint and test Done locally: - `make format` - `make lint` - `make integration_tests` - `make docs_build` (from the project root) --------- Co-authored-by: Nasty <bogatenkova.anastasiya@mail.ru> |
2 months ago |
Ben Chambers |
5ac936a284
|
community[minor]: add document transformer for extracting links (#24186)
- **Description:** Add a DocumentTransformer for executing one or more `LinkExtractor`s and adding the extracted links to each document. - **Issue:** n/a - **Depedencies:** none --------- Co-authored-by: Eugene Yurtsev <eugene@langchain.dev> |
2 months ago |
maang-h |
721f709dec
|
community: Improve QianfanChatEndpoint tool result to model (#24466)
- **Description:** `QianfanChatEndpoint` When using tool result to answer questions, the content of the tool is required to be in Dict format. Of course, this can require users to return Dict format when calling the tool, but in order to be consistent with other Chat Models, I think such modifications are necessary. |
2 months ago |
ccurme |
dcba7df2fe
|
community[patch]: deprecate langchain_community Chroma in favor of langchain_chroma (#24474) | 2 months ago |
ZhangShenao |
0f6737cbfe
|
[Vector Store] Fix function `add_texts` in `TencentVectorDB` (#24469)
Regardless of whether `embedding_func` is set or not, the 'text' attribute of document should be assigned, otherwise the `page_content` in the document of the final search result will be lost |
2 months ago |
maang-h |
7b28359719
|
docs: Add ChatSparkLLM docstrings (#24449)
- **Description:** - Add `ChatSparkLLM` docstrings, the issue #22296 - To support `stream` method |
2 months ago |
Erick Friis |
f4ee3c8a22
|
infra: add min version testing to pr test flow (#24358)
xfailing some sql tests that do not currently work on sqlalchemy v1 #22207 was very much not sqlalchemy v1 compatible. Moving forward, implementations should be compatible with both to pass CI |
2 months ago |
Philippe PRADOS |
f5856680fe
|
community[minor]: add mongodb byte store (#23876)
The `MongoDBStore` can manage only documents. It's not possible to use MongoDB for an `CacheBackedEmbeddings`. With this new implementation, it's possible to use: ```python CacheBackedEmbeddings.from_bytes_store( underlying_embeddings=embeddings, document_embedding_cache=MongoDBByteStore( connection_string=db_uri, db_name=db_name, collection_name=collection_name, ), ) ``` and use MongoDB to cache the embeddings ! |
2 months ago |
Dristy Srivastava |
020cc1cf3e
|
Community[minor]: Added checksum in while send data to pebblo-cloud (#23968)
- **Description:** - Updated checksum in doc metadata - Sending checksum and removing actual content, while sending data to `pebblo-cloud` if `classifier-location `is `pebblo-cloud` in `/loader/doc` API - Adding `pb_id` i.e. pebblo id to doc metadata - Refactoring as needed. - Sending `content-checksum` and removing actual content, while sending data to `pebblo-cloud` if `classifier-location `is `pebblo-cloud` in `prmopt` API - **Issue:** NA - **Dependencies:** NA - **Tests:** Updated - **Docs** NA --------- Co-authored-by: dristy.cd <dristy@clouddefense.io> |
2 months ago |
keval dekivadiya |
06f47678ae
|
community[minor]: Add TextEmbed Embedding Integration (#22946)
**Description:** **TextEmbed** is a high-performance embedding inference server designed to provide a high-throughput, low-latency solution for serving embeddings. It supports various sentence-transformer models and includes the ability to deploy image and text embedding models. TextEmbed offers flexibility and scalability for diverse applications. - **PyPI Package:** [TextEmbed on PyPI](https://pypi.org/project/textembed/) - **Docker Image:** [TextEmbed on Docker Hub](https://hub.docker.com/r/kevaldekivadiya/textembed) - **GitHub Repository:** [TextEmbed on GitHub](https://github.com/kevaldekivadiya2415/textembed) **PR Description** This PR adds functionality for embedding documents and queries using the `TextEmbedEmbeddings` class. The implementation allows for both synchronous and asynchronous embedding requests to a TextEmbed API endpoint. The class handles batching and permuting of input texts to optimize the embedding process. **Example Usage:** ```python from langchain_community.embeddings import TextEmbedEmbeddings # Initialise the embeddings class embeddings = TextEmbedEmbeddings(model="your-model-id", api_key="your-api-key", api_url="your_api_url") # Define a list of documents documents = [ "Data science involves extracting insights from data.", "Artificial intelligence is transforming various industries.", "Cloud computing provides scalable computing resources over the internet.", "Big data analytics helps in understanding large datasets.", "India has a diverse cultural heritage." ] # Define a query query = "What is the cultural heritage of India?" # Embed all documents document_embeddings = embeddings.embed_documents(documents) # Embed the query query_embedding = embeddings.embed_query(query) # Print embeddings for each document for i, embedding in enumerate(document_embeddings): print(f"Document {i+1} Embedding:", embedding) # Print the query embedding print("Query Embedding:", query_embedding) --------- Co-authored-by: Eugene Yurtsev <eugene@langchain.dev> |
2 months ago |
Ben Chambers |
3691701d58
|
community[minor]: Add keybert-based link extractor (#24311)
- **Description:** Add a `KeybertLinkExtractor` for graph vectorstores. This allows extracting links from keywords in a Document and linking nodes that have common keywords. - **Issue:** None - **Dependencies:** None. --------- Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com> Co-authored-by: ccurme <chester.curme@gmail.com> |
2 months ago |
Ben Chambers |
83f3d95ffa
|
community[minor]: GLiNER link extraction (#24314)
- **Description:** This allows extracting links between documents with common named entities using [GLiNER](https://github.com/urchade/GLiNER). - **Issue:** None - **Dependencies:** None --------- Co-authored-by: Eugene Yurtsev <eyurtsev@gmail.com> |
2 months ago |
Anas Khan |
b5acb91080
|
Mask API keys for various LLM/ChatModel Modules (#13885)
**Description:** - Added masking of the API Keys for the modules: - `langchain/chat_models/openai.py` - `langchain/llms/openai.py` - `langchain/llms/google_palm.py` - `langchain/chat_models/google_palm.py` - `langchain/llms/edenai.py` - Updated the modules to utilize `SecretStr` from pydantic to securely manage API key. - Added unit/integration tests - `langchain/chat_models/asure_openai.py` used the `open_api_key` that is derived from the `ChatOpenAI` Class and it was assuming `openai_api_key` is a str so we changed it to expect `SecretStr` instead. **Issue:** https://github.com/langchain-ai/langchain/issues/12165 , **Dependencies:** none, **Tag maintainer:** @eyurtsev --------- Co-authored-by: HassanA01 <anikeboss@gmail.com> Co-authored-by: Aneeq Hassan <aneeq.hassan@utoronto.ca> Co-authored-by: kristinspenc <kristinspenc2003@gmail.com> Co-authored-by: faisalt14 <faisalt14@gmail.com> Co-authored-by: Harshil-Patel28 <76663814+Harshil-Patel28@users.noreply.github.com> Co-authored-by: kristinspenc <146893228+kristinspenc@users.noreply.github.com> Co-authored-by: faisalt14 <90787271+faisalt14@users.noreply.github.com> Co-authored-by: Chester Curme <chester.curme@gmail.com> |
2 months ago |
ccurme |
f99369a54c
|
community[patch]: fix formatting (#24443)
Somehow this got through CI: https://github.com/langchain-ai/langchain/pull/24363 |
2 months ago |
Ben Chambers |
242b085be7
|
Merge pull request #24315
* community: Add Hierarchy link extractor * add example * lint |
2 months ago |
Brice Fotzo |
034a8c7c1b
|
community: support advanced text extraction options for pdf documents (#20265)
**Description:** - Updated constructors in PyPDFParser and PyPDFLoader to handle `extraction_mode` and additional kwargs, aligning with the capabilities of `PageObject.extract_text()` from pypdf. - Added `test_pypdf_loader_with_layout` along with a corresponding example text file to validate layout extraction from PDFs. **Issue:** fixes #19735 **Dependencies:** This change requires updating the pypdf dependency from version 3.4.0 to at least 4.0.0. Additional changes include the addition of a new test test_pypdf_loader_with_layout and an example text file to ensure the functionality of layout extraction from PDFs aligns with the new capabilities. --------- Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com> Co-authored-by: Bagatur <baskaryan@gmail.com> Co-authored-by: Erick Friis <erick@langchain.dev> |
2 months ago |
Rafael Pereira |
cf28708e7b
|
Neo4j: Update with non-deprecated cypher methods, and new method to associate relationship embeddings (#23725)
**Description:** At the moment neo4j wrapper is using setVectorProperty, which is deprecated ([link](https://neo4j.com/docs/operations-manual/5/reference/procedures/#procedure_db_create_setVectorProperty)). I replaced with the non-deprecated version. Neo4j recently introduced a new cypher method to associate embeddings into relations using "setRelationshipVectorProperty" method. In this PR I also implemented a new method to perform this association maintaining the same format used in the "add_embeddings" method which is used to associate embeddings into Nodes. I also included a test case for this new method. |
2 months ago |
Rafael Pereira |
fc41730e28
|
neo4j: Fix test for order-insensitive comparison and floating-point precision issues (#24338)
**Description:** This PR addresses two main issues in the `test_neo4jvector.py`: 1. **Order-insensitive Comparison:** Modified the `test_retrieval_dictionary` to ensure that it passes regardless of the order of returned values by parsing `page_content` into a structured format (dictionary) before comparison. 2. **Floating-point Precision:** Updated `test_neo4jvector_relevance_score` to handle minor floating-point precision differences by using the `isclose` function for comparing relevance scores with a relative tolerance. Errors addressed: - **test_neo4jvector_relevance_score:** ``` AssertionError: assert [(Document(page_content='foo', metadata={'page': '0'}), 1.0000014305114746), (Document(page_content='bar', metadata={'page': '1'}), 0.9998371005058289), (Document(page_content='baz', metadata={'page': '2'}), 0.9993508458137512)] == [(Document(page_content='foo', metadata={'page': '0'}), 1.0), (Document(page_content='bar', metadata={'page': '1'}), 0.9998376369476318), (Document(page_content='baz', metadata={'page': '2'}), 0.9993523359298706)] At index 0 diff: (Document(page_content='foo', metadata={'page': '0'}), 1.0000014305114746) != (Document(page_content='foo', metadata={'page': '0'}), 1.0) Full diff: - [(Document(page_content='foo', metadata={'page': '0'}), 1.0), + [(Document(page_content='foo', metadata={'page': '0'}), 1.0000014305114746), ? +++++++++++++++ - (Document(page_content='bar', metadata={'page': '1'}), 0.9998376369476318), ? ^^^ ------ + (Document(page_content='bar', metadata={'page': '1'}), 0.9998371005058289), ? ^^^^^^^^^ - (Document(page_content='baz', metadata={'page': '2'}), 0.9993523359298706), ? ---------- + (Document(page_content='baz', metadata={'page': '2'}), 0.9993508458137512), ? ++++++++++ ] ``` - **test_retrieval_dictionary:** ``` AssertionError: assert [Document(page_content='skills:\n- Python\n- Data Analysis\n- Machine Learning\nname: John\nage: 30\n')] == [Document(page_content='skills:\n- Python\n- Data Analysis\n- Machine Learning\nage: 30\nname: John\n')] At index 0 diff: Document(page_content='skills:\n- Python\n- Data Analysis\n- Machine Learning\nname: John\nage: 30\n') != Document(page_content='skills:\n- Python\n- Data Analysis\n- Machine Learning\nage: 30\nname: John\n') Full diff: - [Document(page_content='skills:\n- Python\n- Data Analysis\n- Machine Learning\nage: 30\nname: John\n')] ? --------- + [Document(page_content='skills:\n- Python\n- Data Analysis\n- Machine Learning\nage: John\nage: 30\n')] ? +++++++++ ``` |
2 months ago |
bovlb |
5caa381177
|
community[minor]: Add ApertureDB as a vectorstore (#24088)
Thank you for contributing to LangChain! - [X] *ApertureDB as vectorstore**: "community: Add ApertureDB as a vectorestore" - **Description:** this change provides a new community integration that uses ApertureData's ApertureDB as a vector store. - **Issue:** none - **Dependencies:** depends on ApertureDB Python SDK - **Twitter handle:** ApertureData - [X] **Add tests and docs**: If you're adding a new integration, please include 1. a test for the integration, preferably unit tests that do not rely on network access, 2. an example notebook showing its use. It lives in `docs/docs/integrations` directory. Integration tests rely on a local run of a public docker image. Example notebook additionally relies on a local Ollama server. - [X] **Lint and test**: Run `make format`, `make lint` and `make test` from the root of the package(s) you've modified. See contribution guidelines for more: https://python.langchain.com/docs/contributing/ All lint tests pass. Additional guidelines: - Make sure optional dependencies are imported within a function. - Please do not add dependencies to pyproject.toml files (even optional ones) unless they are required for unit tests. - Most PRs should not touch more than one package. - Changes should be backwards compatible. - If you are adding something to community, do not re-import it in langchain. If no one reviews your PR within a few days, please @-mention one of baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17. --------- Co-authored-by: Gautam <gautam@aperturedata.io> |
2 months ago |
Lage Ragnarsson |
a3c10fc6ce
|
community: Add support for specifying hybrid search for Databricks vector search (#23528)
**Description:** Databricks Vector Search recently added support for hybrid keyword-similarity search. See [usage examples](https://docs.databricks.com/en/generative-ai/create-query-vector-search.html#query-a-vector-search-endpoint) from their documentation. This PR updates the Langchain vectorstore interface for Databricks to enable the user to pass the *query_type* parameter to *similarity_search* to make use of this functionality. By default, there will not be any changes for existing users of this interface. To use the new hybrid search feature, it is now possible to do ```python # ... dvs = DatabricksVectorSearch(index) dvs.similarity_search("my search query", query_type="HYBRID") ``` Or using the retriever: ```python retriever = dvs.as_retriever( search_kwargs={ "query_type": "HYBRID", } ) retriever.invoke("my search query") ``` --------- Co-authored-by: Bagatur <22008038+baskaryan@users.noreply.github.com> Co-authored-by: Erick Friis <erick@langchain.dev> |
2 months ago |
Christopher Tee |
5171ffc026
|
community(you): Integrate You.com conversational APIs (#23046)
You.com is releasing two new conversational APIs — Smart and Research. This PR: - integrates those APIs with Langchain, as an LLM - streaming is supported If no one reviews your PR within a few days, please @-mention one of baskaryan, efriis, eyurtsev, ccurme, vbarda, hwchase17. |
2 months ago |