From 937a7e93f27977322c933ae83ffea0a3b925f21c Mon Sep 17 00:00:00 2001 From: Harrison Chase Date: Wed, 21 Jun 2023 23:13:45 -0700 Subject: [PATCH] add motherduck docs (#6572) --- .../ecosystem/integrations/motherduck.mdx | 50 +++++++++++++++++++ 1 file changed, 50 insertions(+) create mode 100644 docs/extras/ecosystem/integrations/motherduck.mdx diff --git a/docs/extras/ecosystem/integrations/motherduck.mdx b/docs/extras/ecosystem/integrations/motherduck.mdx new file mode 100644 index 00000000..b8256586 --- /dev/null +++ b/docs/extras/ecosystem/integrations/motherduck.mdx @@ -0,0 +1,50 @@ +# Motherduck + +>[Motherduck](https://motherduck.com/) is a managed DuckDB-in-the-cloud service. + +## Installation and Setup + +First, you need to install `duckdb` python package. + +```bash +pip install duckdb +``` + +You will also need to sign up for an account at [Motherduck](https://motherduck.com/) + +After that, you should set up a connection string - we mostly integrate with Motherduck through SQLAlchemy. +The connection string is likely in the form: + +``` +token="..." + +conn_str = f"duckdb:///md:{token}@my_db" +``` + +## SQLChain + +You can use the SQLChain to query data in your Motherduck instance in natural language. + +``` +from langchain import OpenAI, SQLDatabase, SQLDatabaseChain +db = SQLDatabase.from_uri(conn_str) +db_chain = SQLDatabaseChain.from_llm(OpenAI(temperature=0), db, verbose=True) +``` + +From here, see the [SQL Chain](/docs/modules/chains/popular/sqlite.html) documentation on how to use. + + +## LLMCache + +You can also easily use Motherduck to cache LLM requests. +Once again this is done through the SQLAlchemy wrapper. + +``` +import sqlalchemy +eng = sqlalchemy.create_engine(conn_str) +langchain.llm_cache = SQLAlchemyCache(engine=eng) +``` + +From here, see the [LLM Caching](/docs/modules/model_io/models/llms/how_to/llm_caching) documentation on how to use. + +