forked from Archives/langchain
1837caa70d
# docs: ecosystem/integrations update It is the first in a series of `ecosystem/integrations` updates. The ecosystem/integrations list is missing many integrations. I'm adding the missing integrations in a consistent format: 1. description of the integrated system 2. `Installation and Setup` section with 'pip install ...`, Key setup, and other necessary settings 3. Sections like `LLM`, `Text Embedding Models`, `Chat Models`... with links to correspondent examples and imports of the used classes. This PR keeps new docs, that are presented in the `docs/modules/models/text_embedding/examples` but missed in the `ecosystem/integrations`. The next PRs will cover the next example sections. Also updated `integrations.rst`: added the `Dependencies` section with a link to the packages used in LangChain. ## Who can review? @hwchase17 @eyurtsev @dev2049
1.2 KiB
1.2 KiB
Airbyte JSON
Airbyte is a data integration platform for ELT pipelines from APIs, databases & files to warehouses & lakes. It has the largest catalog of ELT connectors to data warehouses and databases.
Installation and Setup
This instruction shows how to load any source from Airbyte
into a local JSON
file that can be read in as a document.
Prerequisites:
Have docker desktop
installed.
Steps:
- Clone Airbyte from GitHub -
git clone https://github.com/airbytehq/airbyte.git
. - Switch into Airbyte directory -
cd airbyte
. - Start Airbyte -
docker compose up
. - In your browser, just visit http://localhost:8000. You will be asked for a username and password. By default, that's username
airbyte
and passwordpassword
. - Setup any source you wish.
- Set destination as Local JSON, with specified destination path - lets say
/json_data
. Set up a manual sync. - Run the connection.
- To see what files are created, navigate to:
file:///tmp/airbyte_local/
.
Document Loader
See a usage example.
from langchain.document_loaders import AirbyteJSONLoader