# DeepInfra This page covers how to use the DeepInfra ecosystem within LangChain. It is broken into two parts: installation and setup, and then references to specific DeepInfra wrappers. ## Installation and Setup - Get your DeepInfra api key from this link [here](https://deepinfra.com/). - Get an DeepInfra api key and set it as an environment variable (`DEEPINFRA_API_TOKEN`) ## Available Models DeepInfra provides a range of Open Source LLMs ready for deployment. You can list supported models [here](https://deepinfra.com/models?type=text-generation). google/flan\* models can be viewed [here](https://deepinfra.com/models?type=text2text-generation). You can view a list of request and response parameters [here](https://deepinfra.com/databricks/dolly-v2-12b#API) ## Wrappers ### LLM There exists an DeepInfra LLM wrapper, which you can access with ```python from langchain.llms import DeepInfra ```