-
I am in the process of migrating my chat app from LangServe to LangGraph and want to self-host it in Docker (as before). From documentation I think it will be possible to run What options do I have? |
Beta Was this translation helpful? Give feedback.
Replies: 4 comments 9 replies
-
Also looking for a solution of a fully self-hosted and self-managed solution for deploying my LangGraph agents - no LangSmith monitoring desired. Working on researching this right now - will report back. In the past I had used FastAPI to create an endpoint in which I called |
Beta Was this translation helpful? Give feedback.
This comment has been hidden.
This comment has been hidden.
-
Self-hosted lite offers up to 1 million nodes executed per year for free. You can sign up for a Developer account on LangSmith. The Developer account is also free up to 5k trace executions per month. How to guide: https://langchain-ai.github.io/langgraph/how-tos/deploy-self-hosted/ Up to date pricing information is here: Deploying LangGraph Server requires both PostgreSQL and Redis. LangGraph Server uses a web-queue-worker architecture that maintains the queue even during application crashes or horizontal scaling. LangGraph Server provides a built-in persistence layer which can be used for storing application data (e.g., chat history), supproting human-in-the-loop workflows, and allows recovery after an exception occurs in the middle of a workflow (i.e., resuming without repeating previously done work). LangGraph Server has a number of additional features that you'll likely want when deploying applications to production (e.g., implementation of double-texting strategy which allows you to handle a situation when a user sends two consecutive messages to an agent before it finished responding to the first one). You can read more about the feature set here: https://langchain-ai.github.io/langgraph/concepts/langgraph_platform/#why-use-langgraph-platform |
Beta Was this translation helpful? Give feedback.
-
@kisanmajumder I think you can just self-host redis and postgres in their own Docker containers alongside your FastAPI LangGraph API. This is what's done in a development environment when you use LangGraph Studio. Self-sovereignty demands fully self-hosted options. It looks like LangChain has very good Enterprise support - but even if less than 1% of users fall into this category, owning the full stack yourself is empowering. |
Beta Was this translation helpful? Give feedback.
Self-hosted lite offers up to 1 million nodes executed per year for free. You can sign up for a Developer account on LangSmith. The Developer account is also free up to 5k trace executions per month.
How to guide: https://langchain-ai.github.io/langgraph/how-tos/deploy-self-hosted/
Up to date pricing information is here:
Deploying LangGraph Server requires both PostgreSQL and Redis. LangGraph Server uses a web-queue-worker architecture that maintains the queue even during application crashes or horizontal scaling.
LangGraph Server provides a built-in persistence layer which can be used for …