langgraph/tutorials/langgraph-platform/local-server/ #2527
Replies: 20 comments 20 replies
-
I tried setting it up using the
Any idea why it's failing to find the module? |
Beta Was this translation helpful? Give feedback.
-
I'm having a situation where
|
Beta Was this translation helpful? Give feedback.
-
When using Traceback (most recent call last):
File "<frozen runpy>", line 198, in _run_module_as_main
File "<frozen runpy>", line 88, in _run_code
File "D:\ANACONDA\envs\LangSmith\Scripts\langgraph.exe\__main__.py", line 7, in <module>
File "D:\ANACONDA\envs\LangSmith\Lib\site-packages\click\core.py", line 1157, in __call__
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ANACONDA\envs\LangSmith\Lib\site-packages\click\core.py", line 1078, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File "D:\ANACONDA\envs\LangSmith\Lib\site-packages\click\core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ANACONDA\envs\LangSmith\Lib\site-packages\click\core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ANACONDA\envs\LangSmith\Lib\site-packages\click\core.py", line 783, in invoke
return __callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "D:\ANACONDA\envs\LangSmith\Lib\site-packages\langgraph_cli\analytics.py", line 96, in decorator
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "D:\ANACONDA\envs\LangSmith\Lib\site-packages\langgraph_cli\cli.py", line 608, in dev
run_server(
File "D:\ANACONDA\envs\LangSmith\Lib\site-packages\langgraph_api\cli.py", line 221, in run_server
uvicorn.run(
TypeError: run() got an unexpected keyword argument 'auth' This error occurs because the Solution: To resolve this issue, modify the |
Beta Was this translation helpful? Give feedback.
-
Requirements: but the latest version of langchain-cli in PyPI is 0.0.35. |
Beta Was this translation helpful? Give feedback.
-
Hi, I followed all the steps mentioned in the documentation Ran this step after creating an env file with contents: My Langgraph server is running but it does not show any of the threads in the file. Error:
Am I missing something? |
Beta Was this translation helpful? Give feedback.
-
Mixed Content: The page at 'https://smith.langchain.com/studio/thread?baseUrl=http%3A%2F%2F0.0.0.0%3A2024' was loaded over HTTPS, but requested an insecure resource 'http://0.0.0.0:2024/assistants/search'. This request has been blocked; the content must be served over HTTPS. It is impossible to load a local resource from a an https resource which is also on another domain. This can't work |
Beta Was this translation helpful? Give feedback.
-
Guys come on. This doesn't work. You put a quickstart tutorial on one of the most hyped tools at the moment and you don't have time to fix it. Come on guys!!! |
Beta Was this translation helpful? Give feedback.
-
What hook functions can be used to perform cleanup after langgraph-cli is shut down? |
Beta Was this translation helpful? Give feedback.
-
Hey, I've followed all the steps of the process and after doing |
Beta Was this translation helpful? Give feedback.
-
After running I see no correlation across the times that this happens versus not, but do know it persists for a while. I've tried clearing cookies as well and that did not do the trick. I can see the langgraph server is up and I'm able to get responses from it from a python client. Any idea how to fix? |
Beta Was this translation helpful? Give feedback.
-
After running langgraph dev
╦ ┌─┐┌┐┌┌─┐╔═╗┬─┐┌─┐┌─┐┬ ┬
This in-memory server is designed for development and testing. 2025-02-06T13:05:57.151330Z [info ] Will watch for changes in these directories: ['C:\Users\zbx\path\to\your\app'] [uvicorn.error] api_variant=local_dev |
Beta Was this translation helpful? Give feedback.
-
I followed the instructions above. I create a .env file and write my OPENAI_API_KEY and OPENAI_API_BASE, and then change the default model in configuration.py to 'openai/gpt-4-turbo-preview'. Is it because the base-url? How to change the base-url then? |
Beta Was this translation helpful? Give feedback.
-
Hi there is it possible to get this run with python 3.10? python 3.10 is the officially supported version in the company and our base image are using python 3.10. |
Beta Was this translation helpful? Give feedback.
-
GUYS GUYS GUYS! To fix the 'Failed to fetch' Chrome > Settings > Privacy and Security > Site Settings > Additional content settings > Insecure content > Allowed to show insecure content > Add smith.langchain.com |
Beta Was this translation helpful? Give feedback.
-
Team, Attaching to langgraph-api-1, langgraph-postgres-1, langgraph-redis-1 Cleared all images containers and volumes but still error |
Beta Was this translation helpful? Give feedback.
-
Create a proxy script with NGROK and set the API KEY in a file called Unfortunately the studio, the ways it work so far, is bad design because you're basically allowing the langsmith website to send and execute commands in your PC which is A MAJOR SECURITY FLAW (both in design and security): DO NOT EVER LET A REMOTE SERVER execute code on your pc, this is very basic security stuff. My recommandation is execute the langgraph studio as a docker connected to a docker to docker network with this proxy. from flask import Flask, request, make_response
import requests
from pyngrok import ngrok
from dotenv import load_dotenv
import os
# Load environment variables from .env file
load_dotenv("config.env")
app = Flask(__name__)
# Target base URL for proxying (backend you're proxying to)
PROXY_BASE_URL = "http://127.0.0.1:2024" # Replace with your langhgraph docker URI
# Add CORS headers to allow everything
def add_cors_headers(response):
response.headers['Access-Control-Allow-Origin'] = '*' # Allow all origins
response.headers['Access-Control-Allow-Credentials'] = 'true' # Allow credentials
response.headers['Access-Control-Allow-Headers'] = '*' # Allow all headers
response.headers['Access-Control-Allow-Methods'] = '*' # Allow all HTTP methods
return response
# Handle all routes and proxy requests
@app.route('/', defaults={'path': ''}, methods=['GET', 'POST', 'PUT', 'DELETE', 'OPTIONS'])
@app.route('/<path:path>', methods=['GET', 'POST', 'PUT', 'DELETE', 'OPTIONS'])
def proxy(path):
# Handle preflight OPTIONS requests
if request.method == 'OPTIONS':
return add_cors_headers(make_response())
# Build the proxied request to the target server
target_url = f"{PROXY_BASE_URL}/{path}"
try:
response = requests.request(
method=request.method,
url=target_url,
headers={key: value for key, value in request.headers.items() if key.lower() != 'host'},
params=request.args,
data=request.get_data(),
cookies=request.cookies,
)
# Build the response with proxied content and status
flask_response = make_response(response.content, response.status_code)
for key, value in response.headers.items():
flask_response.headers[key] = value
# Add CORS headers
return add_cors_headers(flask_response)
except requests.RequestException as e:
return make_response(f"Error proxying request: {e}", 502)
# Handle favicon requests to avoid unnecessary errors
@app.route('/favicon.ico')
def favicon():
return '', 204 # Return an empty response with "No Content" status
if __name__ == '__main__':
# Retrieve the ngrok authtoken from the .env file
ngrok_authtoken = os.getenv("NGROK_AUTHTOKEN")
# Authenticate ngrok with your authtoken
ngrok.set_auth_token(ngrok_authtoken)
# Start ngrok tunnel
public_url = ngrok.connect(8123)
print(f"ngrok tunnel: {public_url}")
# Start Flask app
app.run(host="0.0.0.0", port=8123) Once lanched, open the NGROK url to validate the tunneling. Thank me later. |
Beta Was this translation helpful? Give feedback.
-
Don't have UI button to create new nodes. |
Beta Was this translation helpful? Give feedback.
-
For anyone struggling with the CORS-blocks-langgraph-studio-from-accessing-a-locally-deployed-langgraph-server problem I've just posted a slightly simper approach using nginx to reverse proxy and add the missing Access-Control-XXXX headers needed for CORS to work in Chrome. It's simpler in the sense that it:
Ultimately it achieves a similar outcome to martinobettuchi's solution above, so choose whichever is more familiar or you prefer. |
Beta Was this translation helpful? Give feedback.
-
Hi everyone LangGraph project not loading after setup After cloning the project and setting up the .env file, I tried to start the project, but the UI stays stuck on the "Starting new-langgraphjs-project" screen (as shown in the screenshot below). No TypeScript or JavaScript templates are being opened or rendered. I see logs like: bash ● Starting Target port: 3000 Steps taken: Cloned the repository. Added a .env file with the correct values. Ran the project using the provided instructions. Expected behavior: A template (or the LangGraph builder) should open and allow me to build or view a graph. Actual behavior: Just a loading screen — nothing else happens. |
Beta Was this translation helpful? Give feedback.
-
I've set up langgraph as the instruction, create app from template, cd to the app_path, set API_KEYs in .env, and then run I checked the log of the server program, just as following, which showed either 404 or 200 error. Is there anything wrong with my setup? (1) when I open in web browser http://localhost:2024/docs, the page is blank, and the server log is: (2) http://localhost:2024/, the page showed Not Found, and the server log is: (3) https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024, and the server log is: |
Beta Was this translation helpful? Give feedback.
-
langgraph/tutorials/langgraph-platform/local-server/
Build language agents as graphs
https://langchain-ai.github.io/langgraph/tutorials/langgraph-platform/local-server/
Beta Was this translation helpful? Give feedback.
All reactions