diff --git a/src/docs.json b/src/docs.json index 5754c04b4..a28197750 100644 --- a/src/docs.json +++ b/src/docs.json @@ -869,7 +869,7 @@ { "group": "Releases & changelogs", "pages": [ - "langsmith/langgraph-server-changelog", + "langsmith/agent-server-changelog", "langsmith/release-versions" ] }, @@ -1191,7 +1191,7 @@ "group": "Deployment components", "pages": [ "langsmith/components", - "langsmith/langgraph-server", + "langsmith/agent-server", "langsmith/data-plane", "langsmith/control-plane" ] @@ -1677,6 +1677,14 @@ { "source": "oss/javascript/langchain/OUTPUT_PARSING_FAILURE", "destination": "oss/javascript/langchain/errors/OUTPUT_PARSING_FAILURE" + }, + { + "source": "/langsmith/langgraph-server", + "destination": "/langsmith/agent-server" + }, + { + "source": "/langsmith/langgraph-server-changelog", + "destination": "/langsmith/agent-server-changelog" } ] } diff --git a/src/langsmith/add-auth-server.mdx b/src/langsmith/add-auth-server.mdx index b3d138027..2e33c4b7b 100644 --- a/src/langsmith/add-auth-server.mdx +++ b/src/langsmith/add-auth-server.mdx @@ -270,7 +270,7 @@ You've successfully built a production-ready authentication system for your Lang 1. Set up an authentication provider (Supabase in this case) 2. Added real user accounts with email/password authentication -3. Integrated JWT token validation into your LangGraph server +3. Integrated JWT token validation into your Agent Server 4. Implemented proper authorization to ensure users can only access their own data 5. Created a foundation that's ready to handle your next authentication challenge 🚀 diff --git a/src/langsmith/add-human-in-the-loop.mdx b/src/langsmith/add-human-in-the-loop.mdx index aa6cdaf93..b3c05ec47 100644 --- a/src/langsmith/add-human-in-the-loop.mdx +++ b/src/langsmith/add-human-in-the-loop.mdx @@ -133,7 +133,7 @@ To review, edit, and approve tool calls in an agent or workflow, use LangGraph's - This is an example graph you can run in the LangGraph API server. + This is an example graph you can run in the Agent Server. See [LangSmith quickstart](/langsmith/deployment-quickstart) for more details. ```python {highlight={7,13}} @@ -171,7 +171,7 @@ To review, edit, and approve tool calls in an agent or workflow, use LangGraph's 2. Any JSON serializable value can be passed to the @[`interrupt`] function. Here, a dict containing the text to revise. 3. Once resumed, the return value of `interrupt(...)` is the human-provided input, which is used to update the state. - Once you have a running LangGraph API server, you can interact with it using + Once you have a running Agent Server, you can interact with it using [LangGraph SDK](/langsmith/langgraph-python-sdk) diff --git a/src/langsmith/langgraph-server-changelog.mdx b/src/langsmith/agent-server-changelog.mdx similarity index 98% rename from src/langsmith/langgraph-server-changelog.mdx rename to src/langsmith/agent-server-changelog.mdx index e7864570a..0d377234d 100644 --- a/src/langsmith/langgraph-server-changelog.mdx +++ b/src/langsmith/agent-server-changelog.mdx @@ -1,9 +1,9 @@ --- -title: LangGraph Server changelog -sidebarTitle: LangGraph Server changelog +title: Agent Server changelog +sidebarTitle: Agent Server changelog --- -[LangGraph Server](/langsmith/langgraph-server) is an API platform for creating and managing agent-based applications. It provides built-in persistence, a task queue, and supports deploying, configuring, and running assistants (agentic workflows) at scale. This changelog documents all notable updates, features, and fixes to LangGraph Server releases. +[Agent Server](/langsmith/agent-server) is an API platform for creating and managing agent-based applications. It provides built-in persistence, a task queue, and supports deploying, configuring, and running assistants (agentic workflows) at scale. This changelog documents all notable updates, features, and fixes to Agent Server releases. ## v0.5.4 diff --git a/src/langsmith/agent-server.mdx b/src/langsmith/agent-server.mdx new file mode 100644 index 000000000..6e3f399cb --- /dev/null +++ b/src/langsmith/agent-server.mdx @@ -0,0 +1,59 @@ +--- +title: Agent Server +--- + +LangSmith Deployment's **Agent Server** offers an API for creating and managing agent-based applications. It is built on the concept of [assistants](/langsmith/assistants), which are agents configured for specific tasks, and includes built-in [persistence](/oss/langgraph/persistence#memory-store) and a **task queue**. This versatile API supports a wide range of agentic application use cases, from background processing to real-time interactions. + +Use Agent Server to create and manage [assistants](/langsmith/assistants), [threads](/oss/langgraph/persistence#threads), [runs](/langsmith/assistants#execution), [cron jobs](/langsmith/cron-jobs), [webhooks](/langsmith/use-webhooks), and more. + + +**API reference**

+For detailed information on the API endpoints and data models, refer to the [API reference docs](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref.html). +
+ +To use the Enterprise version of the Agent Server, you must acquire a license key that you will need to specify when running the Docker image. To acquire a license key, [contact our sales team](https://www.langchain.com/contact-sales). + +You can run the Enterprise version of the Agent Server on the following LangSmith [platform](/langsmith/platform-setup) options: + +- [Cloud](/langsmith/cloud) +- [Hybrid](/langsmith/hybrid) +- [Self-hosted](/langsmith/self-hosted) + +## Application structure + +To deploy an Agent Server application, you need to specify the graph(s) you want to deploy, as well as any relevant configuration settings, such as dependencies and environment variables. + +Read the [application structure](/langsmith/application-structure) guide to learn how to structure your LangGraph application for deployment. + +## Parts of a deployment + +When you deploy Agent Server, you are deploying one or more [graphs](#graphs), a database for [persistence](/oss/langgraph/persistence), and a task queue. + +### Graphs + +When you deploy a graph with Agent Server, you are deploying a "blueprint" for an [Assistant](/langsmith/assistants). + +An [Assistant](/langsmith/assistants) is a graph paired with specific configuration settings. You can create multiple assistants per graph, each with unique settings to accommodate different use cases +that can be served by the same graph. + +Upon deployment, Agent Server will automatically create a default assistant for each graph using the graph's default configuration settings. + + +We often think of a graph as implementing an [agent](/oss/langgraph/workflows-agents), but a graph does not necessarily need to implement an agent. For example, a graph could implement a simple +chatbot that only supports back-and-forth conversation, without the ability to influence any application control flow. In reality, as applications get more complex, a graph will often implement a more complex flow that may use [multiple agents](/oss/langchain/multi-agent) working in tandem. + + +### Persistence and task queue + +Agent Server leverages a database for [persistence](/oss/langgraph/persistence) and a task queue. + +[PostgreSQL](https://www.postgresql.org/) is supported as a database for Agent Server and [Redis](https://redis.io/) as the task queue. + +If you're deploying using [LangSmith cloud](/langsmith/cloud), these components are managed for you. If you're deploying Agent Server on your [own infrastructure](/langsmith/self-hosted), you'll need to set up and manage these components yourself. + +For more information on how these components are set up and managed, review the [hosting options](/langsmith/platform-setup) guide. + +## Learn more + +- [Application Structure](/langsmith/application-structure) guide explains how to structure your application for deployment. +- The [API Reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref.html) provides detailed information on the API endpoints and data models. diff --git a/src/langsmith/api-ref-control-plane.mdx b/src/langsmith/api-ref-control-plane.mdx index 1b753bc04..e4b613c6e 100644 --- a/src/langsmith/api-ref-control-plane.mdx +++ b/src/langsmith/api-ref-control-plane.mdx @@ -3,7 +3,7 @@ title: Control plane API reference for LangSmith Deployment sidebarTitle: Control plane API --- -The control plane API is part of [LangSmith Deployment](/langsmith/deployments). With the control plane API, you can programmatically create, manage, and automate your [LangGraph Server](/langsmith/langgraph-server) deployments—for example, as part of a custom CI/CD workflow. +The control plane API is part of [LangSmith Deployment](/langsmith/deployments). With the control plane API, you can programmatically create, manage, and automate your [Agent Server](/langsmith/agent-server) deployments—for example, as part of a custom CI/CD workflow. View the full Control Plane API reference documentation diff --git a/src/langsmith/assistants.mdx b/src/langsmith/assistants.mdx index 198132f52..008aafba8 100644 --- a/src/langsmith/assistants.mdx +++ b/src/langsmith/assistants.mdx @@ -19,9 +19,9 @@ Assistants are a [LangSmith](/langsmith/home) concept. They are not available in Assistants build on the LangGraph open source concept of [configuration](/oss/langgraph/graph-api#runtime-context). -While configuration is available in the open source LangGraph library, assistants are only present in [LangSmith](/langsmith/home). This is due to the fact that assistants are tightly coupled to your deployed graph. Upon deployment, LangGraph Server will automatically create a default assistant for each graph using the graph's default configuration settings. +While configuration is available in the open source LangGraph library, assistants are only present in [LangSmith](/langsmith/home). This is due to the fact that assistants are tightly coupled to your deployed graph. Upon deployment, Agent Server will automatically create a default assistant for each graph using the graph's default configuration settings. -In practice, an assistant is just an _instance_ of a graph with a specific configuration. Therefore, multiple assistants can reference the same graph but can contain different configurations (e.g. prompts, models, tools). The LangGraph Server API provides several endpoints for creating and managing assistants. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) and [this how-to](/langsmith/configuration-cloud) for more details on how to create assistants. +In practice, an assistant is just an _instance_ of a graph with a specific configuration. Therefore, multiple assistants can reference the same graph but can contain different configurations (e.g. prompts, models, tools). The LangSmith Deployment API provides several endpoints for creating and managing assistants. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) and [this how-to](/langsmith/configuration-cloud) for more details on how to create assistants. ## Versioning diff --git a/src/langsmith/cicd-pipeline-example.mdx b/src/langsmith/cicd-pipeline-example.mdx index 201ef0b63..6448e187d 100644 --- a/src/langsmith/cicd-pipeline-example.mdx +++ b/src/langsmith/cicd-pipeline-example.mdx @@ -93,7 +93,7 @@ The workflow includes: ![Agent Deployment Revision Workflow](./images/cicd-new-lgp-revision.png) -- **Testing and evaluation workflow**: In addition to the more traditional testing phases (unit tests, integration tests, end-to-end tests, etc.), the pipeline includes [offline evaluations](/langsmith/evaluation-concepts#offline-evaluation) and [LangGraph dev server testing](/langsmith/local-server) because you want to test the quality of your agent. These evaluations provide comprehensive assessment of the agent's performance using real-world scenarios and data. +- **Testing and evaluation workflow**: In addition to the more traditional testing phases (unit tests, integration tests, end-to-end tests, etc.), the pipeline includes [offline evaluations](/langsmith/evaluation-concepts#offline-evaluation) and [Agent dev server testing](/langsmith/local-server) because you want to test the quality of your agent. These evaluations provide comprehensive assessment of the agent's performance using real-world scenarios and data. ![Test with Results Workflow](./images/cicd-test-with-results.png) @@ -207,12 +207,12 @@ Example `langgraph.json`: ### Local development and testing -![LangGraph Studio CLI Interface](./images/cicd-studio-cli.png) +![Studio CLI Interface](./images/cicd-studio-cli.png) First, test your agent locally using [Studio](/langsmith/studio): ```bash -# Start local development server with LangGraph Studio +# Start local development server with Studio langgraph dev ``` diff --git a/src/langsmith/cli.mdx b/src/langsmith/cli.mdx index b88ee03e5..716801b6f 100644 --- a/src/langsmith/cli.mdx +++ b/src/langsmith/cli.mdx @@ -3,7 +3,7 @@ title: LangGraph CLI sidebarTitle: LangGraph CLI --- -**LangGraph CLI** is a command-line tool for building and running the [LangGraph API server](/langsmith/langgraph-server) locally. The resulting server exposes all API endpoints for runs, threads, assistants, etc., and includes supporting services such as a managed database for checkpointing and storage. +**LangGraph CLI** is a command-line tool for building and running the [Agent Server](/langsmith/agent-server) locally. The resulting server exposes all API endpoints for runs, threads, assistants, etc., and includes supporting services such as a managed database for checkpointing and storage. ## Installation @@ -71,7 +71,7 @@ To build and run a valid application, the LangGraph CLI requires a JSON configur | `dockerfile_lines` | Array of additional lines to add to Dockerfile following the import from parent image. | | `checkpointer` | Configuration for the checkpointer. Supports:
  • `ttl` (optional): Object with `strategy`, `sweep_interval_minutes`, `default_ttl` controlling checkpoint expiry.
  • `serde` (optional, 0.5+): Object with `allowed_json_modules` and `pickle_fallback` to tune deserialization behavior.
| | `http` | HTTP server configuration with the following fields:
  • `app`: Path to custom Starlette/FastAPI app (e.g., `"./src/agent/webapp.py:app"`). See [custom routes guide](/langsmith/custom-routes).
  • `cors`: CORS configuration with fields such as `allow_origins`, `allow_methods`, `allow_headers`, `allow_credentials`, `allow_origin_regex`, `expose_headers`, and `max_age`.
  • `configurable_headers`: Define which request headers to expose as configurable values via `includes` / `excludes` patterns.
  • `logging_headers`: Mirror of `configurable_headers` for excluding sensitive headers from logs.
  • `middleware_order`: Choose how custom middleware and auth interact. `auth_first` runs authentication hooks before custom middleware, while `middleware_first` (default) runs your middleware first.
  • `enable_custom_route_auth`: Apply auth checks to routes added through `app`.
  • `disable_assistants`, `disable_mcp`, `disable_meta`, `disable_runs`, `disable_store`, `disable_threads`, `disable_ui`, `disable_webhooks`: Disable built-in routes or hooks.
  • `mount_prefix`: Prefix for mounted routes (e.g., "/my-deployment/api").
| - | `api_version` | _(Added in v0.3.7)_ Which semantic version of the LangGraph API server to use (e.g., `"0.3"`). Defaults to latest. Check the server [changelog](/langsmith/langgraph-server-changelog) for details on each release. | + | `api_version` | _(Added in v0.3.7)_ Which semantic version of the LangGraph API server to use (e.g., `"0.3"`). Defaults to latest. Check the server [changelog](/langsmith/agent-server-changelog) for details on each release. | | Key | Description | @@ -83,7 +83,7 @@ To build and run a valid application, the LangGraph CLI requires a JSON configur | `dockerfile_lines` | Array of additional lines to add to Dockerfile following the import from parent image. | | `checkpointer` | Configuration for the checkpointer. Supports:
  • `ttl` (optional): Object with `strategy`, `sweep_interval_minutes`, `default_ttl` controlling checkpoint expiry.
  • `serde` (optional, 0.5+): Object with `allowed_json_modules` and `pickle_fallback` to tune deserialization behavior.
| | `http` | HTTP server configuration mirroring the Python options:
  • `cors` with `allow_origins`, `allow_methods`, `allow_headers`, `allow_credentials`, `allow_origin_regex`, `expose_headers`, `max_age`.
  • `configurable_headers` and `logging_headers` pattern lists.
  • `middleware_order` (`auth_first` or `middleware_first`).
  • `enable_custom_route_auth` plus the same boolean route toggles as above.
| - | `api_version` | _(Added in v0.3.7)_ Which semantic version of the LangGraph API server to use (e.g., `"0.3"`). Defaults to latest. Check the server [changelog](/langsmith/langgraph-server-changelog) for details on each release. | + | `api_version` | _(Added in v0.3.7)_ Which semantic version of the LangGraph API server to use (e.g., `"0.3"`). Defaults to latest. Check the server [changelog](/langsmith/agent-server-changelog) for details on each release. |
@@ -250,7 +250,7 @@ To build and run a valid application, the LangGraph CLI requires a JSON configur You can configure the time-to-live (TTL) for checkpoints using the `checkpointer` key. This determines how long checkpoint data is retained before being automatically handled according to the specified strategy (e.g., deletion). Two optional sub-objects are supported: * `ttl`: Includes `strategy`, `sweep_interval_minutes`, and `default_ttl`, which collectively set how checkpoints expire. - * `serde` _(LangGraph server 0.5+)_ : Lets you control deserialization behavior for checkpoint payloads. + * `serde` _(Agent server 0.5+)_ : Lets you control deserialization behavior for checkpoint payloads. Here's an example setting a default TTL of 30 days (43200 minutes): @@ -306,7 +306,7 @@ To build and run a valid application, the LangGraph CLI requires a JSON configur _(Added in v0.3.7)_ - You can pin the API version of the LangGraph server by using the `api_version` key. This is useful if you want to ensure that your server uses a specific version of the API. + You can pin the API version of the Agent Server by using the `api_version` key. This is useful if you want to ensure that your server uses a specific version of the API. By default, builds in Cloud deployments use the latest stable version of the server. This can be pinned by setting the `api_version` key to a specific version. ```json @@ -339,7 +339,7 @@ To build and run a valid application, the LangGraph CLI requires a JSON configur _(Added in v0.3.7)_ - You can pin the API version of the LangGraph server by using the `api_version` key. This is useful if you want to ensure that your server uses a specific version of the API. + You can pin the API version of the Agent Server by using the `api_version` key. This is useful if you want to ensure that your server uses a specific version of the API. By default, builds in Cloud deployments use the latest stable version of the server. This can be pinned by setting the `api_version` key to a specific version. ```json diff --git a/src/langsmith/cloud.mdx b/src/langsmith/cloud.mdx index 7ae0d176d..35a21c65b 100644 --- a/src/langsmith/cloud.mdx +++ b/src/langsmith/cloud.mdx @@ -19,10 +19,10 @@ The **Cloud** option is a fully managed model where LangChain hosts and operates | | **Who manages it** | **Where it runs** | |-------------------|-------------------|-------------------| | **LangSmith platform (UI, APIs, datastores)** | LangChain | LangChain's cloud | -| **Your LangGraph Servers** | LangChain | LangChain's cloud | +| **Your Agent Servers** | LangChain | LangChain's cloud | | **CI/CD for your apps** | LangChain | LangChain's cloud | -![Cloud deployment: LangChain hosts and manages all components including the UI, APIs, and your LangGraph Servers.](/langsmith/images/langgraph-cloud-architecture.png) +![Cloud deployment: LangChain hosts and manages all components including the UI, APIs, and your Agent Servers.](/langsmith/images/langgraph-cloud-architecture.png) ## Get started diff --git a/src/langsmith/components.mdx b/src/langsmith/components.mdx index d7c7b5d7c..17be90fd7 100644 --- a/src/langsmith/components.mdx +++ b/src/langsmith/components.mdx @@ -1,5 +1,5 @@ --- -title: LangSmith components +title: LangSmith Deployment components sidebarTitle: Overview mode: wide --- @@ -9,17 +9,17 @@ When running self-hosted [LangSmith Deployment](/langsmith/deploy-self-hosted-fu ```mermaid flowchart subgraph LangSmith Deployment - A[LangGraph CLI] -->|creates| B(LangGraph Server deployment) + A[LangGraph CLI] -->|creates| B(Agent Server deployment) B <--> D[Studio] B <--> E[SDKs] B <--> F[RemoteGraph] end ``` -- [LangGraph Server](/langsmith/langgraph-server): Defines an opinionated API and runtime for deploying graphs and agents. Handles execution, state management, and persistence so you can focus on building logic rather than server infrastructure. +- [Agent Server](/langsmith/agent-server): Defines an opinionated API and runtime for deploying graphs and agents. Handles execution, state management, and persistence so you can focus on building logic rather than server infrastructure. - [LangGraph CLI](/langsmith/cli): A command-line interface to build, package, and interact with graphs locally and prepare them for deployment. -- [Studio](/langsmith/studio): A specialized IDE for visualization, interaction, and debugging. Connects to a local LangGraph Server for developing and testing your graph. +- [Studio](/langsmith/studio): A specialized IDE for visualization, interaction, and debugging. Connects to a local Agent Server for developing and testing your graph. - [Python/JS SDK](/langsmith/sdk): The Python/JS SDK provides a programmatic way to interact with deployed graphs and agents from your applications. - [RemoteGraph](/langsmith/use-remote-graph): Allows you to interact with a deployed graph as though it were running locally. -- [Control Plane](/langsmith/control-plane): The UI and APIs for creating, updating, and managing LangGraph Server deployments. -- [Data plane](/langsmith/data-plane): The runtime layer that executes your graphs, including LangGraph Servers, their backing services (PostgreSQL, Redis, etc.), and the listener that reconciles state from the control plane. +- [Control Plane](/langsmith/control-plane): The UI and APIs for creating, updating, and managing Agent Server deployments. +- [Data plane](/langsmith/data-plane): The runtime layer that executes your graphs, including Agent Servers, their backing services (PostgreSQL, Redis, etc.), and the listener that reconciles state from the control plane. diff --git a/src/langsmith/control-plane.mdx b/src/langsmith/control-plane.mdx index fad3994c3..c1f8cc789 100644 --- a/src/langsmith/control-plane.mdx +++ b/src/langsmith/control-plane.mdx @@ -3,7 +3,7 @@ title: LangSmith control plane sidebarTitle: Control plane --- -The _control plane_ is the part of LangSmith that manages deployments. It includes the control plane UI, where users create and update [LangGraph Servers](/langsmith/langgraph-server), and the control plane APIs, which support the UI and provide programmatic access. +The _control plane_ is the part of LangSmith that manages deployments. It includes the control plane UI, where users create and update [Agent Servers](/langsmith/agent-server), and the control plane APIs, which support the UI and provide programmatic access. When you make an update through the control plane, the update is stored in control plane state. The [data plane](/langsmith/data-plane) “listener” polls for these updates by calling the control plane APIs. @@ -32,7 +32,7 @@ An integration is an abstraction for a `git` repository provider (e.g. GitHub). ### Deployments -A deployment is an instance of a LangGraph Server. A single deployment can have many revisions. +A deployment is an instance of an Agent Server. A single deployment can have many revisions. ### Revisions @@ -87,7 +87,7 @@ Resources for `Production` type deployments can be manually increased on a case- * Postgres connection timeouts/errors * Failed or retrying background runs -This behavior is expected. Preemptible compute infrastructure **significantly reduces the cost to provision a `Development` type deployment**. By design, LangGraph Server is fault-tolerant. The implementation will automatically attempt to recover from Redis/Postgres connection errors and retry failed background runs. +This behavior is expected. Preemptible compute infrastructure **significantly reduces the cost to provision a `Development` type deployment**. By design, Agent Server is fault-tolerant. The implementation will automatically attempt to recover from Redis/Postgres connection errors and retry failed background runs. `Production` type deployments are provisioned on durable compute infrastructure, not preemptible compute infrastructure. @@ -100,7 +100,7 @@ The control plane and [data plane](/langsmith/data-plane) "listener" application When implementing a LangGraph application, a [checkpointer](/oss/langgraph/persistence#checkpointer-libraries) does not need to be configured by the developer. Instead, a checkpointer is automatically configured for the graph. Any checkpointer configured for a graph will be replaced by the one that is automatically configured. -There is no direct access to the database. All access to the database occurs through the [LangGraph Server](/langsmith/langgraph-server). +There is no direct access to the database. All access to the database occurs through the [Agent Server](/langsmith/agent-server). The database is never deleted until the deployment itself is deleted. @@ -126,8 +126,8 @@ After a deployment is ready, the control plane monitors the deployment and recor * Number of container restarts. * Number of replicas (this will increase with [autoscaling](/langsmith/data-plane#autoscaling)). * [PostgreSQL](/langsmith/data-plane#postgres) CPU, memory usage, and disk usage. -* [LangGraph Server queue](/langsmith/langgraph-server#persistence-and-task-queue) pending/active run count. -* [LangGraph Server API](/langsmith/langgraph-server) success response count, error response count, and latency. +* [Agent Server queue](/langsmith/agent-server#persistence-and-task-queue) pending/active run count. +* [Agent Server API](/langsmith/agent-server) success response count, error response count, and latency. These metrics are displayed as charts in the Control Plane UI. diff --git a/src/langsmith/cron-jobs.mdx b/src/langsmith/cron-jobs.mdx index 02bc97d56..9f93a58c0 100644 --- a/src/langsmith/cron-jobs.mdx +++ b/src/langsmith/cron-jobs.mdx @@ -15,7 +15,7 @@ LangSmith Deployment supports cron jobs, which run on a user-defined schedule. T Note that this sends the same input to the thread every time. -The LangGraph Server API provides several endpoints for creating and managing cron jobs. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) for more details. +The LangSmith Deployment API provides several endpoints for creating and managing cron jobs. See the [API reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/) for more details. Sometimes you don't want to run your graph based on user interaction, but rather you would like to schedule your graph to run on a schedule - for example if you wish for your graph to compose and send out a weekly email of to-dos for your team. LangSmith Deployment allows you to do this without having to write your own script by using the `Crons` client. To schedule a graph job, you need to pass a [cron expression](https://crontab.cronhub.io/) to inform the client when you want to run the graph. `Cron` jobs are run in the background and do not interfere with normal invocations of the graph. diff --git a/src/langsmith/custom-routes.mdx b/src/langsmith/custom-routes.mdx index 7fa19e2ff..4ffb1f19e 100644 --- a/src/langsmith/custom-routes.mdx +++ b/src/langsmith/custom-routes.mdx @@ -6,7 +6,7 @@ When deploying agents to LangSmith Deployment, your server automatically exposes You can add custom routes by providing your own [`Starlette`](https://www.starlette.io/applications/) app (including [`FastAPI`](https://fastapi.tiangolo.com/), [`FastHTML`](https://fastht.ml/) and other compatible apps). You make LangSmith aware of this by providing a path to the app in your `langgraph.json` configuration file. -Defining a custom app object lets you add any routes you'd like, so you can do anything from adding a `/login` endpoint to writing an entire full-stack web-app, all deployed in a single LangGraph Server. +Defining a custom app object lets you add any routes you'd like, so you can do anything from adding a `/login` endpoint to writing an entire full-stack web-app, all deployed in a single Agent Server. Below is an example using FastAPI. diff --git a/src/langsmith/data-plane.mdx b/src/langsmith/data-plane.mdx index aefc5836e..ba19ffcdd 100644 --- a/src/langsmith/data-plane.mdx +++ b/src/langsmith/data-plane.mdx @@ -3,11 +3,11 @@ title: LangSmith data plane sidebarTitle: Data plane --- -The _data plane_ consists of your [LangGraph Servers](/langsmith/langgraph-server) (deployments), their supporting infrastructure, and the "listener" application that continuously polls for updates from the [LangSmith control plane](/langsmith/control-plane). +The _data plane_ consists of your [Agent Servers](/langsmith/agent-server) (deployments), their supporting infrastructure, and the "listener" application that continuously polls for updates from the [LangSmith control plane](/langsmith/control-plane). ## Server infrastructure -In addition to the [LangGraph Server](/langsmith/langgraph-server) itself, the following infrastructure components for each server are also included in the broad definition of "data plane": +In addition to the [Agent Server](/langsmith/agent-server) itself, the following infrastructure components for each server are also included in the broad definition of "data plane": - **PostgreSQL**: persistence layer for user, run, and memory data. - **Redis**: communication and ephemeral metadata for workers. @@ -26,15 +26,15 @@ In other words, the data plane "listener" reads the latest state of the control ## PostgreSQL -PostgreSQL is the persistence layer for all user, run, and long-term memory data in a LangGraph Server. This stores both checkpoints (see more info [here](/oss/langgraph/persistence)), server resources (threads, runs, assistants and crons), as well as items saved in the long-term memory store (see more info [here](/oss/langgraph/persistence#memory-store)). +PostgreSQL is the persistence layer for all user, run, and long-term memory data in a Agent Server. This stores both checkpoints (see more info [here](/oss/langgraph/persistence)), server resources (threads, runs, assistants and crons), as well as items saved in the long-term memory store (see more info [here](/oss/langgraph/persistence#memory-store)). ## Redis -Redis is used in each LangGraph Server as a way for server and queue workers to communicate, and to store ephemeral metadata. No user or run data is stored in Redis. +Redis is used in each Agent Server as a way for server and queue workers to communicate, and to store ephemeral metadata. No user or run data is stored in Redis. ### Communication -All runs in a LangGraph Server are executed by a pool of background workers that are part of each deployment. In order to enable some features for those runs (such as cancellation and output streaming) we need a channel for two-way communication between the server and the worker handling a particular run. We use Redis to organize that communication. +All runs in an Agent Server are executed by a pool of background workers that are part of each deployment. In order to enable some features for those runs (such as cancellation and output streaming) we need a channel for two-way communication between the server and the worker handling a particular run. We use Redis to organize that communication. 1. A Redis list is used as a mechanism to wake up a worker as soon as a new run is created. Only a sentinel value is stored in this list, no actual run information. The run information is then retrieved from PostgreSQL by the worker. 2. A combination of a Redis string and Redis PubSub channel is used for the server to communicate a run cancellation request to the appropriate worker. @@ -42,7 +42,7 @@ All runs in a LangGraph Server are executed by a pool of background workers that ### Ephemeral metadata -Runs in a LangGraph Server may be retried for specific failures (currently only for transient PostgreSQL errors encountered during the run). In order to limit the number of retries (currently limited to 3 attempts per run) we record the attempt number in a Redis string when it is picked up. This contains no run-specific info other than its ID, and expires after a short delay. +Runs in an Agent Server may be retried for specific failures (currently only for transient PostgreSQL errors encountered during the run). In order to limit the number of retries (currently limited to 3 attempts per run) we record the attempt number in a Redis string when it is picked up. This contains no run-specific info other than its ID, and expires after a short delay. ## Data plane features @@ -125,7 +125,7 @@ Multiple deployments can share the same Redis instance. For example, for `Deploy ### LangSmith tracing -LangGraph Server is automatically configured to send traces to LangSmith. See the table below for details with respect to each deployment option. +Agent Server is automatically configured to send traces to LangSmith. See the table below for details with respect to each deployment option. | Cloud | Hybrid | Self-Hosted | |------------|------------------------|----------------------| @@ -133,7 +133,7 @@ LangGraph Server is automatically configured to send traces to LangSmith. See th ### Telemetry -LangGraph Server is automatically configured to report telemetry metadata for billing purposes. See the table below for details with respect to each deployment option. +Agent Server is automatically configured to report telemetry metadata for billing purposes. See the table below for details with respect to each deployment option. | Cloud | Hybrid | Self-Hosted | |------------|------------------------|----------------------| @@ -141,7 +141,7 @@ LangGraph Server is automatically configured to report telemetry metadata for bi ### Licensing -LangGraph Server is automatically configured to perform license key validation. See the table below for details with respect to each deployment option. +Agent Server is automatically configured to perform license key validation. See the table below for details with respect to each deployment option. | Cloud | Hybrid | Self-Hosted | |------------|------------------------|----------------------| diff --git a/src/langsmith/data-storage-and-privacy.mdx b/src/langsmith/data-storage-and-privacy.mdx index 81f529994..e498d080a 100644 --- a/src/langsmith/data-storage-and-privacy.mdx +++ b/src/langsmith/data-storage-and-privacy.mdx @@ -2,7 +2,7 @@ title: Data storage and privacy sidebarTitle: Data storage and privacy --- -This document describes how data is processed in the LangGraph CLI and the LangGraph Server for both the in-memory server (`langgraph dev`) and the local Docker server (`langgraph up`). It also describes what data is tracked when interacting with the hosted Studio frontend. +This document describes how data is processed in the LangGraph CLI and the Agent Server for both the in-memory server (`langgraph dev`) and the local Docker server (`langgraph up`). It also describes what data is tracked when interacting with the hosted Studio frontend. ## CLI @@ -13,13 +13,13 @@ By default, calls to most CLI commands log a single analytics event upon invocat You can disable all CLI telemetry by setting `LANGGRAPH_CLI_NO_ANALYTICS=1`. -## LangGraph Server +## Agent Server -The [LangGraph Server](/langsmith/langgraph-server) provides a durable execution runtime that relies on persisting checkpoints of your application state, long-term memories, thread metadata, assistants, and similar resources to the local file system or a database. Unless you have deliberately customized the storage location, this information is either written to local disk (for `langgraph dev`) or a PostgreSQL database (for `langgraph up` and in all deployments). +The [Agent Server](/langsmith/agent-server) provides a durable execution runtime that relies on persisting checkpoints of your application state, long-term memories, thread metadata, assistants, and similar resources to the local file system or a database. Unless you have deliberately customized the storage location, this information is either written to local disk (for `langgraph dev`) or a PostgreSQL database (for `langgraph up` and in all deployments). ### LangSmith Tracing -When running the LangGraph server (either in-memory or in Docker), LangSmith tracing may be enabled to facilitate faster debugging and offer observability of graph state and LLM prompts in production. You can always disable tracing by setting `LANGSMITH_TRACING=false` in your server's runtime environment. +When running the Agent server (either in-memory or in Docker), LangSmith tracing may be enabled to facilitate faster debugging and offer observability of graph state and LLM prompts in production. You can always disable tracing by setting `LANGSMITH_TRACING=false` in your server's runtime environment. ### In-memory development server @@ -37,7 +37,7 @@ If you've disabled [tracing](#langsmith-tracing), no user data is persisted exte ## Studio -[Studio](/langsmith/studio) is a graphical interface for interacting with your LangGraph server. It does not persist any private data (the data you send to your server is not sent to LangSmith). Though the Studio interface is served at [smith.langchain.com](https://smith.langchain.com), it is run in your browser and connects directly to your local LangGraph server so that no data needs to be sent to LangSmith. +[Studio](/langsmith/studio) is a graphical interface for interacting with your Agent Server. It does not persist any private data (the data you send to your server is not sent to LangSmith). Though the Studio interface is served at [smith.langchain.com](https://smith.langchain.com), it is run in your browser and connects directly to your local Agent Server so that no data needs to be sent to LangSmith. If you are logged in, LangSmith does collect some usage analytics to help improve the debugging user experience. This includes: @@ -46,7 +46,7 @@ If you are logged in, LangSmith does collect some usage analytics to help improv * Browser type and version * Screen resolution and viewport size -Importantly, no application data or code (or other sensitive configuration details) are collected. All of that is stored in the persistence layer of your LangGraph server. When using Studio anonymously, no account creation is required and usage analytics are not collected. +Importantly, no application data or code (or other sensitive configuration details) are collected. All of that is stored in the persistence layer of your Agent Server. When using Studio anonymously, no account creation is required and usage analytics are not collected. ## Quick reference diff --git a/src/langsmith/deploy-hybrid.mdx b/src/langsmith/deploy-hybrid.mdx index 2b515ab30..f0c896285 100644 --- a/src/langsmith/deploy-hybrid.mdx +++ b/src/langsmith/deploy-hybrid.mdx @@ -9,7 +9,7 @@ icon: "cloud" The Hybrid deployment option requires an [Enterprise](https://langchain.com/pricing) plan. -The [**hybrid**](/langsmith/hybrid) model lets you run the [data plane](/langsmith/data-plane)—your LangGraph Server deployments and agent workloads—in your own cloud, while LangChain hosts and manages the [control plane](/langsmith/control-plane) (the LangSmith UI and orchestration). This setup gives you the flexibility of self-hosting your runtime environments with the convenience of a managed LangSmith instance. +The [**hybrid**](/langsmith/hybrid) model lets you run the [data plane](/langsmith/data-plane)—your Agent Server deployments and agent workloads—in your own cloud, while LangChain hosts and manages the [control plane](/langsmith/control-plane) (the LangSmith UI and orchestration). This setup gives you the flexibility of self-hosting your runtime environments with the convenience of a managed LangSmith instance. The following steps describe how to connect your self-hosted data plane to the managed LangSmith control plane. @@ -34,7 +34,7 @@ The following steps describe how to connect your self-hosted data plane to the m 2. Create a listener from the LangSmith UI. The `Listener` data model is configured for the actual ["listener" application](/langsmith/data-plane#”listener”-application). 1. In the left-hand navigation, select `Deployments` > `Listeners`. 2. In the top-right of the page, select `+ Create Listener`. - 3. Enter a unique `Compute ID` for the listener. The `Compute ID` is a user-defined identifier that should be unique across all listeners in the current LangSmith workspace. The `Compute ID` is displayed to end users when they are creating a new deployment. Ensure that the `Compute ID` provides context to the end user about where their LangGraph Server deployments will be deployed to. For example, a `Compute ID` can be set to `k8s-cluster-name-dev-01`. In this example, the name of the Kubernetes cluster is `k8s-cluster-name`, `dev` denotes that the cluster is reserved for "development" workloads, and `01` is a numerical suffix to reduce naming collisions. + 3. Enter a unique `Compute ID` for the listener. The `Compute ID` is a user-defined identifier that should be unique across all listeners in the current LangSmith workspace. The `Compute ID` is displayed to end users when they are creating a new deployment. Ensure that the `Compute ID` provides context to the end user about where their Agent Server deployments will be deployed to. For example, a `Compute ID` can be set to `k8s-cluster-name-dev-01`. In this example, the name of the Kubernetes cluster is `k8s-cluster-name`, `dev` denotes that the cluster is reserved for "development" workloads, and `01` is a numerical suffix to reduce naming collisions. 4. Enter one or more Kubernetes namespaces. Later, the "listener" application will be configured to deploy to each of these namespaces. 5. In the top-right on the page, select `Submit`. 6. After the listener is created, copy the listener ID. You will use it later when installing the actual "listener" application in the Kubernetes cluster (step 5). @@ -65,11 +65,11 @@ The following steps describe how to connect your self-hosted data plane to the m createCRDs: true # set this to `false` if the CRD has been previously installed in the current Kubernetes cluster ``` - `config.langsmithApiKey`: The `langgraph-listener` deployment authenticates with LangChain's LangGraph control plane API with the `langsmithApiKey`. - - `config.langsmithWorkspaceId`: The `langgraph-listener` deployment is coupled to LangGraph Server deployments in the LangSmith workspace. In other words, the `langgraph-listener` deployment can only manage LangGraph Server deployments in the specified LangSmith workspace ID. - - `config.langgraphListenerId`: In addition to being coupled with a LangSmith workspace, the `langgraph-listener` deployment is also coupled to a listener. When a new LangGraph Server deployment is created, it is automatically coupled to a `langgraphListenerId`. Specifying `langgraphListenerId` ensures that the `langgraph-listener` deployment can only manage LangGraph Server deployments that are coupled to `langgraphListenerId`. + - `config.langsmithWorkspaceId`: The `langgraph-listener` deployment is coupled to Agent Server deployments in the LangSmith workspace. In other words, the `langgraph-listener` deployment can only manage Agent Server deployments in the specified LangSmith workspace ID. + - `config.langgraphListenerId`: In addition to being coupled with a LangSmith workspace, the `langgraph-listener` deployment is also coupled to a listener. When a new Agent Server deployment is created, it is automatically coupled to a `langgraphListenerId`. Specifying `langgraphListenerId` ensures that the `langgraph-listener` deployment can only manage Agent Server deployments that are coupled to `langgraphListenerId`. - `config.watchNamespaces`: A comma-separated list of Kubernetes namespaces that the `langgraph-listener` deployment will deploy to. This list should match the list of namespaces specified in step 2d. - - `config.enableLGPDeploymentHealthCheck`: To disable the LangGraph Server health check, set this to `false`. - - `ingress.hostname`: As part of the deployment workflow, the `langgraph-listener` deployment attempts to call the LangGraph Server health check endpoint (`GET /ok`) to verify that the application has started up correctly. A typical setup involves creating a shared DNS record or domain for LangGraph Server deployments. This is not managed by LangSmith. Once created, set `ingress.hostname` to the domain, which will be used to complete the health check. + - `config.enableLGPDeploymentHealthCheck`: To disable the Agent Server health check, set this to `false`. + - `ingress.hostname`: As part of the deployment workflow, the `langgraph-listener` deployment attempts to call the Agent Server health check endpoint (`GET /ok`) to verify that the application has started up correctly. A typical setup involves creating a shared DNS record or domain for Agent Server deployments. This is not managed by LangSmith. Once created, set `ingress.hostname` to the domain, which will be used to complete the health check. - `operator.enabled`: There can only be 1 instance of the `langgraph-platform-operator` deployed in a Kubernetes namespace. Set this to `false` if there is already an instance of `langgraph-platform-operator` deployed in the current Kubernetes namespace. - `operator.createCRDs`: Set this value to `false` if the Kubernetes cluster already has the `LangGraphPlatform CRD` installed. During installation, an error will occur if the CRD is already installed. This situation may occur if multiple listeners are deployed on the same Kubernetes cluster. 5. Deploy `langgraph-dataplane` Helm chart. diff --git a/src/langsmith/deploy-standalone-server.mdx b/src/langsmith/deploy-standalone-server.mdx index 258ceef91..2ccf39b06 100644 --- a/src/langsmith/deploy-standalone-server.mdx +++ b/src/langsmith/deploy-standalone-server.mdx @@ -4,7 +4,7 @@ sidebarTitle: Standalone servers icon: "server" --- -This guide shows you how to deploy **standalone LangGraph Servers** without the LangSmith UI or control plane. This is the most lightweight self-hosting option for running one or a few agents as independent services. +This guide shows you how to deploy **standalone Agent Servers** without the LangSmith UI or control plane. This is the most lightweight self-hosting option for running one or a few agents as independent services. **Standalone servers are intended primarily for development purposes.** @@ -15,7 +15,7 @@ For production workloads, we recommend [LangSmith Deployment](/langsmith/self-ho -**This is the setup page for deploying LangGraph Servers directly without the LangSmith platform.** +**This is the setup page for deploying Agent Servers directly without the LangSmith platform.** Review the [self-hosted options](/langsmith/self-hosted) to understand: - [Standalone Server](/langsmith/self-hosted#standalone-server): What this guide covers (no UI, just servers). @@ -52,7 +52,7 @@ Before continuing, review the [standalone server overview](/langsmith/self-hoste ## Kubernetes -Use this [Helm chart](https://github.com/langchain-ai/helm/blob/main/charts/langgraph-cloud/README.md) to deploy a LangGraph Server to a Kubernetes cluster. +Use this [Helm chart](https://github.com/langchain-ai/helm/blob/main/charts/langgraph-cloud/README.md) to deploy an Agent Server to a Kubernetes cluster. ## Docker @@ -127,7 +127,7 @@ services: You can run the command `docker compose up` with this Docker Compose file in the same folder. -This will launch a LangGraph Server on port `8123` (if you want to change this, you can change this by changing the ports in the `langgraph-api` volume). You can test if the application is healthy by running: +This will launch an Agent Server on port `8123` (if you want to change this, you can change this by changing the ports in the `langgraph-api` volume). You can test if the application is healthy by running: ```shell curl --request GET --url 0.0.0.0:8123/ok diff --git a/src/langsmith/deploy-with-control-plane.mdx b/src/langsmith/deploy-with-control-plane.mdx index f244eb330..580bd7ba5 100644 --- a/src/langsmith/deploy-with-control-plane.mdx +++ b/src/langsmith/deploy-with-control-plane.mdx @@ -37,7 +37,7 @@ Before completing this guide, you'll need the following: ## Step 1. Test locally -Before deploying, test your application locally. You can use the [LangGraph CLI](/langsmith/cli#dev) to run a LangGraph server in development mode: +Before deploying, test your application locally. You can use the [LangGraph CLI](/langsmith/cli#dev) to run an Agent server in development mode: ```bash langgraph dev diff --git a/src/langsmith/deployments.mdx b/src/langsmith/deployments.mdx index 2ce996fe9..5b8f57c7c 100644 --- a/src/langsmith/deployments.mdx +++ b/src/langsmith/deployments.mdx @@ -8,7 +8,7 @@ mode: wide **Start here if you're building or operating agent applications.** This section is about deploying **your application**. If you need to set up LangSmith infrastructure, the [Platform setup section](/langsmith/platform-setup) covers infrastructure options (cloud, hybrid, self-hosted) and setup guides for hybrid and self-hosted deployments. -This section covers how to package, build, and deploy your _agents_ and applications as [LangGraph Servers](/langsmith/langgraph-server). +This section covers how to package, build, and deploy your _agents_ and applications as [Agent Servers](/langsmith/agent-server). A typical deployment workflow consists of the following steps: @@ -35,7 +35,7 @@ A typical deployment workflow consists of the following steps: ## What you'll learn - Configure your [app for deployment](/langsmith/application-structure) (dependencies, [project setup](/langsmith/setup-app-requirements-txt), and [monorepo support](/langsmith/monorepo-support)). -- Build, deploy, and update [LangGraph Servers](/langsmith/langgraph-server). +- Build, deploy, and update [Agent Servers](/langsmith/agent-server). - Secure your deployments with [authentication and access control](/langsmith/auth). - Customize your server runtime ([lifespan hooks](/langsmith/custom-lifespan), [middleware](/langsmith/custom-middleware), and [routes](/langsmith/custom-routes)). - Debug, observe, and troubleshoot deployed agents using the [Studio UI](/langsmith/studio). @@ -47,12 +47,12 @@ A typical deployment workflow consists of the following steps: href="/langsmith/application-structure" cta="Configure your app" > - Package, build, and deploy your agents and graphs to LangGraph Server. + Package, build, and deploy your agents and graphs to Agent Server. ### Related -- [LangGraph Server](/langsmith/langgraph-server) +- [Agent Server](/langsmith/agent-server) - [Application structure](/langsmith/application-structure) - [Local server testing](/langsmith/local-server) diff --git a/src/langsmith/double-texting.mdx b/src/langsmith/double-texting.mdx index 2b71259b5..36c4da285 100644 --- a/src/langsmith/double-texting.mdx +++ b/src/langsmith/double-texting.mdx @@ -5,7 +5,7 @@ sidebarTitle: Overview **Prerequisites** -* [LangGraph Server](/langsmith/langgraph-server) +* [Agent Server](/langsmith/agent-server) Many times users might interact with your graph in unintended ways. diff --git a/src/langsmith/env-var.mdx b/src/langsmith/env-var.mdx index d8452b23a..022ffd56d 100644 --- a/src/langsmith/env-var.mdx +++ b/src/langsmith/env-var.mdx @@ -1,8 +1,8 @@ --- title: Environment variables -sidebarTitle: LangGraph Server environment variables +sidebarTitle: Agent Server environment variables --- -The LangGraph Server supports specific environment variables for configuring a deployment. +The Agent Server supports specific environment variables for configuring a deployment. ## `BG_JOB_ISOLATED_LOOPS` @@ -42,7 +42,7 @@ For more details, refer to [Set a sampling rate for traces](/langsmith/sample-tr ## `LANGGRAPH_AUTH_TYPE` -Type of authentication for the LangGraph Server deployment. Valid values: `langsmith`, `noop`. +Type of authentication for the Agent Server deployment. Valid values: `langsmith`, `noop`. For deployments to LangSmith, this environment variable is set automatically. For local development or deployments where authentication is handled externally (e.g. self-hosted), set this environment variable to `noop`. @@ -91,13 +91,13 @@ Set `LOG_JSON` to `true` to render all log messages as JSON objects using the co The `MOUNT_PREFIX` environment variable is only allowed in Self-Hosted Deployment models, LangSmith SaaS will not allow this environment variable. -Set `MOUNT_PREFIX` to serve the LangGraph Server under a specific path prefix. This is useful for deployments where the server is behind a reverse proxy or load balancer that requires a specific path prefix. +Set `MOUNT_PREFIX` to serve the Agent Server under a specific path prefix. This is useful for deployments where the server is behind a reverse proxy or load balancer that requires a specific path prefix. For example, if the server is to be served under `https://example.com/langgraph`, set `MOUNT_PREFIX` to `/langgraph`. ## `N_JOBS_PER_WORKER` -Number of jobs per worker for the LangGraph Server task queue. Defaults to `10`. +Number of jobs per worker for the Agent Server task queue. Defaults to `10`. ## `POSTGRES_URI_CUSTOM` @@ -123,7 +123,7 @@ Control Plane Functionality: Database Connectivity: -* The custom Postgres instance must be accessible by the LangGraph Server. The user is responsible for ensuring connectivity. +* The custom Postgres instance must be accessible by the Agent Server. The user is responsible for ensuring connectivity. ## `REDIS_CLUSTER` @@ -145,7 +145,7 @@ Defaults to `False`. This environment variable is supported in API Server version 0.1.9 and above. -Specify a prefix for Redis keys. This allows multiple LangGraph Server instances to share the same Redis instance by using different key prefixes. +Specify a prefix for Redis keys. This allows multiple Agent Server instances to share the same Redis instance by using different key prefixes. Defaults to `''`. diff --git a/src/langsmith/faq.mdx b/src/langsmith/faq.mdx index c8af606c1..6a24bb3db 100644 --- a/src/langsmith/faq.mdx +++ b/src/langsmith/faq.mdx @@ -147,7 +147,7 @@ Yes! LangGraph is totally ambivalent to what LLMs are used under the hood. The m ### Can I use Studio without logging in to LangSmith? -Yes! You can use the [development version of LangGraph Server](/langsmith/local-server) to run the backend locally. +Yes! You can use the [development version of Agent Server](/langsmith/local-server) to run the backend locally. This will connect to the Studio frontend hosted as part of LangSmith. If you set an environment variable of `LANGSMITH_TRACING=false`, then no traces will be sent to LangSmith. diff --git a/src/langsmith/generative-ui-react.mdx b/src/langsmith/generative-ui-react.mdx index 1e0239def..3dcf7f428 100644 --- a/src/langsmith/generative-ui-react.mdx +++ b/src/langsmith/generative-ui-react.mdx @@ -5,7 +5,7 @@ sidebarTitle: Implement generative user interfaces with LangGraph **Prerequisites** * [LangSmith](/langsmith/home) -* [LangGraph Server](/langsmith/langgraph-server) +* [Agent Server](/langsmith/agent-server) * [`useStream()` React Hook](/langsmith/use-stream-react) diff --git a/src/langsmith/home.mdx b/src/langsmith/home.mdx index ee81f8951..def86889b 100644 --- a/src/langsmith/home.mdx +++ b/src/langsmith/home.mdx @@ -58,7 +58,7 @@ Once your account and API key are ready, choose a quickstart to begin building w arrow="true" cta="Deploy your agents" > - Deploy your agents as LangGraph Servers, ready to scale in production. + Deploy your agents as Agent Servers, ready to scale in production. LangGraph Servers and agent workloads) runs in your cloud, managed by you. +- **Data plane** (your Agent Servers and agent workloads) runs in your cloud, managed by you. This combines the convenience of a managed interface with the flexibility of running workloads in your own environment. -Learn more about the [control plane](/langsmith/control-plane), [data plane](/langsmith/data-plane), and [LangGraph Server](/langsmith/langgraph-server) architecture concepts. +Learn more about the [control plane](/langsmith/control-plane), [data plane](/langsmith/data-plane), and [Agent Server](/langsmith/agent-server) architecture concepts. | Component | Responsibilities | Where it runs | Who manages it | |----------------|------------------|---------------|----------------| | Control plane |
  • UI for creating deployments and revisions
  • APIs for managing deployments
  • Observability data storage
| LangChain's cloud | LangChain | -| Data plane |
  • Listener to sync with control plane
  • LangGraph Servers (your agents)
  • Backing services (Postgres, Redis, etc.)
| Your cloud | You | +| Data plane |
  • Listener to sync with control plane
  • Agent Servers (your agents)
  • Backing services (Postgres, Redis, etc.)
| Your cloud | You | When running LangSmith in a hybrid model, you authenticate with a [LangSmith API key](/langsmith/create-account-api-key). @@ -30,7 +30,7 @@ When running LangSmith in a hybrid model, you authenticate with a [LangSmith API 1. Use the `langgraph-cli` or [Studio](/langsmith/studio) to test your graph locally. 1. Build a Docker image using the `langgraph build` command. -1. Deploy your LangGraph Server from the [control plane UI](/langsmith/control-plane#control-plane-ui). +1. Deploy your Agent Server from the [control plane UI](/langsmith/control-plane#control-plane-ui). Supported Compute Platforms: [Kubernetes](https://kubernetes.io/).

@@ -42,13 +42,13 @@ For setup, refer to the [Hybrid setup guide](/langsmith/deploy-hybrid). Hybrid deployment: LangChain-hosted control plane (LangSmith UI/APIs) manages deployments. Your cloud runs a listener, LangGraph Server instances, and backing stores (Postgres/Redis) on Kubernetes. Hybrid deployment: LangChain-hosted control plane (LangSmith UI/APIs) manages deployments. Your cloud runs a listener, LangGraph Server instances, and backing stores (Postgres/Redis) on Kubernetes. ### Compute Platforms @@ -76,7 +76,7 @@ In the hybrid option, one or more ["listener" applications](/langsmith/data-plan ### Kubernetes cluster organization - One or more listeners can run in a Kubernetes cluster. - A listener can deploy into one or more namespaces in that cluster. -- Cluster owners are responsible for planning listener layout and LangGraph Server deployments. +- Cluster owners are responsible for planning listener layout and Agent Server deployments. ### LangSmith workspace organization - A workspace can be associated with one or more listeners. diff --git a/src/langsmith/langgraph-server.mdx b/src/langsmith/langgraph-server.mdx deleted file mode 100644 index 964812322..000000000 --- a/src/langsmith/langgraph-server.mdx +++ /dev/null @@ -1,59 +0,0 @@ ---- -title: LangGraph Server ---- - -**LangGraph Server** offers an API for creating and managing agent-based applications. It is built on the concept of [assistants](/langsmith/assistants), which are agents configured for specific tasks, and includes built-in [persistence](/oss/langgraph/persistence#memory-store) and a **task queue**. This versatile API supports a wide range of agentic application use cases, from background processing to real-time interactions. - -Use LangGraph Server to create and manage [assistants](/langsmith/assistants), [threads](/oss/langgraph/persistence#threads), [runs](/langsmith/assistants#execution), [cron jobs](/langsmith/cron-jobs), [webhooks](/langsmith/use-webhooks), and more. - - -**API reference**

-For detailed information on the API endpoints and data models, refer to the [API reference docs](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref.html). -
- -To use the `Enterprise` version of the LangGraph Server, you must acquire a license key that you will need to specify when running the Docker image. To acquire a license key, [contact our sales team](https://www.langchain.com/contact-sales). - -You can run the `Enterprise` version of the LangGraph Server on the following LangSmith [hosting](/langsmith/platform-setup) options: - -- [Cloud](/langsmith/cloud) -- [Hybrid](/langsmith/hybrid) -- [Self-hosted](/langsmith/self-hosted) - -## Application structure - -To deploy a LangGraph Server application, you need to specify the graph(s) you want to deploy, as well as any relevant configuration settings, such as dependencies and environment variables. - -Read the [application structure](/langsmith/application-structure) guide to learn how to structure your LangGraph application for deployment. - -## Parts of a deployment - -When you deploy LangGraph Server, you are deploying one or more [graphs](#graphs), a database for [persistence](/oss/langgraph/persistence), and a task queue. - -### Graphs - -When you deploy a graph with LangGraph Server, you are deploying a "blueprint" for an [Assistant](/langsmith/assistants). - -An [Assistant](/langsmith/assistants) is a graph paired with specific configuration settings. You can create multiple assistants per graph, each with unique settings to accommodate different use cases -that can be served by the same graph. - -Upon deployment, LangGraph Server will automatically create a default assistant for each graph using the graph's default configuration settings. - - -We often think of a graph as implementing an [agent](/oss/langgraph/workflows-agents), but a graph does not necessarily need to implement an agent. For example, a graph could implement a simple -chatbot that only supports back-and-forth conversation, without the ability to influence any application control flow. In reality, as applications get more complex, a graph will often implement a more complex flow that may use [multiple agents](/oss/langchain/multi-agent) working in tandem. - - -### Persistence and task queue - -LangGraph Server leverages a database for [persistence](/oss/langgraph/persistence) and a task queue. - -[PostgreSQL](https://www.postgresql.org/) is supported as a database for LangGraph Server and [Redis](https://redis.io/) as the task queue. - -If you're deploying using [LangSmith cloud](/langsmith/cloud), these components are managed for you. If you're deploying LangGraph Server on your [own infrastructure](/langsmith/self-hosted), you'll need to set up and manage these components yourself. - -For more information on how these components are set up and managed, review the [hosting options](/langsmith/platform-setup) guide. - -## Learn more - -- LangGraph [Application Structure](/langsmith/application-structure) guide explains how to structure your LangGraph application for deployment. -- The [API Reference](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref.html) provides detailed information on the API endpoints and data models. diff --git a/src/langsmith/local-server.mdx b/src/langsmith/local-server.mdx index 5a1c3206a..e65d7fb98 100644 --- a/src/langsmith/local-server.mdx +++ b/src/langsmith/local-server.mdx @@ -75,9 +75,9 @@ You will find a [`.env.example`](/langsmith/application-structure#configuration- LANGSMITH_API_KEY=lsv2... ``` -## 5. Launch LangGraph server 🚀 +## 5. Launch Agent Server 🚀 -Start the LangGraph API server locally: +Start the Agent Server locally: @@ -104,10 +104,10 @@ Sample output: > - Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 ``` -The [`langgraph dev`](/langsmith/cli) command starts [LangGraph Server](/langsmith/langgraph-server) in an in-memory mode. This mode is suitable for development and testing purposes. +The [`langgraph dev`](/langsmith/cli) command starts [Agent Server](/langsmith/agent-server) in an in-memory mode. This mode is suitable for development and testing purposes. -For production use, deploy LangGraph Server with a persistent storage backend. For more information, refer to the LangSmith [hosting options](/langsmith/platform-setup). +For production use, deploy Agent Server with a persistent storage backend. For more information, refer to the LangSmith [platform options](/langsmith/platform-setup). ## 6. Test the API @@ -229,10 +229,10 @@ Now that you have a LangGraph app running locally, you're ready to deploy it: **Choose a hosting option for LangSmith:** - [**Cloud**](/langsmith/cloud): Fastest setup, fully managed (recommended). -- [**Hybrid**](/langsmith/hybrid): Data plane in your cloud, control plane managed by LangChain. +- [**Hybrid**](/langsmith/hybrid): Data plane in your cloud, control plane managed by LangChain. - [**Self-hosted**](/langsmith/self-hosted): Full control in your infrastructure. -For more details, refer to the [hosting comparison](/langsmith/platform-setup). +For more details, refer to the [Platform setup comparison](/langsmith/platform-setup). **Then deploy your app:** - [Deploy to Cloud quickstart](/langsmith/deployment-quickstart): Quick setup guide. @@ -240,4 +240,4 @@ For more details, refer to the [hosting comparison](/langsmith/platform-setup). **Explore features:** - **[Studio](/langsmith/studio)**: Visualize, interact with, and debug your application with the Studio UI. Try the [Studio quickstart](/langsmith/quick-start-studio). -- **API References**: [LangGraph Server API](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/), [Python SDK](/langsmith/langgraph-python-sdk), [JS/TS SDK](/langsmith/langgraph-js-ts-sdk) +- **API References**: [LangSmith Deployment API](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref/), [Python SDK](/langsmith/langgraph-python-sdk), [JS/TS SDK](/langsmith/langgraph-js-ts-sdk) diff --git a/src/langsmith/platform-logs.mdx b/src/langsmith/platform-logs.mdx index a4c3931e3..d13765502 100644 --- a/src/langsmith/platform-logs.mdx +++ b/src/langsmith/platform-logs.mdx @@ -21,7 +21,7 @@ Clicking this button will take you to the server logs view for the associated de The server logs view displays logs from both: -- **LangGraph Server's own operational logs** - Internal server operations, API calls, and system events +- **Agent Server's own operational logs** - Internal server operations, API calls, and system events - **User application logs** - Logs written in your graph with: - Python: Use the `logging` or `structlog` libraries - JavaScript: Use the re-exported Winston logger from `@langchain/langgraph-sdk/logging`: diff --git a/src/langsmith/platform-setup.mdx b/src/langsmith/platform-setup.mdx index b8184fcaa..29e66a4f2 100644 --- a/src/langsmith/platform-setup.mdx +++ b/src/langsmith/platform-setup.mdx @@ -16,7 +16,7 @@ If you want to deploy an agent application, the [Deployment section](/langsmith/ You can deploy LangSmith in one of three modes: - [**Cloud**](/langsmith/cloud): fully managed by LangChain -- [**Hybrid**](/langsmith/hybrid): LangChain manages the control plane; you host the data plane +- [**Hybrid**](/langsmith/hybrid): LangChain manages the control plane; you host the data plane - [**Self-hosted**](/langsmith/self-hosted): you manage the full stack within your infrastructure @@ -65,7 +65,7 @@ Refer to the following table for a comparison: | **Best for** | Quick setup, managed infrastructure | Data residency requirements + managed control plane | Full control, data isolation | -You can [run a LangGraph Server locally for free](/langsmith/local-server) for testing and development. +You can [run an Agent Server locally for free](/langsmith/local-server) for testing and development. ### Related diff --git a/src/langsmith/quick-start-studio.mdx b/src/langsmith/quick-start-studio.mdx index fda5131bb..f0ce428ea 100644 --- a/src/langsmith/quick-start-studio.mdx +++ b/src/langsmith/quick-start-studio.mdx @@ -6,7 +6,7 @@ sidebarTitle: Quickstart [Studio](/langsmith/studio) in the [LangSmith Deployments UI](https://smith.langchain.com) supports connecting to two types of graphs: - Graphs deployed on [cloud or self-hosted](#deployed-graphs). -- Graphs running locally with [LangGraph server](#local-development-server). +- Graphs running locally with [Agent Server](#local-development-server). ## Deployed graphs @@ -50,7 +50,7 @@ To test your application locally using Studio: Safari blocks `localhost` connections to Studio. To work around this, run the command with `--tunnel` to access Studio via a secure tunnel.
- This will start the LangGraph Server locally, running in-memory. The server will run in watch mode, listening for and automatically restarting on code changes. Read this [reference](/langsmith/cli#dev) to learn about all the options for starting the API server. + This will start the Agent Server locally, running in-memory. The server will run in watch mode, listening for and automatically restarting on code changes. Read this [reference](/langsmith/cli#dev) to learn about all the options for starting the API server. You will see the following logs: diff --git a/src/langsmith/release-versions.mdx b/src/langsmith/release-versions.mdx index a7ac8abfa..a47f8753a 100644 --- a/src/langsmith/release-versions.mdx +++ b/src/langsmith/release-versions.mdx @@ -8,4 +8,4 @@ import ReleaseVersionPolicy from "/snippets/release-version-policy.mdx"; ## Current version support -To check the current supported versions and their support levels, refer to the [LangGraph Server Changelog](/langsmith/langgraph-server-changelog) for the latest release information. +To check the current supported versions and their support levels, refer to the [Agent Server Changelog](/langsmith/agent-server-changelog) for the latest release information. diff --git a/src/langsmith/scalability-and-resilience.mdx b/src/langsmith/scalability-and-resilience.mdx index c638373ed..f33b02f05 100644 --- a/src/langsmith/scalability-and-resilience.mdx +++ b/src/langsmith/scalability-and-resilience.mdx @@ -29,10 +29,10 @@ If a hard shutdown occurs due to a server crash or an infrastructure failure, an For deployment modalities where we manage the Postgres database, we have periodic backups and continuously replicated standby replicas for automatic failover. This Postgres configuration is available in the [Cloud deployment option](/langsmith/cloud) for [`Production` deployment types](/langsmith/control-plane#deployment-types) only. -All communication with Postgres implements retries for retry-able errors. If Postgres is momentarily unavailable, such as during a database restart, most/all traffic should continue to succeed. Prolonged failure of Postgres will render the LangGraph Server unavailable. +All communication with Postgres implements retries for retry-able errors. If Postgres is momentarily unavailable, such as during a database restart, most/all traffic should continue to succeed. Prolonged failure of Postgres will render the Agent Server unavailable. ## Redis resilience All data that requires durable storage is stored in Postgres, not Redis. Redis is used only for ephemeral metadata, and communication between instances. Therefore we place no durability requirements on Redis. -All communication with Redis implements retries for retry-able errors. If Redis is momentarily unavailable, such as during a database restart, most/all traffic should continue to succeed. Prolonged failure of Redis will render the LangGraph Server unavailable. +All communication with Redis implements retries for retry-able errors. If Redis is momentarily unavailable, such as during a database restart, most/all traffic should continue to succeed. Prolonged failure of Redis will render the Agent Server unavailable. diff --git a/src/langsmith/sdk.mdx b/src/langsmith/sdk.mdx index 7ac8f423f..39d2ff4b4 100644 --- a/src/langsmith/sdk.mdx +++ b/src/langsmith/sdk.mdx @@ -3,7 +3,7 @@ title: LangGraph SDK sidebarTitle: LangGraph SDK --- -LangSmith provides both a Python SDK for interacting with [LangGraph Server](/langsmith/langgraph-server). +LangSmith provides both a Python SDK for interacting with [Agent Server](/langsmith/agent-server). **Python SDK reference** @@ -29,7 +29,7 @@ You can install the packages using the appropriate package manager for your lang ## Python sync vs. async -The Python SDK provides both synchronous (`get_sync_client`) and asynchronous (`get_client`) clients for interacting with LangGraph Server: +The Python SDK provides both synchronous (`get_sync_client`) and asynchronous (`get_client`) clients for interacting with Agent Server: diff --git a/src/langsmith/self-hosted.mdx b/src/langsmith/self-hosted.mdx index b914a845c..97596329f 100644 --- a/src/langsmith/self-hosted.mdx +++ b/src/langsmith/self-hosted.mdx @@ -28,7 +28,7 @@ This page provides an overview of each self-hosted model: icon="layer-group" href="#langsmith-deployment" > - Enables deploying graphs to LangGraph Server via the control plane. The control plane and data plane provide the full LangSmith platform for running and monitoring agents. This includes observability, evaluation, and deployment. + Enables deploying graphs to Agent Server via the control plane. The control plane and data plane provide the full LangSmith platform for running and monitoring agents. This includes observability, evaluation, and deployment. - Host a LangGraph Server directly without the control plane UI. A lightweight option for running one or a few agents as independent services, with full control over scaling and integration. + Host an Agent Server directly without the control plane UI. A lightweight option for running one or a few agents as independent services, with full control over scaling and integration. Model | Includes | Best for | Methods ------------------|------------------|----------|-------------------- **Observability & Evaluation** |
  • LangSmith (UI + API)
  • Backend services (queue, playground, ACE)
  • Datastores: PostgreSQL, Redis, ClickHouse, optional blob storage
|
  • Teams who need self-hosted observability, tracing, and evaluation
  • Running LangSmith without deploying agents/graphs
|
  • Docker Compose (dev/test)
  • Kubernetes + Helm (production)
-**Observability, Evaluation & Deployment** |
  • Everything from Observability and Evaluation
  • Control plane (deployments UI, revision management, Studio)
  • Data plane (LangGraph Server pods)
  • Kubernetes operator for orchestration
|
  • Enterprise teams needing a private LangChain Cloud
  • Centralized UI/API for managing multiple agents/graphs
  • Integrated observability and orchestration
|
  • Kubernetes with Helm (required)
  • Runs on EKS, GKE, AKS, or self-managed clusters
-**Standalone server** |
  • LangGraph Server container(s)
  • Requires PostgreSQL + Redis (shared or dedicated)
  • Optional LangSmith integration for tracing
|
  • Lightweight deployments of one or a few agents
  • Integrating LangGraph Servers as microservices
  • Teams preferring to manage scaling & CI/CD themselves
|
  • Docker / Docker Compose (dev/test)
  • Kubernetes + Helm (production)
  • Any container runtime or VM (ECS, EC2, ACI, etc.)
+**Observability, Evaluation & Deployment** |
  • Everything from Observability and Evaluation
  • Control plane (deployments UI, revision management, Studio)
  • Data plane (Agent Server pods)
  • Kubernetes operator for orchestration
|
  • Enterprise teams needing a private LangChain Cloud
  • Centralized UI/API for managing multiple agents/graphs
  • Integrated observability and orchestration
|
  • Kubernetes with Helm (required)
  • Runs on EKS, GKE, AKS, or self-managed clusters
+**Standalone server** |
  • Agent Server container(s)
  • Requires PostgreSQL + Redis (shared or dedicated)
  • Optional LangSmith integration for tracing
|
  • Lightweight deployments of one or a few agents
  • Integrating Agent Servers as microservices
  • Teams preferring to manage scaling & CI/CD themselves
|
  • Docker / Docker Compose (dev/test)
  • Kubernetes + Helm (production)
  • Any container runtime or VM (ECS, EC2, ACI, etc.)
For setup guides, refer to: @@ -132,7 +132,7 @@ This includes everything from [LangSmith](#langsmith), plus: | Component | Responsibilities | Where it runs | Who manages it | |-----------|------------------|---------------|----------------| | Control plane |
  • UI for creating deployments & revisions
  • APIs for deployment management
| Your cloud | You | -| Data plane |
  • Operator/listener to reconcile deployments
  • LangGraph Servers (agents/graphs)
  • Backing services (Postgres, Redis, etc.)
| Your cloud | You | +| Data plane |
  • Operator/listener to reconcile deployments
  • Agent Servers (agents/graphs)
  • Backing services (Postgres, Redis, etc.)
| Your cloud | You | You run both the control plane and the data plane entirely within your own infrastructure. You are responsible for provisioning and managing all components. @@ -170,14 +170,14 @@ If you want to self-host LangSmith for observability, evaluation, and agent depl ## Standalone Server -The **Standalone server** option is the most lightweight and flexible way to run LangSmith. Unlike the other models, you only manage a simplified data plane made up of LangGraph Servers and their required backing services (PostgreSQL, Redis, etc.). +The **Standalone server** option is the most lightweight and flexible way to run LangSmith. Unlike the other models, you only manage a simplified data plane made up of Agent Servers and their required backing services (PostgreSQL, Redis, etc.). This includes: | Component | Responsibilities | Where it runs | Who manages it | |-----------|------------------|---------------|----------------| | **Control plane** | n/a | n/a | n/a | -| **Data plane** |
  • LangGraph Servers
  • Postgres, Redis, etc.
| Your cloud | You | +| **Data plane** |
  • Agent Servers
  • Postgres, Redis, etc.
| Your cloud | You | This option gives you full control over scaling, deployment, and CI/CD pipelines, while still allowing optional integration with LangSmith for tracing and evaluation. @@ -201,17 +201,17 @@ Do not run standalone servers in serverless environments. Scale-to-zero may caus 1. Define and test your graph locally using the `langgraph-cli` or [Studio](/langsmith/studio) 2. Package your agent as a Docker image -3. Deploy the LangGraph Server to your compute platform of choice (Kubernetes, Docker, VM) +3. Deploy the Agent Server to your compute platform of choice (Kubernetes, Docker, VM) 4. Optionally, configure LangSmith API keys and endpoints so the server reports traces and evaluations back to LangSmith (self-hosted or SaaS) ### Supported compute platforms -- **Kubernetes**: Use the LangSmith Helm chart to run LangGraph Servers in a Kubernetes cluster. This is the recommended option for production-grade deployments. +- **Kubernetes**: Use the LangSmith Helm chart to run Agent Servers in a Kubernetes cluster. This is the recommended option for production-grade deployments. - **Docker**: Run in any Docker-supported compute platform (local dev machine, VM, ECS, etc.). This is best suited for development or small-scale workloads. ### Setup guide -To set up a [LangGraph Server](/langsmith/langgraph-server), refer to the [how-to guide](/langsmith/deploy-standalone-server) in the application deployment section. +To set up an [Agent Server](/langsmith/agent-server), refer to the [how-to guide](/langsmith/deploy-standalone-server) in the application deployment section. diff --git a/src/langsmith/server-a2a.mdx b/src/langsmith/server-a2a.mdx index 5502b03d1..667a03348 100644 --- a/src/langsmith/server-a2a.mdx +++ b/src/langsmith/server-a2a.mdx @@ -1,11 +1,11 @@ --- -title: A2A endpoint in LangGraph Server -sidebarTitle: A2A endpoint in LangGraph Server +title: A2A endpoint in Agent Server +sidebarTitle: A2A endpoint in Agent Server --- [Agent2Agent (A2A)](https://a2a-protocol.org/latest/) is Google's protocol for enabling communication between conversational AI agents. [LangSmith implements A2A support](https://langchain-ai.github.io/langgraph/cloud/reference/api/api_ref.html#tag/a2a/post/a2a/{assistant_id}), allowing your agents to communicate with other A2A-compatible agents through a standardized protocol. -The A2A endpoint is available in [LangGraph Server](/langsmith/langgraph-server) at `/a2a/{assistant_id}`. +The A2A endpoint is available in [Agent Server](/langsmith/agent-server) at `/a2a/{assistant_id}`. ## Agent Card Discovery diff --git a/src/langsmith/server-api-ref.mdx b/src/langsmith/server-api-ref.mdx index 33385dd22..dbf53d987 100644 --- a/src/langsmith/server-api-ref.mdx +++ b/src/langsmith/server-api-ref.mdx @@ -1,17 +1,17 @@ --- -title: LangGraph Server API reference for LangSmith Deployment -sidebarTitle: LangGraph Server API +title: Agent Server API reference for LangSmith Deployment +sidebarTitle: Agent Server API --- -The LangGraph Server API reference is available within each [deployment](/langsmith/deployments) at the `/docs` endpoint (e.g. `http://localhost:8124/docs`). +The Agent Server API reference is available within each [deployment](/langsmith/deployments) at the `/docs` endpoint (e.g. `http://localhost:8124/docs`). - View the full LangGraph Server API reference documentation + View the full Agent Server API reference documentation ## Authentication -For deployments to LangSmith, authentication is required. Pass the `X-Api-Key` header with each request to the LangGraph Server. The value of the header should be set to a valid LangSmith API key for the organization where the LangGraph Server is deployed. +For deployments to LangSmith, authentication is required. Pass the `X-Api-Key` header with each request to the Agent Server. The value of the header should be set to a valid LangSmith API key for the organization where the Agent Server is deployed. Example `curl` command: diff --git a/src/langsmith/server-mcp.mdx b/src/langsmith/server-mcp.mdx index cefcc544a..1f3449693 100644 --- a/src/langsmith/server-mcp.mdx +++ b/src/langsmith/server-mcp.mdx @@ -1,13 +1,13 @@ --- -title: MCP endpoint in LangGraph Server -sidebarTitle: MCP endpoint in LangGraph Server +title: MCP endpoint in Agent Server +sidebarTitle: MCP endpoint in Agent Server --- The Model Context Protocol (MCP) is an open protocol for describing tools and data sources in a model-agnostic format, enabling LLMs to discover and use them via a structured API. -[LangGraph Server](/langsmith/langgraph-server) implements MCP using the [Streamable HTTP transport](https://spec.modelcontextprotocol.io/specification/2025-03-26/basic/transports/#streamable-http). This allows LangGraph **agents** to be exposed as **MCP tools**, making them usable with any MCP-compliant client supporting Streamable HTTP. +[Agent Server](/langsmith/agent-server) implements MCP using the [Streamable HTTP transport](https://spec.modelcontextprotocol.io/specification/2025-03-26/basic/transports/#streamable-http). This allows LangGraph **agents** to be exposed as **MCP tools**, making them usable with any MCP-compliant client supporting Streamable HTTP. -The MCP endpoint is available at `/mcp` on [LangGraph Server](/langsmith/langgraph-server). +The MCP endpoint is available at `/mcp` on [Agent Server](/langsmith/agent-server). You can set up [custom authentication middleware](/langsmith/custom-auth) to authenticate a user with an MCP server to get access to user-scoped tools within your LangSmith deployment. @@ -76,7 +76,7 @@ To enable MCP: ### Client -Use an MCP-compliant client to connect to the LangGraph server. The following examples show how to connect using different programming languages. +Use an MCP-compliant client to connect to the Agent Server. The following examples show how to connect using different programming languages. @@ -85,7 +85,7 @@ Use an MCP-compliant client to connect to the LangGraph server. The following ex ``` > **Note** - > Replace `serverUrl` with your LangGraph server URL and configure authentication headers as needed. + > Replace `serverUrl` with your Agent Server URL and configure authentication headers as needed. ```js import { Client } from "@modelcontextprotocol/sdk/client/index.js"; diff --git a/src/langsmith/streaming.mdx b/src/langsmith/streaming.mdx index 71e639c1b..b6d4747ca 100644 --- a/src/langsmith/streaming.mdx +++ b/src/langsmith/streaming.mdx @@ -2,10 +2,10 @@ title: Streaming API sidebarTitle: Streaming API --- -[LangGraph SDK](/langsmith/langgraph-python-sdk) allows you to [stream outputs](/oss/langgraph/streaming/) from the [LangGraph Server API](/langsmith/server-api-ref). +[LangGraph SDK](/langsmith/langgraph-python-sdk) allows you to [stream outputs](/oss/langgraph/streaming/) from the [LangSmith Deployment API](/langsmith/server-api-ref). -LangGraph SDK and LangGraph Server are a part of [LangSmith](/langsmith/home). +LangGraph SDK and Agent Server are a part of [LangSmith](/langsmith/home). ## Basic usage @@ -88,7 +88,7 @@ Basic usage example: - This is an example graph you can run in the LangGraph API server. + This is an example graph you can run in the Agent Server. See [LangSmith quickstart](/langsmith/deployment-quickstart) for more details. ```python @@ -117,7 +117,7 @@ Basic usage example: ) ``` - Once you have a running LangGraph API server, you can interact with it using + Once you have a running Agent Server, you can interact with it using [LangGraph SDK](/langsmith/langgraph-python-sdk) @@ -452,7 +452,7 @@ async for chunk in client.runs.stream( 1. Set `stream_subgraphs=True` to stream outputs from subgraphs. - This is an example graph you can run in the LangGraph API server. + This is an example graph you can run in the Agent Server. See [LangSmith quickstart](/langsmith/deployment-quickstart) for more details. ```python @@ -493,7 +493,7 @@ async for chunk in client.runs.stream( graph = builder.compile() ``` - Once you have a running LangGraph API server, you can interact with it using + Once you have a running Agent Server, you can interact with it using [LangGraph SDK](/langsmith/langgraph-python-sdk) diff --git a/src/langsmith/studio.mdx b/src/langsmith/studio.mdx index 159a310fe..a960ddc51 100644 --- a/src/langsmith/studio.mdx +++ b/src/langsmith/studio.mdx @@ -6,11 +6,11 @@ sidebarTitle: Overview **Prerequisites** * [LangSmith](/langsmith/home) -* [LangGraph Server](/langsmith/langgraph-server) +* [Agent Server](/langsmith/agent-server) * [LangGraph CLI](/langsmith/cli) -Studio is a specialized agent IDE that enables visualization, interaction, and debugging of agentic systems that implement the LangGraph Server API protocol. Studio also integrates with [tracing](/langsmith/observability-concepts), [evaluation](/langsmith/evaluation), and [prompt engineering](/langsmith/prompt-engineering). +Studio is a specialized agent IDE that enables visualization, interaction, and debugging of agentic systems that implement the Agent Server API protocol. Studio also integrates with [tracing](/langsmith/observability-concepts), [evaluation](/langsmith/evaluation), and [prompt engineering](/langsmith/prompt-engineering). ## Features @@ -28,14 +28,14 @@ Key features of Studio: ```mermaid flowchart subgraph LangSmith Deployment - A[LangGraph CLI] -->|creates| B(LangGraph Server deployment) + A[LangGraph CLI] -->|creates| B(Agent Server deployment) B <--> D[Studio] B <--> E[SDKs] B <--> F[RemoteGraph] end ``` -Studio works for graphs that are deployed on [LangSmith](/langsmith/deployment-quickstart) or for graphs that are running locally via the [LangGraph Server](/langsmith/local-server). +Studio works for graphs that are deployed on [LangSmith](/langsmith/deployment-quickstart) or for graphs that are running locally via the [Agent Server](/langsmith/local-server). Studio supports two modes: diff --git a/src/langsmith/use-remote-graph.mdx b/src/langsmith/use-remote-graph.mdx index 6219098a8..4dabcd566 100644 --- a/src/langsmith/use-remote-graph.mdx +++ b/src/langsmith/use-remote-graph.mdx @@ -23,7 +23,7 @@ sidebarTitle: Interact with a deployment using RemoteGraph Before getting started with `RemoteGraph`, make sure you have: - Access to [LangSmith](/langsmith/home), where your graphs are developed and managed. -- A running [LangGraph Server](/langsmith/langgraph-server), which hosts your deployed graphs for remote interaction. +- A running [Agent Server](/langsmith/agent-server), which hosts your deployed graphs for remote interaction. ## Initialize the graph diff --git a/src/langsmith/use-stream-react.mdx b/src/langsmith/use-stream-react.mdx index 850b6e629..1b5157f9f 100644 --- a/src/langsmith/use-stream-react.mdx +++ b/src/langsmith/use-stream-react.mdx @@ -5,7 +5,7 @@ sidebarTitle: Integrate LangGraph into your React application **Prerequisites** * [LangSmith](/langsmith/home) -* [LangGraph Server](/langsmith/langgraph-server) +* [Agent Server](/langsmith/agent-server) The [`useStream()`](https://langchain-ai.github.io/langgraphjs/reference/modules/sdk.html) React hook provides a seamless way to integrate LangGraph into your React applications. It handles all the complexities of streaming, state management, and branching logic, letting you focus on building great chat experiences. diff --git a/src/oss/langchain/deploy.mdx b/src/oss/langchain/deploy.mdx index 2d21bb093..c1af59cde 100644 --- a/src/oss/langchain/deploy.mdx +++ b/src/oss/langchain/deploy.mdx @@ -15,6 +15,6 @@ Before you begin, ensure you have the following: ### 1. Create a repository on GitHub -Your application's code must reside in a GitHub repository to be deployed on LangSmith. Both public and private repositories are supported. For this quickstart, first make sure your app is LangGraph-compatible by following the [local server setup guide](/oss/langchain/studio#setup-local-langgraph-server). Then, push your code to the repository. +Your application's code must reside in a GitHub repository to be deployed on LangSmith. Both public and private repositories are supported. For this quickstart, first make sure your app is LangGraph-compatible by following the [local server setup guide](/oss/langchain/studio#setup-local-agent-server). Then, push your code to the repository. diff --git a/src/oss/langchain/studio.mdx b/src/oss/langchain/studio.mdx index 4291a816c..041eb9bc1 100644 --- a/src/oss/langchain/studio.mdx +++ b/src/oss/langchain/studio.mdx @@ -7,5 +7,5 @@ import Studio from '/snippets/oss/studio.mdx'; -For more information about local and deployed agents, see [Set up local LangGraph Server](/oss/langchain/studio#setup-local-langgraph-server) and [Deploy](/oss/langchain/deploy). +For more information about local and deployed agents, see [Set up local Agent Server](/oss/langchain/studio#setup-local-agent-server) and [Deploy](/oss/langchain/deploy). diff --git a/src/oss/langchain/ui.mdx b/src/oss/langchain/ui.mdx index c04459026..e8cd4d4f9 100644 --- a/src/oss/langchain/ui.mdx +++ b/src/oss/langchain/ui.mdx @@ -15,13 +15,13 @@ import chat_uiJS from '/snippets/oss/ui-js.mdx'; ### Connect to your agent -Agent Chat UI can connect to both [local](/oss/langchain/studio#setup-local-langgraph-server) and [deployed agents](/oss/langchain/deploy). +Agent Chat UI can connect to both [local](/oss/langchain/studio#setup-local-agent-server) and [deployed agents](/oss/langchain/deploy). After starting Agent Chat UI, you'll need to configure it to connect to your agent: 1. **Graph ID**: Enter your graph name (find this under `graphs` in your `langgraph.json` file) -2. **Deployment URL**: Your LangGraph server's endpoint (e.g., `http://localhost:2024` for local development, or your deployed agent's URL) -3. **LangSmith API key (optional)**: Add your LangSmith API key (not required if you're using a local LangGraph server) +2. **Deployment URL**: Your Agent server's endpoint (e.g., `http://localhost:2024` for local development, or your deployed agent's URL) +3. **LangSmith API key (optional)**: Add your LangSmith API key (not required if you're using a local Agent server) Once configured, Agent Chat UI will automatically fetch and display any interrupted threads from your agent. diff --git a/src/oss/langgraph/deploy.mdx b/src/oss/langgraph/deploy.mdx index 173f6587d..79044e19c 100644 --- a/src/oss/langgraph/deploy.mdx +++ b/src/oss/langgraph/deploy.mdx @@ -17,6 +17,6 @@ Before you begin, ensure you have the following: ### 1. Create a repository on GitHub -Your application's code must reside in a GitHub repository to be deployed on LangSmith. Both public and private repositories are supported. For this quickstart, first make sure your app is LangGraph-compatible by following the [local server setup guide](/oss/langgraph/studio#setup-local-langgraph-server). Then, push your code to the repository. +Your application's code must reside in a GitHub repository to be deployed on LangSmith. Both public and private repositories are supported. For this quickstart, first make sure your app is LangGraph-compatible by following the [local server setup guide](/oss/langgraph/studio#setup-local-agent-server). Then, push your code to the repository. diff --git a/src/oss/langgraph/local-server.mdx b/src/oss/langgraph/local-server.mdx index c4bf052dd..9dbe7a923 100644 --- a/src/oss/langgraph/local-server.mdx +++ b/src/oss/langgraph/local-server.mdx @@ -91,7 +91,7 @@ You will find a `.env.example` in the root of your new LangGraph app. Create a ` LANGSMITH_API_KEY=lsv2... ``` -## 5. Launch LangGraph server 🚀 +## 5. Launch Agent server 🚀 Start the LangGraph API server locally: @@ -119,7 +119,7 @@ Sample output: > - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 ``` -The `langgraph dev` command starts LangGraph Server in an in-memory mode. This mode is suitable for development and testing purposes. For production use, deploy LangGraph Server with access to a persistent storage backend. For more information, see the [Platform setup overview](/langsmith/platform-setup). +The `langgraph dev` command starts Agent Server in an in-memory mode. This mode is suitable for development and testing purposes. For production use, deploy Agent Server with access to a persistent storage backend. For more information, see the [Platform setup overview](/langsmith/platform-setup). ## 6. Test your application in Studio @@ -129,7 +129,7 @@ The `langgraph dev` command starts LangGraph Server in an in-memory mode. This m > - LangGraph Studio Web UI: https://smith.langchain.com/studio/?baseUrl=http://127.0.0.1:2024 ``` -For a LangGraph Server running on a custom host/port, update the baseURL parameter. +For an Agent Server running on a custom host/port, update the baseURL parameter. Use the `--tunnel` flag with your command to create a secure tunnel, as Safari has limitations when connecting to localhost servers: diff --git a/src/oss/langgraph/ui.mdx b/src/oss/langgraph/ui.mdx index 7287c6ba3..363e55530 100644 --- a/src/oss/langgraph/ui.mdx +++ b/src/oss/langgraph/ui.mdx @@ -14,13 +14,13 @@ import chat_uiJS from '/snippets/oss/ui-js.mdx'; ### Connect to your agent -Agent Chat UI can connect to both [local](/oss/langgraph/studio#setup-local-langgraph-server) and [deployed agents](/oss/langgraph/deploy). +Agent Chat UI can connect to both [local](/oss/langgraph/studio#setup-local-agent-server) and [deployed agents](/oss/langgraph/deploy). After starting Agent Chat UI, you'll need to configure it to connect to your agent: 1. **Graph ID**: Enter your graph name (find this under `graphs` in your `langgraph.json` file) -2. **Deployment URL**: Your LangGraph server's endpoint (e.g., `http://localhost:2024` for local development, or your deployed agent's URL) -3. **LangSmith API key (optional)**: Add your LangSmith API key (not required if you're using a local LangGraph server) +2. **Deployment URL**: Your Agent server's endpoint (e.g., `http://localhost:2024` for local development, or your deployed agent's URL) +3. **LangSmith API key (optional)**: Add your LangSmith API key (not required if you're using a local Agent server) Once configured, Agent Chat UI will automatically fetch and display any interrupted threads from your agent. diff --git a/src/snippets/oss/studio.mdx b/src/snippets/oss/studio.mdx index 896d1f158..03d86899c 100644 --- a/src/snippets/oss/studio.mdx +++ b/src/snippets/oss/studio.mdx @@ -18,7 +18,7 @@ Studio is our free-to-use, powerful agent IDE that integrates with [LangSmith](/ Before you begin, ensure you have the following: * An API key for [LangSmith](https://smith.langchain.com/settings) (free to sign up) -## Setup local LangGraph server +## Setup local Agent server ### 1. Install the LangGraph CLI @@ -125,7 +125,7 @@ yarn install ### 6. View your agent in Studio -Start your LangGraph server: +Start your Agent server: :::python ```shell