Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions posts/2025-09-09-25.0.0.9.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ open-graph-image-alt: Open Liberty Logo
blog-available-in-languages:
- lang: zh-Hans
path: /zh-Hans/blog/2025/09/09/25.0.0.9.html
- lang: ja
path: /ja/blog/2025/09/09/25.0.0.9.html
---
= ECDH-ES support added to JWT Builder in 25.0.0.9
Ismath Badsha <https://github.com/IsmathBadsha>
Expand Down
10 changes: 9 additions & 1 deletion posts/2025-10-23-mcp-standalone-blog.adoc
Original file line number Diff line number Diff line change
Expand Up @@ -32,7 +32,15 @@ Consider a scenario where your company provides weather forecasting services tha
A more effective solution is to enable the AI to access current weather data through tools exposed by your Liberty application. This allows the AI to retrieve up-to-date forecast information whenever needed, ensuring responses are always based on the most current data available, without the need for AI model retraining.

== How to Use the Liberty MCP Server Feature
The Liberty MCP Server feature enables a Liberty server to communicate with agentic AI workflows using the MCP protocol with https://modelcontextprotocol.io/specification/2025-03-26/basic/transports#streamable-http[streamable HTTP]. Using the MCP protocol provides a standardized way for any AI application to be able to discover and utilize the business logic within your application.
The Liberty MCP Server feature enables a Liberty server to communicate with agentic AI workflows through the MCP protocol. This protocol provides a standardized way for AI applications to discover and utilize the business logic within your application.

The MCP endpoint is available at `/mcp` under your application's context root. For example, if you see this in your logs:
```
CWWKT0016I: Web application available (default_host): http://localhost:9080/myMcpApp/
```
Then your MCP endpoint can be accessed at `http://localhost:9080/myMcpApp/mcp`. You can connect any MCP client that supports the https://modelcontextprotocol.io/specification/2025-03-26/basic/transports#streamable-http[Streamable HTTP transport].

To test your MCP server, you can use the https://modelcontextprotocol.io/docs/tools/inspector[MCP Inspector]. With `npm` installed, simply run `npx @modelcontextprotocol/inspector` to download and run it.

=== Declaring an MCP Tool
To expose your business logic to authorized AI applications, you'll need to declare it as an https://modelcontextprotocol.io/specification/2025-06-18/server/tools[MCP tool]. In this context, a tool is a function or operation that the AI can invoke to perform a specific task.
Expand Down
Loading