πŸ€– AI-enable existing Microservices by adding built-in MCP Server

Overview

In this article, I will demonstrate how you can quickly extend your existing (or new) microservice, developed by using the Java Microservice SDK, with an MCP Server providing custom tools for your AI agent.

MCP (Model Context Protocol) is a standard that was developed to integrate external systems with an AI agent or application.

While simple AI agents work pretty well by mainly using the LLM, some of them need special context. Just think about an AI agent that should help you to discover issues in your device fleet or requires any other data that is not known by the LLM.

For that, attached to the AI trend, MCP rapidly became THE standard for providing additional context to AI agents. No wonder a lot of vendors, open-source components, and applications adopted it and developed MCP-Servers themselves to expose data and functionality to AI or provided tools to easily build your own MCP Server.

So also Cumulocity did, by providing an open-source MCP-Server which covers most of the available REST API, as described by @Nikolaus_Neuerburg in the following article.

You can also check out the MCP Server yourself directly - it’s open source.

That’s a huge benefit because with that you can develop AI agents that use IoT context provided by Cumulocity in a harmonized way.

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”    β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ LLM Agent        │◄──── MCP Protocol │◄──── IoT Platform       β”‚
β”‚ (e.g. Claude/GPT)β”‚    β”‚ Server       β”‚    β”‚ (e.g. Cumulocity)  β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜    β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜

(Source: Supercharing IoT Article)

But there is more to do. As Cumulocity is an open platform by allowing you to develop your own extensions such as microservices or UI plugins, these custom functionalities & data might also be useful to be used in your AI agent.

Spring AI

The Java Microservice SDK uses Spring Boot as a core. Luckily, Spring also jumped on the AI-train and developed a comprehensive AI suite including libs to build your own MCP Server:

There they provided mulitple variants of MCP Server that can be used:

  • Standard MCP Server - which mainly supports STDIO server transport, meaning it is designed to be executed in a terminal to communicate like we know from other command-line tools.
  • WebMVC Server Transport - which supports SSE (Server-Sent Events) and STDIO, meaning it contains a lightweight web server that pushes messages/events to the clients leveraging HTTP. MVC stands for a classic web framework.
  • WebFlux Server Transport - which covers the same functionality as the WebMVC Server but uses the WebFlux Framework instead of the MVC Framework. WebFlux, in short, is a reactive web framework, meaning it is designed to build non-blocking asynchronous web applications. Especially designed for high concurrency and high load applications.
  • Streamable HTTP Server Transport - still in development therefor considered unstable but is available in WebMVC and WebFlux variants to support streamable HTTP which is a successor of the plain SSE Server Transport.

In this guide, I will use the classic WebMVC variant as it supports SSE which is currently supported by the AI Agent Manager, but you could also use the WebFlux or any streamable HTTP variant of course.

Adapting the microservice

Assuming you already have a microservice with existing functionality, you can easily add MCP-based tools by following the described steps.

  1. Upgrade your Microservice SDK to at least Spring Boot 3.4
  2. Adding Spring AI dependencies to pom.xml
  3. Adding MCP Server configuration properties
  4. Expose custom tools by using annotations

If you don’t have an existing microservice but you are still interested in adding a MCP Server, check out this guide first:

Upgrade your used Microservice SDK

First of all, you need to check if your used microservice SDK must be upgraded. You should use at least 2025.18.0, which uses Spring Boot 3.4 as a minimum. Better is to use a newer version like 2025.67.0, for example.

Follow the guide if migration is needed:

Adding dependencies

Next, we need to add the required dependency to the pom.xml. Adding dependency management block (please adapt the version if a newer one is available):

<dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>org.springframework.ai</groupId>
            <artifactId>spring-ai-bom</artifactId>
            <version>1.1.0-SNAPSHOT</version>
            <type>pom</type>
            <scope>import</scope>
        </dependency>
    </dependencies>
</dependencyManagement>

Adding the dependency:

<dependency>
    <groupId>org.springframework.ai</groupId>
    <artifactId>spring-ai-starter-mcp-server-webmvc</artifactId>
</dependency>

Adding configuration properties

Next, we need to adapt our application.properties. As a minimum, you should add the following:

#Spring AI
spring.ai.mcp.server.enabled=true
spring.ai.mcp.server.name=<YourDesiredMicroserviceMCPServerName>
spring.ai.mcp.server.base-url=/service/<YourExistingMicroserviceContextPath>

Example assuming your microservice name and context path is hello-world-service:

#Spring AI
spring.ai.mcp.server.enabled=true
spring.ai.mcp.server.name=hello-world-mcp-service
spring.ai.mcp.server.base-url=/service/hello-world-service

Check out the list of available properties here: MCP Server Boot Starter :: Spring AI Reference

Expose tools by using annotations

The next step is to annotate your existing methods of services or components with the @Tool(..) annotation. Read more if you want to understand how this annotation works.

Here is an example that exposes a method to evaluate an expression for JSONata (to transform a JSON document) where expression and a JSON document as sourceJSON is provided:

/**
 * Test a JSONata expression against a JSON string.
 * @param expression JSONata expression to be evaluated against the source JSON
 * @param sourceJSON JSON string to be used as source for the JSONata expression evaluation
 * @return The result of the JSONata expression evaluation as a pretty-printed JSON string
 * @throws RuntimeException if the evaluation fails
 * @throws IllegalArgumentException if the expression or source JSON is null or empty
 */
@Tool(description = "Evaluate a JSONata expression against a JSON object")
public String evaluateJsonataExpression(String expression, String sourceJSON) {
    if(expression == null || expression.isEmpty())
        throw new IllegalArgumentException("JSONata expression cannot be null");
    if(sourceJSON == null || sourceJSON.isEmpty())
        throw new IllegalArgumentException("Source JSON cannot be null");
    try {
        var expr = jsonata(expression);
        Object parsedJson = Json.parseJson(sourceJSON);
        Object result = expr.evaluate(parsedJson);
        return toPrettyJsonString(result);
    } catch (Exception e) {
        log.error("Error evaluating JSONata expression: ", e);
        throw new RuntimeException(e);
    }
}

Yep, it’s that simple. Just annotate your function with Tool with a meaningful description, and you are good to go.

To get more examples, check out the Spring AI example servers

Adding ToolCallbackProviders

Finally, we need to point our microservice to all services that expose Tools by adding the bean ToolCallbackProvider to the App.java class:

@Bean
public ToolCallbackProvider myTools(...) {
    return MethodToolCallbackProvider.builder()
            .toolObjects(...)
            .build();
}

Again, an example where AIAgentService is a service where we used the @Tools(...) annotation.

@Bean
public ToolCallbackProvider tools(AIAgentService aiAgentService) {
    return MethodToolCallbackProvider.builder()
            .toolObjects(aiAgentService)
            .build();
}

You can add multiple beans where @Tools(...) has been used

That’s it! Now you can already build, deploy & run your microservice. Ideally, you deploy it directly to your tenant.

Test the MCP Server

Now that everything is implemented, we want to check if the MCP Server with its provided tools work as expected.

There are several ways to achieve that.

  • Using a tool like MCP Inspector running locally.
  • Using Cumulocity provided tools like AI Agent Manager (currently in private preview).

Using a MCP Inspector

We can easily start MCP Inspector with

npx @modelcontextprotocol/inspector

assuming nodeJS is installed on your computer. When installed & started, it will open a browser window:

Change the Transport Type to SSE
URL: https://<yourTenantURL>/service/<YourMicroserviceName>/sse
API Token Authentication: <BearerToken>

To retrieve the bearer token, you can simply go to your browser, log in to your Cumulocity tenant, open the developer tools with F12, and check the Application tab. Under cookies, you’ll find an authorization cookie containing an encoded bearer token as value. Copy it.

!! Caution !! Do not share this token with anyone!

Back in MCP Inspector, you can just paste the token and click on Connect. After a short time, you will be connected.

On the right side, you can now request all resources, prompts, and tools. By clicking on List Tools, it will fetch all tools of your MCP Server:

You can directly test it by providing the required input parameters.

Using AI Agent Manager

Will be added when AI Agent Manager is available, but to spoil the surprise: It’s even simpler than using MCP Inspector. Basically, you just need to paste the URL https://<yourTenantURL>/service/<YourMicroserviceName>/sse and click on Test connection, which will return the available tools already.

Summary

In this article, I demonstrated the steps to easily extend your existing microservices with a full-fledged MCP Server including tools to be used by AI agents.

As you saw, by leveraging the Microservice SDK, it’s quite simple. Of course, you can also build an MCP Server on top of any other microservice that has been developed in other programming languages. This will most probably require a little bit more effort to find the right libs and defining tools in your code. As an example for Python, you can check this repository:

Leave a comment if you have feedback or questions.

3 Likes