Delinea releases free open-source MCP server to secure AI agents

0
10

AI agents are becoming more common in the workplace, but giving them access to sensitive systems can be risky. Credentials often get stored in plain text, added to prompts, or passed around without proper oversight. Delinea wants to fix that problem with its new open source Model Context Protocol (MCP) Server.

Delinea MCP server

How the Delinea MCP Server works

The MCP Server connects AI agents to the Delinea Platform, allowing them to securely retrieve and use credentials. The server applies identity checks and policy rules so access can be tracked and limited. This makes it easier to protect secrets without having to build custom integrations for every tool or workflow.

Phil Calvin, Chief Product Officer at Delinea, told Help Net Security that the design of the MCP Server is focused on reducing the risk of credential misuse by AI systems. “Delinea MCP Server combines abstraction, least-privilege controls, and ephemeral authentication to make sure that AI tools can be productive without creating new avenues for credential leakage or misuse,” Calvin said.

Instead of giving AI systems direct access to vaults, the MCP Server acts as a secure intermediary. “It only exposes carefully defined functions so AI tools can request what they need to do a job but never see raw credentials,” Calvin explained. Access is further restricted through policies and scope controls, and authentication happens with short-lived tokens. This means sensitive passwords are never directly handled by the AI agents themselves.

Why securing AI credentials matters

Many AI agents need to interact with real systems, like databases or cloud services. If they are hardcoded with credentials, there is no easy way to audit or revoke access. By using ephemeral tokens and centralized policies, the MCP Server reduces that risk. It also supports standards like OAuth and includes connectors for popular AI platforms such as ChatGPT and Claude.

Some organizations may face challenges rolling out the MCP Server, especially if they have complex legacy environments. Calvin noted that integration takes planning and care. “Integrating the MCP Server into existing AI workflows isn’t plug-and-play,” he said. “The main challenges organizations will face are around configuration, securely handling credentials, and making sure their existing AI agents use the new controlled tools instead of bypassing them.”

To help with that process, Delinea offers Docker images, documentation, and sample integrations. “We provide ready-to-use Docker images, documentation, and reference integrations with popular tools like ChatGPT, Claude, and VSCode Copilot,” Calvin said. “We also provide best practices on how to scope tools, separate credentials from configs, and test deployments before going live. In short, while the integration takes some upfront planning, we’ve built the project and documentation so organizations can adopt it step-by-step with confidence.”

Delinea’s Model Context Protocol (MCP) Server is available on GitHub.

LEAVE A REPLY

Please enter your comment!
Please enter your name here