Design & Reuse

Dnotitia Launches the World's First MCP-Based AI Agent Workstation with Integrated Vector Database

Aug. 12, 2025, Aug. 12, 2025 – 

Seoul, South Korea –  Dnotitia Inc. (Dnotitia), a fast-rising innovator in long-term memory AI and semiconductor-integrated solutions, unveiled its Mnemos Workstation, the world’s first AI agent workstation built on the Model Context Protocol(MCP) with an integrated vector database, at FMS 2025 (for Future of Memory and Storage). FMS 2025, the world’s premier memory and storage conference, took place August 5 – 7 in Santa Clara, California. By combining storage, GPU, vector database, and AI agent capabilities into a single device, the Mnemos Workstation captured significant attention as a compact and cost-efficient alternative to traditional, complex AI infrastructure.

Dnotitia demonstrated the Mnemos Workstation during the event. The device featured the full capabilities of Seahorse Cloud, Dnotitia’s vector database platform that launched in beta this past April. Through a single web-based application, it enabled semantic search on S3-compatible object storage, with all vector indexing and retrieval fully managed. The Mnemos Workstation also integrated the Mnemos Agent, built to comply with the Model Context Protocol(MCP) standard, which leverages the Seahorse search engine as a tool. This design was optimized to run the entire AI stack, including the vector database and agent layer, on a single workstation as a managed service, without requiring any external servers. With its tightly integrated compute resources and software, the Mnemos Workstation processed advanced AI workloads to be processed in real time on a single device, without the complexity of distributed infrastructure.

The Mnemos Workstation integrated Dnotitia’s DNA foundation model, the high-performance Seahorse vector database, and an MCP-compliant AI agent into a single, compact device. This all-in-one design eliminated the need for external cloud or server infrastructure, allowing users to run advanced AI functions locally. Even without specialized expertise, users could perform complex tasks directly on the device. The Mnemos Agent was not a task-specific tool designed only for RAG use cases but a general-purpose AI agent that adhered to the MCP standard. With its fully web-based design, users could access it directly via a browser without the complicated process of installing an agent or MCP server on their own device.

In this demonstration, Seahorse Vector Database’s performance advantages over a competing vector database were shown directly under identical conditions, while real-time analysis revealed how vector database workloads varied depending on SSD characteristics. This went beyond simple numerical benchmarks, marking the industry’s first demonstration to show, in a real AI environment, how storage and databases operate and interact.

“The Mnemos Workstation delivers a complete AI service in a local environment without a data center, integrating high-performance LLMs, a vector database, and AI agents into a single device,” said Moo-Kyoung Chung, CEO of Dnotitia. “As data becomes increasingly critical to AI services, SSD storage is also emerging as a core component of AI infrastructure,” he added. “Dnotitia has been driving innovation in this field, and this FMS demonstration was an opportunity to showcase those results.”

[About Dnotitia]

Dnotitia is an AI and semiconductor company that creates innovative value through the convergence of artificial intelligence (AI) and data, providing high-performance and low-cost LLM solutions. Leveraging Dnotitia’s world’s first Vector Data Processing Unit (VDPU), the company offers ▲Seahorse, the high-performance vector database, which supports Retrieval-Augmented Generation (RAG) solutions – a key technology for Gen AI. Additionally, Dnotitia offers ▲Mnemos, a personal/Edge LLM device based on its proprietary LLM foundation model.

  • Seahorse is indexes various types of multi-modal data, such as text, images, and videos, into vector form, providing semantic search that extracts information reflecting meanings and contexts based on user queries. Seahorse can be used not only in RAG systems but also for implementing semantic search across all digital data stored globally.
  • Mnemos is a solution designed to address the high costs and resource consumption of AI. It is a compact edge device capable of running high-performance LLM without the need for a data center. Leveraging Dnotitia’s RAG and LLM optimization technology, Mnemos delivers high-performance LLM services using minimal GPU/NPU resources.

Founded in 2023, Dnotitia has grown to a team of over 100 employees in a short period of time and has established strategic partnerships across various industries. By integrating specialized semiconductors and optimized algorithms, Dnotitia aims to usher in a new era of AI. Through the fusion of data with AI to develop AI with long-term memory, Dnotitia envisions realizing a low-cost AGI (Artificial General Intelligence) accessible to everyone, creating a future where the benefits of AI can be enjoyed by all.

For more information about Dnotitia, please visit: www.dnotitia.com .