Dell powers AI for enterprise communication with new platform

Dell enhances its AI Data Platform to support scalable search, analytics, and storage workloads that power next-gen collaboration and communication use cases.

author-image
Shubhendu Parth
New Update
AI for enterprise communication Dell 1

Dell Technologies has announced updates to its AI Data Platform aimed at helping enterprises scale AI workloads across fragmented data environments, including communication-intensive use cases such as transcription, retrieval-augmented generation (RAG), and semantic search.

Advertisment

As AI adoption expands across industries, the need for infrastructure that converts siloed, distributed data into reliable AI outcomes has become more urgent. Dell’s enhanced AI Data Platform decouples data storage from compute, creating a modular foundation for AI training, inferencing, and real-time decision-making. The platform is a key component of Dell’s broader AI Factory approach.

The updates feature storage engines such as Dell PowerScale and ObjectScale, which support parallel processing and multi-protocol access, the company stated in a press release. PowerScale is designed for AI pipelines involving RAG and inferencing and now integrates with emerging GPU platforms, including those from NVIDIA. ObjectScale, Dell’s high-performance Amazon Simple Storage Service (S3)-native object storage, now offers improved performance for small object handling and comes with deeper S3 integration.

Dell ObjectScale will soon support S3 over Remote Direct Memory Access (RDMA), expected to significantly improve throughput and reduce latency, enabling faster access to large volumes of unstructured data common in communication workloads.

Advertisment

Platform Tools Target AI Search and Analytics

Dell also shared that it has introduced new data engines in collaboration with software partners to make AI data more actionable. The Data Search Engine, built with Elastic, enables natural language interactions for AI models and supports metadata-rich search across billions of files—crucial for generative AI and RAG applications used in communication and content-heavy environments.

Besides, a new Data Analytics Engine, developed with Starburst, allows seamless querying across data sources and integrates an “agentic layer” that automates documentation and insights using large language models. This engine is designed to accelerate AI-driven business decisions by connecting data across cloud and on-prem platforms.

The platform also incorporates support for NVIDIA cuVS, a hybrid keyword-vector search tool that helps speed up AI retrieval tasks. While not the central focus, NVIDIA’s technologies support enhanced AI inferencing and indexing capabilities in Dell’s infrastructure.

Advertisment

Technical Enhancements in Dell’s AI Storage Stack

Dell’s AI platform update also brings a series of infrastructure improvements designed to increase performance and efficiency for large-scale AI deployments.The PowerScale F710, which has now received NVIDIA Cloud Partner (NCP) certification, is engineered to support workloads at a scale of up to 16,000 GPUs.

According to Dell, the system delivers denser rack utilisation, reduces the number of required network switches by up to 88%, and lowers power consumption by 72% compared to previous configurations, enabling enterprises to optimise data centre resources more effectively.

The company has also expanded ObjectScale with a software-defined option that runs on Dell PowerEdge servers. This version delivers up to an eightfold performance improvement over earlier generations of all-flash object storage. It is tuned for high-speed access to small objects, particularly relevant for AI-driven communication use cases where frequent metadata lookups and retrieval operations are common.

Advertisment

Moreover, Dell plans to introduce S3 over RDMA support in a forthcoming tech preview due in December 2025. This enhancement is expected to significantly accelerate object storage performance, offering up to 230% higher throughput, 80% lower latency, and 98% lower CPU utilisation when compared with traditional S3 implementations.

Such improvements aim to support enterprises that manage large volumes of unstructured data, including transcripts, logs, communications, and media artefacts.

Beyond performance upgrades, Dell has introduced deeper S3 integration, bucket-level compression and new optimisations for small object handling. The company says these updates can deliver up to 19% higher throughput and 18% lower latency for 10KB objects, improvements that benefit communication-intensive AI workloads that constantly ingest and reference small pieces of data.

Advertisment

Dell says these improvements aim to help enterprises scale from AI prototypes to full production, particularly in sectors such as healthcare, manufacturing, and telecom, where communication workflows and data sensitivity are central.

“AI’s success depends on unlocking enterprise data,” said Arthur Lewis, President, Infrastructure Solutions Group, Dell Technologies. “With our modular platform and trusted collaborators, we are helping organisations move faster from pilot to production.”