Gen-AI-Today

GenAI TODAY NEWS

Free eNews Subscription

What Moveworks' New Collab Means for Accessible and Scalable GenAI Solutions

By Alex Passett

The Moveworks M.O. is clear: Connect more AI innovators around the world.

Founded in 2016, Moveworks strives to improve AI users’ experiences via its generative Copilot platform. By providing a user-friendly interface for AI self-service, users can a.) feel adequately supported, and b.) confidently focus their on time on other areas of impactful work. By providing a secure, conversational interface, Moveworks gives organizations’ employees a great outlet for taking action, searching for general information, querying specific datasets, tailoring received notifications, and creating content across myriad enterprise applications.

And just last week, Moveworks announced a new multi-year collaboration deal with a company we think at least some of y’all (i.e. written with heavy sarcasm, readers) may have heard of before:

It’s Microsoft.

According to the official announcement, this collab “marks a pivotal step in standardizing Moveworks’ AI and ML platform on Microsoft Azure, which further underscores Microsoft Azure’s ability to meet the demands of Moveworks’ Next-Generation Copilot.”

For context, Moveworks Copilot is powered by its Moveworks Reasoning Engine, which is comprised of Moveworks’ proprietary enterprise LLM “MoveLM” and Azure Open AI’s GenAI capabilities. By leveraging Azure OpenAI Service’s retrieval augmented generation, LLM chaining and prompt engineering, Moveworks Copilot users can interact with it (and, subsequently, take action) across their business systems through natural language inputs.

Per Bhavin Shah, co-founder and CEO of Moveworks:

“We are seeing significant customer demand for our new Next-Generation Copilot, and Microsoft Azure provides the stability, security, and privacy standards required to deliver that experience at a first-class level. With Microsoft's flexible offerings, including the latest NVIDIA GPUs and Azure OpenAI Service’s provisioned throughput units (PTUs), we are able to build and deliver cutting-edge generative AI use cases to our customers at scale.”

And from Microsoft’s Katy Brown, CVP, Software and Digital Platforms:

“Microsoft's collaboration with Moveworks demonstrates Microsoft Azure OpenAI Service's role in reshaping how businesses operate. Together, we're directly embedding cutting-edge AI into employees' daily tasks, and responding to our customers' calls for deeper, actionable intelligence. Through this collaboration we are unlocking new levels of productivity and innovation by making generative AI accessible and secure across diverse industries.”




Edited by Greg Tavarez
Get stories like this delivered straight to your inbox. [Free eNews Subscription]
SHARE THIS ARTICLE
Related Articles

VoIP Provider Zadarma Integrates Three AI Voice Agents into its PBX Platform

By: Erik Linask    6/11/2025

London-based VoIP provider Zadarma integrated three AI-powered voice assistants directly into its PBX platform, a first in Europe, according to the co…

Read More

The Future of CX: Mosaicx Unveils AI-Native Engage Platform

By: Erik Linask    6/6/2025

Mosaicx has launched Engage, its next-gen AI-native CX platform to drive improvements in customer engagement and experiences.

Read More

Jabra Reviving Human Focus Amid AI Revolution in Customer Experience

By: Erik Linask    5/27/2025

Jabra looks to redefine how customer service teams make good on the promise of quality CX by combining the "what" of customer conversations, with "how…

Read More

When AI Ambitions are Dictated by Cloud Matters

By: Special Guest    5/27/2025

How are increasing AI workloads changing what we know about and how we design cloud architectures?

Read More

Rising AI-Driven Infrastructure Costs Expose Critical Weaknesses: NVMe SSDs & CXL Modules Redefine Scalability

By: Special Guest    5/7/2025

AI workloads are too demanding for their existing IT architecture. GPUs remain under-utilized, not because of faulty hardware, but because data can't …

Read More

-->