Megan: Yeah. That trust piece is just so important, isn’t it, today? And Sam, as much as AI has impacted audio innovation, audio has also had an impact on AI capabilities. I wondered if you could talk a little bit about audio as a data input and the advancements technologies like large language models, LLMs, are enabling.
Sam: Absolutely. Audio is really a rich data source that’s added a new dimension to AI capabilities. If you think about speech recognition or natural language processing, they’ve had significant advances due to audio data that’s provided for them. And to Brendan’s point about trust and accuracy, I like to think of the products that Shure enables customers with as essentially the eyes and ears in the room for leading AI companions just like the Zoom AI Companion. You really need that pristine audio input to be able to trust the accuracy of what the AI generates. These AI Companions have been very instrumental in the way we do business every day. I mean, between transcription, speaker attributions, the ability to add action items within a meeting and be able to track what’s happening in our interactions, all of that really has to rely on that accurate and pristine input from audio into the AI. I feel that further improves the trust that our end users have to the results of AI and be able to leverage it more.
If you think about it, if you look at how AI audio inputs enhance that interactive AI system, it enables more natural and intuitive interactions with AI. And it really allows for that seamless integration and the ability for users to use it without having to worry about, is the room set up correctly? Is the audio level proper? And when we talk even about agentic AI, we’re working on future developments where systems can self-heal or detect that there are issues in the environment so that they can autocorrect and adapt in all these different environments and further enable the AI to be able to do a much more effective job, if you will.
Megan: Sam, you touched on future developments there. I wonder if we could close our conversation today with a bit of a future forward look, if we could. Brendan, can you share innovations that Zoom is working on now and what are you most excited to see come to fruition?
Brendan: Well, your timing for this question is absolutely perfect because we’ve just wrapped up Zoomtopia 2025.
Megan: Oh, wow.
Brendan: And this is where we discussed a lot of the new AI innovations that we have coming to Zoom. Starting off, there’s AI Companion 3.0. And we’ve launched this next generation of agentic AI capabilities in Zoom Workplace. And with 3.0 when it releases, it isn’t just about transcribing, it’s turned into really a platform that helps you with follow-up task, prep for your next conversation, and even proactively suggest how to free up your time. For example, AI Companion can help you schedule meetings intelligently across time zones, suggest which meetings you can skip, and still stay informed and even prepare you with context and insights before you walk into the conversation. It’s about helping people focus on strategy and creativity instead of administrative busy work. And for hybrid work specifically, we introduced Zoomie Group Assistant, which will be a big leap for hybrid collaboration.
Acting as an assistant for a group chat and meetings, you can simply ask, “@Zoomie, what’s the latest update on the project?” Or “@Zoomie, what are the team’s action items?” And then get instant answers. Or because we’re talking about audio here, you can go into a conference room and say, “Hey, Zoomie,” and get help with things like checking into a room, adjusting lights, temperature, or even sharing your screen. And while all these are built-in features, we’re also expanding the platform to allow custom AI agents through our AI Studio, so organizations can bring their own agents or integrate with third-party ones.