HomeNewsTechnologyRazer’s CES 2026 Vision Puts AI Right Next to the Player

Razer’s CES 2026 Vision Puts AI Right Next to the Player

follow us on Google News

If there was a single takeaway from CES 2026, it was that Razer is thinking well beyond mice, keyboards, and headsets. The company’s show wasn’t about one headline grabbing product so much as a collection of ideas that all point in the same direction: AI moving closer to the player, the desk, and even the body, rather than living somewhere off in the cloud.

- Advertisement -

Project AVA is a good place to start. What began as an AI esports coach has turned into a small animated desk companion designed to sit next to a PC. It watches what’s happening on screen, adapts over time, and is meant to help with everything from gaming to everyday tasks. The concept taps into a broader trend of always present AI assistants, the kind that feel more like a background helper than a chatbot you have to summon. The real test, as always, will be whether it feels genuinely useful or just another thing competing for attention.

Razer is also experimenting with what happens when AI becomes wearable. Project Motoko is a headset concept built on Snapdragon hardware, combining first person cameras, spatial audio, and a dense microphone setup to interpret what’s going on around you. In theory, it could assist with gaming, work, or daily life by adding context and awareness in real time. It also raises the familiar questions that follow wearable tech everywhere, including comfort, privacy, and whether people actually want to wear something like this for hours at a time.

Away from the consumer facing concepts, Razer is clearly trying to get the attention of developers. The Forge AI Developer Workstation is designed for teams that want to run and train AI models locally, with an emphasis on speed, control, and predictable costs. Instead of leaning on subscriptions and cloud services, the pitch is about keeping compute on the device, where latency is lower and data stays closer to home. That approach lines up with a growing interest in edge and on premise AI, especially as models get larger and more demanding.

- Advertisement -

To make that hardware easier to use, Razer also introduced AIKit, an open source toolkit that simplifies local AI workflows. It handles things like GPU detection and clustering, so developers can focus more on building and tuning models and less on setup. Keeping the project open on GitHub suggests Razer understands that developers want flexibility and transparency, not another closed ecosystem to manage.

That same mindset shows up in Razer’s partnership with Tenstorrent. Together, they introduced a compact AI accelerator that connects over Thunderbolt 5 and delivers desktop level AI performance in a portable package. It’s aimed at developers who need serious compute for tasks like language models or image generation, but don’t want to be tied to a full workstation or server. The modular design, which allows multiple units to be linked together, makes it feel more like a building block than a fixed solution.

Razer didn’t ignore the living room, either. The Wolverine V3 Bluetooth controller, developed with LG, is tuned for cloud gaming on TVs, with low latency wireless performance and built in TV controls. It’s a reminder that gaming is increasingly happening on big screens in shared spaces, not just at desks with dedicated hardware.

The company also leaned back into comfort and immersion with its seating announcements. Project Madison imagines a chair that adds haptics, spatial audio, and lighting into the mix, while the Iskur V2 NewGen focuses on more practical improvements like cooling and ergonomic support for long sessions. One is clearly experimental, the other immediately usable, and together they show how Razer continues to think about the physical side of extended play.

Taken together, Razer’s CES lineup feels less like a sales pitch and more like a snapshot of where the company is placing its bets. Local AI, open tools, and tighter integration between hardware and software are all central themes. Not every concept will make it to market, and not every idea will resonate with everyone, but the direction is clear. Razer is exploring how intelligence built directly into devices might quietly change how gaming and everyday computing fit into daily life.

Leave a Reply

More to Explore

Blender 5.1: The Precision Refinement Every Designer Needs

Released on March 17, 2026, Blender 5.1 arrives not as a radical departure, but as a masterclass in refinement. While version 5.0 was the...

How to Set Up Firefox’s New Free Built-in VPN and Use Native Split View

Digital privacy often feels like a full-time job, requiring users to juggle various extensions and subscriptions just to keep their personal data from leaking...

OpenAI Shuts Down Sora as Disney’s $1 Billion Deal Collapses

The sudden closure of OpenAI's AI video platform marks one of the most dramatic reversals in the brief history of generative AI, and leaves...

Sony’s Tokyo Studio Is Where the Future of Filmmaking Gets Made

Sony is bringing its global media production hub network to Japan, opening the Digital Media Production Center Japan (DMPC Japan) inside the company's Group...

Anthropic’s Claude Cowork Lets You Assign AI Tasks From Your Phone and Walk Away

Artificial intelligence is getting better at doing things. The harder challenge has always been getting it to do things without you watching. Anthropic's Claude...

NVIDIA’s Dynamo 1.0 Is Free, Open Source Software That Makes AI Inference Up to 7x Faster

Running AI models at scale is harder than it looks. Training a model is a one-time investment. Inference, the process of actually using that...

Adobe and NVIDIA Are Teaming Up to Reinvent Creative and Marketing Workflows With AI

Two of the most influential companies in creative technology are deepening a partnership that goes back more than two decades. Adobe and NVIDIA have...

NVIDIA Is Trying to Become the Default Platform for Every Kind of Robot

Jensen Huang has a bold prediction: every industrial company will become a robotics company. Whether or not that timeline plays out exactly as he...

NVIDIA and T-Mobile Want to Turn the 5G Network Into a Distributed AI Computer

Most conversations about AI infrastructure focus on data centers, the massive facilities packed with GPU racks that train and run the world's most powerful...

BYD, Nissan, Geely and More Are Building Self-Driving Cars on NVIDIA’s Platform — and Robotaxis Are Coming to Uber by 2027

Self-driving vehicles have been a promise for a long time. The technology has advanced significantly, but wide-scale deployment has remained perpetually just around the...

NVIDIA Wants to Be the Platform That Powers Every Enterprise AI Agent

Autonomous AI agents are moving from experiment to enterprise infrastructure faster than most organizations anticipated. The question is no longer whether companies will deploy...

NVIDIA Is Building a Coalition of AI Labs to Develop Open Frontier Models Together

The race to build the most powerful AI models has largely been a competition, with labs guarding their research, their data, and their techniques...

NVIDIA Is Releasing a Wave of Open AI Models Covering Everything From Robot Brains to Drug Discovery

NVIDIA does not just make chips. It has spent years building a parallel business in AI software and open models, and at GTC this...

NVIDIA’s NemoClaw Brings Security and Privacy to OpenClaw’s Fast-Growing AI Agent Platform

AI agents are getting good enough to actually be useful, and that is precisely when the uncomfortable questions start. If a piece of software...

NVIDIA Is Taking Its AI Chips to Space — Here’s What That Actually Means

NVIDIA has spent the last several years building the infrastructure that powers AI on Earth. Now it is setting its sights considerably higher. The...

Recommended for You

You Might Also Like