How HP sees the era of the AI PC | Alex Cho


Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


PC makers are betting everything on the AI bandwagon. At the CES 2025 tech trade show in Las Vegas last week, AI technology was everywhere.

And the most ubiquitous AI at the show came in the form of the AI PC, built by a host of the world’s largest computer makers using chips from the biggest semiconductor companies like Nvidia, AMD, Intel and Qualcomm. The computers took the form of laptops, workstations and desktops.

I caught up with Alex Cho, president of HP’s personal systems division, at CES 2025. HP showed off a bunch of AI PCs and easily customized gaming accessories. Cho believes that bringing AI into our everyday workflow through the PC makes sense because we’re all working too hard.

Based on a survey of 15,000 people, HP believes that only 29% of us have a healthy work relationship. But with AI skills, people may find that they can automate a lot of the drudgery that bogs them down. And putting more AI processing power in the PC — the edge of the network — makes sense because local processing of LLMs means your data can stay private within your own home. It also means the network won’t get bogged down with traffic, the problems can be solved in real time, and the energy consumption will be lower as well.

The AI processing power in the newest HP machines is around 55 TOPS, and the applications such as AI companions are getting more useful, Cho said. HP showed off its Omen AI, which optimizes a computer so it can run games in the best way possible.

Here’s an edited transcript of our interview.

Alex Cho is president of HP Personal Systems.

VentureBeat: What are some things you’re excited about?

Alex Cho: We’ve been building and sharing our strategy around the future of work. We think that’s a massive opportunity because of where the world is going. Employees and employers looking to enable work that’s far more flexible. Also coinciding with the fact that in our second year of a large body of work being done around the work relationship index–we just finished that last year, the second round. It’s a very big body of work. People’s work relationship index is very low. I think 29%. It’s only changed a point. This is across 15,000 people in multiple countries. Only one in four will have a healthy work relationship. If you think about the implications, that’s a big deal for the world, for countries, for companies, for people.

One of the key drivers that improves that work relationship index is confidence in capabilities and having the skills needed for the future, AI being one of the top ones. Those who are actively using AI, their work relationship index is 11 points higher. Very pragmatically, we see that AI is going from a general concept to an area of skill sets where daily use is improving the work relationship index, which is relatively low. In that space, we’re excited about how we have a portfolio of products and solutions around work.

If you look at our portfolio, no one spends more time with that employee than us. They’re going to be on a PC from HP, a docking station, which now many people have because they like having an HP display, or multiple displays. They got hooked on that productivity value during COVID. Our printers. People are printing out. A lot of work is through Zoom and Teams, whether in the office or at home. Services that not only enable an IT manager to support the employee whether they work from home or in the office, but through our workforce experience platform, allow us to facilitate and manage that digital work experience across all those devices.

That breadth of portfolio is second to none. We feel very good about having that large scope, having all those endpoint devices to be a place where we’re bringing AI – for the value of local AI, for privacy. Cost is a big driver of why ISV developers want to use more. Also the security and privacy, and some of the latency benefits of local. Having each of those endpoint devices be a place delivering new AI experiences for that customer–we’re able to deliver that in a way that enables employees to work more productively. It’s about more than productivity. Work becomes a place that can enable more growth and fulfillment because of what we enable.

hp omen max 16
HP Omen Max 16 gaming laptops.

That’s what we’re very excited about. At CES we’ve announced our key proof points. World’s most performant and secure platform for running AI on a PC through our platforms. Software experiences that allow you to capitalize on running AI locally, whether it’s from HP – we call it AI Companion for things like a local RAG research model – or ISVs who we’re working with to capitalize on the AI PC.

There’s the wonderful world of work, where we have leadership. But as well, a key component to that is play. We’re bringing the same capabilities to our gaming portfolio. We’ve announced some of the highest-performance gaming devices, and for the devices to work together with peripherals, our HyperX peripherals. Low latency, instant pairing wireless, now able to manage across three devices. Using AI to help even the core gaming experience through Omen AI that optimizes, in real time, your OS, CPU, GPU, and the title, so you have the maximum frames per second.

I know it’s a lot. We’re making the stuff real. We’re doing it in a way that’s unique and differentiated and meaningful for customers.

VentureBeat: A lot of the Nvidia talk was about agents, how you’re going to manage a bunch of agents that will do your work for you. Figuring that out is a lot of the task for humans. What agents can help me with my job? I have a gigantic knowledge base inside my Gmail, but I can never figure out the search to mine it for good things. Somebody who gave me an agent could do that for me, so I don’t have to figure it out, that would be very valuable.

Cho: The space is rapidly evolving, from models–the model area is obviously rapidly advancing. Better models, reasoning models. We’re working very much on bringing models locally to the device, so you get the advantages of local AI. Our view is that it will be a hybrid environment. You’ll do a lot of AI inferencing in the cloud, but also locally, for those benefits around cost, latency, privacy, and security. Then there’s–agentic workflow is the buzzword, I guess. But really, agents that are going to help facilitate key tasks or workflows.

This is why we think it ties to our vision around the future work, that employee-work relationship index. You start seeing that in our marketing campaigns around the AI PC. When you allow someone to do the work that they love, when you have a lot of the tasks managed through AI, that frees up the employee, the worker, to be engaged at a different level. A lot more fulfillment. Work won’t feel like work. That’s what we think we have a unique opportunity to deliver across the portfolio of work devices that we have.

VentureBeat: It’s an elevated way of looking at what products to design.

hp ai pc
HP AI PCs

Cho: The past few years, when we talk about HP–are we a PC company? Are we a printer company? In the future our clear ambition is that we want to be the work company, because of that portfolio breadth, because of the ability to bring AI locally, because of the unique security we can enable through the performance leadership we’ll provide, through the ability to manage it as a service and solution through our workforce experience platform. We think that’s an area of tremendous value for the future of work, an area of real differentiation we can provide, and an area of growth for us. That will also be higher-value growth.

VentureBeat: How has the AI PC been changing? How is it noticeably different from a year ago?

Cho: We’re starting to see the growth of ISVs and applications that are capitalizing on running AI locally. Remember, our discussion before was–it’s the changing use cases, or new use cases and experiences that these devices will provide, that will be the real catalyst for category transformation. We’re increasingly seeing those happen. It’s early days. We went from a couple of proof points to now seeing a broader set of titles. We’re actively working with the ISV community on helping them capitalize on the AI PC. A lot of what we’re showing for our customers and partners is those actual applications and use cases.

Perfect example, we talked about a product manager. Before, they were doing all their market research, customer panels and interviews. Those will take multiple sessions. You have to schedule them. You have concept design. You have to work with a marketing agency to render some stuff. Then you present it to get feedback. You’re trying to communicate feedback it in the most compelling way.

Now, running AI locally, we have what we call AI Companion, which is basically an analyze RAG model. You get huge segmentation available for that product manager. You can run focus groups in multiple languages and have live translation. Before that would take weeks. If you have a concept design, instead of going to the marketing department–at HP we go to the design guys, actually, and say, “Can you render this?” “Nah, I don’t have the time.” Now they can use AI tools locally and render that. They’re very good for driving discussion. Is there value to this? Being able to present that to a group of people and get real time feedback–how are you presenting? Are you clear? Are you touching your face too much? Are you talking too fast? That changes a day in the life of the product manager.

The first thing we’re seeing on the AI PC front, we’re starting to see these titles come in. They’re increasing. They’re meaningful. That’s one thing. Second, when we last talked that was primarily when we saw our first MPU-enabled devices that were ARM-based. You’re seeing the introduction of x86-based now. We’ve introduced them here at CES, and at the end of last year. In that space, we’re leading the pack with the level of TOPS performance at 55 TOPS. We’re doing some unique co-engineering with AMD. We’re using our security stack to deliver a better secure platform, which will be increasingly important in AI. We see x86. We see the portfolio expanding there.

The other thing that you’re seeing–we’re tying these devices and starting to create a better-together experience across the portfolio. Whether it’s headsets working with the device, you’re going to see a lot more of that. Which we can uniquely do as HP because of the portfolio breadth we have. We’re also, from our side, using AI in our workforce experience platform, so that for IT managers managing those devices, they have a rich set of telemetry and AI that helps them solve proactively, remediate any kind of experiences that will impact how employees use the device. There’s development in the industry, and then there’s a lot that we’re doing on top of that adding more leadership.

Have you seen AI Companion? We’re using AI models and the ability to add in your own data. You can summarize. The ability to take a large file, analyze it, table it, synthesize it, without having any of your data going out there, that’s a rich opportunity for value. When it comes to work, it’s not about data. It’s about insights.

This is interesting. They put in three resumes. Now, prioritize these candidates for a position in an airline. Think about that. You could always print them all out and identify what’s needed, or you can use an AI model. Prioritization and rationale. When you start talking about the type of insights that you can drive from data, the broad bodies of data–this is more than just a new product. This is enabling a new level of productivity. We can do this in the highest performance, most secure, and tied to a device that has the best in audio and video. Most people are using Teams and Zoom all day long. Our Poly acquisition and assets that we’re now integrating allow us to offer the best in how you can show up, how we capture you.

Suddenly this thing becomes more than just a refreshed device. It becomes the basis by which you’re being empowered as an employee. I love that. You know my excitement about–this is the beginning of the next decade.

hp omen ai
HP Omen AI configures your PC for the right gaming settings.

VentureBeat: It’s picking the best resume for the job?

Cho: It’s looking at the attributes needed for the industry, which is probably in the public domain. It takes the resumes as they are and prioritizes them based on a set of criteria it’s identified. Think about how long that would have taken. Insight generation. You talked about agentic workflow. It’s really insights to action. If we’re moving this world from data, which is largely the age that we’re in in computing – it’s about data – from there to insights to action, that’s a profound impact that we think will unlock a lot. We’re excited because that’s very much around the work space. That’s why we say the leader in work. We’re leveraging AI at the edge and we have the broadest portfolio to do that.

VentureBeat: Do you think it’s part of HP’s job, or someone else’s job, to do things like coming up with useful queries for AI? How do I search through my Gmail to insights, something like that? If you lay out for the user that this is the kind of thing you want to feed into your AI model to get an answer, and then you can get to insights quicker, is it part of HP’s job to do that?

Cho: I would start with, our job is to deliver a platform that can do that type of AI workflow with the highest performance and most security and privacy. Second, it’s our role particularly to help enable the community of ISVs to develop applications that capitalize on that. That’s why a lot of what you’ve seen from us in the past nine months is around being a catalyst for the ISV community to leverage that. On top of that, we think there are some core experiences that we ourselves will invest in developing. What I just showed you around AI Companion is one of them. One, it’s taking AI models, but it’s also uniquely optimizing the management of that on this device with your local data secured. That’s a key experience that we uniquely own.

The other area where we think we have a unique role is–remember, we’re not just in PC. We’re in work. Across the device, across the printer, across the room, and having a workforce experience platform that’s able to give you a level of management and facilitation of the digital experience across all of them. We think that’s our role. An employee doesn’t just use one device. The other role is for us to increasingly have all our devices work better together. That’s what we think our role is. We don’t own all of it for sure, but there are some key elements where we feel like we have an opportunity to contribute, differentiate, and add value for the customer.

There’s some great work being done by ISV providers. We’re working with them on helping them capitalize on how you can do those things faster, more cost-effectively–cost is a key driver, because for a lot of those application developers, the cost of delivering from the cloud is very high. Being able to do that with information and data that’s private and secure, that you want to keep locally versus putting it out in the public domain. And then giving them the highest performance for AI and a secure platform to do that.

VentureBeat: This one where you can print from a web page and it doesn’t come out all wonky, I thought that seemed particularly useful.

Cho: Think about Excel. When you try to print an Excel document, you can’t just go “print.” You’ll get 18 pages of a single column. In the past, you’d have to set a print area, and then that ends up being so small you can’t read it. Maybe adjust the margins. Maybe truncate things. The ability for AI to reduce the friction of these experiences is also huge. We talked about all these new things, but what we’re excited about–as we’re fusing AI into the core experiences and reducing friction, it’s improving those types of things, like what we call a perfect print. That’s so meaningful. I get to focus on what I care about, instead of reformatting this stuff. That’s just part of the larger experiential transformation that we feel like we can uniquely deliver across a portfolio of endpoint devices.

VentureBeat: I like the thinking about cables that you’ve put forward as well. Plug in this cable and your monitor is going to work with your laptop now.

Cho: We’ve done a lot around that. There’s multiple tiers. One is, let’s just make the cables and adapters smaller. You may have seen some of that. Second, let’s reduce the number of cables. Third, reduce it even more and get to wireless. That’s that wireless B. There’s multiple elements. Wireless B instantly connects. It’s not hard to connect. Let’s do it with low latency. In our gaming announcements, instead of having that dongle for wireless, it’s embedded and designed into our Omen PC with our HyperX headset. What we announced at CES is also not for one peripheral, but for three. You want to have your keyboard, mouse, and headset all connected. We think that’s part of the overall solution ecosystem we can enable. We call it Better Together.

HP Z2 Mini G1a Workstation Desktop PC
HP Z2 Mini G1a Workstation Desktop PC

VentureBeat: The pace of change for the AI PC over the year looks like it’s been good.

Cho: A lot is happening. Silicon, applications, memory. We announced, with our AI workstations, not only the highest TOPS performance, but also the integration of memory. A lot of these models are more memory-constrained. Again, we’re raising the bar and setting the standard in providing the industry’s most performant AI workload support on our devices today.

VentureBeat: Is there a role for HP in these energy-related issues that are coming up because of AI? We have data centers talking about using nuclear power.

Cho: We have a point of view on this. This is meaningful for customers. First, I’ll set aside AI for a second. We’re very much focused on sustainability. That’s not new. I feel like I’ve talked to you about this for years. It’s an ongoing, hasn’t changed focus. The world’s most sustainable portfolio. We’ve done that with all the transformation of our materials. We have recycled material in 100% of our devices. We’ve moved 100% of our packaging to sustainable sources. But as well, we are now the leader in our devices and our portfolio around EPEAT Gold, which is a measure of energy consumption. That leadership is something we’ve been building and will continue.

Second, on the topic of AI, again, we believe that it will be an environment where AI will operate from the cloud and also locally. The real growth, we believe, will be much more inference locally, for the benefits in cost. But cost is tied to energy. And then latency, security, and privacy. The ability to run those models locally versus the cost of having support in the cloud–remember, with AI you’re iterating all the time. Over a period of time, we think that contributes to the broader value proposition of our endpoint devices. Not only the most sustainable, but when you run AI locally, it will be a more cost-effective solution. Not for all. There’s clearly a huge need for cloud-enabled inference.

VentureBeat: Have customers weighed in and told you that there’s so much demand for data center expansion that the sustainability infrastructure is not there to support it all?

HP OMEN 27qs GS gaming monitor.
HP OMEN 27qs GS gaming monitor.

Cho: We intersect with that in the following way. Yes, there’s a lot of dialogue around continued growth and the limitations around cloud data centers. But that’s not the center point. Where it floats down to us is, if you’re an application developer, the cost of delivering your experiences in the cloud versus locally shifts the equation such that you want to deliver locally. Many of those ISV companies we work with are saying, “I would rather enable a local model to deliver the experience, because of the cost to deliver from the cloud.”

That’s the way we intersect with it. That’s why we’re rapidly working on building these devices to support higher AI workloads, to do that in the most secure manner, and to do that in a way where these ISVs can optimize to deliver their value propositions. That’s why we see that. Inference at the edge will be the domain of real growth in the market. We have a unique opportunity to add value there.

VentureBeat: Do the microprocessor companies seem to have the same strategy for how to expand their AI capabilities enough for you?

Cho: There’s a lot of energy and innovation happening in the silicon space, which is good. You have this entire ecosystem innovating in the space. We think that’s great. How do you support more AI workloads and more demanding AI workloads? TOPS is a good measure of that locally. There’s a lot of focus on that. How do you optimize power consumption? That’s where you see a lot of focus too. More AI performance, but performance that minimizes impact. You see a lot of that.

They’re partnering with us very closely. We’re partnering very closely with them on identifying, for customer segments, what their needs are. We’re doing a lot of co-engineering to make sure the entire stack–we really curate the entire stack, from the OS to the silicon to the firmware to the security to the I/O and everything I mentioned around audio and video. We’re working with them to curate and solution-engineer for each of the target segments. Gamers, data scientists, knowledge workers.

VentureBeat: It was interesting to see Nvidia talk about DLSS 4, where they’re using AI to predict what the next pixels are going to be. They have 33 new pixels coming up and only two of them have to be drawn. The other 31 can be predicted. Bringing AI into the processing provides these dramatic performance speed-ups. It helps performance evolve faster than Moore’s Law would allow.

Cho: The reason why we look at AI being exciting for this category–it’ll take existing challenges and improve them. It will address friction points, which is great. We talked about printing. It enables new experiences. You’re not just doing the old things faster and solving friction, but actually doing new things. That’s why we think that the role–think about this device. It could be any PC. It’s been such an important instrument of knowledge and productivity. But in general, people have been doing the same types of things.

HP ZBook Ultra 14-inch G1a mobile workstation PC.
HP ZBook Ultra 14-inch G1a mobile workstation PC.

We measure what apps people use. It hasn’t changed much. Browsing, productivity, office apps, and then some kind of entertainment, like Candy Crush. In the future, what you can do on these devices is going to expand so much more. That’s exciting. This device that has been such an important instrument for that type of utility will dramatically expand. The direction is going to be more and more personalized, helping you get more insights.

Third, it’s going to enable more agentic workflows, more streamlined processes. The input process of this has been relatively consistent. Keyboard, mouse, maybe a little bit of voice and touch. The ability to take multimodal inputs – voices, gestures, using the camera – because of AI’s ability to manage multimodal streams–it’s powerful and exciting. A good time to be in this space. We’re excited about leveraging our assets and capabilities to deliver new value for customers.

VentureBeat: There’s also something interesting about all the digital twin talk that’s been going on. For gaming, the line between simulation and games is disappearing. Games are becoming these vast simulations, like Microsoft Flight Simulator 2024. It’s a simulation of the earth. You fly around an accurate simulation. The interplay going on between physical and digital because of digital twins, this feedback cycle that’s happening where each one improves the other–we’ve talked about robotics in the physical world as well. It’s another part of making work better, more accurate, maybe more practiced? Does that feed into some of what you think about the changing nature of work?

Cho: The intersection of physical and digital, we’ve talked about that for a while. We’ve had different terms for it. Now AI is a massive accelerant for that. That’s why having an endpoint portfolio that we think is the broadest for work–that being a basis by which you can capture a lot more than just through a keyboard and mouse. You can capture through microphones and cameras. A lot of the assets we gained through Poly are around audio and video, processing that. Analog to digital, physical to digital, having that as a part of your overall base of data that you can use to drive actionable insights.

We think all of that ties well together to the opportunity we have. Our ability to lead in the future of work, bringing all of those things together with assets we already have, is going to add a lot of value for our customers.



Source link

About The Author