However, with decades of experience providing high performance computing (HPC) and storage solutions to the industry, the company is well-positioned to provide customers with the infrastructure they need for innovative AI initiatives. means.
The global tech giant’s goal is to enable customers to create their own “private AI cloud” covering a wide range of AI use cases, from autonomous driving to large-scale language models (LLM) and bioscience.
I recently spoke with Mark Armstrong, vice president and general manager of AI for HPE’s EMEA region. Learn how organizations ready to scale their artificial intelligence (AI) deployments can benefit from partnering with an infrastructure provider, and the challenges they face. . Here are some highlights from the chat. You can read the full text here.
Optimized workload management
Getting the infrastructure and architectural elements in place is always important when deploying enterprise AI.generally here HPE It begins as Mr. Armstrong tells me: “The first aspect of that is working closely with the customer to determine what the right architecture is and calculating the workloads that the customer needs. And to be honest, how the technology works I think we have the strongest team in the world in terms of our ability to understand what’s going on and optimize it for our applications.”
With its depth, AI expertiseWith our long history in the HPC space, HPE is uniquely positioned to help.
“When you look at the workloads required for generative AI, these requirements are very similar to high performance computing, which is why we have spent the past two years successfully serving this new and upcoming generative AI market. ,” Armstrong said. says to me.
It also created a number of high-performance storage solutions of the kind needed to feed real-time streaming data to machine learning algorithms.
But despite these technologies, Armstrong believes that a customer-centric approach is central to the organization’s strategy when helping customers optimize their workloads, ensuring that solutions are tailored precisely to individual customer needs. He said he is focusing on making sure that this is the case.
“By leveraging the experience gained from deploying these large-scale systems, we can ensure we design the right solutions for customers looking to solve problems with generative AI,” he says. Masu.
Generative AI for critical business functions
Examples of this partnership in action include: collaboration Between HPE and Aleph Alpha. Built on HPE Apollo 6500 Gen10 Plus By deploying an HPC platform and an HPE machine learning development environment, the German startup was able to create the explainable and auditable AI solutions that its customers (in the private sector and government) need today. Ta.
In this case, the concept of “data sovereignty” was important. The idea was to provide a solution that could process and act on sensitive data without compromising privacy or the commercial value of the data. This means professional clients such as lawyers and medical professionals can benefit from accessible analytics enabled by generative AI.
“Aleph Alpha has the desire to lead the development of next-generation AI…and they are demonstrating this with their remarkable strategy of making their LLM model available in all major European languages. . HPE supports that vision with the necessary computing architecture and solutions,” said Armstrong.
A further important aspect of HPE’s strategy in this area is its commitment to providing flexible procurement and usage models.
“Our couple is [HPE’s] We have capabilities that allow our customers to procure these systems and use them in a variety of ways,” said Armstrong.
This includes both capital expenditures and a number of as-a-service models. This all ties into his vision for HPE to enable customers to create their own private AI clouds. LLM-as-a-service includes: green lake platform.
“We aim to ensure that our customers can leverage this generative AI in a way that makes sense for their business,” Armstrong said.
This means HPE customers have a wide range of choices when it comes to purchasing and integrating AI infrastructure, and can adapt to the use case and scale they need.
Of course, none of this is new to HPE. For decades, HPE has offered everything from supercomputers to printer ink refills “as a service.” Extending these models into its generative AI strategy shows that the company considers this breakthrough technology one of its core products today.
How will generative AI become important in the future?
Key to HPE’s strategy, Armstrong said, is the idea that generative AI can greatly simplify the process for companies to leverage data to create highly tailored services for their customers.
“I think we’re going to see significant innovation in almost every industry over the next few years in terms of what generative AI can offer,” he said.
This means more models tailored to deliver company-specific or industry-specific results, as well as more information on how to keep data secure as it is used to provide customer-specific services. It means deeper understanding.
“I think this is going to be an important step in the future, and we’re going to see it emerge. We’re already starting to see it emerge,” Armstrong said. “I think we’ll start to see more of that over the next 24 months.”
you can click here Read the full interview with Mark Armstrong, vice president and general manager of AI for EMEA at Hewlett Packard Enterprises.