AI and the future of everything: Five ways AI will change our world as we know it
What does the future of AI look like? HPE Fellow and Chief Architect at Hewlett Packard Labs, Kirk Bresniker, shares his thoughts on what may happen in the years to come.
- Generative AI, powered by LLMs, is reshaping industries, marking a clear divide between life before and after its rise.
- Proficiency in AI will be essential for staying relevant as technology transforms every field.
For more than a decade, artificial intelligence (AI) and machine learning (ML) have been unleashing new capabilities for enterprises and researchers. Whether it’s using predictive analytics to forecast equipment maintenance, computer vision tools to give eyes to automated assembly line robots, or digital twins to simulate the behavior of factories, cities, and even economies, the list of AI-powered applications is long and growing longer.
But none of these breakthroughs have captured the imagination of individuals and enterprises like generative AI (GenAI). Over the past two years, the world has undergone a tectonic shift due to the emergence of the large language models (LLMs) that form the foundation of GenAI applications. The aftershocks will be felt for decades to come.
I grew up in Silicon Valley, where we were always waiting for “the big one” that could change our lives overnight. While I hear that metaphor used in big tech all the time, this time, it’s appropriate.
In the future we’ll look back and see a clear demarcation: life before LLMs and life after
Our ability to ask questions of GenAI chatbots using natural language and have them produce answers by drawing from nearly the entirety of recorded human information will impact every knowledge-based pursuit performed by humans. In the future, we’ll look back and see a clear demarcation: life before LLMs and life after.
What will that future look like exactly? Here’s a look ahead to the end of the decade to imagine how we’ll settle into partnership with tomorrow’s AI technologies.
1. AI models will replace enterprise operating systems and applications
Today, we use a portfolio of applications to perform basic functions such as searching databases, sending messages, or creating documents, using the tools we know well. In the future, we’ll ask an AI-based executive agent to provide an answer or perform these tasks, and it will recruit models that have proved to be safe and compliant, write or rewrite applications on the fly, and negotiate terms and conditions in your best interest. Your AI agent will simultaneously solve equations of physics, economics, law, and more to decide the best way to implement each sequence of tasks, orchestrating other models, and seeking information from additional sources as needed. The AI agent will also remember past requests and anticipate future ones, adapting to users’ behavior and creating highly personalized systems for them.
2. AI model generation and operations will become more transparent
Today, data scientists and AI engineers building leading AI models are often at a loss to explain how they reach any particular outcome. The sheer scale of inputs, the nature of training, and the massive power of computation required to produce a model all combine to make AI models inexplicable and unexplainable. And while in some cases that’s perfectly acceptable, when it comes to adoption for a specific use in a highly regulated enterprise, transparency is going to be the key to adoption.
As these models become increasingly important in critical decision-making, we will see an iterative process of legislation, litigation, negotiation, and innovation among regulators, enterprises, and the communities they operate in. This process will likely continue to reflect differences in risk tolerances, values, and priorities from industry to industry and from region to region.
AI models will also need to become more transparent about the resources they consume. You can’t talk about the future of AI without considering the unprecedented amounts of electricity, water, talent, and money required to train a leading-edge model. And while the eye-watering amount of resources going into training is top of mind today, we should prepare ourselves for that to continue increasing. Current leading social media infrastructure is scaled to hundreds of thousands of inferences per user-hour, but what resources will be required to support millions of inferences every hour of every day for 8 billion humans?[1]
Operators of foundational models will need to be explicit about the provenance of the energy, infrastructure, and information of their models, allowing organizations to make informed decisions about whether the insights AI models offer are worth the cost.
3. Sustainability will become a global priority — and AI will help us get there
Every element within the world’s computing infrastructure — every component in every rack in every data center — will need to be optimized for sustainability. Decision-makers will be called upon to determine whether the value of each business outcome outweighs the energy expenditure required to produce it. From mining the minerals, manufacturing the infrastructure, and deploying it at scale to bring together the information and energy to train and infer results, we’ll have to account for every joule of energy, every byte of information, and every liter of water used.
A big reason why Hewlett Packard Enterprise adopted direct liquid cooling in its high performance computing systems is the energy efficiency DLC provides. Liquid cooling can slash a data center’s carbon footprint and cooling costs by nearly 90% per year.[2] While we’re decades into developing this technology for the world’s most demanding supercomputer applications in science and engineering, it’s now moving into data centers at 40 times the scale in gigawatt AI deployments. To put those numbers into perspective, the world’s fastest exascale supercomputers, Frontier, Aurora, and El Capitan, operate at around 25 megawatts, which is the equivalent of a little more than what 20,000 average U.S. homes consume.[3] Tomorrow’s data centers will consume more than a gigawatt, the equivalent of what 833,000 average U.S. homes consume.[4]
HPE is committed to pushing the boundaries of efficiency in every aspect of information technology: compute, storage, and networking. Reinforcement learning-powered digital twins can optimize the energy, transportation, and communication needs of massive-scale systems by identifying waste in the energy ecosystem, anticipating fluctuations in demand, and making suggestions for how to manage the grid more efficiently using renewable energy resources.
4. Building new LLMs will require new computing paradigms
Today’s most advanced LLMs are scaling to trillions of parameters, the number of variables that can be adjusted to make the model’s predictions more accurate. The open question is whether more parameters will yield even better performing models. If so, the next generation of models will require orders of magnitude more parameters, along with even larger volumes of data and gigawatts of computing power.
Research institute Epoch AI estimates that the most expensive model to date, Gemini Ultra, has a combined capital and operational cost of $800 million.[5] If the current pace of LLM development continues, within a decade, we could be spending the equivalent of the annual global IT budget to train one model at a time. In other words, we will hit a limit on our ability to train larger models using existing technologies. Even if novel technologies and algorithms begin to approach the training efficiency of biological intelligences, inferencing over these models, up to millions of times per hour for each of 8 billion people, will be an even greater hurdle. Can we afford to give everyone access to an AI-optimized future?
Photonic computation, which uses light waves for data storage and processing, could enable us to build low-latency, low-energy devices for performing inference at the edge. But training the next generation of LLMs will likely require technologies and algorithms that are still being incubated by research teams like ours at Hewlett Packard Labs. The ultimate goal is to make AI capable of true deductive reasoning. Physics-based accelerators may be the key to a new dimension of AI behaviors that eventually lead us to artificial general intelligence.
5. AI’s biggest impact will be on human behavior
Just as humans have adapted to computers, the internet, and smartphones over the past three decades, we must adapt to AI and learn how to use it effectively.
Our CEO, Antonio Neri, likes to say that everyone at HPE should be pursuing a minor in AI. What he means is that every team and every team member should explore the possibilities of this technology and ask themselves if what they’re doing today could be done more effectively and efficiently using this technology.
The answer won’t always be yes, but every individual within every organization must be willing to seriously ponder the question. While I don’t think robots are coming for our jobs, I strongly believe that if you want to be proficient in science, engineering, industry, or even the arts, you’ll need to be proficient in AI. If you don’t know how to take advantage of this technology, you may find yourself being replaced by someone who does.
Join us in person: AI House Davos
Headed to Switzerland January 20-24th? The AI House Davos 2025 is a multi-stakeholder forum to convene industry leaders, researchers and policy makers to discuss how AI will shape the future. It is a partnership comprising of organizations across the globe including Hewlett Packard Enterprise, Mergantix, ETH AI Center, Swisscom and G42. HPE will be hosting three sessions. Click on the links for more information and to register:
- Defining Sovereignty in a New Era of AI
Tuesday 21 January, 14:50-15:45
Abstract: There is a special demand for sovereign AI, which has become a prerequisite for protecting national security and strengthening the resilience of the public sector. Sovereign AI also allows nations to make compute and tools accessible for the greater good to innovate and unlock economic growth. In this session, we’ll explore how nations invest in sovereign AI and through a strong public-private collaboration, can scale AI and make it inclusive to create a positive societal impact
- In the Era of AI, the World is only as strong as our Network Connection
Wednesday 22 January, 9:00-9:55
Abstract: As government agencies, private enterprises, and NGOs turn to AI tools to solve their most complex and perplexing challenges, networking technologies, along with related policies and programs they face, will directly impact their ability to succeed. This comes at a time of great convergence including technological, organizational, and even infrastructure collaboration. This session will discuss the possibilities that the era of AI represent – as well as the challenges that come with pursuing these innovative solutions – and demonstrate why reliable, secure, ubiquitous connectivity is crucial not only to advancing AI but our entire society.
- Can We Resolve the Paradox of Sustainable AI?
Thursday 23 January, 16:10-17:05
Abstract: The adoption of AI comes with a large energy consumption challenge. The massive compute power required to run AI applications puts a heavy strain on resources and costs and, in the near term, risks exacerbating the climate crisis. Geopolitical factors are also creating energy challenges in certain parts of the world, making it even more difficult to adopt AI. This panel will explore holistic strategies – from reimagining data centers to establishing future-proof initiatives with a collaborative, whole-society approach – to reduce the carbon intensity of AI deployments and build a more sustainable world.
[1] “Building Meta’s GenAI Infrastructure,” Engineering at Meta, March 12, 2024 and “Facebook: The Leading Social Platform of Our Times,” Investing.com, October 31, 2024
[2] “Extending the viability of air-cooling in high-performance data centers with HPE Cray XD2000 systems,” HPC Wire, July 3, 2023
[3] “Exascale: The New Frontier of Computing,” Oak Ridge National Laboratory
[4] “The Gigawatt Data Center Campus is Coming,” Data Center Frontier, April 29, 2024
[5] Epoch.AI, 2024