On a recent trip to Taiwan, our first management meeting in Taipei ended with a question I wasn’t expecting: “You didn’t ask about CPO?”
I paused, “CPO?” (for the record, CPO stands for Co-Packaged Optics which essentially bring optical connections closer to chips in order to transfer data faster and more efficiently). Add this to an ever-growing list of acronyms – LPU, TPU, COPOS, DLC – and it serves as a good reminder that in AI today, the jargon is multiplying almost as fast as the demand[1].
Over the following days, we met with more than 30 corporates, including most of the key players across the AI supply chain. The proliferation of new acronyms is a symptom of something bigger: an ecosystem expanding rapidly, with new technologies, products and bottlenecks emerging faster than most investors can process, and with demand showing no sign of slowing, if anything, accelerating. Companies across multiple parts of the ecosystem continue to report strong order visibility, ongoing capacity expansion, and a growing pipeline of applications. In many conversations, the most striking takeaway has been that while the pace of investment already appears unprecedented, most still believe this is only the beginning of a much larger infrastructure build-out.
Meetings with corporates in Taipei.
More importantly, this wave of innovation may help explain why valuations look stretched on the surface: earnings are struggling to keep pace with how quickly the total addressable market is expanding, leaving investors constantly playing catch-up. While in Taipei, one of our small cap holdings reported in a single quarter what investors had been expecting for the full year. This is clearly a one-off, but it gives you a sense of the magnitude.
This is not to say there won’t be accidents along the way, and periods of sharp corrections should be expected. It is also worth acknowledging the backdrop of an ongoing geopolitical conflict, with markets already showing signs of risk-off, and plenty of profits to be taken from AI-related equities, which can amplify downside moves. If prolonged, this conflict may even begin to disrupt parts of the supply chain, with inputs such as helium, a key component in semiconductor manufacturing, potentially becoming constrained. Competition is also likely to intensify, particularly as new entrants emerge in certain areas of the supply chain, which could moderate the pace of the earnings growth currently enjoyed by the incumbents.
Reflecting on the trip, I come away with a renewed sense of optimism and feel more constructive than I did previously: the pace of innovation and breadth of deployment still suggest we are relatively early in the AI cycle.
At the same time, the drivers of AI tech demand are gradually evolving. Demand for semiconductor components remains strong, but there is an increasing focus on system-level efficiency. In that context, three areas in particular stand out as the next key drivers:
The first area is ASICs (Application Specific Integrated Circuits). There is a noticeable shift from GPUs to ASICs for specific tasks. These accelerators are more efficient because they are designed for a single function, allowing for lower power consumption compared to GPUs, which handle a broad range of tasks. This transition will also reduce reliance on Nvidia’s GPUs, opening the door for new players in the semiconductor design space, especially fabless companies focused on creating highly specialised chips like TPUs and LPUs.
The second area is power architecture and thermal management. As AI infrastructure scales and power density increases, these are becoming some of the fastest growing areas of the value chain. The shift from AC/DC to HVDC power systems will likely drive the next leg of growth, but more importantly, as compute density rises and heat generation increases, cooling requirements extend well beyond GPUs and CPUs. Entire server racks will increasingly require integrated thermal management, creating another layer of value within the supply chain.
The third area is the expansion of AI into the physical world. While data centres remain the dominant source of demand today, companies are increasingly looking to robotics, industrial automation, and autonomous systems as the next drivers of growth. These segments are still early, but they have the potential to become meaningful contributors to company earnings over time. The extension of AI into the physical and industrial environments could ultimately prolong the cycle beyond what is currently assumed.
On a parallel note, these shifts do not appear to reduce demand for advanced semiconductor manufacturing, as most approaches still rely on leading-edge nodes and increasingly complex packaging. More broadly, the scope of AI applications continues to expand.
Taiwan remains central to all of these dynamics. While we continue to keep a close eye on valuations, the competitive landscape, and risks from technological displacement, we feel we are well exposed to these trends across our portfolios. The opportunity set is broadening, and with it, the scope for stock picking is improving.