Visit Markets Data
Imec GaN-on-Si MOSHEMT Technology

Terakraft and Neurophos collaborate on sustainable AI

The global race to build faster, more capable artificial intelligence systems is driving an unprecedented expansion of computing infrastructure. Yet with every leap forward in processing power comes an equally significant surge in energy demand. This tension between innovation and sustainability is now at the forefront of the AI industry. Against this backdrop, Norwegian green datacentre operator Terakraft and U.S.-based optical chip innovator Neurophos have announced a landmark collaboration aimed at addressing one of the sector’s most pressing challenges: how to make large-scale AI both powerful and sustainable. The partnership centers on a pilot programme, expected to launch in 2027, that will combine Terakraft’s renewables-powered datacentre operations with Neurophos’ groundbreaking optical processing hardware. The initiative will allow select enterprise clients early access to Neurophos’ ultra-efficient AI inference platform, hosted within one of the world’s most energy-efficient datacentres. If successful, the project could serve as a template for how the future of artificial intelligence can scale without compromising the planet’s resources. The sustainability challenge of AI Artificial intelligence adoption has accelerated across industries, from healthcare and logistics to finance, energy, and creative sectors. However, this rapid uptake comes with staggering energy costs. Running large-scale models requires enormous computing power, with GPUs consuming megawatts of electricity, not only for computation but also for cooling. Datacentres have emerged as both enablers of AI and as significant contributors to global electricity consumption. Analysts project that AI workloads could account for up to 10% of global electricity demand by the end of the decade if efficiency improvements are not realized. The collaboration between Terakraft and Neurophos directly targets this problem. By pairing a datacentre with a power usage effectiveness (PUE) below 1.1 – a figure that outperforms most global benchmarks – with next-generation optical processors that promise 100x greater efficiency than current GPUs, the companies aim to demonstrate that sustainability and scale are not mutually exclusive. Terakraft’s green foundation Terakraft has positioned itself at the intersection of renewable energy and high-performance computing. Based in Norway, the company leverages the region’s abundant hydropower resources to fully power its datacentre infrastructure. Unlike conventional datacentres that depend on mechanical chillers and air conditioning, Terakraft employs natural lake water cooling, drastically reducing energy consumption while minimizing environmental impact. In addition, the company has opted to repurpose an existing hydropower plant built with reinforced concrete rather than constructing new facilities. This decision avoids the creation of additional embodied carbon, which is often overlooked in sustainability discussions. By reusing existing infrastructure, Terakraft is reducing its carbon footprint across both operational and construction phases, setting a precedent for future datacentre projects worldwide. Chairman of the board Giorgio Sbriglia framed the move as part of a broader mission: “Our mission has always been to power the future responsibly, and this collaboration brings that vision to life.” For Terakraft, sustainability is not simply an add-on but the core of its business model. Neurophos’ optical breakthrough On the hardware side, Neurophos represents a new wave of AI chip startups challenging the dominance of GPU manufacturers. Its proprietary optical processing units (OPUs) are designed to fundamentally reshape how AI inference is conducted. Traditional GPUs, while powerful, rely heavily on electrical transistors, which generate heat and require immense cooling resources. Optical processors, by contrast, use light to perform computations, dramatically increasing speed and reducing energy usage. Neurophos claims that its miniaturisation of optical modulators by a factor of 10,000x allows for compute-in-memory architectures that mimic the human brain’s efficiency. In simulations, its processors have been shown to deliver the performance equivalent of 100 GPUs while consuming just 1% of the energy. CEO and founder Patrick Bowen emphasises that this is not just about incremental gains but about enabling a paradigm shift in how AI workloads are powered. “By deploying our 100x more efficient inference chips in Terakraft’s green datacentre we’re proving that AI’s exponential growth can be achieved sustainably, together,” Bowen explained. The Neurophos vision is to democratise high-performance AI by making it accessible without environmental trade-offs, a message that resonates strongly in an era of rising corporate sustainability commitments. A proving ground for the future The pilot project scheduled for 2027 will function as more than a technical test. It is also a real-world demonstration of how green infrastructure and revolutionary hardware can align to create a scalable blueprint for sustainable AI. Select enterprise clients will gain early access to the Neurophos platform, providing feedback and performance validation that could accelerate commercial deployment. For the AI industry, this collaboration signals that the sector is moving beyond simply chasing scale. The question is no longer just how powerful a model can be, but how responsibly it can be powered. If Terakraft and Neurophos succeed, they may catalyze a new wave of partnerships across the AI ecosystem, uniting datacentre operators, chipmakers, and enterprises around a shared mission: making AI both sustainable and transformative. The Terakraft–Neurophos collaboration is more than just a technological experiment. It is a signal of how the AI infrastructure market is evolving in response to environmental, financial, and competitive pressures. At the heart of this shift lies a fundamental tension: while demand for artificial intelligence is rising exponentially, the global capacity to generate electricity sustainably is finite. Every company scaling AI must confront this reality, and many are beginning to see green infrastructure not as a cost burden but as a competitive advantage. The competitive pressure on GPU incumbents For more than a decade, GPUs manufactured by companies like NVIDIA and AMD have been the backbone of AI computing. Their architecture, originally designed for graphics rendering, proved remarkably well-suited for machine learning workloads. Today, NVIDIA dominates the global AI chip market, with its H100 and successor chips driving model training and inference for the largest AI platforms. Yet the drawbacks of this GPU-centric model are becoming more apparent. Energy consumption, heat output, and supply chain constraints have created bottlenecks. Leading cloud providers frequently cite GPU shortages, while enterprises deploying AI at scale face spiraling costs. If Neurophos’ optical chips deliver on their promise of 100x efficiency gains, the dominance of GPUs could be challenged, particularly in inference – the stage of AI where models are deployed into production environments. The emergence of optical processors could spark a wave of hardware diversification, reducing reliance on a single vendor and introducing new paradigms of compute. Just as CPUs gave way to GPUs for AI, GPUs may eventually give way to light-based processors. Terakraft’s decision to host Neurophos chips in a renewable-powered datacentre provides a stage for this potential transition. The datacentre industry’s pivot to green models Datacentre operators worldwide are under mounting pressure to align with net-zero goals. The sector is projected to consume nearly 10% of global electricity by 2030, with AI workloads accounting for a growing share. Governments in Europe, North America, and Asia are beginning to scrutinize the carbon intensity of datacentres, with new regulations emerging around efficiency standards and renewable integration. In this context, Terakraft’s model – powered entirely by hydropower and cooled with natural resources – is a strategic differentiator. A power usage effectiveness (PUE) below 1.1 places the company in the top tier globally, positioning it to attract enterprises that must meet environmental, social, and governance (ESG) commitments. As more enterprises set science-based climate targets, the location and energy profile of their AI workloads are becoming boardroom issues. Hosting in a green facility is no longer just good PR; it is a requirement for risk management and investor confidence. By pairing renewable energy with breakthrough efficiency chips, Terakraft and Neurophos are showcasing a dual-pronged sustainability approach: reduce consumption at the chip level and minimize emissions at the datacentre level. This integrated strategy could become the gold standard for next-generation AI infrastructure. Enterprise adoption and the ESG advantage For enterprises, the appeal of this collaboration lies in both cost savings and compliance benefits. Energy efficiency translates directly into lower operating expenses, particularly at scale. A model that requires only 1% of the energy compared to GPU-driven systems could unlock new financial viability for AI applications that were previously cost-prohibitive. At the same time, enterprises face growing pressure from regulators, investors, and consumers to demonstrate climate responsibility. The European Union’s Corporate Sustainability Reporting Directive (CSRD) and similar frameworks globally are pushing companies to account for their Scope 3 emissions, which include outsourced datacentre operations. Choosing a partner like Terakraft allows enterprises to credibly report progress toward sustainability targets, while benefiting from cutting-edge performance. Furthermore, early adopters of Neurophos’ technology may gain a first-mover advantage in AI deployment. The pilot programme will provide select enterprises with access to an inference platform that not only reduces costs but also accelerates workloads. For industries where time-to-decision is critical – financial services, logistics, autonomous systems – such speed could prove transformative. Signaling a shift in AI economics The economics of AI are changing. Until now, the industry has largely operated on the assumption that greater performance requires greater consumption. This “bigger is better” mindset has resulted in hyperscale clusters consuming vast amounts of energy, with little consideration for sustainability. The Terakraft–Neurophos collaboration suggests a different path forward: exponential performance gains without exponential energy growth. If successful, this model could recalibrate how enterprises think about AI investment. Rather than bracing for escalating costs tied to energy and GPU scarcity, businesses could anticipate declining costs through efficiency gains. Such a shift would democratize access to AI, allowing not only global tech giants but also mid-sized firms and regional enterprises to deploy large-scale AI responsibly. The broader implication is clear: sustainability is no longer just about environmental stewardship. It is about economic competitiveness in a world where energy, efficiency, and performance are inseparably linked.