1. Home
  2. Companies
  3. Graphcore
GR

Graphcore

About

We're building the future of AI compute at Graphcore, where our team of semiconductor, software and AI experts is creating the complete AI compute stack from silicon to datacenter infrastructure. We developed the Intelligence Processing Unit (IPU), a groundbreaking processor specifically designed for machine intelligence that delivers leading-edge AI performance with unprecedented efficiency. Our silicon engineers are doing pioneering work, including being the first company in the world to bring a Wafer-on-Wafer AI processor to market, while our software and AI teams build new tools and optimize popular AI models to lead the market for performance, efficiency, and usability.

We're optimistic about a future where people live healthier, fairer, more informed, and more sustainable lives. Our AI technology aims to bring democratized intelligence that everyone can benefit from. Now as a wholly owned subsidiary of SoftBank Group, backed by significant long-term investment, we're expanding our teams around the world to solve the toughest problems in AI computing. We're bringing together the brightest minds in a place where everyone has the opportunity to make an impact on the company, our products, and the future of artificial intelligence.

Open roles at Graphcore

Explore 2 open positions at Graphcore and find your next opportunity.

GR

AI Performance Engineer

Graphcore

Milpitas, California, United States (On-site)

4d ago
GR

Senior Principal Test Framework Software Engineer

Graphcore

Austin, Texas, United States (On-site)

1w ago

Similar companies

D-

d-Matrix

d-Matrix is a computing hardware company founded in 2019 that designs and builds purpose-built platforms for running generative AI inference at scale. Rather than adapting existing GPU architectures, the company built its platform from the ground up, using digital in-memory compute technology to address a core bottleneck in AI workloads: the costly and energy-intensive movement of data between memory and processors. Its flagship product, Corsair, is a computing platform engineered specifically for the demands of generative AI inference. By processing data closer to where it is stored, Corsair is designed to deliver ultra-low latency and high throughput while keeping energy consumption and operational costs manageable - performance characteristics that are increasingly important as generative AI deployments scale. The company sits at the intersection of silicon design, systems engineering, and software, working across the full stack to optimise compute architecture for inference. Its technical domains include: AI inference and generative AI Digital in-memory compute Compute architecture and silicon design Energy-efficient, high-throughput, low-latency systems Systems engineering and software Founded with a mission to make generative AI commercially viable and sustainable beyond the largest technology companies, d-Matrix has grown from a small startup to a team of over 200 engineers and technologists. The company operates with a first-principles approach, taking on fundamental hardware problems rather than incremental adaptations of existing infrastructure.

NU

NVIDIA USA

In 1993, three computer scientists - Jensen Huang, Chris Malachowsky, and Curtis Priem - shared a vision that interactive 3D graphics would become the cornerstone of future computing. They founded NVIDIA with a bold mission to bring 3D graphics to the gaming and multimedia markets. Their breakthrough invention of the GPU in 1999 not only sparked the growth of the PC gaming market but also ignited the modern era of artificial intelligence. From those early days of graphics chips, NVIDIA has evolved into the world leader in accelerated computing. The company pioneered the GPU as a specialized processor that handles complex mathematical calculations in parallel, enabling dramatic advancements in gaming, high-performance computing, and AI. Today, NVIDIA drives innovation across industries - from autonomous vehicles and robotics to healthcare and scientific research - tackling challenges no one else can solve by transforming massive amounts of data into actionable insights and real-time experiences.

ZU

Zuma

Zuma, headquartered in Santa Monica and backed by Andreessen Horowitz and Y Combinator, builds agentic AI systems for multifamily property management. Its platform automates the repetitive layers of onsite operations - lead engagement, tour scheduling, and rent collection - while keeping human staff in the roles that require direct human interaction. The company describes its approach as augmenting property teams rather than displacing them, operating under the premise that the future of work is human and AI working together. Zuma's primary product, Kelsey, is an AI leasing assistant designed to handle the high-volume, time-intensive tasks that typically occupy onsite teams at apartment communities. The company claims to have served thousands of apartment communities and millions of residents to date. Engineering and design at Zuma work directly alongside operations staff, with product iteration driven by feedback from property managers in the field. The company characterises its development pace as rapid and its priorities as impact-led rather than prestige-led - a culture signal aimed at those who want their work deployed and tested quickly against real-world conditions.

CO

CoreWeave

CoreWeave is a specialized cloud platform founded in 2017 and built from the ground up to provide GPU-based infrastructure for artificial intelligence workloads. The company began as a cryptocurrency mining operation under the name Atlantic Crypto before pivoting entirely to AI infrastructure - a shift that has since placed it among TIME's 100 most influential companies. The platform is designed specifically for training and deploying large language models at scale, rather than adapting general-purpose cloud architecture to AI use cases. Its core offerings include GPU compute, a Kubernetes-based orchestration layer tuned for AI workloads, and custom storage solutions built to meet the demands of machine learning pipelines. CoreWeave operates data centers across the United States and Europe, serving AI research labs, enterprises, and startups. Engineering at CoreWeave spans a broad range of technical disciplines, including: AI infrastructure and GPU compute Cloud platform and data center operations Kubernetes-based orchestration Custom storage systems for ML workloads Large language model training and deployment The company draws its team from varied professional backgrounds - including commodities trading and infrastructure engineering - and operates with the stated conviction that cloud infrastructure designed for AI must differ fundamentally from what came before it.

CL

Clarifai

Clarifai is a full-stack AI platform founded in 2013 by Matthew Zeiler, Ph.D. The company enables organisations to build, train, deploy, and monitor AI models across the entire machine learning lifecycle, handling unstructured data in the form of images, video, text, and audio. Its platform spans data preparation, model training, MLops, and production deployment, with edge deployment supported through its Local Runners feature and large-scale compute managed via AI Compute Orchestration. The platform serves over 400,000 users across 170+ countries and provides access to more than one million AI models, delivering billions of predictions. On the infrastructure side, Clarifai has benchmarked throughput of 544 tokens per second using the GPT-OSS-120B model. Clients include Amazon, Siemens, NVIDIA, Canva, Vimeo, and OpenTable. The company has been recognised as a leader in Forrester's New Wave Computer Vision report. Clarifai's technical roots trace to the ImageNet 2013 competition, where it took all five top places in image classification. The company has raised $100 million in total funding, including a $60 million Series C round, with backing from Menlo Ventures, Union Square Ventures, Google Ventures, NVIDIA, and Qualcomm.

CD

Chai Discovery

Chai Discovery builds AI foundation models designed to predict and reprogram biochemical molecular interactions, with the goal of transforming drug discovery into an engineering discipline. The company's founders are researchers who co-invented protein language modeling and developed state-of-the-art protein folding algorithms. Chai Discovery is backed by OpenAI, Thrive Capital, and General Catalyst, and has a partnership with Eli Lilly. The company's core platform - anchored by its Chai-1 and Chai-2 models - predicts and redesigns interactions between biochemical molecules at atomic precision. Key capabilities include drug-like antibody design against challenging targets and the generation of functional GPCR agonists with minimal screening. The platform has reported a success rate of over 85% for computationally designed molecules meeting drug-like properties, a figure that contrasts with the traditional approach of screening millions or billions of protein sequences over years at a cost of billions of dollars. Chai Discovery's technical work spans several domains: Protein language modeling and folding algorithms Computational protein and antibody design GPCR agonist design Biopharma-focused machine learning Computational prediction of biochemical interactions The company focuses on targets traditionally considered undruggable, positioning its platform as a tool capable of addressing therapeutic challenges that conventional methods cannot efficiently reach.