1. Home
  2. Companies
  3. Descript
DE

Descript

About

Descript makes a video and audio editing platform built around a straightforward idea: editing should work like editing a document. Rather than working with traditional timelines, users edit by manipulating a transcript - cutting text cuts the corresponding audio or video. The platform supports solo creators and distributed teams, and claims a user base of millions.

The product spans video editing, audio editing, podcast production, and collaboration tools, with an expanding suite of AI-assisted features. These include automated design and the ability to generate content from a text description. The technical domains the company works across include generative AI, speech processing, and human-centered design - with a stated emphasis on building AI that augments rather than complicates the creative process.

Descript describes its team as drawing from both the creator world and the technology industry. The company places weight on human-centered design and positions itself against the complexity of conventional editing software. Its culture signals point toward a team that values invention and forward-looking thinking - people, as the company puts it, who think about things that don't yet exist.

Similar companies

TA

Together AI

At Together AI, we believe software quality comes from research-driven innovation and genuine collaboration. Our engineers and researchers work side-by-side, contributing to open-source advancements like FlashAttention, Mamba, and RedPajama while building infrastructure that powers the next generation of AI. We optimize for impact over hierarchy - meaning you'll own substantial technical challenges from day one, whether it's designing distributed inference engines or pioneering new model architectures. We're committed to open and transparent AI systems because we believe the best outcomes for society emerge from shared knowledge. Our purpose-built GPU cloud platform empowers developers and organizations of all sizes to train, fine-tune, and deploy generative AI models without lock-in. We foster a culture of deep technical curiosity where tackling frontier problems in distributed systems, model optimization, and AI infrastructure is the norm - not the exception.

1 job
CL

Clarifai

Clarifai is a full-stack AI platform founded in 2013 by Matthew Zeiler, Ph.D. The company enables organisations to build, train, deploy, and monitor AI models across the entire machine learning lifecycle, handling unstructured data in the form of images, video, text, and audio. Its platform spans data preparation, model training, MLops, and production deployment, with edge deployment supported through its Local Runners feature and large-scale compute managed via AI Compute Orchestration. The platform serves over 400,000 users across 170+ countries and provides access to more than one million AI models, delivering billions of predictions. On the infrastructure side, Clarifai has benchmarked throughput of 544 tokens per second using the GPT-OSS-120B model. Clients include Amazon, Siemens, NVIDIA, Canva, Vimeo, and OpenTable. The company has been recognised as a leader in Forrester's New Wave Computer Vision report. Clarifai's technical roots trace to the ImageNet 2013 competition, where it took all five top places in image classification. The company has raised $100 million in total funding, including a $60 million Series C round, with backing from Menlo Ventures, Union Square Ventures, Google Ventures, NVIDIA, and Qualcomm.

DA

Duku AI

Duku AI builds an autonomous testing platform designed to remove the manual burden of software quality assurance. Its platform deploys AI agents that simulate real customer journeys after every build, catching failures before users encounter them. Rather than requiring teams to write and maintain test scripts, the system self-heals as codebases evolve - eliminating the flaky tests and stale test suites that slow down engineering teams. The company is led by operators with experience scaling Meta's testing infrastructure, launching Uber's global playbooks, and growing Deliveroo from its early stages to hypergrowth. That background informs both the product's technical architecture and its positioning: giving engineering and product teams the confidence to ship quickly without sacrificing reliability. Despite being a three-person team, Duku AI has reached $330,000 in revenue in 2025. The company is venture-backed.

LA

LangChain

LangChain builds the frameworks and platforms that developers use to construct, test, and deploy AI agents and large language model (LLM) systems. Its open source offerings - the LangChain and LangGraph frameworks - together exceed 90 million downloads per month and connect to more than 1,000 integrations, making them among the most widely adopted tools in the agent engineering ecosystem. The company operates on an open-core model, pairing its open source frameworks with LangSmith, a commercial platform that adds observability, evaluation, and deployment capabilities for production LLM systems. The open source frameworks provide pre-built architectures and developer tooling for assembling AI agents, while LangSmith addresses the reliability and monitoring demands that arise when those systems move into production. Together, the products cover the full development lifecycle - from early prototyping through to live deployment - targeting both the individual developer and the AI team operating at scale. LangChain's user base spans major enterprises and early-stage startups alike, with millions of developers relying on its frameworks worldwide. The company is focused specifically on agent engineering: the discipline of building LLM-powered systems that can reason, plan, and act reliably in real-world environments.

D-

d-Matrix

d-Matrix is a computing hardware company founded in 2019 that designs and builds purpose-built platforms for running generative AI inference at scale. Rather than adapting existing GPU architectures, the company built its platform from the ground up, using digital in-memory compute technology to address a core bottleneck in AI workloads: the costly and energy-intensive movement of data between memory and processors. Its flagship product, Corsair, is a computing platform engineered specifically for the demands of generative AI inference. By processing data closer to where it is stored, Corsair is designed to deliver ultra-low latency and high throughput while keeping energy consumption and operational costs manageable - performance characteristics that are increasingly important as generative AI deployments scale. The company sits at the intersection of silicon design, systems engineering, and software, working across the full stack to optimise compute architecture for inference. Its technical domains include: AI inference and generative AI Digital in-memory compute Compute architecture and silicon design Energy-efficient, high-throughput, low-latency systems Systems engineering and software Founded with a mission to make generative AI commercially viable and sustainable beyond the largest technology companies, d-Matrix has grown from a small startup to a team of over 200 engineers and technologists. The company operates with a first-principles approach, taking on fundamental hardware problems rather than incremental adaptations of existing infrastructure.

OP

OpenEvidence

OpenEvidence is an AI-powered medical information platform built for healthcare professionals making clinical decisions at the point of care. Its core product functions as an AI copilot for physicians: clinicians submit clinical questions and receive evidence-based answers, synthesised from trusted medical literature and accompanied by references. The platform is available free to verified U.S. healthcare professionals. The platform has reached considerable scale within the U.S. medical community. More than 40% of U.S. physicians log in daily, and the platform has supported over 100 million AI-powered clinical consultations from doctors and frontline clinicians. OpenEvidence draws on partnerships with leading medical institutions and is designed for use across practice settings, from major hospital systems to rural clinics. The company operates at the intersection of several technical disciplines, including: Natural language processing applied to medical literature Clinical decision support and medical knowledge synthesis HIPAA-compliant health IT infrastructure Evidence-based AI for high-stakes clinical environments OpenEvidence is focused on the United States market and positions itself as a foundational layer for how physicians access and apply medical knowledge in practice.