Funding & Deals

Bedrock Robotics Hits US$1.75B Valuation Following US$270M Series B Funding

Inside the funding round driving the shift to intelligent construction fleets

Updated

February 7, 2026 2:12 PM

Aerial shot of an excavator. PHOTO: UNSPLASH

Bedrock Robotics has raised US$270 million in Series B funding as it works to integrate greater automation into the construction industry. The round, co-led by CapitalG and the Valor Atreides AI Fund, values the San Francisco-based company at US$1.75 billion, bringing its total funding to more than US$350 million.

The size of the investment reflects growing interest in technologies that can change how large infrastructure and industrial projects are built. Bedrock is not trying to reinvent construction from scratch. Instead, it is focused on upgrading the machines contractors already use—so they can work more efficiently, safely and consistently.

Founded in 2024 by former Waymo engineers, Bedrock develops systems that allow heavy equipment to operate with increasing levels of autonomy. Its software and hardware can be retrofitted onto machines such as excavators, bulldozers and loaders. Rather than relying on one-off robotic tools, the company is building a connected platform that lets fleets of machines understand their surroundings and coordinate with one another on job sites.

This is what Bedrock calls “system-level autonomy”. Its technology combines cameras, lidar and AI models to help machines perceive terrain, detect obstacles, track work progress and carry out tasks like digging and grading with precision. Human supervisors remain in control, monitoring operations and stepping in when needed. Over time, Bedrock aims to reduce the amount of direct intervention those machines require.

The funding comes as contractors face rising pressure to deliver projects faster and with fewer available workers. In the press release, Bedrock notes that the industry needs nearly 800,000 additional workers over the next two years and that project backlogs have grown to more than eight months. These constraints are pushing firms to explore new ways to keep sites productive without compromising safety or quality.

Bedrock states that autonomy can help address those challenges. Not by removing people from the equation—but by allowing crews to supervise more equipment at once and reduce idle time. If machines can operate longer, with better awareness of their environment, sites can run more smoothly and with fewer disruptions.

The company has already started deploying its system in large-scale excavation work, including manufacturing and infrastructure projects. Contractors are using Bedrock’s platform to test how autonomous equipment can support real-world operations at scale, particularly in earthmoving tasks that demand precision and consistency.

From a business standpoint, the Series B funding will allow Bedrock to expand both its technology and its customer deployments. The company has also strengthened its leadership team with senior hires from Meta and Waymo, deepening its focus on AI evaluation, safety and operational growth. Bedrock says it is targeting its first fully operator-less excavator deployments with customers in 2026—a milestone for autonomy in complex construction equipment.

In that context, this round is not just about capital. It is about giving Bedrock the runway to prove that autonomous systems can move from controlled pilots into everyday use on job sites. The company bets that the future of construction will be shaped less by individual machines—and more by coordinated, intelligent systems that work alongside human crews.

Keep Reading

Artificial Intelligence

How Analog Devices Is Turning Hardware Into Intelligence?

The upgraded CodeFusion Studio 2.0 simplifies how developers design, test and deploy AI on embedded systems.

Updated

January 8, 2026 6:34 PM

Illustration of CodeFusion Studio™ 2.0 showing AI, code and chip icons. PHOTO: ANALOG DEVICES, INC.

Analog Devices (ADI), a global semiconductor company, launched CodeFusion Studio™ 2.0 on November 3, 2025. The new version of its open-source development platform is designed to make it easier and faster for developers to build AI-powered embedded systems that run on ADI’s processors and microcontrollers.

“The next era of embedded intelligence requires removing friction from AI development”, said Rob Oshana, Senior Vice President of the Software and Digital Platforms group at ADI. “CodeFusion Studio 2.0 transforms the developer experience by unifying fragmented AI workflows into a seamless process, empowering developers to leverage the full potential of ADI's cutting-edge products with ease so they can focus on innovating and accelerating time to market”.

The upgraded platform introduces new tools for hardware abstraction, AI integration and automation. These help developers move more easily from early design to deployment.

CodeFusion Studio 2.0 enables complete AI workflows, allowing teams to use their own models and deploy them on everything from low-power edge devices to advanced digital signal processors (DSPs).

Built on Microsoft Visual Studio Code, the new CodeFusion Studio offers built-in checks for model compatibility, along with performance testing and optimization tools that help reduce development time. Building on these capabilities, a new modular framework based on Zephyr OS lets developers test and monitor how AI and machine learning models perform in real time. This gives clearer insight into how each part of a model behaves during operation and helps fine-tune performance across different hardware setups.

Additionally, the CodeFusion Studio System Planner has also been redesigned to handle more device types and complex, multi-core applications. With new built-in diagnostic and debugging features — like integrated memory analysis and visual error tracking — developers can now troubleshoot problems faster and keep their systems running more efficiently.

This launch marks a deeper pivot for ADI. Long known for high-precision analog chips and converters, the company is expanding its edge-AI and software capabilities to enable what it calls Physical Intelligence — systems that can perceive, reason, and act locally.  

“Companies that deliver physically aware AI solutions are poised to transform industries and create new, industry-leading opportunities. That's why we're creating an ecosystem that enables developers to optimize, deploy and evaluate AI models seamlessly on ADI hardware, even without physical access to a board”, said Paul Golding, Vice President of Edge AI and Robotics at ADI. “CodeFusion Studio 2.0 is just one step we're taking to deliver Physical Intelligence to our customers, ultimately enabling them to create systems that perceive, reason and act locally, all within the constraints of real-world physics”.