Announcing our $25 million fundraise

Atrium

The Operating System for Software-Defined Construction

Construction sites are really messy. They're dynamic environments full of edge cases. Controlling a fleet of autonomous robots in such a setting requires a strong grasp of the reality around us. Atrium bridges the digital and physical worlds through a software system that’s deeply integrated into our business and is highly adaptable to lessons from the field. It allows us to model and maintain a digital map of the physical environment and integrates directly with the digital drawings of the structures we build, enabling autonomous robotic construction.


But Atrium is much more. It helps us plan deployments and track progress, design and model buildings, map the surrounding construction site, and execute robotic build plans—both in real life and in simulation. It also helps us analyze build results, investigate failures, calibrate our robots, and manage our fleet. It's CAD, ERP, BI, robotics control, and operations. All in one place.

Atrium is at the heart of everything we do at Monumental.

A Quick Note on Our Technology Stack

For the developers out there: Atrium isn’t a single monolith, but a tightly integrated ecosystem of tools and services, built with a variety of programming languages.

At the lowest level, we control our actuators and sensors directly from the micro controllers. All embedded code is written in C++. Because debugging micro controllers is tough and we deploy a lot of them, we keep embedded logic to an absolute minimum.

One layer up, each of our autonomous ground vehicles runs a single central computer. This is where our robotics control and navigation logic lives, written entirely in Rust. For our computer vision and deep learning stack, we use Python. Using cameras, we can localize precisely on the construction site and monitor every brick placement for quality control. By tightly integrating our localization and control systems, we can place bricks in world-space coordinates with millimeter precision.

Tying everything together is the Atrium desktop application, built using TypeScript and Rust. Much of our Rust code is compiled to WebAssembly, allowing our UI to interface directly with the same control code that we deployed to our robots. This gives us a shared stack between real-world deployments and simulated runs in the front-end.

View Job Openings

Design

Atrium features a fully integrated architectural design tool allowing us to draw building facades with high levels of detail in a parametric way. The exact position of every single brick and mortar joint is pre-computed based on the constraints set by the architect. Designs are broken down segment-by-segment and can automatically be translated into build-ready plans for our robots.

Align

The real world doesn’t always match the blueprint. Before every build, we capture a full 3D reconstruction of the site using photogrammetry and bring this into Atrium. By aligning the scans with our designed building models, we catch millimeter-scale differences before even a single brick is placed. We reference physical site features like door frames, window openings, fiducial markers, to ground everything in real-world data. The result is a precisely aligned and true-to-scale digital twin.

Build

Once everything is aligned, Atrium connects directly to our robots to begin construction. Through our onboard cameras we localize our systems, cross-referenced with the digital twin. Our central planner interprets real-time sensor data to guide each robot, segment by segment, course by course, brick by brick. Multiple robots can work together to speed up construction of larger building sites.

Whether operating directly on-site or remotely, we have live access to everything, robot telemetry, video feeds, build progress, and quality data. This way we can act fast if something needs attention.

Inspect

Every action our robots take is logged: sensor readings, images, videos, you name it. Atrium gives us tools to analyze that data in detail. We use it to catch mistakes, monitor quality, and debug both hardware and software. This isn't just quality control, but an ongoing feedback loop allowing us adapt and fix issues on the fly.

Detect

Bricks come in all shapes and sizes. To properly pick & place bricks and perform automated quality control we need to accurately recognize and model bricks. We use custom-trained computer vision models to recognize and measure bricks under every circumstance we can encounter in our deployment. Partial obstructions, strong shadows and sun glare, rain, odd discoloring, it doesn't matter – we compute a precise model for all bricks.

Calibrate

Our robots are built to be scalable and cost-effective. That means we can’t rely on perfect parts or precision machining. Instead, we use Atrium to handle calibration. Aligning each robot's cameras, sensors, and kinematics so the entire fleet behaves consistently. Calibration happens in a dedicated area at our HQ using high fidelity motion tracking cameras, right before robots head into the field.

Simulate

Before anything touches the construction site, we simulate the entire build in Atrium. From software changes to new hardware, we can test everything virtually. Checking timing, reachability, collisions, and more. These fast, automated simulations help us catch bugs and optimize performance before real-world deployment allowing for very fast R&D cycles.

Analyze

With multiple robots across multiple sites, staying in sync can get messy. Atrium brings clarity by streaming live operational data like videos from the field, placement logs, quality metrics, operator notes, etc into a single dashboard. It gives our team a clear view of what’s happening, no matter where it’s happening.