
Jasin Bushnaief
CTO, Co-founder

We ingested 1.73 trillion Lidar points covering New Zealand’s North Island, transforming 13 TB of data and put it behind a seamless, queryable API. By moving away from traditional workflows that rely on 300,000 fragmented "tiles," this country-scale dataset provides a single source of truth that significantly reduces time to insight and enables instant, interactive visualization directly in the browser. Whether you are querying a single building or a 700 km cross-section, the system eliminates file boundaries to ensure that managing a massive archive feels as lightweight and accessible as a million-point scan.

We have added support for streaming point clouds directly from LumiDB into QGIS, allowing you to use your data for map creation or geospatial analysis without downloading massive files. By utilizing EPT links, users can instantly stream datasets with full Level of Detail support. This feature takes advantage of LumiDB’s runtime transformation, meaning all existing data is ready to use immediately without time-consuming re-indexing or pre-processing.

We’ve brought progressive 3D Tiles streaming to the LumiDB Viewer, letting users instantly see full 3D datasets and get live query results without waiting. This major upgrade is part of a broader summer rollout focused on delivering a smooth, high-performance experience for exploring and analyzing massive point cloud datasets.

Managing large 3D scan datasets efficiently is challenging—especially when dealing with strict memory constraints. In this post, we explore how metadata queries in LumiDB let you interactively enable and disable scans without ever loading the full dataset into memory. We’ll walk through a real-world example, where a building scan is split into multiple scanner positions, and show how LumiDB’s built-in filtering and level-of-detail (LOD) handling can keep your application fast and responsive. 🚀

From hacking together data management software for autonomous robots at Amazon to starting LumiDB, this is the story of how we set out to fix reality capture data. Learn how we’re tackling the challenges of exploding data volumes, outdated tools, and scattered workflows to build a future where reality capture data is easily accessible.



