Our November newsletter highlights Lance community governance, a deep dive on Lance and Iceberg, a demo of Netflix's multimodal search, previous talk recordings, and the latest product and community updates.
We’ve launched a dedicated Lance discord, website, and GitHub organization focused entirely on the format, feature discussions, proposals, and real-world use cases.
🧊 From BI to AI: A Modern Lakehouse Stack with Lance and Iceberg
The modern lakehouse stack is composed of six layers.
Iceberg remains a strong choice for large-scale OLAP and BI workloads. Lance complements it by addressing AI and multimodal data requirements with an Arrow-native layout, high-performance indexing, and built-in interop with Parquet.
Together, both formats can coexist in the same lakehouse stack: Iceberg for BI, Lance for AI.
Here is a demo from Netflix and LanceDB’s joint talk at Ray Summit 2025, highlighting how to search through hundreds of terabytes of multimodal data with negligible latency and perform multimodal data understanding at scale.
Recordings you might've missed
Scaling Multimodal Data Curation with Ray and LanceDB
Lei Xu (LanceDB), Pablo Delgado (Netflix)
Data Loading for Data Engineers
Weston Pace (LanceDB)
Supercharging Multimodal Feature Engineering
Jack Ye (LanceDB)
Product Updates
LanceDB Enterprise Features
We have enabled full-text search in SQL to reach parity with our Python API capabilities. We have also introduced incremental indexing using SPFresh, eliminating the need for full reindexing while maintaining centroid freshness and reducing cold latency significantly.