Starburst-Delta lake and Unity catalog connector vs Tableflow

Hi,

Does Starburst have similar capabilities in working with Databricks and Kafka?

Can a implementation with a Kafka cluster use Starburst for similar use cases as Tableflow by Confluent?

Thanks!

Starburst Galaxy has a feature called Kafka streaming ingest as documented at Starburst | Kafka streaming ingestion. A recent webinar demonstrated this as well as you can see a replay at https://www.youtube.com/watch?v=N09DmkAeP94.

Thanks! Can this be used for Delta lake tables as well?

No, it can only created Iceberg tables. If you wanted to, you could run a little follow-along script that uses the CDC function, Iceberg connector — Trino 471 Documentation, to grab changes since the last Iceberg snapshot ID you caught-up to to get the next batch of changes and then apply them to a Delta Lake table.

Fundamentally, I’d guess it isn’t a GIANT engineering effort to allow the ingestion framework to use a switch to determine which table format to use, but that would. be something that you’d have to request from Starburst since the ingestion framework is not open-sourced.