Data Migration with Starburst

Keep your business moving forward while migrating to a Modern Data Lake

Transformative data teams choose Starburst to succeed at every stage of their cloud modernization journey

Data and analytics architectures built on legacy on-premise data lakes and rigid data warehouses prevent companies from harnessing the power, performance, productivity, and potential of all their big data. 

With Starburst as a part of your modernization journey, complete your data migration while moving your business forward faster by unleashing the potential of all your data before, during, and after your data migration.

Customers choosing Starburst as their cloud analytics platform can complete their data migrations up to 67% faster than competing modern data stacks and achieve up to 42% lower TCO over the leading cloud analytics vendors


Make your Modern Data Lake the center of your data migration strategy

Modern data lakes or data lakehouses combine advanced data warehouse capabilities for scaling high-concurrency SQL analytics with the flexibility and cost-effectiveness of a cloud data lake, consolidating your data architecture.  It becomes the center of gravity for data but not the single source system. With a modern data lake, organizations can leverage open table and file formats from technologies like Iceberg, Delta Lake, or Hudi, not only as a cost-effective object store and landing zone for interactive analytics you would expect from a data warehouse but also for prescriptive, predictive, and cognitive analytics. Modern data lakes also help usher in a new era of efficiency, increased productivity, and cost optimizations by phasing out legacy systems and adopting new systems built in and for the cloud.

How 8 companies gained greater data warehousing value with Starburst

Read eBook

Starburst capabilities for data migrations


Plan for success

With Starburst’s data lake analytics platform as a critical component of the data migration strategy, data teams can bridge the silos between on-prem and cloud data sources, creating a single access point for company-wide SQL analytics while maintaining existing security and governance requirements. This avoids costly data outages and other risks frequently experienced in traditional migrations and allows your data migration plan to be built with greater flexibility while enhancing analytics user experience.

Execute with confidence and efficiency

Reduce your data migration process by half and eliminate disruptions to business-critical use cases. During the migration, use a combination of Starburst’s 50+ data source connectors in the SQL-based semantic layer, your data migration pipelines, and third-party services to create a shorter path for your data to reach its cloud destination. Furthermore, with Starburst’s semantic layer in place, data consumers continue to benefit from fast, reliable, accurate, and timely data without downtime or awareness of the inflight data migration.

Innovate with newly unlocked capabilities

Post migration, Starburst improves data team collaboration and scales your analytics more cost-effectively than leading cloud data warehouses. With data management enhanced via universal search, cataloging, governance, data quality and lineage, and data products, data producers and consumers can find the data they need faster and streamline data sharing. New data requests benefit from Starburst’s cross-cloud and multi-source semantic layer and fault-tolerant engine, allowing new jobs to be created no matter where the data lives and without switching data platforms – enabling true, highly scalable multi-cloud analytics.

Starburst as a part of your migration solution

Starburst with Trino moves your business forward at all stages of data migration strategy. With universal and uninterrupted access to your data from Day 0, whether it’s on-prem, in the cloud, multi-cloud, or in a hybrid system, your data teams and consumers will never lose access to business-critical insights because of a migration – Starburst enhances those insights. Starburst’s leading data lake analytics platform allows your migration team to complete data migrations with greater agility in half the time as other vendors. Once your project is complete, Starburst’s open standards reinforce your control over your data and provide the greatest flexibility to your modern data strategy and architecture.

Why a modern data lake architecture is essential for data-driven organizations

Read eBook

New capabilities and realized value

Once a data migration project is complete, value realization from having Starburst as a part of the data architecture doesn’t end. A whole new set of capabilities and benefits are unlocked.

Data lake query engine

Fast query engine to read data from a data lake.

Federated queries

Join data across multiple data sources and types without moving data physically.

$500 in credits

Start Free

Instantly access all of your most critical cloud data sources with Starburst Galaxy

Materialized views

Create persistent copies of data integrated across systems for low latency, repeated use, for example, reporting dashboards.

Data Products

Enable organization and reuse of data assets by publishing certified and managed data products.

Data Localization

Ensure compliance with regional regulations that require data to be stored and processed locally.

Get in touch

Contact Us

Have questions? We’re here to help.

Starburst Gravity

Universal discovery, governance, and sharing layer that enables the management of all data assets connected to Starburst. This includes capabilities like universal search, cataloging, governance, and data products.

Multi-cloud queries

Federated queries joining data across multiple locations such as on-prem and public clouds, significantly reducing the need to move data between locations.

Data migration benefits data teams

Data analysts, data scientists, and data engineers gain

  • A high-performance query engine to access and explore data.
  • In-place exploration of data without having to move data into centralized data storage like a data warehouse or data lake.
  • Reduced dependency on data engineering or IT teams to provision data in a centralized repository.
  • Simplified single access point for data users to connect to 50+ sources.

Data engineers gain

  • Create data products for frequently used data assets.
  • Explore, sample, and model data assets without creating pipelines and data movement.

Data stewards, data governance admins, and data platform admins gain

  • Ability to define policies and uniform access control with built-in RBAC and ABAC and integration with leading governance and security tools.

IT and Infrastructure teams gain

  • Ability to provide access quickly to data without making users wait until data pipelines are set up.
  • Ability to track data usage and use this insight to consolidate the highest value, frequently used data into higher performance, lower latency data stores like data lakes.
  • Flexibly scale and isolate workloads (BI, ad hoc, AI/ML, operational) for optimal performance.

Unlock access to all distributed data at every stage of the data migration process

Unlock access to disparate data across clouds, regions, and ground-to-cloud using  cluster-to-cluster connectivity. With Starburst, customers link catalogs or sources supported in one cluster, with those data sources supported in remote clusters. Starburst becomes the gateway for connected data and unlocking data access across geographies while ensuring access controls are in place and data residency requirements are honored—no more cloud data lock-in, empowering organizations to take control of constantly increasing egress fees.

Traditional data migration processes vs Starburst

Traditional data migration process

With a data migration process, the downstream requirements (e.g., BI, reporting, data science) are tightly coupled with the cloud data warehouse schemas and, thus, the ETL processes that create them. This would be okay if not because these downstream workloads frequently change, potentially leading to data integration issues. Building and maintaining pipelines is not necessarily difficult, but it can become time-consuming, resource intensive, and tedious at scale. This requires extensive planning and modeling of what-if scenarios to get as much of the ETL/schemas right the first time to avoid breaks and downstream impact leading to downtime.

How Starburst simplifies the data migration process

Starburst simplifies the entire migration process. First, with the federation, before moving a single table, you can create a semantic federation layer for all of your source tables. That lets you start immediately plugging in your downstream client tools and applications and working out your downstream business requirements. Repointing single tables or entire schemas is simple as you start the data migration process to a low-cost cloud object storage.

Best practices for data migration

Besides the volume of data and location, it is crucial to inventory current processes, data structures, and safeguards. Key considerations for your data migration strategy should include business requirements, data value, data type, governance, and complexity.

Understand the data you are migrating

Data quality is fundamental to business success. The first best practice when migrating data is understanding the data you are moving from one system to another. This involves assessing the source and target systems, identifying the data formats, types, volumes, dependencies, and relationships, and evaluating the data quality and integrity.

Define a clear strategy

Data migration is not a one-time event but a continuous cycle that requires planning and execution. The second best practice when migrating data is to define a clear strategy that aligns with your business goals, requirements, and constraints. This involves setting the migration project’s scope, objectives, timeline, budget, and roles and responsibilities.

Have a disaster recovery plan

Data migration can be risky and result in data loss or corruption. Create a backup of your data before you migrate it. This may also include the need for continued source data replication.  This can help you recover your data in case of any errors or failures during the migration process.

Migrate data in batches

Data migration can be lengthy and complex, especially if you have a large amount of data to migrate to your new environment. The fourth best practice when migrating data is to migrate data in batches rather than all at once. This can help you reduce the migration time, minimize the impact on business operations, and monitor and troubleshoot the migration process more efficiently.

Use the right data migration tools

Data migration can be challenging and tedious, requiring much manual work and expertise. The fifth best practice when migrating data is determining which types of data migration tools are needed and the best vendors to provide them. The focus should always be on automation and simplification of the data migration process without incurring massive costs. Many tools are available for data migration, such as cloud-specific data transfer services, other migration software, and professional services. Consider your SQL-based semantic layer over all the data sources to operate as a single access point at this stage. The data consumption layer allows for the underlying data locations, formats, and technologies to change, without the users needing to be aware.

Federated data products for data migration
Logo block with Starburst, Gilead, and Accenture
Watch on-demand webinar

Test your migration process

Data migration can be prone to errors and failures that can affect the quality and usability of your migrated data. The sixth best practice when migrating data is to test your migration process before you move your data to production. This involves validating your migrated data and systems’ functionality, performance, security, and compatibility.

Test your migrated data thoroughly before using it in production

Data migration can result in changes or discrepancies in your migrated data that can affect your business outcomes. The seventh best practice when migrating data is to test your migrated data thoroughly before using it in production. This involves verifying your migrated data’s accuracy, completeness, consistency, and integrity.

Keep your data flowing throughout the migration

Downstream business applications and teams should not experience downtime during migrations. The eighth best practice is to ensure business-critical applications, dashboards, reports, and teams continue to access relevant information without duplicating your entire data architecture.

Caution: Two data groups for special migration consideration

Two groups of data may not be worth the migration headaches. It is crucial to have a clear data migration strategy that defines the scope, criteria, and methods for selecting and migrating data. This strategy should consider the trade-offs between the costs and benefits of migrating different data workloads.

Data for applications managed on-premises

For example, a product team may continue to run a web application on-premises while its smartphone app sibling is deployed to the cloud. Tables supporting both applications, like “customers” or “order history,” may need to persist on-premises for the web application until the web application is also ready to be migrated to the cloud.

Regulated data

Data that may be subjected to regulatory requirements or could be sensitive, outdated, inaccurate, incomplete, or irrelevant to the business’s future operations must be maintained. Migrating this data can cause errors, performance issues, compliance risks, and have significant cost implications.

It may be advantageous to embrace a hybrid cloud approach and persist data on-premises and utilize Starburst to bridge the cloud and on-premise data sets to increase the capacity of a migration team, speed the migration, prevent data or workload outages and help avoid data duplication, performance, and compliance issues. 


A modern cloud data lake strategy that helps simplify and streamline the data migration from Hadoop to the cloud. It enables a seamless data transition from HDFS to a cloud-based storage service without requiring data format or schema changes. It can also allow a seamless transition of the applications, pipelines, and tools from MapReduce, Hive, Spark, and other frameworks to Starburst, which because of the Trino core, can query any data source without data movement or duplication. Starburst also integrates with other open-source tools and frameworks, like Apache Iceberg, Delta Lake, and Apache Hudi, to make the modern data lake a reality.

Data Warehouse

As a part of the data migration process, leading cloud data warehouses recommend first moving your data into a cloud data lake like S3 or Azure Data Lake Storage for staging before transferring the data into their proprietary vaults, making it easy for data to go in, but not so much for it to come back out. Because Starburst provides warehouse-like capabilities on the data lake, you don’t need to put all your data into an expensive and rigid cloud data warehouse. You decide which, if any, data should move into a cloud data warehouse while being able to run high-performance SQL analytics directly on your cloud data lake.

Activate your data on all major clouds

Starburst simplifies data migrations to all three major cloud providers and, once done, activates the data in and around the modern data lake. Choosing the right cloud platform provider can be daunting, with each offering massively scalable object storage across their global data centers, data lake orchestration, and managed Spark, Trino, and Hadoop services. You may opt for a multi-region and multi-cloud architecture, depending on your requirements. Learn how Starburst partners with each cloud to maximize the return on your migration.

Amazon Web Services | Microsoft Azure | Google Cloud Platform

Leading consulting partners work with Starburst to simplify your data migration and help ensure success

Get in touch

Want to try Starburst? Have questions? We're here to help.

Contact Us

Start Free with
Starburst Galaxy

Up to $500 in usage credits included

  • Query your data lake fast with Starburst's best-in-class MPP SQL query engine
  • Get up and running in less than 5 minutes
  • Easily deploy clusters in AWS, Azure and Google Cloud
For more deployment options:
Download Starburst Enterprise

Please fill in all required fields and ensure you are using a valid email address.