Databricks Architect - ETL Modernization / SSIS Migration (Remote LATAM) at Atmosera – Remote - LATAM
Explore Related Opportunities
About This Position
About the Role
We are seeking a Databricks Data Architect for a 6-week engagement to support the modernization of an existing data integration platform.
The current environment relies on SSIS packages orchestrated through SQL Server Agent jobs, synchronizing data between multiple SQL Server and Oracle databases. The goal of this engagement is to analyze the current architecture, design a target-state Databricks-based solution, and support the migration of these workloads to a modern lakehouse architecture.
You will work closely with infrastructure architects and data engineers to define migration strategies, redesign data pipelines, and ensure reliable, scalable data movement across systems.
This role is ideal for someone experienced in data-platform modernization, legacy ETL migrations, and Databricks-based architectures.
Assessment & Discovery
- Analyze the current SSIS packages, SQL Server Agent jobs, and cross-database synchronization processes.
- Document existing data flows, scheduling and orchestration patterns, transformation logic, and dependencies between SQL Server and Oracle systems.
- Identify bottlenecks, operational risks, opportunities to simplify pipelines, and define migration complexity and sequencing strategy.
Architecture & Migration Design
- Define the target-state architecture using Databricks and Delta Lake.
- Design modern equivalents for SSIS ETL packages, SQL Agent job orchestration, and cross-database synchronization logic.
- Define patterns for data ingestion from SQL Server and Oracle, incremental data synchronization, job orchestration, and error handling.
- Provide a clear migration blueprint for moving workloads from SSIS into Databricks pipelines.
Implementation Support
- Guide or support the implementation of Databricks pipelines, jobs, and notebooks that replace existing SSIS workloads.
- Assist engineering teams in rebuilding ETL logic in Databricks, implementing incremental ingestion patterns, validating data consistency between systems, and supporting testing and cutover planning.
- Ensure data integrity, performance, and reliability throughout the migration.
Documentation & Communication
- Document current architecture (as-is), target architecture (to-be), and the migration roadmap.
- Communicate architectural decisions, trade-offs, and risks clearly to both technical and non-technical stakeholders.
- Strong hands-on experience with Databricks, including clusters, jobs, notebooks, and Delta Lake.
- Experience modernizing or migrating legacy ETL systems such as SSIS, Informatica, or similar platforms.
- Experience building data pipelines integrating relational systems such as SQL Server and Oracle.
- Understanding of data synchronization patterns, incremental loading strategies, and ETL orchestration.
- Ability to analyze large, coupled legacy data workflows and define pragmatic modernization paths.
- Strong communication skills and experience collaborating with architecture and engineering teams.
- Experience with Databricks on AWS environments.
- Experience replacing SQL Server Agent or legacy schedulers with modern orchestration frameworks.
- Familiarity with data lakehouse architectures and Delta Lake design patterns.
- Experience with large-scale ETL modernization or data-platform migration initiatives.
- You will work with a collaborative architecture and engineering team to modernize a legacy data integration platform and transition it to a scalable Databricks-based architecture. This short-term engagement offers the opportunity to make a high-impact contribution to a critical data-platform modernization initiative.