Senior Google Cloud Data Engineer in Brazil, Indiana at Jobgether
Explore Related Opportunities
Job Description
This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Senior Google Cloud Data Engineer in Brazil.
This role is focused on designing, building, and optimizing modern, scalable data platforms on Google Cloud Platform to support high-impact digital and analytics products.
You will work on both real-time and batch data pipelines that process large-scale, complex datasets powering business-critical insights.
The position involves deep collaboration with analytics, data science, and engineering teams in a global, agile environment.
You will play a key role in ensuring data reliability, performance, governance, and cost efficiency across cloud-based architectures.
Beyond pipeline development, you will contribute to data modeling, transformation standards, and BI enablement using Looker and BigQuery.
This is a hands-on technical role with strong ownership, combining engineering excellence with strategic data platform thinking.
- Design, build, and maintain scalable batch and real-time data pipelines using GCP services such as BigQuery, Dataflow, and Pub/Sub.
- Develop robust ETL/ELT workflows, ensuring high availability, fault tolerance, and data accuracy across distributed systems.
- Implement and maintain dbt transformation models, CI/CD pipelines, and structured data contracts for curated datasets and analytical marts.
- Optimize BigQuery performance through advanced query tuning, partitioning, clustering, and cost-efficient architecture design.
- Build and monitor data quality frameworks using tools such as Great Expectations, including freshness SLOs and reconciliation checks.
- Develop event-driven architectures and streaming pipelines using windowing, triggers, and watermarking strategies.
- Design and maintain LookML semantic models, ensuring consistent metrics and governance across the organization.
- Build impactful dashboards in Looker for both operational monitoring and executive reporting use cases.
- Perform root cause analysis to resolve data, pipeline, or performance issues and implement permanent fixes.
- Implement platform reliability controls including retries, dead-letter queues, disaster recovery runbooks, and security validation.
- Collaborate with cross-functional teams to ensure alignment between data engineering, analytics, and business needs.
- Document systems, pipelines, and architectural decisions to ensure transparency and maintainability.
- Strong hands-on experience in data engineering using Python for automation, pipeline development, and data processing.
- Advanced SQL expertise, including complex queries, nested structures, and analytical functions.
- Deep experience with Google Cloud Platform, especially BigQuery, Dataflow (Apache Beam), and Pub/Sub.
- Proven ability to build scalable batch and streaming pipelines with high reliability and performance.
- Strong understanding of data modeling, transformation frameworks, and modern data architecture principles.
- Experience implementing dbt workflows, CI/CD pipelines, and data governance practices.
- Expertise in Looker, including LookML, semantic modeling, explores, and dashboard development for diverse audiences.
- Strong knowledge of data quality frameworks, monitoring systems, and production-grade data operations.
- Experience optimizing cloud cost and performance in large-scale distributed data environments.
- Ability to work independently while collaborating effectively in agile, cross-functional teams.
- Strong communication skills with the ability to translate technical concepts into business insights.
- Advanced English proficiency.
- Competitive compensation aligned with market standards and experience level.
- Flexible remote work model within Brazil.
- Opportunity to work with cutting-edge Google Cloud technologies and large-scale data systems.
- International, collaborative environment with strong engineering culture.
- Career growth opportunities in data engineering, cloud architecture, and analytics leadership.
- Exposure to global clients and complex, high-impact data challenges.
- Continuous learning culture with support for certifications and technical development.