JobTarget Logo

Big Data Engineer (GCP) in Phoenix, Arizona at OmegaHires

NewSalary: $55.00 - $60.00/hrJob Function: Information Technology
OmegaHires
Phoenix, Arizona, 85003, United States
Posted on
New job! Apply early to increase your chances of getting hired.

Explore Related Opportunities

Job Description

Job Title: Big Data Engineer (GCP)
Location: Phoenix, AZ (Onsite/Hybrid)
Duration:
Job Summary

We are seeking an experienced Big Data Engineer with strong expertise in Google Cloud Platform (GCP) to design, build, and optimize scalable data pipelines and analytics solutions. The ideal candidate will have hands-on experience with BigQuery and GCP data services, and will collaborate closely with data scientists, architects, and business stakeholders to deliver high-performance, reliable data systems.

Key ResponsibilitiesData Engineering & Pipeline Development
  • Design, develop, and maintain scalable data pipelines using GCP services.
  • Build efficient ETL/ELT processes for structured and unstructured data.
  • Ensure data quality, integrity, and availability across systems.
GCP & Big Data Technologies
  • Work extensively with BigQuery, Dataflow, and Dataproc for data processing and analytics.
  • Optimize BigQuery queries for performance and cost efficiency.
  • Leverage GCP-native tools for scalable and resilient data architectures.
Programming & Processing
  • Develop data processing solutions using Python, Java, or Scala.
  • Implement batch and real-time data processing frameworks.
Workflow Orchestration & Automation
  • Design and manage workflows using Airflow or Cloud Composer.
  • Automate data pipelines and integrate with CI/CD processes.
Collaboration & Delivery
  • Partner with data scientists, analysts, and business teams to understand requirements.
  • Participate in Agile ceremonies and contribute to sprint deliverables.
  • Ensure timely delivery of high-quality data solutions.
Required Qualifications
  • 7+ years of experience in Big Data Engineering.
  • Strong hands-on experience with GCP services (BigQuery, Dataflow, Dataproc).
  • Proficiency in Python, Java, or Scala for data engineering.
  • Strong SQL skills with experience in query optimization.
  • Experience with workflow orchestration tools (Airflow/Composer).
  • Familiarity with Agile methodologies and CI/CD practices.
  • Strong problem-solving and analytical skills.
Nice to Have
  • Experience with real-time streaming (Pub/Sub, Kafka).
  • Knowledge of data warehousing and data lake architectures.
  • Exposure to data governance and security best practices.

Job Location

Phoenix, Arizona, 85003, United States

Frequently asked questions about this position

Similar Jobs In Phoenix, Arizona

Urgently Hiring

Systems Administrator I

American Fence Company, Inc.
Phoenix, Arizona
Hot Job

DAS Technician- Phoenix, AZ

Communication Technology Services Inc
Phoenix, Arizona

Space Force MUOS RF Systems Engineer

Mission Systems
Scottsdale, Arizona
New

Network Engineer

QE Solar
Scottsdale, Arizona

Remote Data Entry Agent - Work at Home

UsaSurveyJobBoard
Phoenix, Arizona

Apply NowYour application goes straight to the hiring team