JobTarget Logo

Sr Data Engineer - Toronto/Waterloo at Syndesus – Toronto

Syndesus
Toronto
Posted on
Updated on
NewJob Function:Information Technology
New job! Apply early to increase your chances of getting hired.

Explore Related Opportunities

About This Position

Senior Data Engineer Hybrid | Waterloo or Toronto, ON (must be locally based)

The RoleSenior IC role on a data innovation team responsible for designing, building, and operating the data architecture that powers analytics, ML/AI initiatives, and business intelligence across a large-scale consumer software platform. You'll work at the intersection of data engineering, data science, and data quality — partnering closely with stakeholders, data scientists, and product teams.
Responsibilities
  • Design and deploy comprehensive data architecture capturing structured and unstructured data from diverse internal and external sources
  • Build resilient ETL/ELT pipelines routing data across cloud structures, local databases, and other storage forms
  • Implement data quality frameworks — validation, monitoring, and automated recovery strategies
  • Collaborate with data scientists to enable advanced analytics, predictive modeling, and ML initiatives
  • Develop web-enabled, self-service analytics solutions that democratize data access company-wide
  • Apply AI/ML and big-data techniques to automate data cleansing, transformation, and enrichment
  • Leverage MCP (Model Context Protocol) to connect enterprise applications and automate data flows
  • Ensure secure, scalable, and compliant data ingestion with appropriate PII handling
  • Troubleshoot pipeline issues, optimize performance, and participate in on-call rotations
  • Mentor junior team members and contribute to data engineering practice growth

Requirements
  • 8+ years of hands-on ETL/ELT pipeline development across varied data sources
  • Strong programming skills in Python, Scala, or Java (production-quality code)
  • Experience with modern data platforms — Snowflake, Databricks, Apache Spark, Kafka, Airflow
  • Cloud platform experience — AWS, Azure, or GCP and their native data services
  • Experience with real-time data processing and streaming architectures
  • Solid data modeling, warehousing, and dimensional modeling fundamentals
  • Knowledge of containerization and orchestration (Docker, Kubernetes)
  • Practical knowledge of MCP and AI-assisted development tools
  • Familiarity with DataOps and MLOps practices
  • Experience managing sensitive/PII data with attention to compliance and governance
  • Strong communication skills across technical and non-technical stakeholders

Preferred
  • Background in data science or analytics
  • Experience in client-facing or Professional Services roles


Compensation$130,000 - $175,000 + ~15% bonus. Full benefits.
The CompanyWell-established consumer software company with a large global footprint. Strong benefits: bonus, pension, medical/dental/vision, generous PTO, and paid parental leave.

CompensationThe base pay range for this role is CA$130,000 – CA$175,000 per year.

Job Location

Toronto

Frequently asked questions about this position

Continue to apply
Enter your email to continue. You’ll be redirected to the employer’s application.
By clicking Continue, you understand and agree to JobTarget's Terms of Service and Privacy Policy.
Apply Now