JobTarget Logo

Data Engineer Intern in Toronto, Ontario at PureFacts Financial Solutions

New
PureFacts Financial Solutions
Toronto, Ontario, M5E 1G6, Canada
Posted on
New job! Apply early to increase your chances of getting hired.

Explore Related Opportunities

Job Description

About PureFacts Financial Solutions

PureFacts is a revenue performance software company serving wealth management, asset management, and asset servicing firms. We help financial institutions protect, optimize, and grow revenue through a connected platform spanning pricing, billing, compensation, reporting, and transparency. By unifying fragmented data and workflows into a trusted revenue foundation, we help clients improve accuracy, strengthen governance, reduce manual effort, and unlock new growth opportunities.


At PureFacts, we are building an AI-native platform and company. We embed AI, intelligent automation, and agentic workflows across our products and operations to detect anomalies, surface insights, streamline repetitive work, and support faster, better decision-making. In a highly regulated industry, we believe AI must be practical, governed, and auditable—amplifying human expertise while helping our teams and clients focus on higher-value, strategic work.


About the role

We are seeking a Data Engineering Intern to assist in building and maintaining the foundational data pipelines that power our analytics and applications. This role focuses on the extraction of data from various source systems (SaaS APIs, internal services) and the loading of that data into our central data environment.


This is a technical, hands-on role ideal for a student or recent graduate looking to apply their programming skills to real-world data movement and orchestration challenges.


What you'll do

  • Pipeline Development: Build and maintain basic ELT/ETL pipelines to move data from external APIs and internal systems into our database.
  • API Integration: Interface with RESTful and GraphQL APIs to programmatically retrieve data using standard authentication methods (OAuth2, API Keys).
  • Data Transformation: Implement lightweight data cleaning and normalization during the extraction process to ensure data integrity.
  • Schema Management: Assist in the design and maintenance of destination tables and schemas.
  • Monitoring & Maintenance: Monitor pipeline health, identify failures, and perform basic debugging to ensure consistent data delivery.

Qualifications

  • Programming: Proficiency in Python (specifically libraries like requests, pandas, and json).
  • SQL: Strong ability to write and optimize SQL queries for data insertion and retrieval.
  • API Literacy: Understanding of HTTP methods (GET, POST), status codes, and JSON/XML data formats.
  • Databases: Experience with at least one relational database (e.g., PostgreSQL, MySQL).
  • Cloud Platforms: Familiarity with cloud environments, with a strong preference for Azure.
  • Version Control: Basic familiarity with Git for collaborative development.


Bonus & Preferred Skills

  • Modern Data Stack: Experience or familiarity with Snowflake as a cloud data warehouse.
  • Orchestration: Understanding of basic scheduling concepts (e.g., Cron jobs, Airflow etc.).


Candidate Attributes

  • Logically Driven: A first-principles approach to solving technical bottlenecks.
  • Detail Oriented: High attention to data accuracy and edge-case handling in code.
  • Communicative: Ability to document technical processes and explain pipeline logic clearly.


The pay range for this role is:
25 - 30 CAD per hour(Toronto, Canada)

Job Location

Toronto, Ontario, M5E 1G6, Canada

Frequently asked questions about this position

Similar Jobs In Toronto, Ontario

Service Designer, Access Networks

Beanfield Technologies Inc.
Toronto, Ontario

IT Support Administrator

Beanfield Technologies Inc.
Toronto, Ontario

OSP Construction Designer

Beanfield Technologies Inc.
Bolton, Ontario

Data Engineer

Pacific Smoke International Inc
Markham, Ontario

Apply NowYour application goes straight to the hiring team