Data Engineer in Bangalore at Trellance LLC
Explore Related Opportunities
Job Description
Job Title: Data Engineer (SSIS)
Department: ProBridge
Reporting to: Manager
Company Detail
CU Rise Analytics Pvt. Ltd. is a wholly owned subsidiary of Trellance, Inc. CU Rise Analytics is an offshore development centre of Trellance Inc., working collaboratively to offer comprehensive solutions for data analytics, technology and talent to allow our credit union clients to provide high quality service to their members and remain competitive. Our core expertise lies in data science and technology, encompassing data analytics, predictive modelling, business intelligence, and technology services.
Trellance Cooperative Holdings, Inc. is a credit union cooperative and leading technology partner for credit unions. Its companies – consisting of Rise Analytics, ProBridge, Optiri and CUDX – provide innovative technology solutions for credit unions to increase efficiency, manage risk and improve the member experience. Trellance’s mission is to make sure credit unions have access to the tools and resources they need to grow, enhance member value and remain competitive in a rapidly evolving financial landscape.
Learn more at Trellance.com
Experience Required: 3 to 5 years
Job Description:
We are seeking skilled Data Engineer with expertise in SSIS and SQL to join our team. As a Data Engineer, you will be critical in our client’s work of Automation import and export and cloud migration from on-premises systems to, a cloud-based data platform. You will be responsible for building robust and efficient SSIS package, transforming raw data, and ensuring smooth data consumption. Experience with Snowflake is a plus.
Responsibilities:
- Collaborate with cross-functional teams to understand data requirements and design performant, efficient, and secure data pipelines from the landing zone to the consumption level in SQL server and SSIS.
- Utilize native SSIS capabilities to implement data transformations, aggregations, and enrichments to meet business needs.
- Develop and maintain data ingestion processes from various sources to on-premises sources to the landing zone, ensuring data integrity and accuracy throughout the pipeline.
- Work closely with the architecture team to ensure seamless integration of data sources.
- Optimize SSIS Package performance, scalability, and reliability, implementing best practices and monitoring mechanisms.
- Collaborate with data analysts and other stakeholders to understand data consumption requirements and provide solutions for efficient data retrieval.
- Ensure compliance with data governance policies and industry standards throughout the data engineering process.
- Continuously stay updated with the latest advancements in Snowflake and data engineering techniques, sharing knowledge and insights with the team.
Skills & Qualifications:
Must-Have:
- Bachelor’s degree in computer science, Engineering, or a related field (or equivalent practical experience).
- Proven experience as a Data Engineer with a focus on SQL Server, including hands-on experience in building end-to-end data pipelines.
- Strong understanding of SQL Server architecture, data warehousing concepts, and ETL/ELT methodologies.
- Expert knowledge of SQL (T-SQL specifically for SQL Server), including writing complex queries, stored procedures, and performance optimization techniques.
- Hands-on experience with Microsoft SQL Server Integration Services (SSIS) is essential, along with general knowledge of other ETL tools
- Familiarity with cloud platforms (e.g., AWS, Azure) and their data services.
- Solid understanding of data modeling principles and ability to design efficient data structures within Snowflake.
- Strong problem-solving skills and ability to analyze complex data requirements to develop effective solutions.
- Excellent communication and collaboration skills, with the ability to work effectively in cross-functional teams.
- Proactive and self-motivated with a passion for data engineering and delivering high-quality solutions.
Preferred Qualifications:
- Expertise and experience using Snowflake procedures, functions, streams, tasks, external stages, and external tables.
- Expertise and experience using Snow pipe.
- Experience with Snowpark, especially with Python.
- Experience with version control systems (e.g., Git) and CI/CD pipelines.
- Knowledge of Python or other scripting languages for data processing and automation.
- Expertise in Banking Data/Domain.
- Familiarity with data governance frameworks and practices.
- Experience with data visualization tools (e.g., Tableau, Power BI) for reporting and analytics.