Senior Developer – PS Innovations in India at Jobgether
Explore Related Opportunities
Job Description
This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Senior Developer – PS Innovations in India.
Join a high-impact engineering environment focused on building next-generation cloud data ecosystems and intelligent automation solutions. In this role, you will design scalable data architectures, optimize modern Snowflake environments, and create advanced ETL/ELT pipelines that support real-time business intelligence and decision-making. You will collaborate closely with cross-functional stakeholders to transform complex business requirements into resilient and automated technical solutions. The position offers the opportunity to work with cutting-edge technologies, including AI-driven workflows, cloud-native tools, and advanced analytics platforms. This is an ideal opportunity for an experienced developer who thrives in fast-paced environments, enjoys solving large-scale data challenges, and wants to contribute to innovative enterprise-grade initiatives with global reach.
- Design, develop, and maintain scalable ETL/ELT workflows using Python, SQL, and Node.js to support enterprise data operations.
- Optimize Snowflake data warehouse environments, including performance tuning, clustering strategies, stored procedures, and warehouse cost management.
- Build resilient and self-healing data pipelines with audit logging, monitoring, and automated validation mechanisms to ensure data integrity and availability.
- Develop high-volume API integrations and complex JSON processing pipelines for structured and unstructured data ingestion.
- Collaborate with technical leads, business stakeholders, and cross-functional teams to define data models and scalable architectural solutions.
- Implement DevOps and DataOps best practices, including CI/CD pipelines, Git-based workflows, and infrastructure automation.
- Contribute to innovation initiatives involving Generative AI, metadata automation, and intelligent data orchestration capabilities.
- Troubleshoot and resolve performance bottlenecks, pipeline failures, and operational inefficiencies through long-term architectural improvements.
Requirements:
- 5+ years of professional experience in data engineering with strong expertise in ETL/ELT development and cloud-based data ecosystems.
- Advanced proficiency in SQL and extensive hands-on experience with Snowflake technologies, including Snowpark, Cortex, Tasks, and Streams.
- Strong programming skills in Python, SQL, and Node.js, with experience building scalable and maintainable data applications.
- Proven expertise in designing automated data pipelines, handling high-volume integrations, and processing complex datasets.
- Experience with DevOps/DataOps practices, CI/CD implementation, Git version control, and infrastructure-as-code methodologies.
- Solid understanding of cloud architecture, data modeling, warehouse optimization, and performance tuning strategies.
- Analytical mindset with strong troubleshooting and problem-solving capabilities in distributed and high-availability environments.
- Excellent communication skills with the ability to explain technical concepts clearly to both technical and non-technical audiences.
- Experience with Streamlit, Generative AI, LLM integrations, or contact center analytics solutions is considered a strong advantage.
Benefits:
- Flexible and remote-friendly work environment.
- Opportunity to work on cutting-edge cloud, AI, and data engineering initiatives.
- Exposure to large-scale global enterprise projects and modern technology stacks.
- Collaborative and innovation-driven engineering culture.
- Career growth opportunities in a fast-evolving international organization.
- Access to learning resources, technical development programs, and mentorship opportunities.
- Inclusive and diverse workplace focused on ownership, collaboration, and professional impact.
- Competitive compensation and comprehensive employee benefits package.