Remote Data Engineer Expert at Jobgether – Denmark, Western Australia
Jobgether
Denmark, Western Australia, 6333, Australia
Posted on
NewJob Function:Information Technology
New job! Apply early to increase your chances of getting hired.
About This Position
Remote Data Engineer Expert
This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer - REMOTE. In this role, you will design and implement scalable data solutions that empower the business to make data-driven decisions. You will play a crucial role in developing data pipelines and working with cutting-edge technologies on cloud platforms. This position offers an exciting opportunity to contribute to our clients operational improvement by leveraging comprehensive data insights that enhance customer experiences. Collaborating with various teams, you will ensure high-quality data management practices are upheld, ultimately driving impactful results for the organization. Join us to be part of a dynamic team focused on innovation and customer satisfaction.Accountabilities
This position is posted by Jobgether on behalf of a partner company. We are currently looking for a Data Engineer - REMOTE. In this role, you will design and implement scalable data solutions that empower the business to make data-driven decisions. You will play a crucial role in developing data pipelines and working with cutting-edge technologies on cloud platforms. This position offers an exciting opportunity to contribute to our clients operational improvement by leveraging comprehensive data insights that enhance customer experiences. Collaborating with various teams, you will ensure high-quality data management practices are upheld, ultimately driving impactful results for the organization. Join us to be part of a dynamic team focused on innovation and customer satisfaction.Accountabilities
- Design, build, and optimize ETL/ELT workflows using Databricks, SQL, and Python/PySpark.
- Develop and maintain robust, scalable, and efficient data pipelines for processing large datasets.
- Work on cloud platforms (Azure, AWS) to build and manage data lakes and scalable architectures.
- Utilize cloud services like Azure Data Factory and AWS Glue for data processing.
- Use Databricks for big data processing and analytics.
- Leverage Apache Spark for distributed computing and data transformations.
- Create and manage SQL-based data solutions ensuring scalability and performance.
- Develop and enforce data quality checks and validations.
- Collaborate with cross-functional teams to deliver impactful data solutions.
- Leverage CI/CD pipelines to streamline development and deployment of workflows.
- Maintain clear documentation for data workflows and optimize data systems.
- Bachelors or Masters degree in Computer Science, Information Technology, or related field.
- 36 years of experience in Data Engineering or related roles.
- Hands-on experience with big data processing frameworks and data lakes.
- Proficiency in Python, SQL, and PySpark for data manipulation.
- Experience with Databricks and Apache Spark.
- Knowledge of cloud platforms like Azure and AWS.
- Familiarity with ETL tools (Alteryx is a plus).
- Strong understanding of distributed systems and big data technologies.
- Basic understanding of DevOps principles and CI/CD pipelines.
- Hands-on experience with Git, Jenkins, or Azure DevOps.
- Flexible remote working conditions.
- Opportunities for professional growth and training.
- Collaborative and inclusive company culture.
- Access to modern technologies and tools.
- Health and wellness benefits.
- Work-life balance.
- Participation in innovative projects.
- Dynamic and fast-paced working environment.
Scan to Apply
Just scan this QR code to apply from your phone.
Job Location
Denmark, Western Australia, 6333, Australia