Data Engineer – Databricks

Company Description

DataAIT Technologies is an Australian company based in Melbourne, VIC, dedicated to providing Data & AI services. Specializing in cutting-edge technologies like Data Analytics, Cloud Technologies, AI, and Automation, we support businesses in their data-driven transformation initiatives. Our unique approach involves offering services on a flexible “Service-as-a-service” model, collaborating with clients to innovate and deliver data initiatives. Additionally, we promote Gender Diversity through initiatives supporting women in technology and offering IT skills training programs.

Job Title: Data Engineer – Databricks Expert

Location: Melbourne / Hybrid

Type: Full-Time / Contract

Role Description

This is a full-time hybrid role for a Databricks Data Engineer at DataAIT Technologies in Melbourne, VIC. The Data Engineer will be responsible for tasks such as data engineering, modeling, ETL processes, data warehousing, and data analytics to support business… transformation initiatives driven by data ecosystems. In this role, you will design, build, and optimise data pipelines and platforms, leveraging Databricks’ capabilities to deliver impactful solutions to our clients.

Key Responsibilities:
• Design and implement scalable data pipelines using Databricks and Apache Spark.
• Collaborate with data scientists, analysts, and stakeholders to deliver end-to-end data solutions.
• Optimise data architecture for performance, reliability, and scalability.
• Integrate data from multiple sources, including cloud platforms (Azure, AWS, GCP).
• Monitor and maintain data pipeline health, ensuring data quality and governance.
• Provide expertise in Data Lakehouse architecture and Big Data processing.
• Stay up to date with the latest Databricks features and best practices.

Qualifications
• Data Engineering and Data Modeling skills
• Experience in Extract Transform Load (ETL) processes
• Data Warehousing and Data Analytics expertise
• Proficiency in Databricks platform
• Strong analytical and problem-solving skills
• Experience with cloud technologies such as AWS or Azure
• Knowledge of programming languages like Python or SQL
• Bachelor’s degree in Computer Science, Data Engineering, or related field
• 5+ years of experience in Data Engineering with a focus on Databricks and Spark.
• Expertise in designing and implementing ETL pipelines and data transformation workflows.
• Proficiency in Python or Scala for data engineering.
• Experience in working with data lakes and data warehousing.
• Solid understanding of data governance and security best practices.
• Excellent communication and problem-solving skills.

Preferred Qualifications:
• Experience with MLflow, Delta Lake, and DBT.
• Knowledge of GenAI and AI/ML models on Databricks.
• Familiarity with Snowflake and Fivetran integration.

What We Offer:
• Opportunity to work with cutting-edge technologies and leading-edge clients.
• Flexible working environment with remote options.
• Continuous learning and development opportunities in the Data & AI space.
• Competitive remuneration package and potential for onsite opportunities.

If you’re passionate about Data Engineering and want to make an impact in the world of Data & AI, we’d love to hear from you

Company: DataAIT Technologies
Location: Melbourne VIC, Australia
Salary Range:
Job Providers:
LinkedIn
匠人学院
Talent.com
Sercanto