Job Description:
We are seeking a talented and experienced Senior Data Engineer to join our dynamic data team and help drive our data strategy and architecture forward.
The Senior Data Engineer will be responsible for designing, developing, and maintaining scalable data pipelines and data architectures. This role involves working closely with data analytics engineers, analysts, and development teams to ensure that data systems are optimised for performance, reliability, and accuracy.
The ideal candidate will have a strong background in data engineering, proficiency in various data technologies, and a passion for solving complex data challenges.
Roles and Responsibilities:
Design, build, and maintain scalable data pipelines and data processing systems to support increasing data volume and complexity – both live and batch. Monitor metrics for production workloads, and optimise them for performance & cost.
Work with analytics engineers, analysts, developers, and business teams to determine design needs, improvements, and implement software
Implement data governance policies and procedures to ensure data accuracy, completeness, PII masking and security.
Identify and resolve data quality issues and data-related problems.
Proactively discover & learn frameworks, technologies to improve efficiency of team, automate manual processes, and re-design infrastructure for greater scalability and optimised data delivery.
Improve self sufficiency of decentralised teams w.r.t data while ensuring data masking and security.
Define & manage IAM policies across teams to access the data.
Align to Agile principles, and adhere to sprint methodology.
Hire and mentor the team as needed and contribute to the growth and development of the team.
Qualification and Skills
Experience of 2+ years in the data engineering domain
Strong knowledge in data platforms and architecture and experience with cloud platforms
Expertise in data warehousing with preferable experience in BigQuery/Redshift
Strong experience with programming languages like SQL, Python.
Experience in ETL tools like Airflow, Fivetran, Meltano, dbt, etc.
Experience in data visualisation tools like Tableau, Looker, Quicksight, Metabase
Have knowledge on industry practices in data security
Be up to date with data engineering tools. Ability to explore new frameworks and integrate with existing architecture.
Troubleshooting & debugging skills
Ability for critical thinking, and have a systematic and logical approach to problem-solving. Should have a solution-seeking mindset
Ability to prioritise and deliver outcomes from each sprint
Must possess good communication & leadership skills
Have a good eye for accuracy and attention to detail