Job Description
- Design, develop, and maintain scalable data pipelines and architectures using Google Cloud Platform (GCP) technologies, including Dataflow, BigQuery, Pub/Sub, and Cloud Storage.
- Build and manage ETL processes to transform raw and diverse data sources into clean, structured, and analytics-ready datasets.
- Collaborate closely with data scientists and analysts to deliver reliable data solutions that drive insights and informed decision-making.
- Document data pipelines, architecture, and data models to ensure transparency, consistency, and ease of maintenance.
- Diagnose and resolve data-related issues promptly to minimize downtime and ensure uninterrupted data flow.
- Continuously optimize pipelines for improved performance, scalability, and cost efficiency across environments.
- Automate repetitive workflows and enhance operations using scripts and modern data engineering tools.
- Ensure data quality, integrity, and security throughout the entire data lifecycle and across multiple platforms.
- Stay current with emerging GCP tools, technologies, and best practices to continuously innovate and enhance data infrastructure
Job Experience and skill required
- Bachelors or Masers degree in Computes Science, Data Science or a related filed
- 5+ years experience as a Data Engineer
- Strong programming skills in BigQuery, Python, SQL, GCP, ETL Development and data modeling.
Apply now!
For more IT jobs, please visit www.********.co.za
I also specialise in recruiting in the following:
Software Development
Data Engineer
Data Analyst
Infrastructure
Architecture
… and more!
If you have not had any response in two weeks, please consider the vacancy application unsuccessful. Your profile will be kept on our database for any other suitable roles / positions.
For more information contact:
Monica de Wet
Senior Recruitment Consultant
www.********.co.za
Explore More Opportunities
Get Similar Job Alerts
Job Seeker Tip
Join professional associations in your field to expand your network and knowledge.