Job Description
ESSENTIAL SKILLS:
- Expertise with Apache Spark (PySpark), Databricks notebooks, Delta Lake, and SQL
- Strong programming skills in Python for data processing
- Experience with cloud data platforms (Azure) and their Databricks offerings; familiarity with object storage (ADLS)
- Proficient in building and maintaining ETL/ELT pipelines, data modeling, and performance optimization
- Knowledge of data governance, data quality, and data lineage concepts
- Experience with CI/CD for data pipelines, and orchestration tools (GitHub Actions, Asset Bundles or Databricks’ jobs)
- Strong problem-solving skills, attention to detail, and ability to work in a collaborative, cross-functional team
ADVANTAGEOUS SKILLS:
- Experience with streaming data (Structured Streaming, Kafka, Delta Live Tables).
- Familiarity with materialized views, streaming tables, data catalogs and metadata management.
- Knowledge of data visualization and BI tools (Splunk, Power BI, Grafana).
- Experience with data security frameworks and compliance standards relevant to the industry.
- Certifications in Databricks or cloud provider platforms.
QUALIFICATIONS/EXPERIENCE:
Bachelor’s or Master’s degree in Computer Science, Data Engineering, Information Systems, or a related field.
3+ years of hands-on data engineering experience.
Key Responsibilities:
- Design, develop, test, and maintain robust data pipelines and ETL/ELT processes on Databricks (Delta Lake, Spark, SQL, Python/Scala/SQL notebooks)
- Architect scalable data models and data vault/ dimensional schemas to support reporting, BI, and advanced analytics
- Implement data quality, lineage, and governance practices; monitor data quality metrics and resolve data issues proactively
- Collaborate with Data Platform Engineers to optimize cluster configuration, performance tuning, and cost management in cloud environments (Azure Databricks)
- Build and maintain data ingestion from multiple sources (RDBMS, SaaS apps, files, streaming queues) using modern data engineering patterns (CDC, event-driven pipelines, change streams, Lakeflow Declarative Pipelines)
- Ensure data security and compliance (encryption, access controls) in all data pipelines
- Develop and maintain CI/CD pipelines for data workflows; implement versioning, testing, and automated deployments
GO APPLY NOW
Safe & secure application process
Explore More Opportunities
Get Similar Job Alerts
Job Seeker Tip
Keep copies of job descriptions for positions you apply to - they're useful for interview prep.
How to Apply
Click “GO APPLY NOW” to visit the company’s application page.
Follow their instructions carefully.
JVR Jobs connects you with employers – we don’t process applications directly.
Latest Job Opportunities
West Rand: FINANCIAL WEALTH SALES EXECUTIVE posted by Liberty Standard Bank Group
Join the Golden team at Liberty now and become a Professional Financial Adviser If Sales is in your blood contact…
View JobUmhlanga Rocks: Graduate Financial Adviser posted by Liberty Standard Bank Group 2
MAKE 2024 YOUR YEAR OF SUCCESS At Liberty we're More Than Insurance. We are not afraid to step outside the…
View JobPretoria: Junior IT Support Technician posted by Hire Resolve
A South African financial services company that provides a wide range of services to individuals and businesses, including professional advice,…
View JobCape Town Region: ICT Sales Representative posted by DB Recruitment
Ready to take your ICT sales career to the next level? A leading player in the ICT industry is on…
View JobCape Town City Centre: Customer Service Agent – French posted by Merchants SA Pty Ltd
French International Call Centre Agent About Us: Join a dynamic, multilingual contact centre that supports global clients across various industries.…
View JobLimpopo: Salesperson 2IC posted by Pepkor Lifestyle
Introduction We are looking for a dynamic and result driven Salesperson to join our team. The ideal candidate will be…
View Job
Browse Employers
Job Alerts