Job Description
- Design, build, and maintain ELT pipelines
- Pipeline success rate (% jobs completed without failure)
- Data latency (extraction to availability)
- Error resolution turnaround time
- Integrate data across systems (WMS, TMS, ERP, IoT)
- Number of successful system integrations
% data completeness and consistency across sources - Average time to onboard new data source
- Data Lake / Warehouse Management
- System uptime and availability
- Query performance (execution speed)
- Storage usage vs capacity
- Structured and clean data sets
- Data quality score (accuracy, completeness, validity)
- Number of reported data issues
- Resolution turnaround time
- Collaboration with BI and Data Science teams
- Time to deliver datasets for reports and models
- Internal stakeholder satisfaction rating
- Support requests resolved within SLA
- Documentation and data cataloguing
% of pipelines with up to date documentation - Metadata completeness
- User ease of data discovery
- Security and compliance support
% compliance with access control policies - Number of unauthorised access incidents
- Audit readiness and completion rate
- Process automation
- Manual hours reduced via automation
- Number of recurring tasks automated
- Stability of automated workflows
- Issue analysis and root cause investigations
- Time to issue resolution (from investigation to recommendation)
- Number of root causes correctly identified
- Reduction in repeated issues
- Advanced SQL development, optimisation and data engineering
- Query performance improvements (execution speed and efficiency)
- SQL based data validation, transformation and cleansing coverage
- Reusable SQL pipelines for recurring logistics workflows (inventory, shipments, routing)
- SQL based reconciliation across multiple systems
- Version controlled, documented SQL scripts aligned with governance standards
- Reduction in errors or rework due to SQL inefficiencies
JOB SPECIFIC REQUIREMENTS
- Minimum Requirements (Experience and Qualifications)
- Bachelor’s or Masters in Computer Science or related field
- Microsoft Certified Azure Data Engineer Associate
- Google Professional Data Engineer
- AWS Certified Data Analytics
- Certifications in BI or analytics tools such as Power BI, Tableau or SQL
- 3 to 5 years experience in data engineering preferably in logistics or supply chain sectors
- Required Knowledge
- Development of ELT and ETL pipelines using tools such as Apache Airflow,
- SSIS or Azure Data Factory
- Data integration across logistics systems including ERP, WMS, TMS and IoT
- Data modelling, schema design and SQL optimisation
- Data warehousing concepts including star and snowflake schemas
- Version control and CI/CD pipelines for data products
- Supply chain and logistics data structures and flows
- Required Skills
- Advanced proficiency in SQL and Python or another scripting languageStrong debugging, problem solving and performance tuning skills
- Data validation, cleansing and transformation techniques
- Building scalable and reusable data pipelines
- Working knowledge of cloud based data platforms such as Azure, AWS or GCP
- Communication skills
- Required Competencies
- Ability to work under pressure
- Time management
- Collaboration
- Problem solving
- Attention to detail
- Analytical thinking
- Working under pressure in an agile environment
ADDITIONAL NOTES OF IMPORTANCE
- Operate in a safe manner complying with all Health, Safety, Quality and
- Environmental requirements to ensure own safety and the safety of others
- Compliance with Good Distribution and Good Documentation Practice guidelines as per South African Health Products Regulatory Authority (SAHPRA)
- All handling of Pharmaceutical Goods and Medical Devices must be in accordance with the operational requirements of the Business and the regulatory requirements of the relevant statutory bodies in South Africa as per the Medicines and Related Substances Act 101 of 1965 as amended
DEFINITION
- Designing, constructing and maintaining data infrastructure
- Functional Operations
- Refers to day to day functional delivery requirements of role
- Check overnight ETL jobs and resolve any pipeline errors
- Review logs from a new IoT tracking feed integration
- Optimise SQL queries for delivery performance dashboard
- Deploy updated pipeline to automate inbound shipment data
- Document new data model changes for inventory tracking
GO APPLY NOW
Safe & secure application process
Explore More Opportunities
Get Similar Job Alerts
Job Seeker Tip
Join professional associations in your field to expand your network and knowledge.
How to Apply
Click “GO APPLY NOW” to visit the company’s application page.
Follow their instructions carefully.
JVR Jobs connects you with employers – we don’t process applications directly.
Latest Job Opportunities
Johannesburg: Credit controller
Duties include: Following up on outstanding debt telephonically and by email to in order to reduce debtor days Allocating payments…
View JobCenturion: Sales Executive
Requirements: Matric Tertiary qualification will be beneficial 2 years sales experience Valid driver's license and own transport Own laptop Excellent...
View JobPaarl: Accountant
We are looking for a detail-oriented Accountant to join our team. As an Accountant, you will be responsible for managing…
View JobPort Elizabeth: Machining Manager
KEY PERFORMANCE AREAS Plan, coordinate, and control change initiatives within the Machine Shop, Wheel Finishing, and Diamond Cut areas. Monitor…
View JobEastern Cape: Senior Truck Parts Sales Executive
Minimum Requirements: Must have at least have 3 to 5 years' Truck Parts Sales Executive experience within the Commercial Industry…
View JobPort Elizabeth: Junior Software Developer
Position Overview An exciting opportunity exists for a driven and detail-focused Junior Software Developer to participate in the design, development,…
View Job
Browse Employers
Job Alerts