Build, enhance, and maintain our real-time data pipeline. Work with various infrastructure and operations teams to maintain our data infrastructure. As a senior engineer on the Data Streaming Platform team, you will be responsible for developing and maintaining data ingestion pipelines vital to the continued growth of the bank. You will collaborate with many teams across the bank to understand data needs and turn them into platforms and services, monitor and maintain the health of the data streams, and have an impact on the data ecosystem within Square. Develop data products data warehouse solutions in on-premises and cloud environments using cloud-based services, platforms and technologies. Dynamic and results-driven Lead Stream Data Engineer with extensive experience in designing, developing, and deploying high-performance, re-usable streaming applications using Apache Kafka, Apache Flink and Java. Proven expertise in building scalable data pipelines and real-time processing systems that enhance operational efficiency and drive business insights. Strong background in microservices architecture, cloud technologies, and agile methodologies. Qualifications Bachelor’s Degree in Computer Science or similar fields like Information Systems, Big Data, etc. AWS Data Engineer Certification would be advantageous Related Technical certifications Experience Experience in developing solutions in the cloud. At least 5 years of experience with designing and developing streaming Data Pipelines for Data Ingestion or Transformation using AWS technologies. Experience with distributed log systems, such as Kafka and AWS Kinesis. RedPanda or Confluence experience advantageous. Experience in developing data warehouses and data marts. Experience in Data Vault and Dimensional modelling techniques. Experience working in a high availability DataOps environment. Proficiency in AWS services related to data engineering, such as AWS Glue, Athena, and EMR. Strong programming skills in languages like Python and Java. Experience in implementating CI/CD pipelines. Github. Designing and implementation of scalable streaming architectures using technologies such as Apache Kafka, Apache Flink, or AWS Kinesis to handle real-time data ingestion and processing Programming Languages: Proficient in Java (Java SE 8/11), with a solid understanding of object-oriented programming principles and Python 3. Streaming Technologies: Extensive experience with Apache Kafka, including Kafka Streams API for real-time data processing, producer/consumer development, and stream management and expertise in Apache Flink. Desirable skills : Decodable, K8’s, Datavault, data warehouses and modelling. Frameworks: Deep knowledge of Spring Boot for building RESTful services and microservices architectures; adept at using Spring Cloud for distributed systems. Database Management: Skilled in integrating various databases (e.g., NoSQL ) with streaming applications to ensure efficient data storage and retrieval. Cloud Platforms: Hands-on experience deploying applications on AWS, utilizing services such as EC2, S3, RDS, and Lambda for serverless architectures. Operational Delivery Build, enhance, and maintain our real-time data pipeline. Work with various infrastructure and operations teams to maintain our data infrastructure. Be self-driven in identifying and documenting feature gaps, and designing and implementing solutions to them. Help build, modernize, and maintain services and tooling to ensure resiliency, fix data discrepancies, and enhance the customer experience. Monitor daily execution, diagnose and log issues, and fix pipelines to ensure SLAs are met with internal stakeholders. Mentor other engineers and help them grow; code reviews, guidance on best practices, leveraging your experience in the field. Technical Leadership Participate in the engineering and other disciplines` community of practice. Share AWS knowledge and practical experience with the community. Challenge and contribute to the development of architectural principles and patterns. Technical leadership and mentorship. Developing and monitoring of data engineering standards and principles. Lead technical delivery within teams and provide oversight of solutions. Compliance Ensure solutions adhere to the company`s patterns, guidelines and standards. Operate within project environments and participate in continuous improvement efforts. Delivery Management Follow and participate in defined ways of work including, but not limited to, sprint planning, backlog grooming, retrospectives, demos and PI planning.
Data Streaming Engineer (Jhb/Ct) position available in Gauteng, Johannesburg. This job position was posted by . The job has been posted on 2025-03-14 in the It Computing Software category