Website Unifi Africa
Unifi Africa is the name for a microfinance group that provides digital credit solutions in several Sub-Saharan African countries . Its head office is in Stellenbosch, South Africa.
About the role
- Unifi is on the lookout for a talented Data Engineer with strong expertise in Google Cloud Platform (GCP) to join our fast-growing team. In this role, you’ll design, build, and maintain scalable data pipelines and architectures that power our business.
- You’ll collaborate closely with data scientists and analysts to ensure seamless data flow across the organisation, enabling smarter decisions and impactful solutions.
- We’re looking for someone who is analytically sharp, self-motivated, and thrives in an unstructured environment. A genuine passion for African business is a must—along with a healthy sense of adventure and a good sense of humour to match our dynamic culture.
Responsibilities
- Design and build scalable data pipelines and architectures using GCP technologies such as Dataflow, BigQuery, Pub/Sub, and Cloud Storage.
- Develop and manage ETL processes to transform diverse data sources into clean, structured formats for analysis and reporting.
- Partner with data scientists and analysts to understand their needs and deliver solutions that enable insights and decision-making.
- Create and maintain documentation for data pipelines, architecture, and data models to ensure clarity and consistency.
- Troubleshoot and resolve data-related issues quickly to minimise disruption.
- Continuously optimise data pipelines for performance, scalability, and cost efficiency.
- Automate workflows and processes through scripts and tools that streamline operations.
- Safeguard data quality and integrity across all sources, pipelines, and platforms.
- Stay ahead of the curve by keeping up with new GCP tools, best practices, and data engineering trends.
Requirements
- Bachelor’s or Master’s degree in Computer Science, Data Science, or a related field.
- 5+ years’ experience as a Data Engineer or in a similar role.
- Strong programming skills in BigQuery, Python, SQL, and GCP.
- Proven expertise in ETL development and data modeling.
- Familiarity with data lakehouse concepts and techniques.
- Excellent problem-solving, analytical, and critical-thinking skills.
- Strong communication and collaboration abilities.
- Experience with Google Cloud Platform (GCP) technologies—especially BigQuery, with additional exposure to Dataflow, Pub/Sub, and Cloud Storage—considered highly beneficial.
- Background in financial services would be an added advantage.