Data Generalist
2026-01-29T12:57:21+00:00
CarePay
https://cdn.greatkenyanjobs.com/jsjobsdata/data/employer/comp_4507/logo/Carepay.png
https://www.greatkenyanjobs.com/jobs
FULL_TIME
Nairobi
Nairobi
00100
Kenya
Financial Services
Computer & IT, Science & Engineering, Business Operations
2026-02-06T17:00:00+00:00
8
Background
CarePay is a Kenyan company that administers conditional healthcare payments between funders, patients and healthcare providers. Through our M-TIBA platform, CarePay directs funds from public and private funders directly to patients into a "health wallet” on their mobile phone. The use of these funds is restricted to conditional spending at selected health...
The Data Generalist
The Data Generalist is an individual contributor in the Data team, operating across the full data lifecycle. In a small, highly autonomous team, this role focuses primarily on data engineering and backend engineering, with some involvement in data science and analytics engineering, emphasizing end-to-end delivery, stakeholder impact and continuous learning.
Data Engineering & Backend Engineering
- Maintain and improve the data infrastructure using Snowflake, AWS, Terraform and Kubernetes, always keeping in mind reliability, security and cost
- Orchestrate, refactor and troubleshoot ETL pipelines in Airflow
- Develop FastAPI microservices that expose the data products, from simple reports to advanced rules-based and ML systems
- Monitor data quality alerts and raise occurring issues to other development teams
- Collaborate with the Head of Data on architectural decisions and technical improvements
Analytics Engineering & Data Modelling
- Build and maintain analytics models using dbt, following best practices such as star schemas and dimensional modelling
- Adapt ETL pipelines to evolving data models
- Ensure analytical datasets are reliable, well-documented, and optimised for self-service BI and downstream consumption
- Occasionally work on ad-hoc analyses and customer-facing dashboards
Machine Learning & Advanced Analytics
- Design, build, and productionise machine learning models
- Develop user-facing ML and AI products and proof-of-concepts using tools such as scikit-learn, LLMs, Streamlit and FastAPI
Collaboration & Mentorship
- Actively collaborate with other Data team members through pairing sessions and informal mentorship
- Share best practices across data, backend and ML engineering
- Reach out to other technical leads to clarify backend logic, data semantics, or system behaviour required for data initiatives
Ownership & Stakeholder Interaction
- Lead data projects end-to-end with minimal supervision, from requirements gathering to delivery
- Proactively spot opportunities for improvements in the data infrastructure and ML/AI adoption
- Understand and articulate trade-offs between technical solutions, delivery speed, and stakeholder needs
- Translate complex technical concepts into clear, business-oriented language
- Occasionally present and explain data products, insights, and deliveries directly to stakeholders
Requirements
- 3+ years of experience in a data-related role
- Proficiency in Python, SQL, dbt and Airflow
- Experience building, maintaining, and documenting APIs in Python (preferably FastAPI)
- Familiarity with modern DevOps practices – Cloud (AWS preferred), IaC (Terraform), CI/CD, Kubernetes
- Familiarity with data science/ML concepts
- Strong curiosity and willingness to learn
- Good understanding of data tooling landscape, ability to pick a right tool for the job while keeping a pragmatic mindset
- Nice to have: Some experience with LLMs/Agentic AI, Streamlit, Frontend development
- Maintain and improve the data infrastructure using Snowflake, AWS, Terraform and Kubernetes, always keeping in mind reliability, security and cost
- Orchestrate, refactor and troubleshoot ETL pipelines in Airflow
- Develop FastAPI microservices that expose the data products, from simple reports to advanced rules-based and ML systems
- Monitor data quality alerts and raise occurring issues to other development teams
- Collaborate with the Head of Data on architectural decisions and technical improvements
- Build and maintain analytics models using dbt, following best practices such as star schemas and dimensional modelling
- Adapt ETL pipelines to evolving data models
- Ensure analytical datasets are reliable, well-documented, and optimised for self-service BI and downstream consumption
- Occasionally work on ad-hoc analyses and customer-facing dashboards
- Design, build, and productionise machine learning models
- Develop user-facing ML and AI products and proof-of-concepts using tools such as scikit-learn, LLMs, Streamlit and FastAPI
- Actively collaborate with other Data team members through pairing sessions and informal mentorship
- Share best practices across data, backend and ML engineering
- Reach out to other technical leads to clarify backend logic, data semantics, or system behaviour required for data initiatives
- Lead data projects end-to-end with minimal supervision, from requirements gathering to delivery
- Proactively spot opportunities for improvements in the data infrastructure and ML/AI adoption
- Understand and articulate trade-offs between technical solutions, delivery speed, and stakeholder needs
- Translate complex technical concepts into clear, business-oriented language
- Occasionally present and explain data products, insights, and deliveries directly to stakeholders
- Python
- SQL
- dbt
- Airflow
- FastAPI
- AWS
- Terraform
- Kubernetes
- scikit-learn
- LLMs
- Streamlit
- 3+ years of experience in a data-related role
- Proficiency in Python, SQL, dbt and Airflow
- Experience building, maintaining, and documenting APIs in Python (preferably FastAPI)
- Familiarity with modern DevOps practices – Cloud (AWS preferred), IaC (Terraform), CI/CD, Kubernetes
- Familiarity with data science/ML concepts
- Strong curiosity and willingness to learn
- Good understanding of data tooling landscape, ability to pick a right tool for the job while keeping a pragmatic mindset
- Some experience with LLMs/Agentic AI, Streamlit, Frontend development (Nice to have)
JOB-697b5931cd650
Vacancy title:
Data Generalist
[Type: FULL_TIME, Industry: Financial Services, Category: Computer & IT, Science & Engineering, Business Operations]
Jobs at:
CarePay
Deadline of this Job:
Friday, February 6 2026
Duty Station:
Nairobi | Nairobi
Summary
Date Posted: Thursday, January 29 2026, Base Salary: Not Disclosed
Similar Jobs in Kenya
Learn more about CarePay
CarePay jobs in Kenya
JOB DETAILS:
Background
CarePay is a Kenyan company that administers conditional healthcare payments between funders, patients and healthcare providers. Through our M-TIBA platform, CarePay directs funds from public and private funders directly to patients into a "health wallet” on their mobile phone. The use of these funds is restricted to conditional spending at selected health...
The Data Generalist
The Data Generalist is an individual contributor in the Data team, operating across the full data lifecycle. In a small, highly autonomous team, this role focuses primarily on data engineering and backend engineering, with some involvement in data science and analytics engineering, emphasizing end-to-end delivery, stakeholder impact and continuous learning.
Data Engineering & Backend Engineering
- Maintain and improve the data infrastructure using Snowflake, AWS, Terraform and Kubernetes, always keeping in mind reliability, security and cost
- Orchestrate, refactor and troubleshoot ETL pipelines in Airflow
- Develop FastAPI microservices that expose the data products, from simple reports to advanced rules-based and ML systems
- Monitor data quality alerts and raise occurring issues to other development teams
- Collaborate with the Head of Data on architectural decisions and technical improvements
Analytics Engineering & Data Modelling
- Build and maintain analytics models using dbt, following best practices such as star schemas and dimensional modelling
- Adapt ETL pipelines to evolving data models
- Ensure analytical datasets are reliable, well-documented, and optimised for self-service BI and downstream consumption
- Occasionally work on ad-hoc analyses and customer-facing dashboards
Machine Learning & Advanced Analytics
- Design, build, and productionise machine learning models
- Develop user-facing ML and AI products and proof-of-concepts using tools such as scikit-learn, LLMs, Streamlit and FastAPI
Collaboration & Mentorship
- Actively collaborate with other Data team members through pairing sessions and informal mentorship
- Share best practices across data, backend and ML engineering
- Reach out to other technical leads to clarify backend logic, data semantics, or system behaviour required for data initiatives
Ownership & Stakeholder Interaction
- Lead data projects end-to-end with minimal supervision, from requirements gathering to delivery
- Proactively spot opportunities for improvements in the data infrastructure and ML/AI adoption
- Understand and articulate trade-offs between technical solutions, delivery speed, and stakeholder needs
- Translate complex technical concepts into clear, business-oriented language
- Occasionally present and explain data products, insights, and deliveries directly to stakeholders
Requirements
- 3+ years of experience in a data-related role
- Proficiency in Python, SQL, dbt and Airflow
- Experience building, maintaining, and documenting APIs in Python (preferably FastAPI)
- Familiarity with modern DevOps practices – Cloud (AWS preferred), IaC (Terraform), CI/CD, Kubernetes
- Familiarity with data science/ML concepts
- Strong curiosity and willingness to learn
- Good understanding of data tooling landscape, ability to pick a right tool for the job while keeping a pragmatic mindset
- Nice to have: Some experience with LLMs/Agentic AI, Streamlit, Frontend development
Work Hours: 8
Experience in Months: 36
Level of Education: bachelor degree
Job application procedure
Interested and qualified? Click here to apply
All Jobs | QUICK ALERT SUBSCRIPTION