Data Generalist job at CarePay
New
Website :
3 Days Ago
Linkedid Twitter Share on facebook
Data Generalist
2026-01-29T12:57:21+00:00
CarePay
https://cdn.greatkenyanjobs.com/jsjobsdata/data/employer/comp_4507/logo/Carepay.png
FULL_TIME
Nairobi
Nairobi
00100
Kenya
Financial Services
Computer & IT, Science & Engineering, Business Operations
KES
MONTH
2026-02-06T17:00:00+00:00
8

Background

CarePay is a Kenyan company that administers conditional healthcare payments between funders, patients and healthcare providers. Through our M-TIBA platform, CarePay directs funds from public and private funders directly to patients into a "health wallet” on their mobile phone. The use of these funds is restricted to conditional spending at selected health...

The Data Generalist

The Data Generalist is an individual contributor in the Data team, operating across the full data lifecycle. In a small, highly autonomous team, this role focuses primarily on data engineering and backend engineering, with some involvement in data science and analytics engineering, emphasizing end-to-end delivery, stakeholder impact and continuous learning.

Data Engineering & Backend Engineering

  • Maintain and improve the data infrastructure using Snowflake, AWS, Terraform and Kubernetes, always keeping in mind reliability, security and cost
  • Orchestrate, refactor and troubleshoot ETL pipelines in Airflow
  • Develop FastAPI microservices that expose the data products, from simple reports to advanced rules-based and ML systems
  • Monitor data quality alerts and raise occurring issues to other development teams
  • Collaborate with the Head of Data on architectural decisions and technical improvements

Analytics Engineering & Data Modelling

  • Build and maintain analytics models using dbt, following best practices such as star schemas and dimensional modelling
  • Adapt ETL pipelines to evolving data models
  • Ensure analytical datasets are reliable, well-documented, and optimised for self-service BI and downstream consumption
  • Occasionally work on ad-hoc analyses and customer-facing dashboards

Machine Learning & Advanced Analytics

  • Design, build, and productionise machine learning models
  • Develop user-facing ML and AI products and proof-of-concepts using tools such as scikit-learn, LLMs, Streamlit and FastAPI

Collaboration & Mentorship

  • Actively collaborate with other Data team members through pairing sessions and informal mentorship
  • Share best practices across data, backend and ML engineering
  • Reach out to other technical leads to clarify backend logic, data semantics, or system behaviour required for data initiatives

Ownership & Stakeholder Interaction

  • Lead data projects end-to-end with minimal supervision, from requirements gathering to delivery
  • Proactively spot opportunities for improvements in the data infrastructure and ML/AI adoption
  • Understand and articulate trade-offs between technical solutions, delivery speed, and stakeholder needs
  • Translate complex technical concepts into clear, business-oriented language
  • Occasionally present and explain data products, insights, and deliveries directly to stakeholders

Requirements

  • 3+ years of experience in a data-related role
  • Proficiency in Python, SQL, dbt and Airflow
  • Experience building, maintaining, and documenting APIs in Python (preferably FastAPI)
  • Familiarity with modern DevOps practices – Cloud (AWS preferred), IaC (Terraform), CI/CD, Kubernetes
  • Familiarity with data science/ML concepts
  • Strong curiosity and willingness to learn
  • Good understanding of data tooling landscape, ability to pick a right tool for the job while keeping a pragmatic mindset
  • Nice to have: Some experience with LLMs/Agentic AI, Streamlit, Frontend development
  • Maintain and improve the data infrastructure using Snowflake, AWS, Terraform and Kubernetes, always keeping in mind reliability, security and cost
  • Orchestrate, refactor and troubleshoot ETL pipelines in Airflow
  • Develop FastAPI microservices that expose the data products, from simple reports to advanced rules-based and ML systems
  • Monitor data quality alerts and raise occurring issues to other development teams
  • Collaborate with the Head of Data on architectural decisions and technical improvements
  • Build and maintain analytics models using dbt, following best practices such as star schemas and dimensional modelling
  • Adapt ETL pipelines to evolving data models
  • Ensure analytical datasets are reliable, well-documented, and optimised for self-service BI and downstream consumption
  • Occasionally work on ad-hoc analyses and customer-facing dashboards
  • Design, build, and productionise machine learning models
  • Develop user-facing ML and AI products and proof-of-concepts using tools such as scikit-learn, LLMs, Streamlit and FastAPI
  • Actively collaborate with other Data team members through pairing sessions and informal mentorship
  • Share best practices across data, backend and ML engineering
  • Reach out to other technical leads to clarify backend logic, data semantics, or system behaviour required for data initiatives
  • Lead data projects end-to-end with minimal supervision, from requirements gathering to delivery
  • Proactively spot opportunities for improvements in the data infrastructure and ML/AI adoption
  • Understand and articulate trade-offs between technical solutions, delivery speed, and stakeholder needs
  • Translate complex technical concepts into clear, business-oriented language
  • Occasionally present and explain data products, insights, and deliveries directly to stakeholders
  • Python
  • SQL
  • dbt
  • Airflow
  • FastAPI
  • AWS
  • Terraform
  • Kubernetes
  • scikit-learn
  • LLMs
  • Streamlit
  • 3+ years of experience in a data-related role
  • Proficiency in Python, SQL, dbt and Airflow
  • Experience building, maintaining, and documenting APIs in Python (preferably FastAPI)
  • Familiarity with modern DevOps practices – Cloud (AWS preferred), IaC (Terraform), CI/CD, Kubernetes
  • Familiarity with data science/ML concepts
  • Strong curiosity and willingness to learn
  • Good understanding of data tooling landscape, ability to pick a right tool for the job while keeping a pragmatic mindset
  • Some experience with LLMs/Agentic AI, Streamlit, Frontend development (Nice to have)
bachelor degree
36
JOB-697b5931cd650

Vacancy title:
Data Generalist

[Type: FULL_TIME, Industry: Financial Services, Category: Computer & IT, Science & Engineering, Business Operations]

Jobs at:
CarePay

Deadline of this Job:
Friday, February 6 2026

Duty Station:
Nairobi | Nairobi

Summary
Date Posted: Thursday, January 29 2026, Base Salary: Not Disclosed

Similar Jobs in Kenya
Learn more about CarePay
CarePay jobs in Kenya

JOB DETAILS:

Background

CarePay is a Kenyan company that administers conditional healthcare payments between funders, patients and healthcare providers. Through our M-TIBA platform, CarePay directs funds from public and private funders directly to patients into a "health wallet” on their mobile phone. The use of these funds is restricted to conditional spending at selected health...

The Data Generalist

The Data Generalist is an individual contributor in the Data team, operating across the full data lifecycle. In a small, highly autonomous team, this role focuses primarily on data engineering and backend engineering, with some involvement in data science and analytics engineering, emphasizing end-to-end delivery, stakeholder impact and continuous learning.

Data Engineering & Backend Engineering

  • Maintain and improve the data infrastructure using Snowflake, AWS, Terraform and Kubernetes, always keeping in mind reliability, security and cost
  • Orchestrate, refactor and troubleshoot ETL pipelines in Airflow
  • Develop FastAPI microservices that expose the data products, from simple reports to advanced rules-based and ML systems
  • Monitor data quality alerts and raise occurring issues to other development teams
  • Collaborate with the Head of Data on architectural decisions and technical improvements

Analytics Engineering & Data Modelling

  • Build and maintain analytics models using dbt, following best practices such as star schemas and dimensional modelling
  • Adapt ETL pipelines to evolving data models
  • Ensure analytical datasets are reliable, well-documented, and optimised for self-service BI and downstream consumption
  • Occasionally work on ad-hoc analyses and customer-facing dashboards

Machine Learning & Advanced Analytics

  • Design, build, and productionise machine learning models
  • Develop user-facing ML and AI products and proof-of-concepts using tools such as scikit-learn, LLMs, Streamlit and FastAPI

Collaboration & Mentorship

  • Actively collaborate with other Data team members through pairing sessions and informal mentorship
  • Share best practices across data, backend and ML engineering
  • Reach out to other technical leads to clarify backend logic, data semantics, or system behaviour required for data initiatives

Ownership & Stakeholder Interaction

  • Lead data projects end-to-end with minimal supervision, from requirements gathering to delivery
  • Proactively spot opportunities for improvements in the data infrastructure and ML/AI adoption
  • Understand and articulate trade-offs between technical solutions, delivery speed, and stakeholder needs
  • Translate complex technical concepts into clear, business-oriented language
  • Occasionally present and explain data products, insights, and deliveries directly to stakeholders

Requirements

  • 3+ years of experience in a data-related role
  • Proficiency in Python, SQL, dbt and Airflow
  • Experience building, maintaining, and documenting APIs in Python (preferably FastAPI)
  • Familiarity with modern DevOps practices – Cloud (AWS preferred), IaC (Terraform), CI/CD, Kubernetes
  • Familiarity with data science/ML concepts
  • Strong curiosity and willingness to learn
  • Good understanding of data tooling landscape, ability to pick a right tool for the job while keeping a pragmatic mindset
  • Nice to have: Some experience with LLMs/Agentic AI, Streamlit, Frontend development

Work Hours: 8

Experience in Months: 36

Level of Education: bachelor degree

Job application procedure

 Interested and qualified? Click here to apply

All Jobs | QUICK ALERT SUBSCRIPTION

Job Info
Job Category: Computer/ IT jobs in Kenya
Job Type: Full-time
Deadline of this Job: Friday, February 6 2026
Duty Station: Nairobi | Nairobi
Posted: 29-01-2026
No of Jobs: 1
Start Publishing: 29-01-2026
Stop Publishing (Put date of 2030): 10-10-2076
Apply Now
Notification Board

Join a Focused Community on job search to uncover both advertised and non-advertised jobs that you may not be aware of. A jobs WhatsApp Group Community can ensure that you know the opportunities happening around you and a jobs Facebook Group Community provides an opportunity to discuss with employers who need to fill urgent position. Click the links to join. You can view previously sent Email Alerts here incase you missed them and Subscribe so that you never miss out.

Caution: Never Pay Money in a Recruitment Process.

Some smart scams can trick you into paying for Psychometric Tests.