Data Engineer job at Co-operative Bank of Kenya
New
3 Days Ago
Linkedid Twitter Share on facebook
Data Engineer
2025-12-23T12:38:01+00:00
Co-operative Bank of Kenya
https://cdn.greatkenyanjobs.com/jsjobsdata/data/employer/comp_7942/logo/Coopbanklogo.jpg
FULL_TIME
 
Nairobi
Nairobi
00100
Kenya
Banking
Computer & IT, Science & Engineering
KES
 
MONTH
2025-12-31T17:00:00+00:00
 
 
8

Background

Are you looking for an employer who promotes individual excellence and mutual respect in a team-driven culture with a key focus on social empowerment? The Co-operative Bank of Kenya is the place for those looking to new horizons.

We seek a fearless inventor with an artistic streak who can find creative solutions to tough problems and sculpt brilliant strategies from a mountain of data, someone who can create the digital equivalent of Mona Lisa with only an algorithm and a smile.

Reporting to the Head – Business Intelligence, the Data Engineer will be responsible for designing, building, and maintaining high-quality data pipelines, data integration layers, and automated machine learning operational pipelines (MLOps). This role ensures reliable, secure, scalable movement of data across enterprise systems and supports the full lifecycle of AI/ML models

The Role

Responsibilities

  • Gather information from business users to understand their detailed requirements and expectations, analyze business/use case requirements from BI analysts to determine operational problems, define data modelling requirements and develop data structures to support the generation of business insights and strategy.
  • Carry out analysis of requirements and recommend solutions to address user requirements.
  • Assist in preparing system definition/specification by the users highlighting technical requirements and roll out BI Solutions to stakeholders.
  • Identify, analyze and interpret trends or patterns in complex data sets using statistical techniques and provide reports.
  • Creation, scheduling, testing, deployment, and maintenance, of data pipeline from different source to required destination with the required transformations for reporting (ETL).
  • Design, build, and optimize data ingestion pipelines from structured and unstructured sources
  • Ensure data quality, lineage, and governance through automated checks and metadata management.
  • Implement CI/CD for data pipelines including automated testing, version control, and rollbacks.
  • Create reusable pipeline components and templates to accelerate onboarding of new data sources.
  • Develop and maintain data models, warehouse layer, lakehouse zones.
  • Build and automate end-to-end ML pipelines integrating training, validation, deployment, and monitoring.
  • Create feature pipelines, model training pipelines, and batch/real-time prediction services.
  • Manage ML model versioning, metadata tracking, and reproducibility
  • Build visualizations to summarize and make presentations to business and other key stakeholders. Filter, clean data and review reports, print outs and performance indicators to locate and correct code problems.
  • Secure BI solutions by putting adequate controls and restrict access to programs by users in accordance to the requirements of the Bank.
  • Guide the business in drawing report formats & wireframes and advice on the best approach to transform data and automate reports as well as Design and code reports/dashboards according to user specification with the key objective to deliver reports that will assist in decision-making and control.
  • Develop and maintain documentation/manuals on system configuration or set up, carry out technical user training as required to enable users interpret BI reports as well as deal with data, dashboards and report queries from users and resolve or advise them accordingly.

Skills, Competencies and Experience

Qualifications

  • Bachelor of Science degree in Computer Science, IT, Software Engineering, or any other degree in related fields.

Experience

  • A minimum of 3 years’ experience in Data engineering, BI & Software Development using Oracle.
  • Strong knowledge and experience with ETL tools (Oracle ODI, Microsoft SSIS,Talend), query languages (Oracle PL/SQL, SQL), programming languages (Java, Python, Scala)
  • Experience with Dimensional data modeling, data management and data processing. Knowledge of statistics and experience using statistical packages in analyzing large data sets (Python,R, SPS, SAS, Excel etc.)
  • Experience in CI/CD and automation tools (GitLab CI, Jenkins, Argo, etc.).
  • Experience with Big data tools (Hadoop, Apache Hive, Scala, Kafka, Apache Spark, NoSQL da)
  • Knowledge of visualization tools (Oracle Analytics Server, Power BI, SSRS, Tableau, click view)
  • Technical expertise regarding data models, database design development, data mining and segmentation techniques is desired.
  • Very good knowledge of Windows Operating Systems and a fair knowledge of Unix & Linux.
  • Gather information from business users to understand their detailed requirements and expectations, analyze business/use case requirements from BI analysts to determine operational problems, define data modelling requirements and develop data structures to support the generation of business insights and strategy.
  • Carry out analysis of requirements and recommend solutions to address user requirements.
  • Assist in preparing system definition/specification by the users highlighting technical requirements and roll out BI Solutions to stakeholders.
  • Identify, analyze and interpret trends or patterns in complex data sets using statistical techniques and provide reports.
  • Creation, scheduling, testing, deployment, and maintenance, of data pipeline from different source to required destination with the required transformations for reporting (ETL).
  • Design, build, and optimize data ingestion pipelines from structured and unstructured sources
  • Ensure data quality, lineage, and governance through automated checks and metadata management.
  • Implement CI/CD for data pipelines including automated testing, version control, and rollbacks.
  • Create reusable pipeline components and templates to accelerate onboarding of new data sources.
  • Develop and maintain data models, warehouse layer, lakehouse zones.
  • Build and automate end-to-end ML pipelines integrating training, validation, deployment, and monitoring.
  • Create feature pipelines, model training pipelines, and batch/real-time prediction services.
  • Manage ML model versioning, metadata tracking, and reproducibility
  • Build visualizations to summarize and make presentations to business and other key stakeholders. Filter, clean data and review reports, print outs and performance indicators to locate and correct code problems.
  • Secure BI solutions by putting adequate controls and restrict access to programs by users in accordance to the requirements of the Bank.
  • Guide the business in drawing report formats & wireframes and advice on the best approach to transform data and automate reports as well as Design and code reports/dashboards according to user specification with the key objective to deliver reports that will assist in decision-making and control.
  • Develop and maintain documentation/manuals on system configuration or set up, carry out technical user training as required to enable users interpret BI reports as well as deal with data, dashboards and report queries from users and resolve or advise them accordingly.
  • Strong knowledge and experience with ETL tools (Oracle ODI, Microsoft SSIS,Talend), query languages (Oracle PL/SQL, SQL), programming languages (Java, Python, Scala)
  • Experience with Dimensional data modeling, data management and data processing. Knowledge of statistics and experience using statistical packages in analyzing large data sets (Python,R, SPS, SAS, Excel etc.)
  • Experience in CI/CD and automation tools (GitLab CI, Jenkins, Argo, etc.).
  • Experience with Big data tools (Hadoop, Apache Hive, Scala, Kafka, Apache Spark, NoSQL da)
  • Knowledge of visualization tools (Oracle Analytics Server, Power BI, SSRS, Tableau, click view)
  • Technical expertise regarding data models, database design development, data mining and segmentation techniques is desired.
  • Very good knowledge of Windows Operating Systems and a fair knowledge of Unix & Linux.
  • Bachelor of Science degree in Computer Science, IT, Software Engineering, or any other degree in related fields.
bachelor degree
36
JOB-694a8d2993e51

Vacancy title:
Data Engineer

[Type: FULL_TIME, Industry: Banking, Category: Computer & IT, Science & Engineering]

Jobs at:
Co-operative Bank of Kenya

Deadline of this Job:
Wednesday, December 31 2025

Duty Station:
Nairobi | Nairobi

Summary
Date Posted: Tuesday, December 23 2025, Base Salary: Not Disclosed

Similar Jobs in Kenya
Learn more about Co-operative Bank of Kenya
Co-operative Bank of Kenya jobs in Kenya

JOB DETAILS:

Background

Are you looking for an employer who promotes individual excellence and mutual respect in a team-driven culture with a key focus on social empowerment? The Co-operative Bank of Kenya is the place for those looking to new horizons.

We seek a fearless inventor with an artistic streak who can find creative solutions to tough problems and sculpt brilliant strategies from a mountain of data, someone who can create the digital equivalent of Mona Lisa with only an algorithm and a smile.

Reporting to the Head – Business Intelligence, the Data Engineer will be responsible for designing, building, and maintaining high-quality data pipelines, data integration layers, and automated machine learning operational pipelines (MLOps). This role ensures reliable, secure, scalable movement of data across enterprise systems and supports the full lifecycle of AI/ML models

The Role

Responsibilities

  • Gather information from business users to understand their detailed requirements and expectations, analyze business/use case requirements from BI analysts to determine operational problems, define data modelling requirements and develop data structures to support the generation of business insights and strategy.
  • Carry out analysis of requirements and recommend solutions to address user requirements.
  • Assist in preparing system definition/specification by the users highlighting technical requirements and roll out BI Solutions to stakeholders.
  • Identify, analyze and interpret trends or patterns in complex data sets using statistical techniques and provide reports.
  • Creation, scheduling, testing, deployment, and maintenance, of data pipeline from different source to required destination with the required transformations for reporting (ETL).
  • Design, build, and optimize data ingestion pipelines from structured and unstructured sources
  • Ensure data quality, lineage, and governance through automated checks and metadata management.
  • Implement CI/CD for data pipelines including automated testing, version control, and rollbacks.
  • Create reusable pipeline components and templates to accelerate onboarding of new data sources.
  • Develop and maintain data models, warehouse layer, lakehouse zones.
  • Build and automate end-to-end ML pipelines integrating training, validation, deployment, and monitoring.
  • Create feature pipelines, model training pipelines, and batch/real-time prediction services.
  • Manage ML model versioning, metadata tracking, and reproducibility
  • Build visualizations to summarize and make presentations to business and other key stakeholders. Filter, clean data and review reports, print outs and performance indicators to locate and correct code problems.
  • Secure BI solutions by putting adequate controls and restrict access to programs by users in accordance to the requirements of the Bank.
  • Guide the business in drawing report formats & wireframes and advice on the best approach to transform data and automate reports as well as Design and code reports/dashboards according to user specification with the key objective to deliver reports that will assist in decision-making and control.
  • Develop and maintain documentation/manuals on system configuration or set up, carry out technical user training as required to enable users interpret BI reports as well as deal with data, dashboards and report queries from users and resolve or advise them accordingly.

Skills, Competencies and Experience

Qualifications

  • Bachelor of Science degree in Computer Science, IT, Software Engineering, or any other degree in related fields.

Experience

  • A minimum of 3 years’ experience in Data engineering, BI & Software Development using Oracle.
  • Strong knowledge and experience with ETL tools (Oracle ODI, Microsoft SSIS,Talend), query languages (Oracle PL/SQL, SQL), programming languages (Java, Python, Scala)
  • Experience with Dimensional data modeling, data management and data processing. Knowledge of statistics and experience using statistical packages in analyzing large data sets (Python,R, SPS, SAS, Excel etc.)
  • Experience in CI/CD and automation tools (GitLab CI, Jenkins, Argo, etc.).
  • Experience with Big data tools (Hadoop, Apache Hive, Scala, Kafka, Apache Spark, NoSQL da)
  • Knowledge of visualization tools (Oracle Analytics Server, Power BI, SSRS, Tableau, click view)
  • Technical expertise regarding data models, database design development, data mining and segmentation techniques is desired.
  • Very good knowledge of Windows Operating Systems and a fair knowledge of Unix & Linux.

 

Work Hours: 8

Experience in Months: 36

Level of Education: bachelor degree

Job application procedure
Interested in applying for this job? Click here to submit your application now.

If you fit the profile, then apply today! Please forward your application enclosing detailed Curriculum Vitaeindicating the job reference number DE/IID/2025 as the subject 

 

All Jobs | QUICK ALERT SUBSCRIPTION

Job Info
Job Category: Computer/ IT jobs in Kenya
Job Type: Full-time
Deadline of this Job: Wednesday, December 31 2025
Duty Station: Nairobi | Nairobi
Posted: 23-12-2025
No of Jobs: 1
Start Publishing: 23-12-2025
Stop Publishing (Put date of 2030): 10-10-2076
Apply Now
Notification Board

Join a Focused Community on job search to uncover both advertised and non-advertised jobs that you may not be aware of. A jobs WhatsApp Group Community can ensure that you know the opportunities happening around you and a jobs Facebook Group Community provides an opportunity to discuss with employers who need to fill urgent position. Click the links to join. You can view previously sent Email Alerts here incase you missed them and Subscribe so that you never miss out.

Caution: Never Pay Money in a Recruitment Process.

Some smart scams can trick you into paying for Psychometric Tests.