Junior Data Engineer
2025-07-10T16:34:45+00:00
ibuQa Capital Ltd
https://cdn.greatkenyanjobs.com/jsjobsdata/data/employer/comp_8571/logo/ibuQa%20Capital%20Ltd.jpeg
https://www.ibuqa.io/
FULL_TIME
Nairobi
Nairobi
00100
Kenya
Nonprofit, and NGO
Computer & IT
2025-07-20T17:00:00+00:00
Kenya
8
The Junior Data Engineer will be responsible for building and maintaining scalable data pipelines, ensuring data accuracy, and assisting in the integration of data sources. This role involves working with large datasets, optimizing data flow, and collaborating with cross-functional teams to meet the data needs of the organization. The ideal candidate is eager to learn, passionate about data, and excited to contribute to the company’s data infrastructure.
Responsibilities
- Data Pipeline Development: Assist in building and maintaining ETL (Extract, Transform, Load) pipelines to collect and transform data from various sources.
- Data Integration: Work on integrating diverse data sources into the company’s data warehouse or data lake, ensuring consistency and accuracy.
- Data Quality: Monitor and validate the integrity of incoming and existing data to maintain high standards of data quality and accuracy.
- Performance Optimization: Assist in optimizing data pipelines and infrastructure for efficiency, scalability, and speed.
- Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and implement solutions that meet their needs.
- Documentation: Create and maintain documentation for data processes, pipelines, and systems to ensure clarity and reproducibility.
- Data Warehousing: Support the development and maintenance of the company’s data warehouse to facilitate effective data storage and retrieval.
- Data Security: Assist in implementing best practices to ensure the security and privacy of sensitive data throughout the pipeline.
Requirements
- Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field.
- Basic knowledge of programming languages such as Python.
- Experience with SQL for querying and manipulating data.
- Familiarity with ETL processes and basic knowledge of data pipeline tools (e.g., Apache Airflow).
- Understanding of data warehousing concepts and experience working with databases such as PostgreSQL, MySQL, or NoSQL databases.
- Knowledge of cloud platforms such as AWS, Google Cloud, or Azure is a plus.
Data Pipeline Development: Assist in building and maintaining ETL (Extract, Transform, Load) pipelines to collect and transform data from various sources. Data Integration: Work on integrating diverse data sources into the company’s data warehouse or data lake, ensuring consistency and accuracy. Data Quality: Monitor and validate the integrity of incoming and existing data to maintain high standards of data quality and accuracy. Performance Optimization: Assist in optimizing data pipelines and infrastructure for efficiency, scalability, and speed. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and implement solutions that meet their needs. Documentation: Create and maintain documentation for data processes, pipelines, and systems to ensure clarity and reproducibility. Data Warehousing: Support the development and maintenance of the company’s data warehouse to facilitate effective data storage and retrieval. Data Security: Assist in implementing best practices to ensure the security and privacy of sensitive data throughout the pipeline.
Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field. Basic knowledge of programming languages such as Python. Experience with SQL for querying and manipulating data. Familiarity with ETL processes and basic knowledge of data pipeline tools (e.g., Apache Airflow). Understanding of data warehousing concepts and experience working with databases such as PostgreSQL, MySQL, or NoSQL databases. Knowledge of cloud platforms such as AWS, Google Cloud, or Azure is a plus.
No Requirements
JOB-686feba53afa6
Vacancy title:
Junior Data Engineer
[Type: FULL_TIME, Industry: Nonprofit, and NGO, Category: Computer & IT]
Jobs at:
ibuQa Capital Ltd
Deadline of this Job:
Sunday, July 20 2025
Duty Station:
Nairobi | Nairobi | Kenya
Summary
Date Posted: Thursday, July 10 2025, Base Salary: Not Disclosed
Similar Jobs in Kenya
Learn more about ibuQa Capital Ltd
ibuQa Capital Ltd jobs in Kenya
JOB DETAILS:
The Junior Data Engineer will be responsible for building and maintaining scalable data pipelines, ensuring data accuracy, and assisting in the integration of data sources. This role involves working with large datasets, optimizing data flow, and collaborating with cross-functional teams to meet the data needs of the organization. The ideal candidate is eager to learn, passionate about data, and excited to contribute to the company’s data infrastructure.
Responsibilities
- Data Pipeline Development: Assist in building and maintaining ETL (Extract, Transform, Load) pipelines to collect and transform data from various sources.
- Data Integration: Work on integrating diverse data sources into the company’s data warehouse or data lake, ensuring consistency and accuracy.
- Data Quality: Monitor and validate the integrity of incoming and existing data to maintain high standards of data quality and accuracy.
- Performance Optimization: Assist in optimizing data pipelines and infrastructure for efficiency, scalability, and speed.
- Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and implement solutions that meet their needs.
- Documentation: Create and maintain documentation for data processes, pipelines, and systems to ensure clarity and reproducibility.
- Data Warehousing: Support the development and maintenance of the company’s data warehouse to facilitate effective data storage and retrieval.
- Data Security: Assist in implementing best practices to ensure the security and privacy of sensitive data throughout the pipeline.
Requirements
- Bachelor’s degree in Computer Science, Data Science, Engineering, or a related field.
- Basic knowledge of programming languages such as Python.
- Experience with SQL for querying and manipulating data.
- Familiarity with ETL processes and basic knowledge of data pipeline tools (e.g., Apache Airflow).
- Understanding of data warehousing concepts and experience working with databases such as PostgreSQL, MySQL, or NoSQL databases.
- Knowledge of cloud platforms such as AWS, Google Cloud, or Azure is a plus.
Work Hours: 8
Experience: No Requirements
Level of Education: bachelor degree
Job application procedure
Interested and qualified? Click here to apply
All Jobs | QUICK ALERT SUBSCRIPTION