Technical AI Safety Grantmaker job at Open Philanthropy
New
1 Day Ago
Linkedid Twitter Share on facebook
Technical AI Safety Grantmaker
2025-11-12T03:13:59+00:00
Open Philanthropy
https://cdn.greatkenyanjobs.com/jsjobsdata/data/employer/comp_9458/logo/Open%20Philanthropy.png
FULL_TIME
 
Remote
Nairobi
00100
Kenya
Nonprofit, and NGO
Science & Engineering, Social Services & Nonprofit
KES
 
MONTH
2025-11-24T17:00:00+00:00
 
Kenya
8

The Technical AI Safety (TAIS) team funds technical research aimed at reducing catastrophic risks from advanced AI, and is housed under our broader work on navigating transformative AI, the largest focus area at Open Philanthropy. Last year, we made $40 million in grants, and this year we expect to make >$130 million. We plan to continue expanding our grantmaking in 2026, and are looking to hire additional staff to enable this.

We think that technical AI safety grantmaking is a highly impactful career for reducing catastrophic risks from advanced AI. Grantmakers have an outsized influence on the field of technical AI safety: the role involves influencing dozens of research projects at once, setting incentives for the entire field, and growing the field by supporting new researchers and incubating organizations that could play important roles in the future. Our grants include general operating support to organizations conducting AI safety research (e.g. FAR.AI, Redwood Research), project-based grants to academic and independent researchers (e.g. through our recent RFP), and proactively seeding new initiatives.

We only have three grantmakers on the team, and are regularly bottlenecked by technical grantmaker capacity, particularly as we have scaled. If you join our team, you may be able to significantly increase the quantity and quality of grants we’re able to make. For example, growing our team’s capacity may enable us to:

  • Periodically update and reopen our recent RFP, keeping it open for longer (potentially permanently open), to match the substantial interest we have received from researchers throughout the year, even as the AI safety field continues to grow.
  • Spend more time actively seeking out and creating exciting new grant opportunities.
  • Engage more with our largest grantees to ensure they are set up for success, including suggesting alterations to make their research more impactful.
  • Write more in public about research we think would be impactful and how to make it happen.
  • Investigate more of the promising proposals we receive, instead of having to aggressively triage due to limited grantmaker capacity.

About the roles

We are looking for multiple hires at a range of seniority levels: Senior Program Associate, Associate Program Officer, and Senior Program Officer. Below, we outline what we are looking for across the roles, and then give more detail about how expectations differ between them.

The ideal candidate for these positions will possess many of the skills and experiences described below. However, there is no such thing as a “perfect” candidate, and we are hiring across a broad range of levels of seniority, so if you are on the fence about applying because you are unsure whether you are qualified, we strongly encourage you to apply. There is a single application for all of the roles listed; we plan to let you know at the point of inviting you for a work test which role(s) we are considering you for.

Who we’re looking for across the roles

The core function of each role is to recommend grants to advance technical research aimed at reducing catastrophic risks from AI. All of our grantmakers have significant responsibility for investigating and recommending grants. We expect team members to develop views about the field, and want to empower them to make grants that can help to shape it. In practice we expect to rely significantly on grantmakers’ inside views about individual grants, and often about entire research agendas.

You might be a good fit for these roles if you have:

  • Familiarity with AI safety. You have well thought out views on the sources and severity of catastrophic risk from transformative AI, and in the cases for and against working on various technical research directions to reduce those risks. You communicate your views clearly, and you regularly update these views through conversation with others.
  • Technical literacy. You are comfortable evaluating a technical proposal for technical feasibility, novelty, and the potential of its contribution to a research area (e.g. to one of the research areas we list in our most recent RFP). You are at home in technical conversations with researchers who are potential or current grantees.
  • Good judgment. You can identify and focus on the most important considerations, have good instincts about when to do due diligence and when to focus on efficiency, and form reasonable, holistic perspectives on people and organizations.
  • High productivity. You are conscientious and well-organized, and you can work efficiently.
  • Clear communication. You avoid buzzwords and abstractions, and give concise arguments with transparent reasoning (you’ll need to produce internal grant writeups, and you may also draft public blog posts).
  • High agency. You will push to make the right thing happen on large, unscoped projects, even if it requires rolling up your sleeves to do something unusual, difficult, and/or time-consuming.
  • Technical AI safety research experience. You have published TAIS research in the past. This is not a hard requirement, but is useful for these roles (especially the more senior roles).

We also expect all staff to model our operating values of ownership, openness, calibration, and inclusiveness.

In general, roles within the team are fairly fluid, with people at different levels of seniority contributing to a range of tasks according to where their skillset and experience is most valuable. Even “junior” team members (in terms of professional experience) regularly take on significant responsibility, especially in areas in which they have expertise.

Central tasks across the roles could include:

  • Evaluating technical grant applications, for example those that came from our recent RFP.
  • Iterating with potential grantees on their research ideas and strategic plans.
  • Maintaining strong knowledge of important developments in AI capabilities and safety research, and adapting our funding strategy appropriately.
  • Developing strong relationships with key AI safety researchers and other important people in the field, and understanding their views on important developments.
  • Explaining our AI safety threat models and research priorities to potential grantees.
  • Sharing feedback and managing relationships with grantees, both in writing and conversation.

Senior Program Associate

Senior Program Associates are typically engaged early contributors to the field of TAIS with strong independent judgment. Candidates might have roughly 0.5-2 years of TAIS-relevant experience – i.e. any experience that involves spending a significant fraction of your time thinking, talking, or reading about technical AI safety. Examples of TAIS-relevant experience include a research master’s degree focused on AI alignment research, time in a technical AI safety mentorship program, or employment in an organization that works on technical AI safety.

Associate Program Officer

Associate Program Officers typically have established expertise in technical AI safety (i.e. 2-4 years of TAIS-relevant experience) or bring professional judgment and transferable skills from other domains while having some technical AI safety expertise (i.e. typically 0.5-2 years of TAIS-relevant experience and 3+ years of other professional experience).

In addition to the tasks listed above, Associate Program Officers might expect to:

  • Develop our grantmaking strategy in particular areas, including ways we could increase impact or use active grantmaking to shape the field of AI safety.
  • Actively create highly promising grant opportunities where they do not already exist.
  • Own relationships with our largest and most important grantees.

Senior Program Officer

Senior Program Officers are typically recognized thought leaders in the field of technical AI safety (i.e. typically bring 5+ years of TAIS-relevant experience) or bring senior-level professional expertise

  • Evaluating technical grant applications, for example those that came from our recent RFP.
  • Iterating with potential grantees on their research ideas and strategic plans.
  • Maintaining strong knowledge of important developments in AI capabilities and safety research, and adapting our funding strategy appropriately.
  • Developing strong relationships with key AI safety researchers and other important people in the field, and understanding their views on important developments.
  • Explaining our AI safety threat models and research priorities to potential grantees.
  • Sharing feedback and managing relationships with grantees, both in writing and conversation.
  • Familiarity with AI safety
  • Technical literacy
  • Good judgment
  • High productivity
  • Clear communication
  • High agency
  • Technical AI safety research experience
bachelor degree
36
JOB-6913fb772b4bc

Vacancy title:
Technical AI Safety Grantmaker

[Type: FULL_TIME, Industry: Nonprofit, and NGO, Category: Science & Engineering, Social Services & Nonprofit]

Jobs at:
Open Philanthropy

Deadline of this Job:
Monday, November 24 2025

Duty Station:
Remote | Nairobi | Kenya

Summary
Date Posted: Wednesday, November 12 2025, Base Salary: Not Disclosed

Similar Jobs in Kenya
Learn more about Open Philanthropy
Open Philanthropy jobs in Kenya

JOB DETAILS:

The Technical AI Safety (TAIS) team funds technical research aimed at reducing catastrophic risks from advanced AI, and is housed under our broader work on navigating transformative AI, the largest focus area at Open Philanthropy. Last year, we made $40 million in grants, and this year we expect to make >$130 million. We plan to continue expanding our grantmaking in 2026, and are looking to hire additional staff to enable this.

We think that technical AI safety grantmaking is a highly impactful career for reducing catastrophic risks from advanced AI. Grantmakers have an outsized influence on the field of technical AI safety: the role involves influencing dozens of research projects at once, setting incentives for the entire field, and growing the field by supporting new researchers and incubating organizations that could play important roles in the future. Our grants include general operating support to organizations conducting AI safety research (e.g. FAR.AI, Redwood Research), project-based grants to academic and independent researchers (e.g. through our recent RFP), and proactively seeding new initiatives.

We only have three grantmakers on the team, and are regularly bottlenecked by technical grantmaker capacity, particularly as we have scaled. If you join our team, you may be able to significantly increase the quantity and quality of grants we’re able to make. For example, growing our team’s capacity may enable us to:

  • Periodically update and reopen our recent RFP, keeping it open for longer (potentially permanently open), to match the substantial interest we have received from researchers throughout the year, even as the AI safety field continues to grow.
  • Spend more time actively seeking out and creating exciting new grant opportunities.
  • Engage more with our largest grantees to ensure they are set up for success, including suggesting alterations to make their research more impactful.
  • Write more in public about research we think would be impactful and how to make it happen.
  • Investigate more of the promising proposals we receive, instead of having to aggressively triage due to limited grantmaker capacity.

About the roles

We are looking for multiple hires at a range of seniority levels: Senior Program Associate, Associate Program Officer, and Senior Program Officer. Below, we outline what we are looking for across the roles, and then give more detail about how expectations differ between them.

The ideal candidate for these positions will possess many of the skills and experiences described below. However, there is no such thing as a “perfect” candidate, and we are hiring across a broad range of levels of seniority, so if you are on the fence about applying because you are unsure whether you are qualified, we strongly encourage you to apply. There is a single application for all of the roles listed; we plan to let you know at the point of inviting you for a work test which role(s) we are considering you for.

Who we’re looking for across the roles

The core function of each role is to recommend grants to advance technical research aimed at reducing catastrophic risks from AI. All of our grantmakers have significant responsibility for investigating and recommending grants. We expect team members to develop views about the field, and want to empower them to make grants that can help to shape it. In practice we expect to rely significantly on grantmakers’ inside views about individual grants, and often about entire research agendas.

You might be a good fit for these roles if you have:

  • Familiarity with AI safety. You have well thought out views on the sources and severity of catastrophic risk from transformative AI, and in the cases for and against working on various technical research directions to reduce those risks. You communicate your views clearly, and you regularly update these views through conversation with others.
  • Technical literacy. You are comfortable evaluating a technical proposal for technical feasibility, novelty, and the potential of its contribution to a research area (e.g. to one of the research areas we list in our most recent RFP). You are at home in technical conversations with researchers who are potential or current grantees.
  • Good judgment. You can identify and focus on the most important considerations, have good instincts about when to do due diligence and when to focus on efficiency, and form reasonable, holistic perspectives on people and organizations.
  • High productivity. You are conscientious and well-organized, and you can work efficiently.
  • Clear communication. You avoid buzzwords and abstractions, and give concise arguments with transparent reasoning (you’ll need to produce internal grant writeups, and you may also draft public blog posts).
  • High agency. You will push to make the right thing happen on large, unscoped projects, even if it requires rolling up your sleeves to do something unusual, difficult, and/or time-consuming.
  • Technical AI safety research experience. You have published TAIS research in the past. This is not a hard requirement, but is useful for these roles (especially the more senior roles).

We also expect all staff to model our operating values of ownership, openness, calibration, and inclusiveness.

In general, roles within the team are fairly fluid, with people at different levels of seniority contributing to a range of tasks according to where their skillset and experience is most valuable. Even “junior” team members (in terms of professional experience) regularly take on significant responsibility, especially in areas in which they have expertise.

Central tasks across the roles could include:

  • Evaluating technical grant applications, for example those that came from our recent RFP.
  • Iterating with potential grantees on their research ideas and strategic plans.
  • Maintaining strong knowledge of important developments in AI capabilities and safety research, and adapting our funding strategy appropriately.
  • Developing strong relationships with key AI safety researchers and other important people in the field, and understanding their views on important developments.
  • Explaining our AI safety threat models and research priorities to potential grantees.
  • Sharing feedback and managing relationships with grantees, both in writing and conversation.

Senior Program Associate

Senior Program Associates are typically engaged early contributors to the field of TAIS with strong independent judgment. Candidates might have roughly 0.5-2 years of TAIS-relevant experience – i.e. any experience that involves spending a significant fraction of your time thinking, talking, or reading about technical AI safety. Examples of TAIS-relevant experience include a research master’s degree focused on AI alignment research, time in a technical AI safety mentorship program, or employment in an organization that works on technical AI safety.

Associate Program Officer

Associate Program Officers typically have established expertise in technical AI safety (i.e. 2-4 years of TAIS-relevant experience) or bring professional judgment and transferable skills from other domains while having some technical AI safety expertise (i.e. typically 0.5-2 years of TAIS-relevant experience and 3+ years of other professional experience).

In addition to the tasks listed above, Associate Program Officers might expect to:

  • Develop our grantmaking strategy in particular areas, including ways we could increase impact or use active grantmaking to shape the field of AI safety.
  • Actively create highly promising grant opportunities where they do not already exist.
  • Own relationships with our largest and most important grantees.

Senior Program Officer

Senior Program Officers are typically recognized thought leaders in the field of technical AI safety (i.e. typically bring 5+ years of TAIS-relevant experience) or bring senior-level professional expertise

 

Work Hours: 8

Experience in Months: 36

Level of Education: bachelor degree

Job application procedure

Click here to Apply 

 

All Jobs | QUICK ALERT SUBSCRIPTION

Job Info
Job Category: Technician jobs in Kenya
Job Type: Full-time
Deadline of this Job: Monday, November 24 2025
Duty Station: Remote | Nairobi | Kenya
Posted: 12-11-2025
No of Jobs: 1
Start Publishing: 12-11-2025
Stop Publishing (Put date of 2030): 10-10-2076
Apply Now
Notification Board

Join a Focused Community on job search to uncover both advertised and non-advertised jobs that you may not be aware of. A jobs WhatsApp Group Community can ensure that you know the opportunities happening around you and a jobs Facebook Group Community provides an opportunity to discuss with employers who need to fill urgent position. Click the links to join. You can view previously sent Email Alerts here incase you missed them and Subscribe so that you never miss out.

Caution: Never Pay Money in a Recruitment Process.

Some smart scams can trick you into paying for Psychometric Tests.