
The Nonlinear Library EA - AI Safety Research Organization Incubation Program - Expression of Interest by kaykozaronek
Nov 21, 2023
01:58
Welcome to The Nonlinear Library, where we use Text-to-Speech software to convert the best writing from the Rationalist and EA communities into audio. This is: AI Safety Research Organization Incubation Program - Expression of Interest, published by kaykozaronek on November 21, 2023 on The Effective Altruism Forum.
Tl;dr: If you might want to participate in our incubation program and found an AI safety research organization,
express your interest here. If you want to help out in other ways please fill out that same form.
We
Catalyze Impact - believe it is a bottleneck in AI safety that there are
too few AI safety organizations. To address this bottleneck we are piloting an incubation program, similar to
Charity Entrepreneurship's program.
The incubation program is designed to help you
find a complementary co-founder
acquire additional knowledge and skills for founding an AI safety research organization
get access to a network of mentors, advisors and potential funders
Program overview
We aim to deliver this program end of Q1 2024. Here's a broad outline of the 3 phases we are planning:
Phase 1: Online preparation focused on skill building, workshops from experts, and relationship building (1 month)
Phase 2: An immersive in-person experience in London, focused on testing cofounder fit, continuous mentorship, and networking (2 months)
Phase 3: Continued individualized coaching and fundraising support
Who is this program for?
We are looking for motivated and ambitious engineers, generalists, technical researchers, or entrepreneurs who would like to contribute significantly to reducing the risks from AI.
Express your Interest!
If you are interested in joining the program, funding Catalyze, or helping out in other ways, please fill in this
form!
For more information, feel free to reach out at alexandra@catalyze-impact.org
crossposted to LessWrong
Thanks for listening. To help us out with The Nonlinear Library or to learn more, please visit nonlinear.org
Tl;dr: If you might want to participate in our incubation program and found an AI safety research organization,
express your interest here. If you want to help out in other ways please fill out that same form.
We
Catalyze Impact - believe it is a bottleneck in AI safety that there are
too few AI safety organizations. To address this bottleneck we are piloting an incubation program, similar to
Charity Entrepreneurship's program.
The incubation program is designed to help you
find a complementary co-founder
acquire additional knowledge and skills for founding an AI safety research organization
get access to a network of mentors, advisors and potential funders
Program overview
We aim to deliver this program end of Q1 2024. Here's a broad outline of the 3 phases we are planning:
Phase 1: Online preparation focused on skill building, workshops from experts, and relationship building (1 month)
Phase 2: An immersive in-person experience in London, focused on testing cofounder fit, continuous mentorship, and networking (2 months)
Phase 3: Continued individualized coaching and fundraising support
Who is this program for?
We are looking for motivated and ambitious engineers, generalists, technical researchers, or entrepreneurs who would like to contribute significantly to reducing the risks from AI.
Express your Interest!
If you are interested in joining the program, funding Catalyze, or helping out in other ways, please fill in this
form!
For more information, feel free to reach out at alexandra@catalyze-impact.org
crossposted to LessWrong
Thanks for listening. To help us out with The Nonlinear Library or to learn more, please visit nonlinear.org
