Skip to content

An initiative to create concise and widely shareable educational resources, infographics, and animated explainers on the latest contributions to the community AI alignment effort. Boosting the signal and moving the community towards finding and building solutions.

liondw/Signal-Alignment

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

73 Commits
 
 
 
 

Repository files navigation

SIGNAL ALIGNMENT TITLE

Signal-Alignment 📡

Welcome to the Signal-Alignment project! This is my initiative to create educational resources, infographics, and animated explainers for understanding the latest community AI alignment research. My goal is use clear, concise design to make AI alignment research more accessible and encourage more people to contribute to the development of safe and beneficial AI.

I am in very early stages, please be patient with me as I get set up! Please contact me below if you're interested in adding my design services to your AI-related projects.

Understanding the Problem ⚠

We are at a crossroads in AI development, fast approaching AGI and superintelligent Autonomous AI. There are many valid concerns ranging from security, identity theft, misalignment, and malicious bad faith actors. Currently, AI alignment research and investment is severely lagging behind other advancements in AI technology. Alignment of AI is an urgent matter for which the vast majority of society has had little to no awareness or say on.

The lack of accessible educational resources in the field of AI alignment creates a barrier to entry for individuals who want to contribute. That's why I created Signal-Alignment to target individuals who are actively working in all open source AI projects, and even people who are new to AI alignment or have limited technical knowledge. If you have an idea or resource you would like to suggest, you can contact me below.

Design Process ✒

To ensure accessibility, I design the resources with simplicity and clarity in mind. The design process involves brainstorming ideas, researching and gathering information, sketching and prototyping, and designing and refining the resources via feedback. I use AI tools where ethical to assist in the design process. You can view my projects in progress in the projects tab above.

I recently participated in a 24 hour Hackathon to help present ETHOS, a modular and easily accessible agent alignment framework.

See more here: https://youtu.be/SL7f6WX20Ks

ETHOS Framework Presentation

Where to begin 🖼

I am prioritizing the design of educational slides articulating the AI alignment problem, as well as some immediate ways to contribute to a solution.

I have already created an Introduction to David Shapiro's Heuristic Imperatives, a proposed framework to solving the AI alignment problem, and currently working on a follow-up addressing implementation of these principles, as well as a companion animated explainer video.

This covers a fraction of David's work thus far, I highly suggest learning more via his videos:

AGI Unleashed: Game Theory, Byzantine Generals, and the Heuristic Imperatives

The AGI Moloch: Nash Equilibrium, Attractor States, and Heuristic Imperatives: How to Achieve Utopia

As well as his github repo on Heuristic Imperatives

More projects to come, view progress in the projects tab above.

1 - Title

Finding More Educational Resources 📚

While Signal-Alignment provides an introduction to AI alignment, deeper reading and research is necessary for a thorough understanding of the field. Here are a few resources to help you get started:

David Shapiro - Author of Benevolence by Design and AI researcher

https://www.youtube.com/@DavidShapiroAutomator

https://github.com/daveshap/HeuristicImperatives

AI Safety Community document - links to other community alignment efforts

https://coda.io/@alignmentdev/alignmentecosystemdevelopment

Contact me 📞

If you have questions, would like to contribute, or have topic suggestions, feel free to reach out on our discussion page or at signal.alignment@gmail.com.

An early part of the Signal-Alignment initiative will be offering design and non-code contributions to others. For projects that I decide to take on, I am able to assist you with:

  • Branding
  • Logo
  • Colour Palette
  • Designing landing pages for a product
  • Formatting your README files
  • Creating design assets (images, social media posts)

Looking for researchers

If this project looks like a step in the right direction, I am looking for people to work closely with me to assist with creating these resources faster. Please send me an enquiry via my email above, and I will review on a case-by-case.

If you are a non-coder, have a special interest in communication and understanding and want to help gather information, read through papers, and suggest high-priority ideas for me to cover, this might be the right place to start.

If you are experienced with implementing AI assisted workflows into projects, I also welcome any advice/ suggestions there.

About

An initiative to create concise and widely shareable educational resources, infographics, and animated explainers on the latest contributions to the community AI alignment effort. Boosting the signal and moving the community towards finding and building solutions.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published