'Quantilizers' are a proposed form of AI that selects randomly from the most effective e.g. 10% of actions, rather than taking the most effective action. This might limit extreme consequences from AI.
Share this post
A Safer Form of AI?
Share this post
'Quantilizers' are a proposed form of AI that selects randomly from the most effective e.g. 10% of actions, rather than taking the most effective action. This might limit extreme consequences from AI.