Welcome to the Singularity mini wiki at Scratchpad!
You can use the box below to create new pages for this mini-wiki. Make sure you type
[[Category:Singularity]] on the page before you save it to make it part of the Singularity wiki (preload can be enabled to automate this task, by clicking this link and saving that page. Afterwards, you may need to purge this page, if you still see this message).
Singularitarianism is a moral philosophy based upon the belief that a technological singularity — the technological creation of smarter-than-human intelligence — is possible, and advocating deliberate action to bring it into effect and ensure its safety. While many futurists and transhumanists speculate on the possibility and nature of this type of singularity (often referred to as just the Singularity; capitalized and objectivized to indicate its sheer magnitude as a historical event), Singularitarians believe it is not only possible, but that it is desirable and can also be guided. Accordingly, they "dedicate their lives" to promoting it and acting in ways they believe will contribute to its safety and early arrival.