Talk:Existential risk from artificial intelligence
| This is the talk page for discussing improvements to the Existential risk from artificial intelligence article. This is not a forum for general discussion of the subject of the article. |
Article policies
|
| Find sources: Google (books · news · scholar · free images · WP refs) · FENS · JSTOR · TWL |
| Archives: 1, 2Auto-archiving period: 12 months |
| This It is of interest to multiple WikiProjects. | ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| ||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
The empirical argument
[edit]I'm pretty busy editing other articles, but to add my own perspective on this topic: I thought all of this was pretty silly up until I started seeing actual empirical demonstrations of misalignment by research teams at Anthropic et al. and ongoing prosaic research convinced me it wasn't all navel-gazing. This article takes a very Bostromian-armchair perspective that was popular around 2014, without addressing what I'd argue has become the strongest argument since then.
- "Hey, why'd you come around to the view that human-level AI might want to kill us?"
- "Well, what really convinced me is how it keeps saying it wants to kill us."
– Closed Limelike Curves (talk) 22:50, 20 September 2024 (UTC)
- Makes sense. There is more empirical research being done nowadays, so we could add content on that. Alenoach (talk) 00:44, 21 September 2024 (UTC)
- Nah. It still is pretty silly. Folks treating this topic seriously have spent a little too long watching Black Mirror and various other lame sci-fi. I'm sorta surprised this entire article hasn't been taken to AfD. How does it avoid WP:CRYSTALBALL's prohibition on speculative future history? NickCT (talk) 18:07, 27 November 2024 (UTC)
- I think the only sci-fi movie I've ever seen is Star Wars. In any case, it's an appropriate topic because the discussion itself is notable and widely-reported on in reliable sources—other examples of this would be the articles on designer babies and human genetic enhancement. Like the link says:
– Closed Limelike Curves (talk) 19:34, 27 November 2024 (UTC)Predictions, speculation, forecasts and theories stated by reliable, expert sources or recognized entities in a field may be included, though editors should be aware of creating undue bias to any specific point-of-view.
- I think the only sci-fi movie I've ever seen is Star Wars. In any case, it's an appropriate topic because the discussion itself is notable and widely-reported on in reliable sources—other examples of this would be the articles on designer babies and human genetic enhancement. Like the link says:
Elon Musk?
[edit]"On March 2, 2025, Elon Musk estimated a 20% chance of AI-caused extinction."
Elon Musk is not an AI researcher, philosopher, or other specialist. There is no reason whatsoever for his opinion or estimates to be given credibility.
I don't remove the sentence myself because his persona inspires strong emotions and I don't want to enter a dispute over it, but I defer to some more senior editor who would consider removing it. DigitalDracula (talk) 12:51, 12 March 2025 (UTC)
- I don't have a strong opinion on the matter, but I would note that Musk co-founded OpenAI. WeyerStudentOfAgrippa (talk) 13:22, 12 March 2025 (UTC)
- Slight preference toward removing the sentence. A lot of what Musk says is unsubstantiated, especially these days, so it's unclear whether his subjective probability estimates are valuable for readers, and the article already mentions him a number of times. I remove it for now, but if someone insists to add it back (with a better source than H2S Media), I won't oppose it. Alenoach (talk) 22:44, 15 March 2025 (UTC)
Additions to "Views on banning and regulation"
[edit]I think adding organizations such as StopAI and PauseAI is be a needed addtion to his section 109.247.57.11 (talk) 08:52, 28 April 2025 (UTC)
Use of word "doomer"
[edit]Doomer is a common word now to describe people who are into this cause, including by the people themselves. Can I add that term, or is it sore subject still? Not logged in 2 (talk) 18:48, 18 May 2025 (UTC)
- The term "doomer" is mostly pejorative and isn't precisely defined (at which subjective probability estimate of doom does one start being a "doomer"? 50%? 90%?), so it's likely better to avoid the term when possible. Alenoach (talk) 20:02, 18 May 2025 (UTC)
Wiki Education assignment: Introduction to Technical Writing
[edit]
This article is currently the subject of a Wiki Education Foundation-supported course assignment, between 19 August 2025 and 13 December 2025. Further details are available on the course page. Student editor(s): Aromer264 (article contribs).
— Assignment last updated by Aromer264 (talk) 02:22, 20 September 2025 (UTC)
- B-Class level-5 vital articles
- Wikipedia level-5 vital articles in Technology
- B-Class vital articles in Technology
- B-Class Computer science articles
- Mid-importance Computer science articles
- WikiProject Computer science articles
- B-Class Disaster management articles
- Low-importance Disaster management articles
- B-Class Effective Altruism articles
- High-importance Effective Altruism articles
- B-Class futures studies articles
- High-importance futures studies articles
- WikiProject Futures studies articles
- B-Class Transhumanism articles
- High-importance Transhumanism articles
- B-Class Alternative views articles
- Low-importance Alternative views articles
- WikiProject Alternative views articles
