Difference between revisions of "AI safety lacks a space to ask stupid or ballsy questions"

From Issawiki
Jump to: navigation, search
Line 1: Line 1:
 
(this page assumes that asking stupid or ballsy questions is important for learning/making intellectual progress)
 
(this page assumes that asking stupid or ballsy questions is important for learning/making intellectual progress)
  
I think the way that voting and crossposting work on [[LessWrong]] and the [[Alignment Forum]] significantly reduces the number of stupid or ballsy questions people can ask. (I mean stuff like there being downvotes, and how it feels bad when something isn't "good enough" to be crossposted to Alignment Forum.) I [https://www.greaterwrong.com/posts/5syG88rmW5iD9kTM5/is-it-harder-to-become-a-miri-mathematician-in-2019-compared like] [https://www.greaterwrong.com/posts/9YwNXt7ANyMqTu6Ky/degree-of-duplication-and-coordination-in-projects-that asking] [https://eaforum.issarice.com/posts/tDk57GhrdK54TWzPY/i-m-buck-shlegeris-i-do-research-and-outreach-at-miri-ama these] [https://eaforum.issarice.com/posts/an9GrNXrdMwBJpHeC/long-term-future-fund-august-2019-grant-recommendations-1#AkWpKT5dF7SDQHMD3 questions], but I feel like there's a ton of work involved in making these publicly acceptable. If I have 50 such questions (I'm sure I've written down at least 50 questions like this), I'm only posting like 5 of them on the internet, because it takes a lot of mental energy to simulate people's reactions, making it palatable would-be-downvoters, etc. I feel like this significantly reduces my learning rate.
+
I think the way that voting and crossposting work on [[LessWrong]] and the [[Alignment Forum]] significantly reduces the number of stupid or ballsy questions people can ask. (I mean stuff like there being downvotes, and how it feels bad when something isn't "good enough" to be crossposted to Alignment Forum.) I [https://www.greaterwrong.com/posts/5syG88rmW5iD9kTM5/is-it-harder-to-become-a-miri-mathematician-in-2019-compared like] [https://www.greaterwrong.com/posts/9YwNXt7ANyMqTu6Ky/degree-of-duplication-and-coordination-in-projects-that asking] [https://eaforum.issarice.com/posts/tDk57GhrdK54TWzPY/i-m-buck-shlegeris-i-do-research-and-outreach-at-miri-ama these] [https://eaforum.issarice.com/posts/an9GrNXrdMwBJpHeC/long-term-future-fund-august-2019-grant-recommendations-1#AkWpKT5dF7SDQHMD3 questions], but I feel like there's a ton of work involved in making these publicly acceptable. If I have 50 such questions (I'm sure I've written down at least 50 questions like this), I'm only posting like 5 of them on the internet, because it takes a lot of mental energy to simulate people's reactions, making it palatable to would-be-downvoters, etc. I feel like this significantly reduces my learning rate.
  
 
[[Category:AI safety meta]]
 
[[Category:AI safety meta]]

Revision as of 10:31, 18 November 2020

(this page assumes that asking stupid or ballsy questions is important for learning/making intellectual progress)

I think the way that voting and crossposting work on LessWrong and the Alignment Forum significantly reduces the number of stupid or ballsy questions people can ask. (I mean stuff like there being downvotes, and how it feels bad when something isn't "good enough" to be crossposted to Alignment Forum.) I like asking these questions, but I feel like there's a ton of work involved in making these publicly acceptable. If I have 50 such questions (I'm sure I've written down at least 50 questions like this), I'm only posting like 5 of them on the internet, because it takes a lot of mental energy to simulate people's reactions, making it palatable to would-be-downvoters, etc. I feel like this significantly reduces my learning rate.