Search This Blog

Wednesday, September 18, 2019

Another Topic Too Dangerous to Discuss?

I find the topic of sex, gender, identity, power and social constructionism very interesting. And here's an interesting article on the topic with the following catchy excerpt:
I basically just made it up.
Human characteristics generally have a basis in some mix of nature and nurture (or DNA and memes if you prefer). Topics like the above are dangerous to discuss because if it can be interpreted that one is putting just a little too much emphasis on nature (for example that the contribution of nature/DNA is non-zero) then one can get in a lot of trouble.

I sometimes wonder if the study of biology and particularly genetics is going to be shut down in the future. The problem is that it's seemingly increasingly at odds with social science. Biologists are finding more and more correlations between genes and human traits like intelligence and various behaviors via Genome Wide Association Studies (GWAS) and are starting to propose mechanisms for the genetic basis of those traits while Social Scientists clearly assert that what biologists are finding simply cannot be correct.

Perhaps not all of biology will be banned - just those topics that have to do with things like intelligence, behavior and identity. Nonetheless, it seems like we might be headed for a different sort of Creationism - not one that's deity based, but rather social science based.

Tuesday, September 17, 2019

Richard Stallman Resigns

There have been topics that I've wanted to write about but have been hesitant to do so. For example, I found the Epstein phenomenon to be fascinating (though awful), from his motivations to his operations to his (apparent) suicide. However, it was moderately clear that writing even one word about the subject that could possibly be interpreted by anybody as not being politically correct could be devastating to me.

I met Richard Stallman, a MacArthur Fellowship Award (Genius Grant) recipient and quintessential MIT nerd a few times when I was at MIT, both at CSAIL and at parties. He was, in my opinion, quite opinionated and could be very abrasive, but he was also very smart, very talented, extremely productive and seemed to overall have a good heart as far as I could tell.

He was recently forced to resign from various positions:
In 2019, Stallman was reported by colleagues to have made statements by email in defense of Marvin Minsky, then deceased, against allegations of sexual abuse in connection with Jeffrey Epstein's alleged child sex trafficking operation.[114] In the resulting furor, Stallman resigned from both MIT[115][116] and the Free Software Foundation.[117]
I'm not totally sure, but my recollection is that Minsky was at least somewhat of a mentor to Stallman, so it's not surprising that Stallman might be inclined to try and defend his dead mentor and given that he's the quintessential MIT nerd also not surprising that he'd lack the filters to realize it would be a really bad idea to do so.

Anyway, if I needed confirmation that Epstein was yet another topic I should stay way away from, this was it.

My question is: what can I write about that won't get me in trouble? I guess more science and math stuff so that's what I'll focus on.

Wednesday, September 11, 2019

Interesting Abstract


Here is an abstract I found interesting:

Technological innovation can create or mitigate risks of catastrophes—such as nuclear war, extreme climate change, or powerful artificial intelligence run amok—that could imperil human civilization. What is the relationship between economic growth and these existential risks? In a model of endogenous and directed technical change, with moderate parameters, existential risk follows a Kuznets-style inverted Ushape. This suggests we could be living in a unique “time of perils,” having developed technologies advanced enough to threaten our permanent destruction, but not having grown wealthy enough yet to be willing to spend much on safety. Accelerating growth during this “time of perils” initially increases risk, but improves the chances of humanity’s survival in the long run. Conversely, even short-term stagnation could substantially curtail the future of humanity. Nevertheless, if the scale effect of existential risk is large and the returns to research diminish rapidly, it may be impossible to avert an eventual existential catastrophe.

This has been my intuition for a long time. My metaphor is this. Humanity/civilization is on a runway in a scramjet accelerating towards a brick wall. If we go full pedal to the metal we might, just might, be fast enough to lift off the runway in time to clear the brick wall. If we don't, we won't reach a high enough speed to to clear the wall but unfortunately our forward momentum is too great to stop and we're sure to hit the wall and that will be the end.

Thursday, September 05, 2019

YASP (Yet Another Sunset Photo)

As usual, from my apartment. This time with a little foreground rain. No rainbow though.