According to the GenderAnalyzer, that blog post was written by a man. I tested your original post as well and it was correctly guessed as being written by a woman.
I tried it on some other pages and if anything the thing is underconfident—it’s right more often than it supposes.
lesswrong.com is probably written by a male somewhere between 66-100 years old. The writing style is academic and happy most of the time.
The age result is interesting.
(This is a different web site that uses the same underlying service. It is based on the most recent posts, so the result will likely change over time.)
These percentages are supposedly Bayesian estimates, so it basically just means that it isn’t easy to tell one way or another but the thing was more inclined to take it as female. If the thing is well calibrated it would be right 63% of the time and wrong 37% of the time with this estimate. But at least for my tests it was right even more often—it seems other people had different experiences.
This is based on all the estimates that people have voted on. So it’s not strange if it’s only getting 63 − 70% correct; it’s giving many estimates which are less certain than this.
I wasn’t referring to the total percentage but to ranges: for example when it estimated from 65-75%, it seemed to be wrong 1 in 4 to 1 in 6 times instead of 1 in 3 to 1 in 4. But maybe my sample was still too small.
My livejournal gets 58% female; my synopsis of my webcomic gets 81% female; and my serial fiction, which I coauthor with another woman, gets 75% female.
According to the GenderAnalyzer, that blog post was written by a man. I tested your original post as well and it was correctly guessed as being written by a woman.
I tried it on some other pages and if anything the thing is underconfident—it’s right more often than it supposes.
[/me googles “GenderAnalyzer” and checks own blog.]
Woo-hoo! (I’m male, but it seems to me a bad thing for that to be obvious from my writing.)
It’s probably not fair to the tool to use it on a community blog, but:
The age result is interesting.
(This is a different web site that uses the same underlying service. It is based on the most recent posts, so the result will likely change over time.)
Darn—claims my blog is 63% woman. Not sure how to take that!
These percentages are supposedly Bayesian estimates, so it basically just means that it isn’t easy to tell one way or another but the thing was more inclined to take it as female. If the thing is well calibrated it would be right 63% of the time and wrong 37% of the time with this estimate. But at least for my tests it was right even more often—it seems other people had different experiences.
Just clicked through to the following screen after selecting “no—it didn’t get it right” to see the resulting poll:
Yes − 63% No − 32% Don’t know − 5%
This is based on all the estimates that people have voted on. So it’s not strange if it’s only getting 63 − 70% correct; it’s giving many estimates which are less certain than this.
What was the percentage? The tests I’ve done have range from 31% to 73% for the correct answer.
I wasn’t referring to the total percentage but to ranges: for example when it estimated from 65-75%, it seemed to be wrong 1 in 4 to 1 in 6 times instead of 1 in 3 to 1 in 4. But maybe my sample was still too small.
I’m sorry—I meant the percentage for the blog post and for Alicorn’s post.
66% for the blog post, 56% for Alicorn’s original post. For this comment : http://lesswrong.com/lw/1ss/babies_and_bunnies_a_caution_about_evopsych/1ofp , it gave 70% female, which is reasonable: it’s much more obvious than in the original post (apart from the fact that she says so explicitly which I assume the thing doesn’t know.)
My livejournal gets 58% female; my synopsis of my webcomic gets 81% female; and my serial fiction, which I coauthor with another woman, gets 75% female.