Oh bugger.
Now I have to change careers. Or pretend I’m a dentist at all LW meetups henceforth.
If you’d like the impressions of an SEO in the trenches: yes, this is more or less the model I had, except for the novelty of seeing Google as the computational underdog. From the perspective of any particular SEO, it is the cursed Yellow Face which burns usss.
I would say that you oversell our powers somewhat, except that I’m one of the wimpy white-hatted specimens who don’t have access to the fun Dark Artefacts like link farms and industrialised content shops. Even with an industrialised content shop, trust me, it’s difficult to make a car insurance website interesting.
(As you might expect, our half of the industry circulates its own little self-justifying anecdotes about how often the link farms get sniffed out and banned, and how users are smarter than that and don’t fall for that hothouse spam stuff. Except of course they largely do, and we’re not seeing it, as you point out.)
I would say one thing about doing SEO as an aid to rationality; constructing fragile little houses of lies, and then watching an (admittedly imperfect) Big Bad Algorithm blow them down again and again, makes one REALLY REALLY receptive to Eliezer talking about entangled truths. I think it’s still my favourite part of the sequences.
Hearing from the SEO (albeit white-hat) point of view is really useful. I only have my external observation of the industry and my theoretical understanding of OBP to draw from so I wasn’t sure how well I represented the SEO state of the art.
I think you talk about current SEO well. Good content and links to that content are still state of the art.
I got a lot out of thinking about the computational / human-bandwidth asymmetry of Google vs content creators.
But have you considered how the fear of being Sandboxed plays into things?
My first thought was that it improved value of the proxy somewhat by making people who know the proxy will change over time be less cavalier. Most people engaging in serious SEO have lucrative websites. You have to be very risk-seeking to go after those small marginal gains at the risk of losing all your cash flow permanently. There aren’t that many large players that it gets driven down to a Nash Equilibrium quicker than Google’s algorithms can change.
But the more I think about it, the fear of being penalized also tends to make legitimate content producers even more concerned that doing ANY SEO is bad. That may make things doubly-worse.
It’s impossible to not do SEO. Every site is optimized for something.
Thanks for the feedback. I hadn’t factored sandboxing into my thinking at all. But as you say it’s a double-edged sword.
I assume the way SEO techniques get around this is to initially ‘interrogate’ the algorithm through throwaway, high-risk websites, and then any robust techniques discovered slowly make their way up the ladder to established, highly conservative websites. Of course at any chance you risk getting caught (and it depends on the repulsiveness of the techniques as well) but that’s always a risk when you’re building on someone else’s platform. If a website depends on its standing in Google search results, you can say it’s building on their platform.
Also an interesting point about LessWrong’s optimization. I guess now we know we count two Search Engine Optimizers in our midst, the Powers that Be can get in touch with you guys..
Oh bugger. Now I have to change careers. Or pretend I’m a dentist at all LW meetups henceforth.
If you’d like the impressions of an SEO in the trenches: yes, this is more or less the model I had, except for the novelty of seeing Google as the computational underdog. From the perspective of any particular SEO, it is the cursed Yellow Face which burns usss.
I would say that you oversell our powers somewhat, except that I’m one of the wimpy white-hatted specimens who don’t have access to the fun Dark Artefacts like link farms and industrialised content shops. Even with an industrialised content shop, trust me, it’s difficult to make a car insurance website interesting.
(As you might expect, our half of the industry circulates its own little self-justifying anecdotes about how often the link farms get sniffed out and banned, and how users are smarter than that and don’t fall for that hothouse spam stuff. Except of course they largely do, and we’re not seeing it, as you point out.)
I would say one thing about doing SEO as an aid to rationality; constructing fragile little houses of lies, and then watching an (admittedly imperfect) Big Bad Algorithm blow them down again and again, makes one REALLY REALLY receptive to Eliezer talking about entangled truths. I think it’s still my favourite part of the sequences.
Hearing from the SEO (albeit white-hat) point of view is really useful. I only have my external observation of the industry and my theoretical understanding of OBP to draw from so I wasn’t sure how well I represented the SEO state of the art.
I think you talk about current SEO well. Good content and links to that content are still state of the art.
I got a lot out of thinking about the computational / human-bandwidth asymmetry of Google vs content creators.
But have you considered how the fear of being Sandboxed plays into things?
My first thought was that it improved value of the proxy somewhat by making people who know the proxy will change over time be less cavalier. Most people engaging in serious SEO have lucrative websites. You have to be very risk-seeking to go after those small marginal gains at the risk of losing all your cash flow permanently. There aren’t that many large players that it gets driven down to a Nash Equilibrium quicker than Google’s algorithms can change.
But the more I think about it, the fear of being penalized also tends to make legitimate content producers even more concerned that doing ANY SEO is bad. That may make things doubly-worse.
It’s impossible to not do SEO. Every site is optimized for something.
For instance, lesswrong.com is optimized for:
vote
points
permalink
children
password
Think about that next time you lament that lesswrong is overwhelmingly less popular than other sites with clearly inferior content.
Thanks for the feedback. I hadn’t factored sandboxing into my thinking at all. But as you say it’s a double-edged sword.
I assume the way SEO techniques get around this is to initially ‘interrogate’ the algorithm through throwaway, high-risk websites, and then any robust techniques discovered slowly make their way up the ladder to established, highly conservative websites. Of course at any chance you risk getting caught (and it depends on the repulsiveness of the techniques as well) but that’s always a risk when you’re building on someone else’s platform. If a website depends on its standing in Google search results, you can say it’s building on their platform.
Also an interesting point about LessWrong’s optimization. I guess now we know we count two Search Engine Optimizers in our midst, the Powers that Be can get in touch with you guys..