Skip to main content

Should I do reciprocal linking on my site?


Reciprocal links are touch and go for a few reasons.

The reason that incoming links are good is that search engines think your site is worthwhile if humans are linking to it. In other words, if a webmaster reviews your content and links to it then your content must be good (thinks the search engine).

So people actively started to try and build links, to make search engines think that their content is good.

The only thing is that search engines hate it when you do things just to optimize your site for search engines. Optimize your site for users, while making it easy for search engines to see what it's about (this is where the bold, italics, page title and headings come in)

Search engines see reciprocal links as a distortion. The webmaster is not linking to you because your content is good, they're only linking to you because they're getting something in return (a link to their site, or even money). If you have a page of links that link to sites that are linking to you a search engine is going to realize that none of those links actually mean anything. They're going to think that you're trying SEO "tricks".

Google specifically warns against "excessive" reciprocal linking, but doesn't quantify this. So if you find a nice juicy site that insists on a reciprocal link then fine - give them a link from an article. Never create a page of reciprocal links to a whole bunch of worthless sites that are irrelevant, or have low page rank. Think how easy it would be for a search engine to spot this.

Google puts this a whole lot better than I can (in their Guidelines for Webmasters) when they say "make pages primarily for users, not for search engines. ... Don't participate in link schemes designed to increase your site's ranking or PageRank."

They expand on this when defining link schemes.

Two better alternatives to reciprocal linking are to use link bait and to submit your site to RSS aggregators. Don't go rushing off to build hundreds of reciprocal links to other sites. Rather spend your energy building your site into a great resource that people will want to visit (and link to).

By the way, to report a site that is using link spam you can use the Google tool: http://goo.gl/linkspam . If you can use this tool, then your competitors can use it on you too.

Comments

Popular posts from this blog

Separating business logic from persistence layer in Laravel

There are several reasons to separate business logic from your persistence layer.  Perhaps the biggest advantage is that the parts of your application which are unique are not coupled to how data are persisted.  This makes the code easier to port and maintain. I'm going to use Doctrine to replace the Eloquent ORM in Laravel.  A thorough comparison of the patterns is available  here . By using Doctrine I am also hoping to mitigate the risk of a major version upgrade on the underlying framework.  It can be expected for the ORM to change between major versions of a framework and upgrading to a new release can be quite costly. Another advantage to this approach is to limit the access that objects have to the database.  Unless a developer is aware of the business rules in place on an Eloquent model there is a chance they will mistakenly ignore them by calling the ActiveRecord save method directly. I'm not implementing the repository pattern in all its glory in this demo.  

Fixing puppet "Exiting; no certificate found and waitforcert is disabled" error

While debugging and setting up Puppet I am still running the agent and master from CLI in --no-daemonize mode.  I kept getting an error on my agent - ""Exiting; no certificate found and waitforcert is disabled". The fix was quite simple and a little embarrassing.  Firstly I forgot to run my puppet master with root privileges which meant that it was unable to write incoming certificate requests to disk.  That's the embarrassing part and after I looked at my shell prompt and noticed this issue fixing it was quite simple. Firstly I got the puppet ssl path by running the command   puppet agent --configprint ssldir Then I removed that directory so that my agent no longer had any certificates or requests. On my master side I cleaned the old certificate by running  puppet cert clean --all  (this would remove all my agent certificates but for now I have just the one so its quicker than tagging it). I started my agent up with the command  puppet agent --test   whi

Redirecting non-www urls to www and http to https in Nginx web server

Image: Pixabay Although I'm currently playing with Elixir and its HTTP servers like Cowboy at the moment Nginx is still my go-to server for production PHP. If you haven't already swapped your web-server from Apache then you really should consider installing Nginx on a test server and running some stress tests on it.  I wrote about stress testing in my book on scaling PHP . Redirecting non-www traffic to www in nginx is best accomplished by using the "return" verb.  You could use a rewrite but the Nginx manual suggests that a return is better in the section on " Taxing Rewrites ". Server blocks are cheap in Nginx and I find it's simplest to have two redirects for the person who arrives on the non-secure non-canonical form of my link.  I wouldn't expect many people to reach this link because obviously every link that I create will be properly formatted so being redirected twice will only affect a small minority of people. Anyway, here's