subject: How to Use Automated SEO Software Tools for the Best Results [print this page] How to Use Automated SEO Software Tools for the Best Results
There's a preamble before I assess the pros and cons of automated Search Engine Optimization (SEO) software tools, but I think that it serves to set the scene nicely.
I opt to receive junk e-mail from Internet marketers because, occasionally (very rarely, in fact), I find an item of software that is actually useful.
As a qualified professional software engineer, I'm able easily to distinguish good software from bad. Free software usually doesn't work well and may even be detrimental to the computer in some way if it's created by an amateur. I've found that even the software that works well is almost always not designed well, but the fact that it produces results is more important than the way it looks or its poor usability, I suppose.
Every day I receive hundreds of junk e-mails. This morning (04 Sep 2010), for example, I received among them the three following items.
One was selling "Keyword Research Software" which seemed only to show search engine results from Google and YouTube when a keyword is typed in. It had nothing at all to do with keyword research. There was no price stated for the software. Much of the sales page was taken up with the seller's life story (poor education, "littlle" (sic) brother died, etc.) Then it went on to offer article spinning software for $47 with this enticement:
"College student plagerizes ...gets thrown out of school...
To Bad He Didn't Have The Best Research Software...
Click Here to Order Now!"
This is just one example of thousands of similar sales pages sloshing around on the Web. It begs the question: What sane person would even consider handing over money to such an unbusiness-like individual who is barely coherent, can't spell and can't string sentences together properly?
The second junk e-mail I want to mention included this:
"How to use little known, yet brutally effective techniques for hijacking commissions."
"How to hijack all their hard work for massive paydays."
Good grief! What are Internet marketers coming to? This person is encouraging buyers of this "system" to steal from others! Inciting cybercrime! How would honest people, trying to make an honest living with the Internet, feel about this?
Again, this is only one of many similar appeals to cheat or steal from others that I've seen in my long experience.
The third junk e-mail, which is very relevant to the subject, talked about an "Instant Ranking Formula" that claims to get Google to rank any website highly. It stated:
"Get any new site indexed quickly and backlinks to it automatically and this happens simultaneously."
(Let's ignore the all-too-frequent poor English in this case.) Software like this usually scatters backlinks (explained later) around sites which belong to a group of like-minded people who have also paid for the backlinks, regardless of the relevance of the textual content containing the backlink to the site that it links to.
This is not the purpose intended by search engines for backlinks, and backlinks from sites whose subject matter is unrelated can harm the ranking of both sites. Google can detect this, and assumes that the site owners are trying to "beat the system" by robotic means. Moreover, even relevant outward-bound links, if they are too numerous, can get a site labelled as a "link farm", which renders all the linked sites liable to be penalized. Many purveyors of automated SEO software, keyword research software, etc. unashamedly claim that their software "cheats" the search engines.
Seeing these three examples today spurred me to write this article, with the message that attempting to cheat the search engines with fully automated SEO software may come back to bite you.
Google's 'Quality Guidelines' for websites states: "Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or 'bad neighborhoods' on the web, as your own ranking may be affected adversely by those links."
This is confirmed on Google's web page 'Google-friendly sites', where it says: "Keep in mind that our algorithms can distinguish natural links from unnatural links. Natural links to your site develop as part of the dynamic nature of the web when other sites find your content valuable and think it would be helpful for their visitors. Unnatural links to your site are placed there specifically to make your site look more popular to search engines. ... Only natural links are useful for the indexing and ranking of your site."
Regarding deceptive techniques generally, Google's 'Quality Guidelines' for websites also states: "Webmasters who spend their energies upholding the spirit of the basic principles will provide a much better user experience and subsequently enjoy better ranking than those who spend their time looking for loopholes they can exploit." Enough said!
What are Search Engines looking for, then?
That is the question, rather than "To cheat, or not to cheat?"
Most people who have investigated how to raise a website's position in search engines have, at some time or another, encountered the phrase "Content is King!" This declaration has always been, and always will be, true. The reputation of search engines relies on the usefulness to the visitor of the search results they return. They will, therefore, reward those websites that contain useful content by ranking them highly in their search results. It's that simple.
Because I know more about Google than Yahoo, Bing (previously MSN) and the others, I'll refer to Google's criteria. It is very likely that other search engines use similar criteria, and most of them get their results from Google, anyway.
So, how does Google determine whether or not website content is useful? Because Google uses robots ("spiders") to crawl through websites, they cannot read the content as humans do. Therefore programmatical methods are employed. The most important of these determine how relevant the content is to the search term used by the visitor. Relevance is measured in several ways.
How does Google determine the relevance of a website to a search term?
1. Keyword Density
Keyword density is a ratio of the number of times a keyword (phrase) occurs in a web page, expressed as a percentage of the total number of words in the page. Search engines calculate it to determine whether a web page is relevant to a specified keyword (phrase).
Keyword density is less important nowadays as a factor for determining page rank (PR), simply because it is too easily manipulated by website owners. Indeed, too many keywords in a web page is regarded as an attempt to cheat the search engines and can cause it to be penalized. This practice is known as "keyword stuffing".
The optimum keyword density is considered by many SEO experts to be between 1% and 3%. The density of a keyword (phrase) of 4% or more might be considered as being "search spam".
2. Latent Semantic Indexing
It's a grand title for a bold attempt to simulate human intelligence. To keep the explanation simple, a search engine that employs Latent Semantic Indexing (LSI) in its algorithms analyzes the entire content of a web page, to discover what the subject matter is about. Once it "knows" this, it indexes the page to appear in results for search terms that are conceptually similar in meaning, even if the actual search term does not appear in the text.
It is, therefore, fruitless to use software simply to create a nonsensical page and inject keywords into it at random. While search algorithms become more "intelligent", it becomes more and more essential to add the human touch to websites, to distinguish them from those cranked out by automation software. By all means, use software tools to automate repetitive mini-tasks that are beyond human capacity, but use them judiciously. There is no substitute for the human brain.
3. External Backlinks
Backlinks are links to a website from other websites. Genuine backlinks are created by people, not robots, who think that a certain website contains useful information from which other web users could benefit. So, they publish a link to it on their website, thus making their own site more useful. This is the way natural backlinking is done. Google rewards websites that have natural backlinks with a higher page rank (PR), because it reckons that, if many other people think that the website is useful enough to link to it, then it must be useful. Fairly logical, really.
Any other kind of backlinking is considered by Google to be unnatural. An example of unnatural backlinking is "reciprocal linking", whereby two websites simply exchange links to each other. This kind of backlinking was easily discovered by Google, and it adjusted its algorithm so that the two links cancelled each other out, and neither website gained any benefit. Reciprocal linking became useless and died out.
People then began to develop "three-way linking" in order to try to beat the system. This works perfectly well and is acceptable to Google, provided that it is done in the spirit intended.
For example, Company A programs and builds websites, Company B specializes in graphic design, and Company C is expert in Search Engine Optimization (SEO). None of these companies is a competitor to either of the others; their businesses complement each other in that all three specialisms are related and are needed for creating a successful website. If Company A's site contains a link to Company B's site, Company B's site contains a link to Company C's site, and Company C's site contains a link to Company A's site, all three companies benefit from the link exchange, and Google is happy. The only way to achieve such a natural three-way link exchange is to make a personal, individual approach to related websites.
If, on the other hand, three-way links are created by a robot, or by a human for the sole purpose of trying to increase page rank, Google frowns on the practice and all the sites involved run the risk of being penalized.
Google values links to a website from other websites highly, and usually increases its page rank (PR), provided that:
A) The website containing the link has, itself, a high page rank;
B) The website containing the link is in a related industry;
C) The website containing the link is an "authority" site;
D) The website containing the link is not a "link farm", i.e., its primary purpose must be to provide useful information to visitors, and not to be a repository for dozens of links to other websites.
Unless these four criteria are met, a backlink to a website is unlikely to have any effect on its page rank. Moreover, if criteria B and D are not met, an adverse effect on the website is possible.
There are two fundamental problems with using automatic backlinking software tools:
1) They do not distinguish between relevant and irrelevant websites on which to place the backlinks, thus failing criterion B. They create backlinks on only general blog, forum and social media sites, or sites in a group where links are distributed to each other.
2) Because they are automated, they create the backlinks on the same sites for everybody, thus swelling the number of links on those sites, and failing criterion D.
A third drawback of automatic backlinking software tools is that any Tom, Dick or Harry can achieve exactly the same link results. Thus, the playing field is just as level as it was beforehand, and no advantage (which is questionable on such general sites, anyway) is gained by any user of the software. There is no more benefit for anyone who is prepared to put in some effort to achieve better results than for anyone who is lazy and just wants to click a button.
The expression "link juice" came into vogue some time ago. It's quite apt. If a website is linked to by many "authority" sites of high page rank which have only a few links on them to other sites, those links will be very valuable to that website, and its page rank is very likely to rise. If, however, the sites contain many links to other sites, the "link juice" is diluted and the links are less valuable. Moreover, if the number of links continues to grow and the informational content does not, it's only a matter of time before Google classifies such sites as "link farms", and penalizes them. Once that happens, the websites that they link to are likely to suffer, as Google will consider them to have tried to beat the system unfairly.
Every day I receive e-mails from Web Masters, asking for a three-way links arrangement with one or other of my websites. I decline respectfully, unless:
A) The domain name of the site they're offering to link to mine is relevant to my site's content;
AND
B) The content of the site they're offering to link to mine is relevant to my site's content;
AND
C) The domain name of the site to which they want my site to link is relevant to my site's content;
AND
D) The content of the site to which they want my site to link is relevant to my site's content;
AND
E) The site they're offering to link to mine is not an obvious "link farm", i.e., its primary purpose must be to provide useful information to visitors, and not to be a repository for dozens of links to other websites.
If only some of these criteria are met, I write back, offering the Web Master pre-paid advertising on my site instead of a link exchange. Sometimes, albeit rarely, they agree to pay for the link. Incidentally, most of my Internet income is derived from advertisers who write to me, offering directly to pay for one-way text links to their website from mine. Again, the websites must be on a related subject, otherwise my website could suffer, and theirs surely would. A prime example is TheGamesForum.com, which makes thousands of dollars annually from only a few advertisements. The reason for its high page rank is not so much its backlinks (which are all natural ones) as its content, which consists of hundreds of pages about games. In short, it is an "authority" website.
What are "authority" websites?
It's always difficult to know what's in Google's "mind", which rumours are true and which are merely rumours. Recently I heard that Google gives more credibility -- and therefore higher page rank (PR) -- to websites with at least 100 pages. Whether or not this is true, it makes sense. A website that contains lots of RELEVANT information about a subject is more likely to be an "authority" site than one that contains only a little information.
Nowadays, when it's so easy to create content-rich websites with specialist software (see below), I believe that effort is well spent by focusing on creating lots of useful content, and just letting the backlinks occur naturally, as, indeed, Google expects them to. After all, "Content is King", isn't it?
Unless the subject is vast, like "games", and you're willing to put a lot of time and effort into creating unique content manually, building an "authority" website quickly is conceivable only if the niche subject is narrow, and software tools that create unique content are used. An example of such a software tool is the "article spinner".
What are "article spinners"?
Many attempts have been made to create unique content automatically, and they have failed to varying degrees. Attempts, using article-spinning software, are also likely to fail, UNLESS some manual effort is also input. I'll elaborate on that in a moment.
First, let me explain how article spinners work. If you read through any piece of text word by word, you'll notice that ALMOST EVERY WORD can be substituted by a synonym. If, therefore, you take an original article and substitute almost every word for a synonym, you'll end up with two completely different articles, yet they will say exactly the same thing. Because Google uses a robot to check for uniqueness of content, it sees these two versions of the same article as completely different content. A human, of course, would notice that they are, in fact, two versions of the same article.
That may be fine for two versions, but imagine how long it would take to produce ten versions, or a hundred. It would obviously be uneconomic for a human to do that. For software, however, it presents no problem to create several hundred unique content pages from an original in a few minutes. That's what "spinning" text means.
There are two types of article-spinning tools: There are those that use dictionaries to substitute the synonyms entirely automatically. These will, eventually, cause more harm than good, because they use synonyms randomly, resulting in stilted content at best, and nonsense at worst. Human readers give up after having read just a few words.
The other type uses a dictionary to suggest synonyms, leaving it to the human author to select those which make sense in the current context. The best article spinners also allow the user to add more synonyms and alternative phrases to the dictionary for future use.
When you come to spin the articles -- I call them "articles" for want of a better word; they can, of course, be any kind of content -- the software creates different versions of the text, using the synonyms chosen. Because the alternative words and phrases were chosen by a human, every version will make sense and be readable by a human. The more synonymous words and phrases there are, the more different the versions will be.
Benefits and risks of article-spinners...
The previous sentence contains the recipe for success. You will not get the best out of article-spinning software if you choose only one or two synonyms for only the most significant words, and Google may even detect too many similarities between versions and penalize your website, if you publish similar versions to create content. Google says: "Duplicate content on a site is not grounds for action on that site unless it appears that the intent of the duplicate content is to be deceptive and manipulate search engine results."
For the greatest success, therefore, and to avoid the "duplicate content" risk, you MUST supply ALL the possible synonyms for EVERY possible word AND phrase. This means finding alternatives, not just for individual words, but also phrases and the words within the phrases, AND also the positioning of the phrases in each sentence, AND also the positioning of the sentences in each paragraph. Are you getting the message?
Similarity is not the only consideration. The resulting spun text must also make good sense, be relevant to the search term, and be interesting and enjoyable for a human reader, because Google can also detect "doorway" pages.
In Google's own words: "Doorway pages are typically large sets of poor-quality pages where each page is optimized for a specific keyword or phrase. In many cases, doorway pages are written to rank for a particular phrase and then funnel users to a single destination. Whether deployed across many domains or established within one domain, doorway pages tend to frustrate users, and are in violation of our Webmaster Guidelines."
If your spun articles reside on your website(s), they MUST NOT, therefore, be "poor-quality", otherwise the domain(s) hosting them may be removed from Google's index. Again in Google's words about doorway pages: "Google's aim is to give our users the most valuable and relevant search results." Therefore your spun text pages MUST be of sufficiently high quality that they can be read sensibly, as if they were the original text, and are both relevant and interesting to human readers.
Depending on the length of the original text, you need to allocate at least a day or two with a good thesaurus just to the actual spinning of the text. The software just makes it easy. That's apart from the initial writing of the content. It needs to be done only once, and the effort is well worth it. The benefit is proportional to the effort. If you do the job properly, you'll have hundreds of web pages on your site with truly unique content. Remember: "Content is King!"
This begs the question: Is this cheating the search engines? It depends on the way you look at it. On the one hand, you are using a software tool to replace human effort. On the other, you are creating intelligible, relevant, useful content which is found when surfers enter the particular search term that displays one of your hundreds of pages to which it corresponds. When you think about it logically, the visitor is not going to plough through a number of your web pages that all say essentially the same thing; The visitor is going to find and read the ONE page that matches the search term, and that's all. All you have done is provide the precise information that the visitor is looking for. And that's exactly what Google wants to happen, is it not? This is a moot point, and the jury is still out. Is the potential success worth the risk? The decision is yours.
Is article-spinning software enough on its own?
Sorry to disappoint you, but the answer is "No". I'm afraid that you still have some more work to do if you want your website to reach the heights in search engine results. You have, however, done most of the work by creating many unique, content-rich web pages that should convince search engines that yours is an "authority" site. Although content is king, there are other aspects of SEO that are essential in getting the content found in searches.
Domain names, file names and HTML tags must be relevant!
As suggested in the 'External Backlinks' section earlier, the domain name is of paramount importance for a high position in Google search engine results. Also extremely important are the file name and/or folder name, and the HTML 'title', 'description' and 'h1' heading tags. For example, if you search for the phrase "adwords campaign manager", you'll see this phrase in the domain names, folders, titles and descriptions on the first page, despite the fact that competition for this keyword phrase is fierce.
Ideally, each individual web page should be focused on one single keyword (phrase) that you know (from your keyword research!) that people search for. The keyword should appear in the textual content several times (taking care to avoid "keyword stuffing"), and it should also exist in the file name, and in the HTML 'title', 'description' and 'h1' heading tags. The appearance of the keyword in the first paragraph is also very important, as is, possibly, the last paragraph.
Now that you've spun hundreds of content-rich web pages, though, how are you going to change all their file names to match your keyword list, and insert each keyword into the corresponding file's content and HTML 'title', 'description' and 'h1' heading tags? Again, this would be uneconomic for a human to do, but is quick and easy for software to accomplish.
When I looked for a software tool to do these tasks, I couldn't find one. Fortunately, however, as a software engineer, I was able to create such a tool myself. It goes under the rather obvious name of "SEO Editor" and is available at the link below.
Are automated SEO software tools worthwhile?
In the old days there was no software to automate search engine optimization tasks. They had to be done manually. Nowadays, however, when there's so much fierce competition for that elusive number one spot in the search engine results, we must use a certain amount of automation, unless we have a lot of time to spare, otherwise we stand almost no chance of being found by the huge number of people who want or need what we have to offer.
The problem with software that does EVERYTHING at the touch of a button, however, is that too many casual "bedroom entrepreneurs" saturate the Web with the same stuff with the same pattern. It's always only a matter of time before Google and, presumably, the other search engines catch on to the latest ruse and change their criteria to combat it. Even while it lasts, the users still end up competing with each other on the same level. Only the level changes.
There is no substitute for the human element, and there probably never will be. It is the human touch that distinguishes one piece of work from another, one website from another. By all means, use automation tools for tedious, repetitive, time-consuming tasks, but give your website that something different, that human touch, that bit of hard work, that element of YOU, to distinguish it from those of the great unwashed masses.
An ideal result can be achieved with a judicious combination of automated SEO tools and human effort. If you are industrious, you will succeed.