An Overview of Seo Technology And Its Unreliability

Google+ Pinterest LinkedIn Tumblr +

This past weekend I had an epiphany: a revelation of the first order that demonstrated how many in the public have been duped once again by the SEO technology gurus. Now the unreliability of SEO technology would be made obvious to these eyes Being a newbie to this article writing business on the Internet, every article researched, on SEO enabling for articles and websites was read and implemented immediately. So, in spite of not knowing the difference between C.E.O. and SEO, my very article at Factoidz scored a “high” score for the rating. Now another “sacred cow” of technology that of SEO reliability would be slaughtered. The unreliability of the SEO image was going to come down at least in my mind.

Elated, this writer took Friday night off to go to church. That following Saturday morning while checking to see how all was going, sudden shock settled in. Several of the articles which had previously rated a “high” were now “low” and some that were “medium”, were now “high”. And without human intervention of any kind.  Something smelled rotten in Denmark.

More researching was done and questions on web site forums was indicated concerning SEO rankings. No one came back with the same or nearly same answer. “Strive for a 5-7% score.” “No, go for a 3%.” “Don’t get discouraged.” “It’ll be alright in a day or two.” “Re-do the articles”. “Forget keyword density!”

Attempts To Recover and Restore SEO Compatibility.

After having changed the keyword density on some “test” articles in order to increase the density factor, some came back after just 3 seconds of twirling by the almighty SEO analyzer. Something was definitely wrong as usually it took up to 2 or 3 days of twirling. It had stopped spinning it’s analytical mind in seconds. This was the height of unreliability.

The Epipany

An article written about of all things, Vick’s VapoRub received the greatest all time page views of any of my articles, yet the SEO “monster” gave it a “medium” score. It had been written with pure SEO strategy in mind. Is the human reader and English language being sacrificed because of some weird little spider-like software robot making decisions? Is the fabled SEO technology actually an unreliable deficiency with a tremendous spin machine in full gear? Should the Internet gurus need to repent for their cover up of SEO’s technology unreliability?

The three freebie text analyzing sites, Keyword Analyzer, Live Key-word Analysis and Textalyser certainly are a help but be it known as far as this writer’s limited experience, all three have come in with different values and even different word counts. Perhaps not greatly differing but there is a difference.

Let’s look at just a couple of the myths of perpetrated by SEO gurus and the Internet powers that be.

* Only links from relevant sites matter. Truth – not that relevant links aren’t valuable but they are not required for ranking well in the search engines. This will come as the site grows in popularity.  A link is just a link to the SEO monster.

* Another hoax. “My site will get banned unless I use 100% unique content!”  More poppycock. Truth – try submitting the same article on an article bank directory and see if its ranking goes any further south.

* The Big Google Monster will get you and penalize you! This was the same stuff we were handed about Microsoft and its invincibility. And in what direction is Microsoft headed today? It’s going south.

Google is an algorithm-based system and it always will be and that’s all. It relies on the writer’s content, the writer’s links and the writer’s text to determine rankings.  It’s a system that can be worked to generate as much traffic as one wants. This in turn depends on how many sites one feels like building, indexing and for how much web presence is desired. (1)

The Solution:

* One needs to learn a working knowledge about SEO technology, yes. But don’t get crazy with it.

* There is nothing like content. There I said it – the “C” word!  CONTENT!   A writer’s fame and reputation is built by content not algorithms.

* Once one knows the basics of keyword targeting (and what to target), online optimization and choosing a niche, it’s easier, a lot easier than being in bondage to any software tyranny.

* Just keep building content and get links.

* And one more thing: “follow the inner voice. ” Follow your gut feeling about your writing and about your site. “Follow the Force, Luke Skywalker”

Keyword density is basically the number of times your selected keyword, or keyword phrase, appears in a web page. Keyword density is calculated by taking the total number of keywords on a page and dividing that by the total number of all words on the page. I use the 3% density percentage for myself but let’s use 5% for illustrative purposes.

For example, if you have 100 words on your web page, (not including HTML code or any other code), and you use your keyword five times throughout the page, your keyword density is 5%. (2)

Here’s What The Numbers Look Like:

5 (total keyword usage) divided by 100 (total words) = .05

0.05 x 100 = 5%

Summary and This Writer’s Conclusion

This is not rocket science. SEO technology has a degree of unreliability. One cannot refute the figures. There are literally millions of websites and articles on the web with very high rankings. Many owners and authors are academics, professionals or otherwise plain folk who don’t know the first thing about SEO technology. Yet many of them are making millions of dollars from their sales or the proliferation of their articles through Google and other search engines such as Bing, Yahoo and Alta Vista. Write the best content one can and match it as far as possible with the SEO guidelines but go with your inner feelings, “trust your gut!”





About Author

Leave A Reply