The Seven Deadly Sins of SEO: #7 “Avoid Black Hat Techniques”

They appear every so often on internet marketing forums; people claiming to have discovered a loophole or fool proof “black hat” search engine optimization technique. Their technique, available for a price, will propel your website to the top of the search engine listings – and of course they guarantee you’ll never get caught.

Now, think about it. While we’d all like to believe that there are methods that can get us to number one in Google with no effort whatsoever, it just isn’t true. Google is huge, and it’s smart. There’s no denying that those employing “black hat” (a phrase used to describe methods that go against Google, or other search engine’s terms of service) techniques may experience success at first, but it won’t be long term. Not ever. In fact, there’ll be lucky if it works for a few days.

Let’s say these people, these forum peddlers, really had discovered a flawless technique to guarantee themselves top of the pile picks in search engine results. Do you think they’d be selling their method for a couple of bucks on forums? No, of course not. If their method really worked, they’d be creating small affiliate websites in every profitable niche, working their SEO black hat magic and sitting back to watch the profits roll in. Furthermore, the more they publicize their method, the more likely it is that Google will discover it – so why would they risk it?

They wouldn’t, because these methods don’t exist. Avoid them. Don’t waste money, both on purchasing the method and the subsequent building and use of method on a website, on something that is doomed to fail.

The Seven Deadly Sins of SEO: #6 “Title Stacking”

When it comes to search engine optimization, one of the most useful tools in a web developer’s arsenal is the < title > tag within HTML code. Unlike articles, which must be based around keywords (a procedure which is never easy), the < title > tag is a section of code which you can pack with your keywords – all without having to add context, readability, and all the other things that an article needs. The extra bonus is that you can have your main page have a < title > tag full of keywords, and keywords are often hard to find on a simple “welcome to this website” page.

The usefulness of the < title > tag is also one of its major problems. The tag becomes so powerful, so influential, and so easy to use, that those employing shady black hat SEO techniques quickly learn how to manipulate it. They discovered that by using more than one set of < title > and < / title > tags in an HTML code for a web page, they could fit in many more keywords – and thus raise up the search rankings. Using many sets of < title > tags is, understandably, known as title stacking.

Get caught doing it by search engines, and you’ll be dropped from the search results quicker than you can say ‘jack rabbit’. It might work for awhile, but the overall quality and reliability of your site will soon be called in to question – because you will get caught. Use one set of < title > tags only, and keep the keywords relevant to your site.

The Seven Deadly Sins of SEO: #5 “Hosting Viruses, Malware or Other Nasties”

by  
Filed under Seo For Advanced

This may seem obvious; no search engine is going to rank you well in their search results if their bots discover that there is spyware, malware, viruses or any other kind of internet nasties contained within your website. In fact, if a bot does discover such content, your site will most likely be removed and blacklisted for good.

So that’s simple – and most of you won’t even be considering hosting that kind of content anyway, so there’s nothing to worry about, right? Perhaps wrong. Many sites are subject to hacking, which leads to them being infected with the nasties that search engines (and internet users in general, for that matter) hate so much. Even sites with thoroughly strong security can be hacked and infected, quite without the owner’s knowledge. So you could be merrily promoting your site, working on its content and ensuring your SEO is tip top, but you may not be aware that your site is infected and only a few steps away from being blacklisted forevermore.

There are a few things you can use to prevent it. The first is obvious, but crucial: visit your site regularly with your anti-virus working, and check it seems okay. Secondly, you can get a good idea of what other people think of your site by installing a Firefox Add-On called “Web of Trust”. This displays a ring of one of three colors near the browser menu of a website; green means the website is ‘safe’, orange means ‘doubtful’ and red means ‘avoid this site’. These colors are user generated, so you can check that no one is experiencing problems with your site by installing this add-on.

The Seven Deadly Sins of SEO: #4 “Linking To Bad Sites”

Have you ever heard the phrase ‘falling in with a bad crowd’? Well, if you link to websites that search engines consider ‘bad’, that’s the search engine optimization equivalent of falling in with a bad crowd. While your website may not be intrinsically ‘bad’ in itself, if you promote (by linking) sites that violate the terms and conditions of major search engines, you’ll be tarred with the same brush. While it’s unlikely your site will be completely blacklisted, you may see a sharp fall in rankings position – or even be removed from the search rankings altogether.

This, of course, begs the question: how do I know what a ‘bad’ site is? After all, if someone links to you, you’re probably going to want to do the decent thing and return the favor That’s what so much of website building, networking and promotion is all about – right? So how can you be sure you’re not destroying your own search engine chances by linking to a poor site that search engines consider bad?

It’s tricky, but the basic answer is to use your gut. How does the website look? Does it look professionally designed, properly maintained? Is the content unique, or does it all sound familiar, or is the English terribly written?

On a more technical basis, you can check the PageRank of the site, and also its standings with Alexa. This should give a good understanding of the website in question’s general standing, and whether or not it’s the kind of crowd you want to be associating with. Also familiarize yourself with the Google terms of service, and scan the site for any obvious violations. If it passes, feel free to post a link back.

The Seven Deadly Sins of SEO: #3 “Duplicate Content”

Among those well versed in internet marketing, duplicate content is something of a sticky issue. The exact nature of the problem is in what constitutes duplicate content, with some internet marketers insisting anything that has previously been written on any other website qualifies as duplicate content – while others say it only matters for the same text to be repeated on the same website.

The exact definition is not exactly known, and isn’t helped by the fact that the search engines are not particularly forthcoming on the issue. However, if you are found to be using duplicate content on your website and a search engine does have an issue with it, you can forget about getting a good ranking with that search engine.

It is more likely – though not certain – that the duplicate content rule applies to text used within the same site. You should not, for example, make lots of pages all using the same article with no changes. This is the lesser version of duplicate content, though some marketers still insist search engines frown on the same article or text being used from anywhere on the internet; and that this will trigger a duplicate content penalty.

The idea, of course, is to avoid plagiarism and for search engines to avoid publishing results that show the same text over and over again. To be absolutely sure you’re not committing the duplicate content sin, always write and use original content, both within your website and externally. That way, you can be sure – no matter who is right or wrong in the debate – that you aren’t going to be penalized for it.

The Seven Deadly Sins of SEO: #2 “Cloaking”

All the major search engines compete to make their search results as relevant, up to date and informative as possible. For a search engine to be considered effective, and therefore gain users, it relies on its reputation for providing the right information for any given search term.

They’re correct in assuming this. Imagine you were looking for some tips on how to clean your windows, and you used a search engine you’re unfamiliar with. If you visited a site through this new search engine, and it brought you to a website on adult porn – you wouldn’t be too happy, would you? In fact, you’d probably dismiss the search engine as useless, and wouldn’t bother to use it again.

That’s why search engines take issue with a practice known as ‘cloaking’ so very seriously. If their livelihood depends on the search results being accurate and informative, search engines have a duty to their own business ethics – as well as their customer’s – to frown upon cloaking, and they do. Do it, and your website will be removed from search results and most likely blacklisted.

So what is cloaking? Cloaking is the practice of writing a piece of programming that means human visitors to your website see something very different from what a search engine bot crawling your website sees. If you cloak effectively, you could indeed disguise your adult site as something as harmless as cleaning windows – and you’d benefit from a good SEO ranking. You’d also, unfortunately, ruin the search engine results – and they won’t accept that. S when it comes to cloaking, avoid this practice.

The Seven Deadly Sins of SEO: #1 “Hidden Text”

Anyone with a basic understanding of search engine optimization will know that text on a website plays a large part in how you are ranked in search engines. In fact, it could be argued that the textual content of a website is actually the most important thing for search engines.

It’s therefore natural for the cunning mind to wonder if it’s possible to introduce sections of ‘hidden text’. Imagine you’re not the best writer in the world, and you don’t want to have to spend a lot of money outsourcing content creation. Yet at the same time, you’re aware of the importance that search engines place on textual content. So rather than writing poor articles yourself, trying to jam your keywords in, you can simply write the keywords into a spare section of your website – and then change the font color so it is the same, or virtually the same, as the background of the page. Suddenly, your website is stuffed with keywords, but all without having to publish poor articles or ruin the look and feel of your website in general.

This practice goes by a variety of names, including font matching and keyword stuffing. However, whatever you call it, it’s a bad idea.

Why? Well, the reason is obvious – it’s a cheat. Google, and the other major search engines, place an importance on text content because they want their search results to be relevant. Hidden text defeats the point of this, and if you’re caught doing it, you will have your website banned from the search engine – for good. So don’t do it.

What Is Robots.txt?

For a search engine to keep their listings up to date, and present the most accurate search engine results, they perform an action known as a ‘crawl’. This is essentially sending a ‘bot’ (sometimes known as a ‘spider’) out to crawl the internet. The bot will then find new pages, updated pages or pages it did not previously know to exist. The end result of the crawl is that the search engine results page is updated, and all of the pages found on the last crawl are now included. It’s simply a method of finding sites on the internet.

However, there may be some instances where you have a website page you do not want included in search engine results. For example, you may be in the process of building a page, and do not want it listed in search engine results until it is completed. In these instances, you need to use a file known as robots.txt to tell a search engine bot to ignore your chosen pages within your website.

Robots.txt is basically a way of telling a search engine “don’t come in here, please”. When a bot finds a robots.txt file, it will ‘read’ it and will usually ignore all the URLs contained within. Therefore pages within the file do not appear in search results. It isn’t a failsafe; robots.txt is a request for bots to ignore the page, rather than a complete block, but most bots will obey the information found within the file. Some “nasty” bots may actually ignore your robots.txt file and index everything they find. However, for the nice bots, when you are ready for the page to be included in a search engine, you simply modify your robots.txt file and remove the URL of the designated page.

Why Is SEO Important?

Think back to the days before the internet, when everyone looked for business listings in the Yellow Pages or a similar hard copy directory. Now, imagine you’re trying to find a plumber. You go to the “P” page, flick through, and only see three plumbers listed.

It’d be a massive advantage for the three plumbers, wouldn’t it? To effectively be all that potential customers see. Well, that’s the power of SEO.

When someone uses a search engine, they type in a set of words to bring up results that are relevant to them. Once the applicable websites appear on the results page, they are rated in terms of ‘relevancy’. Any website which has correctly used search engine optimization will be judged by Google to be ‘relevant’. So if you have correctly used SEO, your website will appear somewhere near the top of the search engine results page – hopefully in the top three.

Why is that so important? Well, studies have shown that the vast majority of those using search engines only click on websites listed in the top three results of any search engine results page. This is where it compares to being only one of three businesses listed in an old-style paper directory like the Yellow Pages. Master SEO, and your website will appear in high in search results, in the slots (one to three) that users click. That naturally leads on to more people visiting your website, and that in turn means more business, more customers, and ultimately more money.

What Is SEO?

“SEO” stands for “search engine optimization”, and as the name would suggest, it’s all about utilizing your website so that search engines can find it easily.

When someone searches for a website, they will key a few choice words in to their chosen search engine. For example, if they’re looking for a restaurant in New York, they’ll probably type something like: “good new york restaurant” or even just “new york restaurant”. SEO is all about making sure your website appears at the top of the search results for any given phrase like this.

Obviously the phrase is dependent on your niche, what you’re selling or what services you offer – but good SEO practices are about reducing your site down to those key words that people type into Google. There would be no point, in the example, for a great New York restaurant to have their website online but not have it come up under search results for “new york restaurant”.

However, with hundreds of thousands of websites online, there are literally thousands of pages that could turn up results for (again, using the example) “new york restaurant”. SEO is the method used to beat all the other websites, and get yourself as high in the search results as possible. As most users will tend to visit a website from the top three search results and completely neglect the rest, the efficacy of your SEO really can make or break the success of your website. And if your business in general is dependent on your website, then SEO can be the difference between success or failure in the business world.

« Previous Page

Skip to content