Sunday, March 1, 2009

Drupal seo mistakes

Drupal seo mistakes
Deprecated - Drupal core · Drupal 4.7.x
toma - June 7, 2006 - 14:28

I want to start this subject, so people can know what to do to make drupal seo friendly

Drupal don't use description meta to node by default see an example on some popular drupal site

type on

Google index 186 000 from and google robot ignore 185 000 pages and take only 1 page

Google ignore pages with same description

Solution :

Use Node (key)words module

If you are using the Internationalization module be careful that you can be ignored or banned on google for duplicate content ! why :

The internationalization module give duplicate url, for your site

for example if you are using the blog module and you have two languages french and english

You will have and
Imagine if you have 1000 members, you will have 1000 more pages, duplicate content same title , same description

If someone have experience something with seo can post here
patchak - June 7, 2006 - 14:48

In that situation it would be wise to maybe build a meta description for user profiles from their own description field if they have any... You could block robots.txt on sme languages and choose only one to let the SE's in, that way there no dup...

One should also check out it's alias and www/non-www 301 configurations to make sure no duplicate content is presented to the engines. An example is the comments form. When you click on add a comment you go to another page with the post and the comment form. Well google crawled those pages, indexeded them and associated those pages with the original one. Oups... before anything went wrong with crazy G I blocked google from the comment/reply/*. Now the pages are not duplicated again...

It is probable that google won,t do anything to your site and recognize that page as a submit form associated with the other. But I prefered not to take the chance, having had myself a lot of problems with google and non-intentional duplicate content.
Better to make a robot.txt
toma - June 7, 2006 - 15:14

Better to make a robot.txt for drupal to be seo friendly and prevent robot to index files for duplicat content
i am using pathauto and in admin configuration i am creating also feed

Google index this page and give them unknow file format on google search

Also drupal pagination


Google ignore them because all page are the same title and description and give data from the default index page.
are ignored also

Solution is to make in pathauto configuration to

if you have a voca "news" and taxonomy "sports" in pathauto configuration by default you have ( give page not found)
better to use

All sports pages will be listed at

you can use taxonomy theme module

and theme every taxonomy with different title and description and content / blocks

We can build a standard robot txt solution, if someone have experience some good seo, post your experience here
thank you all
category module
patchak - June 7, 2006 - 19:01

I would recommend category module, so that all your vocabs and terms (containers and categories) have proper metas.

Also, SEO wise, I would recommend using path to create simple aliases to the most important pages. Keywords in the URL, but not too much.

Also make sure to keep control of the aliases, and check out if drupal is not leaking out two urls for the same page. It is not supposed to, but I still check it out sometimes.

Also for SEO, having menus to the right, thus after the text in the spiders eyes, might help you a bit to give a good idea of what your page is about to the spider.

Overall I ave to say that once all these precautions are taken care of, especially meta keywords, 301 redirection and robots.txt drupal actually delivers predictable and normal rankings...Like some html page would rank. IMO. But you really need the metas, if all your definitions are the same, you,re not in the game SEO-wise.
what do you mean
klb - July 6, 2006 - 11:43

what do you mean writing

drupal is not leaking out two urls for the same page

? if i use specific urls for some content types then i should protect drupal from making them accessible via node/$node->nid?
Hello, you wrote "Also for
mcneelycorp - September 20, 2006 - 02:42


you wrote "Also for SEO, having menus to the right, thus after the text in the spiders eyes". I just wanted to comment, you can do this by css. You can have your column show up for the user on the left side but through css you can actually load it on the right for the spider. This will put the main content in front of the spider first then the normal stuff last.

You can find loads of css examples online.
Menus to the right
guitarmiami - September 23, 2006 - 18:48

Also for SEO, having menus to the right, thus after the text in the spiders eyes, might help you a bit to give a good idea of what your page is about to the spider.

I hear about this technique a lot, but I'm not sure that it helps. Google is aware that sometimes menus appear on the right and sometimes on the left. I would imagine that they have a way to determine what is page navigation and what is content through good semantic markup. They are probably using things like header and paragraph tags to identify obvious content. An unordered list of links to other pages on the same domain is obviously navigation.

On a general business site, I think it is best to have navigation on the left. Right side menus seem less intuitive for the average non-technical user. On blogs and tech sites, users are more familiar with menus possibly being on the left.

Just my opinion.

doesn't matter
sepeck - September 23, 2006 - 19:18

If you layout your DIV's correctly, you can position your menu's on the left and still have your content print before your left side menu's. It's how I did my site.

-Steven Peck
Test site, always start with a test site.
Drupal Best Practices Guide -|- Black Mountain
jiangxijay - October 6, 2007 - 00:34

Linearization of content is more important than whether or not the CSS places a div on the left or right side of the screen.
At my blog crawled by google
ahmaddani - April 6, 2007 - 20:24

At my blog crawled by google about thousand page. But at really its about one hundred page.

I thinks this is reason why google get like that.

I've stop for all of my page from google robot crawler by robots.txt but nothing different .

Ahmad Daniyal
globalwarming awareness2007
guitarmiami - April 14, 2007 - 21:58

I took a look at your blog and it doesn't have a robots.txt. The robots.txt file has to go in the root of your subdirectory, i.e.,

From just a quick look at your indexed pages I would add at least the following rules:

User-agent: *
Disallow: /aggregator/
Disallow: /comment/reply
Disallow: /node/add
Disallow: /archive/
Disallow: /tracker/
Disallow: /user/
Disallow: /*?*
Disallow: /*feed

Then wait a few weeks and check Google again ( The number of indexed pages should go down to something more accurate.
Would this be a good rule of thumb for all sites?
tknospdr - April 15, 2007 - 14:19

Or just his specific one becuase it's a blog?

Drupal robots.txt
guitarmiami - April 21, 2007 - 03:12

Those were just specific to his site. I would probably add those to any Drupal site though, plus others...

I will try to find time to write a longer Drupal SEO tutorial and include information about robots.txt. The robots.txt file should expand depending on what modules you have installed.
That would be handy
tknospdr - April 21, 2007 - 15:24

I can't wait.

I just deleted it
ahmaddani - June 20, 2007 - 06:22

I just deleted my robots.txt cause my robots.txt module get error like problem that i explaine before.

Ahmad Daniyal
ngadutrafik 2007|Ahmad Daniyal's Blog|Indie Publisher's Blog
this should work. i'll try a
Siime - August 26, 2007 - 19:49

this should work. i'll try a few things and update you guys later...
SEO legends
funana - July 6, 2006 - 12:37

I was thinking "what about a SEO thread" and stumbled upon this thread here. Thank you Toma!

I work as an SEO and I would recommend the following steps:

- Install Autopath and Nodewords Modules
- Autopath: Change the seperators to - or + because Google takes _ as hard seperators, - or + as catenation.

Example: the URL ranks for the keywords "fashion" and "shop" ranks for the keywords "fashionshop", "fashion shop" and "fashion-shop"

You can add more than just the title to the URLs in Note etc Path Settings and seperate that with an _ . Like "[nid]_[title]". Just remember: Keep it short and relevant!

If you use Freetagging (you really should), then go to Settings/Meta Tags (the nodewords mod) and

- Don't specify global Keywords, they are useless (often non-contextual)
- Select your Vocabularies
- Select "Use the teaser of the page if the meta description is not set"
- Deselect all "Tags to show on edit form" to prevent non-contextual keywords, which are poison for google

The META keywords is one of the most discussed SEO issues. Google doesnt use them anymore but still checks if the keywords are in the body too. If you dont use them, then google gives you a 100% keyword match, which is good. I prefer soemthing like 5-7 keywords that are all included in the visible body text which can easily produced with this mod, because there are still some search engines who like keywords and they could be used in the future for tagclouds and new applications (think of technorati which completly reversed the google art of indexing via tags (keywords) and pings).

The "Duplicate Content Legend":

Yes, there is a duplicate content detection at Google & Co. and it does well with detecting session variables and other waste in the URLs. But: If there are 5 different links to one unique page it is no problem! You will see that google just takes the one that is more trusted, whatever that means. Most of the time it shows the first crawled link in its SERPs.
There is absolutely NO "Google penalisation" for pages who have good rewrite rules and a lot of different links to one page. Just one thing you should not do is setting up several subdomains that point to one page.

The "Google ignore pages with same description Legend":

No, it just hides the results. If you click on "show all results" you can see them all.
This is very logical because Google doesnt wnat to to show up a page with 10 URLs which descriptions are exactly ther same. Everybody would hate the SERPs otherwise ;-)
You may not like it, but it's very usefull.

Just install the nodewords mod OR and that is something that can easily be done with every Drupal Version, simply delete the description META tag completely. Google will then show up text exerps from your body text, but maybe rank a little bit worse in direct competion with an other page with the same topic and relevance.

Although I'm in SEO since 98 I am not the allmighty and yes, it's right: "Two SEOs, three opinions!",

so please ask, discuss and share your thoughts and suggestions with us!
Thanks ;]
klb - July 6, 2006 - 13:42

Thanks ;]
nice post
dman - July 6, 2006 - 13:52

I hate SEO (much prefer accurate metadata) but I know it's a neccessary evil.
Thanks for the insights

funana - July 6, 2006 - 15:29

No, it's not evil. I don't do spamming, cloaking etc. SEO has to be natural and permanent. Content is still king... If you have good unique content and an optimized Drupal page everything is fine. No tricks no nothing.
nothing personal
dman - July 6, 2006 - 16:30

Not Evil as a concept, just annoying because you have to do it to make up for all the shysters out there.

Like locking your door when you go out.

You wouldn't have to if everyone behaved, and life would be nicer.

Once upon a time Yahoo was the best search engine out because every link was added by hand. If you built it, they would come.
Now everyone is trying to hit their keywords, you gotta bend over backwards and know the ins and outs just to rate.
Ah well, as long as it's making somebody some money. I'm just sick of yelling at clients to get them to stop putting every keyword they can think of in every page.

funana - July 6, 2006 - 23:52

I'm just sick of yelling at clients to get them to stop putting every keyword they can think of in every page.

I'm with you. That is a bad practise of your clients and has nothing to do with good seo.

Yahoo is a good search engine right now. Not because of their catalog, mostly because of their very good algos. The concept of hand submitted static links (remember: only one link per domain allowed!) has failed.

Building an optimized page today means to think more semantic and topic based. RSS, tagging, pinging, filtering, mashuping - all that stuff.
And readable URLs and RSS Feeds are nothing bad, arent they?

I speak of organic seo. The original meaning of the word "optimize". The new web concepts go confirm with organic seo, except the massive use of ajax, flash and js (which is only a matter of time and will, google seems to be able to follow javascript links since a while).
Drupal can help you to build an optimized page (I am totaly satisfied) - you just have to produce good content and install two or three easy modules. In my opinion it's important for webmasters to know, that you can make some mistakes with drupal that may have drastical impact on your ranking.

But I absolutely agree with you, Dan. It would be a great world if everybody would be honest and respectful. Better beg than steal.
Hello, what did you mean by
mcneelycorp - September 20, 2006 - 02:35


what did you mean by "Deselect all "Tags to show on edit form" to prevent non-contextual keywords, which are poison for google"?

I am using the most updated version of this from csv (on drupal 4.7+) which, on that drop down (tags to show on edit form) i have choices of:


where description, geourl, and keywords are selected...

Also, I wanted to ask, my site uses two other subdomains ( for the US) UK and AU. I want google and others to see them as dedicated for those countries only. I am seeing uk google users finding AU pages and vice versa. I do use google sitemaps and submit all three sitemaps for each country. And, each country pages should have unique content even though they all pretty much have the same menu categories. And, I should mention, each have their own database in case you are wondering.

I will be adding germany and france soon so i imagine i will need the i18module to do that?

Since you have seo experience could you give me some tips on my site? That goes for you too toma. I am using all the suggested modules already. You mentioned in one of these comments on this node that you should give content between links. Take a look at one of the product pages on my site and click on stores tab (if there are more than 4 stores you will see this tab). those images in each store column is linked to the store offer. Each link has a identifier so I can test what people are clicking on. After reading what you wrote I wonder if I should take those links out?

Thanks for your time.
Keywords & Linktracking
funana - September 20, 2006 - 12:20

I mean that you should not include keywords into the META Keywords that are not contained in the body of the page. Which means to me, that I disable the possibility for users to define own keywords.
If you put in the categories or the tags - which are displayed in the body - there will be no problems!

I visited your site and it looks very good and well optimized. I wouldnt use the keywords so excessivly, but you can change that at any time.
You display your Description and Keywords below the page content and that is exactly what you should do, if you have none of the keywords or description in your page content allready.

The Store link issue: I see. You use a trackerscript for outbound URLs. Hmmm. Although it is okay to use this, I would recommend to use original target URLs.
The starting page should not contain too much outbound links, but pages like the store overview may rank a lot higher if you use the external URLs.

Outbound link tracking? Did you give a try? I use this (paid) service and I am very satisfied. If you use the adsense module you will be able to see where ppl. clicked your adsense ads (sorted by ad format and no. of clicks) - in the drupal admin and in your mybloglog stats. It shows outbound clicks in realtime and has a nice realtime "refferer" overview. You can try it for free, there is a demo.

Subdomains for countries: No. Google just looks for the language of the pages. If it's in english, then the pages will show up in all english Google versions. AU, GB, USA ...
Although i18n is a little bit tricky to configure, it will help you to serve the right language to your visitors. You can define which content language should be used by default, by browser or manually. You can use i18n to manage both - content and navigation
language. You can define if the content should depend on the navigation language or not.
Thanks for the feedback, did
mcneelycorp - September 20, 2006 - 14:56

Thanks for the feedback,

did you notice on the product page, those tabs are all within the same page, so all the content from each tab is actually in one page/url. I can move the order around for the tabs if that will help off set so many links being concentrated within main body content. I purposely set this up for seo. I notice many other price comparison websites that have tabs have another page and in some cases are using the same title and keywords for each tab. That is a huge seo mistake for them - but good for me ;-)

I noticed that this site has been picking up google traffic very quickly compared to the old version of the site. The rate of pages getting on first page search results is climbing and the number of visits as well. Google used to bring about 75 visits per day from and now since I changed to this new drupal & product page layout, I am getting about 150 a day. Most are product pages landing pages which is a good thing.

I know it is a good thing to have keywords and description in metas and to repeat that somewhere near the bottom, which you see I am doing. But, I am wondering if google will interpret this as spamming. Those keywords and description near the bottom are there for the reader and bots. But, I am just wondering if I am pushing google's buttons.

Thanks for the tracking tip. It would be nice to see who is linking to you, which the program you referenced does offer this. I think this should be a module in itself if it doesn't exist already - to put in a block or so on the frontend.

I heard so many bad things about i18n module so I was hesitant to work with it. But, I do need if for the non english sites so I will have to use it. I am looking for a solution for organic traffic to get directed to the right page. So if someone is coming from search, and find a results for the main site, I wanted to offer the user a small popup to give them the choice of what country/currency they want to browse in. I notice a lot of traffic coming in from countries with the wrong landing page. This is an evil and needs to be corrected. Do you have any other solutions in mind for this case?

Thanks for the feedback.
SEO optimization script/module
MikeyLikesIt - October 13, 2006 - 11:28

I use path auto to generate my urls, so they are pretty nicely done to match the content of the page. But you pointed out that there's a difference between using - and using _ to seperate the words in the title. Mine use _ and I think I may keep it that way because some of the page titles are long and the url is built off the page title and I wouldn't want the engine to index the entire title as one phrase.

I was thinking about making a script that searched a set of taxonomy's for terms which could be passed to a query that would search for terms that are actually phrases and do string replacements to modify the path so that those phrases would be picked up as complete phrases by the search engines that crawl the site.

Does this url modification also encourage the engine to index those phrases throughout the body text, or is that done automatically?

Does having a word or phrase in the url more than once hurt SEO? Sometime page titles contain the name of a parent section.

Do you know if wrapping span tags around the keyword phrases within the body of the page would also help the search engines index the page with the proper phrases? If so, the the glossary module or something similar would be helpful.


I'm pretty new to actually using drupal so if anyone can help point out tips for building a script like this, that would be great. Please let me know how much interest there would be in a module for that.

guitarmiami - October 13, 2006 - 19:00

Do you know if wrapping span tags around the keyword phrases within the body of the page would also help the search engines index the page with the proper phrases?

I don't think that <span> will help you much. Semantic markup will help you though. Drupal is already pretty good about that.

Dashes in the URLs are apparently slightly better than underscores, but don't change existing URLs because that will hurt your rankings.

If you make an SEO module, make sure that it doesn't change existing URLs for pages and that it doesn't create a lot of identical pages with different URLs.

It won't hurt to have a word mentioned twice in a URL, but I wouldn't do it often. Google doesn't care as much about the keywords in the URL... but having those keywords there benefits you when the URL is used as the link text on inbound links.
Can taxonomy terms and vocabularies adhere to SEO?
deadlyminds - July 8, 2007 - 21:27

Is it possible to create title / descripton tags for taxonomy terms (


can I display this as a node?

I've just migrated my old site to drupal, saved all old HTML URLs as they are..

when I build my vocabulary to match my old site structure, I find that the I lose out to the SEs when displaying articles under taxonomy..
The url alias table should preserve old versions
iliphil - July 7, 2006 - 08:45

I'm very new to Drupal but my 2 cents on this.

The url_alias table should preserve old aliases so when a page is accessed using the node value or an old alias. A 301 Moved Permanently header should be sent to redirect to the latest (unique) url when old values are used. It's an extra query - once you have the url_alias.src and ($pid ) then you query on that
SELECT dst FROM drupal.url_alias where src='node/2' and pid>$pid order by pid desc limit 0,1
and 301 redirect to the latest dst

It doesn't look like it works like that now.
Great Idea, I will try it
ahashim - August 3, 2006 - 10:05

Great Idea, I will try it today.

Ahmed Hashim
Google and SEO
johnchalekson - August 17, 2006 - 09:12 - - -

I have made several "test" sites, and I fear that google has banned the content because it only indexes the index page, none of the other pages. have i made a terrible mistake and produced a lot of duplicate content that can be considered to be a 'link farm'? is any of the damage that i did do reversible?

i now have robots.txt disallowed for all of the nodes. not sure if that will reinstate my status, but i do have gsite module and pathauto running. is it bad to have gsite running and making a lot of submissions on each chron?

thanks for the advice..
I check out your websites
toma - August 19, 2006 - 12:03

I check out your websites and seems not been banned, it appear on google search, but google do index only your home index page, may be your sites are new, try and use Node (key)words module

So you will have for every node a description, its better for seo, if you use google sitemap module try to leave Google download the sitemap not on every cron

Use the URL list module
URL list

Try to make a good title for your index pages, use more words, for that use the nodewords module its very important to get all your pages be indexed

Good luck
google downloading sitemap on every cron
mcneelycorp - September 20, 2006 - 02:15


I wanted to confirm what you wrote above. "if you use google sitemap module try to leave Google download the sitemap not on every cron". So you are saying to not have google download your new gsitemap on every cron? I am asking because I cron every 2 hours because of the amount of content added hourly. Mostly this is done so that the search portion of the cron doesn't time out.

Saying that, I noticed that google is coming to the new gsitemap url after every cron. The page is getting large now and I noticed today that google got a timeout.

So, a few questions:
1) why should google not come on every cron (for me every two hours)
2) i think there is a way to tell google to just visit every day or so (pinging maybe) but is there a way to control that on google'e end or is that through drupal's end.
3) i know google allows zip format, on my old site i used this because of the sizes. has anyone setup gsitemap to do zipped files instead of xml?

thanks for your feedback on this thread
You are right
funana - August 20, 2006 - 12:36

The example at is definetly a "link farm like" page.
You have to ad a lot more text between the links to prevent google from detecting "spam techniques".

Follow the tipps described above. URLrewrite, pathauto, nodewords...
Yet another drupal
toma - August 26, 2006 - 15:30

Yet another drupal mistakes

1 - Admins can't control how the title of (between tags) the taxonomy and the vocabulary of his website.

The title module make a good job as we can change the title of every node, not for taxonomies and vocabularies, google gives priority to Title and description metatags.

If you just install the node keyword module, 2 - you need to edit your taxonomy and vocabulary terms to add description, that will make a good result in next google update...

3 - The Forum module, admins can't change any metatags (for forum and container) even with nodewords module. its not good for seo...

Get your site on Yahoo Site Explorer

URL list module

Its important for yahoo to an url list

You need a yahoo mail account in order to use the Site explorer for yahoo its like a google sitemap for google

Go to

and add your site, you maye add a feed url like
and the urllist
make your site Authenticated with yahoo, follow the yahoo steps

Anyone have a solution to 1 and 2

Thank you
Toma, you mention to get you
mcneelycorp - September 20, 2006 - 02:39


you mention to get you site on Yahoo Site Explorer. I just wanted to share with everyone that yahoo is accepting the gsitemap url ( and indexing correctly. I submitted my gsitemap url to them and yahoo is rapidly indexing each node. I was a bit surprised to find this to be so, but if you look in your xml from the gsitemap url, you will see the elements are generic elements which allows yahoo to pick them up.

So, you kill two birds with one stone! Now if MSN would get on the wagon my smile would be a bit bigger.
scb - September 11, 2006 - 16:08

I haven't tested it, but the Google Sitemap Module should be very helpful in SEO.
Excellent Info!
eagereyes - September 20, 2006 - 02:29

@ toma: To have total
patchak - September 20, 2006 - 21:12

@ toma:

To have total control over taxonomy pages you need the category module...
Thanks for the info...
v1nce - September 23, 2006 - 19:28

Thanks for the info... Subscribing to follow.
Get your site to google Blogsearch
toma - September 24, 2006 - 12:36

In order to get listed on blogsearch (, you just need to ping site such
Go to and open an account, test your rss feed

I test mine is a valid syndication Feed

Try sugree patch at for ping module, it work great for me

When the patch installed, go to ping module administration page and add blog ping server url

Read on and get more webblog to ping

Drupaldemo now 5.0
Would appreciate SEO tips and comments....
Sentiment - September 28, 2006 - 21:33

I have recently rolled out a Drupal site - Home Security Guru.
I installed and configured nodewords, and am using clean urls.

Would any of the SEO guys here care to have a quick look at my website and give me some useful tips?
Hello, I personally think
mcneelycorp - September 29, 2006 - 02:32


I personally think there are too many ads throughout the site. The layout seems fine as a user. I would cut your title length down, it serves as a set of words for search engines to pick up on and also for readers. Your readers should come first so tayloring the title for them is import, but, you readers won't find your page in the search results if the engines don't pick up on the important words. So it is a catch 22.

On your index page, I would find less competing keywords to use to at least help drive some organic traffic. There are plenty of resources explaining why you should not pick the top keywords when first starting a site. The description for the index seems fine.

For subcategories,
You should use keywords related just to that page, and be sure to throw in a description.

One other thing I would recommend is only showing one search box. On some pages there are two.

I hope that is helpful.
guitarmiami - September 29, 2006 - 03:06

I agree about the titles. No need to add the site slogan to the <title>.
Site slogan
guitarmiami - September 29, 2006 - 16:40

Sorry, my mistake -- from a quick glance at the site I though you were using Home Security Guru as the site name, and The web's largest Home Security resource as the site slogan, and had then modified the site to also print the site slogan on every page's title.

You could split your long title up, where the first part is the site name, and the second is the site slogan. That would give you a similar front page title, but a shorter title on the interior pages of the site.
Thanks, advice implemented!
Sentiment - September 30, 2006 - 21:58

I really appreciate your help.
If you have any more suggestions please let me know :-)
Sentiment - September 30, 2006 - 22:00

I have already implemented your comment about the title length, and will most definitely look into the other issues.
Thanks once again!
Very Nice! - September 29, 2006 - 09:06

Sorry this is off-topic, but you have a very nice site. Looks very professional. I had to bookmark it.

interested in this topic
enky - September 29, 2006 - 17:09


Indian Electronics Community
Sentiment - September 30, 2006 - 18:16

I had a great designer... Eitan Isaacson of Inapplicable Solutions.
Thank you all for your tips!
Sentiment - September 30, 2006 - 18:14

I will most definitely implement them very soon...

Looks like this post hasn't
treadLightly - October 10, 2006 - 18:09

Looks like this post hasn't seen any action in a a couple of weeks. Time to resurrect it!

Just wondering if any of the great SEO people here would be kind enough to take a look at I am using pathauto and clean urls. I haven't added Node Words yet, but I'll be doing that soon.

I'd really appreciate any tips or help pointing out things I may have done wrong.

First you need to install
toma - October 10, 2006 - 22:36

First you need to install the node word module its the most important, don't forget to set description to the frontpage in nodeword module setting page, you may need to make a list of keywords related to your site, as i can see its about cars, try to use more text in your content, specialy in top and left of your page, use <h1><h2><h3> for title etc,
try to change your head title
Recent Posts | Auto Industry, Automotive News and Bookmarks -

Auto Industry, Automotive News and Bookmarks

search for Auto industry in google and try to have specific keyword like Auto industry whatever your city

try to have some keydenstity in your pages repeat some words in footer for example
Teasers as description not working with CCK
toma - October 13, 2006 - 21:10

If you are using cck and you check "Use the teaser of the page if the meta description is not set." in nodewords admin setting it will not work, and you will not get any description, read on

Thanks to Robrecht Jacques will look to correct this issue
marcoBauli - December 13, 2006 - 09:02

this has been fixed some time ago by Robrecht ;)
301 module - global redirect
dman - October 17, 2006 - 10:16

As a cross-post to all the folk who have subscribed to this issue-
I recently suggested a clean fix to the 'problem' of content appearing to be duplicated when indexed by search engines - return a 301 redirect from the system (node/123) paths to your desired canonic alias.

Turns out this has already been thought of and implimented last year, as pavlos referred us to the global redirect module

SEO isn't my thing, but those who care may like to evaluate it :)

How to troubleshoot Drupal |
For duplicate content i make
toma - October 17, 2006 - 11:52

For duplicate content i make a robots.txt and disallow some links see my robots.txt

User-agent: *
Crawl-delay: 10
Disallow: /aggregator/
Disallow: /tracker/
Disallow: /comment/reply/
Disallow: /node/add/
Disallow: /user/
Disallow: /u/
Disallow: /privatemsg/
Disallow: /mail/
Disallow: /files/
Disallow: /search/
Disallow: /book/print/
Disallow: *?page=
Disallow: *?from=
Disallow: /node/
Disallow: /comment/
Disallow: /node/
Disallow: /taxonomy/
Disallow: /archive/

a 301 redirect is a good solution, thanks
Duplicate Content
gonefishing - November 30, 2006 - 21:20

Duplicate content will cause major damage to your site. Take a look at the 3 following urls:

3 unique urls that have the exact same content causes SE indexing and ranking problems.

More duplicate content:

I'm not a mod_write expert but, I have managed to solve all the problems list above.

redirect "node" and "node/" to your home page:

RewriteRule ^node/?$ [R=301,L]

remove the trailing slash from any url:

RewriteRule ^(.*[^/])/$ /$1 [R=301,L]

I would also suggest downloading drupal 5 and take a look at the robots.txt file.
Thanks for the above
drubeedoo - February 7, 2007 - 05:12

Thanks for the above .htaccess rewrite rules... I realize the links you provide above would show duplicate content, but are you saying that Drupal inadvertently mixes & matches these links when generating internal page links? Or, are you just concerned with consistent links coming in from the outside? How would Google know about the inconsistencies you mention above, if Drupal builds consistent links? (I hope my question makes sense.)
Log Files
gonefishing - June 9, 2007 - 14:26

I found links of that type by going through the access log files for my site. That means Google knows about them . The very worst inbound links have variables, query strings and session IDs in them;

EG: (this link is at the bottom of Drupal's home page and it's the first of 80 pages) (I added "&cid=" to the url and just created 80 new pages on the Drupal site)
I must be missing something obvious...
drubeedoo - June 16, 2007 - 07:20

Thanks for the reply, but aren't log files private, unless you take steps to make them otherwise? Google (or any search engine) should be oblivious to this stuff. Unless some web page includes a hard link to a /node?page address, I just don't see how this is an issue. Why would one set their logs as public for Google to slurp? (I must have a thick skull, because I'm still trying to understand the problem as it pertains to Drupal.)
Just my guess : Google sees
guix - October 5, 2007 - 18:09

Just my guess : Google sees /node/123 for instance so he thinks "haha, there's a node/ directory let's crawl it so he crawls /node/ too
good post
JohnForsythe - March 29, 2007 - 20:27

This is good advice. I actually just wrote an article about basically the same thing:

My .htaccess is a little bit different, but the idea is the same.

John Forsythe
Need reliable Drupal hosting?
Drupal .htaccess rewrites
guitarmiami - April 21, 2007 - 03:24

That is very good advice...

The only issue with the /node redirect is that I think it blocks login to the site if you have a login form on the front page. You can login from any page except for the front page. One solution would be not to have a login form on the front page, and instead link to a login page instead.
Multisites with shared database
marcoBauli - December 13, 2006 - 09:14

Hello thread, and thank you toma and all for the interesting tips!

I currently setted up multiple sites from a single codebase:
these sites share basically everything, nodes as well, but each single site is a subdomain pointing to the respective category (taxonomy term).
So for example uses the "dogs" category as the front page, while uses "penguins".

The "problem" is that all nodes, togheter with some service links still appear in the main (you can see the live site at and a subdomain in action at )

Would this transform into penalties in Search Engines for duplicate content?

Or shouldn't we worry being every site in it's own domain or subdomain and thus being seen by google and others as completely separate sites?

Thank you for any comment!

Hi Its not a good idea to
toma - December 13, 2006 - 15:34


Its not a good idea to have a lot of subdomains with same content, if you want your visitors to remember your url as you say for for taxonomy "webhost", you can make a 301 redirection, it will not hurt your ranking, but its not good to have a lot of subdomains, you can have , it will be better for search engine ranking.

Good luck
hi toma, thanks for
marcoBauli - December 14, 2006 - 19:57

hi toma, thanks for replying!

actually i did the opposite to try to fill some smaller niches of contents first, and not giving the earlier visitors the impression of being in a big empty room.

I also needed these niches to be separated sites so to customize the contents and navigation links (blocks) in them to actually avoid users navigate to still empty areas.

For the moment i redirected with 301 couple of to As soon as things start moving and contents coming, i will redirect back and follow your advice ;)

funana - December 16, 2006 - 17:04

Absolutely correct, toma!
tknospdr - December 14, 2006 - 19:23

Thanks for all the great tips!

I know people have already
GiorgosK - February 7, 2007 - 14:58

I know people have already talked about this but I just wanted to stretch that
what Toma said about "Google ignore pages with same description"
its totally not true at least for google

it might be with the other search engines but as far as my sites concerned most of them don't have meta keywords or some have exactly the same on all pages and I have not been penalized quite the contrary I could not be doing any better.

world experts tag cloud
Keywords or Description?
funana - February 7, 2007 - 17:17

You are right with keywords. Google doesnt care about pages with the same keywords (maybe it doesnt care about keywords generally) which makes sense due to the fact that there could be a lot of pages with the same keywords but different content.

You are not right about the META description. As I mentioned before, the pages are in the google serps but they are hidden if you use the same description on all of your pages. Google shows up the main page in the serps then usually and these results are not clicked very often because they dont seem to match the search criterias of the user...


SEO Tips For Successful Drupal Sites
Cape Verde News & Community
My Info Collection
I have a site that has
GiorgosK - February 7, 2007 - 20:58

about 15 pages total and all of them have the same exact META keywords and META description but most of the pages show up in the google result and are not hidden as you say, so your theory could not possibly be universal and apply to all cases

Google mostly cares about inbound links and the actual content as far as I am concerned
world experts tag cloud
15 pages
funana - February 8, 2007 - 10:14

Hi Kongeo,

Thanks for your suggestions.

Though I recommend to have unique descriptions.

15 pages is not what we are talking about. We talk about sites with 10.000+ pages, where this could be an issue.
You might be right with "the theory is not universal", but there are tons of examples for the described google behaviour.

You are right with incoming links, which makes - let's say - more than 60% of a good SEO.


SEO Tips For Successful Drupal Sites
Cape Verde News & Community
My Info Collection
re descriptions; h1 tags for page titles not sitenames n slogans
DocMartin - March 12, 2007 - 12:39

I've had pages with duplicate descriptions appear in Google - and perform well in the SERPs. (especially in couple of forums)

Think the description's one criterion for judging if pages are similar/same, but not only one. Maybe if pages look to have high degree of similarity elsewhere, having same meta description can tip the balance so Google treats as identical.

Re h1 tags: surely should be for page titles. Yet, in at least Garland, used for title of site and slogan - which I'd figure only useful if only aiming to rank for these.
Seems to me a mistake, then, that's made right in a key theme for Drupal.

Not yet sure how to arrange it so that have h1 for each page title, yet not appearing in lists with these titles (as on frontpage if have titles n teasers from stories).
Quite right
dman - March 12, 2007 - 12:43

I've seen a bunch of themes that make this mistake. I don't care (much) about SEO, but it's semantically wrong to do so.

HTML, since its inception, designated H1 to be the page title not site name.
I write scrapers and converters, and when I find a discrepancy between what the [title]tag[/title] says and what the [h1]visible[/h1] has - I'd prefer to choose the H1 - because woefully many (static) sites left the hidden title as something useless or default.

Site name should not be the only thing in the H1 - except perhaps on the home page :)

How to troubleshoot Drupal |
I've had a similar
bjraines - April 24, 2007 - 02:58

I've had a similar experience with meta tags. Google has indexed all my pages
Drupal SEO
ThaboGoodDogs - March 30, 2007 - 21:13

It seems like more and more people are wising up to search engine optimization and how their site ranks. Sure, we've all found great results on the first page of Google's search results but sometimes the best sites are buried pages deep in the rankings. I've put together an article that's a Dummies Guide to Drupal SEO. It has some nice tips for newbie Drupal user and also has links to other great articles that are floating around. Feel free to add comments and suggestions at the bottom of the page.
Google Slump: from #2 to #200 in a week
marcoBauli - April 17, 2007 - 13:33

Howdy all,

i'm running a sports site that untill last week was scoring so good on Google, showing in the first 4 positions for specific keywords, and in the first 10 for less specific ones...that was just great!

Two weeks ago i used Traffic Blazer to submit it to blogs, directories, local search engines, and all that stuff TB offers. Result? I now rank 200 or more for the same exact keywords...!!

At this point i'm not completely sure it's Traffic Blazer guilt....i'm investigating right now.. one thing i'm sure is i didn't change any relevant SEO things on my site (except for some page titles patterns and global keywords) and i am absolutely *not* using any dirty trick that could get Google angry with my site.....

Just my experience...... 6 months of hard work to see my site kicked at the bottom of SERPs and traffic terribly damaged in a week for who knows what reason....really too bad <:(

I'm kind of worried to say it soft, a comment from any SEO savvy in this thread would be greatly apreciated... the site is visible at this address

Thanks a lot

PS: other minor changes since the slump are:
. copied 20 open-content pages on my site
. added one more adsense block on pages header
. added site links in page footer
. enabled the possibility or Google to crawl also images (in GWebmasterTools)

the site still shows on first SERPs on other major search engines, ranks 160.000 on Alexa and still has a PR 4/10. Seems a Google thing only. Any advice much apreciated
bad neighbourhoods?
DocMartin - April 20, 2007 - 12:06

Quick, non-too authoritative response (hopefully someone will be much more helpful):

google says it doesn't like links from "bad neighborhoods" - sites that aren't related to yours. Wonder re Traffic Blazer links?
Never heard of TB before; but a post at comments on it - based only on basic info not experience, noting "You only want the directories that are quality directories. Getting listed in the poorly viewed directories can hurt your rankings."
Maybe could check backlinks - yahoo more useful for this?

Google can make sites bounce about - "dance" and sometimes all that's needed is patience. webmasterworld a rich source of info.
Looked at your site - maybe can swap links sometime; Cheung Chau is place with very little kitesurfing. Maybe useful to have more links from pages directly to articles (blog posts) directly on your site? From experience with Joomla, I think such links help, rather than just going thro intermediate categories pages.
CheungChauHK 長洲HK - South China Sea island in Hong Kong.
funana - June 15, 2007 - 14:12

Compliments, very nice site!

To your "SEO problem":

That's exactly why I allways warn everybody on using such "SEO" tools.
The bad neighbourhood theory may be correct.

One thing I noticed is your excessive use of keywords in the metas. That's not a good idea. Let's take two of them as an example: "courses" and "lessons". You have them in your keywords, but they don't appear in the body text. Bad idea!

Although keywords are not longer actively used by SEs, google checks if there are keywords in the metas which are not in the body and then gives you negative points.

My advice: Kick the meta keywords completely! Or let them be generated by "Meta Tags" and "Freetagging".
One thing that you probably dont want to hear is the fact, that you now have to be very patient. If you want to speed up the process of getting a better ranking again, find sites with a >4/5 PR which backlink to you. Backlinks should be in the body text, not in footer or navigation and their texts (the link texts) should be different to the one you used before. They may link to different pages of your site, not only to main.

Good luck!


SEO Tips For Successful Drupal Sites
Cape Verde News & Community
My Info Collection
Thank you so much - this an
smath - August 12, 2007 - 23:10

Thank you so much - this an extremely informative and useful thread. I'm a little overwhelmed right now with all of this information - it's a long-running discussion!

Right now I'm busy with a site for a client and would like for it to rank well, but without employing any dodgy techniques. He has just sent me a mail asking whether it is a good idea to sign up with and my gut feeling is that those sort of companies are not to be trusted, and, furthermore, not necessary. Am I talking out of turn here? Do others agree or disagree?

I'm drupalizing his existing site and have my version - which isn't quite complete - up at . The other thing I was wondering is this: he has two domains pointing to the main site. Initially was registered but the main domain should be Is this a problem, or regarded by the search engines as a spamming technique?

I'm using pathauto, clean URLS and have just installed nodewords based on your advice and am wondering if I'm making any critical mistakes? I'd appreciate any advice.

Many thanks again for all the input.
Interesting advices. Are - August 26, 2007 - 14:42

Interesting advices. Are there some references which would support the theory?

Drupal Theme Garden
How long before Google/Yahoo Index your site?
vkr11 - May 23, 2007 - 09:28


How long does it take before a brand new drupal site is indexed by Google or Yahoo search engine ?

- Victor
FPGA Central
funana - June 15, 2007 - 13:42

That depends on the links that point to your page and if it is a brandnew address (domain) or if it has been in google index before.

Normally it should not take longer than 2-3 weeks. But make sure that you have links pointing to your site from other well listed sites.


SEO Tips For Successful Drupal Sites
Cape Verde News & Community
My Info Collection
I'm very new to Drupal but
david007 - May 23, 2007 - 10:40

I'm very new to Drupal but my 2 cents on this.

The url_alias table should preserve old aliases so when a page is accessed using the node value or an old alias. A 301 Moved Permanently header should be sent to redirect to the latest (unique) url when old values are used. It's an extra query - once you have the url_alias.src and ($pid ) then you query on that
SELECT dst FROM drupal.url_alias where src='node/2' and pid>$pid order by pid desc limit 0,1
and 301 redirect to the latest dst
Insurance Center
Active URL alias
Gurpartap Singh - June 5, 2007 - 17:17

We already have a feature request issue at A screenshot: Review the code and usability(UI) to get it in asap ;-)
omnyx - August 12, 2007 - 23:56

have every search in your web site indexed
toma - August 13, 2007 - 13:03

Read issue here

Fuzzy search module
X-Robots-Tag - controlling Googlebot via HTTP headers
toma - August 13, 2007 - 13:06

Read more

And here

i post an issue feature for Meta tags module
i agree
garymoore - August 23, 2007 - 10:15

i agree
interesting analysing guys
Sree - August 26, 2007 - 20:34

interesting analysing guys ..... ! ll chk this detailedly n ll post my exps...
Duplicate content, htaccess and multisite
kikko - August 29, 2007 - 17:22

Hi, I am a italian drupaler (newbie) and I have read this very useful discussion. Thanks to all.
Particulary for the Gonefish htaccess rewrite tricks for duplicate content. I try successfull using them for my new site with drupal 5.1 (is not ready, just preparing on the server), and - wow! - they works: no more www.mysite.ext/node and no more trailing slash at the end of the urls.

Every things ok? No. I find the problem later, because I use a drupal multisite installation with two sites: the main www.mysite.ext and subdomain.mysite.ext.
This rewrite rule on htaccess (good for a single site)

RewriteRule ^node/?$ [R=301,L]

redirect the subdomain.mysite.ext/node to the home page of www.mysite.ext.
Disaster (I told you I am a newbie?).

I have search something about this problem (rules for prevent duplicate content) for a multisite drupal installation but I did't find anything. So, after find a similar problem (I don't remember where, sorry) i try this:

RewriteRule ^node/?$ http://%{HTTP_HOST}/$1 [R=301,L]

and - wow - it works!! No more /node for every site installation.
I hope is useful for people too and if someone know a mistake or a better htaccess rule for the same problem, tell us, please.
La casa di Kikko
Hi i don't really get what
toma - August 29, 2007 - 21:06

Hi i don't really get what you are trying to restrict, its done by robots.txt

try for multisite and if you want to have different robots.txt file for every domain or subdomain u are using

Use google sitemap

The google sitemap give you stats about urls indexed, errors etc

Google Proxy Hacking
toma - September 18, 2007 - 17:31

Google Proxy Hacking Protector module

Read more

Help SEO Experts
prakashkadakol - September 27, 2007 - 09:31

Hi SEO Experts i am never to DRUPAL,currently working with application,i need imrove its search engine ranking specially in google,plz give me some suggetions
Sree - September 27, 2007 - 10:53


Also check :
also check the seo groups like
Step By Step Video For Drupal SEO
Joe Matthew - September 29, 2007 - 09:04

Here is a step by step video for how to rank for a search term using Drupal as your platform. I had created as part of a video series to rank for "listen with your headphones" on a brand new domain. Within a week after I wrote this I ranked for it and as of this writing still do.



Drupal SEO |CMS Videos
jiangxijay - October 6, 2007 - 02:13

momper - October 8, 2007 - 19:56

NonProfit - October 10, 2007 - 19:47

Hey Kids!
dman - October 11, 2007 - 01:17

Look guys, this thread is long over a year old, and, valuable as it was, it's dead dead dead. Stop 'subscribing'.

Y'all should go over and join if you expect to see anything new.
G'wan. Git!

How to troubleshoot Drupal |


web development company said...

listed nice mistakes, when i was developed my website using this CMS, I.m also solve the same errors.. surely i said all the beginners drupal developer surely at turn the issues..

InnomaxMediaLLP said...

The Drupal seo mistakes are made so you can know all about them

SEO company Singapore