Forum Moderators: Robert Charlton & goodroi

Message Too Old, No Replies

Dealing With Consequences of Jagger Update

Your site dropped? Lost rankings? What to do now?

         

reseller

8:25 am on Nov 12, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Hi Folks

Jagger is winding down and life must go on. If Jagger has been kind to your site, Congrats. But for the rest of fellow members who lost rankings or their sites dropped of the index, its time to do some thinking and decide on what to improve or change on your affected websites. Still ethical measures are what interest me most.

Some food for the thought.

After my site was hit by Allegra (2-3 Feb 2005) and lost 75% of my Google's referrals and hit for second time on 22nd July 2005 ending up with only 5-10% of pre-Allegra Google's referrals.
My site is now back to the level of around 50% of pre-Allegra Google's referrals and growing... until further. I say "until further" because who knows what the next update or "everflux" do to my site!

Before my site returned back around 19-22 Sept 2005 (very slow at the begining), I went through my site several times for months and did the followings:

- removed duplicate pages. In my case it was several testing pages (even back to 1997) which I just forgot on the server.

- removed one or two 100% frame pages.

- removed some pre-sell affiliate program pages with content provided entirely by affiliate program vendors.

- removed few (affiliate referrals) outbound links which was on the menu bar of all pages (maybe we are talking about sitewide linking).

- on resource pages, I reduced the outbound links to be less than 100 .

- made a 301 redirect non-www to www (thanks to my good Norwich friend Dayo-UK).

- finally filed a reinclusion request in accordance with the guidelines posted on Matt's blog (thanks Mr. Inigo).

Would you be kind to tell us how Jagger Update affected your site, and what do you intend to do about it.

Thanks!

Newman

1:57 am on Nov 13, 2005 (gmt 0)

10+ Year Member



removed some pre-sell affiliate program pages with content provided entirely by affiliate program vendors.

Please could someone explain this problem...

wanderingmind

7:00 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



Err reseller, Jagger3 may be over - but has the data spread to Google.com or all the toher datacenters? I still see Jagger 3 only in 2 66. DCs...

reseller

7:15 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Newman

>>"removed some pre-sell affiliate program pages with content provided entirely by affiliate program vendors."

Please could someone explain this problem...<<

Some affiliate vendors make available for their affiliates a whole Pre-Sell page(s) to promote the product/service. When/if many affiliates use the same Pre-Sell pages there is the risk that your own Pre-Sell page ends up of being a duplicate of the same Pre-Sell pages on several affilates sites, IMO.

And its after all a very lazy way to promote the affiliate program you are on. Better create your own Pre-Sell page.

reseller

7:23 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



wanderingmind

>>Err reseller, Jagger3 may be over - but has the data spread to Google.com or all the toher datacenters? I still see Jagger 3 only in 2 66. DCs... <<

Matt said something to the effect that what you see now on [66.102.9.104...] is what you are going to get. In this sense, we know by now how Jagger3 will look like in general.

Having said that, there is of course the chance that the following flux (maybe for a week or two) will bring some changes too. And I do believe that the folks at the plex are still tweaking Jagger3.

However, most of the affected sites at present mightbe still affected after the flux is over, IMO. And its better to start thinking of solutions.

dibbern2

7:31 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



reseller,
would this thread include Thin Affiliate site issues if that was the cause of a rank drop? Or should I start a new one?

reseller

7:43 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



dibbern2

>>reseller,
would this thread include Thin Affiliate site issues if that was the cause of a rank drop? Or should I start a new one? <<

You are most welcome to post it here and share your thought with us if the drop in ranking happened in connection with Jagger update.

Thanks.

cleanup

9:21 am on Nov 13, 2005 (gmt 0)

10+ Year Member



Good morning Reseller from a rather dreary wet Madrid!

Strange that you got your site back from a dup penalty on 22nd, the same day that many of us here lost them for some (the same?) reason.

Something really wierd happend that day, office party?

As for Jagger, seems that most of the sites that went awol came back at Jagger3 except for the one lost into the Sept 22nd black hole.

What have I learned and changed? difficult to say and I think that in this respect Google have triumphed here as they have us all confused with Jagger!

.)I added 301's to all sites.
.)I changed hosting company for one site that was sharing with another.
.)Have become even more discerning with link exchanges.
.)Have made sure of more direct deep inbounds, targeted pages to help with Googles post Jagger "loss of precision"

The first three of the above are arbitrary changes that could be argued for before Jagger.

The last one is really the only one I have altered with Jagger in mind.

reseller

9:38 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Good morning to you too cleanup from a cloudy Denmark at less than 10 degree centigrade (around 122.0 F) :-)

>>Strange that you got your site back from a dup penalty on 22nd, the same day the many of us here lost them for the some (same?) reason.<<

Though we shall never get an "official" statement about what exactly happened around 19-22 Sept 2005, I think it was a pre-Jagger preparations of some kind. At that time the folks talked about filters, but I think it was more than just filters.

>>Have made sure of more direct deep inbounds, targeted pages to help with Googles post Jagger "loss of precision"<<

Very interesting indeed!

Newman

10:26 am on Nov 13, 2005 (gmt 0)

10+ Year Member



Better create your own Pre-Sell page.

reseller, do you think that something like this may also affect a site dropping from Google Top 10?

reseller

10:47 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Newman

>>Better create your own Pre-Sell page.

reseller, do you think that something like this may also affect a site dropping from Google Top 10?<<

I really don't know for sure. But one thing I'm very sure of; duplicates are nasty killers as have been reported on several threads by many kind fellow members.

And those out-of-the-box Pre-Sell pages are in reality nothing else than duplicates. I'm not blaming affiliate program vendors and affiliates for using them, I used them myself.
But the way Google look at such Pre-Sell pages and regard them as duplicates isn't the same as before. IMO, affiliate program vendors need to learn and adubt very fast, otherwise they mightbe ending hurting their affiliates "in good faith".

Kimkia

11:10 am on Nov 13, 2005 (gmt 0)

10+ Year Member



At resellers invitation, I'm submitting the following post (also posted on the latest Jagger thread, apologies for duplication):

My site has been at the top of Google SERPS for very competitive phrases for about three years. It's a high quality site offering hundreds of pages of original content but I began seeing a downward trend a few months back; not achieving rank that I would expect for certain new pages, then a slide back on important older pages, particularly the index pages of my main sections. Then, ...overnight...my site hit the toilet for almost everything on September 22.
Thanks to Jagger, I have returned with a vengeance. I'm scoring very high for all those competitive keywords, plus quite a few more, and I'm delighted. I don't have it down to an exact science, but here are my observations of what works for me and what doesn't:

1) Duplication in titles dilutes the importance of all the keywords used, so avoid using your site slogan or similar repetition. Focus clearly and concisely on the content of that particular page.

2) If you are subdividing your site into logical and important sections then use your home page to point to and describe each section. Have a universal menu on all pages that points to home page, plus each index page of the separate sections. This establishes the importance that you place on these pages and is echoed in Google both in page rank and index placement.

3) Avoid too much cross-linking between "related pages." I'm pretty sure I tripped a filter here, and it works a heck of a lot better if I just point to the appropriate section index instead of cross-linking multiple related pages. It's as if googlebot goes merrily along, identifying important pages, then hits cross-links to not so important pages and has a hissy fit because it just doesn't understand you any more. Unfortunately, When googlebot doesn't "get it," you don't get indexed!

4) Avoid keyword stuffing. I was guilty of overusing keywords, putting my site slogan on each page, plus a keyword rich description of the page topic, before actually reaching the topic itself, which also included keywords. I'm a professional magazine writer, and this isn't a natural way for me to write, but it had become second nature when writing for my site. I was, in effect, leaving a breadcrumb trail all over for Googlebot to follow...and, in the end, Googlebot said screw the brumbcrumbs - all I want is the meat and potatoes. (My theory is that scrapers use so many breadcrumbs that they might as well make stuffing - so Googlebot has lost its appetite for breadcrumbs in general.)

5) Tighten up meta descriptions to reflect the content of the page only - no slogans or overused keywords here either.

6) Check your site for canonical problems and hijackers. If you find www and non-www issues, throw up a 301 redirect immediately and check WebmasterWorld for progress in this area. If you find hijackers...either use the Url Removal tool as described elsewhere on WebmasterWorld, or report them as spammers, whichever applies. I had to do both (and special thanks to Bear for alerting me to canonical issues, and to Dayo and Reseller for keeping me up-to-date with the latest.) I'm still wrestling with both of these but they are, at least, under some kind of control now.

7) The only reciprocal linking you should do is genuine, heartfelt recommendations of one site for another. Forget the "you rub my back, I'll rub yours" links. If you are a worthwhile content site you will get natural links. Augment that by distributing worthwhile content articles with your bio and link included in the deal.

8) Be pro-active if you find spammers in your niche appearing in the Google index. As Resellers keeps reminding us, report them. If the spammers are using AdSense, report them to AdSense as well. On one occasion, a report that I made to both AdSense, and google user support, resulted in the demise of about 10 different spam domains. These guys were appearing in the index spoofing my domain with page titles like "MyDomain specific page...url moved" then a description that said "url moved, please visit spamsite.homepage.htm" complete with fake 404s. Grrrrr....

9. Examine the successful competition with a microscope. Much as you dislike them - and I sure do dislike my top competitor - you can learn something from them! My competition site puts up hand selected links with brief descriptions as if they belong to the site itself, then describes Amazon books to fluff out the rest of the page and, from September 22 to Jagger 3, managed to beat me consistently in the index. It is still doing far too well, getting too many visitors but not much more than 1 page view per visitor (and what does that say, I ask you?). Regardless, its success in Google, I think, is because it never cross-links individual pages, (because there is nothing of substance to cross-link to) and therefore the site remains focused on section index links - where you find more of the same sh...stuff. I learned and copied her narrow navigation technique and it works. Point is...you can learn from whatever your competitors are doing, successful or otherwise.

10. Browse WebmasterWorld, and go a little further than you normally do. You'll be surprised what you will pick up. Next on my agenda is banning bad bots via htaccess, and WebmasterWorld is likely the source of the code that I will use.

11. Realize evolution on the net is inevitable and survival of the fittest applies. You can learn the skills you need on this forum, but you will also need to constantly hone those skills, and learn new ones, as Google continues its updates and the internet continues to grow.

Gimp

11:12 am on Nov 13, 2005 (gmt 0)

10+ Year Member



I have been looking at my problem. It seems to be duplicate content. I have many hotel pages all of which have a basic format with some changed details. Those pages can't be changed to satisfy some algo.
So I am moving them to a throwaway domain. I will set up doorways in my main site to go to the new pages on the throwaway domain. The doorways will have content and refer people for details to the throwaway domain. The throwaway domain will get penalized.

This will also be done for catalog pages. For example, a catalog of similar products has a standard format with only some changed details. So once again one gets hit with duplicate content. So they will be moved to a google throwaway domain and doorways made in the main site. That domain will be optimized for other search engines.

We tripped over this accidently when we noticed a new site, with no more than 20 pages of content, all of which referred back to a main site started ranking well in google while the main site has stayed where it dropped to in June. To make the main site rank, we will now move anything in a standard format to a new throwaway domain.

Any comments on this? Are there any better solutions?

reseller

11:16 am on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Kimkia

Very kind of you to post on this thread your valuable tips and observations. Much appreciated.

cleanup

11:19 am on Nov 13, 2005 (gmt 0)

10+ Year Member




" hit the toilet for almost everything on September 22"

OK, but why do you think that was?

and what did you do to get it back?

was it fixed by a 301 redirect?

when did you place it and how long did it take
before your return?

Thanks.

Gimp

11:24 am on Nov 13, 2005 (gmt 0)

10+ Year Member



In order to avoid excessive cross linking, does it make sense to use a rel="nofollow" tag on internal pages when you want to link to another internal page?

Kimkia

11:26 am on Nov 13, 2005 (gmt 0)

10+ Year Member



Just reading what has been said so far in this overdue thread - and yes, I concur...I tried dynamic affiliate pages once and will never do it again.

Back in the spring, I added a dynamic affiliate catalog, providing it with its own category within my own site, inserting the code as per instructions...and having NO clue that it would amount to over 1000 separate product pages, effectively turning my content-based site into an affiliate duplicate site, with my own unique content pages now relegated to a minor proportion of my own site.

BIG mistake on my part.

Fortunately, as I said, I had placed this monster in its own directory...so I used the Google Url Remove tool to remove the entire thing.

Depending on which data base I look at for a site:mysite.com search, Google is still calculating my site as having 1000 plus pages when, in fact, about 350 is more correct. I suspect that it's going to be a while before I get this monkey off my back.

Also forgot to mention: check your site for inadvertent duplication - I still had old htm pages up long after I converted to shtml, resulting in a well-deserved duplication penalty on those pages.

zoltan

11:49 am on Nov 13, 2005 (gmt 0)

10+ Year Member



I am wondering if this is caused by Jagger.
We have a PR6 homepage that is on the top 6-7 results of 2-3 competitive terms. The homepage is crawled every day by Googlebot but the internal pages do not get crawled quite frequently. Until September 2005, Googlebot crawled about 10 - 15,000 pages every day, now, this number reduced to 5-600 pages / day. What can be the cause of this drop?
I mention that our position is quite strong, on the Jagger3 DC(s) we even earned 1-3 positions.

Any clue about why Googlebot crawling frequency has been reduced that much?

Kimkia

11:55 am on Nov 13, 2005 (gmt 0)

10+ Year Member



cleanup,

yes, after everything hit the dumper late September, the first thing I did was come here and, after reading and posting a bit, became aware of the canonical problem.

There was some improvement shortly after installing the htaccess redirect, which in my case also required a specific redirect from mysite.com/index.htm and mysite.com because of a stupid error that I had made, linking to mysite/index.htm on some pages out of sheer ignorance.

Google is still listing mysite.com/index.htm on some inurl searches that I do, much to my chagrin. However, I do believe that the basic 301 redirect from non-www to www was worthwhile and contributed to my rise from the ashes. How much so? Your guess is as good as mine. Like Dayo, I think there are residual canonical effects that are difficult to shake.

I'm off to bed now, as it's dawn in Canada and some sleep is in order. Will check back with you all later. G'night!

glengara

12:04 pm on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



FWIW, I'm becoming increasingly wary of placing AS on "actively" promoted sites, I suspect they may well be used to "prove" intent, and their placement may well restrict a sites' usual "room for maneuver".

Gimp

12:16 pm on Nov 13, 2005 (gmt 0)

10+ Year Member



glengara,

Please expain your concern a little more. I am not that smart in this stuff.

glengara

12:51 pm on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



IMO G is seeing AdSense as an "intent indicator", so any matter of doubt as to the possibly innocent intent behind say a crosslinked domain network is dispelled.

Basically I suspect the "rules" may be being applied more strictly to AS publishers....

coldfused

2:50 pm on Nov 13, 2005 (gmt 0)

10+ Year Member



Does anyone know if Google see's your pages as being duplicates if your meta description is the same for each page? Basically has my company name and info but they all have the same description. When I do a site:mydomain.com search it only shows about two or three of my pages and omits the rest as similar pages. When I show results with similar pages omitted it shows them all.

Ankhenaton

3:26 pm on Nov 13, 2005 (gmt 0)



hmm, we started to interlink different language versions between domains .. absolutely useful for humans ... These links are then obviously reciprocal ...

This is between 2 PR6

Just deleted one way which makes the site less useful obviously ...

If that makes a difference then maybe a triple wrapper with cloaking might help.. but then you could be considered cheating too. Algorithms will never work .. and shouldn't be trusted to control nearly all of the worlds internet economy. PR ranking only works if everyone would play honest.. Simple Game theory ... Maynard 1973 etc ... trust ranking won't work either .. i think G is gonna run us through the whole AI history with same ultimately unsatisfying results .. compared to humans :\

reseller

3:48 pm on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



Gimp

>>In order to avoid excessive cross linking, does it make sense to use a rel="nofollow" tag on internal pages when you want to link to another internal page? <<

I guess you wouldn't wish to do so. Unless you wish Googlebot not to crawl those internal pages or Google to give them any credit when ranking your site!

Lets take a look at what Google say abot the rel=nofollow tag

[google.com...]

-------------------------------------------------
14. How do I tell Googlebot not to crawl a single outgoing link on a page?

Meta tags can exclude all outgoing links on a page, but you can also instruct Googlebot not to crawl individual links by adding rel="nofollow" to a hyperlink. When Google sees the attribute rel="nofollow" on hyperlinks, those links won't get any credit when we rank websites in our search results. For example a link,

<a href=http://www.example.com/>This is a great link!</a>

could be replaced with

<a href=http://www.example.com/ rel="nofollow">I can't vouch for this link</a>.
---------------------------------------------

reseller

4:11 pm on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member Top Contributors Of The Month



coldfused

>>Does anyone know if Google see's your pages as being duplicates if your meta description is the same for each page? Basically has my company name and info but they all have the same description. When I do a site:mydomain.com search it only shows about two or three of my pages and omits the rest as similar pages. When I show results with similar pages omitted it shows them all.<<

You might wish to view the posts at this page of the thread: Serious analysis of Duplicate content penalties

[webmasterworld.com...]

Gimp

4:14 pm on Nov 13, 2005 (gmt 0)

10+ Year Member



Thank you Reseller but you did not answer the question.

An internal page may be linked by 30 other internal pages because it is important or contains basic information. Such a page may be a comment page for a news service.

One may want that page indexed by the search engine. But the search engine may say that 30 links to it is excessive linking.

The question is should one use the rel"nofollow" tag in 29 of the links to prevent an excessive internal linking penalty.

Going the next step, suppose one has a standard related links table that is used in 100 pages? Or a link to a subscription page in each of the pages in a news service? Excessive internal linking? rel="nofollow" to stop excessive linking penalty?

It is nice to read the google guidelines. But this board is read by people who read them and obviously interpreted them wrongly.

When the author writes something that the people do not understand, it is not that the people are stupid.

stever

4:45 pm on Nov 13, 2005 (gmt 0)

WebmasterWorld Senior Member 10+ Year Member



>>an excessive internal linking penalty

Would you maybe (pretty please) like to provide some evidence for such a theory before the (Webmaster)world and his mate go off into multi-thread paroxysms of rage about how Google is broke because of this penalty?

coldfused

4:49 pm on Nov 13, 2005 (gmt 0)

10+ Year Member



Thanks for the link reseller.

Gimp

5:11 pm on Nov 13, 2005 (gmt 0)

10+ Year Member



Search excessive linking penalty in Google and look at the questions about too much internal linking possibly triggering a spam penalty.

I am looking for answers. If one thinks it does exist, what does one to avoid it? If one thinks it does not exist, why?

This 425 message thread spans 15 pages: 425