Friday, November 23, 2007

VIDEO Q and A: Is our site being penalized by Google?

Dear Kalena...

The problems we are having are mainly with Google. We have 1,000 indexed pages showing in the 'site:' command from Google search. We have already done many things to get our rankings up but we appear to be penalized. Customers can not find our indexed pages by title OR content. We suspect sabotage by another company. We have sent emails and faxes to Google and asked them to investigate. They ignore us. We're out of options.

Darren


Kalena's Answer:

Dear Darren

Click here to see my video answer


Need more than advice? Take a Search Engine Marketing course online

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , , ,

AddThis Social Bookmark Button

Thursday, November 08, 2007

Q and A: Why have our most popular pages disappeared from Google?

Dear Kalena...

Thanks again for your help a couple of years ago. I need some more advice now and I don't even know where to start troubleshooting. About a week ago I noticed that the bulk of our most popular pages no longer seem to be on Google's radar. I'm talking about pages that used to show up in the top 10 results for typical searches in our industry. Currently, those pages don't show up AT ALL in Google's results, or Google offers a comparatively irrelevant page, like our home page or links page that might happen to have the keywords in question.

This problem is only with Google, not MSN or Yahoo. Probably began during the last 30 days. Other pages come up in Google SERPs just fine. Also, I checked our Google Webmaster Tools and everything looks OK - sitemap downloaded OK, all pages (including the problem ones) indexed. PR for the problem pages is unchanged (lackluster 2-3, but at least not lowered). The only thing I did different was to use a new sitemap a couple of weeks ago. Any ideas?

Rick

Kalena's Answer:

Dear Rick

I'm getting a lot of similar questions to yours at the moment and I'm convinced it is the result of a major tweak Google has made this month to their PageRank algorithm (not to be confused with the Google Toolbar PageRank green bar). Here's my reasoning:
  • None of your pages show up in Google's Supplemental Index, indicating those pages haven't been removed from Google's main datacenter.

  • Google is currently showing 179 pages from your site as being indexed, whereas Yahoo is actually showing over 300 pages indexed, indicating that Google may be suppressing the value of some of your pages.

  • You didn't tell me the search query that returns the rankings you are talking about, but if you were previously ranking well for those terms and you've not changed the pages, then it's probably an external cause rather than something you did to cause the ranking drops.
Google makes small tweaks to their ranking algorithm on a regular basis. Some of these tweaks involve the addition of code filters to detect and suppress code it sees as artificially influencing your page's relevancy. It may be that the new algorithm includes a new filter that has picked up something on your pages Googlebot doesn't like, for example, excessive keyword repetition or duplicate content. Many of your pages have almost identical content to each other, which could have triggered a suppression filter.

Also, you have quite a large number of backward links showing in Yahoo (over 300) but only 8 showing in Google. It may be that Google has decided many of those backlinks are not relevant and has suppressed any influence they previously had on your rankings. The reciprocal link swapping concept you use on your site and the advice you give to potential link partners is quite flawed. It will likely only attract links from very low quality sites, diluting your own site's link popularity as a result. Many of the sites listed on your links page are completely irrelevant to your site. Read my link swapping rant for more info.

Finally, keep in mind that thousands of new pages get added to the Internet every day. Chances are that some of these might be targeting the same keywords and phrases that you are. If those pages are better optimized than yours, yours will naturally be pushed down in the results.

Need more than advice? Take a Search Engine Marketing course online

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , , ,

AddThis Social Bookmark Button

Thursday, October 25, 2007

Q and A: Has my domain glitch caused permanent de-ranking in Google?

Dear Kalena...

I let my domain www.visaplace.com expire, but I re-registered it within 24 hours. The site went black in the interim and it took about a 1/2 day to be repopulated online. However about a week to two weeks later I noticed that Google dropped my rankings for virtually all of my key words. I am totally invisible online now whereas I was well ranked before. I was told by my SEO person not to worry because once Google spiders my site again a few times I will get back up to my original positions.

My SEO person looked at my site and said nothing has changed and I was not blacklisted or anything. He said I should be back up within days to a few weeks. My question is: Is explanation credible? Is there another possible reason why I am de-ranked? I am really concerned.

Thanks so much
Michael

Kalena's Answer:

Dear Michael

Your SEO is right. What's probably happened is that Googlebot tried to index your site during the time the site was down and so dropped some/all of your previously cached pages. This can happen from time to time, especially with hosting outages etc. Obviously, if those pages were previously ranking well for certain search queries, but the pages have temporarily disappeared from Google's data store, those rankings will disappear too.

I see now that Google last cached your page on October 19 so all seems to be well again. I'm not sure how many pages were indexed before the domain problem, but Google shows 79 pages currently indexed.

To check if any site is listed in Google, you can use their Site Status Tool. If your SEO is worth his salt, he will have created a Google Webmaster Tools account for your site and uploaded an XML sitemap to Google Sitemaps on your behalf. This will tell Google how many pages your site has and what the URLs are so Googlebot can index it accurately. If you think Google has dropped some pages, be sure to have your SEO update the XML file and ping Google from within Webmaster Tools when it's uploaded.

If you want to keep close tabs on how/when Google indexes your site, ask your SEO to provide login access to your Webmaster Tools account or set one up for yourself.

Need more than advice? Take a Search Engine Marketing course online

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , , , ,

AddThis Social Bookmark Button

Monday, October 15, 2007

Q and A: Why isn't our site coming up on Google for "pages from Australia"?

Dear Kalena...

We have an Australian website and until now, we have been hosted in the US. Two days ago, we moved to an Australian server, but our site is not coming up on the Australian sites yet (even if you do a Google search for Cairns Unlimited). However, we have other domain names redirected to specific pages within our site, and these come up when you search for sites from Australia. The main reason we moved to an Australian web server was to ensure that our site comes up on Google search results even if readers search for sites only from Australia. Any idea what the problem is?

Thanks for your help
Maria

Kalena's Answer:

Dear Maria

You don't say, but I'm assuming you mean your site doesn't come up when searching on Google.com.au and restricting the search to "pages from Australia"?

I've checked and here's the score:

1) Conducting a search for "Cairns Unlimited" using Google.com brings up your site in first place.

2) Conducting a search for "Cairns Unlimited" using Google.com.au selecting results from "the web" brings up your site in first place.

3) Conducting a search for "Cairns Unlimited" using Google.com.au selecting "pages from Australia" doesn't bring up your site in the first 50 matches, but it does bring up links to your site from other sites.

There could be a couple of things influencing this:

a) It's only been a few days since you made the hosting switch. The DNS entries may not have propagated across the net yet or Googlebot may not have picked up the switch yet. Google datacenters may still be storing cached versions of your pages from your old server. You should give it some more time.

b) Your site has a Google Toolbar PageRank of zero, meaning it hasn't built up enough trust-rank yet to be shown for related search queries on Google, unless you search for very specific terms such as your brand. Things might change when your PageRank improves.

c) You may have switched your hosting company from one based in the US to one based in Australia, but are you SURE the server they use to host your site isn't based in the US? We also use an Australian host but they outsource their server rackspace to a larger company in the US.

d) Even if your site is now hosted in Australia, the domain you are promoting is still a dot com domain. Google takes several things into account when determining a site's origin with server location being only one factor. It is unlikely you'll be able to outrank any sites with AU domain extensions in the regional results with a dot com domain.

e) A site with an Australian domain extension always has a better chance of being included in the regionally-specific search results and out-ranking dot com domains. I see that you also own the .com.au version of your domain but Google isn't caching it as they have determined your dot com domain to be your *correct* one. Have you thought about setting your preferred domain to the .com.au version and parking your dot com domain to that one? Or using 301s to point pages on the dot com to the .com.au? You could then update your Sitemap in your Google Webmaster Tools account to reflect pages on the .com.au domain.

Keep in mind that doing this may improve your site's results in the regional search results, but it may have the opposite effect on your site's performance in Google.com.au "web results" and the wider search results shown on Google.com. You really should decide whether the Australian market is more important to you before you make this switch. You should also keep in mind that many Australian searchers still use Google.com or Google.com.au without selecting "pages from Australia".


Need more than advice? Take a Search Engine Marketing course online

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , ,

AddThis Social Bookmark Button

Friday, September 28, 2007

Q and A: Why do my keywords appear and disappear in Google?

Dear Kalena...

Hello, I am confused as to what is going on with my site in Google. I have a pretty good base of backlinks, not a lot but a decent amount, but when I do searches for a lot keywords my site would appear on the first or 2nd page. It would stay like that for a few weeks then it would disappear for a while like a few weeks or a month or two. Then they would reappear for a few weeks then go away again. This has happened for about 8-9 months now. I don't know if it has to do with changing algorithms. Any ideas?

Thanks!
Travis


Kalena's Answer:

Dear Travis

Google search results come from a wide range of data-centers located around the US and the world. It is very common for Google to pull search results from one datacenter for a week or two and then switch to another datacenter.

In my experience, the search results seem to fluctuate between two major datacenters, which would account for why you are seeing certain results for a few weeks and then seeing them disappear again. See Google's own explanation for ranking fluctuations.


Need more than advice? Take a Search Engine Marketing course online

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , ,

AddThis Social Bookmark Button

Tuesday, September 04, 2007

Q and A: Why hasn't Google indexed all the pages in my sitemap?

Dear Kalena...

Hi my website is www.pakinfobytes.com. Google has indexed my 13 pages then I add my sitemap to google webmaster tools which contain 17 pages but still Google only indexed 13 pages. Why? And how they indexed my all sitemap pages? Is there is any way to tell Google about backlinks or does Google automaticly detects the backlinks? Are the backlinks pages included in sitemaps? I mean if www.example.com have my link then do I add www.example.com to my sitemap or not? I am wondering about how Google knows the backlinks.

Regards
Bilal

Kalena's Answer:

Dear Bilal

Google doesn't automatically add all pages from your site to their index. Because the Internet is so large and millions of pages get added every day, it is not possible for Google to index every single one. They have to determine which pages are the most important and index those. The others either get excluded or included in their supplemental index. You can learn more about the supplemental index here.

According to a site search, Google has indexed 14 pages from your site. A check of the supplemental index on the main Google datacenters shows 8 pages from your site stuck in the supplementals. To get those out, you need to get links to them, preferably from domains that Google considers trustworthy, such as directories or popular sites.

There is no need to tell Google about your backlinks. You certainly don't add them to your sitemap. That is only for your own site pages. Provided your backlinks are from pages that are in Google's index already, the links will be found and registered towards your site's link popularity. Google never displays your true number of backlinks, only the ones they consider important. But rest assured that all links pointing to your site are taken into account when determining your site's true PageRank score.


Need more than advice? Take a Search Engine Marketing course online

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: ,

AddThis Social Bookmark Button

Thursday, August 02, 2007

Google can't index the entire web

It's sometimes hard for people to think about the Internet without automatically thinking of Google. But Dan Crow of Google's Crawl Infrastructure Group gave this sobering message last month in his interview with Jonathan Hochman:
"...the World Wide Web is very large, and Google is not even sure how large. We can only index a fraction of it. Google has plenty of capital to buy more computers, but there just isn't enough bandwidth and electricity available in the world to index the entire Internet."
That leaves Google with a massive dilemma: which pages should they index and which should they ignore? According to Dan, PageRank plays a large role. If your site has relatively few pages and they all have high PageRank, it's likely they'll all be indexed no problem. However, if you have a large number of pages with low PageRank, you probably find that they don't make the cut.

So that just leaves the $64,000 question: what can you do to give your web pages the best possible chance of being indexed? Jonathan was convinced that the following aspects have an impact on a page's indexability:

- Clean, valid HTML code
- Use of external CSS and external Javascript files
- No code bloat

During his interview, Jonathan asked Dan outright if these things would help a page get indexed and Dan agreed that they would. Pages with clean code load faster and use less bandwidth to index.

Looks like it's time to go clean up that sloppy code!

Add to: Digg | Del.icio.us | Ma.gnolia | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , ,

AddThis Social Bookmark Button

Tuesday, July 31, 2007

Q and A: What can I do to improve the rank of my regional domain on Google.com?

Dear Kalena...

My question is on regional google sites. I have a .com.au and rank well on google.com.au when selecting 'search web' and 'search australian sites' but my rank on google.com is very bad. I would have assumed that the results for google.com and google.com.au 'search the web' would have been the same? What can I do to improve my google.com rank, perhaps I could register a .com and point it to my .com.au? Any ideas would be greatly appreciated, thanks for offering this service by the way :)

Tim


Kalena's Answer:

Dear Tim

First up, never assume anything with Google. Secondly, search engines use a few different methods to determine a site's country of origin. Here are just two:

1) IP address the site resides on (physical location of host servers)
2) Domain extension

The physical location of the server that stores your site can have an impact on how search engines treat your site. Even if your site is hosted by an Australian firm, if they use server space located in another country, that is usually the country search engines will associate with your site. Check with your host about server location if this is an issue for you.

Now about your specific example - think about who uses Google.com.au - the primary users are from Australia, correct? So why would Google show the same results to Australian users that they would show to users of Google.com? They (correctly) assume that Australians want to see results that are relevant to them. So Google naturally gives preference to sites with a .com.au domain extension or sites that are hosted in Australia for both regional searches and "search the web" searches on Google.com.au.

Not only that, but Google uses IP detection to determine a searcher's geographical location and present results they determine relevant for persons in that location. How else do you think they decide what AdWords ads to show to different searchers? Advertisers request their ads to be shown to specific regions, countries or towns, so Google have a highly sophisticated algorithm to make sure this happens automatically.

If it is really vital that your site be shown more prominently on Google.com, I would suggest moving your site to a .com domain, on a server located in the US. You could then 301 redirect your .com.au domain to the .com. Pointing a .com to a .com.au won't do anything because you are still instructing bots that the .com.au site is your primary domain. I would really only consider switching domains if your major market is the US, the Australian market is relatively unimportant to you and you are happy to lose visibility in Google.com.au, which is what would inevitably happen.

Add to: Digg | Del.icio.us | Ma.gnolia | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , , ,

AddThis Social Bookmark Button

Saturday, June 16, 2007

Q and A: Will search engines obey the robots.txt if a robots meta tag is used per page?

Dear Kalena...

I am running a CMS (joomla) and have robots.txt configured for the pages I want indexed. however, I noticed that the CMS is automatically appending the meta robots tag (index, follow) to every single page - yikes! So my question is, will Googlebot respect robots.txt or be led astray by the meta robots tag?

Thank you!
AP Clarke


Kalena's Answer:

Dear AP

I think Joomla has an option of turning off the automatic tag appending. Search engines usually follow robots.txt in the first instance, but some will obey the robots meta tag per page, so if you don't want to risk confusing search bots, try to turn off that meta robots tag option or manually delete it from the code after publishing.

Add to: Digg | Del.icio.us | Ma.gnolia | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: ,

AddThis Social Bookmark Button

Monday, May 28, 2007

Q and A: How long do you have to wait until a site gets crawled?

Dear Kalena...

Hi there - good useful info here. Just a quick question. When you add your new url to Google or other search engines how long do you have to wait till it gets crawled? Your help would be great thanks

Eugene

Kalena's Answer:

Dear Eugene

It depends. Often a site will get crawled faster if it is linked to from another site that gets crawled daily, but it is totally up to the search engine robot's crawling schedule. Here's some info from Google's FAQ on the subject:
"Our crawl process is algorithmic; computer programs determine which sites to crawl, how often, and how many pages to fetch from each site."
If you are curious to know when and how often Google and Yahoo crawl your site, you can verify your site with Google's Webmaster Tools and Yahoo Site Explorer.

Add to: Digg | Del.icio.us | Ma.gnolia | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , ,

AddThis Social Bookmark Button

Thursday, May 24, 2007

Q and A: Does Google favor sites running AdSense?

Dear Kalena...

It's nice to see language I can understand! You make instructions very clear thank you. I have a new website with Google ads on it. If I get 10 clicks per day on my ads, does Google favor my site over and above others that may be in the same category that don't run Google ads? I was thinking to some extent they might, so they can make more money. Do you know anything about this? Thanks, and I look forward to your reply.

Dean


Kalena's Answer:

Dear Dean

When AdSense first launched, there were many sceptics in the industry who predicted that Google's algorithm would favor successful AdSense advertisers. Thankfully, they were wrong. I've never seen any indication that Google gives any type of ranking boost or favoritism to sites running AdSense. The only possible technical advantage to having AdSense units on your page would be that Googlebot might visit more often. But even that is not proven.

Add to: Digg | Del.icio.us | Ma.gnolia | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , ,

AddThis Social Bookmark Button

Thursday, May 10, 2007

Q and A: Should I be concerned that Google is not caching my index page?

Hello all. Happy Thursday! I've got a Live Chat FAQ transcript for you today:

Dale : Good evening.
kalena : How may I help You?
Dale : This is a tremendous service.
kalena : Live Help? Glad you like it
Dale : Quick question , should I be concerned that Google is not caching my index page?
kalena : Yes. Has it been cached before? How long has it not been cached?
Dale : Last week I changed the title format of my posts and the index cache was dropped.
kalena : Can you give me the URL and I'll take a quick look?
Dale : They have re indexed the posts with the amended tiltles
Dale : jerseyboysblog.com
Dale : I have read that it could be a data center thing
kalena : I see that you've got a verification tag in place. Have you looked at the stats for the site in GG Webmaster Tools?
Dale : Yes I have, everything is fine except the cache
kalena : If Googlebot hasn't had any problems indexing and all looks ok in sitemaps, my guess is that either your new data is on a datacenter that hasn't updated yet or your page has been sent to the supplemental index.
kalena : Have you updated your XML file and pinged GG to request re-indexing of the sitemap?
Dale : I hope so, I seen sites go from no cache to no PR to no index.
kalena : And the only changes you made were to your title tags for blog posts, right?
Dale : Yes, I updated the site site map manually and resubmitted successfully.
Dale : I amended the CSS so the posts are at the top of the page and the navigation is at the bottom
kalena : Ok, then hopefully it is just a temporary issue and should be resolved between the next cache update and database shuffle.
kalena : Have you checked the way GG views your robots.txt since the changes?
Dale : I have tried to optimize organically.
kalena : Ah. If you've made a LOT of changes to the site in terms of organic SEO, it may have prompted an aging delay, but that shouldn't affect the cache. Are you still seeing good rankings?
Dale : I have dropped from six to ten for the key term Jersey Boys, but that has always fuctuated
kalena : yes. If you were suffering the aging/redesign delay, you wouldn't be ranking for anything. Same if you are in the supplementals, but you also would have an old cache.
Dale : It is hard to compete against ticket brokers with 100's of affiliate links.
kalena : sure, I understand. Just keep an eye on Webmaster Tools and you can always submit a query to GG via that interface if probs continue
Dale : Okay, good night, and thank you.
kalena : so long

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: ,

AddThis Social Bookmark Button

Tuesday, May 08, 2007

Q and A: Why has our Google PageRank dropped to zero?

Dear Kalena...

Can you please help us? I just came across your site and you seem very knowledgeable.

Our problem is Google! Our site has been active for a few years now at www.theforeverrose.com We were once #1 for the search "the forever rose" (and ranked well for a few others as well). But we have been gradually slipping, now we are in position 90 for "the forever rose" and off the charts for others?

Our PageRank was once a three and gradually dropped to now 0. We cannot figure out why and things keep getting worse. We strictly follow all of Google's rules and ethics, we rank fine in Yahoo and MSN. I am tired of hearing the obvious; more links, more pages, better content, SEO.... etc, we have been doing that. I feel like we are just missing something really simple, something right in front of our eyes, something that is penalizing us!

Can you please help? Any of your help would be greatly appreciated.

Thank You,
Mike


Kalena's Answer:

Dear Mike

A quick check of your site with the Google Site Status tool shows that pages from your site are included in the index, but that Google may not know about all your site pages. The site was last indexed by Googlebot on 25 April and you have one backward link displayed by Google but 81 backlinks shown on Yahoo.

Your home page has a Google Toolbar PageRank of zero and some pages have greyed out PageRank and no cache, suggesting they haven't been indexed. Curiously, Google is showing 46 pages from your site in their index, while Yahoo is only showing 25 pages indexed. The fact that the site has already aged and used to have a much higher PageRank may suggest a penalty of some kind.

But there could be a few explanations for your poor PageRank and lack of rankings:

1) You are using a black page background but you then have a table on it with a white background and black text. Some search engines will see this as black text on a black background. It's possible that this may be tripping spam filters.

2) Your site is built using old technology and contains a lot of code bloat. Tables are clunky and difficult for search spiders to index and Googlebot may have tripped up on your code and not indexed all your pages.

3) Your home page contains keyword repetition for the words "rose" and "roses". I don't think the repetition is excessive, but it may have triggered some type of suppression filter in Google.

4) Your site has poor link popularity and the sites that link to you tend to have a very low quality score and no PageRank e.g. cufflinksdepot.com/dir-gifts.htm and escapesportif.com/resources/gifts.html. You don't have enough incoming links pointing to your site from what Google calls "trusted sites" - popular directories, portals and authoritative sites in your industry. Your internal links could also use some work from an anchor text angle.

5) Most of your site pages might be stuck in Google's supplemental index, colloquially (but unfairly) known as Google Hell. Google's Matt Cutts explains why some sites have the bulk of the pages moved to the supplemental results:

"If you used to have pages in our main web index and now they’re in the supplemental results, a good hypothesis is that we might not be counting links to your pages with the same weight as we have in the past. The approach I’d recommend in that case is to use solid white-hat SEO to get high-quality links (e.g. editorially given by other sites on the basis of merit)."

Here's what you should do to address the problems:

1) For better indexing, consider upgrading the site design away from tables to clean HTML and use CSS for formatting. Until you do that, change the background of all pages to white to avoid any potential hidden text penalties from your table layout.

2) Run your site through a text-editor such as Lynx to see what search engines see when they index your site. Verify your site with Google Webmaster Tools and check the diagnostics for potential indexing problems.

3) Optimize your site from scratch. You should make sure your site is search engine compatible and optimized for a wider range of target search keywords and phrases rather than the obvious ones.

4) Create and upload an XML sitemap to Google Sitemaps or use the new Sitemaps Protocol in your robots.txt file to tell search engines where to find your XML sitemap. I like to use the free XML Sitemaps Generator to create my sitemaps.

5) Commence a link building campaign pronto. This campaign should include submitting your site to all the major and minor directories and search engines where the site doesn't currently feature, as well as niche directories and portal sites in your specific industry. Where possible, anchor text incorporating your target keywords should be used within the links. My consulting company can take care of link building for you if you like.


Once changes to your site code have been made and you have achieved some good quality links, most of your problems should disappear. If the problems persist, file a re-inclusion request with Google, explaining what might have triggered penalties and what changes you've made to address the issues. Although technically your site hasn't been excluded from the Google index, this should prompt a review of your site by Google's anti-spam team and hopefully result in any suppression penalties being lifted.

Good luck and let us know how you get on!

Add to: Digg | Del.icio.us | Ma.gnolia | RawSugar | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , , , , , , ,

AddThis Social Bookmark Button

Wednesday, May 02, 2007

Q and A: Why can't I find my site for target search terms?

Dear Kalena...

I have two websites one www.dhcottages.co.uk which ranks 1 for the search terms I use and the other www.tacksuperstore.co.uk which I can't even find. I have done everything I know to help improve the listing position. Can you help?

Leanne


Kalena's Answer:

Dear Leanne

You don't say how long either site has been online, but if either of them are less than 9 months old, it could be that one or both of them is suffering the Google aging delay for new sites. Google has indexed over 1,000 pages on your second site, so you'd think at least a few of those pages would appear in the search results, but it's difficult to tell without knowing your target search terms and seeing how well optimized your site is for those terms.

It may also be an indexing problem, so be sure to run your site code through a text-based browser such as Lynx to see your site the way a search robot would see it. You should also run your home page through the HTML validator to check for coding errors, as your page did seem to take an unusually long time to load, which could signal a problem. The alternative is to verify your site in Google's Webmaster Tools and review the site indexing statistics and diagnostics for potential problems.

Add to: Digg | Del.icio.us | Ma.gnolia | RawSugar | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , ,

AddThis Social Bookmark Button

Tuesday, April 24, 2007

Q and A: How do we stop our domains from competing with each other for search rankings?

Dear Kalena...

We recently constructed an optimised e-commerce site for a customer which initially had some great ranking results on the primary domain name (a .com) which we wanted to promote for a global market as a brand name. In the process we also picked up several keyword related domain names and pointed these in via A-record changes.

Unfortunately we now seem to have a pendulum effect going on between the non-primary domains and the primary ... one swings up Google the rankings, then the next, whilst the primary domain name raises its head occasionally but generally isn't ranking where it should be. We're obviously want to remove the secondary domain names from the index so that we score simply on the primary and are concerned that this activity could penalise us for duplicate site content ... am sure the answer is pretty simple, would appreciate a point in the right direction rather than us poke about in a 'try it and see' fashion!

Many thanks in anticipation
Rob

Kalena's Answer:

Dear Rob

It sounds to me like your question is actually : "How do we stop our domains from competing with each other for search rankings?"

It all comes down to the way you've set up your secondary domains. For starters, I don't know why you needed keyword-related domains unless it is for advertising reasons. You haven't sent me the domain info so I can only guess, but it sounds as though you have the same site content duplicated on multiple domain names and each of them is being indexed by Google. Effectively, this means your domains are competing with each other for rankings on the same search queries!

What you should have done (and should do immediately) is to park your secondary domains on the same IP as your primary domain so that search engines see the domains as a single site, index a single site and all your rankings and link popularity get attributed to your primary domain.

If you've got this set up correctly, when you view the Google cache of any of the domains, it will show your primary domain in the cached results page e.g."This is Google's cache of [primary url appears here]" or any parked domain entered in the browser URL field will automatically switch to your primary domain. Instructions for setting up multiple domain parking correctly can be found in this back issue of HighRankings Advisor.

Add to: Digg | Del.icio.us | Ma.gnolia | RawSugar | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , ,

AddThis Social Bookmark Button

Thursday, April 19, 2007

Q and A: Why won't Google index my web page?

Dear Kalena...

Hi, I ran across your blog while searching for help, and I'm wondering if you could answer a questi
on. I made a web page for my wife's photography biz (http://polkaphotos.com). Google last crawled the index page on Oct 29, 2006. I've done some major overhauling to help with search terms, aesthetics, etc.

The site has Flash, but I've run it through a text browser (Lynx) and everything seems to display okay. I've submitted a sitemap, verified through Google. What do I have to do to get Google to go back and re-index the index page?

Michael


Kalena's Answer:

Dear Michael

I've had a look today and the site was last indexed by Googlebot on 13 March so it is being indexed. But I doubt it will rank very well for target keywords. Why? Because apparently your "overhaul" included retro spam tactics that go directly against Google's Webmaster Guidelines!

What on earth made you think that stuffing a web page full of keywords would make it more attractive to search engines or users? Those paragraphs of meaningless keywords at the bottom of the page will do absolutely nothing except attract red flags and ranking penalties from Google, not to mention distracting visitors from your wife's lovely photography.

If you want the site to be taken seriously by both search engines and visitors, I strongly suggest ditching those out-dated spam tactics. Replace them with a paragraph or two of appealing, descriptive text about your wife's photography. You'll find that many of your target keywords will be integrated into the copy naturally, without becoming meaningless, repetitive drivel.
Meanwhile, thanks for providing our Retro Spam Tactic of the Week!

Add to: Digg | Del.icio.us | Ma.gnolia | RawSugar | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: ,

AddThis Social Bookmark Button

Wednesday, April 11, 2007

Q and A: How do I push offensive content about me off the front page of Google?

Dear Kalena...

On my personal Google page I have noticed some defamatory posts about me from an obscure chat room I was involved in 5 years ago. I'm being accused of posting there now as someone else. I have contacted the webmaster of the relevant site but he refuses to delete the offending posts. Any idea how to adjust my 1st page Google index to push these ugly things out of the way?

Thanks David

Kalena's Answer:

Dear David

That's not too difficult. If the information is offensive and/or defamatory, you can always threaten the Webmaster with legal action unless they remove it from their site.

If the Webmaster does not believe the information warrants removing and you have no legal options (e.g. the information is protected by freedom of speech or similar), you will just have to optimize some pages on your site for your own name and/or build some links from popular sites to your site using your name in the link text to ensure you rank higher than the offensive content. If you have an unusual surname, this should be easily achieved.

Add to: Digg | Del.icio.us | Ma.gnolia | RawSugar | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , ,

AddThis Social Bookmark Button

Monday, March 19, 2007

Q and A: How do I ensure a page is NOT indexed by search engines?

Dear Kalena...

I want to publish a private page on the web, that only I and a few other people will use. How can I ensure that this page is NOT picked up by search engines?

It will be a wiki style page, so there may be lots of content which could be indexed by the Search Engines. This is what I want to avoid.

Many thanks for your help,

Fintan


Kalena's Answer:

Dear Fintan

There are a few ways to achieve this:

1) Use a Robots META tag in the page HTML and include "no index" in the tag.

2) Disallow the page in your robots.txt file.

3) Create the page in a sub-folder and password protect the folder so only persons with a login can view it. If your host uses CPanel, you can set this up yourself using WebProtect.

Add to: Digg | Del.icio.us | Ma.gnolia | RawSugar | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: ,

AddThis Social Bookmark Button

Sunday, March 04, 2007

Google confirms 301s are better than 302s to move a domain

Yes, I am meant to be on holidays and no, I'm not meant to be posting here, but I felt this news was too important to wait another two weeks.

At the Search Summit Conference this week, I had the opportunity to ask Adam Lasnik from Google a question that I get asked a lot: Is it better to use 301 Permanently Moved or 302 Temporarily Moved redirects if you need to move a site to a new domain?

Adam replied that provided you are using the same page file names, you should absolutely use 301s rather than 302s on your old pages if you want Googlebot to re-index your site quickly. He also recommended keeping the old site live until the new site was cached and transferring the site over in different stages, depending on the number of pages it has. Google Support Engineer Maile Ohye added that you should also make sure you verify the new domain and upload your XML sitemap for it via Google Webmaster Tools to aid faster indexing.

I asked if using 301s to the new domain would be more likely to trigger the aging delay to kick in for the new site, but Adam reassured me that using 301s in combination with Google Sitemaps should make the domain relocation process fast and painless. He used an example of a 500,000 page e-commerce site he watched moving domains recently via 301s in three stages and claimed that Googlebot had entirely indexed the new site in just over five weeks.

Add to: Digg | Del.icio.us | Ma.gnolia | RawSugar | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , ,

AddThis Social Bookmark Button

Tuesday, February 13, 2007

Q and A: Does CSS help improve search engine rank?

Dear Kalena...

Does CSS help improve search engine rank?

contactlab


Kalena's Answer:

Hi contactlab

CSS (Cascading Style Sheets) alone probably won't make a blink of difference to the way your site ranks. However using CSS may reduce the amount of code you need to use on each page, avoiding code bloat. Bloated code can sometimes cause important content to be shoved to the bottom of the HTML, reducing the likelihood of it being indexed by engines and reducing its relevancy weight.

CSS can also improve the accuracy of your HTML because there is less code to make errors with and more likely that your site will validate to W3.org standards. Valid code is less likely to trip up search robots as they crawl through your site. So while using CSS won't necessarily boost your rankings on it's own, it could make your site more search engine compatible and that may in turn improve your rank.

Add to: Digg | Del.icio.us | Ma.gnolia | RawSugar | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , ,

AddThis Social Bookmark Button

Tuesday, January 23, 2007

Q and A: Why has our site suddenly dropped out of Google?

Dear Kalena...

Hi there, I have done a lot of work optimizing my wife's web site and succeeded in getting the site to number one on Google, Yahoo and MSN for the search term - "childminder milton keynes" - It took me 3 months to do it and her business has boomed. She is now completely booked up. I do regular checks to ensure the site is still no 1 for the search term and on Yahoo and MSN the site is still at number 1, however, on Google the site isn't even listed in the first 15 pages whereas 2 weeks ago it was indeed listed number 1. I am completely baffled, can you help please.

Thank you very much.
Mark

Kalena's Answer:

Hi Mark

First up, thanks for the caffeine contribution, it really helps! Now, about that site.

I've run the site through Google's Site Status Tool and according to the results, it is still being indexed, with the last visit by Googlebot on 14 January. However: the current Google cache of the site is completely blank and the Google Toolbar PageRank for the site is zero out of 10. Both these things indicate a major problem.

Now, I know the site is over a year old and that you last made changes over a month ago, so my guess is that rather than the aging delay, an algorithm penalty or other such manual suppression, Googlebot encountered problems when indexing the site last, which resulted in zero pages being indexed and stored. Naturally, the site has dropped off the charts because there is zero information stored in Google's datacenter as a result of the indexing and caching issue.

However, I'm not surprised Googlebot had trouble indexing the site. It breaks all the rules for search engine compatibility by using outdated Frames technology. Honestly! Frames are sooooo 1996. Search engines have always had trouble indexing frames-based sites and haven't gotten much better at it over the years. Search engine spiders generally only see the master frame-set (the page pulling all the frames together), not the individual frames. Consequently, there is no content for the search engine to index, apart from the content of the NoFrames element.

Because search engine spiders index sites by following links and because there are usually no links within the frame-set HTML code, search engines are usually unable to index frames-based sites beyond the home page. If you insist on using such dated design technology, you absolutely need to give the search engines a juicy No Frames tag to suck on. Yours currently states:
"Sorry, the Little Steppers website is only veiwable (sic) through a frame compatible browser. Please upgrade to a frame compatible browser."
What does that tell a search engine about your business? Zero, zilch, zip. The only reason your site was ranking on Google for "childminder milton keynes", was because you used that phrase within your Title Tag.

Ideally, a short keyword-filled description of the site should be included in the NoFrames element, as well as a link to the site map or main links page, which acts like a signpost for search engines so they know where to find and index further site content. Danny Sullivan wrote a terrific tutorial about how to optimize frames-based sites. Make sure you read it. But if you are really serious about optimizing your wife's site for search engines, you'll update the technology to a design that is search engine friendly.

To fix your immediate problem, here's what I suggest:
  • Verify your site with Google's Webmaster Tools, check for site errors and study Googlebot's indexing patterns.

  • Create and upload an XML sitemap to Google Sitemaps and study the results via Webmaster Tools. See the free sitemap creator that I recommend.

  • Use Danny's tutorial to reestablish the Frames Context for each page on your site so search engines can jump from one page to the next when indexing.

  • Give Frames the flick!
Oh and one last thing, you are using keyword-stuffed ALT IMG tags on your home page. That is a real Google no no. Better nip that in the bud before you DO get penalized.

Add to: Digg | Del.icio.us | Ma.gnolia | RawSugar | Reddit

Subscribe via: Yahoo Feeds | Feedburner | Technorati | Bloglines

Labels: , , , ,

AddThis Social Bookmark Button

Thursday, December 14, 2006

Q and A: Why doesn't Google index my site any more?

Dear Kalena...

My client's site (www.coaching.uk.net ) has been recently converted to Joomla (around 6 months ago). The content is the same as previously. However Google no longer indexes the site.

The other search engines pick it up with no problems, and all my other Joomla sites are picked up by Google.

Can you suggest any reasons why this one should be an exception??!

Carrie


Kalena's Answer:

Dear Carrie

I've taken a look at the site and here's what I see:

1) You've got a Google Toolbar PageRank of zero.

2) You've got zero backward links listed on Google.

3) Google last cached your site on 7th November.

4) According to their Site Status Tool, Google does not know about all your site pages.


Seeing this, I doubt that Joomla that is causing the problem, it is more than likely that the re-design has triggered the site to get stuck in Google's Aging Delay. This delay can impact new or existing sites and can last up to 9 months. You're not the only one with a Joomla site facing similar problems.

Something to keep in mind when working with dynamic technologies and CMS's like Joomla: Google has stated in the past that they don't index pages containing session ids. I've seen such sites indexed before, so perhaps Google is getting better at this. But why take the risk? Make sure that your site either doesn't use session ids or contains a way for searchbots to grab the data via a parameter-clean URL. You can check how a search engine robot would see your site by downloading a text-based browser like Lynx and running your URLs through it.

See this post for advice on what to do while you're waiting in Google aging limbo.

---------------------------

[If you found this post helpful, you might benefit from downloading our free Search Engine Optimization lesson]

Labels: , , , ,

AddThis Social Bookmark Button

Wednesday, November 29, 2006

Q and A: I've cleaned up the site spam so why isn't our home page being indexed by Google?

Dear Kalena...

Hope you're happy and healthy. Thanks so much for your help last time. I cleaned the hidden text off the page and we got re-indexed quite quickly.

But now we seem to have dropped out again. Although some of our subsidiary pages are indexed, the homepage does not seem to be, unless I'm searching incorrectly.

Any ideas?

Best wishes
Robin
www.breatheonline.com/

Kalena's Answer:

Dear Robin

Ok, a couple of things:

1) You obviously have two domains pointing to the same content (www.breatheonline.com and www.breatheyoga.co.uk). If you view the cache of your .com home page, you'll see that Gooogle produces the cache of the .co.uk home page. It seems that Google has decided that your UK domain is the main one and seems to be caching only that site.

2) A "site:URL" search for each domain shows that 22 pages from your .co.uk domain have been indexed while 4 pages from your .com domain have been indexed. To have Google index two domains with identical content is a dangerous thing because one will usually be suppressed and you rarely have the control over which one. I'm not sure about your DNS and IP setup, but you need to decide which domain you wish to promote in search engines and park the other domain to the same IP as the main one. You can also inform Google which site is your main domain via the Sitemaps Protocol. Check out the free XML sitemaps creator that I recommend.

3) Google is indexing and caching your home page at www.breatheyoga.co.uk just fine, from what I'm seeing. You simply need to do a search for your full URL. The last cache of the page was taken as recently as 26 November.

4) Once you've sorted out your domain issue, it might be a good idea to prepare an XML sitemap and submit it to Google Sitemaps as explained in Google's Webmaster Tools area.

5) Some of your incoming link partners are pointing to the .com site while others point to the .co.uk site. This is diluting your link popularity. Decide which domain is more important to promote via search engines and ask all your link partners to change their links to point to that site. Make sure you do the same thing with all your internal site links.

---------------------------

[If you found this post helpful, you might benefit from downloading our free Search Engine Optimization lesson]

Labels: , , ,

AddThis Social Bookmark Button

Tuesday, September 26, 2006

Q and A: Why aren't search engines indexing beyond my home page?

Dear Kalena...

I have recently opened up my website but till now, only my homepage has been indexed. I cannot understand what I am doing wrong. I will attempt doing a sitemap submit but I am still not sure that I will be able to complete it correctly. Any advice is appreciated.

Regards
Gokarn

Kalena's Answer:

Dear Gokarn

If you had included your site URL in your question submission, I might have been able to provide more help!

All I can do is guess at the problem. If only your homepage is being indexed, it could be one of the following problems:

1) Your site is still experiencing Google's aging delay for new sites.

2) You haven't correctly submitted an XML sitemap to Google Sitemaps.

3) You don't have any incoming links pointing to pages beyond your home page.

4) You don't have an adequate internal site-map to enable search engine spiders to find and index all your site pages.

5) You don't have search engine-friendly navigation or text-based internal linking structure.

6) All pages except for your home page are dynamic in nature and created on the fly (not flat HTML). Dynamic pages are often not indexed by search engines unless included in an XML sitemap.

Send me your site URL and I can be more specific.
---------------------------

[If you found this post helpful, you might benefit from downloading our free Search Engine Optimization lesson]

Labels:

AddThis Social Bookmark Button

Monday, September 18, 2006

Q and A: Why has our site disappeared from Google?

Dear Kalena...

I run a small ecommerce website for disabled people in the UK called essentialaids.com and suddenly it has disappeared from Google.

You can't even find its pages when you search for the URL itself. I haven't deliberately done anything underhand to improve rankings so I'm not quite sure why what's happening.

If you can help shed some light on this I'd be eternally grateful because I'm scared stiff and I'm not sure what to do.

All the best,

Alex


Kalena's Answer:

Dear Alex

Don't panic! Your site is still in Google. In fact, Google has indexed 138 pages on your site so far and Googlebot last visited on September 8.

What you're probably experiencing is Google's aging delay for new or re-designed sites. This can result in your site not appearing in the SERPs (Search Engine Results Pages) for any of your target keyword phrases. This is all perfectly normal and part of Google's process for reviewing sites before adding them to the main database.

I'm afraid you just have to wait it out for 6 to 9 months. You can still find your site in the meantime by conducting a domain search in Google.

---------------------------

[If you found this post helpful, you might benefit from downloading our free Search Engine Optimization lesson]

Labels: , ,

AddThis Social Bookmark Button

Thursday, September 07, 2006

Q and A: Why aren't all my site pages being cached in Google?

Dear Kalena...

Our website is in the top 100,000 websites (Alexa Rank) and i find that Googlebot has crawled almost all our webpages but when i click on "show cache" in Firefox, it is not showing for a lot many pages.

I hope that you would have seen such incidences earlier and would like to know the reason why it happens and what can be done to better it?

Waiting on you!
Rishi


Kalena's Answer:

Dear Rishi

I've checked a few pages of your site and most are showing up as recently cached in Google. There were a couple uncached and quite a few showing a very old cache.

Usually caching errors are caused by:

1) Googlebot abandoning the indexing of your site due to a problem it struck in your code.
2) Googlebot abandoning the indexing of your site due to reaching the maximum site data quota set by Google.
3) A no-cache tag appearing in the code on your page.
4) Googlebot avoiding certain areas of your site by obeying the contents of your Robots.txt file
5) A lack of internal / external links pointing to a particular page on your site (Googlebot being unable to find it).
6) Failure to include all site pages in your navigation structure and/or your Google Sitemap.
7) Incorrect formatting or uploading of your Google Sitemap XML file. Try creating an XML sitemap from scratch.

You should check all these possibilities and monitor your site's indexing via Google Sitemaps.

Hope this helps!

---------------------------

[If you found this post helpful, you might benefit from downloading our free Search Engine Optimization lesson]

Labels: , ,

AddThis Social Bookmark Button

Wednesday, July 26, 2006

Q and A: Why isn't Google indexing my site?

Dear Kalena...

My site has not been indexed in Google for the last one month. I submitted articles, blogs to different sites and submitted links to 100 directories. Could you please tell me what are the strategies I should adopt to get listed in Google? My site got indexed in Yahoo and MSN.

J


Kalena's Answer:

Dear J

It would have helped if you'd included your site URL in your question! Without that, I can only guess at the problem. Here are my best guesses:

1) If your site has only recently been launched, you are probably experiencing Google's aging delay for new sites, which can last up to 9 months.

2) You say you've submitted "links to 100 directories". If this has been done as part of some dodgy link scheme, then Google may have penalized you for it. Brush up on why here.

3) If only a few pages on your site have been indexed, your site's navigation may be preventing or discouraging Googlebot from finding all your content. Create a search engine friendly navigation structure and prepare and upload an XML sitemap to Google Sitemaps.

---------------------------

[If you found this post helpful, you might benefit from downloading our free Search Engine Optimization lesson]

Labels: , ,

AddThis Social Bookmark Button

Wednesday, July 05, 2006

Q and A: Why is Google indexing fewer pages on our site?

Dear Kalena...

Thanks for all your great advice. My question is this:

We have had www.livingwithanxiety.com for several years now. We just
recently did a major update and finally, after years, we changed our
meta tags, titles, and so on. We have been submitting an xml sitemap
to google now for about three months. We topped out at about 42 pages
being indexed, but today, we looked and have only 9. What happened? Is
it because of the changes? Traffic literally has halved. Hmm...

Thanks again for all you do!

Sincerely,
Nashell


Kalena's Answer:

Dear Nashell

I've checked and Google has currently indexed 25 pages on your site.

If you were ranking for particular keywords before your site update, the changes you made may have negatively impacted that. If you are certain your new page content, META and Title tags have been optimized well for target keywords, it is more likely that you have been caught up in the Google aging delay or Sandbox effect for re-designed sites.

You should continue to update and submit your XML sitemap whenever you add new content, build more incoming links and wait for Google to let you out of rankings limbo. Be patient!

---------------------------

[If you found this post helpful, you might benefit from downloading our free Search Engine Optimization lesson]

Labels: , , ,

AddThis Social Bookmark Button

Thursday, June 08, 2006

Q and A: Why is Google only indexing my home page?

Dear Kalena...

I published a site a few months ago. The site is http://www.maxwell3.com. I used the Add URL page to add my site to the Google index. Google picked it up a few weeks later.

Unfortunately, it appears that only the homepage is being indexed and I can't seem to determine why. I even created a SiteMap that holds information about the pages in the system and registered it at Google w/o any problems. It's like Google just decided to only use the default page.

The site map is located at http://www.maxwell3com/maxwell3_sitemap.xml

I would love 2 cents of advice. I was last scanned by the googlebot last month and it comes thru about once a week.

Sam


Kalena's Answer:

Dear Sam

I've had a look at the site and the first thing that struck me was that it hasn't been cached by Google. This indicates a problem of some kind, or perhaps somewhere in the code or Robots.txt file you have specifically blocked Googlebot from indexing the site?

I noticed you used MS Visual Basic to build the site - have you made sure the code validates with W3.org? My Google Toolbar is showing a PageRank of zero out of 10, which is not unusual for a new site, but the non-caching issue bothers me.

Also, I can't get your XML sitemap to load at the address you provided above. Make sure the URL is correct and then submit your sitemap to Google Sitemaps.

---------------------------

[If you found this post helpful, you might benefit from downloading our free Search Engine Optimization lesson]

Labels: ,

AddThis Social Bookmark Button