Can I use robots.txt to optimize Googlebot's crawl?
Today's question comes from blind five year old in San Francisco who wants to know, can I use robots text to optimize Google bots crawl?
Today's question comes from blind five year old in San Francisco who wants to know, can I use robots text to optimize Google bots crawl?
We have a good question from Quinton in Vancouver who asks in the search results, Google will often display a snippet appropriate to the specific search query, often disregarding the metadescription.
Today's question is about geotargeting.
Okay, Chris from the UK asks, is there a limit to the number of pages that Google will index from one site?
Here is a fun question from Sebastian in Germany. Sebastian asks, “We still have old content in the index. We block them via robots.txt use 404 and delete them via Webmaster Tools.
Today's question comes from Andy in New York.
Vladgidea from Romania asked, "How much time is Google taking to index a new webpage, and how can we accelerate the process besides using Google Webmaster Tools?" Well, the simplest answer is to get more links.
Today's question comes from Bogdan Suvar.
We have a good question from Andy in New York who asks, "as memorable .COM domains become more expensive, more developers are choosing alternate new domains like .IO and .IM, which Google geotargets to small areas.
We've been getting a lot of good questions from Ryan in Dearborn, Michigan. Ryan asks many corporate legal departments insist on using registered and trademark symbols on brand keywords and titles.
Fernando C in Spain asks, we are a pretty big site.
Today's question comes from landlord in Colorado who asks, does PageRank take into account cross browser compatibility?
We have a question from Michael S in Austria.
Clayton SEO from South Africa asks, say your index page or any page on your site has been cached by Google, crawled and indexed by Google.
Hi, everybody.
We have a question from Phoenix, Arizona.
Today's question comes from India, where Arpit asks, "Can too many redirects or multiple hops for a given URL, 301 as well as 302, have a negative impact in crawling/indexing/ranking?"
CUTTS: A fun question from SEOmofo in Simi Valley, they ask "If I externalize all CSS style definitions and Java scripts and disallow all user agents from accessing these external files (via robots.txt), would this cause problems for Googlebot?
Today's question comes from Pennsylvania.
Today's question comes from a different Matt.
Today's question comes from London. IanVisits asks, "Knowing that the keywords in a url is one of many factors used to rank a site, what about the impact of the filename suffix?
We have a fun question from Ryan in Dearborn, Michigan, who asks: previously, you mentioned that W3C validation isn't a ranking factor.
Today's question comes from Zurich, in Switzerland.
Today's question comes from Linda,in Lakewood New Jersey. Linda asks:"If I use Google Analytics to track conversions on my e-commerce website, would that have a negative effect in my Google search engine results?"