Showing posts with label google. Show all posts
Showing posts with label google. Show all posts

6.04.2012

DomainTools Scummy SEO


webspam

DomainTools.com is a website which provides technical information about other websites such as their domain name registration, directory listings, keywords, SEO score, etc.  It is a very useful site if you're looking for information like that but are so many people online looking for that information it would rank as the 208th most popular site on the internet?



DomainTools isn't as popular as it's Alexa rank would imply but it does get a lot of visits and they manage to get many of their visits by scraping only the most important SEO related content from the over 100 million registered domain names without giving anything back to the sites they're scraping from. At one point they even had a close relationship with Google.



Let's look at what's really going on.

Continue Reading >>


5.26.2012

Is Google Panda/Penguin Only For English Speaking Searches?



As I was writing my last article patting myself on the back for my little part in making the internet a better place (since Matt Cutts and Google didn't) I was seeing something very strange.



According to Alexa, a lot of the sites I checked that were negatively impacted by Panda and Penguin had very high Alexa Regional Rankings in non English speaking countries. This raises a red flag for me that there might be something fishy happening. You know, cause these sites are written in English and all.



It made me wonder if the recent Panda and Penguin (and even earlier) changes to Google's Search Ranking algorithms might only be effecting English speaking or maybe first-world countries in general. Or maybe there was something else entirely different going on?

Continue Reading >>


How SEO and IM Became Four Letter Words



I just saw Matt Cutts response to the Scamworld video via a post on SaltyDroid.



Matt's tweet in response didn't come as a shock. Internet Marketers and SEOs seem to go hand in hand and Matt has a strong relationship with the SEO community being he's the devine messenger sent down from the top of Mount View to help those of us looking to put things online stay in Google's good graces.



I have respect for what Matt Cutts has been doing to combat webspam at Google but I disagree with him here. While there may have been a time that SEO and IM had a more legitimate meaning (err maybe just SEO), the titles have generally been corrupted. Maybe a little history lesson will help.

Continue Reading >>


1.23.2009

Tell Googlebot to Crawl Faster


Google's Webmaster Tools has changed the interface that allows a webmaster to control how fast or how slow Googlebot crawls a site.



I just noticed the change today, in the past, there were 3 radio buttons for normal, faster and slower. Many webmasters could only set it to normal and slower because faster was disabled unless Google thought your site really needed to be updated faster and could handle it.



As you can see from the screenshot, the control has now been changed to a slider and presumably everyone can set it how they wish, but it's likely that Google may override the settings. Also, setting the crawl rate faster doesn't mean Googlebot will visit your site more frequently. The crawl rate just determines how spaced out one request will be from the other. If Google thinks your site should get 200 visits a day, you're going to get 200 visits a day. If you set a faster crawl rate, that just means those request could happen within the same hour.





One neat thing about the new interface is that it shows you what Google's recommended crawl rate for your site is. One one of sites the crawl rate was faster than the other. In the past, you could only judge by seeing your crawl stats. The top chart shows the number of visits, the second the number of kilobytes downloaded and the third the average time spent downloading a page.



As you can see from the charts, this site seems to be getting slower and the recommended crawl rate was set a little slower than on other sites. Probably because Google thought there was too much strain on the server. This wasn't the case. The drop in response time had to do with the use of a third-party api but the site had plenty of capacity. In this case, I instructed Googlebot to crawl faster.



But sometimes, you really do want to slow down your crawl rate. On another site that is fairly new, Googlebot visits the site 10 times more than real visitors do. It's not a site I'm actively promoting yet and while I don't mind getting the many pages crawled, it seems a waste to waste so much bandwidth on crawling compared to the actual site traffic. In that case I slowed it down.



If you want your site to be crawled less, setting a slower crawl rate can help accomplish that. For instance, if you only want Googlebot to visit your site 100 times a day, you would set your crawl rate so that Googlebot only visits ever 864 seconds.



To help figure out what you should set your crawl rate to take 86,400 and divide it by how often Googlebot should crawl your site. 86,400 is the number of seconds in a day.

If you're not familiar with the Google Webmaster Tools interface, these are the steps to make the change. The placement has also changed:


  1. Log into Google Webmaster Tools
  2. Click on the website you'd like to change in the Dashboard
  3. Select Settings from the left menu bar
  4. Scroll down to the Crawl rate section
  5. Slide the slider either towards faster or slower
  6. Click the Save button

Remember that these are just recommendations and Google may decide to do what it feels is best.