How are there still so many dog websites ranking out there in the search engines? I would guess that over the process of time that some of the search engines would want to take some part of the algorithm and throw a weight of the code in there. It certainly would improve the user experience somewhat. I’ve seen some of the crappiest websites rank for very competitive key words time and time again. It’s not necessary because these companies are super tech savvy and have a monopoly on some good domain, they probably just got in early to the web machine and are still milking the profits from links and domains had from years of working in the online space.
Here is my advice to you people who have crappy websites and are actually making money: spend some money for a nice site redesign. It might help spruce things up a bit, including getting some dynamic content created for your new site.
If you’re looking at advertising on tablets or some mobile device a website supported in those different sectors would be good as well. In short, you’ll ultimately need to make changes, else I could see Google making the changes for you in the near future by downgrading your site from the high station it currently holds. Mark my words, this aspect of the algorithm is not only coming, but it’s going to be a game changer Much bigger than Panda or Penguin.