AJAX makes the sites load faster as it only reloads the changes i.e. the section/parts that are same for both the pages are not reloaded therefore helping the page load faster. While in case of search engines there is a different scenario when a spider comes crawling the AJAX sites have their doors bolted.
Now here we have some good news for you, finally at SMX East conference Google announced a proposal to change to a new standard that would make AJAX crawlable. If the proposal is accepted and the supported by other major search engines, it would prove to be a green signal for the developers wanting to enjoy the rich features of using AJAX while not sacrificing search engine visibility.
You can view the blog post above for details about the proposal. We have tried to represent Google’s goal with this proposal in a less technical approach and simple language.
The website requires minimal changes as it grows.
Users and search engines will see the same content (no cloaking)
Search engines can send users directly to the AJAX URL (not to a static copy)
Site owners have a way of verifying that their AJAX website is rendered correctly and thus that the crawler has access to all the content
According to the estimates given by Google about 70% of the content on web is generated dynamically and the figure is likely to grow in the near future. “This hurts search,” Google says. “Not solving AJAX crawlabilty holds back progress on the web.” Those quotes are from a Google Docs presentation deck about the proposal, embedded below.