I have been working on a large website redesign. The site relies heavily on client-side rendering of html elements and leverages ajax to limit server postbacks. For example, to display a list or grid of items, I am using the Kendo UI Web controls ListView or Grid. These controls don't render search-engine friendly content however. Thankfully, with just a few lines of code, I can create custom views for search engines that render the content in a more indexable way. But, that means I give up the benefits of these controls; namely the client side rendering, sorting and paging.
DisplayModeProviders to the rescue
In ASP.NET MVC 4 there is a new DisplayModeProvider class that allows application developers to tap into the pipeline of MVC view selection. It changes the selection process of the view and can be configured to automatically serve different views to mobile browsers by just changing the filename of a Razor view from (for example) Index.cshtml to Index.Mobile.cshtml. And, because the view is (or at least should be) responsible for rendering the output to the client, changing the view shouldn't require any changes to the underlying business logic. Or, more simply, to support alternative layouts of the same page, a developer could simply add a new view for that new layout without affecting the application logic.
Out of the box, MVC 4 supports "Mobile" as a display mode, but additional DisplayModes can easily be added to applications in the global.asax file. So I decided to add a simple DisplayMode to detect if the current request was from a search engine indexer and serve a custom view with all the ajax and client-side rendering removed.
To add my new DisplayMode, I added the following code to the Application_Start method of the global.asax file:
DisplayModeProvider.Instance.Modes.Insert(0, new DefaultDisplayMode("Crawler")
ContextCondition = (context => Utils.IsCrawler(context.Request))
And in my Utils class I have a static method to detect whether the request is a search engine indexer.
public static bool IsCrawler(HttpRequestBase request)
bool isCrawler = request.Browser.Crawler;
Regex regEx = new Regex(ConfigurationManager.AppSettings["CrawlerMatches"]);
isCrawler = regEx.Match(request.UserAgent).Success;
This method is all over the Internet; the only interesting part of it is the Regex pattern is stored in the web.config so it can be easily modified.
So, what happens?
When a request comes into the site, the request is evaluated by the DisplayModeProvider and if it satisfies the condition of being a search engine indexer (evaluated by the method Utils.IsCrawler) then the view selection will first look for a view with the same name as the normally returned view but with a ".Crawler.cshtml" extension instead of just ".cshtml".
And for the requests that would normally include client-side rendering, I just add an additional version of the view with the ".Crawler.cshtml" extension and that page will get served automatically to search engine indexers.
Developing site that relies heavily on client-side content rendering can be detrimental to search engine visibility. In the past I would have just created all the rendering on the server and sacrificed the interaction speed of having script render the contents client side. With DisplayModeProviders, I'm able to supplement the client-side rendering view with a view that presents a more indexable interface.
It should be noted here, that I am not trying to trick the search engine spider at all. The page the spider will see is exactly the same as a person would see. I am merely changing the interaction of the navigation to be server side instead of client side.
Give it a try and let me know what you find.
Good luck to you!