One of the questions I ask before starting a new SEO Process is the existing capacity and flexibility to implement technical and content changes on the website, which can be highly facilitated or totally challenged by the characteristics of the content management system in use. Even with a professional SEO company services dedicated team of experienced web developers, the implementation of some of the most important SEO recommendations can become a nightmare and even non-viable to develop due to the restrictions of the CMS in use.
This is why for new sites looking to choose a CMS or businesses migrating to a new one it’s also key to assess and take into consideration the different elements that will need to be optimized during an SEO process -besides those related to the required web publishing functionality.
To facilitate this CMS assessment and selection I’ve created a checklist including the different SEO related requirements to take into consideration, from crawl ability & indexability to relevance & sharing: Click Here for the IMAGE You can download a bigger version of the checklist here.
Let’s go through these SEO requirements:
Crawlability & Indexability
Each of the site’s pages should be shown through only one URL. Featuring the same content in many URLs or all of the site content through just one URL should be avoided. Content should be directly included as text and placed directly in the HTML, avoiding Flash or scripts. Content should be always visible to the user and accessible to the search bots. Cloaking should be avoided.
Although it’s not necessary that the URLs are optimized by default -as the optimal URL structure & pattern might change from site to site- what it’s important is to have the flexibility to easily configure & personalize the URLs to be descriptive & hierarchically organized, avoiding parameters & sessions IDs to show the site content.
Canonicalization & redirects
The CMS should facilitate the inclusion & configuration of canonical annotations to indicate the original URL of your site pages as well as the implementation of 301-redirects, whether to refer to your preferred URLs (from your non-www to the www versions of your site pages, for example) to avoid content duplication issues or when changing your pages addresses and refer to their new URLs.
The ability to easily create & edit the robots.txt file to block those areas of your site that you don’t want search engines to crawl and indicate the location of your XML sitemap.
Should provide the option to easily add & configure the meta robots tag in all or specific pages of your site and have the capacity to noindex them to avoid duplicated content for example, such as what can be generated through internal search results.
Should provide features or extensions/plugins to optimize the pages speed by enabling caching & compression, minifying resources, removing render-blocking JS, etc.
Automatically generating XML sitemaps for your site content -pages, images, videos- while facilitating their configuration to avoid the inclusion of blocked, noindexed or canonicalized URLs.
Facilitates the indication of paginated pages with the inclusion of the rel=”next” & rel=”prev” tags.
Provides multi-device support with a desktop, tablet & mobile friendly Web versions whether through responsive, dynamic serving or parallel mobile sites.
Allows to easily configure error pages with the appropriate HTTP status.
Follows Web standards and provides a clean HTML code with externalized CSS & JSS.
Provides a flexible categorization feature that allows to hierarchically organize pages to use their topics as main criteria while avoiding content duplication issues.
Provides text-based navigation with menu & breadcrumbs with customizable anchor text & rel nofollow attribute, linking directly to each relevant page.
Allows managing internal search results to avoid content duplication issues & configure sitelinks search box.
Allows to enable independent language/country versions with differentiated Web structure & tag the site pages with hreflang annotations.
Allows to personalize & optimize the title tags of each page & additionally set rules to automatize their generation by using patterns.
Allows to personalize & optimize the meta descriptions of each page & set rules to automatize them.
Includes & allows to personalize relevant & descriptive heading tags (H1, H2, H3, etc.) for each page.
Provides features or extensions to use Schema.org markup when relevant & show rich snippets for the supported content types.
Allows to personalize & add relevant an ALT description to images.
Allows adding relevant captions & transcriptions to videos (and use an HTML5 video player instead of a Flash one).
Enable indexable comments, Q&A, reviews in the site pages.
Social tags & buttons
Provides functionalities to include social tags (open graph, Twitter cards, rich pins) & integrate social buttons.
Allows to easily include Web analytics code in all of the site pages & integrate with Webmaster Tools.
Provides Web security features & extensions that help to protect against hacking attacks.
Provides functionalities to perform frequent & schedulable backups.
Provides frequent platform updates with fixes & new features.
Allows to easily export the site content & configuration.
Supports the generation of RSS feeds to facilitate subscriptions.
Sometimes is not possible to have all of the desired features and it’s a must to prioritize those requirements that are more critical, will have the higher SEO impact and will facilitate the most of the optimization process.