Some of the well known tags commonly used in SEO are the three following meta tags: meta title tag, meta keywords tag and meta description tag:
<meta name="title" content="title goes here" /> <meta name="keywords" content="keywords, for, the, page, go, here"/> <meta name="description" content="Here you will find a textual description of the page" />
A lot has been written about the benefits of using them, and almost the same amount telling that they are not considered anymore by search engines. Anyway, no matter if they are used or not on the calculation of SERP (Search Engine Results Page), nobody discusses the benefits of having them correctly set on all your pages. At least meta description tags are somehow considered by Google, since Google Webmaster Tools warns you about pages with duplicate meta description:
Differentiate the descriptions for different pages. Using identical or similar descriptions on every page of a site isn't very helpful when individual pages appear in the web results. In these cases we're less likely to display the boilerplate text. Wherever possible, create descriptions that accurately describe the specific page. [...]
Download the VB project code
The question is not “should I use meta tags in my pages?”, the real question (and here comes the problem) is “how can I manage to create individual meta descriptions for all my pages?” and “how can I automate the process of creating meta keywords?”. That would be too much work (or too much technical work) for you (or your users, if they create content on their own).
For instance, consider a CMS (Content Management System) in which users are prompted for some fields to create a new entry. In the simplest form, the CMS can ask the user to enter title and content, for the new entry. In advanced-user mode, the CMS could also ask the user to suggest some keywords, but the user will probably enter just two, three or four words (if any). The CMS needs a way to automatically guess and suggest a default set of meta keywords based on the content before definitely saving the new entry. Those could be checked, and eventually completed by the user, and then accepted. Meta title and meta descriptions are much easier, but will be covered also in our code.
In our sample VB project we will not suggest keywords for the user to confirm, we will just calculate them on the fly and we will set them without user intervention. We will use a dummy VirtualPathProvider that will override the GetFile function in order to retrieve the virtualPath file from the real file system, so it is not a real VirtualPathProvider in the whole sense, just a wrapper to take control of the files being served to ASP.NET before they are actually compiled. A VirtualPathProvider is commonly used to seamless integrate path URLs with databases or any other source of data rather than the file system itself. Our custom class inheriting from VirtualPathProvider will be called FileWrapperPathProvider
. In our case it will not use the full potential of VirtualPathProviders, since we will only retrieve the data from the file system, do minor changes to the source code on the fly and return them in order to be compiled. This will introduce a bit of overload and some extra CPU cycles before the compilation of the pages, but that will only happen once, until the file needs to be compiled again (because the underlying file has changed, for instance).
Our FileWrapperPathProvider.GetFile
function will return a FileWrapperVirtualFile
whenever the virtualPath requested falls under the conditions of IsPathVirtual
function: the file extension is .aspx or .aspx.vb and the path of the requested URL follows the scheme of ~/xx/, that is to say, under a folder of two characters (for the language, ~/en/, ~/de/, ~/fr/, ~/es/, …). In other case, it will return a VirtualFile handled by the previously registered VirtualPathProvider; ie. none, or the filesystem itself without any change.
We have chosen to use a VirtualPathProvider wrapper around the real file system just to show what kind of things can be done with that class. If your data is on a database instead of static files, you will probably be using your own VirtualPathProvider, and in that case it will work by virtualizing the path being requested and retrieving the file contents from the database instead of the filesystem. Whichever the case, you can adapt it to your scenario in order to make use of the idea that we will illustrate in this post.
The idea is somewhat twisted or cumbersome:
- Parse the code behind file for the page being requested (.aspx.vb file) and, using regular expressions (regex), replace the base class so that the page no longer inherits from
System.Web.UI.Page
and inherits fromSystem_Web_UI_ProxyPage
instead(a custom class of our own). This proxy page class declares publicMetaTitle
,MetaDescription
andMetaKeywords
properties and link them to the underlying meta title, meta description and meta keywords declared inside the head tag in the masterpage. When a page inherits from ourSystem_Web_UI_ProxyPage
, it will expose those 3 properties that can be easily set. SeeSystem_Web_UI_ProxyPage.OnLoad
in our sample project. - Read and parse the .aspx file linked to the former .aspx.vb file (the same without the .vb) and make a call to
JAGBarcelo.MetasGen.GuessMetasFromString
method which makes the main job with the file contents. SeeFileWrapperVirtualFile.Open
function in the sample project. - Besides of changing the base class to that of our own, we add some lines to create (or extend)
Page_Init
method on that .aspx.vb file. In those few lines of code that are added on the fly we set the three properties exposed bySystem_Web_UI_ProxyPage
class and that we have just calculated. - Return the
Stream
as output of theVirtualFile.Open
function with the modified contents so that it can be compiled by ASP.NET engine, based on the underlying real file, using the formerly calculated meta title, meta keywords and meta description. Note that this is done in memory, the actual filesystem is not written at any time. The real files are read (.aspx.vb and .asxp), parsed, and the virtual contents created on the fly and given to ASP.NET. You need too be really careful since you can run into compile-time errors in places that will be hard to understand, since the filesystem version of the files are the base contents, but not the actual contents being compiled.
The way we calculate the metas in JAGBarcelo.MetasGen.GuessMetasFromString
is:
- Select the proper set of noise-words set depending on the language of the text.
- Look for the content inside the whole text. It must be inside ContentPlaceholders (we will suppose you will be using masterpages), and we will look for a particular ContentPlaceHolder that contains the main body/contents of the page. Change the
LookForThisContentPlaceHolder
const inside MetasGen.vb file in order to customise it for your own masterpage ContentPlaceHolder's names. - Calculate the meta title as the text within the first <h1> tags right after the searched ContentPlaceHolder.
- Iterate through the rest of the content, counting word occurrences and two-word phrases occurrences, discarding noise words for the given language.
- Calculate the keywords, creating a string that will be filled with the most frequent single-word occurrences (up to 190 characters), and two-word most frequent occurrences (up to 250 characters in total).
- Calculate the description, concatenating content previously parsed, to create a string between of 200 and 394 characters. Those two figures are not randomly chosen, Google Webmaster Tools warns you when any of your pages has meta descriptions shorter than 200 or longer than 394 characters (based on my experience).
- Return the calculated title, keywords and description in the proper
ByRef
parameters.
A good thing about this approach, using a VirtualFile is that you can apply it to your already existing website easily. No matter how many pages your site has, hundreds, thousands,... this code adds meta title, meta keywords and meta descriptions to all your pages automatically, transparently, without user intervention, very little modifications (if any) to your already existing pages and it scales well.
Counting word occurrences.
We iterate through the words within the text under consideration (ContentPlaceHolder) and store their occurrences into a HashTable
(ht1
for single-words and ht2
for two-words). All words are considered in their lowercase variant. The word must have more than two characters to taken into account and must not start with a number. If it passes the former fast test, it is checked against a noise-word list. If it is not a noise word, it is checked against the existing values in the proper HashTable
and included (ht1.Add(word, 1)
), or its value incremented (ht1(word) = ht1(word) + 1
) if it was already there.
Regarding the noise words, we first considered some word frequency lists available out there, but then we thought about using verb conjugations as well. So we first created MostCommonWordsEN
, an array based on simple frequency lists, and then we created also MostCommonVerbsEN
based on another frequency list which considered only verbs. At the end we created MostCommonConjugatedVerbsEN
, where we stored all the conjugations of the former most common English verbs. When checking a word against one of these word strings we only use MostCommonWordsXX
and MostCommonConjugatedVerbsXX
(where XX is one of EN, ES, FR, DE, IT). Yes, we did the same for other languages like Spanish, French, German and Italian, whose conjugations are much more complex than -ed, -ing and -s terminations. For automatic generation of all possible conjugations for the given verbs (in their infinitive form) we used http://www.verbix.com/
Calculating meta title.
It will be the text surrounding the first <h1> and </h1> heading tags right after the main ContentPlaceHolder
of the page.
Calculating meta description.
Most of the time a description about what a whole text is about (or at least it should) is within the first paragraphs of it. Based on that supposition, we try to parse and concatenate text within paragraphs (<p></p> tags) after the first <h1> tag. Based on our experience, when the meta description tag is longer than 394 characters, Google Webmaster Tools complain about it being too long. Taking that point in mind, we try to concatenate html-cleaned text from the first paragraphs of the text to create the meta description tag, ensuring it is not longer than 394 characters. Once we know the way our meta descriptions are automatically created, all we need to do is create our pages starting with an <h1> header tag followed by one or more paragraphs (<p></p>) that will be the source for creating the meta description for the page. This will be suitable for most scenarios. In other cases, you should modify the way you create your pages or update the code to match your needs.
Calculating meta keywords.
Provided those noise word lists for a given language, calculating the keywords (single word) and key phrases (two words) occurrences within the text was something straightforward. We just iterate through the text, check against noise words, and add a new keyword or increment the frequency if the given keyword is already on the HashTable
. At the end of the iteration, we sort the HashTables by descending frequency (using a custom class inheriting from System.Collections.IComparer
). The final keywords list is a combination of the most frequent single keywords (ht1
) up to 190 characters, and the most common two-word key phrases (ht2
), until completing a maximum of 250 characters. All of them will be comma separated values in lowercase.
Summary.
Having meta tags correctly set is a must, however it is difficult to set them manually on every page sometimes, furthermore not forgetting all possible keyword combinations. Too much frequently only a few words are added, and this is when automatic keyword handling can help. If you consider this might be your case, please, download our sample VB project and give it a try (and a few debug traces too). I will be waiting for your comments.
1 comment:
I was trying to do the same for my website and your post helps me a lot.
Keep up the good work.
Post a Comment