Showing posts with label google. Show all posts
Showing posts with label google. Show all posts

2009/03/21

Internet Information Services IIS optimization

It has been a long time since my last post. For the last 8 months I have been working on web pages, IIS based, ASP.NET 3.5, using master pages, and so on.

When I thought that most of the work was almost done (master page designing, CSS/HTML editing, linking between pages, and so on), I faced the other side of the problem: SEO optimization, page sizes optimization, download times, conditional GETs, metas (title, keywords, description), page compression (gzip, deflate). The biggest part of the iceberg was under the water; I had a lot to learn, and a lot of lines to code.

Now all that things are already in place and running, so I am willing to share all the things I have learnt with the community, in a series of posts that will cover:

  • ASP.NET menu control optimization; to reduce the page size, increase download speed, desirable to have in place before using conditional GETs.
  • __VIEWSTATE size minimization; in our case it simply doubled the size of the page. A proper optimization can make the page half the size (or less).
  • Conditional GET and ETag implementation for ASP.NET; generation of ETag and Last-Modified headers, when and how to return 304 – Not modified with no content (saves bandwidth and increases responsiveness of your site).
  • Solve the CryptographicException: Padding is invalid and cannot be removed when requesting WebResource.axd; this problem is somewhat common but you will fill your EventLog with these errors if you start using conditional GETs.
  • Automatic generation of meta tags: meta title, meta description, meta keywords; this way the editing of pages will be much simpler and faster.
  • URL canonicalization with 301 redirects for ASP.NET; solve problems of http/https, www/non-www, upper/lower case, dupe content indexing among others.
  • Serve different versions of robots.txt: Whether they are requested via http or https you can serve different contents for your robots.txt.
  • Enforce robots.txt directives; to ban those robots on the wild by detecting bad-behaving bots, not following rules at robots.txt; we will ban them for some months and prevent them from spending our valuable bandwidth.
  • Distinguish a crawl from Googlebot and from someone else pretending to be Googlebot (or any other well known bot); in order to ban those pretenders for a while.
  • Set up honey-pots being excluded in robots.txt and ban anyone visiting that forbidden URL; very good against screen-scrapers, offline explorers, and so on.

Since we use Google Webmaster Tools and Google Analytics for all our websites, we had the opportunity to check the consequences of every change. For instance here is the graph that shows the decrease in number of Kb downloaded per day when we enabled http compression and put conditional GETs in place. Note how the number of crawled pages keeps more or less the same during the period, while the Kb downloaded per day slides down passed middle January (peaks match several master page updates).

2007/03/01

Top 10 searches that lead to this blog during february '07

KeywordsVisitsBlog entry
mssql$microsoftsmlbiz10Showing posts with label business contact manager
messenger sharing folders usn journal reader service9Microsoft Live Messenger released
sql server cte csv9INNER JOIN with a comma separated values CSV field
fwsrv7ISA Server 2004: fwsrv stopped responding to all requests
sqlagent$microsoftsmlbiz7Outlook 2003 Business Contact Manager served by SQL Server 2005
could not find row in sysindexes for database id 7, object id 1, index id 1. run dbccchecktable on sysindexes.6Outlook 2003 Business Contact Manager served by SQL Server 2005
"msdewriter" informó acerca de un error 0x800423f4. esto forma parte del estado del sistema. la copia de seguridad no puede continuar.5VSS & ntbackup errors
schema_option5How to deploy foreign keys to subscribers that are using queued updates
select permission denied on object 'sysobjects', database 'mssqlsystemresource',schema'sys'4Showing posts with label sql server.
microsoftsmlbiz4Showing posts with label business contact manager.
Totals:1277
It might seems not to be much visits at first glance, but if you consider that this blog has only 32 entries up to now, they are rather technical, and it has less than 9 months old... well, maybe you change your mind. Thanks to you all. I am willing to hear your comments.

2006/12/27

Microsoft SQL Server Search Engine using Google co-op

Google has recently launched a beta service called Google co-op. According their overview:
Using it you can create a highly specialized Custom Search Engine that reflects your knowledge and interests.
Since it seemed to be a good idea, I gave it a try and created a customised search engine for specialised content on Microsoft SQL Server. This search engine allows you to use the power and knowledge of Google and refine the results of your queries, according additional keywords, giving a higher weight to those sites you consider to be more interesting, in order to return more precise results. In summary, all I have done is customise the search engine with websites related to SQL Server (Microsoft, blogs, usenet groups and so on). Feel free to give it a try and suggest me more sites to include to refine the searches further. It seems to work fine.