Categories

HttpCompress

An open compression engine for ASP.NET

News

Mar 12, 2012 – Moved to GitHub

I decided to move the source over to GitHub. Pull requests happily accepted.

Jan 14, 2008 – Version 7 available

A while ago I worked in a patch for the very commonly reported bug “HttpCompress doesn’t work for documents in the root of a site”, but never cut a release. So, here’s a release with that patch incorporated.

Oct 27, 2007 – Project Hosting now at Google Code

In an effort to allow others to easily contribute to this project, I’m now hosting it at http://code.google.com/p/httpcompress/. Head over there to track issues, contribute patches and keep up to date on what’s going on. I’ll still keep current downloads available here, but the build of the project will be taking place over on the new site.

Nov 15, 2005 – Version 6 for .NET 2.0 released!

This is a pretty simple recompile of the version 6 source, targeting the 2.0 version of the .NET Framework. It now uses the built-in deflate and gzip streams found in System.IO.Compression instead of #ziplib. I consider this a beta release; please test it thoroughly before releasing it onto any production systems. [Binary Only, Source Only]

Apr 21, 2004 – Version 6 released!

Another bugfix release:

  • Changed how the Content-Encoding header is written. It is now written on the first call to the compressing stream’s Write method. This fixes the issue where the response would come back with a header indicating it was compressed when the filter was really skipped. This allows Server.Transfer and the default exception reporting mechanism in ASP.NET to work without modification, though their output will not be compressed.

Mar 19, 2004 – Version 5 released!

This is mainly a bugfix release over version 4. New Features:

  • (v5) Plays nice with the OutputCache using the VaryByHeader property.
  • (v5) No longer installs the filter if the CompressionLevel is set to “None”.
  • (v5) No longer throws an exception if a q-value cannot be parsed.
  • (v5) Properly install the INSTALLED_TAG, preventing double processing when
    a filter is not installed
  • Path-based exclusions
  • ContentType / MimeType based exclusions
  • A newer SharpZipLib

Download

Current Version
Older Versions

Known Bugs

Compression breaks when Server.Transfer is used
In ASP.NET 1.0 and 1.1, Server.Transfer and response filters do not play well together. Thankfully, a fix is available.
Images compressed with the filter are broken
This usually occurs with Internet Explorer 5.5 or 6.0. Both of these browsers can drop the first 2048 bytes of the response, breaking images that are compressed. The situation is documented by KnowledgeBase articles Q312496 for IE 6 and Q313712 for IE 5.5. The latest service pack fixes the issue for IE6, while you have to get a hotfix for IE 5.5.

Articles

OnDotNet -Filtering HTTP Requests with .NET
An article about filtering HTTP requests using ASP.NET. The article was based upon my experiences building the HttpCompressionModule.
Advertisements

40 replies on “HttpCompress”

I have a bug I found with this compression module. I have tried to debug but have not found a way. When you use the asp substitution control with this module it causes the substition to not occur.

Thought?

If that environment supports HttpModules and Response.Filter, it should work. Are you talking about the super-Cassini that ships with the more recent ASP.NET environments?

That said, you probably want the svn version right now. It fixes a bug with paths to apps that live off the root of the application instead of in a virtual directory.

Thanks Ben, for HttpCompress. I’m using it to compress the data from an ASP.NET web service. I’d like to know if the deflate compression method uses zlib. If so, how can I turn on the zlib headers in the response?

Hi, its a good tool. Do you have any plans to add functionality to compress webresource.axd files also to compress other resources files which are emitted by some third party components such as Telrik, Infragistics etc.

Hi Ben,

I’m using your compression module and it works very well. I use some PageMethods in my application and have noticed that the response from these does not get compressed. Is the mime type of “application/json” not supported at all?

Thanks.

I have configured this according to “http://www.c-sharpcorner.com/UploadFile/Ihelpable/httpcompression05022006094806AM/httpcompression.aspx?ArticleID=b0c4a586-97b4-4fbd-a95a-8b1200f0e068”. I tested compression using “http://www.whatsmyip.org/http_compression/” before and after configuring and it shows to be working. My client is running Windows XP. The site comes up fine after compressing on Firefox and Chrome, but IE7 cannot display the site. Adding a single aspx page to the exception list allows IE7 to display that page. Any suggestions on getting the site to display in IE7?

I’m downloading the bin folder, add the reference and modified the web.config, but then al my pages display with weird characters, any solution?

thanks a lot

I’ve got this installed on IIS 6.0 and it appears to be working well. Thanks for providing.

I noticed, however, that the compression is not as good as IIS for some files. I was able to get IIS to compress EVERYTHING by editing the metabase and it compressed my file from 146k to 10k. With HttpCompressionModule, that same file is stuck at 22k.

Any idea why?

Today we tried to access our ASP.NET app via Android HTC Dream and it failed with the error: “Invalid Use of Response Filter” and references to GZip.

Every other browser, including iPhone and Palm Pre, along with all the standard PC/Mac browsers, work fine.

Any idea why I might be getting the error?

I found this reference at: http://www.nsilverbullet.net/CategoryView,category,AJAX.aspx

***************************
After doing this I received an “Invalid use of response filter” exception (more info at dasBlog.us) which turns out to be because my host already implements httpCompression, but disabling the blowery handler sorted it out.
***************************

We are running blower.Web.HttpCompress.dll 11/18/07 V. 6.0.0.0 and blowery.Web.HttpCompress.xml at 10/22/07.

Our web.config is like this:

Thanks for any advice you can give me.

Sorry, a few other points and correction. We are running .NET Framework 1.1 and here’s our web.config with the tags changed so that the contents can be viewed:

[configuration]
[configSections]
[sectionGroup name=”blowery.web”]
[section name=”httpCompress” type=”blowery.Web.HttpCompress.SectionHandler, blowery.Web.HttpCompress” /]
[/sectionGroup]
[/configSections]

[blowery.web]
[httpCompress preferredAlgorithm=”deflate” compressionLevel=”high”]
[excludedMimeTypes]
[add type=”image/jpeg” /]
[add type=”image/jpg” /]
[add type=”image/gif” /]
[/excludedMimeTypes]
[excludedPaths]
[add path=”NoCompress.aspx” /]
[add path=”spread11_msoft.aspx” /]
[add path=”ScriptResource.axd” /]
[add path=”WebResource.axd” /]
[/excludedPaths]
[/httpCompress]
[/blowery.web]

[httpModules]
[!– blowery –]
[add name=”CompressionModule” type=”blowery.Web.HttpCompress.HttpModule, blowery.web.HttpCompress” /]
[!– end blowery –]
[/httpModules]

I try it, no errors , but when I run my site I only get weird symbols, but if I set compressionLevel=”none” everythings works fine again

Any reason for that behaivor?

Thanks a lot

On IIS6 with i map WildCard to Aspnet, it not compress static resources.
in debugger i not reach CompressContent method

I noticed the same problem as mogadanez: httpCompress will compress static files when run in the Visual Studio Web Developement Server, but not in IIS6 with wildcard mapping set. It does however compress all dynamic requests as expected, including all my extension-less URLs, so the wildcard mapping actually seems to work.

Anybody any idea on what could be wrong here or how to work around it?

I understand it could become too CPU intensive for high-traffic sites to compress a lot of static content on the files (without server-side caching of compressed content as IIS could do it itself), but that’s not an issue for me now.

Using YSlow to check the stats on my site, I noticed even with blowery, js and css files are not being compressed. My config states to exclude image files and binaries, but not js or css. This is a .net 2 site. Anyone have any ideas why this is happening? I’ve included the source as is in my project btw, rather than using the precompiled binary. This is just for ease of deployment purposes.

for those having issues with js files (and other static files) I found adding this:

To the HTTPHandlers node in web.config seems to work, no idea why as surely thats what its doing by default but there you go.

No. You would need a dedicated server. I looked into doing something just as CPU intensive a while back and I kept reaching my limits. I had to move it to a dedicated server which was about $180/month. Good luck!

Yeah, known issue. Never had to figure what’s going on there. If you can, I would love for someone to figure it out.

Hey there! Thanks for writing the component and making it available. Very helpful!
The issue with .axd is that the method to check that checks for excluded paths looks for the full file name and not only extensions. The default settings add “.axd” but no file is named just “.axd”! This (basic) change should solve the problem:

public bool IsExcludedPath(string relUrl)
{
foreach (string path in _excludedPaths)
{
if (path.Contains(relUrl.ToLower()) || relUrl.ToLower().EndsWith(path))
return true;
}
return false; // _excludedPaths.Contains(relUrl.ToLower());
}

BTW, the source of the issue is that axd files are, apparently, already compressed, so this module is double-compressing them. Another option would be to check if the response header already contains the compression on it and, if so, do not add the compression filter again. That may be a better solution but I didn’t investigate to see if it works.

Thanks!

Hello,
first thanks for sharing this wonderful technology. but can i use it for my web application running on .net framework 3.5 and 4?

I tried using this on a website running .NET 4 and I keep getting the error below. Any ideas?

Exception information:
Exception type: ArgumentOutOfRangeException
Exception message: Index and count must refer to a location within the string.
Parameter name: count
at System.String.RemoveInternal(Int32 startIndex, Int32 count)
at System.String.Remove(Int32 startIndex, Int32 count)
at blowery.Web.HttpCompress.HttpModule.CompressContent(Object sender, EventArgs e)

I use this dll for compression .It’s working fine. But when i export a report in world/Excel it’s showing this error

“Server cannot append header after HTTP headers have been sent. ”

My Code :

public static void ExportToWord(GridView gvReport,Panel tdReportInfo,string reportName)
{
try
{
reportName = reportName + “.doc”;
HttpContext.Current.Response.Clear();
HttpContext.Current.Response.Buffer = true;
HttpContext.Current.Response.ClearContent();
HttpContext.Current.Response.ClearHeaders();
HttpContext.Current.Response.AddHeader(“content-disposition”,
“attachment;filename=” + reportName + “”);
HttpContext.Current.Response.Charset = “”;
HttpContext.Current.Response.ContentType = “application/vnd.ms-word “;
StringWriter sw = new StringWriter();
HtmlTextWriter hw = new HtmlTextWriter(sw);
tdReportInfo.RenderControl(hw);
gvReport.RenderControl(hw);
HttpContext.Current.Response.Output.Write(sw.ToString());
HttpContext.Current.Response.Flush();
HttpContext.Current.Response.End();
}
catch (Exception ex)
{
ReallySimpleLog.WriteLog(ex);

}
}

Any suggestion please ?

First of all, thank you for this useful code.

Only a heads-up:
When compressing a response, Fiddler complains about a HTTP protocol violation, because content-length still has the UNcompressed content size.

Bye

I haven’t seen any questions specifically addressing this specific issue.
When URL Routing is used, The Request goes into IIS and then is routed to the Aspx Page. For example, a site: http://www.site.com/products may route the request to wwwroot/inetpub/mysite/pages/productsPage.aspx
It’s important to note that the page is ROUTED, not ReDirected.
Note that to get the URLs to work properly in IIS 6, a Wildcard extension has to be set up which routes all resources to Asp.Net (Asp.Net will then hand static resources back over) This is documented here: http://blog.codeville.net/2008/07/04/options-for-deploying-aspnet-mvc-to-iis-6/
It would appear that IIS doesn’t GZip the content when it sends it back. I have followed the instructions here: http://www.kavinda.net/2007/02/17/how-to-enable-http-compression-iis6.html
to enable IIS 6 compression.
Any idea why Http Compression doesn’t seem to work? It seems to work fine on other sites on my server. Just the one with URL Routing isn’t working

Make your ASP.NET application gzip its output itself with a method like this one.

Best thing to do is use HttpCompress by Bel Lowery. It’s a simple, free and open source HttpModule handling the HTTP compression of your pages. You can use it in combination with the IIS Http Compression option.
I use HttpCompress in combination with Vici MVC and it works smooth!
I’ve been searching the net for hours and it was either use HttpCompress, buy a commercial project (Port80 Software has a solution) or write my own HttpModule.
PS: IIS does HTTP compression based on the file extension. That’s why it’s not working for websites using URL routing.

Did you try this way ? MS KB322603
To enable IIS 5.0 to compress .aspx pages, follow these steps:

Open a command prompt.
Type net stop iisadmin, and then press ENTER.
Type cd C:InetPubadminscripts, and then press ENTER.
Type the following, and then press ENTER: CSCRIPT.EXE ADSUTIL.VBS SET W3Svc/Filters/Compression/GZIP/HcScriptFileExtensions “asp” “dll” “exe” “aspx”
Type the following, and then press ENTER: CSCRIPT.EXE ADSUTIL.VBS SET W3Svc/Filters/Compression/DEFLATE/HcScriptFileExtensions “asp” “dll” “exe” “aspx”
Type net start w3svc, and then press ENTER.

Sorry to be that late on the discussion, but since I (still) have to enable IIS 6 compression on a MVC site, here is an IIS 6 native solution I have found: Include in compressed extensions axd. This suppose you have IIS 6 extension less URLs support from .Net framework 4 correctly enabled.
I have done that directly in IIS Metabase. (As explained here; %windir%systems32inetsrvmetabase.xml. Prior to edit it, stop IIS or enable “metabase hot editing” in IIS, and backup it.)
Extract from my configuration:
<IIsCompressionScheme Location ="/LM/W3SVC/Filters/Compression/deflate" HcCompressionDll="%windir%system32inetsrvgzip.dll" HcCreateFlags="0" HcDoDynamicCompression="TRUE" HcDoOnDemandCompression="TRUE" HcDoStaticCompression="TRUE" HcDynamicCompressionLevel="9" HcFileExtensions="htm html txt xml css js" HcOnDemandCompLevel="10" HcPriority="1" HcScriptFileExtensions="asp dll exe cgi aspx asmx ashx axd" > </IIsCompressionScheme> <IIsCompressionScheme Location ="/LM/W3SVC/Filters/Compression/gzip" HcCompressionDll="%windir%system32inetsrvgzip.dll" HcCreateFlags="1" HcDoDynamicCompression="TRUE" HcDoOnDemandCompression="TRUE" HcDoStaticCompression="TRUE" HcDynamicCompressionLevel="9" HcFileExtensions="htm html txt xml css js" HcOnDemandCompLevel="10" HcPriority="1" HcScriptFileExtensions="asp dll exe cgi aspx asmx ashx axd" > </IIsCompressionScheme> <IIsCompressionSchemes Location ="/LM/W3SVC/Filters/Compression/Parameters" HcCacheControlHeader="max-age=86400" HcCompressionBufferSize="8192" HcCompressionDirectory="%windir%IIS Temporary Compressed Files" HcDoDiskSpaceLimiting="TRUE" HcDoDynamicCompression="TRUE" HcDoOnDemandCompression="TRUE" HcDoStaticCompression="TRUE" HcExpiresHeader="Wed, 01 Jan 1997 12:00:00 GMT" HcFilesDeletedPerDiskFree="256" HcIoBufferSize="8192" HcMaxDiskSpaceUsage="99614720" HcMaxQueueLength="1000" HcMinFileSizeForComp="1" HcNoCompressionForHttp10="FALSE" HcNoCompressionForProxies="FALSE" HcNoCompressionForRange="FALSE" HcSendCacheHeaders="FALSE" > </IIsCompressionSchemes>
Rational: under the hood, extension less URLs work in IIS 6 by calling an eurl.axd page. See this blog for a more in depth explanation on extension less URLs in IIS6 with fx4.

What's on your mind?

This site uses Akismet to reduce spam. Learn how your comment data is processed.