C-Sharp | Java | Python | Swift | GO | WPF | Ruby | Scala | F# | JavaScript | SQL | PHP | Angular | HTML
But it often requires more time to process data. With the CompressionLevel enum, we control the balance of size and time. We test this enum. We measure its effect on file size.
Compression results, CompressionLevel Original file: 195,725 bytes CompressionLevel.Fastest: 63,125 bytes CompressionLevel.NoCompression: 195,773 bytes CompressionLevel.Optimal: 55,171 bytes
Example. This program uses a real-world HTML file containing the Shakespeare play Macbeth. It specifies CompressionLevel.Optimal, but you can change this to Fastest or NoCompression. You can measure the result file size in Windows Explorer.
Note: The CompressionLevel enum is not available in programs targeting earlier .NET Frameworks, including .NET 4.0.
Based on: .NET 4.5 C# program that uses CompressionLevel.Optimal using System.IO; using System.IO.Compression; using System.Text; class Program { static void Main() { // This code opens the macbeth file. // ... Then it converts it to a byte array and compresses it. // ... The output is compress.gz. string data = File.ReadAllText("macbeth"); byte[] text = Encoding.ASCII.GetBytes(data); byte[] compress = Compress(text); File.WriteAllBytes("compress.gz", compress); } /// <summary> /// Compress, optimal compression. /// </summary> public static byte[] Compress(byte[] raw) { using (MemoryStream memory = new MemoryStream()) { using (GZipStream gzip = new GZipStream(memory, CompressionLevel.Optimal)) { gzip.Write(raw, 0, raw.Length); } return memory.ToArray(); } } }
In the results, listed at the top, we find that CompressionLevel.Optimal is more than 10% better at compressing files than Fastest. NoCompression yielded a file that was similar, but not the same, in byte size to the original.
Benchmark. Next, I tested CompressionLevel.Fastest and Optimal. Is Fastest truly faster? And by how much time? I compressed a dictionary file (containing 143 KB) 100 times with both methods.
And: I found CompressionLevel.Fastest was about three times faster than Optimal. So Optimal is about 10% better for size, but much slower.
C# program that benchmarks CompressionLevel using System; using System.Diagnostics; using System.IO; using System.IO.Compression; using System.Text; class Program { const int _max = 100; static void Main() { // ... Load data, convert to bytes, and run method. string data = File.ReadAllText("dict"); byte[] text = Encoding.ASCII.GetBytes(data); Compress(text, CompressionLevel.Fastest); var s1 = Stopwatch.StartNew(); for (int i = 0; i < _max; i++) { byte[] data1 = Compress(text, CompressionLevel.Fastest); } s1.Stop(); var s2 = Stopwatch.StartNew(); for (int i = 0; i < _max; i++) { byte[] data2 = Compress(text, CompressionLevel.Optimal); } s2.Stop(); Console.WriteLine(((double)(s1.Elapsed.TotalMilliseconds * 1000000) / _max).ToString("0.00 ns")); Console.WriteLine(((double)(s2.Elapsed.TotalMilliseconds * 1000000) / _max).ToString("0.00 ns")); Console.Read(); } public static byte[] Compress(byte[] raw, CompressionLevel level) { using (MemoryStream memory = new MemoryStream()) { using (GZipStream gzip = new GZipStream(memory, level)) { gzip.Write(raw, 0, raw.Length); } return memory.ToArray(); } } } Result 4363345.00 ns Fastest 13004928.00 ns Optimal
Discussion. In testing CompressionLevel, I found some unexpected results. On certain generated files, such as sequential numbers, using Fastest resulted in a smaller file than Optimal. So Optimal does not always yield smaller sizes than Fastest.
However: As a general rule, Optimal improved compression ratios over Fastest. This applied to HTML files and natural language text files.
Summary. CompressionLevel.Fastest is much faster than Optimal. And conversely, Optimal tends to yield slightly better compression ratios. On certain files, Optimal may result in worse compression ratios.
Thus: CompressionLevel adjusts the balance of space and time. With greater time spent, file sizes on average are reduced.