C-Sharp | Java | Python | Swift | GO | WPF | Ruby | Scala | F# | JavaScript | SQL | PHP | Angular | HTML
Then: This WebClient will download a page and the server will think it is Internet Explorer 6. It gets a byte array of data.
Byte ArrayTip: You can add a new HTTP header to your WebClients download request by assigning an entry in the Headers collection.
Also: You can use the WebHeaderCollection returned by Headers and call the Add, Remove, Set and Count methods on it.
C# program that uses client user-agent
using System;
using System.Net;
class Program
{
static void Main()
{
// Create web client simulating IE6.
using (WebClient client = new WebClient())
{
client.Headers["User-Agent"] =
"Mozilla/4.0 (Compatible; Windows NT 5.1; MSIE 6.0)";
// Download data.
byte[] arr = client.DownloadData("http://www.dotnetCodex.com/");
// Write values.
Console.WriteLine("--- WebClient result ---");
Console.WriteLine(arr.Length);
}
}
}
Output
--- WebClient result ---
6585
Tip: When you assign the result to the variable, you are doing a bitwise copy of the reference to that data.
Assign: To set many request headers, simply assign the string keys to the string values you want the headers to be set to.
Content-encoding: This part of the example gets a response HTTP header using the client.ResponseHeaders collection.
Tip: You can access this much like a hashtable or dictionary. If there is no header set for that key, the result is null.
C# program that uses Headers
using System;
using System.Net;
class Program
{
static void Main()
{
// Create web client.
WebClient client = new WebClient();
// Set user agent and also accept-encoding headers.
client.Headers["User-Agent"] =
"Googlebot/2.1 (+http://www.googlebot.com/bot.html)";
client.Headers["Accept-Encoding"] = "gzip";
// Download data.
byte[] arr = client.DownloadData("http://www.dotnetCodex.com/");
// Get response header.
string contentEncoding = client.ResponseHeaders["Content-Encoding"];
// Write values.
Console.WriteLine("--- WebClient result ---");
Console.WriteLine(arr.Length);
Console.WriteLine(contentEncoding);
}
}
Output
--- WebClient result ---
2040
gzip
Note: If no accept-encoding was specified, the server usually returns a plain text string.
Info: Internally, the DownloadString method will call into lower-level system routines in the Windows network stack.
And: It will allocate the resulting string on the managed heap. Then it will return a value referencing that data.
C# program that uses DownloadString
using System;
using System.Net;
class Program
{
static void Main()
{
// Create web client.
WebClient client = new WebClient();
// Download string.
string value = client.DownloadString("http://www.dotnetCodex.com/");
// Write values.
Console.WriteLine("--- WebClient result ---");
Console.WriteLine(value.Length);
Console.WriteLine(value);
}
}
Also: You can access the Headers variable as a WebHeaderCollection, allowing to perform more complex logic on the values.
Note: These allow you to continue running the present method while the download has not completed, and they return void.
However: If you manually call Dispose or use the using-statement, you can make these resources be cleaned up at more predictable times.
Info: If your website exposes some statistics or debugging information at a certain URL, you can configure this program to download that data and log it.
Also: It is possible to use this program on a timer or invoke the program through other programs, with the Process.Start method.
ProcessTip: You can write a console program that accesses a specific URL and then stores it in a log file. The program here is configurable.
C# program that downloads web page and saves it
using System;
using System.IO;
using System.Net;
class Program
{
static void Main(string[] args)
{
try
{
Console.WriteLine("*** Log Append Tool ***");
Console.WriteLine(" Specify file to download, log file");
Console.WriteLine("Downloading: {0}", args[0]);
Console.WriteLine("Appending: {0}", args[1]);
// Download url.
using (WebClient client = new WebClient())
{
string value = client.DownloadString(args[0]);
// Append url.
File.AppendAllText(args[1],
string.Format("--- {0} ---\n", DateTime.Now) +
value);
}
}
finally
{
Console.WriteLine("[Done]");
Console.ReadLine();
}
}
}
Program usage:
1. Compile to EXE.
2. Make shortcut to the EXE.
3. Specify the target URL and the local file to append to.
Such as "http://test/index.html" "C:\test.txt"
Try: In this example, we use a try-catch-finally block. The program begins in the try block.
TryHere: It reads the command-line argument and writes the parameters to the screen. It sets the Accept-Encoding HTTP header.
Main argsCatchFinallyThen: It downloads the page up to 100 times. It averages the total milliseconds elapsed and prints this to the screen as well.
C# program that times web page downloads
using System;
using System.Diagnostics;
using System.Net;
class Program
{
const int _max = 5;
static void Main(string[] args)
{
try
{
// Get url.
string url = args[0];
// Report url.
Console.ForegroundColor = ConsoleColor.White;
Console.WriteLine("... PageTimeTest: times web pages");
Console.ResetColor();
Console.WriteLine("Testing: {0}", url);
// Fetch page.
using (WebClient client = new WebClient())
{
// Set gzip.
client.Headers["Accept-Encoding"] = "gzip";
// Download.
// ... Do an initial run to prime the cache.
byte[] data = client.DownloadData(url);
// Start timing.
Stopwatch stopwatch = Stopwatch.StartNew();
// Iterate.
for (int i = 0; i < Math.Min(100, _max); i++)
{
data = client.DownloadData(url);
}
// Stop timing.
stopwatch.Stop();
// Report times.
Console.WriteLine("Time required: {0} ms",
stopwatch.Elapsed.TotalMilliseconds);
Console.WriteLine("Time per page: {0} ms",
stopwatch.Elapsed.TotalMilliseconds / _max);
}
}
catch (Exception ex)
{
Console.WriteLine(ex.ToString());
}
finally
{
Console.WriteLine("[Done]");
Console.ReadLine();
}
}
}
Usage:
Create a shortcut of the EXE of the program.
Then specify the URL on the command-line in the shortcut.
Caution: This code has many limitations and does not adequately simulate the web browser environment. But it is helpful for benchmarking.
Possible results
... PageTimeTest: times loads of web page over network
Testing: http://www.google.com/
Time required: 259.7351 ms
Time per page: 51.94702 ms
[Done]