I would like to add a picture to a PictureBox via the .Load() Method. The Problem with that picture is that it stays on a website which requires authentication!
Link is like: https://intranet.company.com/_layouts/15/company/PortraitHandler.ashx?isinternal=true&account=test/account
How can I fix this?
Solved it like this:
public Bitmap getImageFromURL(string sURL)
{
HttpWebRequest Request = (HttpWebRequest)WebRequest.Create(sURL);
Request.Method = "GET";
Request.UseDefaultCredentials = true;
HttpWebResponse Response = (HttpWebResponse)Request.GetResponse();
System.Drawing.Bitmap bmp = new System.Drawing.Bitmap(Response.GetResponseStream());
Response.Close();
return bmp;
}
Related
In C#, is it possible to detect if the web address of a file is an image, or a video? Is there such a header value for this?
I have the following code that gets the filesize of a web file:
System.Net.WebRequest req = System.Net.HttpWebRequest.Create("http://test.png");
req.Method = "HEAD";
using (System.Net.WebResponse resp = req.GetResponse())
{
int ContentLength;
if(int.TryParse(resp.Headers.Get("Content-Length"), out ContentLength))
{
//Do something useful with ContentLength here
}
}
Can this code be modified to see if a file is an image or a video?
Thanks in advance
What you're looking for is the "Content-Type" header
string uri = "http://assets3.parliament.uk/iv/main-large//ImageVault/Images/id_7382/scope_0/ImageVaultHandler.aspx.jpg";
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(uri);
request.Method = "HEAD";
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
var contentType = response.Headers["Content-Type"];
Console.WriteLine(contentType);
}
You can check resp.Headers.Get("Content-Type") in response header.
For example, it will be image/jpeg for jpg file.
See list of available content types.
I have the following code for my handler I've debugged it and I can see that my image variable b has the actual image I need however I am not able to display it in my browser. When I run this I just get System.Drawing.Bitmap on the screen instead of the image. I am not sure how to write it to the browser. Any ideas would be much appreciated it thanks.
HttpWebRequest request = (HttpWebRequest)WebRequest.Create("http://myaddress");
request.Credentials = new NetworkCredential("username", "password");
request.Method = "GET";
request.Accept = "image/jpeg";
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
Stream s = response.GetResponseStream();
System.Drawing.Image b = System.Drawing.Image.FromStream(s);
context.Response.ContentType = "image/jpeg";
context.Response.Write(b);
Write it to the output stream:
b.Save(context.Response.OutputStream, ImageFormat.Jpeg);
I fetch webpages in order to feed data to my application. However, the pages contain a lot of images which I don't require at all. I only need the text data.
My problem is that the web requests take an unacceptable amount of time. I think the images also are fetch during a web request. Is there any way to eliminate the images and download only the text data?
The following is the code that I am using currently.
var httpWebRequest = HttpWebRequest.Create(url) as HttpWebRequest;
httpWebRequest.Method = "GET";
httpWebRequest.ProtocolVersion = HttpVersion.Version11;
httpWebRequest.Headers.Add(HttpRequestHeader.AcceptEncoding, "gzip,deflate");
httpWebRequest.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate;
httpWebRequest.Proxy = null;
httpWebRequest.KeepAlive = true;
httpWebRequest.Accept = "text/html";
string responseString = null;
var httpWebResponse = httpWebRequest.GetResponse() as HttpWebResponse;
using (var responseStream = httpWebResponse.GetResponseStream())
{
using (var streamReader = new StreamReader(responseStream))
{
responseString = streamReader.ReadToEnd();
}
}
Also, any other optimization suggestions are most welcome.
That is incorrect.
HttpWebRequest does not know anything about HTML or images; it just sends raw HTTP requests.
You can use Fiddler to see exactly what's going on.
I have a products sale module in which products are uploaded from cj and saved in to database..today i noticed few records contained image url but returns 404(eg image url:http://www.bridalfashionmall.com/images/satin-2.jpg) hence shows no image in the repeater ..how can i check whether the url called dynamically has image in it
The method suggested by sean could be used as first pass. As second pass you can try loading the stream into image and see if it is actually image?
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(imageFilePath);
request.Timeout = 5000;
request.ReadWriteTimeout = 20000;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();
System.Drawing.Image img = System.Drawing.Image.FromStream(response.GetResponseStream());
// Save the response to the output stream
Response.ContentType = "image/gif";
this one helped---
http://stackoverflow.com/questions/1639878/how-can-i-check-if-an-image-exists-at-http-someurl-myimage-jpg-in-c-asp-net
this one too worked
try
{
WebClient client = new WebClient();
client.DownloadData(ImageUrl);
}
catch
{
imgPhoto.ImageUrl = ../User/Images/ResourceImages/Candychocolate1.jpg";//default image path
}
Would it be possible to write a screen-scraper for a website protected by a form login. I have access to the site, of course, but I have no idea how to login to the site and save my credentials in C#.
Also, any good examples of screenscrapers in C# would be hugely appreciated.
Has this already been done?
It's pretty simple. You need your custom login (HttpPost) method.
You can come up with something like this (in this way you will get all needed cookies after login, and you need just to pass them to the next HttpWebRequest):
public static HttpWebResponse HttpPost(String url, String referer, String userAgent, ref CookieCollection cookies, String postData, out WebHeaderCollection headers, WebProxy proxy)
{
try
{
HttpWebRequest http = WebRequest.Create(url) as HttpWebRequest;
http.Proxy = proxy;
http.AllowAutoRedirect = true;
http.Method = "POST";
http.ContentType = "application/x-www-form-urlencoded";
http.UserAgent = userAgent;
http.CookieContainer = new CookieContainer();
http.CookieContainer.Add(cookies);
http.Referer = referer;
byte[] dataBytes = UTF8Encoding.UTF8.GetBytes(postData);
http.ContentLength = dataBytes.Length;
using (Stream postStream = http.GetRequestStream())
{
postStream.Write(dataBytes, 0, dataBytes.Length);
}
HttpWebResponse httpResponse = http.GetResponse() as HttpWebResponse;
headers = http.Headers;
cookies.Add(httpResponse.Cookies);
return httpResponse;
}
catch { }
headers = null;
return null;
}
Sure, this has been done. I have done it a couple of times. This is (generically) called Screen-scraping or Web Scraping.
You should take a look at this question (and also browse the questions under the tag "screen-scraping". Note that Scraping does not only relate to data extraction from a web resource. It also involves submission of data to online forms so as mimic the actions of a user when submitting input such as a Login form.