I'm trying to upload files from ASP.Net to Sharepoint (In-order to preserve TimeStamp I'm using this way)
The following is my code
protected void UploadFileToSharePoint(string UploadedFilePath, string SharePointPath)
{
WebResponse response = null;
try
{
string SUrl = "http://MysharepointPath/Folder";
//WebRequest request = WebRequest.Create(SharePointPath);
WebRequest request = WebRequest.Create(SUrl);
request.Credentials = new NetworkCredential(username,password );
//request.Method = "PUT";
request.Method = "POST";
FileStream fStream = File.OpenRead(UploadedFilePath);
string fileName = fStream.Name.Substring(3);
byte[] contents = new byte[fStream.Length];
fStream.Read(contents, 0, (int)fStream.Length);
fStream.Close();
request.ContentLength = 0;
//Custom code
using (WebClient uploader = new WebClient())
{
try
{
uploader.UploadFile(new Uri(SUrl), UploadedFilePath);
}
catch (Exception ex)
{
}
}
response = request.GetResponse();
}
}
When i try to run the code in debug mode it throws exception
"remote server returned an error 401 unauthorized"
What do you mean by '(In-order to preserve TimeStamp I'm using this way)'? the timestamp on the uploaded document in SharePoint's going to be the date the document was added to the SharePoint document library
The error 401 (Unauthorized), indicates that the client must first authenticate itself, so can you use Fiddler to validate if you're authenticating in the SharePoint server?
Are you using SharePoint Online or OnPremises?.
There are a few samples in codeplex for what you're trying to accomplish, please review this link for more information...
http://spfileupload.codeplex.com/SourceControl/latest#Get-SPScripts.Copy-FilesToSP.ps1
Related
I'm making a tool in Unity to retrieve data from a server. The server's interface can provide URLs that we can later click on which will return an XML or CSV file with the results of that query from that server. But, it requires Basic Authentication. When clicking the links, it simply pops up a login screen before giving me the results. If I try what I [think] I know in Unity (starting with WebRequest.GetResponse()) it simply fails and says I am not authorized. It does not show the popup for authentication. So how do I let that login popup appear when accessing with Unity and await the login results to get the file? Or is there some standardized way to provide that info in the link itself?
Here is some code that should you get started. Just fill in the request link and username, password. please see the comments in the code to see what it does.
//try just in case something went wrong whith calling the api
try
{
//Use using so that if the code end the client disposes it self
using (HttpClient client = new HttpClient())
{
//Setup authentication information
string yourusername = "username";
string yourpwd = "password";
//this is when you expect json to return from the api
client.DefaultRequestHeaders.Accept.Add(new MediaTypeWithQualityHeaderValue("application/json"));
//add the authentication to the request
client.DefaultRequestHeaders.Authorization = new AuthenticationHeaderValue("Basic",
Convert.ToBase64String(
System.Text.ASCIIEncoding.ASCII.GetBytes($"{yourusername}:{yourpwd}")));
//api link used to make the call
var requestLink = $"apiLink";
using (HttpResponseMessage response = client.GetAsync(requestLink).Result)
{
//Make sure the request was successfull before proceding
response.EnsureSuccessStatusCode();
//Get response from website and convert to a string
string responseBody = response.Content.ReadAsStringAsync().Result;
//now you have the results
}
}
}
//Catch the exception if something went from and show it!
catch (Exception)
{
throw;
}
This is what I ended up going with after looking at the comments above. Let me know if I'm doing anything terribly inefficient!
String username = "Superman"; // Obviously handled secretly
String pw = "ILoveLex4evar!"; // Obviously handled secretly
String url = "https://www.SuperSecretServer.com/123&stuff=?uhh";
String encoded = System.Convert.ToBase64String(System.Text.Encoding.GetEncoding("ISO-8859-1").GetBytes(username + ":" + pw));
CookieContainer myContainer = new CookieContainer();
HttpWebRequest request = (HttpWebRequest)WebRequest.Create(url);
request.Headers.Add("Authorization", "Basic " + encoded);
try
{
using (WebResponse response = request.GetResponse())
{
using (Stream responseStream = response.GetResponseStream())
{
using (FileStream xml = File.Create("filepath/filename.xml"))
{
byte[] buffer = new byte[BufferSize];
int read;
while ((read = responseStream.Read(buffer, 0, buffer.Length)) > 0)
{
xml.Write(buffer, 0, read);
}
}
}
}
}
I have jobs in Jenkins that i cannot access unless i log in first using a my username and password.
For example if i try to access "localhost:xxx/job/some_job_1" i will get a 404 Error unless i log in first. And i say this because i have tried the following using WebRequest class:
string formParams = "j_username=bobbyLee&j_password=SecretPassword25&from=%2F&json=%7B%22j_username%22%3A+%bobbyLee%22%2C+%22j_password%22%3A+%22SecretPassword%25%22%2C+%22remember_me%22%3A+false%2C+%22from%22%3A+%22%2F%22%7D&Submit=log+in";
// ***this is the exact string that is sent when i log in normally, obtained using Fiddler***
string formUrl = "http://serverName:PortNum/j_acegi_security_check";
// ***I have also tried http://serverName:PortNum/login***
string cookieHeader;
WebRequest req = WebRequest.Create(formUrl);
req.ContentType = "application/x-www-form-urlencoded";
req.Method = "POST";
byte[] bytes = Encoding.ASCII.GetBytes(formParams);
req.ContentLength = bytes.Length;
using (Stream os = req.GetRequestStream())
{
os.Write(bytes, 0, bytes.Length);
}
WebResponse resp = req.GetResponse();
cookieHeader = resp.Headers["Set-cookie"];
string pageSource;
string getUrl = "http://serverName:portNum/job/some_job/";
WebRequest getRequest = WebRequest.Create(getUrl);
getRequest.Headers.Add("Cookie", cookieHeader);
WebResponse getResponse = getRequest.GetResponse();
using (StreamReader sr = new StreamReader(getResponse.GetResponseStream()))
{
pageSource = sr.ReadToEnd();
}
The response that i get back from the POST request is "HTML OK", and cookieHeader is not null. But when i then try to make a GET request to get what i want, i get a 404 error when attempting to access the job "http://serverName:portNum/job/some_job/", as if i didn't log in successfully.
So what is the correct way to log into Jenkins from c#, and get the HTML source code of the jobs that only appears after logging in?
The RESTAPI is your best friend here.
It is an incredibly rich source of information. I have written a system that will show an entire program of work on a page with full deployment traceability.
I am going to assume you have some security in place in your Jenkins instance which means requests need to be authenticated.
I use the following class for this:
using System;
using System.Net;
using System.Text;
namespace Core.REST
{
public class HttpAdapter
{
private const string ApiToken = "3abcdefghijklmnopqrstuvwxyz12345"; // you will need to change this to the real value
private const string UserName = "restapi";
public string Get(string url)
{
try
{
const string credentials = UserName + ":" + ApiToken;
var authorization = Convert.ToBase64String(Encoding.ASCII.GetBytes(credentials));
using (var wc = new WebClient())
{
wc.Headers[HttpRequestHeader.Authorization] = "Basic " + authorization;
var htmlResult = wc.DownloadString(string.Format(url));
return htmlResult;
}
}
catch (WebException e)
{
Console.WriteLine("Could not retrieve REST API response");
throw e;
}
}
}
}
restapi is a dedicated user I created. I think I gave it admin access just so I didn't have to worry about it. I was admin but all the other developers and testers in the 3 crews had highly controlled and limited access to only what they needed and nothing more. It is also better practice to have a dedicated users for functions like this.
I constructed my c# classes to consume (deserialise) data from any page that supports the api/json suffix.
I have Windows Forms app where I am loading a site. I login to the site inside Windows Forms with valid credentials.
Then somehow I manage to get valid session id & this is how URL looks like after the valid credentials
var url = "http://www.somewebsite123.com/portal/sessionId=123";
I am using Microsoft.mshtml & AxInterop.ShDocVw for fetching the content of the authorized page.
WebClient client = new WebClient();
using (Stream data = client.OpenRead(new Uri(url)))
{
StreamReader reader = new StreamReader(data);
string htmlContent = reader.ReadToEnd();
But at the below line, it throwing the error
strHTML = ((IHTMLElement)htmlContent.document).innerHTML.ToString();
Error
Internal error (WWC-00006)
An unexpected error occurred: ORA-01403: no data found (WWV-16016)
How do I get rid of this error?
The actual DOM content can be found in WebException.Response when WebClient hits 4XX or 5XX:
try {
// Webclient that raise 4XX
}
catch (WebException webex)
{
using (var streamReader = new new StreamReader(webex.Response.GetResponseStream())) {
var domContent = streamReader.ReadToEnd();
}
}
I am getting The remote server returned an error: (400) Bad Request error while running the following code.
I am trying to upload xml file on the http server.
My xml file contains tag for the username,password and domain and when i am trying to connect is manually i am able to connect it,but using same credentials when i am trying to connect it through this code, i am getting 400 Bad Request error.
Please suggest me how to overcome this issue.
Thanks
`
public static void UploadHttp(string xml)
{
string txtResults = string.Empty;
try
{
string url = "http://my.server.com/upload.aspx ";
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(url);
request.KeepAlive = false;
request.SendChunked = true;
request.AllowAutoRedirect = true;
request.Method = "Post";
request.ContentType = "text/xml";
var encoder = new UTF8Encoding();
var data = encoder.GetBytes(xml);
request.ContentLength = data.Length;
var reqStream = request.GetRequestStream();
reqStream.Write(data, 0, data.Length);
reqStream.Close();
WebResponse response = null;
response = request.GetResponse();
var reader = new StreamReader(response.GetResponseStream());
var str = reader.ReadToEnd();
}
catch (WebException ex)
{
if (ex.Status == WebExceptionStatus.ProtocolError)
{
HttpWebResponse err = ex.Response as HttpWebResponse;
if (err != null)
{
string htmlResponse = new StreamReader(err.GetResponseStream()).ReadToEnd();
txtResults = string.Format("{0} {1}", err.StatusDescription, htmlResponse);
}
}
else
{
}
}
catch (Exception ex)
{
txtResults = ex.ToString();
}
}`
Are you sure you should be using POST not PUT?
POST is usually used with application/x-www-urlencoded formats. If you are using a REST API, you should maybe be using PUT? If you are uploading a file you probably need to use multipart/form-data. Not always, but usually, that is the right thing to do..
Also you don't seem to be using the credentials to log in - you need to use the Credentials property of the HttpWebRequest object to send the username and password.
400 Bad request Error will be thrown due to incorrect authentication entries.
Check if your API URL is correct or wrong. Don't append or prepend spaces.
Verify that your username and password are valid. Please check any spelling mistake(s) while entering.
Note: Mostly due to Incorrect authentication entries due to spell changes will occur 400 Bad request.
What type of authentication do you use?
Send the credentials using the properties Ben said before and setup a cookie handler.
You already allow redirection, check your webserver if any redirection occurs (NTLM auth does for sure). If there is a redirection you need to store the session which is mostly stored in a session cookie.
//use "ASCII" or try with another encoding scheme instead of "UTF8".
using (StreamWriter postStream = new StreamWriter(request.GetRequestStream(), System.Text.Encoding.UTF8))
{
postStream.Write(postData);
postStream.Close();
}
How do you login to a webpage and retrieve its content in C#?
That depends on what's required to log in. You could use a webclient to send the login credentials to the server's login page (via whatever method is required, GET or POST), but that wouldn't persist a cookie. There is a way to get a webclient to handle cookies, so you could just POST the login info to the server, then request the page you want with the same webclient, then do whatever you want with the page.
Look at System.Net.WebClient, or for more advanced requirements System.Net.HttpWebRequest/System.Net.HttpWebResponse.
As for actually applying these: you'll have to study the html source of each page you want to scrape in order to learn exactly what Http requests it's expecting.
How do you mean "login"?
If the subfolder is protected on the OS level, and the browser pops of a login dialog when you go there, you will need to set the Credentials property on the HttpWebRequest.
If the website has it's own cookie-based membership/login system, you will have to use HttpWebRequest to first response to the login form.
string postData = "userid=ducon";
postData += "&username=camarche" ;
byte[] data = Encoding.ASCII.GetBytes(postData);
WebRequest req = WebRequest.Create(
URL);
req.Method = "POST";
req.ContentType = "application/x-www-form-urlencoded";
req.ContentLength = data.Length;
Stream newStream = req.GetRequestStream();
newStream.Write(data, 0, data.Length);
newStream.Close();
StreamReader reader = new StreamReader(req.GetResponse().GetResponseStream(), System.Text.Encoding.GetEncoding("iso-8859-1"));
string coco = reader.ReadToEnd();
Use the WebClient class.
Dim Html As String
Using Client As New System.Net.WebClient()
Html = Client.DownloadString("http://www.google.com")
End Using
You can use the build in WebClient Object instead of crating the request yourself.
WebClient wc = new WebClient();
wc.Credentials = new NetworkCredential("username", "password");
string url = "http://foo.com";
try
{
using (Stream stream = wc.OpenRead(new Uri(url)))
{
using (StreamReader reader = new StreamReader(stream))
{
return reader.ReadToEnd();
}
}
}
catch (WebException e)
{
//Error handeling
}
Try this:
public string GetContent(string url)
{
using (System.Net.WebClient client =new System.Net.WebClient())
{
return client.DownloadString(url);
}
}