Download GitHub files in directories - c#

I have this function in my script:
private void DownloadFile(string file, string location)
{
var githubToken = "[token]";
var request = (HttpWebRequest)WebRequest.Create("https://api.github.com/repos/[owner]/[repo]/contents/" + file);
request.Headers.Add(HttpRequestHeader.Authorization, string.Concat("token ", githubToken));
request.Accept = "application/vnd.github.v3.raw";
request.UserAgent = "useragent";
try
{
using (var response = request.GetResponse())
{
var encoding = System.Text.ASCIIEncoding.UTF8;
using (var reader = new StreamReader(response.GetResponseStream(), encoding))
{
/*using (StreamWriter sw = File.CreateText(location))
{
sw.WriteLine(reader.ReadToEnd());
sw.Close();
}*/
using (FileStream fs = File.Create(location))
{
byte[] info = new UTF8Encoding(true).GetBytes(reader.ReadToEnd());
fs.Write(info, 0, info.Length);
}
reader.Close();
}
}
}
catch (Exception ex)
{
MessageBox.Show("ERROR: " + ex + "\n " + file);
}
}
So downloading a single file in the root directory like repo/text.txt, this works great! But when I want to download the file in files/file1.rar, that doesn't work. It gives me an 403 or 404. Can somebody explain why this is, and how to download files from a directory (the repository is private, therefore the large chunk of code)

Related

Issues reading PDF to NetSuite using C#

In my RESTlet I see an error in my log: Code: UNEXPECTED_ERROR null Code: UNEXPECTED_ERROR:
var pdf = data.pdfdoc;
nlapiLogExecution('DEBUG', 'in data', pdf);
I think the data being passed in is not correctly read. Anything look unusual?
static void Main(string[] args)
{
try
{
FileStream stream = new FileStream(#"C:\Users\simplified.PDF", FileMode.Open, FileAccess.ReadWrite);
byte[] fileBytes = new byte[stream.Length];
stream.Read(fileBytes, 0, fileBytes.Length);
stream.Close();
string jsonString = "{\"tranid\":\"18810\", \"pdfdoc\":\"" + fileBytes + "\"" + "}";
WebRequest request = WebRequest.Create("https://rest.sandbox.netsuite.com/app/site/hosting/restlet.nl?script=332&deploy=1");
request.ContentType = "application/json";
request.Method = "PUT";
request.Headers.Add("Authorization:NLAuth nlauth_account=XXX, nlauth_email=test#test.com, nlauth_signature=xxx, nlauth_role=18");
using (var streamWriter = new StreamWriter(request.GetRequestStream()))
{
streamWriter.Write(jsonString);
}
WebResponse response = request.GetResponse();
Console.Write(response);
Console.Read();
}
catch (FileNotFoundException e)
{
Console.Write(e);
}
}
EDIT - code is now creating the pdf doc in NetSuite but not showing the content.
RESTlet
try
{
var file = datain.file;
nlapiLogExecution('DEBUG', 'in data', file);
var base64content = nlapiEncrypt(file, 'base64', null);
var doc = nlapiCreateFile('mydoc2.pdf', 'PDF', base64content);
doc.setFolder(115);
var fileid = nlapiSubmitFile(doc);
nlapiLogExecution('DEBUG', 'after submit', fileid);
}

recursive method to get ftp directoryList

I try to "create" a recursive method, which copy the ftp directory list to a treeView.
I allready tried to do that, but it is more quick & dirty as clean & simple.
Here you can see my Code-Snippets:
public void connectToServer(string pServerIP, string pServerPort, string pUsername, string pPassword)
{
_serverIP = pServerIP;
_serverPort = pServerPort;
_username = pUsername;
_password = pPassword;
string ftpServerPath = "ftp://" + pServerIP + ":" + pServerPort + "/";
try
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(ftpServerPath);
request.Method = WebRequestMethods.Ftp.ListDirectory;
request.Credentials = new NetworkCredential(pUsername, pPassword);
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
secondLevelDirectotyList = new List<string>();
int i = 0;
TreeNode rootTreeNode = tVDirectories.Nodes.Add("/");
Console.WriteLine("/\n");
while (!reader.EndOfStream)
{
secondLevelDirectotyList.Add(reader.ReadLine());
Console.WriteLine("...: " + secondLevelDirectotyList[i]);
i++;
}
reader.Close();
response.Close();
getFTPDirectoryList(secondLevelDirectotyList, 0);
}
catch (WebException ex)
{
MessageBox.Show("The following Exceptions occurs:\n" + ex.Message, "Exception occurs", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
}
private void getFTPDirectoryList(List<string> pTopLevelDirectoryList, int pDirectoryListIndexer)//string pFTPPath)
{
//List<string>
string ftpServerPath = "ftp://" + _serverIP + ":" + _serverPort + "/" + pTopLevelDirectoryList[pDirectoryListIndexer];//pFTPPath;
try
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(ftpServerPath);
request.Method = WebRequestMethods.Ftp.ListDirectory;
request.Credentials = new NetworkCredential(_username, _password);
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
thirdLevelDirectoryList = new List<string>();
int i = 0;
TreeNode ftpServerDirectory = tVDirectories.Nodes[0].Nodes.Add(pTopLevelDirectoryList[pDirectoryListIndexer]);//pFTPPath);
while (!reader.EndOfStream)
{
string streamFTPPath = reader.ReadLine(); //Ließt die Zeile des Streams aus
thirdLevelDirectoryList.Add(streamFTPPath); //Fügt den gesamten Pfad in die String-Liste
Console.WriteLine("...........: " + thirdLevelDirectoryList[i]);
string newTreeNode = streamFTPPath.Substring(streamFTPPath.IndexOf(#"/") + 1);
ftpServerDirectory.Nodes.Add(newTreeNode); //Fügt nur den Unterordner- oder Unterdatei-Namen in die Ansicht ein
i++;
}
reader.Close();
response.Close();
//rekursiv
pDirectoryListIndexer++;
try
{
getFTPDirectoryList(pTopLevelDirectoryList, pDirectoryListIndexer);
}
catch (ArgumentOutOfRangeException ex)
{
// start next level Directory List
//pDirectoryListIndexer = 0;
//getFTPDirectoryList(thirdLevelDirectoryList, 0);
}
}
catch (WebException ex)
{
MessageBox.Show("The following Exceptions occurs:\n" + ex.Message, "Exception occurs", MessageBoxButtons.OK, MessageBoxIcon.Error);
}
}
As you can see, I don't get the point to browse through the ftp folders.
You also may noticed that I work with ftp-protocol the early first time.
For example: I don't know whether I should open a request every time the ftp-ListDirectory-Command is send.
I want something like this:
root/
...folder1
......file1infolder1
...file2
...folder2
......file3infolder2
I hope you can understand me :D
and I am also sry for my bad english.
This is my solution for a recursive method, which list all files and folders from a ftp-path,
BUT
it is unefficient and it is impossible to use!!!!
If you only have five folders, this method will work, but if you have more than ~five folders the method won't end - sure it will be finished SOME DAY..
So for everyone, who read this:
Think about your Idea to use a recursive method to list the ftp-directory!
You better should send the NLIST-ftp-command once the user "opens" a folder.
private void FtpNlistRecursive(string pPath)
{
try
{
DirectoryListOfCurrent = new List<string>();
_ftpServerFullPath = "ftp://" + _serverIP + ":" + _serverPort + "/" + pPath;
string newItem = "";
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(_ftpServerFullPath);
request.Method = WebRequestMethods.Ftp.ListDirectory;
request.Credentials = new NetworkCredential(_username, _password);
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
while (!reader.EndOfStream)
{
newItem = reader.ReadLine();
string shortItem = pPath.Substring(pPath.IndexOf(#"/") + 1); // Aus "Ornder1/Datei1.txt" wird "Datei1.txt"
if (!shortItem.Equals(newItem))
{
try
{
if (pPath.Equals("/"))
{
DirectoryListOfCurrent.Add(newItem);
directoryListOfAll.Add(newItem);
}
else
{
string completePath = pPath + newItem.Substring(newItem.IndexOf(#"/"));
DirectoryListOfCurrent.Add(completePath);
directoryListOfAll.Add(completePath);
}
}
catch (ArgumentOutOfRangeException ex)
{
//bei ZB "Datei3.txt" gibt es kein "/", somit einfach ignorieren
}
}
}
reader.Close();
response.Close();
foreach (string item in DirectoryListOfCurrent)
{
FtpNlistRecursive(item);
}
}
catch (Exception ex)
{
ExceptionOccurs(ex);
}
}

Upload Multiple files to FTP in c#

i'm using the below method to upload files from local server to FTP server, here i'm creating a new connection and initiating a new session each and every file uploading and closing the same. how to achieve this in single initiated session in c#.
this is my code
public bool UploadTempFilesToFTP()
{
string[] fileList;
try
{
ConfiguredValues conObj = new ConfiguredValues();
conObj.PickTheValuesFromConfigFile();
fileList = Directory.GetFiles(conObj.tempPath);
foreach (string FileName in fileList)
{
FtpWebRequest upldrequest = (FtpWebRequest)FtpWebRequest.Create(conObj.tempOutboundURL + FileName);
upldrequest.UseBinary = true;
upldrequest.KeepAlive = false;
upldrequest.Timeout = -1;
upldrequest.UsePassive = true;
upldrequest.Credentials = new NetworkCredential(conObj.user, conObj.pass);
upldrequest.Method = WebRequestMethods.Ftp.UploadFile;
string destinationAddress = conObj.tempPath;
FileStream fs = File.OpenRead(destinationAddress + FileName);
byte[] buffer = new byte[fs.Length];
fs.Read(buffer, 0, buffer.Length);
fs.Close();
Stream requestStr = upldrequest.GetRequestStream();
requestStr.Write(buffer, 0, buffer.Length);
requestStr.Close();
requestStr.Flush();
FtpWebResponse response = (FtpWebResponse)upldrequest.GetResponse();
response.Close();
File.Delete(destinationAddress + FileName);
}
Console.WriteLine("Uploaded Successfully to Temp folder");
return true;
}
catch (Exception ex)
{
Console.WriteLine("Upload failed. {0}", ex.Message);
return false;
}
}
it's weird that i answer an old question but i try almost everything to upload multiple files to ftp with no luck while the solution is very simple and effective, using LOOPING - foreach solved the issue for me i use the below function to Upload the files in one simple step..
public void Uploadbulkftpfiles(string[] list)
{
bool ife;// is folder exists
try
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp://ftpsite.com/folder");
request.Credentials = new NetworkCredential("Username", "Password");
request.Method = WebRequestMethods.Ftp.ListDirectory;
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
ife = true;
}
catch (Exception)
{
ife = false;
}
/////////////////////////////////////////////begin of upload process
if (ife)//the folder is already exists
{
foreach (var str in list)
{
try
{
FtpWebRequest requestUP2 = (FtpWebRequest)WebRequest.Create("ftp://ftpsite.com/folder" + str);
requestUP2.Credentials = new NetworkCredential("UserName", "Password");
requestUP2.Method = WebRequestMethods.Ftp.UploadFile;
requestUP2.KeepAlive = false;
requestUP2.UsePassive = true;
using (Stream fileStream = File.OpenRead("ftp://ftpsite.com/folder" + str))
using (Stream ftpStream = requestUP2.GetRequestStream())
{
fileStream.CopyTo(ftpStream);
}
}
catch (Exception ex1)
{
MessageBox.Show(ex1.Message);
}
}
}
else if (!ife)
{
//CREATE THE FOLDER
try
{
FtpWebRequest request = (FtpWebRequest)WebRequest.Create("ftp:ftpsite/folder");
request.Credentials = new NetworkCredential("UserName", "Password");
request.Method = WebRequestMethods.Ftp.MakeDirectory;
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
}
catch (Exception excr) { MessageBox.Show(excr.Message); }
//UPLOAD THE FILES
foreach (var str in list)
{
try
{
FtpWebRequest requestUP2 = (FtpWebRequest)WebRequest.Create("ftp://ftpsite.com/folder" + str);
requestUP2.Credentials = new NetworkCredential("UserName", "Password");
requestUP2.Method = WebRequestMethods.Ftp.UploadFile;
requestUP2.KeepAlive = false;
requestUP2.UsePassive = true;
using (Stream fileStream = File.OpenRead("ftp://ftpsite.com/folder" + str))
using (Stream ftpStream = requestUP2.GetRequestStream())
{
fileStream.CopyTo(ftpStream);
}
}
catch (Exception ex1)
{
MessageBox.Show(ex1.Message);
}
}
}
}
The ftp protocol is intended to works on request basis.
You start a request with a method (in your case UploadFile).
The only thing you can do is to KeepAlive your request to avoid connection closing
upldrequest.KeepAlive = true;
on every request you create except the last one. This will make a login only the first FTPWebRequest.
Then when you create the last FTPWebRequest, set
upldrequest.KeepAlive = false;
and it will close the connection when done.

how to save the webpage using c#?

How to save the webpage using c#? I need to open a dialog asking for the path to save the file.
Any help?
Create a file chooser like explained on this blog .
And then a web client
WebClient Client = new WebClient ();
Client.DownloadFile("pagename", " saveasname");
Here's another way:
private string DownlodHTMLPage(Uri url)
{
WebResponse response = null;
Stream stream = null;
StreamReader sr = null;
try
{
HttpWebRequest hwr = (HttpWebRequest)WebRequest.Create(url);
//sometimes it doesn't work if user agent is not set
hwr.UserAgent = "Opera/9.80 (Windows NT 5.1; U; pl) Presto/2.2.15 Version/10.10";
response = hwr.GetResponse();
stream = response.GetResponseStream();
//check if content type of page is text/xxx. you can add statement for XHTML files
if (!response.ContentType.ToLower().StartsWith("text/"))
{
return null;
}
string buffer = "", line;
//get the stream reader
sr = new StreamReader(stream);
//download HTML to buffer
while ((line = sr.ReadLine()) != null)
{
buffer += line + "\r\n"; //line with new line markers
}
return buffer;
}
catch (WebException e)
{
System.Console.WriteLine("Can't download from " + url + " 'casue " + e);
return null;
}
catch (IOException e)
{
System.Console.WriteLine("Can't download from " + url + " 'cause " + e);
return null;
}
finally
{
if (sr != null)
sr.Close();
if (stream != null)
stream.Close();
if (response != null)
response.Close();
}
}
Edit
To answer the question in comment of Ranjana. Method above just download a web page and returns it as a string. You save it later using e.g StreamWriter:
StreamWriter writer = new StreamWriter(PATH_TO_FILE, false, Encoding.UTF8);
writer.Write(DownlodHTMLPage(uri));
Path to file you can get using SaveFileDialog, e.g.:
SaveFileDialog dialog = new SaveFileDialog();
if (dialog.ShowDialog() == DialogResult.OK)
{
string file = dialog.FileName;
//rest of the code comes here
}
I hope that this is what you were asking for.

How do I programatically download a file from a sharepoint site?

I have a sharepoint site that has an excel spreadsheet that I need to download on a schedulad basis
Is this possible?
Yes it is possible to download the file from sharepoint.
Once you have the url for the document, it can be downloaded using HttpWebRequest and HttpWebResponse.
attaching a sample code
DownLoadDocument(string strURL, string strFileName)
{
HttpWebRequest request;
HttpWebResponse response = null;
request = (HttpWebRequest)WebRequest.Create(strURL);
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
request.Timeout = 10000;
request.AllowWriteStreamBuffering = false;
response = (HttpWebResponse)request.GetResponse();
Stream s = response.GetResponseStream();
// Write to disk
if (!Directory.Exists(myDownLoads))
{
Directory.CreateDirectory(myDownLoads);
}
string aFilePath = myDownLoads + "\\" + strFileName;
FileStream fs = new FileStream(aFilePath, FileMode.Create);
byte[] read = new byte[256];
int count = s.Read(read, 0, read.Length);
while (count > 0)
{
fs.Write(read, 0, count);
count = s.Read(read, 0, read.Length);
}
// Close everything
fs.Close();
s.Close();
response.Close();
}
You can also use the GetItem API of Copy service to download a file.
string aFileUrl = mySiteUrl + strFileName;
Copy aCopyService = new Copy();
aCopyService.UseDefaultCredentials = true;
byte[] aFileContents = null;
FieldInformation[] aFieldInfo;
aCopyService.GetItem(aFileUrl, out aFieldInfo, out aFileContents);
The file can be retrieved as a byte array.
You can also do this:
try
{
using (WebClient client = new WebClient())
{
client.Credentials = new NetworkCredential("username", "password", "DOMAIN");
client.DownloadFile(http_path, path);
}
}
catch (Exception ex)
{
MessageBox.Show("Error: " + ex.Message);
}
The link the to document in Sharepoint should be a static URL. Use that URL in whatever solution you have to grab the file on your schedule.
Why not just use wget.exe <url>. You can put that line in a batch file and run that through windows scheduler.

Categories

Resources