I need to connect to AWS without using C# library i.e. using only HTTP Rest endpoint so is it possible?
The reason I want to do this because I want to give flexibility to customers to connect to any service, in the case of the library I have to utilize the library code to connect to the relevant services.
And can we create an instance of AWS connection once and use that throughout the session instead of passing token or user name & password in headers again and again?
Here is what I tried using C# AWS library and I need to achieve the same using Rest endpoints.
public bool sendMyFileToS3(System.IO.Stream localFilePath, string bucketName, string subDirectoryInBucket, string fileNameInS3)
{
IAmazonS3 client = new AmazonS3Client(RegionEndpoint.USEast1);
TransferUtility utility = new TransferUtility(client);
TransferUtilityUploadRequest request = new TransferUtilityUploadRequest();
if (subDirectoryInBucket == "" || subDirectoryInBucket == null)
{
request.BucketName = bucketName; //no subdirectory just bucket name
}
else
{
// subdirectory and bucket name
request.BucketName = bucketName + #"/" + subDirectoryInBucket;
}
request.Key = fileNameInS3; //file name up in S3
request.InputStream = localFilePath;
request.ContentType = "";
utility.Upload(request); //commensing the transfer
return true; //indicate that the file was sent
}
Related
We need to read a CSV File of around 2 GB which is stored in Azure Data lake storage Gen1.The purpose is like we have to render the data in Grid format (UI ) with high performance when user request.
We are using .Net Core 2.1 (c#) for doing API for the same .
var creds = new ClientCredential(applicationId, clientSecret);
var clientCreds = ApplicationTokenProvider.LoginSilentAsync(tenantId, creds).GetAwaiter().GetResult();
// Create ADLS client object
AdlsClient client = AdlsClient.CreateClient(adlsAccountFQDN, clientCreds);
string fileName = "/cchbc/sources/MVP/Data.csv";
using (var readStream = new StreamReader(client.GetReadStream(fileName)))
{
while ((line = readStream.ReadLine()) != null)
{
content = content + line;
}
}
I have tried the above code but failed with an error
GETFILESTATUS failed with HttpStatus:Forbidden RemoteException: AccessControlException GETFILESTATUS failed with error 0x83090aa2 (Forbidden. ACL verification failed. Either the resource does not exist or the user is not authorized to perform the requested operation.)
Any suggestion will be very beneficial .Thanks in advance
If you want to use a service principal to access files storing in Azure data lake gen 1, we need to configure ACL for the service principal. The ACL has three permissions Read(read the contents of a file) Write(write or append to a file) and Execute(traverse the child items of a folder).
for example I access file /test/test.csv
Configure ACL as below
Opreation Object / test/ test.csv
Read tets.csv --X --X R--
Code
string appId = "service principal appId";
string appSecret = "service principal appSecret";
string domain = "service principal domain";
var serviceSettings = ActiveDirectoryServiceSettings.Azure;
serviceSettings.TokenAudience = new Uri(#"https://datalake.azure.net/");
var creds = await ApplicationTokenProvider.LoginSilentAsync(domain, appId, appSecret, serviceSettings);
string accountName = "testadls02";
AdlsClient client = AdlsClient.CreateClient($"{accountName}.azuredatalakestore.net", creds);
string fileName = "/test/test.csv";
string line = null;
using (var readStream = new StreamReader(client.GetReadStream(fileName)))
{
while ((line = await readStream.ReadLineAsync()) != null) {
Console.WriteLine(line);
}
}
For more details, please refer to here
I would need to upload a folder (which contains sub folders and files) from one server to another from C# code. I have done few research and found that we can achieve this using FTP. But with that I am able to move only files and not the entire folder. Any help here is appreciated.
The FtpWebRequest (nor any other FTP client in .NET framework) indeed does not have any explicit support for recursive file operations (including uploads). You have to implement the recursion yourself:
List the local directory
Iterate the entries, uploading files and recursing into subdirectories (listing them again, etc.)
void UploadFtpDirectory(
string sourcePath, string url, NetworkCredential credentials)
{
IEnumerable<string> files = Directory.EnumerateFiles(sourcePath);
foreach (string file in files)
{
using (WebClient client = new WebClient())
{
Console.WriteLine($"Uploading {file}");
client.Credentials = credentials;
client.UploadFile(url + Path.GetFileName(file), file);
}
}
IEnumerable<string> directories = Directory.EnumerateDirectories(sourcePath);
foreach (string directory in directories)
{
string name = Path.GetFileName(directory);
string directoryUrl = url + name;
try
{
Console.WriteLine($"Creating {name}");
FtpWebRequest requestDir =
(FtpWebRequest)WebRequest.Create(directoryUrl);
requestDir.Method = WebRequestMethods.Ftp.MakeDirectory;
requestDir.Credentials = credentials;
requestDir.GetResponse().Close();
}
catch (WebException ex)
{
FtpWebResponse response = (FtpWebResponse)ex.Response;
if (response.StatusCode ==
FtpStatusCode.ActionNotTakenFileUnavailable)
{
// probably exists already
}
else
{
throw;
}
}
UploadFtpDirectory(directory, directoryUrl + "/", credentials);
}
}
For the background of complicated code around creating the folders, see:
How to check if an FTP directory exists
Use the function like:
string sourcePath = #"C:\source\local\path";
// root path must exist
string url = "ftp://ftp.example.com/target/remote/path/";
NetworkCredential credentials = new NetworkCredential("username", "password");
UploadFtpDirectory(sourcePath, url, credentials);
A simpler variant, if you do not need a recursive upload:
Upload directory of files to FTP server using WebClient
Or use FTP library that can do the recursion on its own.
For example with WinSCP .NET assembly you can upload whole directory with a single call to the Session.PutFilesToDirectory:
// Setup session options
SessionOptions sessionOptions = new SessionOptions
{
Protocol = Protocol.Ftp,
HostName = "ftp.example.com",
UserName = "username",
Password = "password",
};
using (Session session = new Session())
{
// Connect
session.Open(sessionOptions);
// Download files
session.PutFilesToDirectory(
#"C:\source\local\path", "/target/remote/path").Check();
}
The Session.PutFilesToDirectory method is recursive by default.
(I'm the author of WinSCP)
The following code is used by software on a server outside AWS to obtain some information from a file within an S3 bucket in Amazon. This data is then broken up and used for other purposes.
List<Document> documentList = new List<Document>();
try
{
AmazonS3Config amazonS3Config = new AmazonS3Config();
amazonS3Config.RegionEndpoint = Amazon.RegionEndpoint.GetBySystemName(Settings.AWSRegion);
if (Settings.Proxy == true)
{
if (Settings.IsMasterService == true)
{
amazonS3Config.ProxyHost = Settings.ProxyHost;
amazonS3Config.ProxyPort = Settings.ProxyPort;
amazonS3Config.ProxyCredentials = System.Net.CredentialCache.DefaultCredentials;
}
else
{
if (Settings.IsCompanyStore == true)
{
amazonS3Config.ProxyHost = Settings.ProxyHostCompanyStore;
amazonS3Config.ProxyPort = Settings.ProxyPortCompanyStore;
NetworkCredential corpProxyCreds = new NetworkCredential(Settings.ProxyUserNameCompanyStore, Settings.ProxyPasswordCompanyStore);
amazonS3Config.ProxyCredentials = corpProxyCreds;
}
}
}
AmazonS3Client s3 = new AmazonS3Client(amazonCreds, amazonS3Config);
GetObjectRequest req = new GetObjectRequest();
req.BucketName = Settings.S3BucketName;
req.Key = Settings.S3ObjectName;
using (GetObjectResponse response = s3.GetObject(req))
if (response.HttpStatusCode == System.Net.HttpStatusCode.OK)
{
using (Stream amazonStream = response.ResponseStream)
{
StreamReader amazonStreamReader = new StreamReader(amazonStream);
string _lne = string.Empty;
while ((_lne = amazonStreamReader.ReadLine()) != null)
{
string[] _cfglines = _lne.Split('&');
foreach (string c in _cfglines)
{
string[] _fle = c.Split('|');
Document d = new Document();
d.Name = _fle[1];
d.FolderPath = _fle[0];
documentList.Add(d);
}
}
}
}
else
{
EventHandling.RaiseDebugEvent("response.HttpStatusCode.ToString() = " + response.HttpStatusCode.ToString());
throw new Exception("Could not obtain master configuration file. Status: " + response.HttpStatusCode.ToString());
}
}
catch (Exception ex)
{
EventHandling.RaiseDebugEvent(" ReturnCloudCaptureDocumentList ex.tostring = " + ex.ToString());
EventHandling.RaiseEvent(ex.Message, System.Diagnostics.EventLogEntryType.Error);
}
return documentList;
We have two different types of servers outside AWS. One behind a proxy, one not behind a proxy.
On the server not behind a proxy, this code works fine.
On the server behind a web proxy, this code fails every time with the following error:
'Error making request with Error Code ServiceUnavailable and Http
Status Code ServiceUnavailable. No further error information was
returned by the service.
Reviewing Amazon documentation, the ServiceUnavailable error occurs when you are making too many requests to S3 in a short space of time. This isn't true of this scenario however. We are making only one request, and even if we were making many requests, this would not explain why on one server it works fine, but on another it doesn't (with the only difference being the presence of a Proxy).
Any advice would be appreciated.
(Well shoot, if no one else wants the reputation, I'll take it. ;) )
There are at least three things I can think of (ht to #Collin-Dauphinee).
Access to the bucket may be restricted to the non-proxy'd machine's IP.
Your proxy may be mangling the request.
Your proxy may be refusing to forward the request.
Our system has been used to upload millions of files over several years. The clients use the following code to send an authentication token and zip file to our WEB API on Windows Server 2008 R2. On our Windows 7 devices, the system works great. As we are attempting to move to Windows 10 devices, we have suddenly encountered an issue where the received file has blocks of data in a different order than the source file. The problem only occurs about half of the time, which makes it very difficult to track down.
client code (.NET 4.5)
private static void UploadFile(string srcFile, string username, string password)
{
if (File.Exists(srcFile))
{
ConnectionUtilities connUtil = new ConnectionUtilities();
string authToken = connUtil.GetAuthToken(username, password);
using (HttpContent authContent = new StringContent(authToken))
{
using (HttpContent fileStreamContent = new ByteArrayContent(File.ReadAllBytes(srcFile)))
{
FileInfo fi = new FileInfo(srcFile);
using (HttpClient client = new HttpClient())
using (MultipartFormDataContent formData = new MultipartFormDataContent())
{
client.DefaultRequestHeaders.ExpectContinue = false;
formData.Add(authContent, "auth");
formData.Add(fileStreamContent, "data", fi.Name);
var response = client.PostAsync(ConfigItems.hostName + "UploadData", formData).Result;
if (response.IsSuccessStatusCode)
{
File.Delete(srcFile);
}
}
}
}
}
}
WEB API code (.NET 4.5.2)
public async Task<HttpResponseMessage> PostUploadData()
{
if (Request.Content.IsMimeMultipartContent())
{
MultipartFormDataStreamProvider streamProvider =
MultipartFormDataStreamProvider(HttpContext.Current.Server.MapPath("~/app_data"));
await Request.Content.ReadAsMultipartAsync(streamProvider);
string auth = streamProvider.FormData["auth"];
if (auth != null)
{
auth = HttpUtility.UrlDecode(auth);
}
if (Util.IsValidUsernameAndPassword(auth))
{
string username = Util.GetUsername(auth);
foreach (var file in streamProvider.FileData)
{
DirectoryInfo di = new DirectoryInfo(ConfigurationManager.AppSettings["DataRoot"]);
di = di.CreateSubdirectory(username);
string contentFileName = file.Headers.ContentDisposition.FileName;
di = di.CreateSubdirectory("storage");
FileInfo fi = new FileInfo(file.LocalFileName);
string destFileName = Path.Combine(di.FullName, contentFileName);
File.Move(fi.FullName, destFileName);
}
return new HttpResponseMessage(HttpStatusCode.OK);
}
}
return new HttpResponseMessage(HttpStatusCode.ServiceUnavailable);
}
The problem initially manifests as a zipped file that can't open in Windows. Only by doing a hexadecimal compare did we determine that the file was all there, just not in the same order as the original.
Any thoughts on what might be causing the blocks of data to be reordered?
P.S. I know the HttpClient is not being used as effectively as possible.
After some long and tedious testing (Yay, scientific method) we determined that our web content filter software was causing the issue.
I am using the .NET library for Amazon Web Services for an application that uploads images to an Amazon S3 bucket. It is used in an internal service of an ASP.NET 4.5 application. The NuGet package name is AWSSDK and its version is the latest (as of writing) stable: 2.3.54.2
When I attempt to use the PutObject method on the PutObjectRequest object (to upload the image blob), it throws an exception and complains that the hostname is wrong.
var accessKey = Config.GetValue("AWSAccessKey");
var secretKey = Config.GetValue("AWSSecretKey");
using (var client = new AmazonS3Client(accessKey, secretKey, config))
{
var request = new PutObjectRequest();
request.BucketName = Config.GetValue("PublicBucket");
request.Key = newFileName;
request.InputStream = resizedImage;
request.AutoCloseStream = false;
using (var uploadTaskResult = client.PutObject(request))
{
using (var uploadStream = uploadTaskResult.ResponseStream)
{
uploadStream.Seek(0, SeekOrigin.Begin);
var resultStr = new StreamReader(uploadStream).ReadToEnd();
}
}
}
The exception details are as follows:
Fatal unhandled exception in Web API component: System.Net.WebException: The remote name could not be resolved: 'images.ourcompany.com.http'
at System.Net.HttpWebRequest.GetRequestStream(TransportContext& context)
at System.Net.HttpWebRequest.GetRequestStream()
at Amazon.S3.AmazonS3Client.getRequestStreamCallback[T](IAsyncResult result)
at Amazon.S3.AmazonS3Client.endOperation[T](IAsyncResult result)
at Amazon.S3.AmazonS3Client.EndPutObject(IAsyncResult asyncResult)
at Tracks.Application.Services.Bp.BpTemplateService.UploadImage(Byte[] image, String fileName) in ...
I have tried to debug this in VS by stepping through the code but AWSSDK doesn't come with debug symbols. It should be noted that the remote host name (or bucket name as I think Amazon calls them) is images.ourcompany.com (not our real company's name!). I have checked the value of Config.GetValue("PublicBucket") and it is indeed images.ourcompany.com. At this stage I have exhausted my limited knowledge about Amazon S3 and have no theories about what causes the exception to be thrown.
I think you have to add region endpoint or/and set ServiceUrl to establish connection to AmazonS3 check the similar question below:
Coping folder inside AmazonS3 Bucket (c#)
Upload images on Amazon S3. source code
AmazonS3Config cfg = new AmazonS3Config();
cfg.RegionEndpoint = Amazon.RegionEndpoint.SAEast1;//your region Endpoint
string butcketName = "yourBucketName";
AmazonS3Client s3Client = new AmazonS3Client("your access key",
"your secret key", cfg);
PutObjectRequest request = new PutObjectRequest()
{
BucketName = _bucket,
InputStream = stream,
Key = fullName
};
s3Client.PutObject(request);
or
AmazonS3Config asConfig = new AmazonS3Config()
{
ServiceURL = "http://irisdb.s3-ap-southeast2.amazonaws.com/",
RegionEndpoint = Amazon.RegionEndpoint.APSoutheast2
};