Asynchronous WebDav upload - c#

Could somebody explain why my code isn't working as I expected. So there are .net framework 4.7.2, console app. I have one method in Main which trigger sync void method of file worker. This file worker look for file in my folder on my disk, and if it find something upload them.
WebDav class has public Upload Method which do some work and trigger private method to upload files one by one to webdav.
So my problem is that code execute in two ways synchronous or doesn't execute at all. For upload files to webDav I'm using nuget WebDav.Client.
So how to achieve this behaviour for example I have 10 files I run from server request's for each file to upload to webdav and cut time of execution. So main problem what I trying to solve is cut time.
So there is my code sample:
private static void Main(string[] args)
{
AudioCopy TestWebDav = new(_folderCount);
TestWebDav.AsyncCopyToDav(_pathToSave);
}
public class AudioCopy : IDisposable
{
public void AsyncCopyToDav(string path)
{
DirectoryInfo di = new(path);
var task = Task.Run(() =>
{
foreach (FileInfo file in di.GetFiles())
{
WebDav.UploadAsync(file.Name, file.FullName);
}
});
task.Wait();
}
}
public static class WebDav
{
public static void UploadAsync(string filename, string sourceFilePath, int? part = null)
{
string webDavTarget = ...some logic...;
var task = Task.Run(() => UploadFileToWebDavAsync(webDavTarget, sourceFilePath));
}
}
private static async Task UploadFileToWebDavAsync(string path, string sourceFilePath)
{
WebDavClientParams #params = new()
{
Credentials = _credential,
BaseAddress = new Uri(path),
Timeout = new TimeSpan(0, 5, 0)
};
IWebDavClient client = new WebDavClient(#params);
FileStream stream = new(sourceFilePath, FileMode.Open);
WebDavResponse result = await client.PutFile(path, stream);
if (!result.IsSuccessful)
{
throw new Exception();
}
}

Related

How do I unit test that a file uploads to Amazon S3 in my C# project?

I've started writing unit tests for an Amazon S3 container for files. I've never written tests for S3 before and I want to assert that a file is uploaded or at the very least the upload method for S3 is called.
So, all I'm attempting to do is unit test a method that uploads a file to an S3 container. Here is the test I have written so far to handle this.
Note: I am using NSubstitue for mocking in my tests.
using Amazon.S3;
using Amazon.S3.Model;
using Amazon.S3.Transfer;
using AmazonStorage.Repositories;
using Microsoft.VisualStudio.TestTools.UnitTesting;
using NSubstitute;
using System;
using System.IO;
using System.Threading;
using System.Threading.Tasks;
namespace AmazonStorage.Tests.Repositories
{
[TestClass]
public class FileRepositoryTests
{
private readonly FileRepository _sut;
private readonly IAmazonS3 _client = Substitute.For<IAmazonS3>();
private readonly TransferUtility _transfer = Substitute.For<TransferUtility>();
public FileRepositoryTests()
{
var request = new GetObjectRequest()
{
Key = "somefile.txt",
BucketName = "aws-s3-apadmi"
};
_client.GetObjectAsync(request);
_sut = new FileRepository(_client);
}
[TestMethod]
public async Task PutFileInStorage_ShouldReceieveCallToUploadAsyc()
{
// Arrange
await _transfer.UploadAsync(Arg.Any<string>(), Arg.Any<string>(), Arg.Any<CancellationToken>());
//Assert
// I was thinking of something like this?
// await _client.Received(1).UploadObjectFromStreamAsync(Arg.Any<string>(), Arg.Any<string>(), Arg.Any<Stream>());
}
As you'll see I define my system under test, I bring in and mock the IAmazonS3 interface, I also mock the TransferUtility which is typically called when you upload a file to S3.
The FileRepository class has a file exists check which you'll see below in the method and that is why I pass the GetObjectAsync with a mocked GetObjectRequest into the FileRepository.
Here is my method to upload a file to S3.
private readonly IAmazonS3 _client;
public FileRepository(IAmazonS3 client)
{
_client = client;
}
public async Task PutFileInStorage(string fileName, string content)
{
try
{
await UploadToS3(fileName, content);
}
catch (Exception ex)
{
throw new Exception(ex.Message);
}
}
public async Task UploadToS3(string fileName, string content)
{
using MemoryStream ms = new(Encoding.UTF8.GetBytes(content));
var uploadRequest = new TransferUtilityUploadRequest
{
InputStream = ms,
Key = fileName,
BucketName = "aws-s3-apadmi"
};
bool fileCheck = await CheckFileExists(uploadRequest.Key, uploadRequest.BucketName);
if (fileCheck.Equals(true))
throw new Exception("File already exists");
if (!fileCheck)
{
// Upload file to storage
try
{
var fileTransferUtility = new TransferUtility(_client);
await fileTransferUtility.UploadAsync(uploadRequest);
}
catch (Exception ex)
{
throw new Exception(ex.Message);
}
}
}
The question is, how do I assert that this file has uploaded or at the very least that the UploadAsync has been called once?
Tight coupling to 3rd party dependencies (ie TransferUtility) is the design issue here
//...
var fileTransferUtility = new TransferUtility(_client);
//...
This makes testing your code in isolation difficult.
Consider adding an additional layer of abstraction that wraps TransferUtility to give you more control over your code.
public interface ITransferUtility {
Task UploadAsync(TransferUtilityUploadRequest request);
}
public class FileTransferUtility : ITransferUtility {
private readonly IAmazonS3 client;
public FileTransferUtility (IAmazonS3 client) {
this.client = client;
}
public async Task UploadAsync(TransferUtilityUploadRequest uploadRequest) {
bool fileCheck = await CheckFileExists(uploadRequest.Key, uploadRequest.BucketName);
if (fileCheck)
throw new Exception("File already exists");
// Upload file to storage
try {
TransferUtility fileTransferUtility = new TransferUtility(client);
await fileTransferUtility.UploadAsync(uploadRequest);
}
catch (Exception ex) {
throw new Exception(ex.Message);
}
}
//...CheckFileExists and other necessary members
}
Also note the Separation of Concerns / Single Responsibility Principle (SoC / SRP) applied by refactoring most of the checks done in the repository to the new utility
This now simplifies the repository to focus on its core concern
private readonly ITransferUtility fileTransferUtility;
public FileRepository(ITransferUtility fileTransferUtility) {
this.fileTransferUtility = fileTransferUtility;
}
public async Task PutFileInStorage(string fileName, string content) {
try {
await UploadToS3(fileName, content);
} catch (Exception ex) {
throw new Exception(ex.Message);
}
}
private async Task UploadToS3(string fileName, string content)
using MemoryStream ms = new(Encoding.UTF8.GetBytes(content));
TransferUtilityUploadRequest uploadRequest = new TransferUtilityUploadRequest {
InputStream = ms,
Key = fileName,
BucketName = "aws-s3-apadmi"
};
await fileTransferUtility.UploadAsync(uploadRequest);
}
The subject under test is now able to be isolated to verify expected behavior
[TestClass]
public class FileRepositoryTests {
private readonly FileRepository sut;
private readonly ITransferUtility transfer = Substitute.For<ITransferUtility>();
public FileRepositoryTests() {
sut = new FileRepository(transfer);
}
[TestMethod]
public async Task PutFileInStorage_ShouldReceieveCallToUploadAsyc()
//Arrange
transfer.UploadAsync(Arg.Any<TransferUtilityUploadRequest>()).Returns(Task.CompletedTask);
string fileName = "somefile.txt";
string bucketName = "aws-s3-apadmi";
string content = "Hello World";
//Act
await sut.PutFileInStorage(fileName, content);
//Assert
transfer.Received().UploadAsync(Arg.Is<TransferUtilityUploadRequest>(x =>
x.Key == fileName
&& x.BucketName = bucketName
));
}
}

Execute Xunit tests in theory sequentially (not parallel)

We host configuration files for different environements in our git repo. As part of the CI process I'd like to makesure that these config files are always valid. For this I've created this test which copies the configurations, tries to start the server and shuts it down right away.
public class DeployConfigurationValidationTests
{
#region Private Fields
private readonly ITestOutputHelper _testOutputHelper;
private const string ServerBaseUrl = "http://localhost:44315";
#endregion
#region Constructors
public DeployConfigurationValidationTests(ITestOutputHelper testOutputHelper)
{
_testOutputHelper = testOutputHelper;
}
#endregion
#region Public Tests
/// <summary>
/// Copies all files contained in the directory specified by parameter <see cref="deployConfigDirectoryPath"/> to the executing directory and launches the application with this configuration.
/// </summary>
/// <param name="deployConfigDirectoryPath">The path of the directory containing the deploy configurations</param>
[Theory]
[InlineData("../../../../../Configurations/Dev/")]
[InlineData("../../../../../Configurations/Int/")]
[InlineData("../../../../../Configurations/Prod/")]
public async Task ValidateDeployConfigurationsTest(string deployConfigDirectoryPath)
{
// Arrange (copy deploy configurations into directory where the test is running)
var currentDirectory = Directory.GetCurrentDirectory();
var configurationFilePaths = Directory.GetFiles(deployConfigDirectoryPath);
foreach (var configurationFilePath in configurationFilePaths)
{
var configurationFileName = Path.GetFileName(configurationFilePath);
var destinationFilePath = Path.Combine(currentDirectory, configurationFileName);
File.Copy(configurationFilePath, Path.Combine(currentDirectory, destinationFilePath), true);
_testOutputHelper.WriteLine($"Copied file '{Path.GetFullPath(configurationFilePath)}' to '{destinationFilePath}'");
}
// Act (launch the application with the deploy config)
var hostBuilder = Program.CreateHostBuilder(null)
.ConfigureWebHostDefaults(webHostBuilder =>
{
webHostBuilder.UseUrls(ServerBaseUrl);
webHostBuilder.UseTestServer();
});
using var host = await hostBuilder.StartAsync();
// Assert
// Nothing to assert, if no error occurs, the config is fine
}
#endregion
}
The test is working fine when running each InlineData individually but fails when running the theory because the tests are being ran parallel by default. It will obviously not work to launch multiple (test) servers on the same port, using the same DLLs.
Question: How do I tell xUnit to run those tests sequentially?
We're using .net core 3.1 with XUnit 2.4.1
One way to solve this problem is to make use the CollectionAttribute.
Unfortunately you can apply this attribute only for classes.
So, you would need a small refactoring like this:
internal class ValidateDeploymentConfigBase
{
public async Task ValidateDeployConfigurationsTest(string deployConfigDirectoryPath)
{
// Arrange
var currentDirectory = Directory.GetCurrentDirectory();
var configurationFilePaths = Directory.GetFiles(deployConfigDirectoryPath);
foreach (var configurationFilePath in configurationFilePaths)
{
var configurationFileName = Path.GetFileName(configurationFilePath);
var destinationFilePath = Path.Combine(currentDirectory, configurationFileName);
File.Copy(configurationFilePath, Path.Combine(currentDirectory, destinationFilePath), true);
_testOutputHelper.WriteLine($"Copied file '{Path.GetFullPath(configurationFilePath)}' to '{destinationFilePath}'");
}
var hostBuilder = Program.CreateHostBuilder(null)
.ConfigureWebHostDefaults(webHostBuilder =>
{
webHostBuilder.UseUrls(ServerBaseUrl);
webHostBuilder.UseTestServer();
});
// Act
using var host = await hostBuilder.StartAsync();
}
}
And then your test cases would look like this:
[Collection("Sequential")]
internal class ValidateDevDeploymentConfig: ValidateDeploymentConfigBase
{
[Fact]
public async Task ValidateDeployConfigurationsTest(string deployConfigDirectoryPath)
{
base.ValidateDeployConfigurationsTest("../../../../../Configurations/Dev/");
}
}
...
[Collection("Sequential")]
internal class ValidateProdDeploymentConfig : ValidateDeploymentConfigBase
{
[Fact]
public async Task ValidateDeployConfigurationsTest(string deployConfigDirectoryPath)
{
base.ValidateDeployConfigurationsTest("../../../../../Configurations/Prod/");
}
}

How to programmatically upload a file in .NET 5 to a .NET 5 Rest API without losing content?

I created a .NET 5 REST Api. I can easily upload files from swagger. That is working fine. When debugging, I can see that the byte array is not empty. Here is the Controller method:
[Route("api/[controller]")]
[ApiController]
public class ImageController : ControllerBase
{
// POST api/<ImageController>
[HttpPost]
public void Post([FromForm] UserModel info)
{
var memoryStream = new MemoryStream();
info.Avatar.CopyTo(memoryStream);
var bytes = memoryStream.ToArray();
}
}
This is the UserModel:
public class UserModel
{
[FromForm(Name = "avatar")]
public IFormFile Avatar { get; set; }
[FromForm(Name = "name")]
public string Name { get; set; }
}
I also tried to upload a file programmatically. This is not entirely working. When putting breakpoints in the controller method, I see that the byte array is empty. So the call itself is working but the data is not entering.
Here is the source code of the .NET 5 Console application to upload files.
As explained, this does something useful as it really calls the REST API which I can see by putting breakpoints in the controller method. However, my controller method does not get any data. The byte array is empty.
private static async Task TryUpload()
{
using (var client = new HttpClient())
{
client.BaseAddress = new Uri("http://localhost:5000");
string filePath = "C:\\Users\\daan1982\\Pictures\\RiderStart.png";
var fileStream = File.Create(filePath);
using (var content =
new MultipartFormDataContent("Upload----" + DateTime.Now.ToString(CultureInfo.InvariantCulture)))
{
content.Add(new StreamContent(fileStream), "avatar", "RiderStart.png");
var result = await client.PostAsync("/api/Image", content);
var request = result.RequestMessage;
}
}
}
static async Task Main(string[] args)
{
await TryUpload();
Console.WriteLine("Hello World!");
}
As I named the content "avatar" in the upload and also in the request model, this should work fine. However, it does work but not fine as the byte array is always empty.
What am I doing wrong? And how can I fix this?
File.Create "creates or overwrites a file in the specified path."
You probably want File.OpenRead.
That's how it worked for me.
static async Task Main(string[] args)
{
await TryUpload();
}
private const string Boundary = "EAD567A8E8524B2FAC2E0628ABB6DF6E";
private static readonly HttpClient HttpClient = new()
{
BaseAddress = new Uri("https://localhost:5001/")
};
private static async Task TryUpload()
{
var requestContent = new MultipartFormDataContent(Boundary);
requestContent.Headers.Remove("Content-Type");
requestContent.Headers.TryAddWithoutValidation("Content-Type", $"multipart/form-data; boundary={Boundary}");
var fileContent = await File.ReadAllBytesAsync(#"<path to file\Unbenannt.PNG");
var byteArrayContent = new ByteArrayContent(fileContent);
byteArrayContent.Headers.ContentType = MediaTypeHeaderValue.Parse("image/png");
requestContent.Add(byteArrayContent, "avatar", "Unbenannt.PNG");
var postResponse = await HttpClient.PostAsync("/api/Image", requestContent);
}

Struggling to get async working on deployment in ASP.net

The code works fine on my development environment, but in deployment with scallable architecture it appears to deadlock.
Objective here is to take a queue of API requests to send to SendGrid, batch them up and process each batch one at a time.
First call from ASHX handler
public void ProcessRequest(HttpContext context)
{
var result = Code.Helpers.Email.Sendgrid.Queue.Process().Result;
if (result.Success)
{
Queue.Process()
public static async Task<GenericMethodResult> Process()
{
var queueItems = GetQueueItemsToProcess();
var batches = BatchQueueItems(queueItems);
foreach (var batch in batches)
{
var r = await batch.SendToSendGrid();
if (r.StopBatch)
{
break;
}
}
return new GenericMethodResult(true);
}
SendToSendGrid()
public async Task<SendGridAPIMethodResponse> SendToSendGrid()
{
var r = new SendGridAPIMethodResponse();
var json = API.Functions.CreateJSONData(this);
var sg = new SendGridClient(Settings.Email.SendgridAPIKey);
dynamic response;
if (Action == Action.UpdateRecipient)
{
response = await sg.RequestAsync(SendGridClient.Method.PATCH, urlPath: "contactdb/recipients", requestBody: json);
}
string jsonResponse = response.Body.ReadAsStringAsync().Result;
// Process response...
return r;
}
I've stripped out as much of the code as I could.
Is anyone able to tell me why this code is timing out in production?
This blocking call to .Result in SendToSendGrid() is causing a deadlock as you are mixing async and blocking calls.
string jsonResponse = response.Body.ReadAsStringAsync().Result;
Use async all the way through
var jsonResponse = await response.Body.ReadAsStringAsync();
and try to avoid mixing blocking calls in async methods.
You should also conside making your handler async as well by using HttpTaskAsyncHandler.
public class MyHandler : HttpTaskAsyncHandler {
public override async Task ProcessRequestAsync(HttpContext context) {
var result = await Code.Helpers.Email.Sendgrid.Queue.Process();
if (result.Success) {
//..other code
}
}
}

Accessing file in windows8 App using C#

This is basically for windows 8 app and I'm writing a file using this method -
static async void WriteDataCords(int numDataCodewords)
{
StorageFolder storageFolder = KnownFolders.DocumentsLibrary;
var storageFile = await storageFolder.GetFileAsync("DataCodeWords.txt");
string data = numDataCodewords.ToString();
await FileIO.AppendTextAsync(storageFile, data);
}
and now I'm reading file using this method -
StorageFolder storageFolder7 = KnownFolders.DocumentsLibrary;
var storageFile7 = await storageFolder6.GetFileAsync("DataCodeWords.txt");
string text7 = await Windows.Storage.FileIO.ReadTextAsync(storageFile7);
but when I run this program it's throwing an error "Access denied or We can't access the file". Using this approach I'm writing many files and reading.Please let me know how to solve this problem.
Thanks in advance
My problem is when I'm accessing file to read and display at that time file is involve in writing process so thats why I'm unable to access that file and it's showing an error.
So. Is there any approach by which we only move forward when the file writing process is complete and than reading process will start?
You could use SemaphoreSlim which limits the number of threads that can access a resource.
Below is an example of a class that handles writing/reading for a file. You create an instance of it when you want to write a file and call the WriteDataCords method. Then you need some way to access the correct instance when you want to read and then call ReadDataCords:
public class FileReadWrite
{
public string FileName { get; set; }
public FileReadWrite(string fileName)
{
FileName = fileName;
}
private readonly SemaphoreSlim _semaphore = new SemaphoreSlim(1, 1);
public async Task WriteDataCords(int numDataCodewords)
{
await _semaphore.WaitAsync();
try
{
StorageFolder storageFolder = KnownFolders.DocumentsLibrary;
var storageFile = await storageFolder.GetFileAsync(FileName);
string data = numDataCodewords.ToString();
await FileIO.AppendTextAsync(storageFile, data);
}
finally
{
_semaphore.Release();
}
}
public async Task ReadDataCords()
{
await _semaphore.WaitAsync();
try
{
StorageFolder storageFolder6 = KnownFolders.DocumentsLibrary;
var storageFile7 = await storageFolder6.GetFileAsync(FileName);
string text7 = await Windows.Storage.FileIO.ReadTextAsync(storageFile7);
}
finally
{
_semaphore.Release();
}
}
}
And calling code:
public class ClientCode
{
public async void WriteFile()
{
var fileReadWrite = new FileReadWrite("DataCodeWords.txt");
await fileReadWrite.WriteDataCords(42);
}
public async void ReadFile()
{
var fileReadWrite = GetFileReadWriteForFile("DataCodeWords.txt"); //Method for retreiving correct instance of FileWriteRead class
await fileReadWrite.ReadDataCords();
}
private FileReadWrite GetFileReadWriteForFile(string fileName)
{
}
}
You could skip the FileWriteRead class (it adds complexity) and use SemaphoreSlim directly in the original code for writing/reading but then you could only write/read one file at a time (which might not be a problem).

Categories

Resources