I have a web API hosted on server 101.111.111.111. It accepts files in form data. My question is how to save this files in another server(Ex: 101.111.111.112)?. I didn't find much examples for this in web.
I have this code to save in my local folder:
var file = Request.Form.Files[i];
var folderName = Path.Combine("D:\\Loans", fileInfo.LoanID, fileInfo.Alias );
var pathToSave = Path.Combine(Directory.GetCurrentDirectory(), folderName);
if (!Directory.Exists(pathToSave))
{
Directory.CreateDirectory(pathToSave);
}
var fileName = ContentDispositionHeaderValue.Parse(file.ContentDisposition).FileName.Trim('"');
var fullPath = Path.Combine(pathToSave, fileName);
var dbPath = Path.Combine(folderName, fileName);
using (var stream = new FileStream(fullPath, FileMode.Create))
{
file.CopyTo(stream);
}
One way is that create a share folder on server 101.111.111.112 and then map that shared folder with map network drive...(In Windows) on server 101.111.111.111 and then use that folder like local folder in your code.
Related
I'm making a converter that receives some image/video and does the process to turn it into a webp file.
With jpg, png and webm files i don't have any problem at all, but for some reason when I attempt to use a gif file I got this error: "Access to the path 'C:\Users\Desktop\computing\api\wwwroot\spin.gif' is denied."
This error occurs when i`m trying to save the file received by a IFormFile.
My controller is like this:
[HttpPost]
[DisableRequestSizeLimit]
public async Task<IActionResult> ConverterToWebp()
{
var webp = new WEBPConverter(new VideoSettings(_videoSettings));
var workingdir = Path.Combine(Directory.GetCurrentDirectory(), "wwwroot");
var (file, _) = GetFile();
if (file == null)
return BadRequest("Arquivo inválido");
var (filepath, _) = await SaveToDir(file, workingdir);
var res = await webp.ConverterWebp(filepath);
if (!res.Success)
return BadRequest(res);
return File(res.bytes, "image/webp");
}
The method GetFile() look like this:
private (IFormFile? file, string? mimetype) GetFile()
{
var files = HttpContext.Request.Form.Files;
var file = files.FirstOrDefault();
if (file == null || file.Length == 0)
return (null, null);
var contentTypeProvider = new FileExtensionContentTypeProvider();
var isBlob = file.FileName.ToLower() == "blob";
var mimetypeParseOk = contentTypeProvider.TryGetContentType(file.FileName, out var mimeType);
if (isBlob)
return (file, "blob");
if (mimetypeParseOk)
return (file, mimeType);
return (null, null);
}
And the method who trigger the error, SaveToDir(), look like this:
private async Task<(string filepath, string filename)> SaveToDir(IFormFile file, string path)
{
var filename = new string(file.FileName
.Replace(' ', '_')
.Normalize(NormalizationForm.FormD)
.Where(ch => char.GetUnicodeCategory(ch) != UnicodeCategory.NonSpacingMark)
.ToArray());
var filepath = Path.Combine(path, filename);
using var stream = new FileStream(filepath, FileMode.Create);
await file.CopyToAsync(stream);
return (filepath, filename);
}
The entire project is using .net core 6.0
If I take one file with .gif extension and change it to .webm I got no error, even though the conversion don`t works great.
I don't know the reason why only if i use gif this method to save in directory don't work and generate that generic error because the path exist and has permissions, and that's why it doesn't trigger error in other files types.
By default IIS does not have permission for the wwwroot folder. You need to grant permissions to the IIS_IUSRS for the folder. I would not recommend this approach as it may be a potential security risk. The approach you could take is:
string path = Path.Combine(Path.GetTempPath(), Path.GetTempFileName());
With this you would save the file to the temp folder and temp file name in the users temp folder, in this case the assigned application user or by default IIS_IUSRS. Don't forget to delete the file after you're done with it.
In the case you want to go with the path of granting the access you do the following:
Go to where your inetpub folder is located
Right click on wwwroot and click on Properties
Switch to the security tab and click Edit...
Look for the IIS_IUSRS user, should be PCNAME\IIS_IUSRS
Select Allow for all permissions.
In the end I managed to solve the problem by saving a file without extesion and using theirs bytes arrays to convert.
But in fact the original question is not solved, any file that I try to use and that has the gif's extension get error. I tried to test changing a webm file that's working with name "test.webm" to "test.gif" and get the same error of permission.
This is how my method got no error:
private async Task<(string filepath, string filename)> SaveToDir(IFormFile file, string path)
{
var timespanEpoch = (int)(DateTime.UtcNow - new DateTime(1970, 1, 1)).TotalSeconds;
var filename = $"web_temp_file-{timespanEpoch}";
var filepath = Path.Combine(path, filename);
using var stream = new FileStream(filepath, FileMode.Create);
await file.CopyToAsync(stream);
return (filepath, filename);
}
I'm using asp.net core to create a project to upload the file in specific location in the server.
Here is the code:
TempFileName = Regex.Replace(vehicle.FileUpload.FileName, "[^a-zA-Z0-9_]+", "_", RegexOptions.Compiled);
var directiveToUpload = Path.Combine(_hostingEnvironment.WebRootPath, "images\\UploadFile");
if (!System.IO.Directory.Exists(directiveToUpload))
{
System.IO.Directory.CreateDirectory(directiveToUpload);
}
await SaveFileToServer(TempFileName);
Save the file:
async Task SaveFileToServer(string FileName)
{
if (vehicle.FileUpload.Length > 0)
{
using (var stream = new FileStream(Path.Combine(directiveToUpload, FileName), FileMode.Create))
{
await vehicle.FileUpload.CopyToAsync(stream);
}
}
}
Since the file Get uploaded But it doesn't create sub folder on mac. images\UploadFile this is not working.
Can't reproduce this issue, but since directory separator is not \ on Mac, I guess it does not interpret it correctly. You could replace your code with the usage of Path.DirectorySeparatorChar constant to avoid this.
var directiveToUpload = Path.Combine(_hostingEnvironment.WebRootPath, "images\\UploadFile");
Would become
var directiveToUpload = Path.Combine(_hostingEnvironment.WebRootPath, $"images{Path.DirectorySeparatorChar}UploadFile");
I'm new to .NET Core and Azure I have created an API with SQL-Server and I used Dapper for saving the path to the database for POST form-data with an image, like this:
private async Task<string> WriteFile(Image image)
{
String fileName;
IFormFile file = image.image;
long Id = image.ShopId;
try
{
var extension = "." + file.FileName.Split('.')[file.FileName.Split('.').Length - 1];
fileName = Guid.NewGuid().ToString() + extension;
var path = Path.Combine(Directory.GetCurrentDirectory(), "wwwroot\\cccc", fileName);
using (var bits = new FileStream(path, FileMode.Create))
{
await file.CopyToAsync(bits);
}
Image imageupload = new Image(path,Id);
toDb(imageupload);
}
catch (Exception e)
{
return e.Message;
}
return fileName;
}
public void toDb(Image imageUpload)
{
string path = imageUpload.path;
long ShopId = unchecked((int)imageUpload.ShopId);
using (IDbConnection dbConnection = Connection)
{
string sQuery = "UPDATE shop SET path = #path WHERE ShopId = #ShopId ;";
dbConnection.Open();
dbConnection.Execute(sQuery, new {path = path,ShopId = ShopId});
}
}
Before I deployed to Azure it returned image path "F:\\xxxx\\yyyyy\\zzzzzz\\aaaaaa\\wwwroot\\bbbbbb\\5d665cbc-679d-4926-862b-4e10f9358e8a.png"
After i deployed it return my image path
D:\\home\\site\\wwwroot\\wwwroot\\Shops\\a81c757e-df7e-4cf6-b778-20fc5fcf922d.png
can i view image by using this path if it possible how it view;
If the error is my path that file tried to save to how can I fix it? If I changed saved path to wwwroot\\bbbbbb\\5d665cbc-679d-4926-862b-4e10f9358e8a.png can I viewed file it from client app if its also not possible. How can i fixed this?
can i view image by using this path if it possible how it view.
Yes, in the Azure WebApp D:\home is shared for us. We could get more information about Azure WebApp Sandbox from this tutorial.
We could use the Kudu(To access your KUDU console, using your DEPLOYMENT credentials, navigate to https://*****.scm.azurewebsites.net where ***** is the name of your Web App.) to view, upload or download the files.
We also could use the FTP tool to download or upload the files to Azure WebApp site.
can I viewed file it from client app if its also not possible. How can i fixed this?
I recommand that you could store the image information to Azure storge. It is easy for us to access from client side. For more information about how to use Azure Storage, please refer to this document.
While trying to upload files to SharePoint online, remotely via SharePointClient upload, I am encountering a file size limit of 2mb. From my searches it seems that people have overcome this limit using PowerShell, but is there a way to overcome this limit using the native SharePointClient package in .Net C#? Here is my existing code sample:
using (var ctx = new Microsoft.SharePoint.Client.ClientContext(httpUrl))
{
ctx.Credentials = new Microsoft.SharePoint.Client.SharePointOnlineCredentials(username, passWord);
try
{
string uploadFilename = string.Format(#"{0}.{1}", string.IsNullOrWhiteSpace(filename) ? submissionId : filename, formatExtension);
logger.Info(string.Format("SharePoint uploading: {0}", uploadFilename));
new SharePointClient().Upload(ctx, sharePointDirectoryPath, uploadFilename, formatData);
}
}
I have read from the following site that you can use the ContentStream just not sure how that maps to SharePointClient (if at all):
https://msdn.microsoft.com/en-us/pnp_articles/upload-large-files-sample-app-for-sharepoint
UPDATE:
Per the suggested solution I now have:
public void UploadDocumentContentStream(ClientContext ctx, string libraryName, string filePath)
{
Web web = ctx.Web;
using (FileStream fs = new FileStream(filePath, FileMode.Open))
{
FileCreationInformation flciNewFile = new FileCreationInformation();
// This is the key difference for the first case - using ContentStream property
flciNewFile.ContentStream = fs;
flciNewFile.Url = System.IO.Path.GetFileName(filePath);
flciNewFile.Overwrite = true;
List docs = web.Lists.GetByTitle(libraryName);
Microsoft.SharePoint.Client.File uploadFile = docs.RootFolder.Files.Add(flciNewFile);
ctx.Load(uploadFile);
ctx.ExecuteQuery();
}
}
Still not quite working, but will update again when it is successful. Current error is :
Could not find file 'F:approot12-09-2017.zip'.
FINALLY
I am using files from Amazon S3 so the solution was to take my byte data and to stream that to the call:
public void UploadDocumentContentStream(ClientContext ctx, string libraryName, string filename, byte[] data)
{
Web web = ctx.Web;
FileCreationInformation flciNewFile = new FileCreationInformation();
flciNewFile.ContentStream = new MemoryStream(data); ;
flciNewFile.Url = filename;
flciNewFile.Overwrite = true;
List docs = web.Lists.GetByTitle(libraryName);
Microsoft.SharePoint.Client.File uploadFile = docs.RootFolder.Files.Add(flciNewFile);
ctx.Load(uploadFile);
ctx.ExecuteQuery();
}
You can use FileCreationInformation to create a new file and provide the contents via a FileStream. You can then add the file to the destination library. This should help you get around 2mb limit you are encountering with upload method you are using. Example below:
FileCreationInformation newFile = new FileCreationInformation
{
Url = fileName,
Overwrite = false,
ContentStream = new FileStream(fileSourcePath, FileMode.Open)
};
var createdFile = list.RootFolder.Files.Add(newFile);
ctx.Load(createdFile);
ctx.ExecuteQuery();
In the example the destination library is list you will need to get reference to this first. I can show you how to do this if required.
I am using a free MS Azure virtual webserver for my site.
On my dev machine I can successfully create a CSV file, save it to a relative temp directory, and then download it to the browser client.
However, when I run it from the Azure site, I get the following error:
System.IO.DirectoryNotFoundException: Could not find a part of the
path 'D:\home\site\wwwroot\temp\somefile.csv'.
Does the free version of Azure Websites block us from saving files to disk? If not, where are we allowed to create/save files that we generate on the fly?
Code Example
private FilePathResult SaveVolunteersToCsvFile(List<Volunteer> volunteers)
{
string virtualPathToDirectory = "~/temp";
string physicalPathToDirectory = Server.MapPath(virtualPathToDirectory);
string fileName = "Volunteers.csv";
string pathToFile = Path.Combine(physicalPathToDirectory, fileName);
StringBuilder sb = new StringBuilder();
// Column Headers
sb.AppendLine("First Name,Last Name,Phone,Email,Approved,Has Background Check");
// CSV Rows
foreach (var volunteer in volunteers)
{
sb.AppendLine(string.Format("{0},{1},{2},{3},{4},{5},{6}",
volunteer.FirstName, volunteer.LastName, volunteer.MobilePhone.FormatPhoneNumber(), volunteer.EmailAddress, volunteer.IsApproved, volunteer.HasBackgroundCheckOnFile));
}
using (StreamWriter outfile = new StreamWriter(pathToFile))
{
outfile.Write(sb.ToString());
}
return File(Server.MapPath(virtualPathToDirectory + "/" + fileName), "text/csv", fileName);
}
Make sure that the ~/temp folder gets published to the server, as it's possible your publish process isn't including it.
Azure Websites provide environment variables that you can use to get to things like a temporary storage folder. For example, there is a "TEMP" variable you could access to get a path to the TEMP folder specific to your Website.
Change line 2 in your method to this:
//string physicalPathToDirectory = Server.MapPath(virtualPathToDirectory);
string physicalPathToDirectory = Environment.GetEnvironmentVariable("TEMP");
Then change the last line to this:
//return File(Server.MapPath(virtualPathToDirectory + "/" + fileName), "text/csv", fileName);
return File(pathToFile, "text/csv", fileName);