Nancy httpfile issue - c#

i have problems during parsing request files.
my file size is 1338521 bytes, but Nancy says, that file size is some times 1751049 or 3200349.
on my windows pc it works fine, on linux server this problem appears, so i can't save file.
string result = Convert.ToBase64String(Core.ReadBytesFromStream(file.Value));
using (MemoryStream ms = new MemoryStream(Convert.FromBase64String(result)))
{
using (Bitmap bm2 = new Bitmap(ms))
{
bm2.Save(path);
}
}
any ideas?

You don't need to convert the file like that.
var filename = Path.Combine(storagePath, Request.Files[0].Name);
using (var fileStream = new FileStream(filename, FileMode.Create))
{
Request.Files[0].Value.CopyTo(fileStream);
}
Validate the file when it comes in to ensure the extension is accepted, create a save path, and copy the stream to a new file on the filesystem.
That's it.

Related

Changing the name of a file while in a filestream or byte array to send via WebAPI

I would like to take the contents of a file and rename the file while in memory to send with a different file name using an API.
The Goals:
Not alter the original file (file on disk) in any way.
Not create additional files (like a copy of the file with a new name). I'm trying to keep IO access as low as possible and do everything in memory.
Change the Name of a file object (in memory) to a different name.
Upload the file object to a WebAPI on another machine.
Have "FileA.txt" on source MachineA and have "FileB.txt" on destination MachineB.
I don't think it would matter but I have no plans to write the file back to the system (MachineA) with the new name, it will only be used to send the file object (in memory) to MachineB via a Web API.
I found a solution that uses reflection to accomplish this...
FileStream fs = new FileStream(#"C:\myfile.txt", FileMode.Open);
var myField = fs.GetType()
.GetField("_fileName", BindingFlags.Instance | BindingFlags.NonPublic)
myField.SetValue(fs, "my_new_filename.txt");
However, It's been a few years since that solution was given. Is there a better way to do this in 2021?
One other way would be defining the filename when you save it on MachineB.
You could pass this filename as a payload through the Web API and use it as the file name.
//buffer as byte[] and fileName as string would come from the request
using (FileStream fs = new FileStream(fileName, FileMode.Create))
{
fs.Write(buffer, 0, buffer.Length);
}
The best way I could come up with was using my old method from years ago. The following shows how I used it. I only do this to mask the original filename from the third-party WebAPI I'm sending it to.
// filePath: c:\test\my_secret_filename.txt
private byte[] GetBytesWithNewFileName(string filePath)
{
byte[] file = null;
using (var fs = new FileStream(filePath, FileMode.Open, FileAccess.Read))
{
// Change the name of the file in memory (does not affect the original file)
var fileNameField = fs.GetType().GetField(
"_fileName",
BindingFlags.Instance | BindingFlags.NonPublic
);
// If I leave out the next line, the file name field will have the full filePath
// string as its value in the resulting byte array. This will replace that with
// only the file name I wish to pass along "my_masked_filename.txt".
fileNameField.SetValue(fs, "my_masked_filename.txt");
// Get the filesize of the file and make sure it's compatible with
// the binaryreader object to be used
int fileSize;
try { fileSize = Convert.ToInt32(fs.Length); }
catch(OverflowException)
{ throw new Exception("The file is to big to convert using a binary reader."); }
// Get the file into a byte array
using (var br = new BinaryReader(fs)) { file = br.ReadBytes(fileSize); }
}
return file;
}

Is it possible to upload a CSV file to an SFTP server directly from a MemoryStream?

Whenever I try to upload a file to the SFTP server with the .csv file extension the only thing within that file is System.IO.MemoryStream. If it's a .txt extension it will have all the values in the file. I can manually convert the .txt to .csv and it will be fine. Is it possible to upload it directly to the SFTP server as a CSV file?
The SFTP Service is using the SSH.NET library by Renci.
Using statement:
using (var stream = csvFileWriter.Write(data, new CsvMapper()))
{
byte[] file = Encoding.UTF8.GetBytes(stream.ToString());
sftpService.Put(SftpCredential.Credentials.Id, file, $"/file.csv");
}
SFTP service:
public void Put(int credentialId, byte[] source, string destination)
{
using (SftpClient client = new SftpClient(GetConnectionInfo(credentialId)))
{
ConnectClient(client);
using (MemoryStream memoryStream = new MemoryStream(source))
{
client.BufferSize = 4 * 1024; // bypass Payload error large files
client.UploadFile(memoryStream, destination);
}
DisconnectClient(client);
}
Solution:
The csvFilerWriter I was using returned a Stream not a MemoryStream, so by switching the csvFileWriter and CsvPut() over to MemoryStream it worked.
Updated using statement:
using (var stream = csvFileWriter.Write(data, new CsvMapper()))
{
stream.Position = 0;
sftpService.CsvPut(SftpCredential.credemtoa;s.Id, stream, $"/file.csv");
}
Updated SFTP service:
public void CsvPut(int credentialId, MemoryStream source, string destination)
{
using (SftpClient client = new SftpClient(GetConnectionInfo(credentialId)))
{
ConnectClient(client);
client.BufferSize = 4 * 1024; //bypass Payload error large files
client.UploadFile(source, destination);
DisconnectClient(client);
}
}
It looks like the csvFileWriter.Write already returns MemoryStream. And its ToString returns "System.IO.MemoryStream" string. That's the root source of your problem.
Aditionally, as you already have the MemoryStream, its an overkill to copy it to yet another MemoryStream, upload it directly. You are copying the data over and over again, it's just a waste of memory.
Like this:
var stream = csvFileWriter.Write(data, new CsvMapper());
stream.Position = 0;
client.UploadFile(stream, destination);
See also:
Upload data from memory to SFTP server using SSH.NET
When uploading memory stream with contents created by csvhelper using SSH.NET to SFTP server, the uploaded file is empty
A simple test code to upload in-memory data:
var stream = new MemoryStream();
stream.Write(Encoding.UTF8.GetBytes("this is test"));
stream.Position = 0;
using (var client = new SftpClient("example.com", "username", "password"))
{
client.Connect();
client.UploadFile(stream, "/remote/path/file.txt");
}
You can avoid the unnecessary using of memory stream like this:
using (var sftp = new SftpClient(GetConnectionInfo(SftpCredential.GetById(credentialId).Id))))
{
sftp.Connect();
using (var uplfileStream = System.IO.File.OpenRead(fileName))
{
sftp.UploadFile(uplfileStream, fileName, true);
}
sftp.Disconnect();
}

create zip from byte[] and return to browser

I want to create a zip-file and return it to the browser so that it downloads the zip to the downloads-folder.
var images = imageRepository.GetAll(allCountryId);
using (FileStream f2 = new FileStream("SudaAmerica", FileMode.Create))
using (GZipStream gz = new GZipStream(f2, CompressionMode.Compress, false))
{
foreach (var image in images)
{
gz.Write(image.ImageData, 0, image.ImageData.Length);
}
return base.File(gz, "application/zip", "SudaAmerica");
}
i have tried the above but then i get an error saying the stream is disposed.
Is this possible or should i use another library then gzipstream?
The problem here is exactly what it says: you are handing it something based on gz, but gz gets disposed the moment you leave the using.
One option would be to wait until outside the using block, then tell it to use the filename of the thing you just wrote ("SudaAmerica"). However, IMO you shouldn't actually be writing a file here at all. If you use a MemoryStream instead, you can use .ToArray() to get a byte[] of the contents, which you can use in the File method. This requires no IO access, which is a win in about 20 different ways. Well, maybe 3 ways. But...
var images = imageRepository.GetAll(allCountryId);
using (MemoryStream ms = new MemoryStream())
{
using (GZipStream gz = new GZipStream(ms, CompressionMode.Compress, false))
{
foreach (var image in images)
{
gz.Write(image.ImageData, 0, image.ImageData.Length);
}
}
return base.File(ms.ToArray(), "application/zip", "SudaAmerica");
}
Note that a gzip stream is not the same as a .zip archive, so I very much doubt this will have the result you want. Zip archive creation is available elsewhere in the .NET framework, but it is not via GZipStream.
You probably want ZipArchive

Read a PDF into a string or byte[] and write that string/byte[] back to disk

I am having a problem in my app where it reads a PDF from disk, and then has to write it back to a different location later.
The emitted file is not a valid PDF anymore.
In very simplified form, I have tried reading/writing it using
var bytes = File.ReadAllBytes(#"c:\myfile.pdf");
File.WriteAllBytes(#"c:\output.pdf", bytes);
and
var input = new StreamReader(#"c:\myfile.pdf").ReadToEnd();
File.WriteAllText("c:\output.pdf", input);
... and about 100 permutations of the above with various encodings being specified. None of the output files were valid PDFs.
Can someone please lend a hand? Many thanks!!
In C#/.Net 4.0:
using (var i = new FileStream(#"input.pdf", FileMode.Open, FileAccess.Read))
using (var o = File.Create(#"output.pdf"))
i.CopyTo(o);
If you insist on having the byte[] first:
using (var i = new FileStream(#"input.pdf", FileMode.Open, FileAccess.Read))
using (var ms = new MemoryStream())
{
i.CopyTo(ms);
byte[] rawdata = ms.GetBuffer();
using (var o = File.Create(#"output.pdf"))
ms.CopyTo(o);
}
The memory stream may need to be ms.Seek(0, SeekOrigin.Origin) or something like that before the second CopyTo. look it up, or try it out
You're using File.WriteAllText to write your file out.
Try File.WriteAllBytes.

Include external file in XBAP application

How can i include an external file in a XBAP application? For example a .dat file where i need to extract some data from? I want everything to happen inside the .xbap file, is that possible?
byte[] ba = Properties.Resources.yourfilename;
//set Build Action to Resource and Rebuild
using (var compressedStream = new MemoryStream(ba))
using (var zipStream = new GZipStream(compressedStream, CompressionMode.Decompress))
using (var resultStream = new MemoryStream())
{
zipStream.CopyTo(resultStream);
ba1 = resultStream.ToArray();
}
You can add the .dat as a resource to the project
then during runtime extract using a stream and save to a file on temp folder

Categories

Resources