C# download file from filestream - c#

I want to export my data as csv file. For that I'm using CsvHelper library. But I don't want to have all data in one csv file. The limitation should be 1000 per file.
What I tried for that limitation:
var fileStream = new FileStream("/static/export.csv", FileMode.Create, FileAccess.Write)
var memoryStream = new MemoryStream();
var stream = new StreamWriter();
var writer = new CsvWriter(stream, config);
for(i = 0; i < indexCount; i += 1000)
{
var items = result.Skip(i).Take(1000);
.
.// logic for writing records with CsvHelper
.
writer.Flush();
memoryStream.Position = 0;
byte[] data = memoryStream.ToArray();
fileStream.Write(data, 0, data.Length);
}
For example if I have 100000 rows in database, I want to have 100 csv files. How can I download chunk files from stream?

var i = 1;
foreach (var chunk in result.Chunk(1000))
{
using var fileStream = new FileStream($"/static/export{i++}.csv", FileMode.Create, FileAccess.Write);
using var streamWriter = new StreamWriter(fileStream);
var writer = new CsvWriter(streamWriter , config);
// logic for writing records with CsvHelper
}

Related

Saving a binary file from MemoryStream to a Network path intermittently saves 0 bytes file

I have the following code which works most of the time when saving a file from a memory stream pointing to a byte array to a network location.
using (var writer = new BinaryWriter(new FileStream(filePath, FileMode.Create)))
using (var reader = new BinaryReader(stream))
{
var chunkSize = 1024;
var chunkCount = (int)reader.BaseStream.Length / chunkSize;
var chunks = Enumerable.Range(0, chunkCount)
.Select(_ => reader.ReadBytes(chunkSize));
chunks.ForEach(c => writer.Write(c));
writer.Write(reader.ReadBytes((int)reader.BaseStream.Length % chunkSize));
writer.Close();
reader.Close();
}
Is there any possible way the above code may end up saving zero bytes file? Is it a bad idea to save to the network directly and not use a "copy from temp" method?

Reading and writing binary data from one to another file

I'm reading a binary file and writing to another file in CP 437 format by skipping few lines. But the output file size is increased than the original file and also data is corrupted. Any help to identify the issue.
StreamReader sStreamReader = new StreamReader(#"D:\Denesh\Input.txt");
string AllData = sStreamReader.ReadToEnd();
string[] rows = AllData.Split(",".ToCharArray());
FileStream fileStream = new FileStream(TransLog, FileMode.Open);
StreamReader streamReader = new StreamReader((Stream)fileStream, Encoding.GetEncoding(437));
StreamWriter streamWriter = new StreamWriter(outFile, false);
int num = 0;
int count = 0;
while (!streamReader.EndOfStream)
{
string tlogline = streamReader.ReadLine();
if (rows[count] == Convert.ToString(num))
{
++count;
}
else
{
++num;
streamWriter.WriteLine(tlogline, streamReader.CurrentEncoding);
}
}
fileStream.Close();
streamWriter.Close();
Adding filestream for streamwriter solves the issue. Thanks.

asp.net Web API csv helper writing to the stream doesn't work

My initial requirement is to let the user download a file from object list for that I found this solution https://stackoverflow.com/a/49207997/11178128,
But the problem is when it comes to this line
bin = stream.ToArray();
there are no streams written to it. So the bin comes as an empty array.
What could be the problem?
Also, I'm making my web API available through a windows service. And for some reason System.Web.HttpContext.Current.Response gives me null. any idea why it can be?
Thanks in advance.
This is the code i have so far
List<Device> devices;
using (StreamReader r = new StreamReader(String.Format(#"{0}\deviceList.json", savefilePath)))
{
string json = r.ReadToEnd();
devices = JsonConvert.DeserializeObject<List<Device>>(json);
}
byte[] bin;
//String.Format(#"{0}\devices.csv", savefilePath)
using (MemoryStream stream = new MemoryStream())
using (TextWriter textWriter = new StreamWriter(stream))
using (CsvWriter csv = new CsvWriter(textWriter))
{
csv.Configuration.ShouldQuote = (field, context) => false;
csv.WriteRecords(devices);
bin = stream.ToArray();
}
This is related to another question, CsvHelper not writing anything to memory stream.
You just need to change your using statements so that the StreamWriter gets flushed before calling stream.ToArray();
List<Device> devices;
using (StreamReader r = new StreamReader(String.Format(#"{0}\deviceList.json", savefilePath)))
{
string json = r.ReadToEnd();
devices = JsonConvert.DeserializeObject<List<Device>>(json);
}
byte[] bin;
//String.Format(#"{0}\devices.csv", savefilePath)
using (MemoryStream stream = new MemoryStream())
{
using (TextWriter textWriter = new StreamWriter(stream))
using (CsvWriter csv = new CsvWriter(textWriter))
{
csv.Configuration.ShouldQuote = (field, context) => false;
csv.WriteRecords(devices);
}
bin = stream.ToArray();
}
Actually, after a bit of struggling, Found that i was missing this line.
textWriter.Flush();
As mentioned in the below reply I had to flush the textWriter object in order to write to the file. Here is the working code.
byte[] data;
using (MemoryStream stream = new MemoryStream())
using (TextWriter textWriter = new StreamWriter(stream))
using (CsvWriter csv = new CsvWriter(textWriter))
{
csv.Configuration.RegisterClassMap<DeviceMap>();
csv.Configuration.ShouldQuote = (field, context) => false;
csv.WriteRecords(values);
textWriter.Flush();
data = stream.ToArray();
}
return data;
using (var ms = new MemoryStream())
{
using (var writer = new StreamWriter(ms))
using (var csv = new CsvWriter(writer))
{
csv.WriteRecords(dbresponse);
} // the closing tag here is important!!It flush the streamwriter
ms.ToArray(); // or ms.GetBuffer()
}
Now the ms.ToArray() will contain the data from csvHelper
For a field variant - for example a list, which won't work using the writerecords method - you will need to use writefield. I am just submitting this here as this trifling issue caused me none too little pain.
Here is an async example:
var result = await GetListOfString();
using (var ms = new MemoryStream())
{
using (var writer = new StreamWriter(ms))
using (var csv = new CsvWriter(writer, CultureInfo.InvariantCulture))
{
foreach (var value in result)
{
csv.WriteField(value);
await csv.NextRecordAsync();
}
await writer.FlushAsync();
return ms.ToArray();
}
}

C# Spire Document.SaveToStream not working

I have the following code but it is just creating a 0kb empty file.
using (var stream1 = new MemoryStream())
{
MemoryStream txtStream = new MemoryStream();
Document document = new Document();
fileInformation.Stream.CopyTo(stream1);
document.LoadFromStream(stream1, FileFormat.Auto);
document.SaveToStream(txtStream, FileFormat.Txt);
StreamReader reader = new StreamReader(txtStream);
string text = reader.ReadToEnd();
System.IO.File.WriteAllText(fileName + ".txt", text);
}
I know the data is successfully loaded into document because if do document.SaveToTxt("test.txt", Encoding.UTF8);
instead of the SaveToStream line it exports the file properly.
What am I doing wrong?
When copying a stream, you need to take care to reset the position to 0 if copying. As seen in the answer here, you can do something like this to your streams:
stream1.Position = 0;
txtStream.Position = 0;

Create and write to a text file inmemory and convert to byte array in one go

How can I create a .csv file implicitly/automatically by using the correct method, add text to that file existing in memory and then convert to in memory data to a byte array?
string path = #"C:\test.txt";
File.WriteAllLines(path, GetLines());
byte[] bytes = System.IO.File.ReadAllBytes(path);
With that approach I create a file always (good), write into it (good) then close it (bad) then open the file again from a path and read it from the hard disc (bad)
How can I improve that?
UPDATE
One nearly good approach would be:
using (var fs = new FileStream(#"C:\test.csv", FileMode.Create, FileAccess.ReadWrite))
{
using (var memoryStream = new MemoryStream())
{
fs.CopyTo(memoryStream );
return memoryStream .ToArray();
}
}
but I am not able to write text into that filestream... just bytes...
UPDATE 2
using (var fs = File.Create(#"C:\temp\test.csv"))
{
using (var sw = new StreamWriter(fs, Encoding.Default))
{
using (var ms = new MemoryStream())
{
String message = "Message is the correct ääüö Pi(\u03a0), and Sigma (\u03a3).";
sw.Write(message);
sw.Flush();
fs.CopyTo(ms);
return ms.ToArray();
}
}
}
The string message is not persisted to the test.csv file. Anyone knows why?
Write text into Memory Stream.
byte[] bytes = null;
using (var ms = new MemoryStream())
{
using(TextWriter tw = new StreamWriter(ms)){
tw.Write("blabla");
tw.Flush();
ms.Position = 0;
bytes = ms.ToArray();
}
}
UPDATE
Use file stream Directly and write to File
using (var fs = new FileStream(#"C:\ed\test.csv", FileMode.Create, FileAccess.ReadWrite))
{
using (TextWriter tw = new StreamWriter(fs))
{
tw.Write("blabla");
tw.Flush();
}
}
You can get a byte array from a string using encoding:
Encoding.ASCII.GetBytes(aString);
Or
Encoding.UTF8.GetBytes(aString);
But I don't know why you would want a csv as bytes. You could load the entire file to a string, add to it and then save it:
string content;
using (var reader = new StreamReader(filename))
{
content = reader.ReadToEnd();
}
content += "x,y,z";
using (var writer = new StreamWriter(filename))
{
writer.Write(content);
}
Update: Create a csv in memory and pass back as bytes:
var stringBuilder = new StringBuilder();
foreach(var line in GetLines())
{
stringBuilder.AppendLine(line);
}
return Encoding.ASCII.GetBytes(stringBuilder.ToString());

Categories

Resources