Export to Excel with server side webapi and list - c#

I am getting the response from API on a generic list
IEnumerable<myClass> objClass.
Here I am trying to export the list to a CSV file using StreamWriter
var serviceResponse = await services.GetProfileRepositoryAsync(requestDto, token);
if(requestDto.IsExportable)
{
MemoryStream stream = new MemoryStream();
StreamWriter writer = new StreamWriter(stream);
**writer.Write((serviceResponse.dto.NewSoftwareFileDto));**
writer.Flush();
stream.Position = 0;
return File(stream, "text/csv", "filesname.csv");
}
Since serviceResponse.dto.NewSoftwareFileDto returns a list the writer.Write is not wring the content. Once I have used Objectresult with writer.Write() method and it was working earlier. Now I am not able to recollect it.
I want to avoid looping through the list and writing data.

You can't send a DTO list directly to a MemoryStream using StreamWriter.Write(), because StreamWriter.Write() only accepts either char, char[] or string as first parameter. Since you want to avoid using loops (either for or foreach), you could use LINQ to create a List<string> from existing list of DTO as comma-separated values and then write its contents into the stream:
MemoryStream stream = new MemoryStream();
StreamWriter writer = new StreamWriter(stream);
// create list of strings from DTO list
List<string> items = serviceResponse.dto.NewSoftwareFileDto.Select(x =>
string.Join(",", x.Property1, x.Property2, ...)).ToList();
// insert newline between every lines (i.e. list indexes)
string combined = string.Join(Environment.NewLine, items);
// write combined strings to StreamWriter
writer.Write(combined);
writer.Flush();
stream.Position = 0;
return File(stream, "text/csv", "filesname.csv");
Note that Property1, Property2 etc. represents property names inside list of DTO object, depending on order of definition. The property names used in Select must be available inside NewSoftwareFileDto.

Related

Serialising a collection is converting to a JObject rather than a JArray [duplicate]

I am simply trying to serialize and deserialize a string array in Bson format using Json.NET, but the following code fails:
var jsonSerializer = new JsonSerializer();
var array = new string [] { "A", "B" };
// Serialization
byte[] bytes;
using (var ms = new MemoryStream())
using (var bson = new BsonWriter(ms))
{
jsonSerializer.Serialize(bson, array, typeof(string[]));
bytes = ms.ToArray();
}
// Deserialization
using (var ms = new MemoryStream(bytes))
using (var bson = new BsonReader(ms))
{
// Exception here
array = jsonSerializer.Deserialize<string[]>(bson);
}
Exception message:
Cannot deserialize the current JSON object (e.g. {"name":"value"}) into type 'System.String[]' because the type requires a JSON array (e.g. [1,2,3]) to deserialize correctly.
To fix this error either change the JSON to a JSON array (e.g. [1,2,3]) or change the deserialized type so that it is a normal .NET type (e.g. not a primitive type like integer, not a collection type like an array or List) that can be deserialized from a JSON object. JsonObjectAttribute can also be added to the type to force it to deserialize from a JSON object.
How can I get this to work?
Set ReadRootValueAsArray to true on BsonReader
http://james.newtonking.com/projects/json/help/index.html?topic=html/P_Newtonsoft_Json_Bson_BsonReader_ReadRootValueAsArray.htm
This setting is required because the BSON data spec doesn't save metadata about whether the root value is an object or an array.
Hmmm, from where I sit, your code should work, but Json.Net seems to think that your serialized array of strings is a dictionary. This could be because, according to the BSON specification, arrays actually do get serialized as a list of key-value pairs just like objects do. The keys in this case are simply the string representations of the array index values.
In any case, I was able to work around the issue in a couple of different ways:
Deserialize to a Dictionary and then manually convert it back to an array.
var jsonSerializer = new JsonSerializer();
var array = new string[] { "A", "B" };
// Serialization
byte[] bytes;
using (var ms = new MemoryStream())
using (var bson = new BsonWriter(ms))
{
jsonSerializer.Serialize(bson, array);
bytes = ms.ToArray();
}
// Deserialization
using (var ms = new MemoryStream(bytes))
using (var bson = new BsonReader(ms))
{
var dict = jsonSerializer.Deserialize<Dictionary<string, string>>(bson);
array = dict.OrderBy(kvp => kvp.Key).Select(kvp => kvp.Value).ToArray();
}
Wrap the array in an outer object.
class Wrapper
{
public string[] Array { get; set; }
}
Then serialize and deserialize using the wrapper object.
var jsonSerializer = new JsonSerializer();
var obj = new Wrapper { Array = new string[] { "A", "B" } };
// Serialization
byte[] bytes;
using (var ms = new MemoryStream())
using (var bson = new BsonWriter(ms))
{
jsonSerializer.Serialize(bson, obj);
bytes = ms.ToArray();
}
// Deserialization
using (var ms = new MemoryStream(bytes))
using (var bson = new BsonReader(ms))
{
obj = jsonSerializer.Deserialize<Wrapper>(bson);
}
Hope this helps.
As explained in this answer by James Newton-King, the BSON format doesn't save metadata about whether the root value is a collection, making it necessary to set BsonDataReader.ReadRootValueAsArray appropriately before beginning to deserialize.
One easy way to do this, when deserializing to some known POCO type (rather than dynamic or JToken), is to initialize the reader based on whether the root type will be serialized using an array contract. The following extension methods do this:
public static partial class BsonExtensions
{
public static T DeserializeFromFile<T>(string path, JsonSerializerSettings settings = null)
{
using (var stream = new FileStream(path, FileMode.Open))
return Deserialize<T>(stream, settings);
}
public static T Deserialize<T>(byte [] data, JsonSerializerSettings settings = null)
{
using (var stream = new MemoryStream(data))
return Deserialize<T>(stream, settings);
}
public static T Deserialize<T>(byte [] data, int index, int count, JsonSerializerSettings settings = null)
{
using (var stream = new MemoryStream(data, index, count))
return Deserialize<T>(stream, settings);
}
public static T Deserialize<T>(Stream stream, JsonSerializerSettings settings = null)
{
// Use BsonReader in Json.NET 9 and earlier.
using (var reader = new BsonDataReader(stream) { CloseInput = false }) // Let caller dispose the stream
{
var serializer = JsonSerializer.CreateDefault(settings);
//https://www.newtonsoft.com/json/help/html/DeserializeFromBsonCollection.htm
if (serializer.ContractResolver.ResolveContract(typeof(T)) is JsonArrayContract)
reader.ReadRootValueAsArray = true;
return serializer.Deserialize<T>(reader);
}
}
}
Now you can simply do:
var newArray = BsonExtensions.Deserialize<string []>(bytes);
Notes:
BSON support was moved to its own package, Newtonsoft.Json.Bson, in Json.NET 10.0.1. In this version and later versions BsonDataReader replaces the now-obsolete BsonReader.
The same extension methods can be used to deserialize a dictionary, e.g.:
var newDictionary = BsonExtensions.Deserialize<SortedDictionary<int, string>>(bytes);
By checking the contract type ReadRootValueAsArray is set appropriately.
Demo fiddle here.
In general, you could check data type first before set ReadRootValueAsArray to true, like this:
if (typeof(IEnumerable).IsAssignableFrom(type))
bSonReader.ReadRootValueAsArray = true;
I know this is an old thread but I discovered a easy deserialization while using the power of MongoDB.Driver
You can use BsonDocument.parse(JSONString) to deserialize a JSON object so to deserialize a string array use this:
string Jsonarray = "[\"value1\", \"value2\", \"value3\"]";
BsonArray deserializedArray = BsonDocument.parse("{\"arr\":" + Jsonarray + "}")["arr"].asBsonArray;
deserializedArray can then be used as any array such as a foreach loop.

How I pass the type dynamically IEnumerable<T> C#

I want to pass a object type to IEnumerable<T> at runtime.
For example, I have to get the read the CSV file and they all return different set of data.
So in case if I want use a single method but extract a different set of entity than how I can do that? In the example shown below, I want to pass different entity instead of Student:
public List<Patient> GetAcgFileData(string fullFileName)
{
using (var sr = new StreamReader(fullFileName))
{
var reader = new CsvReader(sr);
////CSVReader will now read the whole file into an enumerable
var records = reader.GetRecords<Student>().ToList();
return records;
}
}
You can make your function take in a generic
public List<T> GetAcgFileData<T>(string fullFileName) {
using (var sr = new StreamReader(fullFileName)) {
var reader = new CsvReader(sr);
////CSVReader will now read the whole file into an enumerable
var records = reader.GetRecords<T>().ToList();
return records;
}
}
Then you can call it with a type GetAcgFileData<Student>("somestring");

Serialise nested list or alternatives

I am trying to store a collection of lists (each containing over 20.000 int's) and was hoping to use a nested lest for this since each day a new list will be added.
Eventually I need to access the data in the following way:
"Take the first value of each list and compile a new list".
Iddeally I'd like to serialise a List<List<int>> however this does not seem to work (I can serialise a List<int>). Is there a trick to doing this (preferably without getting any addons)?
If not, how would you advice me to store such data efficiently and quick?
The way I try it now:
static void saveFunction(List<int> data, string name)
{
using (Stream stream = File.Open(name + ".bin", FileMode.OpenOrCreate))
{
BinaryFormatter bin = new BinaryFormatter();
if (stream.Length == 0)
{
List<List<int>> List = new List<List<int>>();
List.Add(data);
bin.Serialize(stream, List);
}
else
{
List<List<int>> List = (List<List<int>>)bin.Deserialize(stream);
List.Add(data);
bin.Serialize(stream, List);
}
}
}
Strangely the list.Count remains 1, and the number of int in the list remain the same as well while the file size increases.
You need to rewind the stream and clear the previous data between reading and writing:
static void saveFunction(List<int> data, string name)
{
using (Stream stream = File.Open(name + ".bin", FileMode.OpenOrCreate))
{
BinaryFormatter bin = new BinaryFormatter();
if (stream.Length == 0)
{
var List = new List<List<int>>();
List.Add(data);
bin.Serialize(stream, List);
}
else
{
var List = (List<List<int>>)bin.Deserialize(stream);
List.Add(data);
stream.SetLength(0); // Clear the old data from the file
bin.Serialize(stream, List);
}
}
}
What you are doing now is appending the new list to the end of the file while leaving the old list as-is -- which BinaryFormatter will happily read as the (first) object in the file when it is re-opened.
As for your second question, "how would you advice me to store such data efficiently and quick?", since your plan is to "take the first value of each list and compile a new list", it appears you're going to need to re-read the preceding lists when writing a new list. If that were not true, however, and each new list was independent of the preceding lists, BinaryFormatter does support writing multiple root objects to the same file. See here for details: Serializing lots of different objects into a single file

How to read and write a list object into a file

I had written a list object to a file like this
private List<string> _cacheFileList=new List<string>(4);
_cacheFileList.Add("Something");
using (StreamWriter file = new StreamWriter(#"cache.bin"))
{
file.Write(_cacheFileList);
}
Now how could I retrieve whole list object??
Instead of using StreamWriter like that, use BinaryFormatter to serialize your list. Then you can easily retrieve your list back by deserializing. MSDN has a good example about how to do that.
If you just want a text file with a single line per list entry, you could try the code below. Of course, you would need error handling and need to ensure that the strings in the list did not contain newlines.
// Write
List<string> _listA = new List<string>(4);
_listA.Add("Test");
_listA.Add("Test2");
_listA.Add("Test3");
_listA.Add("Test4");
System.IO.File.WriteAllLines("test.txt", _listA);
// Read
List<string> _listB = new List<string>(4);
_listB.AddRange(System.IO.File.ReadAllLines("test.txt"));
If you want just write your list line by line then you can modify your code like this:
var cacheFileList = new List<string>(4);
cacheFileList.Add("Something");
using (var file = new StreamWriter(#"cache.bin"))
{
file.Write(string.Join("\r\n", cacheFileList));
}

put xml into Array

I have an xml file, and I need to be able to sort it in either a list or an array
The XML:
<Restaurant>
<name>test</name>
<location>test</location>
</Restaurant>
<Restaurant>
<name>test2</name>
<location>test2</location>
</Restaurant>
All the Restaurants will have the same number of fields and the same names for the fields, but the number of <Restaurant></Restaurant> in a given xml file is unknown.
In other words, I need an array or list and be able to do this:
String name = restaurantArray[0].name;
String location = restaurantArray[0].location;
While I don't need that syntax obviously, this is the functionality I'm trying to accomplish.
If you are trying to get names of restaurants and Restaurant elements are direct child of root element:
string[] names = xdoc.Root.Elements("Restaurant")
.Select(r => (string)r.Element("name"))
.ToArray();
EDIT: If you are trying to parse whole restaurant objects:
var restaurants = from r in xdoc.Root.Elements("Restaurant")
select new {
Name = (string)r.Element("name"),
Location = (string)r.Element("location")
};
Usage:
foreach(var restaurant in restaurants)
{
// use restaurant.Name or restaurant.Location
}
You can create instance of some Restaurant class instead of anonymous object here. Also you can put restaurants to array by simple restaurants.ToArray() call.
The answer by Sergey is very clear but if you want to load it from the saved file I think it will be helpful for you.
Actually for loading a XML files to the array I used this method, But my array was double Jagged array. The code that I used is below I modified based on your resturant:
private static resturant[][] LoadXML(string filePath)
{
//Open the XML file
System.IO.FileStream fs = new System.IO.FileStream(filePath, System.IO.FileMode.Open);
// First create a xml Serializer object
System.Xml.Serialization.XmlSerializer xmlSer = new System.Xml.Serialization.XmlSerializer(typeof(resturant[][]));
resturant[][] resturant = (resturant[][])xmlSer.Deserialize(fs);
// Close the file stream
fs.Close();
return resturant ;
}
By this function you can read all your data as below :
double [][] res = LoadXML(#"YOUR FILE PATH");
As you know the first and second element of each resturant is name and location I think accessing to them is now easy for you.

Categories

Resources