SQL bulkCopy writes empty string instead NULL - c#

I have function that inserts new values into database:
public async Task BulkAdd(IDataReader data)
{
if (Connection.State == ConnectionState.Broken || Connection.State == ConnectionState.Closed)
{
await Connection.OpenAsync();
}
using (SqlBulkCopy bulk = new SqlBulkCopy(Connection))
{
bulk.DestinationTableName = GetTableName();
bulk.BatchSize = BATCH_SIZE;
bulk.BulkCopyTimeout = 0; // for infinity write 0
bulk.EnableStreaming = true;
await bulk.WriteToServerAsync(data);
}
}
insert strings are generated in order and look like:
,,11111,,7,,620,7 11111,04/15/2013 00:00:00,false,Bulgaria, and then are converted to CsvDataReader:
var csvStreamReader = MapDataExtraWithHeaders(reader, clientId, dataExtraHeadersMap, delimiter, uploadDataId, dateFormat);
using (var csv = new CsvReader(csvStreamReader))
{
csv.Configuration.BadDataFound = null;
csv.Configuration.Delimiter = delimiter;
// ADDED
csv.Configuration.TypeConverterCache.AddConverter<string>(new EmptyAsNullConverter());
var dataReader = new CsvDataReader(csv);
csv.ReadHeader();
if (!HeadersValid(csv.Context.HeaderRecord, DataHeaders))
throw new CvtException(CVTExceptionCode.Import.InvalidHeaders);
await _transactionsDataRepository.BulkAdd(dataReader);
}
I added null constraints like:
alter table [dbo].[Extra] add constraint [DF_Custom] default (null) for [Custom]
however when I look into what was added I see that instead of NULL, empty string was added. How can that be fixed?

You can write a custom converter to convert empty to null :
public class EmptyAsNullConverter : CsvHelper.TypeConversion.StringConverter
{
public override object ConvertFromString(string text, IReaderRow row, MemberMapData memberMapData)
{
if (string.IsNullOrEmpty(text))
return null;
else
return base.ConvertFromString(text, row, memberMapData);
}
}
Credit to brandonw
Example of use :
static void Main(string[] args)
{
var text =
"a,b,c,e,f" + Environment.NewLine +
"a,,c,e,f" + Environment.NewLine +
"a,b,c,,f";
using (var csv = new CsvReader(new StringReader(text), CultureInfo.InvariantCulture))
{
csv.Configuration.TypeConverterCache.AddConverter<string>(new EmptyAsNullConverter());
while(csv.Read())
{
for(int i = 0; i < 5; i++)
{
Console.Write($"{csv.GetField<string>(i) ?? "null"}\t");
}
Console.WriteLine();
}
}
Console.Read();
}
Output :
a b c e f
a null c e f
a b c null f

you can bind CSV to class and write a converter for it.
public class EmptyStringConverter : ConverterBase
{
public override object StringToField(string sourceString)
{
if (String.IsNullOrWhiteSpace(sourceString))
return null;
return sourceString;
}
}
[FieldConverter(typeof(EmptyStringConverter))]
public string MyStrField;

Related

Parse a random JSON to Excel in C# [duplicate]

I'm working with JSON/CSV files in my asp.net web API project and tried with CSVHelper and ServiceStack.Text libraries but couldn't make it work.
The JSON file containing an array is dynamic and may have any number of fields
I read the file using streamreader and then need to convert it into CSV file to make it downloadable for end users.
example file text
[{"COLUMN1":"a","COLUMN2":"b","COLUMN3":"c","COLUMN4":"d","COLUMN5":"e"},
{"COLUMN1":"a","COLUMN2":"b","COLUMN3":"c","COLUMN4":"d","COLUMN5":"e"}]
JSON to CSV
public static string jsonStringToCSV(string content)
{
var jsonContent = (JArray)JsonConvert.DeserializeObject(content);
var csv = ServiceStack.Text.CsvSerializer.SerializeToCsv(jsonContent);
return csv;
}
This doesn't result me CSV data
Then some files are delimiter type with comma or tab and and i want to utilize CSVHelper to convert CSV string to IEnumerable dynamically
public static IEnumerable StringToList(string data, string delimiter, bool HasHeader)
{
using (var csv = new CsvReader(new StringReader(data)))
{
csv.Configuration.SkipEmptyRecords = true;
csv.Configuration.HasHeaderRecord = HasHeader;
csv.Configuration.Delimiter = delimiter;
var records = csv.GetRecords();
return records;
}
}
I was able to solve it by DeserializeObject to a datatable using Json.net, so want to post my own answer but will not mark it as accepted, if anyone have better way to do this.
To convert JSON string to DataTable
public static DataTable jsonStringToTable(string jsonContent)
{
DataTable dt = JsonConvert.DeserializeObject<DataTable>(jsonContent);
return dt;
}
To make CSV string
public static string jsonToCSV(string jsonContent, string delimiter)
{
StringWriter csvString = new StringWriter();
using (var csv = new CsvWriter(csvString))
{
csv.Configuration.SkipEmptyRecords = true;
csv.Configuration.WillThrowOnMissingField = false;
csv.Configuration.Delimiter = delimiter;
using (var dt = jsonStringToTable(jsonContent))
{
foreach (DataColumn column in dt.Columns)
{
csv.WriteField(column.ColumnName);
}
csv.NextRecord();
foreach (DataRow row in dt.Rows)
{
for (var i = 0; i < dt.Columns.Count; i++)
{
csv.WriteField(row[i]);
}
csv.NextRecord();
}
}
}
return csvString.ToString();
}
Final Usage in Web API
string csv = jsonToCSV(content, ",");
HttpResponseMessage result = new HttpResponseMessage(HttpStatusCode.OK);
result.Content = new StringContent(csv);
result.Content.Headers.ContentType = new MediaTypeHeaderValue("text/csv");
result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment") { FileName = "export.csv" };
return result;
I don't know if this is too late to report solution for your question. Just in case if you want to explore open source library to do the job, here is one
Cinchoo ETL makes it easy to convert JSON to csv with few lines of code
using (var r = new ChoJSONReader("sample.json"))
{
using (var w = new ChoCSVWriter("sample.csv").WithFirstLineHeader())
{
w.Write(r);
}
}
For more information / source, go to https://github.com/Cinchoo/ChoETL
Nuget package:
.NET Framework:
Install-Package ChoETL.JSON
.NET Core:
Install-Package ChoETL.JSON.NETStandard
Sample fiddle: https://dotnetfiddle.net/T3u4W2
Full Disclosure: I'm the author of this library.
Had the same problem recently and I believe there is a little bit more elegant solution using the System.Dynamic.ExpandoObject and CsvHelper. It is less code and hopefully the performance is similar or better compared to the DataTable.
public static string JsonToCsv(string jsonContent, string delimiter)
{
var expandos = JsonConvert.DeserializeObject<ExpandoObject[]>(jsonContent);
using (var writer = new StringWriter())
{
using (var csv = new CsvWriter(writer))
{
csv.Configuration.Delimiter = delimiter;
csv.WriteRecords(expandos as IEnumerable<dynamic>);
}
return writer.ToString();
}
}
This code is OK for me:
3 functions (check, parse and aux)
private bool IsValidJson(string strInput)
{
try
{
if (string.IsNullOrWhiteSpace(strInput)) { return false; }
strInput = strInput.Trim();
if ((strInput.StartsWith("{") && strInput.EndsWith("}")) || (strInput.StartsWith("[") && strInput.EndsWith("]")))
{
try
{
_ = JToken.Parse(strInput);
return true;
}
catch
{
return false;
}
}
return false;
}
catch { throw; }
}
private string ParseJsonToCsv(string json)
{
try
{
XmlNode xml = JsonConvert.DeserializeXmlNode("{records:{record:" + json + "}}");
XmlDocument xmldoc = new XmlDocument(); xmldoc.LoadXml(xml.InnerXml);
DataSet dataSet = new DataSet(); dataSet.ReadXml(new XmlNodeReader(xmldoc));
string csv = DTableToCsv(dataSet.Tables[0], ",");
return csv;
}
catch { throw; }
}
private string DTableToCsv(DataTable table, string delimator)
{
try
{
var result = new StringBuilder();
for (int i = 0; i < table.Columns.Count; i++)
{
result.Append(table.Columns[i].ColumnName);
result.Append(i == table.Columns.Count - 1 ? "\n" : delimator);
}
foreach (DataRow row in table.Rows)
for (int i = 0; i < table.Columns.Count; i++)
{
result.Append(row[i].ToString());
result.Append(i == table.Columns.Count - 1 ? "\n" : delimator);
}
return result.ToString().TrimEnd(new char[] { '\r', '\n' });
}
catch { throw; }
}
public void Convert2Json()
{
try
{
if (FileUpload1.PostedFile.FileName != string.Empty)
{
string[] FileExt = FileUpload1.FileName.Split('.');
string FileEx = FileExt[FileExt.Length - 1];
if (FileEx.ToLower() == "csv")
{
string SourcePath = Server.MapPath("Resources//" + FileUpload1.FileName);
FileUpload1.SaveAs(SourcePath);
string Destpath = (Server.MapPath("Resources//" + FileExt[0] + ".json"));
StreamWriter sw = new StreamWriter(Destpath);
var csv = new List<string[]>();
var lines = System.IO.File.ReadAllLines(SourcePath);
foreach (string line in lines)
csv.Add(line.Split(','));
string json = new
System.Web.Script.Serialization.JavaScriptSerializer().Serialize(csv);
sw.Write(json);
sw.Close();
TextBox1.Text = Destpath;
MessageBox.Show("File is converted to json.");
}
else
{
MessageBox.Show("Invalid File");
}
}
else
{
MessageBox.Show("File Not Found.");
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
The below code successfully compiles with latest stable version of CsvHelper nuget package.
public static string JsonToCsv(string jsonContent, string delimeter)
{
var expandos = JsonConvert.DeserializeObject<ExpandoObject[]>(jsonContent);
using (TextWriter writer = new StringWriter())
{
CsvConfiguration csvConfiguration = new CsvConfiguration(System.Globalization.CultureInfo.CurrentCulture);
csvConfiguration.Delimiter = delimeter;
using (var csv = new CsvWriter(writer, csvConfiguration))
{
csv.WriteRecords((expandos as IEnumerable<dynamic>));
}
return writer.ToString();
}
}
using System.Globalization;
using (var csv = new CsvWriter(csvString, CultureInfo.CurrentCulture)) {
...
}

System.OutOfMemoryException in C# when Generating huge amount of byte[] objects

I'm using this code to modify a pdf tmeplate to add specific details to it,
private static byte[] GeneratePdfFromPdfFile(byte[] file, string landingPage, string code)
{
try
{
using (var ms = new MemoryStream())
{
using (var reader = new PdfReader(file))
{
using (var stamper = new PdfStamper(reader, ms))
{
string _embeddedURL = "http://" + landingPage + "/Default.aspx?code=" + code + "&m=" + eventCode18;
PdfAction act = new PdfAction(_embeddedURL);
stamper.Writer.SetOpenAction(act);
stamper.Close();
reader.Close();
return ms.ToArray();
}
}
}
}
catch(Exception ex)
{
File.WriteAllText(HttpRuntime.AppDomainAppPath + #"AttachmentException.txt", ex.Message + ex.StackTrace);
return null;
}
}
this Method is being called from this Method:
public static byte[] GenerateAttachment(AttachmentExtenstion type, string Contents, string FileName, string code, string landingPage, bool zipped, byte[] File = null)
{
byte[] finalVal = null;
try
{
switch (type)
{
case AttachmentExtenstion.PDF:
finalVal = GeneratePdfFromPdfFile(File, landingPage, code);
break;
case AttachmentExtenstion.WordX:
case AttachmentExtenstion.Word:
finalVal = GenerateWordFromDocFile(File, code, landingPage);
break;
case AttachmentExtenstion.HTML:
finalVal = GenerateHtmlFile(Contents, code, landingPage);
break;
}
return zipped ? _getZippedFile(finalVal, FileName) : finalVal;
}
catch(Exception ex)
{
return null;
}
}
and here is the main caller,
foreach (var item in Recipients)
{
//...
//....
item.EmailAttachment = AttachmentGeneratorEngine.GenerateAttachment(_type, "", item.AttachmentName, item.CMPRCode, _cmpTmp.LandingDomain, _cmpTmp.AttachmentZip.Value, _cmpTmp.getFirstAttachment(item.Language, item.DefaultLanguage));
}
The AttachmentGeneratorEngine.GenerateAttachment method is being called approx. 4k times, because I'm adding a specific PDF file from a PDF template for every element in my List.
recently I started having this exception:
Exception of type 'System.OutOfMemoryException' was thrown. at System.IO.MemoryStream.ToArray()
I already implemented IDisposible in the classes and and I made sure that all of them are being released.
Note: it was running before very smoothely and also I double checked the system's resources - 9 GB is used out of 16 GB, so I had enough memory available.
==========================================
Update:
Here is the code that loops through the list
public static bool ProcessGroupLaunch(string groupCode, int customerId, string UilangCode)
{
CampaignGroup cmpGList = GetCampaignGroup(groupCode, customerId, UilangCode)[0];
_campaigns = GetCampaigns(groupCode, customerId);
List<CampaignRecipientLib> Recipients = GetGroupRcipientsToLaunch(cmpGList.ID, customerId);
try
{
foreach (var item in _campaigns)
item.Details = GetCampaignDetails(item.CampaignId.Value, UilangCode);
Stopwatch stopWatch = new Stopwatch();
#region single-threaded ForEach
foreach (var item in Recipients)
{
CampaignLib _cmpTmp = _campaigns.FirstOrDefault(x => x.CampaignId.Value == item.CampaignId);
bool IncludeAttachment = _cmpTmp.IncludeAttachment ?? false;
bool IncludeAttachmentDoubleBarrel = _cmpTmp.IncludeAttachmentDoubleBarrel ?? false;
if (IncludeAttachment)
{
if (_cmpTmp.AttachmentExtension.ToLower().Equals("doc") || (_cmpTmp.AttachmentExtension.ToLower().Equals("docx")))
_type = AttachmentGeneratorEngine.AttachmentExtenstion.Word;
else if (_cmpTmp.AttachmentExtension.ToLower().Equals("ppt") || (_cmpTmp.AttachmentExtension.ToLower().Equals("pptx")))
_type = AttachmentGeneratorEngine.AttachmentExtenstion.PowePoint;
else if (_cmpTmp.AttachmentExtension.ToLower().Equals("xls") || (_cmpTmp.AttachmentExtension.ToLower().Equals("xlsx")))
_type = AttachmentGeneratorEngine.AttachmentExtenstion.Excel;
else if (_cmpTmp.AttachmentExtension.ToLower().Equals("pdf"))
_type = AttachmentGeneratorEngine.AttachmentExtenstion.PDF;
else if (_cmpTmp.AttachmentExtension.ToLower().Equals("html"))
_type = AttachmentGeneratorEngine.AttachmentExtenstion.HTML;
}
//set "recpient" details
item.EmailFrom = _cmpTmp.EmailFromPrefix + "#" + _cmpTmp.EmailFromDomain;
item.EmailBody = GetChangedPlaceHolders((_cmpTmp.getBodybyLangCode(string.IsNullOrEmpty(item.Language) ? item.DefaultLanguage : item.Language, item.DefaultLanguage)), item.ID, _cmpTmp.CustomerId.Value, _cmpTmp.CampaignId.Value);
if (item.EmailBody.Contains("[T-LandingPageLink]"))
{
//..
}
if (item.EmailBody.Contains("[T-FeedbackLink]"))
{
//..
}
if (item.EmailBody.Contains("src=\".."))
{
//..
}
//set flags to be used by the SMTP Queue and Scheduler
item.ReadyTobeSent = true;
item.PickupReady = false;
//add attachment to the recipient, if any.
if (IncludeAttachment)
{
item.AttachmentName = _cmpTmp.getAttachmentSubjectbyLangCode(string.IsNullOrEmpty(item.Language) ? item.DefaultLanguage : item.Language, item.DefaultLanguage) + "." + _cmpTmp.AttachmentExtension.ToLower();
try
{
if (_type == AttachmentGeneratorEngine.AttachmentExtenstion.PDF || _type == AttachmentGeneratorEngine.AttachmentExtenstion.WordX || _type == AttachmentGeneratorEngine.AttachmentExtenstion.Word)
item.EmailAttachment = AttachmentGeneratorEngine.GenerateAttachment(_type, "", item.AttachmentName, item.CMPRCode, _cmpTmp.LandingDomain, _cmpTmp.AttachmentZip.Value, _cmpTmp.getFirstAttachment(item.Language, item.DefaultLanguage));
else item.EmailAttachment = AttachmentGeneratorEngine.GenerateAttachment(_type, value, item.AttachmentName, item.CMPRCode, _cmpTmp.LandingDomain, _cmpTmp.AttachmentZip.Value);
item.AttachmentName = _cmpTmp.AttachmentZip.Value ? (_cmpTmp.getAttachmentSubjectbyLangCode(string.IsNullOrEmpty(item.Language) ? item.DefaultLanguage : item.Language, item.DefaultLanguage) + ".zip") :
_cmpTmp.getAttachmentSubjectbyLangCode(string.IsNullOrEmpty(item.Language) ? item.DefaultLanguage : item.Language, item.DefaultLanguage) + "." + _cmpTmp.AttachmentExtension.ToLower();
}
catch (Exception ex)
{
}
}
else
{
item.EmailAttachment = null;
item.AttachmentName = null;
}
}
#endregion
stopWatch.Stop();
bool res = WriteCampaignRecipientsLaunch(ref Recipients);
return res;
}
catch (Exception ex)
{
Recipients.ForEach(i => i.Dispose());
cmpGList.Dispose();
Recipients = null;
cmpGList = null;
return false;
}
finally
{
Recipients.ForEach(i => i.Dispose());
cmpGList.Dispose();
Recipients = null;
cmpGList = null;
}
}

How to store multiple xml data into single list using IEnumerable<SyncEntity> in c#

I have to store multiple xml files data into a single list of IEnumerable<SyncEntity>, but I don't get exact result as expected. Below is my sample code.
public IEnumerable<SyncEntity> GetUpdatedItemsOfType(DateTime? fromDate, string entityName, List<string> fieldsToRetrieve)
{
ConnLogger.WriteInfo("Dooors SyncConnector", "Run DOORS DXL for List of Getupdateditems of type ");
try
{
var dxlInput = $"{TmpRootFolder};{entityName};{fromDate.ToString()};{string.Join(",", fieldsToRetrieve)};{fieldsToRetrieve.Count.ToString()}";
string dxlPath = GetDxl("GetUpdatedItemsOfType.dxl");
//ActivateAsync(() =>
//{
// DoorsHandle.result = dxlInput;
// DoorsHandle.runFile(dxlPath);
//});
_doorsHandle.result = dxlInput;
_doorsHandle.runFile(dxlPath);
return GetUpdatedItemsOfTypePagination(TmpRootFolder);
}
catch (Exception ex)
{
ConnLogger.WriteException("Doors SyncConnector", ex, "Failed to get list of updateditems of type");
throw;
}
}
The above GetUpdatedItemsOfType(DateTime? fromDate, string entityName, List<string> fieldsToRetrieve) method is where I start one process of doors.
private IEnumerable<SyncEntity> GetUpdatedItemsOfTypePagination(string folderPath)
{
int currentPage = 1;
string finishFilePath = Path.Combine(folderPath, "GetUpdateItemsOfType_Finish.xml");
while (true)
{
string xmlFileFullPath = Path.Combine(folderPath, $"GetUpdateItemsOfType{currentPage}.xml");
bool pageReadCompleted = false;
for (int i = 0; i < 1000; i++) //wait max time of 1,000*0.1 = 100 seconds
{
if (!File.Exists(xmlFileFullPath))
{
if (File.Exists(finishFilePath))
{
yield break;
}
Thread.Sleep(TimeSpan.FromSeconds(0.1));
continue;
}
List<SyncEntity> pageItems = GetUpdatedItemsPage(xmlFileFullPath);
pageReadCompleted = true;
foreach (var syncEntity in pageItems)
{
yield return syncEntity;
}
break;
}
if (!pageReadCompleted)
{
throw new ApplicationException("Timeout reached for GetUpdatedItems method...");
}
currentPage++;
}
}
The above GetUpdatedItemsOfTypePagination(string folderPath) method checks the xml files in given folder.
private List<SyncEntity> GetUpdatedItemsPage(string xmlFilePath)
{
List<FileAttachment> fileAttachment=new List<FileAttachment>();
var xmlData = GenericSerializer.XmlDeSerialize<UpdatedItemsResult>(File.ReadAllText(xmlFilePath));
return xmlData.Items.Select(field => new SyncEntity
{
Name = field.ObjectName,
Id = field.Id,
Modified = Convert.ToDateTime(field.LastModifiedOn),
Fields = field.Attributes.Select(filed => new EntityField
{
Name = filed.Name,
Type = MetadataManager.FromDoorsDataType(filed.Type),
Value = fileAttachment
}).ToList()
}).ToList();
//copy here relevant code from the Execute method
}
The above GetUpdatedItemsPage(string xmlFilePath) method is the relevant code to deserialize the xml data.

JSON string to CSV and CSV to JSON conversion in c#

I'm working with JSON/CSV files in my asp.net web API project and tried with CSVHelper and ServiceStack.Text libraries but couldn't make it work.
The JSON file containing an array is dynamic and may have any number of fields
I read the file using streamreader and then need to convert it into CSV file to make it downloadable for end users.
example file text
[{"COLUMN1":"a","COLUMN2":"b","COLUMN3":"c","COLUMN4":"d","COLUMN5":"e"},
{"COLUMN1":"a","COLUMN2":"b","COLUMN3":"c","COLUMN4":"d","COLUMN5":"e"}]
JSON to CSV
public static string jsonStringToCSV(string content)
{
var jsonContent = (JArray)JsonConvert.DeserializeObject(content);
var csv = ServiceStack.Text.CsvSerializer.SerializeToCsv(jsonContent);
return csv;
}
This doesn't result me CSV data
Then some files are delimiter type with comma or tab and and i want to utilize CSVHelper to convert CSV string to IEnumerable dynamically
public static IEnumerable StringToList(string data, string delimiter, bool HasHeader)
{
using (var csv = new CsvReader(new StringReader(data)))
{
csv.Configuration.SkipEmptyRecords = true;
csv.Configuration.HasHeaderRecord = HasHeader;
csv.Configuration.Delimiter = delimiter;
var records = csv.GetRecords();
return records;
}
}
I was able to solve it by DeserializeObject to a datatable using Json.net, so want to post my own answer but will not mark it as accepted, if anyone have better way to do this.
To convert JSON string to DataTable
public static DataTable jsonStringToTable(string jsonContent)
{
DataTable dt = JsonConvert.DeserializeObject<DataTable>(jsonContent);
return dt;
}
To make CSV string
public static string jsonToCSV(string jsonContent, string delimiter)
{
StringWriter csvString = new StringWriter();
using (var csv = new CsvWriter(csvString))
{
csv.Configuration.SkipEmptyRecords = true;
csv.Configuration.WillThrowOnMissingField = false;
csv.Configuration.Delimiter = delimiter;
using (var dt = jsonStringToTable(jsonContent))
{
foreach (DataColumn column in dt.Columns)
{
csv.WriteField(column.ColumnName);
}
csv.NextRecord();
foreach (DataRow row in dt.Rows)
{
for (var i = 0; i < dt.Columns.Count; i++)
{
csv.WriteField(row[i]);
}
csv.NextRecord();
}
}
}
return csvString.ToString();
}
Final Usage in Web API
string csv = jsonToCSV(content, ",");
HttpResponseMessage result = new HttpResponseMessage(HttpStatusCode.OK);
result.Content = new StringContent(csv);
result.Content.Headers.ContentType = new MediaTypeHeaderValue("text/csv");
result.Content.Headers.ContentDisposition = new ContentDispositionHeaderValue("attachment") { FileName = "export.csv" };
return result;
I don't know if this is too late to report solution for your question. Just in case if you want to explore open source library to do the job, here is one
Cinchoo ETL makes it easy to convert JSON to csv with few lines of code
using (var r = new ChoJSONReader("sample.json"))
{
using (var w = new ChoCSVWriter("sample.csv").WithFirstLineHeader())
{
w.Write(r);
}
}
For more information / source, go to https://github.com/Cinchoo/ChoETL
Nuget package:
.NET Framework:
Install-Package ChoETL.JSON
.NET Core:
Install-Package ChoETL.JSON.NETStandard
Sample fiddle: https://dotnetfiddle.net/T3u4W2
Full Disclosure: I'm the author of this library.
Had the same problem recently and I believe there is a little bit more elegant solution using the System.Dynamic.ExpandoObject and CsvHelper. It is less code and hopefully the performance is similar or better compared to the DataTable.
public static string JsonToCsv(string jsonContent, string delimiter)
{
var expandos = JsonConvert.DeserializeObject<ExpandoObject[]>(jsonContent);
using (var writer = new StringWriter())
{
using (var csv = new CsvWriter(writer))
{
csv.Configuration.Delimiter = delimiter;
csv.WriteRecords(expandos as IEnumerable<dynamic>);
}
return writer.ToString();
}
}
This code is OK for me:
3 functions (check, parse and aux)
private bool IsValidJson(string strInput)
{
try
{
if (string.IsNullOrWhiteSpace(strInput)) { return false; }
strInput = strInput.Trim();
if ((strInput.StartsWith("{") && strInput.EndsWith("}")) || (strInput.StartsWith("[") && strInput.EndsWith("]")))
{
try
{
_ = JToken.Parse(strInput);
return true;
}
catch
{
return false;
}
}
return false;
}
catch { throw; }
}
private string ParseJsonToCsv(string json)
{
try
{
XmlNode xml = JsonConvert.DeserializeXmlNode("{records:{record:" + json + "}}");
XmlDocument xmldoc = new XmlDocument(); xmldoc.LoadXml(xml.InnerXml);
DataSet dataSet = new DataSet(); dataSet.ReadXml(new XmlNodeReader(xmldoc));
string csv = DTableToCsv(dataSet.Tables[0], ",");
return csv;
}
catch { throw; }
}
private string DTableToCsv(DataTable table, string delimator)
{
try
{
var result = new StringBuilder();
for (int i = 0; i < table.Columns.Count; i++)
{
result.Append(table.Columns[i].ColumnName);
result.Append(i == table.Columns.Count - 1 ? "\n" : delimator);
}
foreach (DataRow row in table.Rows)
for (int i = 0; i < table.Columns.Count; i++)
{
result.Append(row[i].ToString());
result.Append(i == table.Columns.Count - 1 ? "\n" : delimator);
}
return result.ToString().TrimEnd(new char[] { '\r', '\n' });
}
catch { throw; }
}
public void Convert2Json()
{
try
{
if (FileUpload1.PostedFile.FileName != string.Empty)
{
string[] FileExt = FileUpload1.FileName.Split('.');
string FileEx = FileExt[FileExt.Length - 1];
if (FileEx.ToLower() == "csv")
{
string SourcePath = Server.MapPath("Resources//" + FileUpload1.FileName);
FileUpload1.SaveAs(SourcePath);
string Destpath = (Server.MapPath("Resources//" + FileExt[0] + ".json"));
StreamWriter sw = new StreamWriter(Destpath);
var csv = new List<string[]>();
var lines = System.IO.File.ReadAllLines(SourcePath);
foreach (string line in lines)
csv.Add(line.Split(','));
string json = new
System.Web.Script.Serialization.JavaScriptSerializer().Serialize(csv);
sw.Write(json);
sw.Close();
TextBox1.Text = Destpath;
MessageBox.Show("File is converted to json.");
}
else
{
MessageBox.Show("Invalid File");
}
}
else
{
MessageBox.Show("File Not Found.");
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
The below code successfully compiles with latest stable version of CsvHelper nuget package.
public static string JsonToCsv(string jsonContent, string delimeter)
{
var expandos = JsonConvert.DeserializeObject<ExpandoObject[]>(jsonContent);
using (TextWriter writer = new StringWriter())
{
CsvConfiguration csvConfiguration = new CsvConfiguration(System.Globalization.CultureInfo.CurrentCulture);
csvConfiguration.Delimiter = delimeter;
using (var csv = new CsvWriter(writer, csvConfiguration))
{
csv.WriteRecords((expandos as IEnumerable<dynamic>));
}
return writer.ToString();
}
}
using System.Globalization;
using (var csv = new CsvWriter(csvString, CultureInfo.CurrentCulture)) {
...
}

Returning JSON from dynamic SQL query

In my data access layer when I want to serialize something into JSON and give it to the client I've been doing something like
using (var con = new SqlConnection(connectionString))
{
using (var cmd = new SqlCommand("spGetLengthsOfStay", con))
{
con.Open();
cmd.CommandType = System.Data.CommandType.StoredProcedure;
SqlDataReader rdr = cmd.ExecuteReader();
while (rdr.Read())
{
var los = new LOS();
los.VisitId = (int)rdr["VisitId"];
los.PatientId = (int)rdr["PatientId"];
los.Gender = (string)rdr["Gender"];
los.Age = (int)rdr["Age"];
los.Discharge = (string)rdr["Discharge"];
los.LengthOfStay = (int)rdr["LengthOfStay"];
losList.Add(los);
}
}
}
There are some instances where I need to query the database with a dynamically generated SQL query, so I don't always know the properties to create a single instance of the object, add it to a list, and return the list. What's the preferred method for getting the results from a SQL query back to the client all at one, without using a concrete type using .NET MVC?
using System;
using System.Collections.Generic;
using System.Linq;
using System.Dynamic;
using Newtonsoft.Json;
// microsoft sqlserver
using System.Data.SqlClient;
// oracle
using Oracle.ManagedDataAccess.Client;
namespace InqdWeb
{
public class Dbio
{
//
public static class Consts
{
public const string msgname = "retmsg";
public const string valname = "retval";
public const string jsond = "{ }";
public const string jsonr = "{ \"" + msgname + "\": \"OK\" }";
}
//
//
// »»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»
// core functions
// »»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»»
//
//
// with a "sql statement"
// and a connection id,
// prepare the actual sql to get data
// return the result as json
public static string sqljson
(string pi_sql
, string pi_conn
)
{
// empty data
var vd = Consts.jsond;
// success message
var vr = Consts.jsonr;
string msgout = "00";
var ld = new List<dynamic>();
ld = sqlmaster(pi_sql, pi_conn, out msgout);
//
if (msgout.Substring(0, 2) == "00") // not empty and no errors
{
vd = JsonConvert.SerializeObject(ld);
vr = Consts.jsonr.Replace("OK", "00");
}
if (msgout.Substring(0, 2) == "10") // empty and no errors
{
vr = Consts.jsonr.Replace("OK", "10");
}
if (msgout.Substring(1, 1) == "1") // error
{
vd = JsonConvert.SerializeObject(ld);
vr = Consts.jsonr.Replace("OK", msgout);
}
// return json with 2 collections: d with data, r with status and message
var vt = jsonmerge(vd, vr);
return vt;
}
//
//
//
// with a sql
// and a conn id
// return data as dynamic list
public static List<dynamic> sqlmaster
(string pi_sql
, string pi_conn
, out string po_msg
)
{
string sql = " ";
sql = pi_sql;
// result
po_msg = msgout;
// po_msg pos1 empty: 1 has rows: 0
// pos2 error: >0 no error: 0
// pos3... error message
return lista;
}
//
//
// with a sql statement
// and a connection string
// return the result on a dynamic list
// plus a string with
// pos1 error 0-ok 1-error
// pos2 list empty 0-ok 1-list is empty
// pos3... message return code from non-select or error message
public static List<dynamic> sqldo
(string pi_sql
, string pi_connstring
, out string msgout
)
{
// variables
string sql = pi_sql;
var lista = new List<dynamic>();
int retcode;
msgout = "0";
//
string ConnString = pi_connstring;
//
//
//
// Microsoft SqlServer
if (SqlFlavor == "Ms")
{
using (SqlConnection con = new SqlConnection(ConnString))
{
try
{
con.Open();
SqlCommand cmd = new SqlCommand(sql, con);
if (sqltype == "R")
{
SqlDataReader reada = cmd.ExecuteReader();
string datatype = "-";
string colname = "-";
while (reada.Read())
{
var obj = new ExpandoObject();
var d = obj as IDictionary<String, object>;
//
for (int index = 0; index < reada.FieldCount; index++)
{
datatype = reada.GetDataTypeName(index);
colname = reada.GetName(index);
bool isnul = reada.IsDBNull(index);
if (!isnul)
{
// add datatypes as needed
switch (datatype)
{
case "int":
d[colname] = reada.GetValue(index);
break;
case "varchar":
d[colname] = reada.GetString(index);
break;
case "nvarchar":
d[colname] = reada.GetString(index);
break;
case "date":
d[colname] = reada.GetDateTime(index);
break;
default:
d[colname] = reada.GetString(index);
break;
}
}
else
{
d[colname] = "";
}
}
lista.Add(obj);
}
reada.Close();
}
}
catch (Exception ex)
{
msgout = "11" + ex.Message.ToString();
}
}
}
//
// Oracle
if (SqlFlavor == "Oa")
{
// Or uses a "
sql = sql.Replace("[", "\"");
sql = sql.Replace("]", "\"");
using (OracleConnection con = new OracleConnection(ConnString))
{
try
{
con.Open();
//
OracleCommand cmd = new OracleCommand(sql, con);
OracleDataReader reada = cmd.ExecuteReader();
string datatype = "-";
string colname = "-";
while (reada.Read())
{
var obj = new ExpandoObject();
var d = obj as IDictionary<String, object>;
// browse every column
for (int index = 0; index < reada.FieldCount; index++)
{
datatype = reada.GetDataTypeName(index);
colname = reada.GetName(index);
bool isnul = reada.IsDBNull(index);
if (!isnul)
{
// add datatypes as needed
switch (datatype)
{
case "Decimal":
d[colname] = reada.GetValue(index);
break;
case "Varchar":
d[colname] = reada.GetString(index);
break;
default:
d[colname] = reada.GetString(index);
break;
}
}
else
{
d[colname] = "";
}
}
lista.Add(obj);
}
reada.Close();
//
}
catch (Exception ex)
{
msgout = "11" + ex.Message.ToString();
}
}
}
//
//
//
return lista;
}
//
//
}
Use it in your controller
string vret = "{'r':{'retval': 'OK' }}";
string sqltxt;
string connt;
connt = ConfigurationManager.ConnectionStrings["<your connection>"].ConnectionString;
sqltxt = "<your select>";
vret = Dbio.sqljson(sqltxt, connt, "MsX"); // MsX for MsSqlServer
return Content(vret, "application/json");

Categories

Resources