I am following this example to map custom column names to my class model:
CsvHelper Mapping by Name
In this particular part:
public FooMap()
{
Map(m => m.Id).Name("ColumnA");
Map(m => m.Name).Name("ColumnB");
}
Is it possible to use string as column name instead of hard-code it? Something like this --
public FooMap()
{
Map("Col1").Name("ColumnA");
Map("Col2").Name("ColumnB");
}
"Col1" and "Col2" are the property of my class model. I've tried to use reflection but it didn't work:
Map(x => typeof(MyClassModel).GetProperty("Col1")).Name("ColumnA");
Please let me know if what I am trying to achieve is possible. Some additional info -- the column mapping (source and destination) are both stored in a table.
Thanks!
This should allow you to use a string to map both the property name and the header name.
void Main()
{
var mapping = new Dictionary<string, string>
{
{"Id","FooId"},
{"Name","FooName"}
};
using (var reader = new StringReader("FooId,FooName\n1,Jordan"))
using (var csv = new CsvReader(reader, CultureInfo.InvariantCulture))
{
var fooMap = new DefaultClassMap<Foo>();
fooMap.Map(mapping);
csv.Context.RegisterClassMap(fooMap);
var records = csv.GetRecords<Foo>().ToList();
}
}
public static class CsvHelperExtensions
{
public static void Map<T>(this ClassMap<T> classMap, IDictionary<string, string> csvMappings)
{
foreach (var mapping in csvMappings)
{
var property = typeof(T).GetProperty(mapping.Key);
if (property == null)
{
throw new ArgumentException($"Class {typeof(T).Name} does not have a property named {mapping.Key}");
}
classMap.Map(typeof(T), property).Name(mapping.Value);
}
}
}
public class Foo
{
public int Id { get; set; }
public string Name { get; set; }
}
As an another approach, define an XML/JSON config file where you can define the columns you want to map. Write a parser that could parse your XML/JSON config and return the columns to be mapped dynamically. This approach allows you to map any no of columns on fly, without having to recompile your code.
Related
I'm working on a C# windows service that reads in a csv file into a List using CsvHelper along with it's class map by index functionality. I would like to store the original raw data row in each model.
I've tried using Map(m => m.Row).Index(-1); but that did not work. I also tried ConvertUsing, but I get a message that MemberMap does not contain a definition for 'ConvertUsing'.
The RegisterClassMap and csv.GetRecords functionality is doing a bulk read that doesn't give me an opportunity to capture the original raw data row.
Any help would be greatly appreciated. I need to create an email with the status (sending the data to a micro service) and the original raw data two, and would love to store it while CsvHelper is reading the file.
void Main()
{
var s = new StringBuilder();
s.Append("Id,Name\r\n");
s.Append("1,one\r\n");
var config = new CsvConfiguration(CultureInfo.InvariantCulture)
{
};
using (var reader = new StringReader(s.ToString()))
using (var csv = new CsvReader(reader, config))
{
csv.Context.RegisterClassMap<FooMap>();
csv.GetRecords<Foo>().ToList().Dump();
}
}
private class Foo
{
public int Id { get; set; }
public string Name { get; set; }
public string RawRow { get; set; }
}
private class FooMap : ClassMap<Foo>
{
public FooMap()
{
Map(m => m.Id);
Map(m => m.Name);
Map(m => m.RawRow).Convert(args => args.Row.Parser.RawRecord);
}
}
.Dump() is a LINQPad method.
I have project entity structure like this as below with project name and project number and along with some list objects as well as properties.
public class DesignProject
{
[Key, GraphQLNonNullType]
public string ProjectNumber { get; set; }
public string Name { get; set; }
[Column(TypeName = "jsonb")]
public ProjectSectionStatus SectionStatuses { get; set; } = new ProjectSectionStatus();
[ForeignKey("AshraeClimateZone"), GraphQLIgnore]
public Guid? AshraeClimateZoneId { get; set; }
public virtual AshraeClimateZone AshraeClimateZone { get; set; }
[Column(TypeName = "jsonb")]
public List<ProjectObject<classA>> classAList{ get; set; } = new List<ProjectObject<classA>>();
[Column(TypeName = "jsonb")]
public List<ProjectObject<classB>> classBList{ get; set; } = new List<ProjectObject<classB>>();
......
......
......
Some more json columns
}
and the project object class like this
public class ProjectObject<T>
{
public Guid? Id { get; set; }
public T OriginalObject { get; set; }
public T ModifiedObject { get; set; }
[GraphQLIgnore, JsonIgnore]
public T TargetObject
{
get
{
return ModifiedObject ?? OriginalObject;
}
}
}
and ClassA entity structure like as below
public class ClassA
{
public string Edition { get; set; }
public string City { get; set; }
}
and i have some similar children entities (ClassA) same like as above, I want to copy the contents and statuses from one project entity to other project entity.
I have project entity with ProjectNumber 1212 and have another project having ProjectNumber like 23323 so i would like copy entire project contents from 1212 to 23323. So is there any way we can achieve this with C# and i am using .Net Core with Entity framework core.
Here the source design project that i am going to copy have same structure with destination design project and i am fine with overriding the destination project values and i don't want to update the project number here.
Could any one please let me know the how can i achieve this copying? Thanks in advance!!
Please let me know if i need to add any details for this question
Update : Deep copy related code
public InsertResponse<string> CopyBookmarkproject(string SourceProjectNumber, string destinationProjectNumber)
{
var sourceDesignProject = this._dbContext.DesignProject.Where(a => a.ProjectNumber == SourceProjectNumber).SingleOrDefault();
var destinationProject = this._dbContext.DesignProject.Where(a => a.ProjectNumber == destinationProjectNumber).SingleOrDefault();
CopyProject(sourceDesignProject, destinationProject);
// Need to update the Db context at here after deep copy
}
private void CopyProject(DesignProject sourceDesignProject, DesignProject destinationProject)
{
destinationProject.classAList= sourceDesignProject.classAList; // Not sure whether this copy will works
destinationProject.AshraeClimateZone = sourceDesignProject.AshraeClimateZone; // not sure whether this copy will works also
}
Updated solution 2:
var sourceDesignProject = this._dbContext.DesignProjects.AsNoTracking()
.Where(a => a.ProjectNumber == sourceProjectNumber)
.Include(a => a.PrimaryBuildingType)
.Include(a => a.AshraeClimateZone).SingleOrDefault();
var targetDesignProject = this._dbContext.DesignProjects.Where(a => a.ProjectNumber == targetProjectNumber).SingleOrDefault();
sourceDesignProject.ProjectNumber = targetDesignProject.ProjectNumber;
sourceDesignProject.SectionStatuses.AirSystemsSectionStatus = Entities.Enums.ProjectSectionStage.INCOMPLETE;
sourceDesignProject.SectionStatuses.CodesAndGuidelinesSectionStatus = Entities.Enums.ProjectSectionStage.INCOMPLETE;
sourceDesignProject.SectionStatuses.ExecutiveSummarySectionStatus = Entities.Enums.ProjectSectionStage.INCOMPLETE;
sourceDesignProject.SectionStatuses.ExhaustEquipmentSectionStatus = Entities.Enums.ProjectSectionStage.INCOMPLETE;
sourceDesignProject.SectionStatuses.InfiltrationSectionStatus = Entities.Enums.ProjectSectionStage.INCOMPLETE;
this._dbContext.Entry(sourceDesignProject).State = EntityState.Modified; // getting the below error at this line
this._dbContext.SaveChanges();
ok = true;
getting an error like as below
The instance of entity type 'DesignProject' cannot be tracked because another instance with the same key value for {'ProjectNumber'} is already being tracked. When attaching existing entities, ensure that only one entity instance with a given key value is attached.
There's a simple and short solution which uses .AsNoTracking() for source entry, assigning destination's ProjectNumber to the source and then changing the EntityState as Modified for it. It will copy all the properties to the destination entry :
var sourceDesignProject = this._dbContext.DesignProject.AsNoTracking().Where(a => a.ProjectNumber == SourceProjectNumber).SingleOrDefault();
var destinationProject = this._dbContext.DesignProject.Where(a => a.ProjectNumber == destinationProjectNumber).SingleOrDefault();
sourceDesignProject.ProjectNumber = destinationProject.ProjectNumber;
this._dbContext.Entry(sourceDesignProject).State = System.Data.Entity.EntityState.Modified;
this._dbContext.SaveChanges();
Make sure your relational properties are loaded before.
You can make use of Reflection. As you have both, source and destination objects and are of the same type. You can just traverse through source object and fetch the properties and replace the values.
Something like :
if (inputObjectA != null && inputObjectB != null)
{
//create variables to store object values
object value1, value2;
PropertyInfo[] properties = inputObjectA.GetType().GetProperties(BindingFlags.Public | BindingFlags.Instance);
//get all public properties of the object using reflection
foreach (PropertyInfo propertyInfo in properties)
{
//get the property values of both the objects
value1 = propertyInfo.GetValue(inputObjectA, null);
value2 = propertyInfo.GetValue(inputObjectB, null);
}
Ignore the properties that are not needed to be copied. The method will have a signature like below :
CopyObjects(object inputObjectA, object inputObjectB, string[] ignorePropertiesList)
Two choices bring to mind for deep copy of objects:
1- Using BinaryFormatter (Needs Serializable attribute on your class, See original answer):
public static T DeepClone<T>(this T obj)
{
using (var ms = new MemoryStream())
{
var formatter = new BinaryFormatter();
formatter.Serialize(ms, obj);
ms.Position = 0;
return (T) formatter.Deserialize(ms);
}
}
2- Using JsonSerializers like NewtoneSoft:
public static T CloneJson<T>(this T source)
{
// Don't serialize a null object, simply return the default for that object
if (Object.ReferenceEquals(source, null))
{
return default(T);
}
// initialize inner objects individually
// for example in default constructor some list property initialized with some values,
// but in 'source' these items are cleaned -
// without ObjectCreationHandling.Replace default constructor values will be added to result
var deserializeSettings = new JsonSerializerSettings {ObjectCreationHandling = ObjectCreationHandling.Replace};
return JsonConvert.DeserializeObject<T>(JsonConvert.SerializeObject(source), deserializeSettings);
}
Update
Here's the code you can use with NewtonSoft Json library:
private void CopyProject(DesignProject sourceDesignProject, ref DesignProject destinationProject)
{
if (Object.ReferenceEquals(sourceDesignProject, null))
{
destinationProject = null;
return;
}
var deserializeSettings = new JsonSerializerSettings {ObjectCreationHandling = ObjectCreationHandling.Replace};
var destProjNumber = destinationProject.ProjectNumber;
destinationProject = JsonConvert.DeserializeObject<DesignProject>
(JsonConvert.SerializeObject(sourceDesignProject), deserializeSettings);
destinationProject.ProjectNumber = destProjNumber;
}
I always prefer using AutoMapper for these type of work. It's optimized and lower risk of making mistakes. And better than serialize/deserialize method since that would cause performance issues.
In your constructor (or in startup.cs and passing via dependency injection)
var config = new MapperConfiguration(cfg => {
cfg.CreateMap<DesignProject, DesignProject>().ForMember(x => x.Id, opt => opt.Ignore());
cfg.CreateMap<ProjectObject<ClassA>, ProjectObject<ClassA>>().ForMember(x => x.Id, opt => opt.Ignore()));
cfg.CreateMap<ProjectObject<ClassA>, ProjectObject<ClassB>>().ForMember(x => x.Id, opt => opt.Ignore()));
cfg.CreateMap<ProjectObject<ClassB>, ProjectObject<ClassB>>().ForMember(x => x.Id, opt => opt.Ignore()));
cfg.CreateMap<ProjectObject<ClassB>, ProjectObject<ClassA>>().ForMember(x => x.Id, opt => opt.Ignore()));
});
var mapper = config.CreateMapper();
In your method :
var newProject = mapper.Map(sourceProject, new DesignProject());
OR
mapper.Map(sourceProject, targetProject);
This shouldn't work. If I understand you want to copy all elements from list into new list, but to make new object. Than maybe you can use extension method to clone complete list as described here link. For updating navigation properties check this post link.
I have table called Asset. It has lot of columns. I only want to select two of them and use them separately.
Both of these columns are strings.
Linq query :
public static List<string> GetAssetIdsWithNames()
{
using (var db = DbManager.Get())
{
var result = db.Assets.SelectMany(i=> new[] { i.AssetName, i.AssetId }).Distinct().ToList();
return result;
}
}
Where I want to use them :
var assetList = AssetManager.GetAssetIdsWithNames();
//CURRENCYBOX IS A DROPDOWN
CurrencyBox.DataSource = assetList;
CurrencyBox.DataBind();
foreach (var item in assetList)
{
CurrencyBox.DataValueField = //asset id goes here
CurrencyBox.DataTextField =//asset name goes here
break;
}
You cannot access the anonymous type outside of the local scope.
Anonymous types can only be returned as Object outside their local scope and their properties inspected via reflection.
So in this scenario, you are likely better off to use a typed data contract and map from your Asset entity instead and then access it from your calling method.
Your use of SelectMany seems odd too, you probably are after Select instead.
public class AssetDto
{
public string Name { get;set; }
public string Id { get; set; }
}
public static List<AssetDto> GetAssetIdsWithNames()
{
using (var db = DbManager.Get())
{
var result = db.Assets.Select(i=> new AssetDto { Name = i.AssetName, Id = i.AssetId }).ToList();
return result;
}
}
You could use named value tuples for that so you don't need to create an extra class
public static List<(string Name, int Id)> GetAssetWithIds()
{
using (var db = DbManager.Get())
{
var result = db.Assets
.Select(a => new { a.AssetName, a.AssetId })
.Distinct().AsEnumerable()
.Select(a => (a.AssetName, a.AssetId))
.ToList();
return result;
}
}
You will need to add System.ValueTuple
I have an immutable class that I want to write to and read from a CSV file. The issue is I am getting an exception when reading the CSV despite having mapped the object and set up a configuration that should allow this to work.
To do this I am using CsvHelper. The immutable class looks like the following.
public class ImmutableTest
{
public Guid Id { get; }
public string Name { get; }
public ImmutableTest(string name) : this(Guid.NewGuid(), name)
{
}
public ImmutableTest(Guid id, string name)
{
Id = id;
Name = name;
}
}
I have no issue writing this to a CSV file, but when I try to read it from a file, I get the following exception.
No members are mapped for type 'CsvTest.Program+ImmutableTest'
However, I have mapped the members for this class in the map class below.
public sealed class ImmutableTestMap : ClassMap<ImmutableTest>
{
public ImmutableTestMap()
{
Map(immutableTest => immutableTest.Id)
.Index(0)
.Name(nameof(ImmutableTest.Id).ToUpper());
Map(immutableTest => immutableTest.Name)
.Index(1)
.Name(nameof(ImmutableTest.Name));
}
}
I have also tried to configure the reader to use the constructor to build the object by using the following configuration.
Configuration config = new Configuration
{
IgnoreBlankLines = true
};
config.RegisterClassMap<ImmutableTestMap>();
config.ShouldUseConstructorParameters = type => true;
config.GetConstructor = type => type.GetConstructors()
.MaxBy(constructor => constructor.GetParameters().Length)
.FirstOrDefault();
No of this seems to be working. Where am I going wrong?
Complete MCVE .NET Framework Console Example
Install the packages
Install-Package CsvHelper
Install-Package morelinq
Sample console program
using System;
using System.IO;
using CsvHelper;
using CsvHelper.Configuration;
using MoreLinq;
namespace CsvTest
{
class Program
{
static void Main()
{
Configuration config = new Configuration
{
IgnoreBlankLines = true
};
config.RegisterClassMap<ImmutableTestMap>();
config.ShouldUseConstructorParameters = type => true;
config.GetConstructor = type => type.GetConstructors()
.MaxBy(constructor => constructor.GetParameters().Length)
.FirstOrDefault();
const string filePath = "Test.csv";
using (FileStream file = new FileStream(filePath, FileMode.Create))
using (StreamWriter fileWriter = new StreamWriter(file))
using (CsvSerializer csvSerializer = new CsvSerializer(fileWriter, config))
using (CsvWriter csvWriter = new CsvWriter(csvSerializer))
{
csvWriter.WriteHeader<ImmutableTest>();
csvWriter.NextRecord();
csvWriter.WriteRecord(new ImmutableTest("Test 1"));
csvWriter.NextRecord();
csvWriter.WriteRecord(new ImmutableTest("Test 2"));
csvWriter.NextRecord();
}
using (FileStream file = new FileStream(filePath, FileMode.Open))
using (StreamReader fileReader = new StreamReader(file))
using (CsvReader csvReader = new CsvReader(fileReader, config))
{
foreach (ImmutableTest record in csvReader.GetRecords<ImmutableTest>())
{
Console.WriteLine(record.Id);
Console.WriteLine(record.Name);
Console.WriteLine();
}
}
}
public sealed class ImmutableTestMap : ClassMap<ImmutableTest>
{
public ImmutableTestMap()
{
Map(immutableTest => immutableTest.Id)
.Index(0)
.Name(nameof(ImmutableTest.Id).ToUpper());
Map(immutableTest => immutableTest.Name)
.Index(1)
.Name(nameof(ImmutableTest.Name));
}
}
public class ImmutableTest
{
public Guid Id { get; }
public string Name { get; }
public ImmutableTest(string name) : this(Guid.NewGuid(), name)
{
}
public ImmutableTest(Guid id, string name)
{
Id = id;
Name = name;
}
}
}
}
If the type is immutable, it will use constructor mapping instead. Your constructor variable names need to match the header names. You can do this using Configuration.PrepareHeaderForMatch.
void Main()
{
var s = new StringBuilder();
s.AppendLine("Id,Name");
s.AppendLine($"{Guid.NewGuid()},one");
using (var reader = new StringReader(s.ToString()))
using (var csv = new CsvReader(reader))
{
csv.Configuration.PrepareHeaderForMatch = (header, indexer) => header.ToLower();
csv.GetRecords<ImmutableTest>().ToList().Dump();
}
}
public class ImmutableTest
{
public Guid Id { get; }
public string Name { get; }
public ImmutableTest(string name) : this(Guid.NewGuid(), name)
{
}
public ImmutableTest(Guid id, string name)
{
Id = id;
Name = name;
}
}
So the issue here is that there are no ParameterMaps in the class map.
The easy way to fix this, with the example above, is to write something like
public ImmutableTestMap()
{
AutoMap();
Map(immutableTest => immutableTest.Id)
.Index(0)
.Name(nameof(ImmutableTest.Id).ToUpper());
}
But this causes an issue that the column headers in the CSV file are ID, Name and the parameter maps generated are id, name. These are not equal to each other and so the reader throws an error saying it can't find a column with the name ID. It also says you can set header validated to null. After playing around with this, I ended up at a point where the configuration
Configuration config = new Configuration
{
IgnoreBlankLines = true,
ShouldUseConstructorParameters = type => true,
GetConstructor = type => type.GetConstructors()
.MaxBy(constructor => constructor.GetParameters().Length)
.FirstOrDefault(),
HeaderValidated = null,
MissingFieldFound = null
};
config.RegisterClassMap<ImmutableTestMap>();
Is trying to parse empty fields and failing to convert it to a GUID. So it looked like that road was dead.
To get around this, I looked at checking each parameter map generated by auto map for the "id" value and then to replace it with "ID". However, this also did not work as the auto generated maps are generated with a null name. The name is only assigned when SetMapDefaults are called during the registration of a map in a config.
So I went for a blind loop of setting the names of the parameters maps to the explicitly defined member maps.
public ImmutableTestMap()
{
AutoMap();
Map(immutableTest => immutableTest.Id)
.Index(0)
.Name(nameof(ImmutableTest.Id).ToUpper());
Map(immutableTest => immutableTest.Name)
.Index(0)
.Name(nameof(ImmutableTest.Id));
if (MemberMaps.Count != ParameterMaps.Count)
{
return;
}
for (int i = 0; i < MemberMaps.Count; i++)
{
ParameterMaps[i].Data.Name = MemberMaps[i].Data.Names[0];
}
}
However, I would say this is more of a bodge than a fix and would definitely be open to other answers.
I use CsvHelper to read and write CSV files and it is great, yet I don't understand how to write only selected type fields.
Say we had:
using CsvHelper.Configuration;
namespace Project
{
public class DataView
{
[CsvField(Name = "N")]
public string ElementId { get; private set; }
[CsvField(Name = "Quantity")]
public double ResultQuantity { get; private set; }
public DataView(string id, double result)
{
ElementId = id;
ResultQuantity = result;
}
}
}
and we wanted to exclude "Quantity" CsvField from resulting CSV file that we currently generate via something like:
using (var myStream = saveFileDialog1.OpenFile())
{
using (var writer = new CsvWriter(new StreamWriter(myStream)))
{
writer.Configuration.Delimiter = '\t';
writer.WriteHeader(typeof(ResultView));
_researchResults.ForEach(writer.WriteRecord);
}
}
What could I use to dynamically exclude a type field from the CSV?
If it is necessary we could process the resulting file, yet I do not know how to remove an entire CSV column with CsvHelper.
I recently needed to achieve a similar result by determining what fields to include at runtime. This was my approach:
Create a mapping file to map which fields I need at runtime by passing in an enum into the class constructor
public sealed class MyClassMap : CsvClassMap<MyClass>
{
public MyClassMap(ClassType type)
{
switch (type)
{
case ClassType.TypeOdd
Map(m => m.Field1);
Map(m => m.Field3);
Map(m => m.Field5);
break;
case ClassType.TypeEven:
Map(m => m.Field2);
Map(m => m.Field4);
Map(m => m.Field6);
break;
case ClassType.TypeAll:
Map(m => m.Field1);
Map(m => m.Field2);
Map(m => m.Field3);
Map(m => m.Field4);
Map(m => m.Field5);
Map(m => m.Field6);
break;
}
}
}
Write out the records to using the mapping configuration
using (var memoryStream = new MemoryStream())
using (var streamWriter = new StreamWriter(memoryStream))
using (var csvWriter = new CsvWriter(streamWriter))
{
csvWriter.Configuration.RegisterClassMap(new MyClassMap(ClassType.TypeOdd));
csvWriter.WriteRecords(records);
streamWriter.Flush();
return memoryStream.ToArray();
}
Mark the field like this:
[CsvField( Ignore = true )]
public double ResultQuantity { get; private set; }
Update: Nevermind. I see you want to do this at runtime, rather than compile time. I'll leave this up as red flag for anyone else who might make the same mistake.
You can do this:
using (var myStream = saveFileDialog1.OpenFile())
{
using (var writer = new CsvWriter(new StreamWriter(myStream)))
{
writer.Configuration.AttributeMapping(typeof(DataView)); // Creates the CSV property mapping
writer.Configuration.Properties.RemoveAt(1); // Removes the property at the position 1
writer.Configuration.Delimiter = "\t";
writer.WriteHeader(typeof(DataView));
_researchResults.ForEach(writer.WriteRecord);
}
}
We are forcing the creation of the attribute mapping and then modifying it, removing the column dynamically.
I had a similar issue with my code and I fixed it by the following code.
you can do this:
var ignoreQuantity = true;
using (var myStream = saveFileDialog1.OpenFile())
{
using (var writer = new CsvWriter(new StreamWriter(myStream)))
{
var classMap = new DefaultClassMap<DataView>();
classMap.AutoMap();
classMap.Map(m => m.ResultQuantity).Ignore(ignoreQuantity)
writer.Configuration.RegisterClassMap(classMap);
writer.Configuration.Delimiter = "\t";
writer.WriteHeader(typeof(DataView));
_researchResults.ForEach(writer.WriteRecord);
}
}
I had to solve this also: I have a couple dozen record types with a common base class plus a common field that has to be ignored by all of them:
// Nothing special here
internal class MyClassMap<T> : ClassMap<T> where T : MyRecordBaseClass
{
public MyClassMap()
{
AutoMap();
Map( m => m.SOME_FIELD ).Ignore();
}
}
This part is generally well documented and not the dynamic part.
But one class needed special sauce by ignoring a different field dynamically, and though I could have created a separate map class, this didn't scale for what I expect will be a lot more of these, so I finally figured out how to do it properly:
...
// special processing for *one* record type
csvwriter.Configuration.RegisterClassMap<MyClassMap<ONE_RECORD_TYPE>>();
if (ShouldIgnore)
{
var map = csvwriter.Configuration.Maps.Find<ONE_RECORD_TYPE>();
map.Map( m => m.SOME_OTHER_FIELD ).Ignore();
}
...
This worked on CsvHelper versions 7.1.1 and 12.1.1.