How to ignore Symbols in string comparison and dictionary lookup - c#

It looks like I have a problem with the symbols encodings between string built by my program and string retrieved from another datasource.
Here is a .NET Fiddle and here is the explanation:
var context = new List<Foo>
{
new Foo { Name = "SoW.Probing.4GEPCCore.CaptSite[1].S1U" },
new Foo { Name = "SoW.Probing.4GEPCCore.CaptSite[2].S1U" },
new Foo { Name = "SoW.Probing.2G3GPSCore.CaptSite[1].GnGpU" },
new Foo { Name = "SoW.Probing.2G3GPSCore.CaptSite[2].GnGpU" }
};
var nameToCheckPresence = GetStringFromAnotherDataSource(); // the value of the string is for example: "SoW.Probing.4GEPCCore.CaptSite.S1U"
nameToCheckPresence = nameToCheckPresence.Replace("CaptSite", "CaptSite[1]");
var foo = context.FirstOrDefault(f => f.Name == nameToCheckPresence); // Should return an object since one object does have that name
My problem is that foo is null. It works if I use this code line:
var foo = context .FirstOrDefault(f => CultureInfo.CurrentCulture.CompareInfo.Compare(f.Name, nameToCheckPresence , CompareOptions.IgnoreSymbols) == 0);
So clearly, I have a problem with symbols encoding (the .? the [ ]?). My true problem is that later, I am doing the same thing with a dictionary. The hashcode of the strings are different and the dictionary lookup also failed:
var dictionary = context.ToDictionary(f => f.Name);
var foo = dictionary[nameToCheckPresence]; // Should return the object but failed and throw a KeyNotFoundException
Is there a way to change the string symbols encoding in a global manner in the application? (WPF application in my case)
As the context can be very large, it is planed to use a Dictionary also in the first place. So if you provide me a solution that only works with Dictionary, it is not a problem.
Just for the record, the datasource is a SQLite database in which is a copy of the data of a MySQL database filled by another WPF application (running on the same computer, no specific culture setup). Finally, the nameToCheckPresence is extracted from a larger string by ANTLR4CS.

This is not a satisfactory answer, but that's all I find to solve the problem. Instead of looking into the dictionary through the indexor, I am doing a linq query:
dictionary.FirstOrDefault(pair => CultureInfo.CurrentCulture.CompareInfo.Compare(pair.Key, localFactName, CompareOptions.IgnoreSymbols) == 0).Value;
But doing this, I lost all the benefit of the dictionary access complexity. If anyone has a better solution, I will take it!

Related

Serialize Enum Values and Names to JSON

This question is related, but IMHO not identical to
How do I serialize a C# anonymous type to a JSON string?
Serialize C# Enum Definition to Json
Whilst testing, I've also stumbled across this culprit LinqPad which made my life difficult:
Why does LINQPad dump enum integer values as strings?
Now, my actual question:
My application (in particular SyncFusion component datasources, such as MultiSelect) requires enumerations in JSON format, e.g. something like this:
[ {"Id":0,"Name":"Unknown"},{"Id":1,"Name":"Open"},{"Id":2,"Name":"Closed"},{"Id":3,"Name":"Approve"} ]
UPDATE
As dbc pointed out, my question may not have been clear enough. I do not want to serialize one entry of the enumeration, but the whole struct. The JSON could then be used for a data source in Javascript, e.g. for a , simplified:
<option value=0>Unknown</option>
<option value=1>Open</option> etc
The JSON object is identical to an Enum in a namespace (with the exception that I have given the a property name to the Key and Value of each entry:
public enum ListOptions
{
Unknown = 0,
Open = 1,
Closed = 2,
Approve = 3
}
I've struggled with Enums, all the other approaches such as specifying a Json StringConverter etc did't yield all options in an array, so I ended up using Linq. My View Model now has a string property like this:
public string CrewListOption => JsonConvert.SerializeObject(Enum.GetValues(typeof(ListOptions))
.Cast<int>()
.Select(e => new { Id = (int) e, Name = typeof(ListOptions).GetEnumName(e) }));
Given that I'm pretty much a beginner with ASP.Net Core, I find it hard to believe that this should be a good solution. Yet I find it hard to find straight-forward better examples of the same thing.
I'd appreciate it if you might be able to help me improve this, and make it potentially more generically useful to "export" whole enumerations to JSON.
Here's the full LinqPad (where Newtonsoft.Json is imported from GAC):
void Main()
{
Enum.GetValues(typeof(ListOptions)).Cast<int>().Select(e => new { Id = e, Name = (ListOptions) e } ).Dump(); // these are identical, except for the typeof()
Enum.GetValues(typeof(ListOptions)).Cast<int>().Select(e => new { Id = (int) e, Name = typeof(ListOptions).GetEnumName(e) }).Dump(); // is typeof(MyEnumType) better?
string JsonString = JsonConvert.SerializeObject(Enum.GetValues(typeof(ListOptions)).Cast<int>().Select(e => new { Id = (int) e, Name = typeof(ListOptions).GetEnumName(e) }));
JsonString.Dump(); // [{"Id":0,"Name":"Unknown"},{"Id":1,"Name":"Open"},{"Id":2,"Name":"Closed"},{"Id":3,"Name":"Approve"}]
}
public enum ListOptions {
Unknown = 0,
Open = 1,
Closed = 2,
Approve = 3
};
You may have static method like
public static EnumToDictionary<string, string> EnumToDictionary<T>() where T: Enum
{
var res = Enum.GetValues(typeof(T)).Cast<T>()
.ToDictionary(e => Convert.ToInt32(e).ToString(), e => e.ToString());
return res;
}
then for Serializing as object
var enumValues= EnumToDictionary<ListOptions>();
var result = JsonConvert.SerializeObject(enumValues);
for serializing as array
var enumValues= EnumToDictionary<ListOptions>().ToArray();
var result = JsonConvert.SerializeObject(enumValues);
Here is an example from Microsoft Docs that convert Enum to Dictionary
https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/generics/constraints-on-type-parameters#enum-constraints
Then you can serialize the dictionary to JSON.

Pass an array or list of different objects to a function for modification

Currently I am receiving an array of objects from a database.
object [] sqlResultData = DatabaseCall.Result();
This array of objects needs to be matched to class variables like this
CClassOfVars classVar = new CClassOfVars();
classVar.myProperty = sqlResultData[0];
classVar.myProperty1 = sqlResultData[1];
What i wish to do is pass the list of propertys on the class in order to a function and have the mapping from the object array occur automatically based on the order.
For example:
Method defined like this
FillData(object [] databaseValues, IList<object>())
Called like this
CClassOfVars classVar = new CClassOfVars();
object [] sqlResultData = DatabaseCall.Result();
FillData(sqlResultData, new List<object>(){classVar.myProperty,classVar.myProperty1});
The FillData function would hopefully type cast and set the values of myProperty and myProperty1 to the values in array locations of 0,1 etc...
Something like this
FillData(object [] databaseValues, IList<object> mapMe)
{
for (int i = 0; i < mapMe.Count; i++)
{
mapMe[i] = CastToTheCorrectType(mapMe[i], databaseValues[i]);
}
}
Cast to the correct type could look like this?? I took from here: cast object with a Type variable
public T CastToTheCorrectType<T>(T hackToInferNeededType, object givenObject) where T : class
{
return givenObject as T;
}
How can i pass a list of different object types to all have there values modified and assigned within a different function?
The matter you asking about is dark and difficult to be implemented through just a function. There are frameworks out there dealing with object relational mapping. If it is an option, install and learn some OR/M. If not ... well, there might be some dirty way.
You can use the JSON.NET library to do the heavy lifting for you. It's super easy to use and install through Nuget. My point is as follows.
Construct an anonymous object. Use the property names of the original object.
Fill it with the data from the object array. Spin a loop over the object array...
Serialize the anonymous object.
Deserialize the JSON string into the target type.
At this point, JSON.NET will handle property mapping for you.
List item
E.g. if your target type is Person you might do this:
var x = new
{
FirstName = String.Empty,
LastName = String.Empty
};
var persons = new List<Person>(sqlResultData.Length);
foreach (var record in sqlResultData)
{
x.FirstName = record[0];
x.LastName = record[1];
var s = JsonConvert.SerializeObject(x)`
var personX = JsonConvert.Deserialize<Person>(s);
persons.Add(person);
}

Why doesn't IEnumerable.Where() find my objects in DynamoDBContext.Scan() results?

While using AWS DynamoDB "Object Persistence Model" in C#, ran into an interesting issue; parsing results of a Scan operation. In the code below, my entries in Datastore.userIndicators (which is a Dictionary with Lists of objects, indexed by usernames) are always empty lists.
var allIndicators = context.Scan<Indicator>();
Datastore.globalIndicators = allIndicators.Where(i => i.UserName == "*").ToList();
var userDefinedIndicators = allIndicators.Where(i => i.UserName != "*");
foreach (var username in userDefinedIndicators.Select(i => i.UserName).Distinct())
{
Datastore.userIndicators[username] = userDefinedIndicators.Where(i => i.DynamoDbRangeKey.StartsWith(username)).ToList();
}
So, for example, if I have entries in my table that include an attribute "UserName" with value "userA", when running this code the Dictionary "Datastore.userIndicators" will end up with an entry for key "userA" but the corresponding value will be an empty list.
After fiddling with this and following a hunch, I modified the assignment of
var allIndicators = context.Scan<Indicator>();
to
var allIndicators = context.Scan<Indicator>().ToList();
Voila!
It turns out (as confirmed by AWS SDK documentation), the return of the DynamoDBContext.Scan() method is lazy-loaded. Calling .ToList() forces enumeration and loads all the results.

C# MemoryCache, Cannot find out what is happening

I created a service supporting my asp.net mvc controller. The service returns a List. It is a list of custom defined fields that I render in the the create and edit views. Since these fields are defined once I want to return a cache when available and when not, create the cache. Since I am testing I have not defined cache expiration.
When I execute the Edit action this service helps mapping the queried values to the cached list of customfields. What happens is that my object in cache is modified.
I am familiair that the MemoryCache contains a reference and that is does not contain a copy of the object. What I do not understand is why the MemoryCache is modified, when I am actually working with an object that - in my view - is not a reference to the cache and had been passed to the method and has no ref or out parameters defined. For me the reference is in a totally different scope?
I tried all sorts of things but I am missing the essential issue that is causing this behavior and I really want to figure out what is happening here. Is there a broader scope of the reference. Do local variables are still being shared among methods?
This is the method in the service that either returns the cached information, or queries the database and stores the result in the cache. It is used by the Create and Edit actions. Notice that the value property is defined as being null so that a Create actions starts with empty fields.
public IList<CustomField> GetCustomFields()
{
var result = MemoryCache.Default["cache_customfield"] as List<CustomField>;
if (result == null)
{
result = session.Query<CustomField>()
.AsEnumerable()
.Select(c => new CustomField
{
Id = c.Id,
Name = c.Name,
Value = null
})
.ToList();
MemoryCache.Default["cache_customfield"] = result;
}
return result;
}
public static IList<CustomField> MapValues(IList<CustomField> fields, IDictionary<string,string> values = null)
{
// the cached information still has value properties that are null
var a = MemoryCache.Default["cache_customfield"] as List<CustomField>;
foreach (var field in fields.OrderBy(x => x.Name))
{
var persistedValue = string.Empty;
values?.TryGetValue(field.Id, out persistedValue);
field.Value = persistedValue;
}
// the cached information suddenly has value properties that are defined, however the 'fields' parameter has no reference to the original information?!
var b = MemoryCache.Default["cache_customfield"] as List<CustomField>;
return fields;
}
I doubt these have much impact on the situation, but these are the actions on the controllers for Create and Edit.
public ActionResult Create()
{
var ticketService = BusinessServiceFacade.GetTicketService(RavenSession);
var vm = new TicketViewModel();
vm.Controls = ControlViewModel.CreateControls(ticketService.GetCustomFields());
return View(vm);
}
public ActionResult Edit(string id)
{
var ticketService = BusinessServiceFacade.GetTicketService(RavenSession);
var ticket = RavenSession.Load<Ticket>(id);
var customfieldValues = ticket.Attributes.ToDictionary(x => x.Name, x => x.Value);
var vm = new TicketViewModel(ticket);
var listOfCustomFields = TicketService.MapValues(ticketService.GetCustomFields(), customfieldValues);
vm.Controls = ControlViewModel.CreateControls(listOfCustomFields);
return View(vm);
}
So essentially, why is my cache modified in the MapValues method when the fields parameter has a scope on his own (not ref or out). Really want to understand what is going on here.
UPDATE:
After making the modification by supplying a new List reference I am not noticing any change.
It looks like the reference is still passed forward from the local variable to the newly created as parameter. One thing would be to entirely build up a new list with freshly created CustomField objects but when possible I would like to avoid that.
I am possibly making a simple mistake.
public ActionResult Create()
{
var ticketService = BusinessServiceFacade.GetTicketService(RavenSession);
var vm = new TicketViewModel();
var fields = ticketService.GetCustomFields();
vm.Controls = ControlViewModel.CreateControls(new List<CustomField>(fields));
return View(vm);
}
public ActionResult Edit(string id)
{
var ticketService = BusinessServiceFacade.GetTicketService(RavenSession);
var ticket = RavenSession.Load<Ticket>(id);
var customfieldValues = ticket.Attributes.ToDictionary(x => x.Name, x => x.Value);
var vm = new TicketViewModel(ticket);
var fields = ticketService.GetCustomFields();
var listOfCustomFields = TicketService.MapValues(new List<CustomField>(fields), customfieldValues);
vm.Controls = ControlViewModel.CreateControls(listOfCustomFields);
return View(vm);
}
Solution
Do a deep copy.
public static IList<CustomField> MapValues(IList<CustomField> fields, IDictionary<string,string> values = null)
{
// break reference, deep copy to new list
var oldList = (List<CustomField>) fields;
var newList = oldList.ConvertAll(c => new CustomField(c.Id, c.Name, c.Visible, c.Type, c.TypeFormat, c.Value));
foreach (var field in newList.OrderBy(x => x.Name))
{
var persistedValue = string.Empty;
values?.TryGetValue(field.Id, out persistedValue);
field.Value = persistedValue;
}
return newList;
}
TicketService.MapValues(ticketService.GetCustomFields()...
Within your Edit method, you call MapValues passing in the result of GetCustomFields, and that result is the cached list. So, within MapValues, all of a, b, and fields are references to the same list (the cached object). That's why you see the changes you make to fields also appear in b.
why is my cache modified in the MapValues method when the fields parameter has a scope on his own (not ref or out).
Yes, fields is scoped to the method. But I think you're confusing the difference between 1) changing the value of fields -- which is a reference to a list. And 2) changing the actual list that fields references. Yes, the changes you make to fields is scoped to this method (e.g. it won't affect the value that was passed in). However, as long as it points to a specific list, any changes you make to that list can be observed by other references to the same list. So, the scope of fields doesn't mean the changes you make to the list will be scoped to this method.
In response to the comment below, if you do something like this:
IList<CustomField> originalList = ticketService.GetCustomFields();
IList<CustomField> newList = new List<CustomField>(originalList);
and pass in the new list to MapValues (TicketService.MapValues(newList...) then the changes within MapValues won't affect the list referenced by the originalList. Because now you have two different lists.
Update: As commented below, I didn't notice you were modifying individual items within the list. So you need to deep-copy in that case. In this specific case, deep-copy isn't too bad since you only have a couple properties to copy:
IList<CustomField> originalList = ticketService.GetCustomFields();
IList<CustomField> newList = originalList
.Select(x => new CustomField
{
Id = x.Id,
Name = x.Name,
Value = x.Value
})
.ToList();
However, you can see how this could get problematic quickly as you have more properties or properties of complex types (need to copy properties of properties, etc.). There are solutions such as serializing/deserializing the object to copy but I'd consider a different design first. Like I said, in your case, I think manually copying a couple properties isn't too bad.

Assign a column value if it throws an exception in Linq query

I have a query where one property is a path "/Primary/secondary/tertiary/.../.../"
My task is to split this path by the slashes, so every sub-path can be assigned as a property in the query result.
The problem is, that the length varies. Some paths have a post-split array length of 1, some of 7. So I need to have 7 different category columns:
var result = MySource.Where(ms => ...)
.Select(ms => new {
ID = ms.ID,
Name = ms.Name,
Category1 = ms.Path.Split('/')[0],
Category2 = ms.Path.Split('/')[1] //exception
.... //exception
Category7 = ms.Path.Split('/')[6] //exception
});
After the path gets split, the resulting array is of various length (1 - 7) leading into an ArgumentOutOfRangeException. How can I circumvent this exceptions?
I have tried using the nullcoalescence operator ms.Path.Split('/')[1] ?? "N/A", which did not help, because there is no result but an exception thrown. Because of this every shorthand-if statement will fail as well.
Is there a way to catch the exception (wrap in try catch block?) so I can assign a default value if the array is out of bounds?
Your modeling seems a little broken. Instead of a flattened set of properties, populate a single collection. Something like this:
Select(ms => new {
ID = ms.ID,
Name = ms.Name,
Categories = ms.Path.Split('/')
})
Going a step further, you can create an actual (non-anonymous) model to hold this information, encapsulating the logic of category range checking. Something like:
Select(ms => new SomeObject(
ms.ID,
ms.Name,
ms.Path.Split('/')
))
Then in SomeObject you can have all sorts of logic, for example:
In the constructor you can perform input checking on the values, including the count of categories supplied, to ensure the object is valid.
You can keep the collection of categories private and expose properties for 1-7 if you really need to, which internally perform this check. (Though I really don't recommend that. It creates an unnecessary point of change for something that's already handled by a collection, indexing values.) Something like:
public string Category1
{
get
{
if (categories.Length < 1)
return string.Empty;
return categories[0];
}
}
Maybe throw an exception instead of returning an empty string? Maybe do something else? The point is to encapsulate this logic within an object instead of in a LINQ query or in consuming code.
you can do
Category7 = ms.Path.Split('/').ElementAtOrDefault(6) ?? "N/A",
see demo: https://dotnetfiddle.net/4nTBhq
ElementAtOrDefault return the element at index (for example 6, like [6]) but if out of bound return null.
optimized, without calling Split multiple times:
.Select(ms => {
var categories = ms.Path.Split('/');
return new {
ID = ms.ID,
Name = ms.Name,
...
Category7 = categories.ElementAtOrDefault(6),
};
})

Categories

Resources