I am using Entity Framework in an ASP.Net MVC Project. The point of the project was to create a web version of an existing older desktop project written in vb6. This vb6 project saves dates as 11/11/1911 whenever a blank date is needed to the table because I have been told by the vb6 programmer that vb6 dates cannot be set to null.
Obviously in my classes I have dates such as:
public DateTime? ExampleDate { get;set; }
This causes the field to appear as 11/11/1911 when ideally it should be null and despite my insistence to change vb6 to simply null the dates the programmer is telling me it can't be done. I know I can create manual properties in the class to check the field and return a value accordingly and mark it as [NotMapped] but there are a huge amount of dates across the tables and I don't want to be creating an extra property for every datetime nor do I want to be making a manual check in my code after loading a record and adding extra unnecessary code.
So what I would like to do if it's possible is somehow create my own datatype or method I can use in the class directly i.e. changing:
public DateTime? ExampleDate { get;set; }
To something like:
public CustomDateTime? ExampleDate { get;set; }
And have the CustomDateTime check and return accordingly. Can anyone tell me if this is possible or recommend another solution?
EDIT:
What about some kind of custom data attribute or similar?
[CustomDateTime]
public DateTime? ExampleDate { get;set; }
I tried with a custom validator but this only triggers on saving and I need to trigger the check when reading.
If you really can't modify the DB to have meaningful values, a good idea might be to have a DataAccess Objet and a Business Object.
I'm using this answer as a reference, How to map Data Access to Business Logic objects in Entity Framework. He explains the idea very well.
This will add some complexity to your code but your entities will stay clean. And, if later the DB values are updated and '11/11/1911' is replaced by NULL you will just need to update your mapping.
I would also recommend not adding additional fields for this purpose to your Entities, as you want to keep them clean. If you're stuck with the values in your database representing what you would consider a null DateTime value then I would employ an extension method.
This way you don't have to repeat your code and you can keep your entities clean... something like this:
namespace Project.DateTimeExtensions
{
public static class DateTimeExtensions
{
public static DateTime? FromVb6DateTime(this DateTime? dateTime)
{
// Is the DateTime a VB6 null representation, if so, return null
return (dateTime.ToString("dd/MM/yyyy") == "11/11/1911") ? dateTime : null;
}
}
}
The you can ensure that the values are converted to null under certain conditions:
// Using the extension namespace
using Project.DateTimeExtensions;
public class MyClass()
{
public IDataService DataService;
public MyClass(IDataService dataService)
{
this.DataService = dataService
}
public void MyMethod()
{
// Get the date time value however you would normally
var vb6DateTime = this.DataService.GetDate();
// Convert the value using the extension method
var convertedDateTime = vb6DateTime.FromVb6DateTime();
}
}
Related
I'm using ASP.NET Boilerplate MVC (not Core) template in my project, which uses EF6 as ORM. Database is SQL Server Express.
Here's my entity object (ignoring non-related properties):
public class Asset : AggregateRoot<long>
{
[DataType(DataType.DateTime)]
public DateTime? LastControlTime { get; set; }
}
When I create a new Asset, this field appropriately created as NULL. So, everything works as intented at first. But when I try to update an object with a simple service call, it screws up.
Here's the method in the application service class:
public void ResetLastControlTime (EntityDto<long> input)
{
var asset = Repository.Get(input.Id);
asset.LastControlTime = default(DateTime?);
}
This should reset that field to null. I also tried asset.LastControlTime = null;. But in the end it's written "0001-01-01 00:00:00.0000000" to that field in the database. I have lots of places in code that I control for a null value so now I had to change tons of old files or I must find some way to reset that field to simply NULL.
I checked similar questions here but cannot find an answer. All of them tells about nullable DateTime, which I already have. In SQL server table schema, Data Type is datetime2(7), so I guess that's correct too. Oh and deleting the DataType annotation also didn't change anything.
So what am I missing here? What should I check to find the issue?
I suppose if all else fails, you can simplify most of your code by re-implementing the property:
public class Asset : AggregateRoot<long>
{
public DateTime? _LastControlTime;
[DataType(DataType.DateTime)]
public DateTime? LastControlTime {
get {
return _LastControlTime;
}
set {
if (value == DateTime.MinValue) {
_LastControlTime = null;
} else {
_LastControlTime = value;
}
}
}
It doesn't really cut to the heart of the problem, but will let you progress without having to change all of your == null and .HasValue throughout the entire program.
I have a quick question. Is it possible to use MongoDB with the OnDeserializing attribute or something like that?
MongoClient client { get; } = new MongoClient("mongodb://localhost:27017");
var Userscollection = db.GetCollection<UserModel>("Users");
var userfind = (await Userscollection.FindAsync(x => x.UserId == "UserId"));
var user = userfind.FirstOrDefault();
My UserModel class has a function with the OnDeserializing attribute but it doesn't fire on Find and fetching the item of the user.
[OnDeserializing]
void TestFunc(StreamingContext context)
{
}
Is there any way to fire it automatically or any similar method to detect in the constructor of the class if the class is creating by my codes or using the MongoDB serializer/deserializer?
OK, After poking a lot with attributes and lots of tries finally I found a solution for my case.
I commented above that creating 2 constructors doesn't work because settings values are going to be done after running constructors.
But, There's a workaround! BsonConstructor Attribute
create a simple constructor or with arguments for your own (if you need otherwise you can skip it)
Then create another constructor but using the BsonConstructor attribute like below.
[BsonConstructor()]
public Postmodel(ObjectId postid, ObjectId userid)
{
//Called by the BSon Serialize/Deserialize
}
But surely this constructor will not be what you need because all of the properties are null or having default values so you need to pass some argument too.
The sample below will give you the values of properties you need so you can do whatever you want in the constructor.
[BsonConstructor(nameof(Id), nameof(UserId))]
public Postmodel(ObjectId postid, ObjectId userid)
{
}
Id and UserId are two properties in my class.
You can also simply use
[BsonConstructor("Id", "UserId")]
instead, But be careful, Changing properties name or removing them in the development won't notify you to fix your values, so using nameof(PropertyName) is much safer.
In the database we have to work with (which is DB2) there are fields stored as character but are in fact other objects, the most common being custom ways the underlying application stores dates and times. For example:
[Table]
public class ExampleTable {
// This is stored in the DB as a char in the format: 2016-01-11-11.39.53.492000
[Column(Name = "WTIMESTAMP")] public string WriteTimestamp { get; set; }
}
Would there be a way to tell linq2db a conversion method to use when converting to / from the database, that would also allow us to access those properties as an object we want (for instance, a C# DateTime object), but get saved back in the proper format?
One thing I thought of was something like:
[Table]
public class ExampleTable {
public DateTime WriteTimestamp { get; set; }
// This is stored in the DB as a char in the format: 2016-01-11-11.39.53.492000
[Column(Name = "WTIMESTAMP")] public string WriteTimestampRaw
{
get {
return ConvertWriteTimestampToDb2Format(WriteTimestamp);
}
set {
WriteTimestamp = ConvertWriteTimestampToDateTime(value);
}
}
}
And then we access WriteTimestamp, but the linq2db uses WriteTimestampRaw in the queries.
But, I'm not sure if that's the best or only option. Thanks in advance.
Well... just noticed that you said linq2db and not Entity Framework after I posted my answer. Maybe it will still give you some ideas, though.
What I have done before with Entity Framework (although not specifically with DB2, but I think it should still work), is to use the code provided in this answer to allow private properties to be mapped to a database column. Then, I have something similar to your code, except the getters and setters are reversed:
[Table("ExampleTable")]
public class ExampleTable
{
[NotMapped]
public DateTime WriteTimestamp
{
get
{
var db2Tstamp = DB2TimeStamp.Parse(WriteTimestampRaw);
return db2Tstamp.Value;
}
set
{
var db2Tstamp = new DB2TimeStamp(value);
WriteTimestampRaw = db2Tstamp.ToString();
}
}
// This is stored in the DB as a char in the format: 2016-01-11-11.39.53.492000
[Column("WTIMESTAMP")]
private string WriteTimestampRaw { get; set; }
}
I used the DB2TimeStamp class to handle the conversion between string and DateTime values, but you could probably do it however you're comfortable.
You can use MappingSchema.SetConverter method to set conversion between specific types on client side. Or MappingSchema.SetConverterExpression to create converters as a part of query tree.
I have an Object
public class Object1{
public List<Object2> {get;set;}
}
public class Object2{
public Name{get;set;}
public Address{get;set;}
}
I have a feature where the user can update just one instance of Object2. So my code for saving Object2 looks like
[HttpPost]
public ActionResult SaveObject2(Object2 obj2)
{
if (obj2.Id == null){
//Add Logic
obj1.Obj2List.Add(obj2)
}
else{
// Update logic
}
}
But obj2.Id is never null.Id is of type ObjectId. How can i check for logic to see if need to insert or update ? I am using asp.net MVC 3 and Mongo DB using the official C# drivers.
Thanks
The ObjectId type is a struct, not a class - so it's never going to be null. Check for ObjectId.Empty, instead.
A word of caution, though: I suppose that you are storing the id of Object2 in some hidden field between requests. If that is the case, be aware that malicious user can easily change the ID by using an HTTP proxy (such as Fiddler), thus tricking you into believing that the Object2 is being updated instead of added.
Depending on context of what you are trying to do, I would suggest performing some additional checks to more reliably determine if you should insert or update your object.
Situation: I have a large shrink wrapped application that my company bought. It is supposed to be extensible, yada, yada. It has a DB, DAL and BLL in the form of SQL and DLLs. It also has a MVC project (the extensible part) but 95% of the "Model" part is in the DAL/BLL libraries.
Problem: I need to extend one of the "Models" located in the BLL. It is an User object with 47 properties, 0 methods and no constructor. What I started was a simple deivation of their class like:
public class ExtendedUser : BLL.DTO.User
{
public bool IsSeller { get; set; }
public bool IsAdmin { get; set; }
}
This works fine if I just create a new ExtendedUser. However, it is populated by another call into their BLL like:
BLL.DTO.User targetUser = UserClient.GetUserByID(User.Identity.Name, id);
I tried the straight forward brute force attempt, which of course throws a Cast Exception:
ExtendedUser targetUser = (ExtendedUser)UserClient.GetUserByID(User.Identity.Name, id);
I am drawing a complete blank on this very simple OO concept. I don't want to write a Constructor that accepts the existing User object then copies each of the properties into my extended object. I know there is a right way to do this. Can someone slap me upside the head and tell me the obvious?
TIA
If you do want to use inheritance, then with 47 properties, something like Automapper might help you copy all the values across - http://automapper.codeplex.com/ - this would allow you to use:
// setup
Mapper.CreateMap<BLL.DTO.User, ExtendedUser>();
// use
ExtendedUser extended = Mapper.Map<BLL.DTO.User, ExtendedUser>(user);
Alternatively, you might be better off using aggregation instead of inheritance - e.g.
public class AggregatedUser
{
public bool IsSeller { get; set; }
public bool IsAdmin { get; set; }
public BLL.DTO.User User { get; set; }
}
What about this approach (basically Aggregation):
public sealed class ExtendedUser
{
public ExtendedUser(BLL.DTO.User legacyUser)
{
this.LegacyUser = legacyUser;
}
public BLL.DTO.User LegacyUser
{
get;
private set;
}
}
I don't want to write a Constructor that accepts the existing User object then copies each of the properties into my extended object.
This is typically the "right" way to do this, unless you have compile time access to the BLL. The problem is that a cast will never work- an ExtendedUser is a concrete type of User, but every User is not an ExtendedUser, which would be required for the cast to succeed.
You can handle this via aggregation (contain the instance of the User as a member), but not directly via inheritance.
This is often handled at compile time via Partial Classes. If the BLL is setup to create the classes (ie: User) as a partial class, you can add your own logic into a separate file, which prevents this from being an issue. This is common practice with many larger frameworks, ORMs, etc.