Lets say I have a domain model of an assembly line that has different orders on it. The user can change the value of the order but each order's value must be greater than the one in front of it and less than the one behind it. I have created an aggregate root called Line to enforce this invariant. This is the simplified version of that code below.
public class Line : IAggregateRoot
{
public int Id { get; }
public List<Order> Orders { get; }
public Line(int id, List<Order> orders)
{
Id = id;
Orders = orders;
}
public void SetOrderValue(int orderId, int newOrderValue)
{
var orderPos = Orders.FindIndex(o => o.Id == orderId);
if (orderPos != -1 && IsValidOrderValue(orderPos,newOrderValue))
{
Orders[orderPos].Value = newOrderValue;
}
}
private bool IsValidOrderValue(int orderPos, int newOrderValue)
{
var lessThanAfter = orderPos == Orders.Count - 1 ? true : newOrderValue <= Orders[orderPos + 1].Value;
var greaterThanBefore = orderPos == 0 ? true : newOrderValue <= Orders[orderPos - 1].Value;
return lessThanAfter && greaterThanBefore;
}
}
public class Order : IEntity<int>
{
public int Id { get; }
public int Value { get; set; }
/*
* Other info about the order goes here
*/
public Order(int id, int value)
{
Id = id;
Value = value;
}
}
The issue that I have is that any object that references Line can also change the value of any order and break the invariant.
line.Orders[0].Value = 10;
I know that in DDD, the aggregate root shouldn't allow references to the inner entities so I thought about making the orders list private. However, then when I try to store the Line aggregate root in a repository, the repository has no way of being able to fetch and save the list of orders. Is there a recommended way in DDD to protect the Order objects from outside objects being able to change their values while at the same time keeping the Order info public so the repository can save it in the database?
Related
I have a Product table in my DB. Also, I have Brand and Category tables in my DB which are not related to each other. I want to relate these. In the form UI when I click the one of the Categories, should come the Brands which they have products in the related category.
I tried this way to do this. First, I get my products by categoryID with GetList method then I get these products' brands and I added these brands to pblist list(Brand type). However, some products have the same brands and pblist have repeated brand names. I tried to fix this with contains method but it does not work. Also, I have the same problem in the other part which I try to remove brands not included in pblist from blist(all brands' list). I tried removing item from blist by taking its index with this code: blist.RemoveAt(blist.IndexOf(item)); but this one also not working.It returns -1. But item is in the blist.
public class BrandVM : BaseVM
{
public int ProductCount { get; set; }
}
public class BaseVM
{
public int ID { get; set; }
public string Name { get; set; }
public override string ToString()
{
return this.Name;
}
public class BrandService : ServiceBase, IBrandService
{
public List<BrandVM> GetList(int Count)
{
try
{
var result = GetQuery();
result = Count > 0 ? result.Take(Count) : result;
return result.ToList();
}
catch (Exception ex)
{
return null;
}
}
public List<BrandVM> GetListByCatID(int pCatID)
{
var plist = productService.GetListByCatID(pCatID);
List<BrandVM> pblist = new List<BrandVM>();
foreach (var item in plist)
{
if (!pblist.Contains(item.Brand))
{
pblist.Add(item.Brand);
}
};
var blist = GetList(0);
var blistBackup = GetList(0);
foreach (BrandVM item in blistBackup)
{
if (!pblist.Contains(item))
{
blist.Remove(item);
}
};
return blist;
}
These are my classes related to Brand. In BrandService I shared the filled methods there are more methods to fill.
This is method is in my ProductService:
I use that method to pull product list by CategoryID (plist)
public List<ProductVM> GetListByCatID(int EntityID)
{
try
{
var result = GetQuery().Where(x => x.Category.ID==EntityID);
return result.ToList();
}
catch (Exception ex)
{
return null;
}
}
This GetQuery method for ProductService, in other services there are some differences but there are similar
private IQueryable<ProductVM> GetQuery()
{
return from p in DB.Products
select new ProductVM
{
ID = p.ProductID,
Name = p.ProductName,
UnitPrice = (decimal)p.UnitPrice,
Category =p.CategoryID==null?null:new CategoryVM()
{
ID = (int)p.CategoryID,
Name = p.Category.CategoryName
},
Brand = p.BrandID == null ? null :
new BrandVM
{
ID=(int)p.BrandID,
Name=p.Brand.BrandName,
}
};
}
Entity framework will translate Linq queries into SQL statements, which means that Equals (and GetHashCode) will not be used for comparison of database objects. However, if you're comparing local instances of these objects, then these methods will be used for comparisons.
The default Equals does a reference comparison to determine equality, which literally means that two instances of a type are only considered equal if they both refer to the exact same object in memory.
Instead, we want to use the ID property for equality comparison, which means we need to override the Equals (and GetHashCode) methods for the class.
Here's an example of how you could do this:
public class BaseVM
{
public int ID { get; set; }
public string Name { get; set; }
public override string ToString()
{
return Name;
}
public override bool Equals(object obj)
{
return obj is BaseVM &&
((BaseVM) obj).ID == ID;
}
public override int GetHashCode()
{
return ID;
}
}
Alternatively, if you don't want to modify the class (which I would recommend since it solves this problem everywhere), you can modify your code to filter out any brands that have the same id (or name):
foreach (var item in plist)
{
// Note: you could potentially use 'Name' instead of 'Id'
if (!pblist.Any(productBrand => productBrand.Id == item.Brand.Id))
{
pblist.Add(item.Brand);
}
}
Since you don't ensure that two different instances for a same brand are not equal,
in the sense that ´.Equals(object other)´ returns true,
the ´.Contains´ method as no way to identify them.
I think you'ĺl solve you issue by overriding .Equals in you Brand class.
I'm working on a project that was not designed with unit testing in mind.
Since inside of StartWorking() method I create a new instance of WorkYear and call year.RecalculateAllTime(), is WorkYear class considered to be an external dependency (in terms of unit testing)?
Or since the Employee bound to WorkYear by composition relationship + Employee is meant to perform actions on WorkYears, are the Employee and WorkYear form a cohesive entity where both classes aren't considered as dependencies of one another?
In other words, should StartWorking(...) method be tested in isolation from WorkYear class?
public abstract class Employee
{
private List<WorkYear> _workYears;
private readonly IntervalCalculator _intervalCalculator;
// Other fields...
protected Employee(IntervalCalculator intervalCalculator)
{
_intervalCalculator = intervalCalculator;
WorkYears = new List<WorkYear>();
}
public IEnumerable<WorkYear> WorkYears
{
get => _workYears.AsReadOnly();
private set => _workYears = value.ToList();
}
// Other properties...
public void StartWorking(DateTime joinedCompany)
{
List<PayPeriodInterval> allIntervals = _intervalCalculator.GenerateIntervalsFor(joinedCompany.Date.Year);
PayPeriodInterval currentInterval = allIntervals.Find(i => i.StartDate <= joinedCompany && joinedCompany <= i.EndDate);
PayPeriod firstPeriod = CalculateFirstPeriod(joinedCompany, currentInterval);
// There is a possibility that employee worked during this year and returned during
// the same exact year or even month. That is why we are trying to find this year in database:
WorkYear year = WorkYears.FirstOrDefault(y => y.CurrentYear == joinedCompany.Year);
if (year == null)
{
// Create new year with current and future periods.
year = new WorkYear(joinedCompany.Year, this, new List<PayPeriod> {firstPeriod});
AddYear(year);
}
else
{
// There is a possibility that employee left and got back during the same period.
// That is why we should try to find this period so that we don't override it with new one:
PayPeriod existingPeriod = year.GetPeriodByDate(joinedCompany);
if (existingPeriod != null)
{
var oldCurrentPeriodWorktime = new TimeSpan(existingPeriod.WorktimeHours, existingPeriod.WorktimeMinutes, 0);
firstPeriod = CalculateFirstPeriod(joinedCompany, currentInterval, oldCurrentPeriodWorktime);
}
year.PayPeriods.Add(firstPeriod);
}
List<PayPeriodInterval> futureIntervals = allIntervals.FindAll(i => currentInterval.EndDate < i.StartDate);
List<PayPeriod> futurePeriods = NewPeriods(futureIntervals);
year.PayPeriods.AddRange(futurePeriods);
year.RecalculateAllTime();
}
public abstract List<PayPeriod> NewPeriods(List<PayPeriodInterval> intervals);
public void AddYear(WorkYear workYear) => _workYears.Add(workYear);
protected abstract PayPeriod CalculateFirstPeriod(DateTime firstDayAtWork, PayPeriodInterval firstPeriodInerval, TimeSpan initialTime = default);
// Other methods...
}
public class WorkYear
{
public WorkYear(int currentYear, Employee employee, List<PayPeriod> periods)
{
Employee = employee;
EmployeeId = employee.Id;
CurrentYear = currentYear;
PayPeriods = periods ?? new List<PayPeriod>();
foreach (PayPeriod period in PayPeriods)
{
period.WorkYear = this;
period.WorkYearId = Id;
}
}
public int EmployeeId { get; }
public int CurrentYear { get; }
public Employee Employee { get; }
public List<PayPeriod> PayPeriods { get; set; }
// Other roperties...
public void RecalculateAllTime()
{
//Implementation Logic
}
}
Super big caveat: This stuff gets really opinionated really fast. There are lots of valid designs!
Okay, now that I've said that. DTO's (Data transfer objects) are not external dependencies. You don't inject them.
WorkYear has all the hallmarks of a DTO (other than that method). I think you are OK as is. Because it has that RecalculateAllTime method it should also be unit tested however. An example of an external dependency in this case would be something that fetches the list of work years.
The basic rule of thumb is:
You compose data (DTOs)
You inject behavior (services)
To not run out of memory by brining in the whole table at ones, I am doing in it chunks of LOAD_SIZE records.
Here is how I am doing it, I feel like there are some indexes that are off by one record? and possible performance improvements that I can do in
So I wanted to have your opinion on this approach.
int totalCount = repo.Context.Employees.Count();
int startRow = 0;
while (startRow <= totalCount)
{
repo.PaginateEmployees(startRow, LOAD_SIZE);
startRow = startRow + LOAD_SIZE ;
}
public List<EmpsSummary> PaginateEmployees(int startRow, int loadSize)
{
var query = (from p in this.Context.Employees
.Skip(startRow).Take(loadSize)
select new EmpsSummary
{
FirstName = p.FirstName,
LastName = p.LastName,
Phone = p.Phone
});
return query.ToList();
}
Because of how Linq works (lazy loading and has compares), if you formulate your statements right it will manage memory much better than you will be able.
From your comments (which should be added to the question) I offer this solution which should manage memory for you just fine.
This example code is not intended to compile -- it is given as an example
// insert list
List<EmpsSummary> insertList;
// add stuff to insertList
List<EmpsSummary> filteredList = insertList.Except(this.Context.Employees);
This assumes that this.Context.Employees is of type EmpsSummary. If it isn't you have to cast it to the correct type.
Also you will need to be able to compare EmpsSummary. To do so create this IEquitable like this:
This example code is not intended to compile -- it is given as an example
public class EmpsSummary : IEquatable<EmpsSummary>
{
public string FirstName { get; set; }
public string LastName { get; set; }
public string Phone { get; set; }
public bool Equals(EmpsSummary other)
{
//Check whether the compared object is null.
if (Object.ReferenceEquals(other, null)) return false;
//Check whether the compared object references the same data.
if (Object.ReferenceEquals(this, other)) return true;
//Check whether the products' properties are equal.
return FirstName.Equals(other.FirstName) &&
LastName.Equals(other.LastName) &&
Phone.Equals(other.Phone);
}
// If Equals() returns true for a pair of objects
// then GetHashCode() must return the same value for these objects.
public override int GetHashCode()
{
int hashProductFirstName = FirstName == null ? 0 : FirstName.GetHashCode();
int hashProductLastName = LastName == null ? 0 : LastName.GetHashCode();
int hashProductPhone = Phone == null ? 0 : Phone.GetHashCode();
//Calculate the hash code
return hashProductFirstName ^ hashProductLastName ^ hashProductPhone;
}
}
I have an ADO.Net Data Access layer in my application that uses basic ADO.Net coupled with CRUD stored procedures (one per operation e.g. Select_myTable, Insert_myTable). As you can imagine, in a large system (like ours), the number of DB objects required by the DA layer is pretty large.
I've been looking at the possibility of refactoring the layer classes into EF POCO classes. I've managed to do this, but when I try to performance test, it gets pretty horrific. Using the class below (create object, set Key to desired value, call dataselect), 100000 runs of data loading only takes about 47 seconds (there are only a handful of records in the DB). Whereas the Stored Proc method takes about 7 seconds.
I'm looking for advice on how to optimise this - as a point of note, I cannot change the exposed functionality of the layer - only how it implements the methods (i.e. I can't pass responsibility for context ownership to the BO layer)
Thanks
public class DAContext : DbContext
{
public DAContext(DbConnection connection, DbTransaction trans)
: base(connection, false)
{
this.Database.UseTransaction(trans);
}
protected override void OnModelCreating(DbModelBuilder modelBuilder)
{
base.OnModelCreating(modelBuilder);
//Stop Pluralising the Object names for table names.
modelBuilder.Conventions.Remove<PluralizingTableNameConvention>();
//Set any property ending in "Key" as a key type.
modelBuilder.Properties().Where(prop => prop.Name.ToLower().EndsWith("key")).Configure(config => config.IsKey());
}
public DbSet<MyTable> MyTable{ get; set; }
}
public class MyTable : DataAccessBase
{
#region Properties
public int MyTableKey { get; set; }
public string Name { get; set; }
public string Description { get; set; }
public bool Active { get; set; }
public int CreatedBy { get; set; }
public DateTime CreatedDate { get; set; }
public int ModifiedBy { get; set; }
public DateTime ModifiedDate { get; set; }
#endregion
#region constructors
public MyTable()
{
//Set Default Values.
Active = true;
Name = string.Empty;
CreatedDate = DateTime.MinValue;
ModifiedDate = DateTime.MinValue;
}
#endregion
#region Methods
public override void DataSelect(System.Data.SqlClient.SqlConnection connection, System.Data.SqlClient.SqlTransaction transaction)
{
using (DAContext ctxt = new DAContext(connection, transaction))
{
var limitquery = from C in ctxt.MyTable
select C;
//TODO: Sort the Query
limitquery = FilterQuery(limitquery);
var limit = limitquery.FirstOrDefault();
if (limit != null)
{
this.Name = limit.Name;
this.Description = limit.Description;
this.Active = limit.Active;
this.CreatedBy = limit.CreatedBy;
this.CreatedDate = limit.CreatedDate;
this.ModifiedBy = limit.ModifiedBy;
this.ModifiedDate = limit.ModifiedDate;
}
else
{
throw new ObjectNotFoundException(string.Format("No MyTable with the specified Key ({0}) exists", this.MyTableKey));
}
}
}
private IQueryable<MyTable1> FilterQuery(IQueryable<MyTable1> limitQuery)
{
if (MyTableKey > 0) limitQuery = limitQuery.Where(C => C.MyTableKey == MyTableKey);
if (!string.IsNullOrEmpty(Name)) limitQuery = limitQuery.Where(C => C.Name == Name);
if (!string.IsNullOrEmpty(Description)) limitQuery = limitQuery.Where(C => C.Description == Description);
if (Active) limitQuery = limitQuery.Where(C => C.Active == true);
if (CreatedBy > 0) limitQuery = limitQuery.Where(C => C.CreatedBy == CreatedBy);
if (ModifiedBy > 0) limitQuery = limitQuery.Where(C => C.ModifiedBy == ModifiedBy);
if (CreatedDate > DateTime.MinValue) limitQuery = limitQuery.Where(C => C.CreatedDate == CreatedDate);
if (ModifiedDate > DateTime.MinValue) limitQuery = limitQuery.Where(C => C.ModifiedDate == ModifiedDate);
return limitQuery;
}
#endregion
}
Selects are slow with tracking on. You should definitely turn off tracking and measure again.
Take a look at my benchmarks
http://netpl.blogspot.com/2013/05/yet-another-orm-micro-benchmark-part-23_15.html
This might be just a hunch, but ... In your stored procedure, the filters are well defined and the SP is in a compiled state with decent execution plan. Your EF query gets constructed from scratch and recompiled on every use. So the task now becomes to devise a way to compile and preserve your EF queries, between uses. One way would be to rewrite your FilterQuery to not rely on fluent conditional method chain. Instead of appending, or not, a new condition every time your parameter set changes, convert it into one, where the filter is either applied when condition is met, or overridden by something like 1.Equals(1) when not. This way your query can be complied and made available for re-use. The backing SQL will look funky, but execution times should improve. Alternatively you could devise Aspect Oriented Programming approach, where compiled queries would be re-used based on parameter values. If I will have the time, I will post a sample on Code Project.
There's a list of objects, each object representing a record from a database. To sort the records there is a property called SortOrder. Here's a sample object:
public class GroupInfo
{
public int Id { get; set; }
public string Text { get; set; }
public int SortOrder { get; set; }
public GroupInfo()
{
Id = 0;
Text = string.Empty;
SortOrder = 1;
}
}
A list object would look like this:
var list = new List<GroupInfo>();
I need to be able to change the SortOrder and update the SortOrder on the other objects in the list. I figured out how to sort up or down by one. I need to know how to change it by more than one and adjust the SortOrder on the other records. Any ideas?
This could be done by first getting the original SortOrder and the updated SortOrder. You would then iterate through your collection and adjust the SortOrder of any other GroupInfo objects that fall inside the range between original and updated. you could put all of this in a "SetSortOrder" function that takes in the containing collection.
public static void SetSortOrder(List<GroupInfo> groupInfos, GroupInfo target, int newSortOrder)
{
if (newSortOrder == target.SortOrder)
{
return; // No change
}
// If newSortOrder > SortOrder, shift all GroupInfos in that range down
// Otherwise, shift them up
int sortOrderAdjustment = (newSortOrder > target.SortOrder ? -1 : 1);
// Get the range of SortOrders that must be updated
int bottom = Math.Min(newSortOrder, target.SortOrder);
int top = Math.Max(newSortOrder, target.SortOrder);
// Get the GroupInfos that fall within our range
var groupInfosToUpdate = from g in groupInfos
where g.Id != target.Id
&& g.SortOrder >= bottom
&& g.SortOrder <= top
select g;
// Do the updates
foreach (GroupInfo g in groupInfosToUpdate)
{
g.SortOrder += sortOrderAdjustment;
}
target.SortOrder = newSortOrder;
// Uncomment this if you want the list to resort every time you update
// one of its members (not a good idea if you're doing bulk changes)
//groupInfos.Sort((info1, info2) => info1.SortOrder.CompareTo(info2.SortOrder));
}
Update: As suggested, I moved the logic into a static helper function.
var sortedList = list.OrderBy(item => item.SortOrder);
Edit: Sorry, I misunderstood. You will need to write yourself a method outside of GroupInfo to handle the updating of that property.