Dealing with OutOfMemory Exception - c#

Is there a proper way to deal with Out Of Memory Exceptions while looping through a large list and adding objects from that list? What is the proper way to go about doing this? I have a large Linq query that returns around 600K items. I then go through each item and add it to an object. The code is below.
static void Main(string[] args)
{
GrabData();
}
public static void GrabData()
{
decimal? TotalNetClaim = 0;
using (var Context = new VSCMeSnapEntities())
{
List<DriveTimeObject> DataFile = new List<DriveTimeObject>();
DriveTimeObject DT = new DriveTimeObject();
DateTime ParamDate = new DateTime(2015, 05, 30);
List<viewDriveTimeFileDump> DataQuery = new List<viewDriveTimeFileDump>();
DataQuery = (from z in Context.viewDriveTimeFileDumps select z).ToList();
foreach (var item in DataQuery)
{
decimal? AmountChargedParts = DT.GetAmountChargedParts(item.chrComponSts.Trim(), item.mnsTotalParts);
decimal? AmountChargedPartsTax = DT.GetAmountChargedPartsTax(item.chrComponSts.Trim(), item.mnsTotalPartTax);
decimal? AmountChargedLabor = DT.GetAmountChargedLabor(item.chrComponSts.Trim(), item.mnsTotalLabor);
decimal? AmountChargedLaborTax = DT.GetAmountChargedLaborTax(item.chrComponSts.Trim(), item.mnsTotalLaborTax);
int? DaysOut = DT.GetDaysOutClaim(item.intRepairFacilCode, item.dtmContPurchDate, item.dtmReported);
long? MilesOut = DT.GetMilesOutClaim(item.intRepairFacilCode, item.inbIncurMiles, item.inbOrigMiles);
decimal? deductible = DT.GetDeductible(item.chrContSts, item.mnsDeduct);
decimal? netClaim = DT.GetNetClaim(item.chrComponSts.Trim(), item.mnsTotalParts, item.mnsTotalPartTax, item.mnsTotalLabor, item.mnsTotalLaborTax, item.mnsDeduct);
DataFile.Add(new DriveTimeObject
{
DealerNumber = item.chrDlrNum,
VSCName = item.chvVSCName,
IcLocationNumber = item.IcLocationNumber,
IcRegion = item.IcRegion,
Identifier = item.chrIdentifier,
ContractNumber = item.chrContNum,
VIN = item.chrVIN,
CoverageCode = item.CvgCode,
ClaimNum = item.intClaimNum,
OriginalMiles = item.inbOrigMiles,
ContractPurchaseDate = item.dtmContPurchDate,
IncurMiles = item.inbIncurMiles,
DateReported = item.dtmReported,
DaysOutClaim = DaysOut,
MilesOut = MilesOut,
RepairFacilityNumber = item.intRepairFacilCode,
FacilityName = item.chvFacilityName,
ZipFive = item.chrZipFive,
FacilityAdvisor = item.chrFacilAdvisor,
ComponentStatus = item.chrComponSts,
ComponentStatusWord = item.ComponDesc,
ComponentCode = item.chrComponCode,
StatusMasterDescription = item.MasterDesc,
ComponentDescription = item.chvComponDesc,
Parts = AmountChargedParts,
PartsTax = AmountChargedPartsTax,
Labor = AmountChargedLabor,
LaborTax = AmountChargedLaborTax,
Deductible = deductible,
NetClaim = netClaim,
CarrierCode = item.intCarrierCode,
NetworkStatus = item.NetworkStatus,
AddOn = item.chrAddOn,
ETCDate = item.ETC,
ATCDate = item.ATC,
LaborTime = item.reaLaborTime,
PaidDate = item.dtmPdDate,
PaymentID = item.intPaymentID,
BatchNumber = item.intBatchNum
});
TotalNetClaim += netClaim;
}
Context.Dispose();
}
Console.WriteLine(TotalNetClaim);
Console.ReadKey();
}
I run out of memory during the foreach loop and I was wondering how I should go about adjusting my code to make this work.

The way to prevent out of memory is to not run out of memory. Which means you need to get rid of objects you don't need.
Without learning more of your use case, it is hard to suggest the fix. Regardless of that, it's bad practice to have so many objects in memory that you run out and crash. Only keep in memory what you need.
One fix is to not use RAM memory, and instead use hard drive memory. Ex: You can write those objects to a database and get rid of them, so you don't keep them around. Considering you have 600k objects, you could do these in batches of 10k/25k records. Then when you need the objects, you can query them. If you need to do calculations with all the objects, I would recommend doing those operations using SQL queries.

You are creating a new DriveTimeObject and storing it in a List<DriveTimeObject> called DataFile. Assuming there are 600k items in your DataQuery, this means that your list also contains 600k items.
However, you are not using that list at all so it's just chewing up all of your memory for no reason. Remove that and save yourself a whole tonne of memory and it should also run a lot faster.

Related

Ramifications of using a stored procedure as a returned object as opposed to using a foreach to populate a data model?

Here are two examples of methods, the first one uses a foreach statement to populate a data model
public List<TheOrderSummary> GetTheOrderSummaryByID(int id)
{
ExoEntities = new ExoEntities();
List<TheOrderSummary> lst = new List<TheOrderSummary>();
foreach(var a in query)
{
lst.Add(new TheOrderSummary
{
OrderDetailID = a.OrderDetailID,
OrderHeaderID = a.OrderHeaderID,
ItemSeq = a.ItemSeq,
MaterialID = a.MaterialID,
Description = a.Description,
DWidth = a.DWidth,
DLength = a.DLength,
MaterialArea = a.MaterialArea.ToString(),
MaterialDetail = a.MaterialDetail,
DHeight = a.DHeight,
PurchUnitOfMeasure = a.PurchUnitOfMeasure,
SellUnitOfMeasure = a.SellUnitOfMeasure,
MaterialCategory = a.MaterialCategory,
MaterialType = a.MaterialType,
MaterialSubType = a.MaterialSubType,
ColorID = (int)a.ColorID,
Color = a.Color,
MaterialPrice = a.MaterialPrice,
MaterialCost = a.MaterialCost,
MaterialLocationID = (int)a.MaterialLocationID,
MaterialLocation = a.MaterialLocation,
LaborPrice = a.LaborPrice,
LaborCost = a.LaborCost,
VendorID = (int)a.VendorID,
VendorName = a.VendorName,
Size = a.Size,
Height = (decimal)a.Height,
Length = (decimal)a.Length,
Width = (decimal)a.Width,
PurchaseQuantity = (decimal)a.PurchaseQuantity,
SellQuantity = (decimal)a.SellQuantity,
TotalFootage = (decimal)a.TotalFootage,
GeneratedItemInd = (int)a.GeneratedItemInd,
ExtendedMaterialPrice = (decimal)a.ExtendedMaterialPrice,
ExtendedLaborCost = (decimal)a.ExtendedLaborCost,
ExtendedMaterialCost = (decimal)a.ExtendedMaterialCost
});
}
return lst;
}
and this one uses a stored procedure to return an object
public List<usp_GetTheOrderDetails_Result> GetTheOrderSummaryByID(int id)
{
ExoEntities = new ExoEntities();
var query = ExoEntities.usp_GetTheOrderDetails(id);
return query.ToList();
}
Both of these are in a DAL and the method that could call either one of these is a JSONResult, both of these can be used to populate a grid. What ramifications would using the second type be down the road as opposed to the first one? They both return the exact same thing, from the looks of it on a performance level, without doing the numbers, the second one would be faster
The pain that is introduced by returning the result directly is that you get a "hard-wired" dependency on the information contract between the consumer and the underlying data provider. If you make a change to the underlying data structure, you must update the client at the same time (which may come with various degrees of pain).
If you instead go with your first option, you encapsulate the knowledge of the underlying information structure into this layer, and may use that to map between the old and new contracts, thereby decoupling the direct dependency between the client and the data provider.
So yes, the second option might bit a tad faster (even though that really should be marginal), but the first one is likely to give you much less pain when it comes to maintaining code and deployments in the longer run.

Speeding up a linq query with 40,000 rows

In my service, first I generate 40,000 possible combinations of home and host countries, like so (clientLocations contains 200 records, so 200 x 200 is 40,000):
foreach (var homeLocation in clientLocations)
{
foreach (var hostLocation in clientLocations)
{
allLocationCombinations.Add(new AirShipmentRate
{
HomeCountryId = homeLocation.CountryId,
HomeCountry = homeLocation.CountryName,
HostCountryId = hostLocation.CountryId,
HostCountry = hostLocation.CountryName,
HomeLocationId = homeLocation.LocationId,
HomeLocation = homeLocation.LocationName,
HostLocationId = hostLocation.LocationId,
HostLocation = hostLocation.LocationName,
});
}
}
Then, I run the following query to find existing rates for the locations above, but also include empty the missing rates; resulting in a complete recordset of 40,000 rows.
var allLocationRates = (from l in allLocationCombinations
join r in Db.PaymentRates_AirShipment
on new { home = l.HomeLocationId, host = l.HostLocationId }
equals new { home = r.HomeLocationId, host = (Guid?)r.HostLocationId }
into matches
from rate in matches.DefaultIfEmpty(new PaymentRates_AirShipment
{
Id = Guid.NewGuid()
})
select new AirShipmentRate
{
Id = rate.Id,
HomeCountry = l.HomeCountry,
HomeCountryId = l.HomeCountryId,
HomeLocation = l.HomeLocation,
HomeLocationId = l.HomeLocationId,
HostCountry = l.HostCountry,
HostCountryId = l.HostCountryId,
HostLocation = l.HostLocation,
HostLocationId = l.HostLocationId,
AssigneeAirShipmentPlusInsurance = rate.AssigneeAirShipmentPlusInsurance,
DependentAirShipmentPlusInsurance = rate.DependentAirShipmentPlusInsurance,
SmallContainerPlusInsurance = rate.SmallContainerPlusInsurance,
LargeContainerPlusInsurance = rate.LargeContainerPlusInsurance,
CurrencyId = rate.RateCurrencyId
});
I have tried using .AsEnumerable() and .AsNoTracking() and that has sped things up quite a bit. The following code shaves several seconds off of my query:
var allLocationRates = (from l in allLocationCombinations.AsEnumerable()
join r in Db.PaymentRates_AirShipment.AsNoTracking()
But, I am wondering: How can I speed this up even more?
Edit: Can't replicate foreach functionality in linq.
allLocationCombinations = (from homeLocation in clientLocations
from hostLocation in clientLocations
select new AirShipmentRate
{
HomeCountryId = homeLocation.CountryId,
HomeCountry = homeLocation.CountryName,
HostCountryId = hostLocation.CountryId,
HostCountry = hostLocation.CountryName,
HomeLocationId = homeLocation.LocationId,
HomeLocation = homeLocation.LocationName,
HostLocationId = hostLocation.LocationId,
HostLocation = hostLocation.LocationName
});
I get an error on from hostLocation in clientLocations which says "cannot convert type IEnumerable to Generic.List."
The fastest way to query a database is to use the power of the database engine itself.
While Linq is a fantastic technology to use, it still generates a select statement out of the Linq query, and runs this query against the database.
Your best bet is to create a database View, or a stored procedure.
Views and stored procedures can easily be integrated into Linq.
Material Views ( in MS SQL ) can further speed up execution, and missing indexes are by far the most effective tool in speeding up database queries.
How can I speed this up even more?
Optimizing is a bitch.
Your code looks fine to me. Make sure to set the index on your DB schema where it's appropriate. And as already mentioned: Run your Linq against SQL to get a better idea of the performance.
Well, but how to improve performance anyway?
You may want to have a glance at the following link:
10 tips to improve LINQ to SQL Performance
To me, probably the most important points listed (in the link above):
Retrieve Only the Number of Records You Need
Turn off ObjectTrackingEnabled Property of Data Context If Not
Necessary
Filter Data Down to What You Need Using DataLoadOptions.AssociateWith
Use compiled queries when it's needed (please be careful with that one...)

How to work around NotMapped properties in queries?

I have method that looks like this:
private static IEnumerable<OrganizationViewModel> GetOrganizations()
{
var db = new GroveDbContext();
var results = db.Organizations.Select(org => new OrganizationViewModel
{
Id = org.OrgID,
Name = org.OrgName,
SiteCount = org.Sites.Count(),
DbSecureFileCount = 0,
DbFileCount = 0
});
return results;
}
This is returns results pretty promptly.
However, you'll notice the OrganizationViewModel has to properties which are getting set with "0". There are properties in the Organization model which I added via a partial class and decorated with [NotMapped]: UnsecureFileCount and SecureFileCount.
If I change those 0s to something useful...
DbSecureFileCount = org.SecureFileCount,
DbFileCount = org.UnsecureFileCount
... I get the "Only initializers, entity members, and entity navigation properties are supported" exception. I find this a little confusing because I don't feel I'm asking the database about them, I'm only setting properties of the view model.
However, since EF isn't listening to my argument I tried a different approach:
private static IEnumerable<OrganizationViewModel> GetOrganizations()
{
var db = new GroveDbContext();
var results = new List<OrganizationViewModel>();
foreach (var org in db.Organizations)
{
results.Add(new OrganizationViewModel
{
Id = org.OrgID,
Name = org.OrgName,
DbSecureFileCount = org.SecureFileCount,
DbFileCount = org.UnsecureFileCount,
SiteCount = org.Sites.Count()
});
}
return results;
}
Technically this gives me the correct results without an exception but it takes forever. (By "forever" I mean more than 60 seconds whereas the first version delivers results in under a second.)
Is there a way to optimize the second approach? Or is there a way to get the first approach to work?
Another option would be to load the values back as an anonymous type and the loop through those to load your viewmodel (n+1 is most likely the reason for the slowness).
For example:
var results = db.Organizations.Select(org => new
{
Id = org.OrgID,
Name = org.OrgName,
DbSecureFileCount = org.SecureFileCount,
DbFileCount = org.UnsecureFileCount,
SiteCount = org.Sites.Count()
}).ToList();
var viewmodels = results.Select( x=> new OrganizationViewModel
{
Id = x.Id,
Name = x.Name,
DbSecureFileCount = x.DbSecureFileCount,
DbFileCount = x.DbFileCount,
SiteCount = x.SiteCount
});
Sorry about the formatting; I'm typing on a phone.
You are basically lazy loading each object at each iteration of the loop, causing n+1 queries.
What you should do is bring in the entire collection into memory, and use it from there.
Sample code:
var organizationList = db.Organizations.Load();
foreach (var org in organizationList.Local)
{
//Here you are free to do whatever you want
}

how to shorten this piece of C# object Oriented code?

this is my createcustomer function which returns a customer object to the caller,
getCustomerDetail returns a datatable which then populates the customer object properties with the values. Problem is whenever there's a change in the object, i have to modify this again, how do I solve this problem so that I only need to change in the Customer object and it saves my work of modifying the entire codes?
public Objects.Customer createCustomer()
{
DataTable dt = Database.Master.Customer.getCustomerDetail(objCustomer.Custcode);
objCustomer.Billaddress1 = dt.Rows[0]["Billaddress1"].ToString();
objCustomer.Billaddress2 = dt.Rows[0]["Billaddress2"].ToString();
objCustomer.Billaddress3 = dt.Rows[0]["Billaddress3"].ToString();
objCustomer.Billcontact = dt.Rows[0]["Billcontact"].ToString();
objCustomer.Billfaxno = dt.Rows[0]["Billfaxno"].ToString();
objCustomer.Billpostalcode = dt.Rows[0]["Billpostalcode"].ToString();
objCustomer.Billremarks = dt.Rows[0]["Billremarks"].ToString();
objCustomer.Billtelno = dt.Rows[0]["Billtelno"].ToString();
objCustomer.Custcode = dt.Rows[0]["Custcode"].ToString();
objCustomer.Custname = dt.Rows[0]["Custname"].ToString();
objCustomer.Doout = dt.Rows[0]["Doout"].ToString();
objCustomer.Douom = dt.Rows[0]["Douom"].ToString();
objCustomer.Inuom = dt.Rows[0]["Inuom"].ToString();
objCustomer.Location = dt.Rows[0]["Location"].ToString();
objCustomer.Outremarks1 = dt.Rows[0]["Outremarks1"].ToString();
objCustomer.Outremarks2 = dt.Rows[0]["Outremarks2"].ToString();
objCustomer.Outremarks3 = dt.Rows[0]["Outremarks3"].ToString();
objCustomer.Pacout = dt.Rows[0]["Pacout"].ToString();
objCustomer.Pacuom = dt.Rows[0]["Pacuom"].ToString();
objCustomer.Perout = dt.Rows[0]["Perout"].ToString();
objCustomer.Peruom = dt.Rows[0]["Peruom"].ToString();
objCustomer.Shipaddress1 = dt.Rows[0]["Shipaddress1"].ToString();
objCustomer.Shipaddress2 = dt.Rows[0]["Shipaddress2"].ToString();
objCustomer.Shipaddress3 = dt.Rows[0]["Shipaddress3"].ToString();
objCustomer.Shipcontact = dt.Rows[0]["Shipcontact"].ToString();
objCustomer.Shipfaxno = dt.Rows[0]["Shipfaxno"].ToString();
objCustomer.Shippostalcode = dt.Rows[0]["Shippostalcode"].ToString();
objCustomer.Shipremaks = dt.Rows[0]["Shipremaks"].ToString();
objCustomer.Shiptelno = dt.Rows[0]["BilladdresShiptelnos1"].ToString();
objCustomer.Shortname = dt.Rows[0]["Shortname"].ToString();
return objCustomer;
}
Some ideas:
Use an Object Relational Model (ORM) tool like Entity Framework or NHibernate
Use a tool like CodeSmith Generator to automatically generate this code for you
Use AutoMapper and Reflection to hydrate your object
You can use Auotamapper.
AutoMapper.Mapper.CreateMap<IDataReader, Objects.Customer>();
var results = AutoMapper.Mapper.Map<IDataReader, IList<Objects.Customer>>(dt.CreateDataReader());

Linq To Sql surprisingly fast retreving data. Is it normal that it is 10x faster than ADO?

I'm currently learning Linq to Sql and Im very surprised by the performance of selecting data. I'm retreving joined data from few tables. I select about 40k of rows. Mapping this data to objects using ADO times about 35s, using NHbiernate times about 130s and what is suspicious using Linq To Sql only 3,5s. Additionally I would like to write that I'm using immediately loading which looks like:
THESIS th = new THESIS(connectionString);
DataLoadOptions dlo = new DataLoadOptions();
dlo.LoadWith<NumericFormula>(x => x.RPN);
dlo.LoadWith<RPN>(x => x.RPNDetails);
dlo.LoadWith<RPNDetail>(x => x.Parameter);
th.LoadOptions = dlo;
th.Log = Console.Out;
Looking to the logs when I'm iterating I can't see that Linq To Sql generate some additional queries to database.
I'm very surprised by huge differences in performance and I wonder that maybe I don't understand something.
Could someone explain me why it works so fast?
To measure time I'm using Stopwatch class.
ADO.NET Code:
public static List<NumericFormulaDO> SelectAllNumericFormulas()
{
var nFormulas = new List<NumericFormulaDO>();
string queryString = #"
SELECT *
FROM NumericFormula nf
Left Join Unit u on u.Unit_Id = nf.Unit_Id
Left Join UnitType ut on ut.UnitType_Id = u.UnitType_Id
Join RPN r on r.RPN_Id = nf.RPN_Id
Join RPNDetails rd on rd.RPN_Id = r.RPN_Id
Join Parameter par on par.Parameter_Id = rd.Parameter_Id where nf.NumericFormula_Id<=10000";
using (var connection = new SqlConnection(connectionString))
{
var command = new SqlCommand(queryString, connection);
connection.Open();
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
var det = new RPNDetailsDO();
det.RPNDetails_Id = Int32.Parse(reader["RPNDetails_Id"].ToString());
det.RPN_Id = Int32.Parse(reader["RPN_Id"].ToString());
det.Identifier = reader["Identifier"].ToString();
det.Parameter.Architecture = reader["Architecture"].ToString();
det.Parameter.Code = reader["Code"].ToString();
det.Parameter.Description = reader["Description"].ToString();
det.Parameter.Parameter_Id = Int32.Parse(reader["Parameter_Id"].ToString());
det.Parameter.ParameterType = reader["ParameterType"].ToString();
det.Parameter.QualityDeviationLevel = reader["QualityDeviationLevel"].ToString();
if (nFormulas.Count > 0)
{
if (nFormulas.Any(x => x.RPN.RPN_Id == Int32.Parse(reader["RPN_Id"].ToString())))
{
nFormulas.First(x=>x.RPN.RPN_Id == Int32.Parse(reader["RPN_Id"].ToString())).RPN.RPNDetails.Add(det);
}
else
{
NumericFormulaDO nFormula = CreatingNumericFormulaDO(reader, det);
nFormulas.Add(nFormula);
//System.Diagnostics.Trace.WriteLine(nFormulas.Count.ToString());
}
}
else
{
NumericFormulaDO nFormula = CreatingNumericFormulaDO(reader, det);
nFormulas.Add(nFormula);
//System.Diagnostics.Trace.WriteLine(nFormulas.Count.ToString());
}
}
}
}
return nFormulas;
}
private static NumericFormulaDO CreatingNumericFormulaDO(SqlDataReader reader, RPNDetailsDO det)
{
var nFormula = new NumericFormulaDO();
nFormula.CalculateDuringLoad = Boolean.Parse(reader["CalculateDuringLoad"].ToString());
nFormula.NumericFormula_Id = Int32.Parse(reader["NumericFormula_Id"].ToString());
nFormula.RPN.RPN_Id = Int32.Parse(reader["RPN_Id"].ToString());
nFormula.RPN.Formula = reader["Formula"].ToString();
nFormula.Unit.Name = reader["Name"].ToString();
if (reader["Unit_Id"] != DBNull.Value)
{
nFormula.Unit.Unit_Id = Int32.Parse(reader["Unit_Id"].ToString());
nFormula.Unit.UnitType.Type = reader["Type"].ToString();
nFormula.Unit.UnitType.UnitType_Id = Int32.Parse(reader["UnitType_Id"].ToString());
}
nFormula.RPN.RPNDetails.Add(det);
return nFormula;
}
LINQ to SQL Code:
THESIS th = new THESIS(connectionString);
DataLoadOptions dlo = new DataLoadOptions();
dlo.LoadWith<NumericFormula>(x => x.RPN);
dlo.LoadWith<RPN>(x => x.RPNDetails);
dlo.LoadWith<RPNDetail>(x => x.Parameter);
th.LoadOptions = dlo;
th.Log = Console.Out;
var nFormulas =
th.NumericFormulas.ToList<NumericFormula>();
NHibernate Code:
IQueryable<NumericFormulaDO> nFormulas =
session.Query<NumericFormulaDO>()
.Where(x=>x.NumericFormula_Id <=10000);
List<NumericFormulaDO> nForList =
new List<NumericFormulaDO>();
nForList = nFormulas.ToList<NumericFormulaDO>();
Related to your comments you can see that in ADO I'm using SqlReader and in LINQ I try to use immediate execution.
Of course it is possible that my mapping "algorithm" in ADO part it's not very good but NHibernate is much more slow than ADO (4x slower) so I wonder if for sure is everything alright in LINQ to SQL part because I think in NHibernate is everything good and after all is much more slow than little confusing ADO part.
Thank you guys for responses.
LINQ-to-SQL consumes ADO.NET and has additional overheads, so no: it shouldn't be faster unless it isn't doing the same work. There was mention of access via ordinals vs names, but frankly that affects micro-seconds, not seconds. It won't explain an order of magnitude change.
The only way to answer this is to trace what LINQ-to-SQL is doing. Fortunately this is simple - you can just do:
dbContext.Log = Console.Out;
which will write the TSQL is executes to the console. There are two options then:
you discover the TSQL isn't doing the same thing (maybe it isn't eager-loading)
you discover the TSQL is valid (=doing the same), but has a better plan - in which case... "borrow" it :p
Once you have the TSQL to compare, test that side-by-side, so you are testing the same work. If you want the convenience without the overheads, I'd look at "dapper" - takes away the boring grunt-work of mapping readers to objects, but very optimised.
Rewritten ADO.NET code based on above remarks, this should be a lot faster. You could still improve by using the ordinal value instead of the column names and by reading the fields in exactly the same order as in the query, but those are micro optimizations.
I've also removed a couple of duplications. You might also want to check how to improve the performance of typecasting and conversions, as the Parse(ToString) route is very inefficient and can cause very strange issues when running with systems running in different languages. There's also a chance of dataloss when doing these conversions when decimal, float or doubles are involved, as not all of their values can be translated to strings correctly (or can't roundtrip back).
public static List<NumericFormulaDO> SelectAllNumericFormulas()
{
var nFormulas = new Dictionary<int, NumericFormulaDO>();
string queryString = #"
SELECT *
FROM NumericFormula nf
Left Join Unit u on u.Unit_Id = nf.Unit_Id
Left Join UnitType ut on ut.UnitType_Id = u.UnitType_Id
Join RPN r on r.RPN_Id = nf.RPN_Id
Join RPNDetails rd on rd.RPN_Id = r.RPN_Id
Join Parameter par on par.Parameter_Id = rd.Parameter_Id where nf.NumericFormula_Id<=10000";
using (var connection = new SqlConnection(connectionString))
{
connection.Open();
using (var command = new SqlCommand(queryString, connection));
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
var det = new RPNDetailsDO();
det.RPNDetails_Id = (int) reader.GetValue("RPNDetails_Id");
det.RPN_Id = (int) reader.GetValue("RPN_Id");
det.Identifier = (string) reader.GetValue("Identifier");
det.Parameter.Architecture = (string)reader.GetValue("Architecture");
det.Parameter.Code = (string)reader.GetValue("Code");
det.Parameter.Description = (string)reader.GetValue("Description");
det.Parameter.Parameter_Id = (int) reader.GetValue("Parameter_Id");
det.Parameter.ParameterType = (string)reader.GetValue("ParameterType");
det.Parameter.QualityDeviationLevel = (string)reader.GetValue("QualityDeviationLevel");
NumericFormulaDO parent = null;
if (!nFormulas.TryGetValue((int)reader.GetValue("RPN_Id"), out parent)
{
parent = CreatingNumericFormulaDO(reader, det);
nFormulas.Add(parent.RPN.RPNID, parent);
}
else
{
parent.RPN.RPNDetails.Add(det);
}
}
}
}
return nFormulas.Values.ToList();
}

Categories

Resources