I am new to databases, and to EF. I am using EF within an ASP.NET Core MVC project. The implementation code below is from a Controller, aiming to combine data from two tables into a summary.
The database has tables: Batch, Doc.
Batch has many columns, including: int BatchId, string BatchEnd. BatchEnd is a consistently formatted DateTime, e.g. 23/09/2016 14:33:21
Doc has many columns including: string BatchId, string HardCopyDestination. Many Docs can refer to the same BatchId, but all Docs that do so have the same value for HardCopyDestination.
I want to populate the following ViewModel
public class Batch
{
public int BatchId { get; set; }
public string Time { get; set; } // from BatchEnd
public string HardCopyDestination { get; set; }
}
But my current query, below, is running dog slow. Have I implemented this correctly?
var BatchViewModels = new List<Batch>();
// this is fine
var batches = _context.BatchTable.Where(
b => b.BatchEnd.Contains(
DateTime.Now.Date.ToString("dd/MM/yyyy")));
// this bit disappears down a hole
foreach (var batch in batches)
{
var doc = _context.DocTable.FirstOrDefault(
d => d.BatchId == batch.BatchId.ToString());
if (doc != null)
{
var newBatchVM = new Batch
{
BatchId = batch.BatchId,
Time = batch.BatchEnd.Substring(whatever to get time),
HardCopyDestination = doc.HardCopyDestination
};
BatchViewModels.Add(newBatchVM);
continue;
}
}
return View(BatchViewModels);
I think you're hitting the database once per batch. If you have many batches that is expensive. You can get all documents in one go from db.
var batchDict = batches.ToDictionary(b => b.BatchId);
var documents = _context.DocTable.Where(doc => batchDict.Keys.Contains(doc.BatchId));
BatchViewModels.AddRange(documents.Select(d => new Batch
{
BatchId = d.BatchId,
Time = batchDict[d.BatchId].BatchEnd.TimeOfDay, // you only want the time?
HardCopyDestination = d.HardCopyDestination
});
By the way, Igor is right about dates and in addition, if BatchId is int in BatchTable, then it should be that in DocTable as well. In above code I assume they are same type but shouldn't be so hard to change if they aren't.
Igor is also right about profiling db is a good way to see what the problem is. I'm just taking a guess based on your code.
Related
Sometimes, we would like to change order details by adding, removing, and editing orders by customer's request or depends on stock quantity.
So now want get some list and update including remove, edit, add rows, then save on database
What's the best efficiently way as C#, EntityFramework?
public class OrderDetail
{
public int Id { get; set; }
public int OrderId {get; set; }
public int Qty{ get; set; }
public string ItemName { get; set; }
}
/// Dummy db, OrderDetail Table
{
{1, 1000, 24,"A"},
{2, 1000, 12,"B"}
}
public void Update()
{
using(var db = new xxEntities())
{
// Get All orders, OrderId==1000, total 2rows
List<OrderDetails> list = db.OrderDetails.Where(x=> x.OrderId==1000).ToList();
// remove some row or rows
var temp1 = list.First(x=> x.Id==1);
list.Remove(temp);
// edit some row or rows
var temp2 = list.First(x=> x.Id==2);
temp2.Qty=100;
// add some row or rows
list.Add(new OrderDetail{ Id=3, OrderId=1000, Qty=2, ItemName="C"});
list.Add(new OrderDetail{ Id=4, OrderId=1000, Qty=2, ItemName="D"});
// Apply all changes
db.SaveChanges();
}
}
Additional Question
public void UpdateOrder(int orderId, List<OrderDetail> newOrders)
{
var result = db.OrderDetails.Where(x=>x.OrderId==orderId).ToList();
result = newOrders;
// it does not work
//db.OrderDetails.Update(result);
db.OrderDetails.RemoveRange(result);
db.OrderDetails.AddRange(newOrders);
db.SaveChange();
}
is it right approach to update multiple rows?
As mentioned in another answer... EF will create individual statements for each of the changes that are detected (i.e., updates, inserts, deletes) and submit them inside a single transaction. Gets the job done but is potentially very "chatty". Benefit is that you don't need to worry about the details of how it's getting done. Pretty easy to just modify the data object and call SaveChanges.
If you can consider not using EF for updates such as this... one way we do this kind of update is by creating a System.Data.DataTable and using that as a table-valued parameter into a stored procedure (if your datastore supports it).
Meta-code:
var dt = new DataTable();
var newRow = dt.NewRow();
newRow["column1"] = newdata;
dt.Rows.Add(newRow);
Then just use dt as your input parameter and let the stored proc determine the insert/update/delete operations.
If you want to Add / Remove / Update rows from your tables in Entity Framework, you have to Add / Remove / Update the items in your DbSet, not in fetched data.
using (var dbContext = new OrderContext())
{
// Add one Order
Order orderToAdd = new Order
{
// fill required properties; don't fill primary key
}
var addedOrder = dbContext.Orders.Add(orderToAdd);
// note: addedOrder has no Id yet.
// Add several Orders
IEnumerable<Order> orders = ...
dbContext.Orders.AddRange(orders);
dbContext.SaveChanges();
// now they've got their id:
Debug.Assert(addedOrder.Id != 0);
Debug.Assert(orders.All(order => order.Id != 0);
}
To Remove, you'll first have to fetch the complete Order
int orderIdToDelete = ...
using (var dbContext = new OrderContext())
{
Order orderToDelete = dbContext.Orders.Find(orderIdToDelete);
dbContext.Orders.Remove(orderToDelete);
var ordersToDelete = dbContext.Orders
.Where(order => order.Date.Year < 2000)
.ToList();
dbContext.Orders.RemoveRange(ordersToDelete);
// the orders are not deleted yet.
dbContext.SaveChanges();
}
To Update, you first have to get the value:
int orderIdToUpdate = ...
Order orderToUpdate = dbContext.Orders.Find(orderIdToUpdate);
orderToUpdate.Date = DateTime.Today;
var today = Datetime.Today;
var dateLimit = today.AddDays(-28);
var nonPaidOrders = dbContext.Orders
.Where(order => !order.Paid && order.Date < dateLimit)
.ToList();
foreach (var order in nonPaidOrders)
{
this.SendReminder(order);
order.ReminderDate = today;
}
dbContext.SaveChanges();
There is no "most efficient" way outside of making all changes then calling SaveChanges. upon which Ef will issue a lot of SQL Statements (one per operation).
There is most efficient way because there is no way to change the way Ef works and there is exactly one way Ef does its updates. They do NOT happen at the same time. Period. They happen in one transaction, one after the other, when you call SaveChanges.
I have been trying to upload 31 distinct records from the SQL server to azure cloud using Azure Search .NET SDK. I am able to upload the records without getting any technical errors. Even the logs confirm that all 31 records got indexed by return the status code as 200 for all 31 records
However in the azure portal when i see the document count on the index i only see 27. This means 4 records for some reason didnt get indexed. if two records have same party ids then only one gets uploaded.
In order to avoid this i created a new key in the dto which
is the combination of party id and tag id to ensure the keys are unique for every row. However this didnt help and i keep losing the rows which have duplicate partyIds.
Could someone please explain to me why the records are missing? i tried googling for related articles but no luck so far.
Below is the Object Dto
public class PartyTagMappingDto
{
[Key] //combination of partyId and TagId
public string Id { get; set; }
[IsFilterable,IsSearchable]
public string PartyId { get; set; }
[IsSearchable,IsFilterable]
public string TagId { get; set; }
[IsSearchable,IsFilterable]
public string TagName { get; set; }
public string Description { get; set; }
}
Maybe this is possible you are sending duplicated data if you want to check please add this code you can find out where your 4 record going.
var batch = IndexBatch.New(actions);
try
{
var data = GetIndexClient(IndexName).Documents.Index(batch);
var passResultCount = data.Results.Where(x => x.Succeeded).Count();
var failResultCount = data.Results.Where(x => x.Succeeded==false).Count();
var MessageResult = data.Results.Where(x => !string.IsNullOrEmpty(x.ErrorMessage));
var keyResult = data.Results.Where(x => !string.IsNullOrEmpty(x.Key)).Select(x=>x.Key).ToList();
var unikKey = keyResult.Distinct().ToList();
string json = Newtonsoft.Json.JsonConvert.SerializeObject(data);
}
catch (IndexBatchException e)
{
// Sometimes when your Search service is under load, indexing will fail for some of the documents in
// the batch. Depending on your application, you can take compensating actions like delaying and
// retrying. For this simple demo, we just log the failed document keys and continue.
Console.WriteLine(
"Failed to index some of the documents: {0}",
String.Join(", ", e.IndexingResults.Where(r => !r.Succeeded).Select(r => r.Key)));
}
Note: in unikKey result can find out the actual result which one update or created in azure server.
i know it is not complicated but i struggle with it.
I have IList<Material> collection
public class Material
{
public string Number { get; set; }
public decimal? Value { get; set; }
}
materials = new List<Material>();
materials.Add(new Material { Number = 111 });
materials.Add(new Material { Number = 222 });
And i have DbSet<Material> collection
with columns Number and ValueColumn
I need to update IList<Material> Value property based on DbSet<Material> collection but with following conditions
Only one query request into database
The returned data from database has to be limited by Number identifier (do not load whole database table into memory)
I tried following (based on my previous question)
Working solution 1, but download whole table into memory (monitored in sql server profiler).
var result = (
from db_m in db.Material
join m in model.Materials
on db_m.Number.ToString() equals m.Number
select new
{
db_m.Number,
db_m.Value
}
).ToList();
model.Materials.ToList().ForEach(m => m.Value= result.SingleOrDefault(db_m => db_m.Number.ToString() == m.Number).Value);
Working solution 2, but it execute query for each item in the collection.
model.Materials.ToList().ForEach(m => m.Value= db.Material.FirstOrDefault(db_m => db_m.Number.ToString() == m.Number).Value);
Incompletely solution, where i tried to use contains method
// I am trying to get new filtered collection from database, which i will iterate after.
var result = db.Material
.Where(x=>
// here is the reasonable error: cannot convert int into Material class, but i do not know how to solve this.
model.Materials.Contains(x.Number)
)
.Select(material => new Material { Number = material.Number.ToString(), Value = material.Value});
Any idea ? For me it is much easier to execute stored procedure with comma separated id values as a parameter and get the data directly, but i want to master linq too.
I'd do something like this without trying to get too cute :
var numbersToFilterby = model.Materials.Select(m => m.Number).ToArray();
...
var result = from db_m in db.Material where numbersToFilterBy.Contains(db_m.Number) select new { ... }
Using a semi-complex structure, I am trying to 'combine' several objects into one using the Linq Aggregate method (though if there is a better way, I am open to ideas).
Here is my basic class design.
class Aspect {
string Name { get; set; }
}
class Arrangement {
Aspect Aspect { get; set; }
IList<Aperture> Apertures { get; set; }
IList<Step> Steps { get; set; }
}
class Step {
int Rank { get; set; }
int Required { get; set; }
}
class Aperture {
string Name { get; set; }
int Size { get; set; }
}
Basically, I am trying to aggregate the entire hierarchy of an IEnumerable<Arrangement> and keep everything on the base level, but where things can appropriately overwrite, I want to overwrite them.
Update
I want to get all Arrangements that share the same Aspect.Name, and get a complete list of Steps, overwriting lower level Steps where higher level Arrangements have the same Rank with a different Required value.
So take for instance...
var list = new List<Arrangement>{
new Arrangement{
Aspect = Aspects.Named("One"),
Steps = new List<Step>{
new Step {
Rank = 1,
Required = 2
},
new Step {
Rank = 2,
Required = 4
}
}
},
new Arrangement{
Aspect = Aspects.Named("One"),
Steps = new List<Step>{
new Step {
Rank = 1,
Required = 3
}
}
}
}
When aggregated properly, it should come out to look like ...
Arrangement
- Aspect
- Name : One
- Steps
- Rank : 1
- Required : 3
- Rank : 2
- Required : 4
I have attempted to use Distinct and Aggregate and it just isn't getting me anywhere. I keep ending up not getting one list or the other. Can anyone help with this?
Update
Here is an example of my current aggregation.
public static Layouts.Template Aggregate(this IList<Layouts.Template> source) {
return source.Aggregate(
source.First(),
(current, next) => new Layouts.Template {
Apertures = (current.Apertures.Concat(next.Apertures).Distinct().ToList()),
Arrangements = (current.Arrangements.Concat(next.Arrangements).Distinct().ToList()),
Pages = (current.Pages.Concat(next.Pages).Distinct().ToList())
});
}
My problem is that I'm having a lot of trouble wrapping my head around how to do this at all, much less in one expression. I'm not unwilling to use multiple methods, but if I could encapsulate it all, it would be really useful. I am fascinated by LINQ in general and I really want to get my head around this.
Update 2
The other collection, Apertures, will work in a similar manner, but it is unrelated to the Steps. Simply two different arrays I must do the same thing to, but they have nothing in common with one another. Learning how to do this with one will give me the knowledge to do it with the other.
If there's no correlation between steps and apertures you can do this:
var result = new Arrangement{
Steps = list.SelectMany(arrangement => arrangment.Steps)
.GroupBy(step => step.Rank)
.Select(l => l.Last())
.OrderBy(step => step.Rank)
.ToList()),
}
If there is you'll need to combine the two somehow. If steps index into apertures then you can use something similar.
After your updates:
var query = arrangements
.GroupBy(a => a.Aspect.Name)
.Select(g =>
new Arrangement
{
Steps = ga.SelectMany(a => a.Steps)
.GroupBy(s => s.Rank)
.Select(gs => gs.Last()),
Aspect = ga.First().Aspect
});
This will create output as in your example.
Now, how to merge it with your current aggregation method? As to my understanging, you want to create one big layout from all current layout contents (including arrangements, pages, etc)?
You don't need aggregate at all, just split it into 3 LINQ queries:
// get all arrangements object from all layouts [flattening with SelectMany]
var arrangements = source.SelectMany(s => s.Arrangements);
// and filter them
var filteredArrangements = // enter larger query from above here
// repeat the same for Apertures and Pages
...
// and return single object
return new Layouts.Template
{
Apertures = filteredApertures,
Arrangements = filteredArrangements,
Pages = filteredPages
};
I assume this isn't the complete solution that you want:
private static Arrangement Accumulate(IEnumerable<Arrangement> arrangements)
{
var stepsByRank = new Dictionary<int, Step>();
foreach (var arrn in arrangements)
foreach (var step in arrn.Steps)
stepsByRank[step.Rank] = step;
return new Arrangement { Steps = stepsByRank.Values.ToArray() };
}
What's missing from this? How else do you want to "aggregate" the arrangements and steps? (edit: after reading your comment, maybe this actually is what you want.)
I have a text file that looks like this:
1,Smith, 249.24, 6/10/2010
2,Johnson, 1332.23, 6/11/2010
3,Woods, 2214.22, 6/11/2010
1,Smith, 219.24, 6/11/2010
I need to be able to find the balance for a client on a given date.
I'm wondering if I should:
A. Start from the end and read each line into an Array, one at a time.
Check the last name index to see if it is the client we're looking for.
Then, display the balance index of the first match.
or
B. Use RegEx to find a match and display it.
I don't have much experience with RegEx, but I'll learn it if it's a no brainer in a situation like this.
I would recommend using the FileHelpers opensource project:
http://www.filehelpers.net/
Piece of cake:
Define your class:
[DelimitedRecord(",")]
public class Customer
{
public int CustId;
public string Name;
public decimal Balance;
[FieldConverter(ConverterKind.Date, "dd-MM-yyyy")]
public DateTime AddedDate;
}
Use it:
var engine = new FileHelperAsyncEngine<Customer>();
// Read
using(engine.BeginReadFile("TestIn.txt"))
{
// The engine is IEnumerable
foreach(Customer cust in engine)
{
// your code here
Console.WriteLine(cust.Name);
// your condition >> add balance
}
}
This looks like a pretty standard CSV type layout, which is easy enough to process. You can actually do it with ADO.Net and the Jet provider, but I think it is probably easier in the long run to process it yourself.
So first off, you want to process the actual text data. I assume it is reasonable to assume each record is seperated by some newline character, so you can utilize the ReadLine method to easily get each record:
StreamReader reader = new StreamReader("C:\Path\To\file.txt")
while(true)
{
var line = reader.ReadLine();
if(string.IsNullOrEmpty(line))
break;
// Process Line
}
And then to process each line, you can split the string on comma, and store the values into a data structure. So if you use a data structure like this:
public class MyData
{
public int Id { get; set; }
public string Name { get; set; }
public decimal Balance { get; set; }
public DateTime Date { get; set; }
}
And you can process the line data with a method like this:
public MyData GetRecord(string line)
{
var fields = line.Split(',');
return new MyData()
{
Id = int.Parse(fields[0]),
Name = fields[1],
Balance = decimal.Parse(fields[2]),
Date = DateTime.Parse(fields[3])
};
}
Now, this is the simplest example, and doesn't account for cases where the fields may be empty, in which case you would either need to support NULL for those fields (using nullable types int?, decimal? and DateTime?), or define some default value that would be assigned to those values.
So once you have that you can store the collection of MyData objects in a list, and easily perform calculations based on that. So given your example of finding the balance on a given date you could do something like:
var data = customerDataList.First(d => d.Name == customerNameImLookingFor
&& d.Date == dateImLookingFor);
Where customerDataList is the collection of MyData objects read from the file, customerNameImLookingFor is a variable containing the customer's name, and customerDateImLookingFor is a variable containing the date.
I've used this technique to process data in text files in the past for files ranging from a couple records, to tens of thousands of records, and it works pretty well.
I think the cleanest way is to load the entire file into an array of custom objects and work with that. For 3 MB of data, this won't be a problem. If you wanted to do completely different search later, you could reuse most of the code. I would do it this way:
class Record
{
public int Id { get; protected set; }
public string Name { get; protected set; }
public decimal Balance { get; protected set; }
public DateTime Date { get; protected set; }
public Record (int id, string name, decimal balance, DateTime date)
{
Id = id;
Name = name;
Balance = balance;
Date = date;
}
}
…
Record[] records = from line in File.ReadAllLines(filename)
let fields = line.Split(',')
select new Record(
int.Parse(fields[0]),
fields[1],
decimal.Parse(fields[2]),
DateTime.Parse(fields[3])
).ToArray();
Record wantedRecord = records.Single
(r => r.Name = clientName && r.Date = givenDate);
Note that both your options will scan the file. That is fine if you only want to search in the file for 1 item.
If you need to search for multiple client/date combinations in the same file, you could parse the file into a Dictionary<string, Dictionary <date, decimal>> first.
A direct answer: for a one-off, a RegEx will probably be faster.
If you're just reading it I'd consider reading in the whole file in memory using StreamReader.ReadToEnd and then treating it as one long string to search through and when you find a record you want to look at just look for the previous and next line break and then you have the transaction row you want.
If it's on a server or the file can be refreshed all the time this might not be a good solution though.
If it's all well-formatted CSV like this then I'd use something like the Microsoft.VisualBasic.TextFieldParser class or the Fast CSV class over on code project to read it all in.
The data type is a little tricky because I imagine not every client has a record for every day. That means you can't just have a nested dictionary for your looksup. Instead, you want to "index" by name first and then date, but the form of the date record is a little different. I think I'd go for something like this as I read in each record:
Dictionary<string, SortedList<DateTime, double>>
hey, hey, hey!!! why not do it with this great project on codeproject Linq to CSV, way cool!
rock solid