So i have a DataSet to query a table on an Oracle Database. The table is very large and has 3.5 million entries. However later in the code I filter this table on a few hundred entries that are necessary.
AgileDataSet agileDataSet = new AgileDataSet();
AgileDataSetTableAdapters.UserDataTableAdapter userDataTableAdapter = new AgileDataSetTableAdapters.UserDataTableAdapter();
userDataTableAdapter.Fill(agileDataSet.UserData);
var l=agileDataSet.UserData.Where(x=>x.ID==1234);
Due to the large amount of entries the Fill() method takes forever. Is there a way to add a conditions to the fill method at runtime. Adding a permanent WHERE clause to the TableAdapter's SQL Statement in the DataSet Designer is not an option, because I do not know beforehand which elements are needed.
Ok so I implemented a hacky "solution" and created a new partial class for the TableAdapter. In this class I generated an adapted Fill-Function to use whenever I need to get a subset of the whole DBTable in my DataTable
partial class UserDataTableAdapter
{
public virtual int FillByCICList(AgileDataSet.UserDataDataTable dataTable, List<long> cics)
{
this.CommandCollection[0].CommandText += String.Format(" WHERE vu.C_IC IN ({0})", String.Join(", ", cics));
this.Adapter.SelectCommand = this.CommandCollection[0];
if ((this.ClearBeforeFill == true))
{
dataTable.Clear();
}
int returnValue = this.Adapter.Fill(dataTable);
this.InitCommandCollection();
return returnValue;
}
}
This function takes a list of c_ics as additional input and adds a where condition to the base commanttext used for the standard fill function.
Calling it would look sth like this:
List<long> c_ics = new List<long>{224,484,966,695,918};
userDataTableAdapter.FillByCICList(agileDataSet.UserData, c_ics);
I am sure that there is a better solution to this, but that is all I got so far
Related
I am using Seed() method of Configuration.cs class for filling data in database when using Update-Database command.
Among other things I am creating list of EventCategory objects like this:
private IList<EventCategory> CreateEventCategoriesTestDefinition()
{
eventCategories = new List<EventCategory>();
var eventCategoryRecruitment = new EventCategory("Recruitment");
eventCategories.Add(eventCategoryRecruitment);
var eventCategoryInternship = new EventCategory("Internship");
eventCategories.Add(eventCategoryInternship);
var eventCategoryTrainingPrograms = new EventCategory("Training Programs");
eventCategoryTrainingPrograms.Events
.Add(new Event("Managerial Training Program 2012-2014", eventCategoryTrainingPrograms));
eventCategories.Add(eventCategoryTrainingPrograms);
var eventCategoryEmployee = new EventCategory("Employee & Team Potential");
eventCategories.Add(eventCategoryEmployee);
return eventCategories;
}
Adding element by element. eventCategory is just a private property:
private IList<EventCategory> eventCategories;
From Seed() method I am calling CreateEventCategoriesTestDefinition()
Almost everything is good but when I go to database to check data I have noticed that data in EventCategory table doesn't have correct order:
As you can see it on a picture Internshipand Training Programs switched positions comparing to order of adding inCreateEventCategoriesTestDefinition() method.
Does anybody knows what is happening here? Why order of adding is not preserved? I know it should be perserved in List<>, but is not the same for IList<>?
Or this is maybe has something to do with EntityFramework?
If you are relying upon the database for your sorting order then either.
Turn auto-id incrementation off and specify your own ID
EventCategory(int id, string name)
If you have to use database identity then instead try using a sort order (int) column for your objects
EventCategory(string name, int sortOrder)
Either way, you cannot guarantee that you'll get a sorted List persisted to the database. I cant say for certain your use case here, but you shouldn't reply on SQL to sort your objects for you, but when querying the database use linq to order them before binding to a view.
e.g.
var mySortedCategories = dbContext.EventCategories.OrderBy(x => x.Name).ToList();
I create a simple stored procedure with some joins with the customer table and other related tables, which takes in two parameters. I can execute this SP in SQL and works.
I drag and drop this SP to my DBML file and recompile.
I add the below code in order to call the SP and return it in a List
public IQueryable<Entities.Customer> AllCustomerRanges(int CId, int ItemID)
{
List<Entities.Customer> c = myDataContext.spCustomerRanges(CId, ItemID).ToList();
}
This gives me the error:
Cannot implicitly convert type 'System.Collections.Generic.List< spCustomerRangesResult>' to 'System.Collections.Generic.List< Entities.Customer>'
Now i dont have a class spCustomerRangesResult but after some research I'm puzzled if i have done something wrong or if i need to implement a class with all the properties that the Customer class has (which sounds a little long winded) or if i've just made an error.
Any idea of how i can call a SP which shows the data in a List?
new class spCustomerRangesResult automatically generated based on sp result, you should convert it to Entities.Customer like this:
public IQueryable<Entities.Customer> AllCustomerRanges(int CId, int ItemID)
{
var c = myDataContext.spCustomerRanges(CId, ItemID).ToList();
if (c == null)
return null;
var customers = c.Select(a => new Entities.Customer
{
FirstName=a.spResultFirstName,
LastName = a.spResultLastName
//this just example conversion, change it as needed.
});
return customers;
}
please note, that I return IQueryable even though the approach that you take when using ToList() but yet returning IQuerybale may not be needed. I dont know all details so this only to show how to convert but the whole method may need re-factoring.
I am new to EF. I created entity models from database.
I have tables CurrencyMaster([FromCurrency],[ToCurrency],[ActiveStatus]) and CurrencyConversion([ID],[FromCurrency],[ToCurrency],[Date],[CurrencyFactor])
I am looping for the CurrencyMaster records and accordingly DownloadCurrencyRates will get me the List<CurrencyRate> objects.
I just want to add these objects to entity database.
I tried something like this
public DownloadStatus DownloadUpdateCurrency(DateTime toDate, DateTime fromDate)
{
CurrencyEntities db = new CurrencyEntities();
var curMasters = db.CurrencyMasters.Where(x => x.ActiveStatus == 0);
foreach (var item in curMasters)
{
var curcRatesList = DownloadCurrencyRates(fromDate, toDate,
item.FromCurrency, item.ToCurrency);
//I know this is a bad code
curcRatesList.Select(x =>
{
db.AddToCurrencyConversions(
new CurrencyEntity.CurrencyConversion {
Date = x.date,
CurrencyFactor = x.value,
FromCurrency = item.FromCurrency,
ToCurrency = item.ToCurrency
}
);
return true;
});
}
db.SaveChanges();
return DownloadStatus.DownloadSuccess;
}
How can I do the same in a proper way?
Is there any way I can do this without looping for curcRatesList?
I am using .NET 3.5, and not sure about EF version.. I didn't try executing code(I need some other setup for that), but I am quite sure that what I am doing is not correct.. So I am posting here..
The procedure is correct. There is no bulk insert capability in EF that would allow to add a whole list of entities in a single method call. You must loop over the items and add them one by one.
As a side note: I would just use an ordinary foreach loop instead of that strange Select trick (which misuses the Select method, but it will work). Or - if curcRatesList is of type List<T> - you can use the Foreach method of List<T> instead of Select.
Is there a "best practice" way of handling bulk inserts (via LINQ) but discard records that may already be in the table? Or I am going to have to either do a bulk insert into an import table then delete duplicates, or insert one record at a time?
08/26/2010 - EDIT #1:
I am looking at the Intersect and Except methods right now. I am gathering up data from separate sources, converting into a List, want to "compare" to the target DB then INSERT just the NEW records.
List<DTO.GatherACH> allACHes = new List<DTO.GatherACH>();
State.IState myState = null;
State.Factory factory = State.Factory.Instance;
foreach (DTO.Rule rule in Helpers.Config.Rules)
{
myState = factory.CreateState(rule.StateName);
List<DTO.GatherACH> stateACHes = myState.GatherACH();
allACHes.AddRange(stateACHes);
}
List<Model.ACH> newRecords = new List<Model.ACH>(); // Create a disconnected "record set"...
foreach (DTO.GatherACH record in allACHes)
{
var storeInfo = dbZach.StoreInfoes.Where(a => a.StoreCode == record.StoreCode && (a.TypeID == 2 || a.TypeID == 4)).FirstOrDefault();
Model.ACH insertACH = new Model.ACH
{
StoreInfoID = storeInfo.ID,
SourceDatabaseID = (byte)sourceDB.ID,
LoanID = (long)record.LoanID,
PaymentID = (long)record.PaymentID,
LastName = record.LastName,
FirstName = record.FirstName,
MICR = record.MICR,
Amount = (decimal)record.Amount,
CheckDate = record.CheckDate
};
newRecords.Add(insertACH);
}
The above code builds the newRecords list. Now, I am trying to get the records from this List that are not in the DB by comparing on the 3 field Unique Index:
AchExceptComparer myComparer = new AchExceptComparer();
var validRecords = dbZach.ACHes.Intersect(newRecords, myComparer).ToList();
The comparer looks like:
class AchExceptComparer : IEqualityComparer<Model.ACH>
{
public bool Equals(Model.ACH x, Model.ACH y)
{
return (x.LoanID == y.LoanID && x.PaymentID == y.PaymentID && x.SourceDatabaseID == y.SourceDatabaseID);
}
public int GetHashCode(Model.ACH obj)
{
return base.GetHashCode();
}
}
However, I am getting this error:
LINQ to Entities does not recognize the method 'System.Linq.IQueryable1[MisterMoney.LARS.ZACH.Model.ACH] Intersect[ACH](System.Linq.IQueryable1[MisterMoney.LARS.ZACH.Model.ACH], System.Collections.Generic.IEnumerable1[MisterMoney.LARS.ZACH.Model.ACH], System.Collections.Generic.IEqualityComparer1[MisterMoney.LARS.ZACH.Model.ACH])' method, and this method cannot be translated into a store expression.
Any ideas? And yes, this is completely inline with the original question. :)
You can't do bulk inserts with LINQ to SQL (I presume you were referring to LINQ to SQL when you said "LINQ"). However, based on what you're describing, I'd recommend checking out the new MERGE operator of SQL Server 2008.
Inserting, Updating, and Deleting Data by Using MERGE
Another example here.
I recommend you just write the SQL yourself to do the inserting, I find it is a lot faster and you can get it to work exactly how you want it to. When I did something similar to this (just a one-off program) I just used a Dictionary to hold the ID's I had inserted already, to avoid duplicates.
I find LINQ to SQL is good for one record or a small set that does its entire lifespan in the LINQ to SQL.
Or you can try to use SQL Server 2008's Bulk Insert .
One thing to watch out for is if you queue more than 2000 or so records without calling SubmitChanges() - TSQL has a limit on the number of statements per execution, so you cannot simply queue up every record and then call SubmitChanges() as this will throw an SqlException, you need to periodically clear the queue to avoid this.
What is the easiest way to convert an IQueryable object to a dataset?
modelshredder has exactly what you need. If you have the datacontext around and don't need the data in terms of your model also, nitzmahone' solution is fine performance wise (if it matches your setup, which is not clear to me)
(yourDatacontext).GetCommand(yourIQueryableHere), pass command text to a DbCommand object, call ExecuteReader, pass reader to dataset's .Load method.
The easiest thing to do might be to write an implementation of IDataReader that can wrapper an IEnumerable (IQueryable is an IEnumberable). There is an implementation here. Then you can call DataTable.Load(IDataReader).
If you create a DataTable from schema so it matches your LINQ to Sql, I have an extension method that takes the IQueryable and fills the DataTable:
public static DataTable AsDataTable(this IQueryable value, DataTable table)
{
var reader = value.GetEnumerator();
while (reader.MoveNext())
{
var record = (Customer)reader.Current;
table.Rows.Add(record.CustomerID, record.City);
}
return table;
}
Note that the cast to Customer is my TEntity:
[Table(Name = "Customers")]
public class Customer
{
[Column(IsPrimaryKey = true)]
public string CustomerID;
[Column]
public string City;
}
There are other options that use reflection to build your DataTable from the Customer class. Given the performance hit from reflection, I chose to use the following method to build my table:
public DataTable GetSchema(params string[] columns)
{
string col_list;
if (columns.Length == 0)
col_list = "*";
else
col_list = String.Join(", ", columns);
return Provider.QueryAsDataTable(string.Format("select top 0 {0} from customers", col_list));
}
Putting all that together, I was able to push the results into my DataGridView:
dgridRO.DataSource = new DataView(rows.AsDataTable(dmap.GetSchema("CustomerID", "City")));
**
ID10TException:
**
I spent all this effort getting a conversion from the IQueryable to a DataTable just so I could create a DataView as a source to the DataGridView. Well, I know I didn't need to do that now.