c# Async and await for loop with database - c#

I'm facing a little problem. I've made a function that imports customers from my website to my database to create invoices and so.. When this function starts, it calls another function to import only new clients. Now I want to make this last function an await to prevent that my software at home can search for the newly imported customers. This customer import is pretty easy, it just selects only the customers which are not imported through a loop. But I also made some security's in there in case something goes wrong. You know, most errors come from human input errors... But what is the best way to make this function an async so that the other function can await while it's importing new customers?
public async Task<bool> ImportClients(bool onlyNewCustomers)
{
System.Data.DataTable Table = new System.Data.DataTable();
System.Data.DataTable CustomerTable = new System.Data.DataTable();
System.Data.DataTable KlantTable = new System.Data.DataTable();
int PrestaCustomerID = 0; //original customer id from the prestashop
int CustomerID = 0; //Original customerID from software customer
int CustomerIdInserted = 0; //id from the inserted customer id in the software
int Wait = 0; //This var is used for the mysql to wait a few miliseconds between x updates
string Sql = "";
string Prefix = "";
DateTime Bday;
//Vars for logging
int CustomersImported = 0;
StringBuilder NewCustomerInfo = new StringBuilder();
StringBuilder NewCustomerAddress = new StringBuilder();
//Select everything whithin the customers table. After that we look if the customer is imported else we update the clients credentials.
Sql =
"SELECT c.id_customer, id_gender, c.firstname, c.lastname, c.email, c.birthday, c.newsletter, c.optin, c.website, " +
"c.active, c.date_add, c.date_upd, c.imported FROM ps_customer c " +
(onlyNewCustomers ? "WHERE imported = 0 " : "") + "ORDER BY c.id_customer DESC;";
Table = Functions.SelectWebQuery(Sql);
if (Table.Rows.Count > 0)
{
for (int i = 0; i < Table.Rows.Count; i++)
{
if (somethingGoesWrong)
{
return false;
}
}
return await Task.WhenAll<bool>(true);
}
}
And what i've tried to call this function
public async static void OrderImport()
{
Functions fns = new Functions();
bool importCustomers = await fns.ImportClients(true);
}
I'm using .net 4.5 with mysql database in winforms.
Thanks!
Paul

My advice would be not to return a Task<bool>, but to return the newly imported customers. If there are no new customers, or there are minor errors, that are probably solved the next time that you import new customers, return an empty list. For errorst that need immediate action raise an exception.
FurtherMore I'd make an extra method to fetch new customers: FetchNewCustomers is easier to understand what it does than FetchCustomers(true).
public Task<ICollection<Customer> FetchNewCustomersAsync()
{
return FetchCustomersAsync(true);
}
public Task<ICollection<Customer> FetchCustomersAsync(bool newOnly)
{
... // TODO: implement
}
Apparently you have a method Functions.SelectWebQuery(string) that returns DataTable. This returned DataTable needs to be converted to a sequence of Customers.
As an extension method. See extension methods demystified
public static IEnumerable<Customer> ToCustomers(this DataTable table)
{
// in this method you handle the problems if the table is not a table of Customers
foreach(DataRow dataRow in table.AsEnumerable)
{
Customer customer = new Customer()
{
... // fetch values from DataRow: dataRow.Field<string>("Name");
}
yield return customer;
}
}
Usage:
string sql = ...
IEnumerable<Customer> customers = Functions.SelectWebQuery(sql).ToCustomers()
Or the async version:
IEnumerable<Customer> customers = (await Functions.SelectWebQueryAsync(sql))
.ToCustomers();
You need to select a sql, depending on whether you want all Customers or only the new Customers:
public string sqlTextAllCustomers {get; }
public string sqlTextNewCustomers {get; }
Now we are ready to implement FetchCustomersAsync:
public Task<ICollection<Customer> FetchCustomersAsync(bool newOnly)
{
string sqlText = newOnly ? sqlTextNewCustomer : sqlTextAllCustomers;
try
{
// in baby steps:
DataTable fetchedData = await Functions.SelectWebQuery(sqlText);
return fetchedData.ToCustomers();
}
catch(Exception exc)
{
// TODO: detect if serious error or not
if (isSeriousError)
throw exc;
else
{
// not a serious error: we'll process the new customers next time
return Enumerable.Empty<Customer>();
}
}
}

Related

I sent data over 900 dataset rows to the web api causing time out. Using Asp.net MVC

I would like to send it divided by 100 rows in a row. I don't have the knowledge to be amputated. C #
This is the code I use to call the web api.
public static string AddPlanningAPI(string planNo, string jobNo, string bDate, string eDate, string progId, string timeGroup, string userId, string stationId)
{
DataSet ds = dsConfirmBookingAPI(planNo, jobNo, bDate, eDate, progId, timeGroup, userId, stationId);
RSWSJobOrder.DataJobOrderSoapClient rsWsJobOrder = new RSWSJobOrder.DataJobOrderSoapClient();
try
{
string iResult = rsWsJobOrder.AddPlaning_Return_JsonString(ds, userId);
return iResult;
}
catch (Exception tmp_ex) { throw tmp_ex; }
}
public string AddPlaning_Return_JsonString(System.Data.DataSet SendDs, string createBy) {
return base.Channel.AddPlaning_Return_JsonString(SendDs, createBy);
}
So you basically want to batch upload your rows. Below will achieve that, you'll define the number of items per 'batch' and upload them in those groups.
Please refer to comments in code for explanation of each line.
Note: If your DataTable needs a name (or you have multiple tables in DataSet) you'll need to modify the below to accommodate for those requirements.
public void BatchUpload(DataSet ds)
{
int numberPerBatch = 100; // Define the number per batch
for (int skip = 0; skip < ds.Tables[0].Rows.Count; skip += numberPerBatch) // Group the batches
{
DataTable batchDT = new DataTable(); // Create a new DataTable for the batch
var batch = ds.Tables[0].Rows.Cast<System.Data.DataRow>().Skip(skip).Take(numberPerBatch); // LINQ to create the batch off existing set.
foreach (var row in batch) // Import rows to new datatable
{
batchDT.Rows.Add(row);
}
DataSet batchDS = new DataSet(); // Create a new DataSet
batchDS.Tables.Add(batchDT); // Add datatable to dataset
string iResult = rsWsJobOrder.AddPlaning_Return_JsonString(batchDS, userId); // send the batch off
}
}

MultiSpeak API: How to Put Database Values in an Array

I am working on MultiSpeak API--I am not familiar with that. A function is as follows:
public meter[] GetMeterByAccountNumber(string accountNumber) {
meter myMeter = new meter();//IS this declaration right?
//some query work and next is sql data reader
int i = 0;
while (rdr.Read())
{
myMeter[i].deviceClass = rdr["deviceClass"].ToString();//error: Cannot apply indexing with [] to type 'meter'
i++;
}
return myMeter[]; //generates ERROR: Value expected.
}
I don't know what this return type of 'GetMeterByAccountNumber' is but it does expect a return of meter[] array.
GetMeterByAccountNumber is not the return type, it's the function name.
You could do somthing like this, however I would call it GetMetersByAccountNumber as it returns an array/IEnumerable
Also i'm not sure what deviceClass has to do with account number..
using System.Linq;
using System.Collections.Generic
public IEnumerable<meter> GetMetersByAccountNumber(string accountNumber) {
var items = new List<meter>();
//some query work and next is sql data reader
while (rdr.Read())
{
var deviceClass = rdr["deviceClass"].ToString();
var meter = new meter();
//Im guessing meter has some properties to set ?
meter.deviceClass = deviceClass;
items.Add(meter);
}
return items.AsReadOnly();
}
Here is what works for me, based on #Richard Friend's Answer. Thanks Richard!
public meter[] GetMetersByAccountNumber(string accountNumber)
{
meter[] final_return;
var items = new List<meter>();
while (rdr.Read())
{
var meter = new meter();
meter.deviceClass = rdr["deviceClass"].ToString();
items.Add(meter);
}
final_return = items.ToArray();
return final_return;
}

CRM Dynamics 2013 SDK Update Current Accounts With 2 Values

I have a scenario in CRM where I need to update existing accounts with their Vat and Registration number. There is well over 30 thousand accounts in the system. I am trying to update using the CRM SDK API but I am battling to figure out how to perform the actual update. The vat number and reg have been provided to me in a spreadsheet with their corresponding number, please note that the accounts are already in CRM so I just need to update the correct account with its Vat and Reg number, How can I do this in CRM, please advice on my code below:
public static void UpdateAllCRMAccountsWithVATAndRegistrationNumber(IOrganizationService service)
{
QueryExpression qe = new QueryExpression();
qe.EntityName = "account";
qe.ColumnSet = new ColumnSet("account", "new_vatno", "new_registrationnumber");
qe.Criteria.AddCondition("accountnumber", ConditionOperator.In,"TA10024846", "TA10028471", "TA20014015", "TA4011652", "TA4011557");
EntityCollection response = service.RetrieveMultiple(qe);
foreach (var acc in response.Entities)
{
acc.Attributes["new_vatno"] = //this is where I am struggling to figure out how I am gong to match the records up,
acc.Attributes["new_registrationnumber"] = //this is where I am struggling to figure out how I am gong to match the records up,
service.Update(acc);
}
}
How am I going to ensure that I update the correct records. I have the vat and reg numbers for the accounts in a spreadsheet, please see example image below. Can I please get advised here. Thanks.
I would load the list of VAT updates from the spreadsheet into a dictionary and then load the 30k record from CRM into memory. Then I would match them up and use ExecuteMultipleRequest to do the updates. Alternatively, you could query CRM using the account numbers (if the list is small enough.) I made the assumption you had thousands of updates to do across the record set of 30k. Note, if the Account record size was very large and couldn't be loaded into memory you would need to do account number queries.
Here is the rough code for the basic solution (I haven't tested, method should be split up, and there is minimal error handling):
public class VatInfo
{
public string RegistrationNumber;
public string TaxNumber;
public static Dictionary<string, VatInfo> GetVatList()
{
//TODO: Implement logic to load CSV file into a list. Dictionary key value should be Account Number
throw new NotImplementedException();
}
}
public class UpdateVatDemo
{
public const int maxBatchSize = 100;
public static void RunVatUpdate(IOrganizationService conn)
{
var vats = VatInfo.GetVatList();
var pagingQuery = new QueryExpression("account");
pagingQuery.ColumnSet = new ColumnSet("accountnumber");
Queue<Entity> allEnts = new Queue<Entity>();
while (true)
{
var results = conn.RetrieveMultiple(pagingQuery);
if (results.Entities != null && results.Entities.Any())
results.Entities.ToList().ForEach(allEnts.Enqueue);
if (!results.MoreRecords) break;
pagingQuery.PageInfo.PageNumber++;
pagingQuery.PageInfo.PagingCookie = results.PagingCookie;
}
ExecuteMultipleRequest emr = null;
while (allEnts.Any())
{
if (emr == null)
emr = new ExecuteMultipleRequest()
{
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = true,
ReturnResponses = true
},
Requests = new OrganizationRequestCollection()
};
var ent = allEnts.Dequeue();
if (vats.ContainsKey(ent.GetAttributeValue<string>("accountnumber")))
{
var newEnt = new Entity("account", ent.Id);
newEnt.Attributes.Add("new_vatno", vats[ent.GetAttributeValue<string>("accountnumber")].TaxNumber);
newEnt.Attributes.Add("new_registrationnumber", vats[ent.GetAttributeValue<string>("accountnumber")].RegistrationNumber);
emr.Requests.Add(new UpdateRequest() { Target = newEnt });
}
if (emr.Requests.Count >= maxBatchSize)
{
try
{
var emResponse = (ExecuteMultipleResponse) conn.Execute(emr);
foreach (
var responseItem in emResponse.Responses.Where(responseItem => responseItem.Fault != null))
DisplayFault(emr.Requests[responseItem.RequestIndex],
responseItem.RequestIndex, responseItem.Fault);
}
catch (Exception ex)
{
Console.WriteLine($"Exception during ExecuteMultiple: {ex.Message}");
throw;
}
emr = null;
}
}
}
private static void DisplayFault(OrganizationRequest organizationRequest, int count,
OrganizationServiceFault organizationServiceFault)
{
Console.WriteLine(
"A fault occurred when processing {1} request, at index {0} in the request collection with a fault message: {2}",
count + 1,
organizationRequest.RequestName,
organizationServiceFault.Message);
}
}
Updating the fetched entity is bound to fail because of its entity state, which would not be null.
To update the fetched entities, you need to new up the entity:
foreach (var acc in response.Entities)
{
var updateAccount = new Entity("account") { Id = acc.Id };
updateAccount .Attributes["new_vatno"] = null; //using null as an example.
updateAccount .Attributes["new_registrationnumber"] = null;
service.Update(acc);
}
Code below shows how I managed to get it righy. forst let me explain. I imported my records into a seperate SQL table, in my code I read that table into a list in memory, I then query CRM accounts that need to be updated, I then loop though each account and check if the account number in CRM matches the account number from my sql database, if it matches, I then update the relevant Reg no and Vat no, See code below:
List<Sheet1_> crmAccountList = new List<Sheet1_>();
//var crmAccount = db.Sheet1_.Select(x => x).ToList().Take(2);
var crmAccounts = db.Sheet1_.Select(x => x).ToList();
foreach (var dbAccount in crmAccounts)
{
CRMDataObject modelObject = new CRMDataObject()
{
ID = dbAccount.ID,
Account_No = dbAccount.Account_No,
Tax_No = dbAccount.Tax_No.ToString(),
Reg_No = dbAccount.Reg_No
//Tarsus_Country = dbAccount.Main_Phone
};
}
var officialDatabaseList = crmAccounts;
foreach (var crmAcc in officialDatabaseList)
{
QueryExpression qe = new QueryExpression();
qe.EntityName = "account";
qe.ColumnSet = new ColumnSet("accountnumber", "new_vatno", "new_registrationnumber");
qe.Criteria.AddCondition("accountnumber", ConditionOperator.In,'list of account numbers go here'
EntityCollection response = service.RetrieveMultiple(qe);
foreach (var acc in response.Entities)
{
if (crmAcc.Account_No == acc.Attributes["accountnumber"].ToString())
{
//acc.Attributes["new_vatno"] = crmAcc.VAT_No.ToString();
acc.Attributes["new_registrationnumber"] = crmAcc.Reg_No.ToString();
service.Update(acc);
}
}
}

Updating an EntityFramework Record with relating new records

I am trying to use one attach to the EF to update all the records I need.
public void UpdateSale(Sale s)
{
Context.Sales.Attach(s);
Context.Entry(s).State = System.Data.Entity.EntityState.Modified;
Context.SaveChanges();
}
Lets Say like so. The above code gives me an error because it says the primary keys I am trying to add already exist(They arent automatically generated yet)
Now Sale has a number of different other Entity Models inside it like:
SavedForm, ProductSale
Now the code calling the UpdateSale is here
public JsonResult AddNewForms(string Anamaka, string NispahB, string Hazaot, string ManufactorerID, string ClientStatus, string TypeFile)
{
BL.FormConnectorLogic fcl = new BL.FormConnectorLogic();
DAL.SavedForm AnamakaForm = MakeSavedForm(Boolean.Parse(Anamaka),"מסמך הנמקה",ClientStatus);
DAL.SavedForm NispahBForm = MakeSavedForm(Boolean.Parse(NispahB), "נספח ב", ClientStatus);
DAL.SavedForm HazaotForm = MakeSavedForm(Boolean.Parse(Hazaot), TypeFile, ClientStatus, ManufactorerID);
var results = new { A = AnamakaForm, N = NispahBForm, H = HazaotForm };
return Json(results, JsonRequestBehavior.AllowGet);
}
public DAL.SavedForm MakeSavedForm(bool Authorized, string FormName, string ClientStatus, string ManufactorerID = "")
{
DAL.Sale s = (DAL.Sale)Session["SaleSave"];
DAL.SavedForm sf = new DAL.SavedForm();
if (Authorized)
{
sf = new DAL.SavedForm();
sf.FormName = new BL.FormConnectorLogic().getFormByName(FormName, ClientStatus, ManufactorerID).FormName;
sf.DateFormed = DateTime.Now;
sf.AgentID = s.AgentID;
sf.Status = "פתוח";
sf.SaleID = s.ID;
s.SavedForms.Add(sf);
new BL.SaleLogic().UpdateSale(s);
Session["SaleSave"] = s;
return sf;
}
else return null;
}
Now I've read up on the State and there is a difference between Added and Modified.
While I can't really seem to tell when I am going to add and when I am going to modify.
Is there anyway to disregard everything and to just shove my whole class and all its relationships to the database?

How to take a CSV field and write to columns in SQL

I have the following code which takes a CSV and writes to a console:
using (CsvReader csv = new CsvReader(
new StreamReader("data.csv"), true))
{
// missing fields will not throw an exception,
// but will instead be treated as if there was a null value
csv.MissingFieldAction = MissingFieldAction.ReplaceByNull;
// to replace by "" instead, then use the following action:
//csv.MissingFieldAction = MissingFieldAction.ReplaceByEmpty;
int fieldCount = csv.FieldCount;
string[] headers = csv.GetFieldHeaders();
while (csv.ReadNextRecord())
{
for (int i = 0; i < fieldCount; i++)
Console.Write(string.Format("{0} = {1};",
headers[i],
csv[i] == null ? "MISSING" : csv[i]));
Console.WriteLine();
}
}
The CSV file has 7 headers for which I have 7 columns in my SQL table.
What is the best way to take each csv[i] and write to a row for each column and then move to the next row?
I tried to add the ccsv[i] to a string array but that didn't work.
I also tried the following:
SqlCommand sql = new SqlCommand("INSERT INTO table1 [" + csv[i] + "]", mysqlconnectionstring);
sql.ExecuteNonQuery();
My table (table1) is like this:
name address city zipcode phone fax device
your problem is simple but I will take it one step further and let you know a better way to approach the issue.
when you have a problem to sold, always break it down into parts and apply each part in each own method. For example, in your case:
1 - read from the file
2 - create a sql query
3 - run the query
and you can even add validation to the file (imagine your file does not even have 7 fields in one or more lines...) and the example below it to be taken, only if your file never passes around 500 lines, as if it does normally you should consider to use a SQL statement that takes your file directly in to the database, it's called bulk insert
1 - read from file:
I would use a List<string> to hold the line entries and I always use StreamReader to read from text files.
using (StreamReader sr = File.OpenText(this.CsvPath))
{
while ((line = sr.ReadLine()) != null)
{
splittedLine = line.Split(new string[] { this.Separator }, StringSplitOptions.None);
if (iLine == 0 && this.HasHeader)
// header line
this.Header = splittedLine;
else
this.Lines.Add(splittedLine);
iLine++;
}
}
2 - generate the sql
foreach (var line in this.Lines)
{
string entries = string.Concat("'", string.Join("','", line))
.TrimEnd('\'').TrimEnd(','); // remove last ",'"
this.Query.Add(string.Format(this.LineTemplate, entries));
}
3 - run the query
SqlCommand sql = new SqlCommand(string.Join("", query), mysqlconnectionstring);
sql.ExecuteNonQuery();
having some fun I end up doing the solution and you can download it here, the output is:
The code can be found here. It needs more tweaks but I will left that for others. Solution written in C#, VS 2013.
The ExtractCsvIntoSql class is as follows:
public class ExtractCsvIntoSql
{
private string CsvPath, Separator;
private bool HasHeader;
private List<string[]> Lines;
private List<string> Query;
/// <summary>
/// Header content of the CSV File
/// </summary>
public string[] Header { get; private set; }
/// <summary>
/// Template to be used in each INSERT Query statement
/// </summary>
public string LineTemplate { get; set; }
public ExtractCsvIntoSql(string csvPath, string separator, bool hasHeader = false)
{
this.CsvPath = csvPath;
this.Separator = separator;
this.HasHeader = hasHeader;
this.Lines = new List<string[]>();
// you can also set this
this.LineTemplate = "INSERT INTO [table1] SELECT ({0});";
}
/// <summary>
/// Generates the SQL Query
/// </summary>
/// <returns></returns>
public List<string> Generate()
{
if(this.CsvPath == null)
throw new ArgumentException("CSV Path can't be empty");
// extract csv into object
Extract();
// generate sql query
GenerateQuery();
return this.Query;
}
private void Extract()
{
string line;
string[] splittedLine;
int iLine = 0;
try
{
using (StreamReader sr = File.OpenText(this.CsvPath))
{
while ((line = sr.ReadLine()) != null)
{
splittedLine = line.Split(new string[] { this.Separator }, StringSplitOptions.None);
if (iLine == 0 && this.HasHeader)
// header line
this.Header = splittedLine;
else
this.Lines.Add(splittedLine);
iLine++;
}
}
}
catch (Exception ex)
{
if(ex.InnerException != null)
while (ex.InnerException != null)
ex = ex.InnerException;
throw ex;
}
// Lines will have all rows and each row, the column entry
}
private void GenerateQuery()
{
foreach (var line in this.Lines)
{
string entries = string.Concat("'", string.Join("','", line))
.TrimEnd('\'').TrimEnd(','); // remove last ",'"
this.Query.Add(string.Format(this.LineTemplate, entries));
}
}
}
and you can run it as:
class Program
{
static void Main(string[] args)
{
string file = Ask("What is the CSV file path? (full path)");
string separator = Ask("What is the current separator? (; or ,)");
var extract = new ExtractCsvIntoSql(file, separator);
var sql = extract.Generate();
Output(sql);
}
private static void Output(IEnumerable<string> sql)
{
foreach(var query in sql)
Console.WriteLine(query);
Console.WriteLine("*******************************************");
Console.Write("END ");
Console.ReadLine();
}
private static string Ask(string question)
{
Console.WriteLine("*******************************************");
Console.WriteLine(question);
Console.Write("= ");
return Console.ReadLine();
}
}
Usually i like to be a bit more generic so i'll try to explain a very basic flow i use from time to time:
I don't like the hard coded attitude so even if your code will work it will be dedicated specifically to one type. I prefer i simple reflection, first to understand what DTO is it and then to understand what repository should i use to manipulate it:
For example:
public class ImportProvider
{
private readonly string _path;
private readonly ObjectResolver _objectResolver;
public ImportProvider(string path)
{
_path = path;
_objectResolver = new ObjectResolver();
}
public void Import()
{
var filePaths = Directory.GetFiles(_path, "*.csv");
foreach (var filePath in filePaths)
{
var fileName = Path.GetFileName(filePath);
var className = fileName.Remove(fileName.Length-4);
using (var reader = new CsvFileReader(filePath))
{
var row = new CsvRow();
var repository = (DaoBase)_objectResolver.Resolve("DAL.Repository", className + "Dao");
while (reader.ReadRow(row))
{
var dtoInstance = (DtoBase)_objectResolver.Resolve("DAL.DTO", className + "Dto");
dtoInstance.FillInstance(row.ToArray());
repository.Save(dtoInstance);
}
}
}
}
}
Above is a very basic class responsible importing the data. Nevertheless of how this piece of code parsing CSV files (CsvFileReader), the important part is thata "CsvRow" is a simple List.
Below is the implementation of the ObjectResolver:
public class ObjectResolver
{
private readonly Assembly _myDal;
public ObjectResolver()
{
_myDal = Assembly.Load("DAL");
}
public object Resolve(string nameSpace, string name)
{
var myLoadClass = _myDal.GetType(nameSpace + "." + name);
return Activator.CreateInstance(myLoadClass);
}
}
The idea is to simple follow a naming convetion, in my case is using a "Dto" suffix for reflecting the instances, and "Dao" suffix for reflecting the responsible dao. The full name of the Dto or the Dao can be taken from the csv name or from the header (as you wish)
Next step is filling the Dto, each dto or implements the following simple abstract:
public abstract class DtoBase
{
public abstract void FillInstance(params string[] parameters);
}
Since each Dto "knows" his structure (just like you knew to create an appropriate table in the database), it can easily implement the FillInstanceMethod, here is a simple Dto example:
public class ProductDto : DtoBase
{
public int ProductId { get; set; }
public double Weight { get; set; }
public int FamilyId { get; set; }
public override void FillInstance(params string[] parameters)
{
ProductId = int.Parse(parameters[0]);
Weight = double.Parse(parameters[1]);
FamilyId = int.Parse(parameters[2]);
}
}
After you have your Dto filled with data you should find the appropriate Dao to handle it
which is basically happens in reflection in this line of the Import() method:
var repository = (DaoBase)_objectResolver.Resolve("DAL.Repository", className + "Dao");
In my case the Dao implements an abstract base class - but it's not that relevant to your problem, your DaoBase can be a simple abstract with a single Save() method.
This way you have a dedicated Dao to CRUD your Dto's - each Dao simply knows how to save for its relevant Dto. Below is the corresponding ProductDao to the ProductDto:
public class ProductDao : DaoBase
{
private const string InsertProductQuery = #"SET foreign_key_checks = 0;
Insert into product (productID, weight, familyID)
VALUES (#productId, #weight, #familyId);
SET foreign_key_checks = 1;";
public override void Save(DtoBase dto)
{
var productToSave = dto as ProductDto;
var saveproductCommand = GetDbCommand(InsertProductQuery);
if (productToSave != null)
{
saveproductCommand.Parameters.Add(CreateParameter("#productId", productToSave.ProductId));
saveproductCommand.Parameters.Add(CreateParameter("#weight", productToSave.Weight));
saveproductCommand.Parameters.Add(CreateParameter("#familyId", productToSave.FamilyId));
ExecuteNonQuery(ref saveproductCommand);
}
}
}
Please ignore the CreateParameter() method, since it's an abstraction from the base classs. you can just use a CreateSqlParameter or CreateDataParameter etc.
Just notice, it's a real naive implementation - you can easily remodel it better, depends on your needs.
From the first impression of your questionc I guess you would be having hugely number of records (more than lacs). If yes I would consider the SQL bulk copies an option. If the record would be less go ahead single record. Insert. The reason for you insert not working is u not providing all the columns of the table and also there's some syntax error.

Categories

Resources