How can i execute functions with respective sql operation insert and update - c#

I am inserting and updating data from on db table to another db table. For this i have 2 functions. I want that control should go to function based on sql operation.
If i inserted data in a db table then control should go to insert function, if i update the db table data then control should go to update function.
Can some one help how can i wrote conditions here to achieve this.
Many Thanks
C# code
//When service starts controls comes to this block
public ESS_VMint()
{
dLib = new DataLibrary("ESS", false, out essStatus);
dLib = new DataLibrary("VillageMate", false, out villagemateStatus);
InsertFields();
UpdateFields();
}
public DataTable InsertFields()
{
//Insert operations
}
public DataTable UpdateFields()
{
//Update Operations
}

Related

C# And MongoDB Create DB from Code

Hi Im Trying to create a DataBase in MongoDB from C# code This is the code Im Useing
public partial class SqlToMongo : Form
{
public SqlToMongo()
{
InitializeComponent();
connectToMongo();
}
public void connectToMongo(){
var con = "mongodb://127.0.0.1";
MongoClient client = new MongoClient(con);
var db = client.GetDatabase("BetsOdds");
bool d = db.RunCommandAsync((Command<BsonDocument>)"{ping:1}")
.Wait(2000);
var Betsodds = db.GetCollection<BetOdds>("Betodds");
}
}
The ping return true when MongoService is running and false when the service is off, the code works.
I'm using RoboMongo as a GUI for MongoDB and after the code runs i still don't see the Database on the GUI. i need some help what i'm doing wrong
Thanks
The database will not show up in the list until you have added some data. I have not used RoboMongo, but if you create a database in code, and then using the Mongo console to list the databases, you will not see anything. Add some data and try again, the database will show up in the list of the 'show dbs' command.

LinqPad connect two azure databases on same server

Per the FAQ (1), I can add additional databases to my existing connection in a number of ways. I have tried them all, and none work for SQL Azure.
In fact, SQL Azure as a provider, doesn't even include the option to "Include additional databases."
Can someone please tell me a workaround for LinqPad to connect two databases? I am trying to create a migration linqpad script to sync data from one database to another.
http://www.linqpad.net/FAQ.aspx#cross-database
This fails because SQL Azure does not let you create linked servers. See
Can linked server providers be installed on a SQL Azure Database instance?
If you simply want to copy data from one database to another, and the schemas are the same, a workaround is to create a separate connection using the same TypedDataContext class:
void Main()
{
CopyFrom<Customer>("<source connection string>");
}
void CopyFrom<TTable> (string sourceCxString) where TTable : class
{
// Create another typed data context for the source. Note that it must have compatible schema:
using (var sourceContext = new TypedDataContext (sourceCxString) { ObjectTrackingEnabled = false })
{
// Delete the rows currently in our table:
ExecuteCommand ("delete " + Mapping.GetTable (typeof (TTable)).TableName);
// Insert the rows from the source table into the target table and submit changes:
GetTable<TTable>().InsertAllOnSubmit (sourceContext.GetTable<TTable>());
SubmitChanges();
}
}
Simple Select Example:
void Main()
{
SimpleSelect("<your conn string>");
}
void SimpleSelect (string sourceCxString)
{
// Create another typed data context for the source. Note that it must have compatible schema:
using (var sourceContext = new TypedDataContext (sourceCxString) { ObjectTrackingEnabled = false })
{
sourceContext.Assignee.OrderByDescending(a => a.CreateTimeStamp).Take(10).Dump();
Assignee.OrderByDescending(a => a.CreateTimeStamp).Take(10).Dump();
}
}

Validating data in my Data Table before attempting to write to the database with SQLBulkCopy

I make use of SqlBulkCopier to insert a large number of entries into our logging database.
The layout of the program is:
It receives a stream of data from the networks (other servers) it then parses the stream and builds up Log objects. (200 - 400) a second. I then add each log to the Sql DataTable Object.
I then increment a counter. Once i have 10000 logs I do the sqlBulkInsert.
now the issue I am having is that, if One of the rows doesn't fit the sql validation, Ie, the one field is to long etc. then I loose all the remaining logs.
Is there not a way to call validate on the data table for each Log Item I add to it, that way i can skip the invalid ones and keep all the valid ones safe.
Currently I am inserting one Item at time, and if it fails I ignore it and carry on with the next. But this obviously defeats the point and performance benefits of SqlBulk Copy.
Some Code:
private DataTable _logTable;
public void AddLog(Log log)
{
if (log.serverId != null || log.serverId > 1)
{
try
{
_logTable.Rows.Add(log.logId, log.messageId, log.serverId, log.time, log.direction, log.hasRouting,
log.selfRouting, log.deviceType, log.unitId, log.accountCode, log.clientId, log.data);
if (_logBufferCounter++ > BufferValue)
{
_logBufferCounter = 0;
using (var sbc = new SqlBulkCopy(_connectionString, SqlBulkCopyOptions.TableLock))
{
sbc.DestinationTableName = "dbo.Logs";
sbc.BulkCopyTimeout = 0;
sbc.WriteToServer(_logTable);
_logTable.Clear();
sbc.Close();
}
}
}
catch (Exception e)
{
Log.Error("Failed to write bulk insert for LOG Table", e);
_logTable.Clear();
}
}
else
{
Log.Error("Server Id is null for LOG: " + LogToString(log));
}
}
No, there is not.
But you can as a programmer do validation before inserting. Not exactly that hard, you know. And there is no need to have a HEAVY data table at all - use normal objects and hammer them into the SqlBulkDCopy instance using your own implementation of the required interface ;)

Asp.net MVC real time application performance

i'm trying to create an asp.net mvc web application ,some pages need to show the data in "real time" ,this data is on a sql server database ,the data is changing always
i created a stored procedure in sql server , i call this procedure in my controller using Entity framework linq and send the result to the browser using ajax
i used outputcashing to minimize the number of execution of the stored procedure ,
in the same controller there is multiple methode that use the same stored procedure ,every methode execute the same procedure ,
how to emprove the performance of my application ,
is there a way to execute the stored procedure only one time for all the controller ??
this is my controller my objective is to minimize the use of the database
[OutputCache(Duration = 20, VaryByParam = "*")]
public class JobsETLController : Controller
{
private ETL_REP_MAUIEntities db = new ETL_REP_MAUIEntities();
public ObjectResult<BIOGetETLJobs_Result> ETLJobs;
public JobsETLController()
{
ETLJobs =db.BIOGetETLJobs();
}
public ActionResult Indexp()
{
var y = from xx in ETLJobs
where xx.etat!="Completed"
orderby xx.etat ascending
select xx;
return PartialView(y);
}
public ActionResult IndexpAll()
{
var y = from xx in ETLJobs
where xx.etat == "Completed"
select xx;
return PartialView(y);
}
If your server is not in a web farm (multiple servers) then you can cache the data in the Asp.net Cache (this is not output cacheing, it's data cacheing). You simply set a 5 minute time limit on the expiration of the data (you say the data needs to update every 5 minutes) so that when a controller needs the data it first checks the cache, and if it's not there it will then execute the stored procedure.
MyData items;
items = (MyData)Cache["MyData"];
if(items == null)
{
items = DoQueryToReturnItems();
Cache.Add("MyData", items, null, DateTime.Now.AddMinutes(5), ..);
}
It's even possible to setup the cache item to be dependent upon a SqlDependency so that when the data changes, the cache can be updated.

Nhibernate transaction locks a tabel

I have developed a WCF api which is using nHibernate. I am new to this. I have used session.update to take care of transaction. I have a for loop in which based on select condition I am updating a record ie. If A is present in tabel1 then I am updating the table else inserting a new entry.
I am getting "could not execute query." when trying to execute a select query on a table which was previously being updated by adding a new entry in the table.
What I think is, because I am using session.save(table1) and then trying select entries from that table I am getting an error. Since session.save temporarily locks the table I am not able to execute a select query on that table.
What can be the solution on this?
Update:
This the for loop I am using to check in the database for some field:
using (ITransaction tranx = session.BeginTransaction())
{
savefunction();
tranx.Commit();
}
Save function:
public void savefunction()
{
for (int i = 0; i < dictionary.Count; i++)
{
ICandidateAttachmentManager candidateAttach = new ManagerFactory().GetCandidateAttachmentManager();
CandidateAttachment attach = new CandidateAttachment();
attach = checkCV();
if(attach == null)
{
//insert new entry into table attach
session.save(attach);
}
}
}
checkCV function:
public void checkCV()
{
using (ICandidateAttachmentManager CandidateAttachmentManager = new ManagerFactory().GetCandidateAttachmentManager())
{
IList<CandidateAttachment> lstCandidateAttachment = CandidateAttachmentManager.GetByfkCandidateId(CandidateId);
if (lstCandidateAttachment.Count > 0)
{
CandidateAttachment attach = lstCandidateAttachment.Where(x => x.CandidateAttachementType.Id.Equals(FileType)).FirstOrDefault();
if (attach != null)
{
return null;
}
else
{
return "some string";
}
}
}
}
What happening here is in the for loop if say for i=2 the attach value comes to null that I am entering new entry into attach table. Then for i=3 when it enters checkCV function I get an error at this line:
IList lstCandidateAttachment =
CandidateAttachmentManager.GetByfkCandidateId(CandidateId);
I think it is because since I am using session.save and then trying to read the tabel contents I am unable to execute the query and table is locked till I commit my session. Between the beginTransaction and commit, the table associated with the object is locked. How can I achieve this? Any Ideas?
Update:
I read up on some of the post. It looks like I need to set isolation level for the transaction. But even after adding it doesn't seem to work. Here is how I tried to inplement it:
using (ITransaction tranx = session.BeginTransaction(IsolationLevel.ReadUncommitted))
{
saveDocument();
}
something I don't understand in your code is where you get your nHibernate session.
Indeed you use
new ManagerFactory().GetCandidateAttachmentManager();
and
using (ICandidateAttachmentManager CandidateAttachmentManager = new ManagerFactory().GetCandidateAttachmentManager())
so your ManagerFactory class provides you the ISession ?
then you do:
CandidateAttachment attach = new CandidateAttachment();
attach = checkCV();
but
checkCV() returns either a null or a string ?
Finally you should never do
Save()
but instead
SaveOrUpdate()
Hope that helps you resolving your issue.
Feel free to give more details

Categories

Resources