Parallel Tasks & lock local variables - c#

I'm new in multithreading programming and I have some questions about it. I'm developing a scheduled job that loads a .xls that every row with data needs to be validated and after open a request in a application, and I'm implementing multithreading to process every row, aiming win some performance.
In my job I have something like this example:
public class Job
{
private readonly IUserBLO _userBLO;
private readonly IRequestBLO _requestBLO;
private readonly object _lockerSheet;
public Job()
{
_userBLO = Factory.CreateInstance("UserBLO");
_requestBLO = Factory.CreateInstance("RequestBLO");
_lockerSheet = new object();
}
public void ProcessJob(BinaryWriter binary)
{
using (ExcelPackage package = new ExcelPackage(binary))
{
var sheet = package.Workbook.Worksheet[1];
var rows = Enumerable.Range(sheet.Dimensions.Start.Row, sheet.Dimensions.End.Row);
Parallel.ForEach(rows, row =>
{
try
{
var excelData = GetExcelData(sheet, row);
ValidadeData(excelData);
_requestBLO.OpenRequestByExcelData(excelData);
}
catch (Exception ex)
{
LogError(ex);
}
});
}
}
public List<string> GetExcelData(ExcelWorksheet sheet, int row)
{
var excelData = new List<string>();
for (int i = sheet.Dimensions.Start.Column; i <= sheet.Dimensions.End.Column; i++)
{
lock (_lockerSheet)
{
var cellValue = sheet.Cells[row, i].Text;
}
excelData.Add(cellValue);
}
return excelData;
}
public void ValidadeData(List<string> excelData)
{
var userLogin = excelData.First();
var user = _userBLO.FindByLogin(userLogin);
if (user == null)
throw new Exception("User not found!");
}
}
When I was coding, I had some issues that when I tried to get the value from sheet in a random position, sometimes, it returned empty, even though this position had value. After some search in google, I tried to use a locker in the moment I was getting the value from sheet, and apparently worked!
lock (_lockerSheet)
{
var cellValue = sheet.Cells[row, i].Text;
}
After this situation, I had some questions:
For every "local" variable that will be parallel used in the tasks needs to be locked, even if that variable is readonly. I'm correct?
If the question above is true, do I need to use a locker in the _userBLO and _requestBLO variables, which are interfaces that contais some specific methods?
If the first question is false, do I just need use a locker if the variable will recive some value in task? When do I really need to use a locker?
PS.: Sorry for my bad english! Brazilian here! :)

Related

Reading the file only once for every method call

I am new to object-oriented programming and I am working on a small personal project with some SQL scripts.
I have a scenario where a SQL script calls a static method with a file path as input.
queries = Select Query from Table where Utils.ContainsKeyword(Query, #Path1) AND NOT Utils.ContainsKeyword(Query, #Path2);
I had initially created a static class that does the following:
public static class Utils
{
public static bool ContainsKeyword(string query, string path)
{
var isQueryInFile = false;
var stringFromFile = GetStringFromFile(path);
List<Regex>regexList = GetRegexList(stringFromFile);
if(regexList!= null)
{
isQueryInFile = regexList.Any(pattern => pattern.IsMatch(query));
}
return isQueryInFile;
}
private static string GetStringFromFile(string path)
{
var words = String.Empty;
if(!string.IsNullOrEmpty(path))
{
try
{
using (StreamReader sr = File.OpenText(path))
{
words = sr.ReadToEnd().Replace(Environment.Newline, "");
}
}
catch { return words; }
}
return words;
}
private static List<Regex> GetRegexList(string words)
{
if(string.IsNullOrEmpty(words)) { return null; }
return words.Split(',').Select(w=> new Regex(#"\b" + Regex.Escape(w) + #'\b', RegexOptions.Compiled | RegexOptions.IgnoreCase)).ToList();
}
}
My problem is that I neither want to read from the file every time the ContainsKeyword static method is called nor do I want to create a new RegexList every time. Also, I cannot change the SQL script and I have to send the path to the file as an input parameter for the method call in the SQL script since the path might change in the future.
Is there a way to make sure I only read the contents from the input path only once, store them in a string, and use the string for the match with different input queries?
To read the content only once, saving in memory will probaby be needed. Memory capacity could be an issue.
public Dictionary<string, string> FileContentCache { get; set; } // make sure that gets initialized
public string GetFileContentCache(string path)
{
if (FileContentCache == null) FileContentCache = new Dictionary<string, string>();
if (FileContentCache.ContainsKey(path))
return FileContentCache[path];
var fileData = GetStringFromFile(path);
FileContentCache.Add(path, fileData);
return fileData;
}

JSON Array to Entity Framework Core VERY Slow?

I'm working on a utility to read through a JSON file I've been given and to transform it into SQL Server. My weapon of choice is a .NET Core Console App (I'm trying to do all of my new work with .NET Core unless there is a compelling reason not to). I have the whole thing "working" but there is clearly a problem somewhere because the performance is truly horrifying almost to the point of being unusable.
The JSON file is approximately 27MB and contains a main array of 214 elements and each of those contains a couple of fields along with an array of from 150-350 records (that array has several fields and potentially a small <5 record array or two). Total records are approximately 35,000.
In the code below I've changed some names and stripped out a few of the fields to keep it more readable but all of the logic and code that does actual work is unchanged.
Keep in mind, I've done a lot of testing with the placement and number of calls to SaveChanges() think initially that number of trips to the Db was the problem. Although the version below is calling SaveChanges() once for each iteration of the 214-record loop, I've tried moving it outside of the entire looping structure and there is no discernible change in performance. In other words, with zero trips to the Db, this is still SLOW. How slow you ask, how does > 24 hours to run hit you? I'm willing to try anything at this point and am even considering moving the whole process into SQL Server but would much reather work in C# than TSQL.
static void Main(string[] args)
{
string statusMsg = String.Empty;
JArray sets = JArray.Parse(File.ReadAllText(#"C:\Users\Public\Downloads\ImportFile.json"));
try
{
using (var _db = new WidgetDb())
{
for (int s = 0; s < sets.Count; s++)
{
Console.WriteLine($"{s.ToString()}: {sets[s]["name"]}");
// First we create the Set
Set eSet = new Set()
{
SetCode = (string)sets[s]["code"],
SetName = (string)sets[s]["name"],
Type = (string)sets[s]["type"],
Block = (string)sets[s]["block"] ?? ""
};
_db.Entry(eSet).State = Microsoft.EntityFrameworkCore.EntityState.Added;
JArray widgets = sets[s]["widgets"].ToObject<JArray>();
for (int c = 0; c < widgets.Count; c++)
{
Widget eWidget = new Widget()
{
WidgetId = (string)widgets[c]["id"],
Layout = (string)widgets[c]["layout"] ?? "",
WidgetName = (string)widgets[c]["name"],
WidgetNames = "",
ReleaseDate = releaseDate,
SetCode = (string)sets[s]["code"]
};
// WidgetColors
if (widgets[c]["colors"] != null)
{
JArray widgetColors = widgets[c]["colors"].ToObject<JArray>();
for (int cc = 0; cc < widgetColors.Count; cc++)
{
WidgetColor eWidgetColor = new WidgetColor()
{
WidgetId = eWidget.WidgetId,
Color = (string)widgets[c]["colors"][cc]
};
_db.Entry(eWidgetColor).State = Microsoft.EntityFrameworkCore.EntityState.Added;
}
}
// WidgetTypes
if (widgets[c]["types"] != null)
{
JArray widgetTypes = widgets[c]["types"].ToObject<JArray>();
for (int ct = 0; ct < widgetTypes.Count; ct++)
{
WidgetType eWidgetType = new WidgetType()
{
WidgetId = eWidget.WidgetId,
Type = (string)widgets[c]["types"][ct]
};
_db.Entry(eWidgetType).State = Microsoft.EntityFrameworkCore.EntityState.Added;
}
}
// WidgetVariations
if (widgets[c]["variations"] != null)
{
JArray widgetVariations = widgets[c]["variations"].ToObject<JArray>();
for (int cv = 0; cv < widgetVariations.Count; cv++)
{
WidgetVariation eWidgetVariation = new WidgetVariation()
{
WidgetId = eWidget.WidgetId,
Variation = (string)widgets[c]["variations"][cv]
};
_db.Entry(eWidgetVariation).State = Microsoft.EntityFrameworkCore.EntityState.Added;
}
}
}
_db.SaveChanges();
}
}
statusMsg = "Import Complete";
}
catch (Exception ex)
{
statusMsg = ex.Message + " (" + ex.InnerException + ")";
}
Console.WriteLine(statusMsg);
Console.ReadKey();
}
I had an issue with that kind of code, lots of loops and tons of changing state.
Any change / manipulation you make in _db context, will generate a "trace" of it. And it making your context slower each time. Read more here.
The fix for me was to create new EF context(_db) at some key points. It saved me a few hours per run!
You could try to create a new instance of _db each iteration in this loop
contains a main array of 214 elements
If it make no change, try to add some stopwatch to get a best idea of what/where is taking so long.
If you're making thousands of updates then EF is not really the way to go. Something like SQLBulkCopy will do the trick.
You could try the bulkwriter library.
IEnumerable<string> ReadFile(string path)
{
using (var stream = File.OpenRead(path))
using (var reader = new StreamReader(stream))
{
while (reader.Peek() >= 0)
{
yield return reader.ReadLine();
}
}
}
var items =
from line in ReadFile(#"C:\products.csv")
let values = line.Split(',')
select new Product {Sku = values[0], Name = values[1]};
then
using (var bulkWriter = new BulkWriter<Product>(connectionString)) {
bulkWriter.WriteToDatabase(items);
}

CRM Dynamics 2013 SDK Update Current Accounts With 2 Values

I have a scenario in CRM where I need to update existing accounts with their Vat and Registration number. There is well over 30 thousand accounts in the system. I am trying to update using the CRM SDK API but I am battling to figure out how to perform the actual update. The vat number and reg have been provided to me in a spreadsheet with their corresponding number, please note that the accounts are already in CRM so I just need to update the correct account with its Vat and Reg number, How can I do this in CRM, please advice on my code below:
public static void UpdateAllCRMAccountsWithVATAndRegistrationNumber(IOrganizationService service)
{
QueryExpression qe = new QueryExpression();
qe.EntityName = "account";
qe.ColumnSet = new ColumnSet("account", "new_vatno", "new_registrationnumber");
qe.Criteria.AddCondition("accountnumber", ConditionOperator.In,"TA10024846", "TA10028471", "TA20014015", "TA4011652", "TA4011557");
EntityCollection response = service.RetrieveMultiple(qe);
foreach (var acc in response.Entities)
{
acc.Attributes["new_vatno"] = //this is where I am struggling to figure out how I am gong to match the records up,
acc.Attributes["new_registrationnumber"] = //this is where I am struggling to figure out how I am gong to match the records up,
service.Update(acc);
}
}
How am I going to ensure that I update the correct records. I have the vat and reg numbers for the accounts in a spreadsheet, please see example image below. Can I please get advised here. Thanks.
I would load the list of VAT updates from the spreadsheet into a dictionary and then load the 30k record from CRM into memory. Then I would match them up and use ExecuteMultipleRequest to do the updates. Alternatively, you could query CRM using the account numbers (if the list is small enough.) I made the assumption you had thousands of updates to do across the record set of 30k. Note, if the Account record size was very large and couldn't be loaded into memory you would need to do account number queries.
Here is the rough code for the basic solution (I haven't tested, method should be split up, and there is minimal error handling):
public class VatInfo
{
public string RegistrationNumber;
public string TaxNumber;
public static Dictionary<string, VatInfo> GetVatList()
{
//TODO: Implement logic to load CSV file into a list. Dictionary key value should be Account Number
throw new NotImplementedException();
}
}
public class UpdateVatDemo
{
public const int maxBatchSize = 100;
public static void RunVatUpdate(IOrganizationService conn)
{
var vats = VatInfo.GetVatList();
var pagingQuery = new QueryExpression("account");
pagingQuery.ColumnSet = new ColumnSet("accountnumber");
Queue<Entity> allEnts = new Queue<Entity>();
while (true)
{
var results = conn.RetrieveMultiple(pagingQuery);
if (results.Entities != null && results.Entities.Any())
results.Entities.ToList().ForEach(allEnts.Enqueue);
if (!results.MoreRecords) break;
pagingQuery.PageInfo.PageNumber++;
pagingQuery.PageInfo.PagingCookie = results.PagingCookie;
}
ExecuteMultipleRequest emr = null;
while (allEnts.Any())
{
if (emr == null)
emr = new ExecuteMultipleRequest()
{
Settings = new ExecuteMultipleSettings()
{
ContinueOnError = true,
ReturnResponses = true
},
Requests = new OrganizationRequestCollection()
};
var ent = allEnts.Dequeue();
if (vats.ContainsKey(ent.GetAttributeValue<string>("accountnumber")))
{
var newEnt = new Entity("account", ent.Id);
newEnt.Attributes.Add("new_vatno", vats[ent.GetAttributeValue<string>("accountnumber")].TaxNumber);
newEnt.Attributes.Add("new_registrationnumber", vats[ent.GetAttributeValue<string>("accountnumber")].RegistrationNumber);
emr.Requests.Add(new UpdateRequest() { Target = newEnt });
}
if (emr.Requests.Count >= maxBatchSize)
{
try
{
var emResponse = (ExecuteMultipleResponse) conn.Execute(emr);
foreach (
var responseItem in emResponse.Responses.Where(responseItem => responseItem.Fault != null))
DisplayFault(emr.Requests[responseItem.RequestIndex],
responseItem.RequestIndex, responseItem.Fault);
}
catch (Exception ex)
{
Console.WriteLine($"Exception during ExecuteMultiple: {ex.Message}");
throw;
}
emr = null;
}
}
}
private static void DisplayFault(OrganizationRequest organizationRequest, int count,
OrganizationServiceFault organizationServiceFault)
{
Console.WriteLine(
"A fault occurred when processing {1} request, at index {0} in the request collection with a fault message: {2}",
count + 1,
organizationRequest.RequestName,
organizationServiceFault.Message);
}
}
Updating the fetched entity is bound to fail because of its entity state, which would not be null.
To update the fetched entities, you need to new up the entity:
foreach (var acc in response.Entities)
{
var updateAccount = new Entity("account") { Id = acc.Id };
updateAccount .Attributes["new_vatno"] = null; //using null as an example.
updateAccount .Attributes["new_registrationnumber"] = null;
service.Update(acc);
}
Code below shows how I managed to get it righy. forst let me explain. I imported my records into a seperate SQL table, in my code I read that table into a list in memory, I then query CRM accounts that need to be updated, I then loop though each account and check if the account number in CRM matches the account number from my sql database, if it matches, I then update the relevant Reg no and Vat no, See code below:
List<Sheet1_> crmAccountList = new List<Sheet1_>();
//var crmAccount = db.Sheet1_.Select(x => x).ToList().Take(2);
var crmAccounts = db.Sheet1_.Select(x => x).ToList();
foreach (var dbAccount in crmAccounts)
{
CRMDataObject modelObject = new CRMDataObject()
{
ID = dbAccount.ID,
Account_No = dbAccount.Account_No,
Tax_No = dbAccount.Tax_No.ToString(),
Reg_No = dbAccount.Reg_No
//Tarsus_Country = dbAccount.Main_Phone
};
}
var officialDatabaseList = crmAccounts;
foreach (var crmAcc in officialDatabaseList)
{
QueryExpression qe = new QueryExpression();
qe.EntityName = "account";
qe.ColumnSet = new ColumnSet("accountnumber", "new_vatno", "new_registrationnumber");
qe.Criteria.AddCondition("accountnumber", ConditionOperator.In,'list of account numbers go here'
EntityCollection response = service.RetrieveMultiple(qe);
foreach (var acc in response.Entities)
{
if (crmAcc.Account_No == acc.Attributes["accountnumber"].ToString())
{
//acc.Attributes["new_vatno"] = crmAcc.VAT_No.ToString();
acc.Attributes["new_registrationnumber"] = crmAcc.Reg_No.ToString();
service.Update(acc);
}
}
}

MPXJ custom fields connected with Tasks?

Good Morning everyone,
Does anyone know on how to use MPXJ v5.1.5 to effectively to read the MPP. project file to get the Outline Code Values linked to their assigned tasks.
I already have found a way of getting the tasks and the time scale data for them but how do I find out what Outline Code or custom field is linked to any task? This will help to create reports on how these custom fields are going.
Here is my main piece of code used to retrieve the Tasks with their time scale data. This piece of code is running on a Background worker and report progress.
void Work_DoWork(object sender, DoWorkEventArgs e)
{
try
{
Document_Details_To_Open Document_Selected_Details = e.Argument as Document_Details_To_Open;
ProjectReader reader = ProjectReaderUtility.getProjectReader(Document_Selected_Details.FileName);
MPXJ.ProjectFile mpx = reader.read(Document_Selected_Details.FileName);
int count = mpx.AllTasks.Size();
int stepsize = 100002 / count;
int pos = 1;
foreach (MPXJ.Task task in mpx.AllTasks.ToIEnumerable())
{
Task_Type task_ = new Task_Type()
{
Name = task.Name,
Total_Days = task.Duration.toString(),
ID = task.ID.toString()
};
//Task.getFieldByAlias()
//can add task above to MVVM connection
foreach (MPXJ.ResourceAssignment Resource in task.ResourceAssignments.ToIEnumerable())//this will only run once per task , I use the ResourceAssignment variable to get the duration data
{
//use the selected document details given
Dictionary<string, java.util.List> worklist = new Dictionary<string, java.util.List>();
foreach (string Work_type in Document_Selected_Details.Data_To_Import)
{
worklist.Add(Work_type, Get_Some_work(Resource, Work_type));
}
int Length_of_data_to_retrieve = Get_Time_Scale_int(Document_Selected_Details.Time_Scale_Units, task.Duration.Duration);
TimescaleUtility TimeScale = new TimescaleUtility();
java.util.ArrayList datelist = TimeScale.CreateTimescale(task.Start, Get_Scale_Type(Document_Selected_Details.Time_Scale_Units), Length_of_data_to_retrieve);
MPXJ.ProjectCalendar calendar = Resource.Calendar;
TimephasedUtility utility = new TimephasedUtility();
Dictionary<string, java.util.ArrayList> durationlist = new Dictionary<string, java.util.ArrayList>();
foreach (KeyValuePair<string, java.util.List> item in worklist)
{
java.util.ArrayList duration = utility.SegmentWork(calendar, item.Value, Get_Scale_Type(Document_Selected_Details.Time_Scale_Units), datelist);
durationlist.Add(item.Key, duration);
}
Dictionary<string, List<string>> ssss = new Dictionary<string, List<string>>();
foreach (var s in durationlist)
{
string key = s.Key;
List<string> Hours = new List<string>();
foreach (var hours in s.Value.toArray().ToList())
{
Hours.Add(hours.ToString());
}
ssss.Add(key, Hours);
}
Task_With_All all = new Models.Task_With_All()
{
Task_Name = task.Name,
Time_Step_Type = Document_Selected_Details.Time_Scale_Units,
Duration_List = ssss,
StartDate = task.Start.ToDateTime().ToString(),
Total_duration = Length_of_data_to_retrieve.ToString()
};
Task_With_All_list.Add(all);
//I have now every task and their Time scale data but I still need to know if the tasks could have custom fields connected or not
}
pos += stepsize;
Work.ReportProgress(pos);
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}
Any help would be greatly appreciated.
Thanks to Jon Iles, the answer on how to get the Outline Codes for a Task became very simple. In MS Project there is a limit of 10 Outline Codes the users can assign to Tasks. To Get these Outline Codes that has been Assigned to a Task using MPXJ v5.1.5, you can use this to get them :
//this code comes from my code block in the question.
...
foreach (MPXJ.Task task in mpx.AllTasks.ToIEnumerable())
{
//if the string values retrieved from these has a valid value that's returned, that value is the Outline Code assigned to the task
string Outline_code_1 = task.GetOutlineCode(1);
string Outline_code_2 = task.GetOutlineCode(2);
string Outline_code_3 = task.GetOutlineCode(3);
string Outline_code_4 = task.GetOutlineCode(4);
string Outline_code_5 = task.GetOutlineCode(5);
string Outline_code_6 = task.GetOutlineCode(6);
string Outline_code_7 = task.GetOutlineCode(7);
string Outline_code_8 = task.GetOutlineCode(8);
string Outline_code_9 = task.GetOutlineCode(9);
string Outline_code_10 = task.GetOutlineCode(10);
}
...

Duplicate entries on server response .net

Scenario
One windows service polls a url every two minutes to retrieve certain data.
If any data has been added since the previous call, the data is retrieved and stored otherwise the loop carries on.
Issue
Sometimes a request takes more than two minutes to return a response.
When this happens, the next request is still made and finds new data, since the previous request hasn't return a response yet
This results in duplicate entries when the data is stored.
What I've tried
I tried to handle that by using a boolean like so:
Boolean InProgress = true;
foreach (var item in Lists)
{
\\Make a request and return new data (if any)
InProgress = false;
if (InProgress = false)
{
\\Store new data
}
}
This doesn't solve the issue. I believe I'm using the boolean in wrong place, but I'm not sure where it should.
This is the loop that makes the request and store the data
void serviceTimer_Elapsed(object sender, ElapsedEventArgs e)
{
try
{
Data getCredentials = new Data();
DataTable credentials = getCredentials.loadCredentials();
Boolean InProgress = true;
for (int i = 0; i < credentials.Rows.Count; i++)
{
if (credentials != null)
{
var PBranchID = (int)credentials.Rows[i]["PortalBranchID"];
var negRef = (int)credentials.Rows[i]["NegotiatorRef"];
var Username = credentials.Rows[i]["Username"].ToString();
var Password = credentials.Rows[i]["Password"].ToString();
var Domain = credentials.Rows[i]["Domain"].ToString();
var FooCompanyBaseUrl = "https://" + Domain + ".FooCompany.com/";
Data getCalls = new Data();
DataTable calls = getCalls.loadCalls(PBranchID);
//If it's not the first call
if (calls != null && calls.Rows.Count > 0)
{
//Makes a call
DateTime CreatedSince = DateTime.SpecifyKind((DateTime)calls.Rows[0]["LastSuccessOn"], DateTimeKind.Local);
string IssueListUrl = FooCompany.WebApi.V2.URLs.Issues(BaseUrl, null, CreatedSince.ToUniversalTime(), null);
FooCompany.WebApi.V2.DTO.PrevNextPagedList resultIssueList;
resultIssueList = FooCompany.WebApi.Client.Helper.Utils.Getter<Foocompany.WebApi.V2.DTO.PrevNextPagedList>(IssueListUrl, Username, Password);
InProgress = false;
if (InProgress == false)
{
if (resultIssueList.Items.Count > 0)
{
//If call returns new issues, save call
Data saveCalls = new Data();
saveCalls.saveCalls(PBranchID);
foreach (var item in resultIssueList.Items)
{
var Issue = FooCompany.WebApi.Client.Helper.Utils.Getter<FooCompany.WebApi.V2.DTO.Issue>(item, Username, Password);
string TenantSurname = Issue.Surname;
string TenantEmail = Issue.EmailAddress;
Data tenants = new Data();
int tenantPropRef = Convert.ToInt32(tenants.loadTenantPropRef(PBranchID, TenantSurname, TenantEmail));
Data Properties = new Data();
DataTable propAddress = Properties.loadPropAddress(PBranchID, tenantPropRef);
var Address1 = propAddress.Rows[0]["Address1"];
var Address2 = propAddress.Rows[0]["Address2"];
var AddressFolder = Address1 + "," + Address2;
if (!Directory.Exists("path"))
{
Directory.CreateDirectory("path");
}
string ReportPDFDestination = "path";
if (File.Exists(ReportPDFDestination))
{
File.Delete(ReportPDFDestination);
}
FooCompany.WebApi.Client.Helper.Utils.DownloadFileAuthenticated(FooCompany.WebApi.V2.URLs.IssueReport(BaseUrl, Issue.Id), Username, Password, ReportPDFDestination);
//Store data
}
IssueListUrl = resultIssueList.NextURL;
}
}
}
else
{
continue;
}
}
}
catch (Exception ex)
{
//write to log
}
}
Question
I'm sure there is a better way than a boolean.
Could anyone advice a different method to handle the issue properly?
Thanks.
Solution
I ended up using a combination of both Thomas and Mason suggestions. I wrapped a lock statement around the main function of my windows service and used a boolean inside the function section that makes the call to the remote server.
Tested many times and it's error free.
You seems to have a problem of synchronisation, just surround the code that iterate though the List with a lock, and you will be fine.
public class MyClass{
private readonly object internalLock= new object();
private bool AlreadyRunning { get; set; }
void serviceTimer_Elapsed(object sender, ElapsedEventArgs e)
{
if(AlreadyRunning){
return;
}
try{
lock(internalLock){
Thread.MemoryBarrier();
if(AlreadyRunning){
return;
}
AlreadyRunning = true;
...Do all the things...
}
}
catch(Exception e){
..Exception handling
}
finally
{
AlreadyRunning = false;
}
}
bool InProgress=false;
void serviceTimer_Elapsed(object sender, ElapsedEventArgs e)
{
if(!InProgress)
{
InProgress=true;
//retrieve data
InProgress=false;
}
}
Your InProgress variable needs to be declared outside the event handler. When you enter the method, check to see if it's already running. If it is, then we do nothing. If it's not running, then we say it's running, retrieve our data, then reset our flag to say we've finished running.
You'll probably need to add appropriate locks for thread safety, similar to Thomas's answer.

Categories

Resources