Trying to Request data from Web-service via Async call.
Started a similar Wait until event has finished post, in this case i was able to use a None-Async method. This is not the case now. There is no None-Async method available with current web-service. So how would i go a head and determine if the event from the Event "RespService_Send_Completed" has finished before initiate / move on with the next row in the loop. If event is successful the return from the webservice is a UUID in "e.Result".
foreach (string id in uuid)
{
WebRef.ResponderService RespService = new WebRef.ResponderService();
_uuid = id;
RespService.SendDataAsync(id);
RespService.SendCompleted += RespService_Send_Completed;
}
The code works fine when calling method
public void InvokeSend(string[] uuid)
with one value in string[] uuid. But when array containes more than one the code will fail.
public class SendReciveSoapData
{
private string _uuid { get; set; }
public void InvokeSend(string[] uuid)
{
foreach (string id in uuid)
{
WebRef.ResponderService RespService = new WebRef.ResponderService();
_uuid =id;
RespService.SendDataAsync(id);
RespService.SendCompleted += RespService_Send_Completed;
}
}
void RespService_Send_Completed(object sender, WebRef.CompletedEventArgs e)
{
//Saving Response Data to database
string SuccessID = e.Result;
string TransactionID = _uuid;
DataBase db = new DataBase();
db.UpdateResponseID(SuccessID, TransactionID);
}
}
The _uuid private field is likely cause of your issue.
Specifically, the foreach loop is not going to wait for the completed event to occur before continuing to the next id. The _uuid field will be overwritten with each iteration, and since the loop is going to be fast -- unless there are a lot (thousands) of calls to create, the loop will finish before any of the completed events are raised. There is simply no way to know what value _uuid will be set to when the completed event is rasied.
Most likely the loop will have finished and _uuid will be the last id when all of the completed events are raised.
I would need to know more about the API you are calling to know for certain, but xxxxAsync() methods usually return a Task<>. Regardless of if it is a Task<> or something else, save all of the return values in an array and remove from the array when completed.
Again, can't be more specific without more information about the API.
I solved the problem by, calling the "InvokeSend" -method from the event. Then working through the array that was sent to the Class, removing "Current" uuid-value after each iteration. Until array is empty.
public class SendReciveSoapData
{
private string[] UUID_array { get; set; }
private string CurrentUUID { get; set; }
public void InvokeSend(string[] uuid_array)
{
int len = uuid_array.Length;
if (len > 0)
{
CurrentUUID = uuid_array[0].ToString();
string strToRemove = CurrentUUID;
UUID_array = uuid_array.Where(val => val != strToRemove).ToArray();
invokeSend(CurrentUUID);
}
}
private void invokeSend(string uuid)
{
CurrentUUID=uuid;
WebRef.ResponderService RespService = new WebRef.ResponderService();
RespService.SendDataAsync(uuid);
RespService.SendCompleted += RespService_Send_Completed;
}
void RespService_Send_Completed(object sender, WebRef.CompletedEventArgs e)
{
//Saving Response Data to database
string SuccessID = e.Result;
string TransactionID = CurrentUUID;
DataBase db = new DataBase();
db.UpdateResponseID(SuccessID, TransactionID);
InvokeSend(UUID_array);
}
}
Related
I need some help. If you input an Directory into my code, it goes in every folder in that Directory and gets every single file. This way, i managed to bypass the "AccessDeniedException" by using a code, BUT if the Directory is one, which contains alot of Data and folders (example: C:/) it just takes way to much time.
I dont really know how to multithread and i could not find any help on the internet. Is there a way to make the code run faster by multithreading? Or is it possible to ask the code to use more memory or Cores ? I really dont know and could use advise
My code to go in every File in every Subdirectory:
public static List<string> Files = new List<string>();
public static List<string> Exceptions = new List<string>();
public MainWindow()
{
InitializeComponent();
}
private static void GetFilesRecursively(string Directory)
{
try
{
foreach (string A in Directory.GetDirectories(Directory))
GetFilesRecursively(A);
foreach (string B in Directory.GetFiles(Directory))
AddtoList(B);
} catch (System.Exception ex) { Exceptions.Add(ex.ToString()); }
}
private static void AddtoList(string Result)
{
Files.Add(Result);
}
private void Btn_Click(object sender, RoutedEventArgs e)
{
GetFilesRecursively(Textbox1.Text);
foreach(string C in Files)
Textbox2.Text += $"{C} \n";
}
You don't need recursion to avoid inaccessible files. You can use the EnumerateFiles overload that accepts an EnumerationOptions parameter and set EnumerationOptions.IgnoreInaccessible to true:
var options=new EnumerationOptions
{
IgnoreInaccessible=true,
RecurseSubdirectories=true
};
var files=Directory.EnumerateFiles(somePath,"*",options);
The loop that appends file paths is very expensive too. Not only does it create a new temporary string on each iteration, it also forces a UI redraw. You could improve speed and memory usage (which, due to garbage collection is also affecting performance) by creating a single string, eg with String.Join or a StringBuilder :
var text=String.Join("\n",files);
Textbox2.Text=text;
String.Join uses a StringBuilder internally whose internal buffer gets reallocated each time it's full. The previous buffer has to be garbage-collected. Once could avoid even this by using a StringBuilder with a specific capacity. Even a rough estimate can reduce reallocations significantly:
var builder=new StringBuilder(4096);
foreach(var file in files)
{
builder.AppendLine(file);
}
create a class so you can add a private field to count the deep of the directroy.
add a TaskSource<T> property to the class, and await the Task that generated only if the deep out of the limit, and trigger an event so your UI can hook into the action and ask user.
if user cancel , then the task fail, if user confirm, then continue.
some logic code
public class FileLocator
{
public FileLocator(int maxDeep = 6){
_maxDeep = maxDeep;
this.TaskSource = new TaskSource();
this.ConfirmTask = this.TaskSource.Task;
}
private int _maxDeep;
private int _deep;
public event Action<FileLocator> OnReachMaxDeep;
public Task ConfirmTask ;
public TaskSource TaskSource {get;}
public Task<List<string>> GetFilesRecursivelyAsync()
{
var result = new List<string>();
foreach(xxxxxxx)
{
xxxxxxxxxxxxxx;
this._deep +=1;
if(_deep == _maxDeep)
{ OnRichMaxDeep?.Invoke(this); }
if(_deep >= _maxDeep)
{
try{
await ConfirmTask;
continue;
}
catch{
return result;
}
}
}
}
}
and call
var locator = new FileLocator();
locator.OnReachMaxDeep += (x)=> { var result = UI.Confirm(); if(result){ x.TaskSource.SetResult(); else{ x.TaskSource.SetException(new Exception()) } } }
var result = await locator.GetFilesRecursivelyAsync("C:");
I am using mvc with Entity framework.
I have one method which is called on button click. method used to get the some value from db. And I am doing some calculation and subtracting the value based on my requirement. At the end I am updating this entity with latest changes.
If I don't have enough value in db for subtraction I want to show the error message to user "Enough value in db". its working fine for single user.
But if that method is called by different user at same time from different-different browser, then its not working.
I have tried with lock the Object or async await but not able to handle this situation. lock is not working on event which is fired by different-2 browser at same time.
Code:
public async Task SaveContainerRoutes(List<ContainerRouteVM> lstCRoute, int cid)
{
//my code
}
Lock code:
public ActionResult SaveContainerRoutes(List<ContainerRouteVM> lstCRoute, int cid)
{
try
{
ContainerRouteBL bl = new ContainerRouteBL();
lock (bl)
{
string note = bl.SaveContainerRoutes(lstCRoute, cid);
}
}
catch (Exception ex)
{
return Json(new { success = false, message = ex.Message });
}
}
Please help. Thanks in advance.
Declare this line in class level
private static Object thisLock = new Object();
use thislock in method
public async Task SaveContainerRoutes(List<ContainerRouteVM> lstCRoute, int cid)
{
lock(thisLock)
{
//place use code
}
}
I have created an application that update the logs form whenever the email are successfully send. My code is something like this:
mailSender.cs
void Serche()
{
{
//perform thread background ip scanner
}
if (InvokeRequired){
this.Invoke(new MethodInvoker(delegate
{
sendReport();
}));
}
}
public void sendReport()
{
//some codes to trigger time schedule to send report
ExportToExcel(filePath);
int milliseconds = 2000;
Thread.Sleep(milliseconds);
sendMail(filename);
}
private void sendMail(string filename)
{
string getFilePath = #"D:\Report\" + filename;
string status = "send";
try
{
// send email filename as attachment
}
catch (Exception ex)
{
status = "Fail";
}
sendMailReport(filename, DateTime.Now, mailStat);
}
private void sendMailReport(string fileName, DateTime dateDelivered, string status)
{
//mailLog updateLogs = new mailLog();
updateLogs.updateMailLogs(fileName,dateDelivered,status);
}
mailLog.cs
public void updateMailLogs(string _fileName, DateTime _dateDelivered, string _status)
{
int num = dataGridView1.Rows.Add();
dataGridView1.Rows[num].Cells[0].Value = _fileName;
dataGridView1.Rows[num].Cells[1].Value = _dateDelivered;
dataGridView1.Rows[num].Cells[2].Value = _status;
dataGridView1.Refresh();
}
I have debug the code line by line and found out that all parameters successfully retrieve in my updateMailLogs method, but not sure why it didnt update my datagridview. Does anyone have any idea why? Please advise.
SOLVED
credit to #shell who enlightened me the answer to this question.
problem:-
1- If the form is already open, then I cannot create another object of mailLog form and call updateMailLogs method.
2- This will not update your grid data. Because the both object reference are different.
solution:-
1- Need to call that method from object of mailLog form which is currently loaded.
private void sendMailReport(string fileName, DateTime dateDelivered,string status)
{
if (Application.OpenForms["mailLog"] != null)
((mailLog)Application.OpenForms["mailLog"]).updateLogs.updateMailLogs(fileName,dateDelivered,status);
}
Provided code does not help to understand what you have done exactly. I mean you are executing method sendMailReport. That method will creates object of mailLog class on every execution. this may lost your existing data. It is better to create your mailLog class object out side of your sendMailReport method block and just execute updateMailLogs method only.
mailLog updateLogs = new mailLog();
private void sendMailReport(string fileName, DateTime dateDelivered,string status)
{
updateLogs.updateMailLogs(fileName,dateDelivered,status);
}
EDITED:
if the form is already loaded then you should call method like this. here, you don't need to create a new object of mailLog class.
private void sendMailReport(string fileName, DateTime dateDelivered,string status)
{
((mailLog)Application.OpenForms["mailLog"]).updateMailLogs(fileName,dateDelivered,status);
}
Usually this happens coz of CrossThread exception, so i guess u need to add try catch to check out, if tho u will need to invoke the grid
Edit: just noticed you asked where to put try catch
you can put it at any of your both voids , try this
try{
updateLogs.updateMailLogs(fileName,dateDelivered,status);
}
catch (Exception ex) {MessageBox.Show(ex.ToString());}
While keeping in mind that:
I am using a blocking queue that waits for ever until something is added to it
I might get a FileSystemWatcher event twice
The updated code:
{
FileProcessingManager processingManager = new FileProcessingManager();
processingManager.RegisterProcessor(new ExcelFileProcessor());
processingManager.RegisterProcessor(new PdfFileProcessor());
processingManager.Completed += new ProcessingCompletedHandler(ProcessingCompletedHandler);
processingManager.Completed += new ProcessingCompletedHandler(LogFileStatus);
while (true)
{
try
{
var jobData = (JobData)fileMonitor.FileQueue.Dequeue();
if (jobData == null)
break;
_pool.WaitOne();
Application.Log(String.Format("{0}:{1}", DateTime.Now.ToString(CultureInfo.InvariantCulture), "Thread launched"));
Task.Factory.StartNew(() => processingManager.Process(jobData));
}
catch (Exception e)
{
Application.Log(String.Format("{0}:{1}", DateTime.Now.ToString(CultureInfo.InvariantCulture), e.Message));
}
}
}
What are are you suggestions on making the code multi-threaded while taking into consideration the possibility that two identical string paths may be added into the blocking queue? I have left the possibility that this might happen and in this case.. the file would be processed twice, the thing is that sometimes I get it twice, sometimes not, it is really awkward, if you have suggestions on this, please tell.
The null checking is for exiting the loop, I intentionally add a null from outside the threaded loop to determine it to stop.
For multi-threading this... I would probably add a "Completed" event to your FileProcessingManager and register for it. One argument of that event will be the "bool" return value you currently have. Then in that event handler, I would do the checking of the bool and re-queueing of the file. Note that you will have to keep a reference to the FileMonitorManager. So, I would have this ThreadProc method be in a class where you keep the FileMonitorManager and FileProcessingManager instances in a property.
To deduplicate, in ThreadProc, I would create a List outside of the while loop. Then inside the while loop, before you process a file, lock that list, check to see if the string is already in there, if not, add the string to the list and process the file, if it is, then skip processing.
Obviously, this is based on little information surrounding your method but my 2 cents anyway.
Rough code, from Notepad:
private static FileMonitorManager fileMon = null;
private static FileProcessingManager processingManager = new FileProcessingManager();
private static void ThreadProc(object param)
{
processingManager.RegisterProcessor(new ExcelFileProcessor());
processingManager.RegisterProcessor(new PdfFileProcessor());
processingManager.Completed += ProcessingCompletedHandler;
var procList = new List<string>();
while (true)
{
try
{
var path = (string)fileMon.FileQueue.Dequeue();
if (path == null)
break;
bool processThis = false;
lock(procList)
{
if(!procList.Contains(path))
{
processThis = true;
procList.Add(path);
}
}
if(processThis)
{
Thread t = new Thread (new ParameterizedThreadStart(processingManager.Process));
t.Start (path);
}
}
catch (System.Exception e)
{
Console.WriteLine(e.Message);
}
}
}
private static void ProcessingCompletedHandler(bool status, string path)
{
if (!status)
{
fileMon.FileQueue.Enqueue(path);
Console.WriteLine("\n\nError on file: " + path);
}
else
Console.WriteLine("\n\nSucces on file: " + path);
}
I have such code:
public void IssueOrders(List<OrderAction> actions)
{
foreach (var action in actions)
{
if (action is AddOrder)
{
uint userId = apiTransactions.PlaceOrder((action as AddOrder).order);
Console.WriteLine("order is placing userId = " + userId);
}
// TODO: implement other actions
}
// how to wait until OnApiTransactionsDataMessageReceived for all userId is received?
// TODO: need to update actions with received data here
}
private void OnApiTransactionsDataMessageReceived(object sender, DataMessageReceivedEventArgs e)
{
var dataMsg = e.message;
var userId = dataMsg.UserId;
apiTransactions.PlaceOrder is asynchronous so I receive userId as result but I will receive data in callback OnApiTransactionsDataMessageReceived.
So for example If I place 3 orders, i will receive 3 userId, for example 1, 3, and 4. Now I need to wait until data for all these userId is received.
userId is always increasing if this is important. This is almost integer numbers sequence, but some numbers may be ommited due parallel execution.
UPD Note:
IssueOrders can be executed parallel from different threads
callack may be called BEFORE PlaceOrder returns
UPD2
Likely I need to refactor PlaceOrder code below so I can guarantee that userId is known before "callback" is received:
public uint PlaceOrder(Order order)
{
Publisher pub = GetPublisher();
SchemeDesc schemeDesc = pub.Scheme;
MessageDesc messageDesc = schemeDesc.Messages[0]; //AddMM
FieldDesc fieldDesc = messageDesc.Fields[3];
Message sendMessage = pub.NewMessage(MessageKeyType.KeyName, "FutAddOrder");
DataMessage smsg = (DataMessage)sendMessage;
uint userId = counter.Next();
FillDataMessageWithPlaceOrder(smsg, order, userId);
System.Console.WriteLine("posting message dump: {0}", sendMessage);
pub.Post(sendMessage, PublishFlag.NeedReply);
sendMessage.Dispose();
return userId;
}
So I need to split PlaceOrder to two methods: userId CreateOrder and void PostOrder. This will guarantee that when callback is received I know userId.
I'd check out the ForkJoin method in the Reactive Framework. It will block until multiple async calls have completed.
Edit: It seems that ForkJoin() was only ever included in an experimental release of Rx. Here's a discussion of what you want based on Merge().
One of the most silly and working approaches would be:
public void IssueOrders(List<OrderAction> actions)
{
var userIds = new List<uint>();
lock(theHashMap)
theHashMap[userIds] = "blargh";
foreach (var action in actions)
{
if (action is AddOrder)
{
lock(userIds)
{
uint userId = apiTransactions.PlaceOrder((action as AddOrder).order);
Console.WriteLine("order is placing userId = " + userId);
userIds.Add(userId);
}
}
// TODO: implement other actions
}
// waiting:
do
{
lock(userIds)
if(userIds.Count == 0)
break;
Thread.Sleep(???); // adjust the time depending on how long you wait for a callback on average
}while(true);
lock(theHashMap)
theHashMap.Remove(userIds);
// now you have the guarantee that all were received
}
private Dictionary<List<uint>, string> theHashMap = new Dictionary<List<uint>,string>();
private void OnApiTransactionsDataMessageReceived(object sender, DataMessageReceivedEventArgs e)
{
var dataMsg = e.message;
var userId = dataMsg.UserId;
// do some other things
lock(theHashMap)
foreach(var list in theHashMap.Keys)
lock(list)
if(list.Remove(userId))
break;
}
but, this is quite crude approach.. Its hard to suggest anything more unless you explain what do yo umean by wait - as Jon asked in the comments. For example, you may might want to leave the IssueOrders, wait anywhere, and just be sure that the some extra job is done when all have arrived? Or maybe you cannot leave the IssueOrders unless all are received? etc..
Edit: please note that near ADD, the lock must be before PlaceOrder, or else, when the callback arrive hyper-fast, the callback may attempt to remove the ID before it is added. Also, note that this implementation is very naiive: the callback must search and lock through all the lists at each time. With a few additional dictionary/maps/indexes, it may be optimized much, but I did not do that here for readability.
In case you are able to change the API, consider to use Task Parallel Library, your code will get much easier with that.
Otherwise AutoResetEvent might help you:
private Dictionary<int, AutoResetEvent> m_Events = new ...;
public void IssueOrders(List<OrderAction> actions)
{
foreach (var action in actions)
{
if (action is AddOrder)
{
uint userId = apiTransactions.PlaceOrder((action as AddOrder).order);
// Attention: Race condition if PlaceOrder finishes
// before the MRE is created and added to the dictionary!
m_Events[userId] = new ManualResetEvent(false);
Console.WriteLine("order is placing userId = " + userId);
}
// TODO: implement other actions
}
WaitHandle.WaitAll(m_Events.Values);
// TODO: Dispose the created MREs
}
private void OnApiTransactionsDataMessageReceived(object sender, DataMessageReceivedEventArgs e)
{
var dataMsg = e.message;
var userId = dataMsg.UserId;
m_Events[userId].Set();
}