I have a thread that returns a site's http response status, but sometimes my program returns false results. and after a while it gives good results.
False result:
it takes a big a mount of time to check, and then it says that (for example) Google is down, which is quite not reasonable, but after a few seconds it returns good results
Can you take a look and tell me whats wrong? or how I can I improve it?
Checks all sites in datagrid:
private void CheckSites()
{
if (CheckSelected())
{
int rowCount = dataGrid.BindingContext[dataGrid.DataSource, dataGrid.DataMember].Count;
string url;
for (int i = 0; i < rowCount; i++)
{
url = dataGrid.Rows[i].Cells[2].Value.ToString();
if (url != null)
{
Task<string[]> task = Task.Factory.StartNew<string[]>
(() => checkSite(url));
// We can do other work here and it will execute in parallel:
//Loading...
// When we need the task's return value, we query its Result property:
// If it's still executing, the current thread will now block (wait)
// until the task finishes:
string[] result = task.Result;
selectRows();
if (result[0] != System.Net.HttpStatusCode.OK.ToString() && result[0] != System.Net.HttpStatusCode.Found.ToString() && result[0] != System.Net.HttpStatusCode.MovedPermanently.ToString())
{
//bad
notifyIcon1.ShowBalloonTip(5000, "Site Down", dataGrid.Rows[i].Cells[2].Value.ToString() + ", has a status code of:" + result, ToolTipIcon.Error);
dataGrid.Rows[i].DefaultCellStyle.BackColor = System.Drawing.Color.Wheat;
TimeSpan ts;
TimeSpan timeTaken = TimeSpan.Parse(result[1]);
dataGrid.Rows[i].Cells[3].Value = result[0];
dataGrid.Rows[i].Cells[3].Style.BackColor = System.Drawing.Color.Red;
dataGrid.Rows[i].Cells[4].Value = timeTaken.Seconds.ToString() + "." + String.Format("{0:0.00000}", timeTaken.Milliseconds.ToString()) + " seconds.";
string sec = (DateTime.Now.Second < 10) ? "0" + DateTime.Now.Second.ToString() : DateTime.Now.Second.ToString();
string min = (DateTime.Now.Minute < 10) ? "0" + DateTime.Now.Minute.ToString() : DateTime.Now.Minute.ToString();
string hour = (DateTime.Now.Hour < 10) ? "0" + DateTime.Now.Hour.ToString() : DateTime.Now.Hour.ToString();
dataGrid.Rows[i].Cells[5].Value = hour + ":" + min + ":" + sec;
//loadbar
}
else if (result[0] == "catch")//catch
{
notifyIcon1.ShowBalloonTip(10000, "SITE DOWN", dataGrid.Rows[i].Cells[1].Value.ToString() + ", Error:" +result[1], ToolTipIcon.Error);
dataGrid.Rows[i].Cells[3].Value = result[1];
dataGrid.Rows[i].Cells[3].Style.BackColor = System.Drawing.Color.Red;
//loadbar
}
else
{
//good
TimeSpan timeTaken = TimeSpan.Parse(result[1]);
dataGrid.Rows[i].Cells[3].Value = result[0];
dataGrid.Rows[i].Cells[3].Style.BackColor = System.Drawing.Color.LightGreen;
dataGrid.Rows[i].Cells[4].Value = timeTaken.Seconds.ToString() + "." + String.Format("{0:0.00000}", timeTaken.Milliseconds.ToString()) + " seconds.";
string sec = (DateTime.Now.Second < 10) ? "0" + DateTime.Now.Second.ToString() : DateTime.Now.Second.ToString();
string min = (DateTime.Now.Minute < 10) ? "0" + DateTime.Now.Minute.ToString() : DateTime.Now.Minute.ToString();
string hour = (DateTime.Now.Hour < 10) ? "0" + DateTime.Now.Hour.ToString() : DateTime.Now.Hour.ToString();
dataGrid.Rows[i].Cells[5].Value = hour + ":" + min + ":" + sec;
//loadbar
}
selectRows();
}
}
}
}
Checks a site:
/////////////////////////////////
////Check datagrid websites-button - returns response
/////////////////////////////////
private string[] checkSite(string url)
{
string[] response = new string[2];
url = dataGrid.Rows[0].Cells[2].Value.ToString();
if (url != null)
{
try
{
HttpWebRequest httpReq;
httpReq.Timeout = 10000;
//loadbar
dataGrid.Rows[0].DefaultCellStyle.BackColor = System.Drawing.Color.Wheat;
System.Diagnostics.Stopwatch timer = new System.Diagnostics.Stopwatch();
timer.Start();
HttpWebResponse httpRes = (HttpWebResponse)httpReq.GetResponse(); //httpRes.Close();
timer.Stop();
//loadbar
HttpStatusCode httpStatus = httpRes.StatusCode;
response[0] = httpStatus.ToString();
response[1] = timer.Elapsed.ToString();//*
httpRes.Close();
return response;
}
catch (Exception he)
{
response[0] = "catch";
response[1] = he.Message;
return response;
}
}
response[0] = "catch";
response[1] = "No URL entered";
return response;
//dataGrid.Rows[i].DefaultCellStyle.BackColor = System.Drawing.Color.Blue;
}
Thanks in advance.
Assuming the code provided is the actual code used:
First of all, your definition of 'False result' and 'Good result' is wrong. If you expect A but get B, that doesn't mean B is invalid. If your wife is giving birth and you expect a boy but it turns out the be a girl, its not a false result. Just unexpected.
That said: lets analyze your work: If it takes a long long time to check a site only to finally get a ??? result which isn't a 200 response code. We can almost savely assume you are dealing with a timeout. If your router, google or any fundamental network device in between is having problems, its expected to get an unexpected answer. "Timeout", "Bad Request", "Server not available" etc. Why would this happen? Its impossible to say for certain without having direct access to your environment.
Looking at your code however, i see that you're using the default TaskScheduler for making each check run as a task in the background (assuming you havent changed the default task scheduler which would be a vey bad practice to begin with). The default task scheduler, schedules each task on the threadpool which results in many many tasks running simultanious. Here we have a good candidate for overloading your network. Many sites (esspecially google) are kinda sensitive for handling many requests from the same source (esspecially if the frequency is high) so maybe google is blocking you temporarily or holding you back. Again, at this point it's pure speculation but the fact that you're running all checks simultaniously (unless the thread pool is on his max) is very likely the cause of your problem.
UPDATE
I would recommend working with a LimitedConcurrencyTaskScheduler ( see here: http://blogs.msdn.com/b/pfxteam/archive/2010/04/09/9990424.aspx ). Here you can limit the amount of tasks that can be run asynchronously. You have to do some testing for what number works ideally in your situation. Also make sure that the frequency is not 'too' high. Its hard to define what is too high, only testing can proof that.
In order to simulate your scenario, I have created a Winform with data grid and a button. On load of the form, I programmatically creates list of url’s (in a table) and bind to data grid. And on button click, we start the download process. In concise, the you have to write more defensive code and the following code only a skeleton of how you can fix the issue.
using System;
using System.Data;
using System.Net;
using System.Threading.Tasks;
using System.Windows.Forms;
namespace app
{
public partial class Form1 : Form
{
DataTable urls = new DataTable();
public Form1()
{
InitializeComponent();
}
//Fill your uri's and bind to a data grid.
void InitTable()
{
//Silly logic to simulate your scenario.
urls = new DataTable();
urls.Columns.Add(new DataColumn("Srl", typeof(string)));
urls.Columns.Add(new DataColumn("Urls", typeof(Uri)));
urls.Columns.Add(new DataColumn("Result", typeof(string)));
DataRow dr = urls.NewRow();
dr["Srl"] = "1";
dr["Urls"] = new Uri("http://www.microsoft.com");
dr["Result"] = string.Empty;
urls.Rows.Add(dr);
dr = urls.NewRow();
dr["Srl"] = "2";
dr["Urls"] = new Uri("http://www.google.com");
dr["Result"] = string.Empty;
urls.Rows.Add(dr);
dr = urls.NewRow();
dr["Srl"] = "3";
dr["Urls"] = new Uri("http://www.stackoverflow.com");
dr["Result"] = string.Empty;
urls.Rows.Add(dr);
urls.AcceptChanges();
}
void UpdateResult()
{
dataGridView1.DataSource = urls;
}
//Important
// This example will freeze UI. You can avoid this while implementing
//background worker or pool with some event synchronization. I haven't covered those area since
//we are addressing different issue. Let me know if you would like to address UI freeze
//issue. Or can do it your self.
private void button1_Click(object sender, EventArgs e)
{
//Create array for Task to parallelize multiple download.
var tasks = new Task<string[]>[urls.Rows.Count];
//Initialize those task based on number of Uri's
for(int i=0;i<urls.Rows.Count;i++)
{
int index = i;//Do not change this. This is to avoid data race
//Assign responsibility and start task.
tasks[index] = new Task<string[]>(
() => checkSite(
new TaskInput(urls.Rows[index]["Urls"].ToString(), urls.Rows[index]["Srl"].ToString())));
tasks[index].Start();
}
//Wait for all task to complete. Check other overloaded if interested.
Task.WaitAll(tasks);
//block shows how to access result from task
foreach (var item in tasks)
{
DataRow[] rows=urls.Select("Srl='"+item.Result[2]+"'");
foreach (var row in rows)
row["Result"]=item.Result[0]+"|"+item.Result[1];
}
UpdateResult();
}
//This is dummy method which in your case 'Check Site'. You can have your own
string[] checkSite(TaskInput input)
{
string[] response = new string[3];
if (input != null)
{
try
{
WebResponse wResponse = WebRequest.Create(input.Url).GetResponse();
response[0] = wResponse.ContentLength.ToString();
response[1] = wResponse.ContentType;
response[2] = input.Srl;
return response;
}
catch (Exception he)
{
response[0] = "catch";
response[1] = he.Message;
response[2] = input.Srl;
return response;
}
}
response[0] = "catch";
response[1] = "No URL entered";
response[2] = input.Srl;
return response;
}
private void Form1_Load(object sender, EventArgs e)
{
InitTable();
UpdateResult();
}
}
//Supply custom object for simplicity
public class TaskInput
{
public TaskInput(){}
public TaskInput(string url, string srl)
{
Url = url;
Srl = srl;
}
public string Srl { get; set; }
public string Url { get; set; }
}
}
Related
Hello everyone and thanks for helping me in advance. The following question might sound stupid and incorrect but I'm a beginner about it.
I have a method that gets some information from my database and sends it to an external database using a post call and a patch call in case the information has changed. I use EF Framework. In that db table there are at least 165k rows.
My question is the following: There is a way to optimize and speed up all the process? Maybe using multi threading, parallelism? I'm a beginner about it and I hope some of you help me understand.
The method is the following:
public async Task<List<dynamic>> SyncOrdersTaskAsync(int PageSize)
{
int PageIndex = 0;
if (PageSize <= 0) PageSize = 100;
const string phrase = "The fields order, task_code must make a unique set";
var sorting = new SortingCriteria {
Properties = new string[] { "WkOpenDate ASC" } };
List<dynamic> listTest = new List<dynamic>();
using (var uow = this.Factory.BeginUnitOfWork())
{
var repo = uow.GetRepository<IWorkOrderRepository>();
var count = await repo.CountAllAsync();
count = 150;
for (PageIndex = 0; PageIndex <= count / PageSize; PageIndex++)
{
var paging = new PagingCriteria
{
PageIndex = PageIndex,
PageSize = PageSize
};
var rows = await repo.GetByCriteriaAsync(
"new {WkID, CompanyID, JobNo, JobTaskNo ,WkNumber, WkYear," +
"WkYard,WkCustomerID,CuName,WkDivisionID,DvName,BusinessUnit," +
"BusinessUnitManagerID,BusinessUnitManager,WkWorkTypeID,WtName," +
"WkActivityID,WkActivityDescription,NoteDescrLavoro,WkWOManagerID," +
"ProjectManager,IDMaster,ProjectCoordinator,WkOpenDate," +
"WkDataChiusa,Prov,CodiceSito,CodiceOffice,CodiceLavorazione," +
"CodiceNodo,DescrizioneNodo,WkPrevisionalStartDate,WkRealStartDate," +
"WkPrevisionalEndDate,WkRealEndDate,NumeroOrdine," +
"WkPrevisionalLabourAmount,TotaleCosti,SumOvertimeHours," +
"SumTravelHours,SumNormalHours,WkProgressPercentage,Stato,CUP,CIG," +
"TotaleManodopera,TotalePrestazioni,TotaleNoli,TotaleMateriali," +
"SumAuxiliaryHours,TipoCommessa,TotaleOrdine, WkPreventivoData," +
"WkConsuntivoData,TotaleFatturato,AggregateTotaleFatturato," +
"AggregateTotalePrestazioni,Contract,CustomerOrderNumber," +
"XmeWBECode,LastUpdateDate,PreGestWkID,CommercialNotes,Mandant," +
"GammaProjectName,WkInventoryDate,WkCloseFlag,WkNote," +
"TotalRegisteredLabour,TotalRegisteredPerformances," +
"TotalRegisteredLeasings,TotalRegisteredMaterials,FlagFinalBalance," +
"FinalBalance,OrderDate,TotalOrderDivision,SearchDescription," +
"TotaleBefToBeApproved,TotaleBefToBeApprovedLeasings," +
"TotaleLabourToBeApproved,AggregateLevel, AggregateTotalLabour," +
"AggregateTotalLeasings,AggregateTotalMaterials," +
"AggregateTotalRegisteredLabour," +
"AggregateTotalRegisteredPerformances," +
"AggregateTotalRegisteredLeasings," +
"AggregateTotalRegisteredMaterials," +
"AggregateTotalCost,AggregateSumNormalHours," +
"AggregateSumAuxiliaryHours,AggregateSumRainHours," +
"AggregateSumTravelHours,AggregateSumOvertimeHours," +
"AggregateWkPrevisionalLabourAmount,AggregateFinalBalance," +
"AggregateTotalOrder,AggregateTotalOrderDivision," +
"AggregateTotalBefToBeApproved," +
"AggregateTotalBefToBeApprovedLeasings," +
"AggregateTotalLabourToBeApproved,TotalProduction," +
"AggregateTotalProduction,JobTaskDescription}", paging, sorting);
String url = appSettings.Value.UrlV1 + "order_tasks/";
using (var httpClient = new HttpClient())
{
httpClient.DefaultRequestHeaders.Add("Authorization", "Token " +
await this.GetApiKey(true));
if (rows.Count() > 0)
{
foreach (var row in rows)
{
var testWork = (Model.WorkOrderCompleteInfo)Mapper
.MapWkOrdersCompleteInfo(row);
var orderIdDiv = await this.GetOrderForSyncing(httpClient,
testWork.JobNo);
var jsonTest = new JObject();
jsonTest["task_code"] = testWork.JobTaskNo;
jsonTest["description"] = testWork.JobTaskDescription;
jsonTest["order"] = orderIdDivitel.Id;
jsonTest["order_date"] = testWork.OrderDate.HasValue
? testWork.OrderDate.Value.ToString("yyyy-MM-dd")
: string.IsNullOrEmpty(testWork.OrderDate.ToString())
? "1970-01-01"
: testWork.OrderDate.ToString().Substring(0, 10);
jsonTest["progress"] = testWork.WkProgressPercentage;
var content = new StringContent(jsonTest.ToString(),
Encoding.UTF8, "application/json");
var result = await httpClient.PostAsync(url, content);
if (result.Content != null)
{
var responseContent = await result.Content
.ReadAsStringAsync();
bool alreadyExists = phrase.All(responseContent.Contains);
if (alreadyExists)
{
var taskCase = await GetTaskForSyncing(httpClient,
testWork.JobTaskNo, orderIdDiv.Id.ToString());
var idCase = taskCase.Id;
String urlPatch = appSettings.Value.UrlV1 +
"order_tasks/" + idCase + "/";
bool isSame = taskCase.Equals(testWork
.toSolOrderTask());
if (!isSame)
{
var resultPatch = await httpClient.PatchAsync(
urlPatch, content);
if (resultPatch != null)
{
var responsePatchContent = await resultPatch
.Content.ReadAsStringAsync();
var jsonPatchContent = JsonConvert
.DeserializeObject<dynamic>(
responsePatchContent);
listTest.Add(jsonPatchContent);
}
}
else
{
listTest.Add(taskCase.JobTaskNo_ +
" is already updated!");
}
}
else
{
var jsonContent = JsonConvert
.DeserializeObject<dynamic>(responseContent);
listTest.Add(jsonContent);
}
}
}
}
}
}
return listTest;
}
}
Maybe I need to apply parallelism in the for loop?
Again, really thanks to everyone in advance and I hope I was clear :)
The most handy tool that is currently available for parallelizing asynchronous work is the Parallel.ForEachAsync method. It was introduced in .NET 6. Your code is quite complex though, and deciding where to put this loop is not obvious.
Ideally you would like to call the Parallel.ForEachAsync only once, so that it parallelizes your work with a single configurable degree of parallelism from start to finish. Generally you don't want to put this method inside an outer for/foreach loop, because then the degree of parallelism will fluctuate during the whole operation. But since your code is complex, I would go the easy way and do just that. I would replace this code:
foreach (var row in rows)
{
//...
}
...with this:
ParallelOptions options = new() { MaxDegreeOfParallelism = 2 };
await Parallel.ForEachAsync(rows, options, async (row, _) =>
{
//...
});
You have to make one more change. The List<T> is not thread safe, and so it will get corrupted if you call Add from multiple threads without synchronization. You can either add a lock (listTest) before each listTest.Add, or replace it with a concurrent collection. My suggestion is to do the later:
ConcurrentQueue<dynamic> listTest = new();
//...
listTest.Enqueue(jsonContent);
//...
return listTest.ToList();
After doing these changes, hopefully your code will still work correctly, and it will be running a bit faster. Then you'll have to experiment with the MaxDegreeOfParallelism setting, until you find the one that yields the optimal performance. Don't go crazy with large values like 100 or 1000. In most cases overparallelizing is harmful, and might yield worse performance than not parallelizing at all.
I have written an app that goes through our own properties and scraps the data. To make sure I don't run through the same URLs, I am using a MySQL database to store the URL, flag it once its processed. All this was being done in a single thread and it's fine if I had only few thousand entries. But I have few hundred thousand entries that I need to parse so I need to make changes in the code (I am newbie in multithreading in general). I found an example and was trying to copy the style but doesn't seem to work. Anyone know what the issue is with the following code?
EDIT: Sorry didn't mean to make people guess the issue but was stupid of me to include the exception. Here is the exception
"System.InValidCastException: 'Specified cast is not valid.'"
When I start the process it collects the URLs from the database and then never hits DoWork method
//This will get the entries from the database
List<Mappings> items = bot.GetUrlsToProcess(100);
if (items != null)
{
var tokenSource = new CancellationTokenSource();
var token = tokenSource.Token;
Worker.Done = new Worker.DoneDelegate(WorkerDone);
foreach (var item in items)
{
urls.Add(item.Url);
WaitingTasks.Enqueue(new Task(id => new Worker().DoWork((int)id, item.Url, token), item.Url, token));
}
LaunchTasks();
}
static async void LaunchTasks()
{
// keep checking until we're done
while ((WaitingTasks.Count > 0) || (RunningTasks.Count > 0))
{
// launch tasks when there's room
while ((WaitingTasks.Count > 0) && (RunningTasks.Count < MaxRunningTasks))
{
Task task = WaitingTasks.Dequeue();
lock (RunningTasks) RunningTasks.Add((int)task.AsyncState, task);
task.Start();
}
UpdateConsole();
await Task.Delay(300); // wait before checking again
}
UpdateConsole(); // all done
}
static void UpdateConsole()
{
Console.Write(string.Format("\rwaiting: {0,3:##0} running: {1,3:##0} ", WaitingTasks.Count, RunningTasks.Count));
}
static void WorkerDone(int id)
{
lock (RunningTasks) RunningTasks.Remove(id);
}
public class Worker
{
public delegate void DoneDelegate(int taskId);
public static DoneDelegate Done { private get; set; }
public async void DoWork(object id, string url, CancellationToken token)
{
if (token.IsCancellationRequested) return;
Content obj;
try
{
int tries = 0;
bool IsUrlProcessed = true;
DateTime dtStart = DateTime.Now;
string articleDate = string.Empty;
try
{
ScrapeWeb bot = new ScrapeWeb();
SearchApi searchApi = new SearchApi();
SearchHits searchHits = searchApi.Url(url, 5, 0);
if (searchHits.Hits.Count() == 0)
{
obj = await bot.ReturnArticleObject(url);
if (obj.Code != HttpStatusCode.OK)
{
Console.WriteLine(string.Format("\r Status is {0}", obj.Code));
tries = itemfound.UrlMaxTries + 1;
IsUrlProcessed = false;
itemfound.HttpCode = obj.Code;
}
else
{
string title = obj.Title;
string content = obj.Contents;
string description = obj.Description;
Articles article = new Articles();
article.Site = url.GetSite();
article.Content = content;
article.Title = title;
article.Url = url.ToLower();
article.Description = description;
string strThumbNail = HtmlHelper.GetImageUrl(url, obj.RawResponse);
article.Author = HtmlHelper.GetAuthor(url, obj.RawResponse);
if (!string.IsNullOrEmpty(strThumbNail))
{
//This condition needs to be added to remove ?n=<number> from EP thumbnails
if (strThumbNail.Contains("?"))
{
article.ImageUrl = strThumbNail.Substring(0, strThumbNail.IndexOf("?")).Replace("http:", "https:");
}
else
article.ImageUrl = strThumbNail.Replace("http:", "https:");
}
else
{
article.ImageUrl = string.IsNullOrEmpty(strThumbNail) ? article.Url.GetDefaultImageUrls() : strThumbNail.Replace("http:", "https:");
}
articleDate = HtmlHelper.GetPublishDate(url, obj.RawResponse);
if (string.IsNullOrEmpty(articleDate))
article.Pubdate = DateTime.Now;
else
article.Pubdate = DateTime.Parse(articleDate);
var client = new Index(searchApi);
var result = client.Upsert(article);
itemfound.HttpCode = obj.Code;
if (result)
{
itemfound.DateCreated = DateTime.Parse(articleDate);
itemfound.DateModified = DateTime.Parse(articleDate);
UpdateItem(itemfound);
}
else
{
tries = itemfound.UrlMaxTries + 1;
IsUrlProcessed = false;
itemfound.DateCreated = DateTime.Parse(articleDate);
itemfound.DateModified = DateTime.Parse(articleDate) == null ? DateTime.Now : DateTime.Parse(articleDate);
UpdateItem(itemfound, tries, IsUrlProcessed);
}
}
}
else
{
tries = itemfound.UrlMaxTries + 1;
IsUrlProcessed = true;
itemfound.HttpCode = HttpStatusCode.OK;
itemfound.DateCreated = DateTime.Parse(articleDate);
itemfound.DateModified = DateTime.Parse(articleDate) == null ? DateTime.Now : DateTime.Parse(articleDate);
}
}
catch (Exception e)
{
tries = itemfound.UrlMaxTries + 1;
IsUrlProcessed = false;
itemfound.DateCreated = DateTime.Parse(articleDate);
itemfound.DateModified = DateTime.Parse(articleDate) == null ? DateTime.Now : DateTime.Parse(articleDate);
}
finally
{
DateTime dtEnd = DateTime.Now;
Console.WriteLine(string.Format("\r Total time taken to process items is {0}", (dtEnd - dtStart).TotalSeconds));
}
}
catch (Exception e)
{
Console.WriteLine(e);
}
Done((int)id);
}
}
All this code is based from Best multi-thread approach for multiple web requests this link. Can someone tell me how to get this approach running?
I think the problem is in the way you're creating your tasks:
new Task(id => new Worker().DoWork((int)id, item.Url, token), item.Url, token)
This Task constructor overload expected Action<object> delegate. That means id will be typed as object and you need to cast it back to something useful first.
Parameters
action
Type: System.Action<Object>
The delegate that represents the code to execute in the task.
state
Type: System.Object
An object representing data to be used by the action.
cancellationToken
Type: System.Threading.CancellationToken
-The CancellationToken that that the new task will observe.
You decided to cast it to int by calling (int)id, but you're passing item.Url as the object itself. I can't tell you 100% what the type of Url is but I don't expect Url-named property to be of type int.
Based on what #MarcinJuraszek said I just went back to my code and added an int as I couldn't find another way to resolve it. Here is the change I made
int i=0
foreach (var item in items)
{
urls.Add(item.Url);
WaitingTasks.Enqueue(new Task(id => new Worker().DoWork((string)id, item.Url, token), item.Url, token));
i++;
}
I have this code to load and count data from API server;
class TestNetWork
{
private Task taskFillPicker;
private List<CityItemDB> itemsCity;
private CustomPicker cpCity;
public async Task FillPicker()
{
try {
JObject res = await SuperFUNC.GET_CITY_ACTIVE_SENDER();
if(res == null){
//null
}else{
string message = res["message"].ToString();
if(message.Equals("Success")){
itemsCity.Clear();
cpCity.Items.Clear();
JArray data = (JArray)res["data"];
int count = data.Count;
for (int i = 0; i < count; i++) {
CityItemDB node = new CityItemDB();
node.cityId = Int32.Parse(data[i]["cityId"].ToString());
node.cityName = data[i]["cityName"].ToString();
itemsCity.Add(node);
cpCity.Items.Add(node.ToString());
}
}else{
//null
}
}
} catch (Exception ex) {
Debug.WriteLine (TAG + " : " + ex.StackTrace);
}
}
public TestNetWork()
{
this.itemsCity = new List<CityItemDB> ();
this.cpCity = new CustomPicker {
HeightRequest = 40,
TextColor = Color.FromHex("#5a5a5a"),
Title = "City Choose",
};
taskFillPicker = FillPicker ();
Debug.WriteLine (COUNT + " : " + itemsCity.Count);
}
}
But console print me COUNT : 0, I'm sure code get and parse json from internet is correct, picker show full data but List<CityItemDB> itemsCity count 0.
Thank for read, sorry my english not good!
You need to await the task, otherwise execution might continue before FillPicker has completed:
taskFillPicker = await FillPicker ();
As this code is in a constructor where await is not possible, I suggest moving it to a separate async method:
public async Task Init()
{
taskFillPicker = await FillPicker ();
Debug.WriteLine (COUNT + " : " + itemsCity.Count);
}
You have to write a little bit more code to construct the object now:
var n = new TestNetWork();
await n.Init();
I am making a Bloomberg web service GetData call for the "DEBT_TO_EQUITY_FUNDAMENTALS_TKR" field. I am setting secmaster = true and asking for a single instrument with a CUSIP identifier (with yellowkey = MarketSector.Corp).
This strikes me as a fairly lightweight call having seen people asking for thousands of instruments and dozens of fields at once.
I have played around with setting lots of different settings but I just can't get this request to return in a few seconds. It gives me the correct return value but it takes longer than 60 seconds.
Any idea if it is possible to get such a request to execute and return in a few seconds?
Thanks
EDIT - Here is the code I am running:
public string GetFundamentalTicker(string identifier, InstrumentType identifierType = InstrumentType.CUSIP)
{
PerSecurityWS ps = new PerSecurityWS();
try
{
log.DebugFormat("Cert path is: {0}", CertPath);
X509Certificate2 clientCert = new X509Certificate2(CertPath, "<password_redacted>");
ps.ClientCertificates.Add(clientCert);
}
catch (Exception e)
{
log.ErrorFormat("Error in cert setup - {0} - {1}", e.Message, e.InnerException == null ? "" : e.InnerException.Message);
return null;
}
//Set request header
GetDataHeaders getDataHeaders = new GetDataHeaders();
getDataHeaders.secmaster = true;
getDataHeaders.secmasterSpecified = true;
//getDataHeaders.fundamentals = true;
//getDataHeaders.fundamentalsSpecified = true;
//getDataHeaders.programflag = ProgramFlag.oneshot;//unnecessary - defaults to this anyway
//getDataHeaders.programflagSpecified = true;
//getDataHeaders.pricing = true;
getDataHeaders.secid = identifierType;
getDataHeaders.secidSpecified = true;
SubmitGetDataRequest sbmtGtDtreq = new SubmitGetDataRequest();
sbmtGtDtreq.headers = getDataHeaders;
sbmtGtDtreq.fields = new string[] {
"DEBT_TO_EQUITY_FUNDAMENTALS_TKR"
};
int currentFundYear = DateTime.Now.Year;
//var fundYears = new List<int>();
List<Instrument> fundYearInstruments = new List<Instrument>();
Instrument fundYearInstrument = null;
fundYearInstrument = new Instrument();
fundYearInstrument.id = identifier;
fundYearInstrument.typeSpecified = true;
fundYearInstrument.type = identifierType;
fundYearInstrument.yellowkey = MarketSector.Corp;
fundYearInstrument.yellowkeySpecified = true;
//fundYearInstrument.overrides = new Override[] {};//{ new Override() { field = "EQY_FUND_YEAR", value = currentFundYear.ToString() } };
fundYearInstruments.Add(fundYearInstrument);
//fundYears.Add(-1);
Instrument[] instr = fundYearInstruments.ToArray();
Instruments instrs = new Instruments();
instrs.instrument = instr;
sbmtGtDtreq.instruments = instrs;
try
{
SubmitGetDataResponse sbmtGtDtResp = ps.submitGetDataRequest(sbmtGtDtreq);
RetrieveGetDataRequest rtrvGtDrReq = new RetrieveGetDataRequest();
rtrvGtDrReq.responseId = sbmtGtDtResp.responseId;
RetrieveGetDataResponse rtrvGtDrResp;
do
{
System.Threading.Thread.Sleep(POLL_INTERVAL);
rtrvGtDrResp = ps.retrieveGetDataResponse(rtrvGtDrReq);
}
while (rtrvGtDrResp.statusCode.code == DATA_NOT_AVAILABLE);
if (rtrvGtDrResp.statusCode.code == SUCCESS)
{
for (int i = 0; i < rtrvGtDrResp.instrumentDatas.Length; i++)
{
for (int j = 0; j < rtrvGtDrResp.instrumentDatas[i].data.Length; j++)
{
if (rtrvGtDrResp.instrumentDatas[i].data[j].value == "N.A." || rtrvGtDrResp.instrumentDatas[i].data[j].value == "N.S." || rtrvGtDrResp.instrumentDatas[i].data[j].value == "N.D.")
rtrvGtDrResp.instrumentDatas[i].data[j].value = null;
return rtrvGtDrResp.instrumentDatas[i].data[j].value;
}
}
return null;
}
else if (rtrvGtDrResp.statusCode.code == REQUEST_ERROR)
{
log.ErrorFormat("Error in the submitted request: {0}", rtrvGtDrResp.statusCode.description);
return null;
}
}
catch (Exception e)
{
log.ErrorFormat("Error in GetData - {0} - {1}", e.Message, e.InnerException == null ? "" : e.InnerException.Message);
return null;
}
return null;
}
Poll interval is 5 seconds and the SOAP web service url is:
https://software.bloomberg.com/datalicensewp/dlws.wsdl
I am having the same issue. I found out that there is a difference between making the same call to Bloomberg API from, for example, console app (works very fast) and web service (takes a lot of time to start session). And the difference is that console app runs under the same user as bbcomm process, whereas web service (or actually iis process) runs under System account. You can try to log out all users on the PC where web service is hosted and then try to make the call. In this case, I guess, bbcomm goes under System account as no one else is logged in and works fast. It worked for me and the call was answered instantly.
I have a netbook with 1.20Ghz Processor & 1GB Ram.
I'm running a C# WinForms app on it which, at 5 minute intervals, reads every line of a text file and depending on what the content of that line is, either skips it or writes it to an xml file. Sometimes it may be processing about 2000 lines.
When it begins this task, the processor gets maxed out, 100% use. However on my desktop with 2.40Ghz Processor and 3GB Ram it's untouched (for obvious reasons)... is there any way I can actually reduce this processor issue dramatically? The code isn't complex, I'm not bad at coding either and I'm not constantly opening the file, reading and writing... it's all done in one fell swoop.
Any help greatly appreciated!?
Sample Code
***Timer.....
#region Timers Setup
aTimer.Tick += new EventHandler(OnTimedEvent);
aTimer.Interval = 60000;
aTimer.Enabled = true;
aTimer.Start();
radioButton60Mins.Checked = true;
#endregion Timers Setup
private void OnTimedEvent(object source, EventArgs e)
{
string msgLoggerMessage = "Checking For New Messages " + DateTime.Now;
listBoxActivityLog.Items.Add(msgLoggerMessage);
MessageLogger messageLogger = new MessageLogger();
messageLogger.LogMessage(msgLoggerMessage);
if (radioButton1Min.Checked)
{
aTimer.Interval = 60000;
}
if (radioButton60Mins.Checked)
{
aTimer.Interval = 3600000;
}
if (radioButton5Mins.Checked)
{
aTimer.Interval = 300000;
}
// split the file into a list of sms messages
List<SmsMessage> messages = smsPar.ParseFile(smsPar.CopyFile());
// sanitize the list to get rid of stuff we don't want
smsPar.SanitizeSmsMessageList(messages);
ApplyAppropriateColoursToRecSMSListinDGV();
}
public List<SmsMessage> ParseFile(string filePath)
{
List<SmsMessage> list = new List<SmsMessage>();
using (StreamReader file = new StreamReader(filePath))
{
string line;
while ((line = file.ReadLine()) != null)
{
var sms = ParseLine(line);
list.Add(sms);
}
}
return list;
}
public SmsMessage ParseLine(string line)
{
string[] words = line.Split(',');
for (int i = 0; i < words.Length; i++)
{
words[i] = words[i].Trim('"');
}
SmsMessage msg = new SmsMessage();
msg.Number = int.Parse(words[0]);
msg.MobNumber = words[1];
msg.Message = words[4];
msg.FollowedUp = "Unassigned";
msg.Outcome = string.Empty;
try
{
//DateTime Conversion!!!
string[] splitWords = words[2].Split('/');
string year = splitWords[0].Replace("09", "20" + splitWords[0]);
string dateString = splitWords[2] + "/" + splitWords[1] + "/" + year;
string timeString = words[3];
string wholeDT = dateString + " " + timeString;
DateTime dateTime = DateTime.Parse(wholeDT);
msg.Date = dateTime;
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
Application.Exit();
}
return msg;
}
public void SanitizeSmsMessageList(List<SmsMessage> list)
{
// strip out unwanted messages
// list.Remove(some_message); etc...
List<SmsMessage> remove = new List<SmsMessage>();
foreach (SmsMessage message in list)
{
if (message.Number > 1)
{
remove.Add(message);
}
}
foreach (SmsMessage msg in remove)
{
list.Remove(msg);
}
//Fire Received messages to xml doc
ParseSmsToXMLDB(list);
}
public void ParseSmsToXMLDB(List<SmsMessage> list)
{
try
{
if (File.Exists(WriteDirectory + SaveName))
{
xmlE.AddXMLElement(list, WriteDirectory + SaveName);
}
else
{
xmlE.CreateNewXML(WriteDirectory + SaveName);
xmlE.AddXMLElement(list, WriteDirectory + SaveName);
}
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
Application.Exit();
}
}
public void CreateNewXML(string writeDir)
{
try
{
XElement Database = new XElement("Database");
Database.Save(writeDir);
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
}
}
public void AddXMLElement(List<SmsMessage> messages, string writeDir)
{
try
{
XElement Database = XElement.Load(writeDir);
foreach (SmsMessage msg in messages)
{
if (!DoesExist(msg.MobNumber, writeDir))
{
Database.Add(new XElement("SMS",
new XElement("Number", msg.MobNumber),
new XElement("DateTime", msg.Date),
new XElement("Message", msg.Message),
new XElement("FollowedUpBy", msg.FollowedUp),
new XElement("Outcome", msg.Outcome),
new XElement("Quantity", msg.Quantity),
new XElement("Points", msg.Points)));
EventNotify.SendNotification("A New Message Has Arrived!", msg.MobNumber);
}
}
Database.Save(writeDir);
EventNotify.UpdateDataGridView();
EventNotify.UpdateStatisticsDB();
}
catch (Exception e)
{
MessageBox.Show(e.ToString());
}
}
public bool DoesExist(string number, string writeDir)
{
XElement main = XElement.Load(writeDir);
return main.Descendants("Number")
.Any(element => element.Value == number);
}
Use a profiler and/or Performance Monitor and/or \\live.sysinternals.com\tools\procmon.exe and/or ResourceMonitor to determine what's going on
If the 5 minute process is a background task, you can make use of Thread Priority.
MSDN here.
If you do the processing on a separate thread, change your timer to be a System.Threading.Timer and use callback events, you should be able to set a lower priority on that thread than the rest of your application.
Inside your ParseFile loop, you could try adding a Thread.Sleep and/or an Application.DoEvents() call to see if that helps. Its better to do this in the parsing is on a seperate thread, but at least you can try this simple test to see if it helps.
Might be that the MessageBoxes in your catches are running into cross-thread problems. Try swapping them out for writing to the trace output.
In any case, you've posted an entire (little) program, which will not help you get specific advice. Try deleting method bodies -- one at a time, scientifically -- and try to get the problem to occur/stop occurring. This will help you to locate the problem and eliminate the irrelevant parts of your question (both for yourself and for SO).
Your current processing model is batch based - do the parsing, then process the messages, and so on.
You'll likely reduce the memory overhead if you switched to a Linq style "pull" approach.
For example, you could convert your ParseFile() method in this way:
public IEnmerable<SmsMessage> ParseFile(string filePath)
{
using (StreamReader file = new StreamReader(filePath))
{
string line;
while ((line = file.ReadLine()) != null)
{
var sms = ParseLine(line);
yield return sms;
}
}
}
The advantage is that each SmsMessage can be handled as it is generated, instead of parsing all of the messages at once and then handling all of them.
This lowers your memory overhead, which is one of the most likely causes for the performance difference between your netbook and your desktop.