EasyNetQ/RabbitMQ client not getting response in IIS Web Application - c#

I'm using EasyNetQ/RabbitMQ to connect a web application running in IIS to a Windows service. However, I don't seem to get the request/response to work. I pretty sure it's not a code problem but I'm running out of ideas where to look at next.
This is a test console application that works just perfectly:
static void Main()
{
var stopHandle = new ManualResetEventSlim();
var producer = new Thread(() =>
{
string cs = "host=localhost;virtualHost=/;username=demo;password=y4xHEyq3nQOd;timeout=120;persistentMessages=false;prefetchcount=1";
var bus = RabbitHutch.CreateBus(cs, x => x.Register<IEasyNetQLogger>(s => new EasyNetQ.Loggers.ConsoleLogger()));
while (!stopHandle.IsSet)
{
try
{
var result = bus.Request<AutomationRequest, AutomationResponse>(new AutomationRequest
{
taskId = "140061381555",
arguments = "type=pdf",
issueId = 97630548355,
});
Console.WriteLine("Result: " + result.status + ", " + result.status + ", " + result.downloadUrl);
}
catch (Exception)
{
Console.WriteLine("Failed");
}
if (!stopHandle.IsSet) Thread.Sleep(1000);
}
bus.Dispose();
});
producer.Start();
Console.ReadLine();
stopHandle.Set();
producer.Join();
}
}
This on the other hand is the same test code but as a web application:
namespace WebApplication2
{
public partial class _Default : Page
{
protected void Page_Load(object sender, EventArgs e)
{
string cs = "host=localhost;virtualHost=/;username=demo;password=y4xHEyq3nQOd;timeout=120;persistentMessages=false;prefetchcount=1";
IBus bus = RabbitHutch.CreateBus(cs, x => x.Register<IEasyNetQLogger>(s => new EasyNetQ.Loggers.ConsoleLogger()));
try
{
var result = bus.Request<AutomationRequest, AutomationResponse>(new AutomationRequest
{
taskId = "140061381555",
arguments = "type=pdf",
issueId = 97630548355,
});
Console.WriteLine("Result: " + result.status + ", " + result.status + ", " + result.downloadUrl);
}
catch (Exception)
{
Console.WriteLine("Failed");
}
bus.Dispose();
}
}
}
Web application sends the message just fine, at queued and it's processed on the back end. Back-end provides the response with the correct correlationID and queues it. However, for some reason unknown to me, the Web application never gets the response.
The Web application eventually gets a timeout. However, the length of the timeout is not the problems, as the getting the response into the queue takes just about two seconds, and changing the timeout to two minutes on both sides does not make any difference.
Does somebody have a ready solution what to fix? Or any ideas where to look? I have googled everything I could figure out about EasyNetQ or RabbitMQ but I didn't even find anybody having reported about similar problems.
--karri

Related

How to use C# HTTPClient with Windows Credentials

I wrote a console app that will read a text file of links and test them using the HttpClient class to check if the links exist. It works wonderfully at home when tested against common links such as google.com.
When I run the app at under my work intranet, though, I get "Forbidden" errors when it checks links on our company Sharepoint and "Unauthorized" errors when it checks links on the Azure Website. My hope was that running it under my authorized Windows desktop PC would be all I needed for it to work, but nope.
Any hints one how to pass my Network credentials when I access the links with HttpClient?
EDIT: Added code to handle passed console argument (string of authentication cookies)
using System;
using System.Threading.Tasks;
using System.Net.Http;
using System.Net.Http.Headers;
namespace linkbot
{
class Program
{
private static async Task ProcessRepositories(string the_url, string the_cookies)
{ int the_index=the_url.IndexOf("/",9);
string base_string=the_url;
string the_rest="/";
if (the_index>=0)
{
base_string=the_url.Substring(0,the_index);
the_rest=the_url.Substring(the_index);
}
try {
var baseAddress = new Uri(base_string);
using (var handler = new HttpClientHandler { UseCookies = false })
using (var client = new HttpClient(handler) { BaseAddress = baseAddress })
{
var message = new HttpRequestMessage(HttpMethod.Get, the_rest);
message.Headers.Add("Cookie", the_cookies);
var result = await client.SendAsync(message);
result.EnsureSuccessStatusCode();
}
Write("\n" + the_url + " - WORKED!");
}
catch(Exception e )
{
Write("\nFailed: " + the_url + "||||||" +e.ToString() );
//throw e;
}
}
static async Task Main(string[] args)
{
if (args.Length<2){
Console.Write("\n###################\nLinkChecker by Sean J. Miller 2022\n\nusage:\n linkchecker.exe <linkfile> <cookies>\nwhere <linkfile> contains a text file of fully specified links (URLs) with one link per line. Example, https://www.google.com\n\nAn output file is generated titled brokenlinks.txt in the same directory where LinkChecker was launched. <cookies> should be a string such as \"cookie1=value1;cookie2=value2;cookie3=value3;\"\n###################\n\n\n");
return;
}
System.IO.File.Delete("brokenlinks.txt");
System.IO.File.WriteAllText("brokenlinks.txt", "Last Check Started " + (DateTime.Now).ToString());
int counter=0;
int total_lines=TotalLines(#args[0]);
Console.Write("Started " + (DateTime.Now).ToString() + "\n");
foreach (string line in System.IO.File.ReadLines(#args[0]))
{
Console.Write("Processing Link " + (++counter) + "/" + total_lines + "\n");
await ProcessRepositories(line, args[1]);
}
Console.Write("Finished " + (DateTime.Now).ToString() + "\n");
Write("\n");
}
static void Write(string the_text)
{
//Console.Write(the_text);
try
{
System.IO.File.AppendAllText("brokenlinks.txt", the_text);
}
catch (Exception ex)
{
throw ex;
}
}
static int TotalLines(string filePath)
{
using (StreamReader r = new StreamReader(filePath))
{
int i = 0;
while (r.ReadLine() != null) { i++; }
return i;
}
}
}
}
You want to set UseDefaultCredentials to true to use the current logged-on user credentials in your request. You can do that by instantiating your HttpClient like so:
private static readonly HttpClient client = new HttpClient(new HttpClientHandler() { UseDefaultCredentials = true });

How to fix high CPU usage found using performance profiling results?

Intro
We are building a c# data collector that collects tagged data from an OPC source (data being sampled every 1/2 second. The code runs well except for the fact that it uses around 25% of the CPU (XEON 2.8GHz 8GB). Using the Visual Studio Performance Profiler, we got the following :
Overall CPU usage
We identified 2 functions with high CPU consumption : Hot path leading to the 2 high consumption functions
We are noticing that the function, itself, doesn't consume much CPU ressources (Self CPU around 1.5%) but around 50% of the whole app consumption (Total CPU around 50%).
Digging further into the "PrepareTags() function, we found that SemaphoreSlim.Release() and Wait() methods are mostly responsible for the CPU usage : PrepareTags() function consumption
Global code structure
When the program begins, the 3 main threads are started. The one I'm focussing on here is the REST API thread. These threads are "permanent" and kept active as long as the application runs. Also, each main thread will start new threads, some are permanent while others will execute and stop.
Based on my observations, around 40 threads are simultaneously active with small variations caused by threads being started and stopped. Thread number is kept constant so it seems there is no deadlocks.
3 Main threads are starting
public void StartThreads()
{
WriteLog.WriteFollowLog("CollecterService : is Started");
//set up default culture
CultureInfo culture = CultureInfo.CreateSpecificCulture(ConfigurationManager.AppSettings["Culturesetting"]);
CultureInfo.DefaultThreadCurrentCulture = culture;
CultureInfo.DefaultThreadCurrentUICulture = culture;
listOfTagUpdated = new SemaphoreSlim(0);
listOfTag = new Dictionary<string, TagSourceAddressObject>();
tcpMemory = new TagMemoryFinFout();
tagAdressMemoryForStat = new TagAdressMemoryFinFout();
tagMemory = new TagMemoryLastValue(false);
//configuration
try
{
// Creation of the Stat thread
this.statThread = new ThreadForCalculation(listOfTag, tagAdressMemoryForStat, tagMemory, tcpMemory);
// Creation of the REST API client thread
this.restApiThread = new ThreadRestApiClient(listOfTag, tcpMemory, listOfTagUpdated, tagMemory);
// Creation of the OPCDA Thread
this.opcDaThread = new ThreadOpcDaClient(listOfTag, tcpMemory, tagAdressMemoryForStat, tagMemory, listOfTagUpdated);
if (configuration != null)
{
this.configuration.StartTracking(this.restApiThread);
}
}
catch (Exception ex)
{
WriteLog.WriteErrorLog(ex);
}
}
The RestAPIThreadClient will also start 3 threads including the permanent "PrepareTags" thread.
RestAPIThread permanent thread starts 3 more permanent threads
...
// Start the authentication and the websocket handlers for this connection
this.AuthenticationConnectionThread = new Thread(CheckConnectionAuthentication);
this.AuthenticationConnectionThread.Name = "AuthentificationConnectionThread";
this.AuthenticationConnectionThread.Start();
this.WebsocketConnectionThread = new Thread(CheckWebsocketConnection);
this.WebsocketConnectionThread.Name = "WebsocketConnectionThread";
this.WebsocketConnectionThread.Start();
this.TcpPrepareTagThread = new Thread(PrepareTags);
this.TcpPrepareTagThread.Name = "TcpPrepareThread";
this.TcpPrepareTagThread.Start();
...
For better comprehension here is what a "TagValueObject" looks like :
TagValueObject newTag = new TagValueObject(null);
newTag.TagID = tag.TagID;
newTag.TagQuality = tag.TagQuality;
newTag.TagTimeStamp = tag.TagTimeStamp;
newTag.TagValue = tag.TagValue;
newTag.TagName = tag.TagName;
The overall process is :
OPC server sends tag data to our collector (every 1/2 second)
The collector stores these in memory
After some processing they are redirected to an API on AWS
The high CPU usage is observed in step 3
What happens in step 3
1. PrepareTags() gets called.
Function PrepareTags()
private void PrepareTags()
{
WriteLog.WriteFollowLog("TCPRestAPIClient : Is Started");
while (!stopWorking)
{
if (token != null)
{
TagValueObject tag = tcpMemory.GetFirstTag();
if (tag != null)
{
tagsToSend.Add(tag);
int numOfTagsToReturn = 50;
if ((readyToSend == true && tagsToSend.Count > 0) || tagsToSend.Count >= numOfTagsToReturn)
{
readyToSend = false;
//WriteLog.WriteFollowLog("Start count:" + tagsToSend.Count.ToString());
List<TagValueObject> tags = tagsToSend.GetRange(0, tagsToSend.Count > numOfTagsToReturn ? numOfTagsToReturn : tagsToSend.Count);
tagsToSend.RemoveRange(0, tagsToSend.Count > numOfTagsToReturn ? numOfTagsToReturn : tagsToSend.Count);
//WriteLog.WriteFollowLog("End count:" + tagsToSend.Count.ToString() + " / Remaining: " + tcpMemory.TagMemorySlot.Count);
Thread thread = new Thread(() =>
{
ProcessTagRequest(tags);
});
thread.Name = "PrepareTags";
thread.Start();
}
}
}
}
}
2. We get a range of tags stored in memory through "GetFirstTag()" and start the ProcessTagRequest Thread
Function GetFirsTag()
public TagValueObject GetFirstTag()
{
TagValueObject tag;
if (valueUpdated.Wait(10))
{
//WriteLog.WriteFollowLog(String.Format("TagMemoryFinFout-GetFirstTag Semaphore(valueUpdated) : Entered. Available Slots = {0}", valueUpdated.CurrentCount));
lock (thislock)
{
tag = tagMemorySlot.FirstOrDefault();
tagMemorySlot.Remove(tag);
}
valueUpdated.Release();
//WriteLog.WriteFollowLog(String.Format("TagMemoryFinFout-GetFirstTag Semaphore(valueUpdated) : Released. Available Slots = {0}", valueUpdated.CurrentCount));
}
else
{
tag = null;
}
return tag;
}
3. "ProcessTagRequest()" thread is started
Function ProcessTagRequest()
private void ProcessTagRequest(List<TagValueObject> tags)
{
if (tags.Count == 0) return;
try
{
sendData(tags, false);
}
catch (Exception ex)
{
WriteLog.WriteErrorLog("TCPRestAPIClient", ex);
}
//WriteLog.WriteErrorLog("Closed Thread");
}
4. We finally send the data to the API through sendData() function.
Function sendData()
private void sendData(List<TagValueObject> valueArray, Boolean externalMemory)
{
if (valueArray.Count > 0)
{
List<object> valueArrayAdjusted = new List<object>();
foreach (var valueEntry in valueArray){
dynamic data = new ExpandoObject();
data.value = valueEntry.TagValue;
data.date = valueEntry.TagTimeStamp;
data.quality = valueEntry.TagQuality;
data._id = valueEntry.TagID;
valueArrayAdjusted.Add(data);
}
var body = new
{
tags = valueArrayAdjusted
};
Boolean sendFail = true;
var tries = 0;
do
{
tries++;
IRestResponse response = APICall(Method.POST, "/senddata", body, token);
if (response.StatusCode == HttpStatusCode.OK)
{
sendFail = false;
// WriteLog.WriteErrorLog("Senddata connected successfully : " + response.StatusCode + " " + response.Content);
}
else
{
WriteLog.WriteFollowLog("Senddata failed to connect : " + response.StatusCode + " " + response.Content);
}
} while (tries < 1 && sendFail);
if (sendFail)
{
try
{
if (ConfigurationManager.AppSettings["UseExternalStorage"] == "true")
{
foreach (TagValueObject value in valueArray)
{
StoreInExternalMemory(value);
}
WriteLog.WriteFollowLog(String.Format("Senddata : Stored {0} tags in external memory", valueArray.Count));
}
} catch (Exception e)
{
WriteLog.WriteErrorLog("ThreatRestApiClient : " + e.ToString());
}
}
}
}
My question is : How to reduce the CPU usage? What is wrong in this code, architecture?

Kafka consumer is not consuming message

I am new in Kafka. kafka consumer is not reading message from the given topic.
I am checking with kafka console as well. it is not working. i donot understand the problem. it was working fine earlier.
public string MessageConsumer(string brokerList, List<string> topics, CancellationToken cancellationToken)
{
//ConfigurationManager.AutoLoadAppSettings("", "", true);
string logKey = string.Format("ARIConsumer.StartPRoducer ==>Topics {0} Key{1} =>", "", string.Join(",", topics));
string message = string.Empty;
var conf = new ConsumerConfig
{
BootstrapServers = "localhost:9092",
GroupId = "23",
EnableAutoCommit = false,
AutoOffsetReset = AutoOffsetResetType.Latest,
};
using (var c = new Consumer<Ignore, string>(conf))
{
try
{
c.Subscribe(topics);
bool consuming = true;
// The client will automatically recover from non-fatal errors. You typically
// don't need to take any action unless an error is marked as fatal.
c.OnError += (_, e) => consuming = !e.IsFatal;
while (consuming)
{
try
{
TimeSpan timeSpan = new TimeSpan(0, 0, 5);
var cr = c.Consume(timeSpan);
// Thread.Sleep(5000);
if (cr != null)
{
message = cr.Value;
Console.WriteLine("Thread" + Thread.CurrentThread.ManagedThreadId + "Message : " + message);
CLogger.WriteLog(ELogLevel.INFO, $"Consumed message Partition '{cr.Partition}' at: '{cr.TopicPartitionOffset} thread: { Thread.CurrentThread.ManagedThreadId}'. Message: {message}");
//Console.WriteLine($"Consumed message Partition '{cr.Partition}' at: '{cr.TopicPartitionOffset}'. Topic: { cr.Topic} value :{cr.Value} Timestamp :{DateTime.UtcNow.ToString("yyyy-MM-dd HH:mm:ss.fff", CultureInfo.InvariantCulture)} GrpId: { conf.GroupId}");
c.Commit();
}
Console.WriteLine($"Calling the next Poll ");
}
catch (ConsumeException e)
{
CLogger.WriteLog(ELogLevel.ERROR, $"Error occured: {e.Error.Reason}");
Console.WriteLine($"Error occured: {e.Error.Reason}");
}
//consuming = false;
}
// Ensure the consumer leaves the group cleanly and final offsets are committed.
c.Close();
}
catch (Exception ex)
{
}
}
return message;
}
What is the issue with this code or there is installation issue with kafka
Is there a Producer actively sending data?
Your consumer is starting from the latest offsets based on the AutoOffsetReset, so it wouldn't read existing data in the topic
The console consumer also defaults to the latest offset
And if you haven't changed the GroupId, then your consumer might have worked once, then you consumed data, then commited the offsets for that group. When the consumer starts again in the same group, it will only resume from the end of the topic, or the offset of the last commit
You also have an empty catch (Exception ex), which might be hiding some other error
Try removing the timespan from consume method.

C# false http response

I have a thread that returns a site's http response status, but sometimes my program returns false results. and after a while it gives good results.
False result:
it takes a big a mount of time to check, and then it says that (for example) Google is down, which is quite not reasonable, but after a few seconds it returns good results
Can you take a look and tell me whats wrong? or how I can I improve it?
Checks all sites in datagrid:
private void CheckSites()
{
if (CheckSelected())
{
int rowCount = dataGrid.BindingContext[dataGrid.DataSource, dataGrid.DataMember].Count;
string url;
for (int i = 0; i < rowCount; i++)
{
url = dataGrid.Rows[i].Cells[2].Value.ToString();
if (url != null)
{
Task<string[]> task = Task.Factory.StartNew<string[]>
(() => checkSite(url));
// We can do other work here and it will execute in parallel:
//Loading...
// When we need the task's return value, we query its Result property:
// If it's still executing, the current thread will now block (wait)
// until the task finishes:
string[] result = task.Result;
selectRows();
if (result[0] != System.Net.HttpStatusCode.OK.ToString() && result[0] != System.Net.HttpStatusCode.Found.ToString() && result[0] != System.Net.HttpStatusCode.MovedPermanently.ToString())
{
//bad
notifyIcon1.ShowBalloonTip(5000, "Site Down", dataGrid.Rows[i].Cells[2].Value.ToString() + ", has a status code of:" + result, ToolTipIcon.Error);
dataGrid.Rows[i].DefaultCellStyle.BackColor = System.Drawing.Color.Wheat;
TimeSpan ts;
TimeSpan timeTaken = TimeSpan.Parse(result[1]);
dataGrid.Rows[i].Cells[3].Value = result[0];
dataGrid.Rows[i].Cells[3].Style.BackColor = System.Drawing.Color.Red;
dataGrid.Rows[i].Cells[4].Value = timeTaken.Seconds.ToString() + "." + String.Format("{0:0.00000}", timeTaken.Milliseconds.ToString()) + " seconds.";
string sec = (DateTime.Now.Second < 10) ? "0" + DateTime.Now.Second.ToString() : DateTime.Now.Second.ToString();
string min = (DateTime.Now.Minute < 10) ? "0" + DateTime.Now.Minute.ToString() : DateTime.Now.Minute.ToString();
string hour = (DateTime.Now.Hour < 10) ? "0" + DateTime.Now.Hour.ToString() : DateTime.Now.Hour.ToString();
dataGrid.Rows[i].Cells[5].Value = hour + ":" + min + ":" + sec;
//loadbar
}
else if (result[0] == "catch")//catch
{
notifyIcon1.ShowBalloonTip(10000, "SITE DOWN", dataGrid.Rows[i].Cells[1].Value.ToString() + ", Error:" +result[1], ToolTipIcon.Error);
dataGrid.Rows[i].Cells[3].Value = result[1];
dataGrid.Rows[i].Cells[3].Style.BackColor = System.Drawing.Color.Red;
//loadbar
}
else
{
//good
TimeSpan timeTaken = TimeSpan.Parse(result[1]);
dataGrid.Rows[i].Cells[3].Value = result[0];
dataGrid.Rows[i].Cells[3].Style.BackColor = System.Drawing.Color.LightGreen;
dataGrid.Rows[i].Cells[4].Value = timeTaken.Seconds.ToString() + "." + String.Format("{0:0.00000}", timeTaken.Milliseconds.ToString()) + " seconds.";
string sec = (DateTime.Now.Second < 10) ? "0" + DateTime.Now.Second.ToString() : DateTime.Now.Second.ToString();
string min = (DateTime.Now.Minute < 10) ? "0" + DateTime.Now.Minute.ToString() : DateTime.Now.Minute.ToString();
string hour = (DateTime.Now.Hour < 10) ? "0" + DateTime.Now.Hour.ToString() : DateTime.Now.Hour.ToString();
dataGrid.Rows[i].Cells[5].Value = hour + ":" + min + ":" + sec;
//loadbar
}
selectRows();
}
}
}
}
Checks a site:
/////////////////////////////////
////Check datagrid websites-button - returns response
/////////////////////////////////
private string[] checkSite(string url)
{
string[] response = new string[2];
url = dataGrid.Rows[0].Cells[2].Value.ToString();
if (url != null)
{
try
{
HttpWebRequest httpReq;
httpReq.Timeout = 10000;
//loadbar
dataGrid.Rows[0].DefaultCellStyle.BackColor = System.Drawing.Color.Wheat;
System.Diagnostics.Stopwatch timer = new System.Diagnostics.Stopwatch();
timer.Start();
HttpWebResponse httpRes = (HttpWebResponse)httpReq.GetResponse(); //httpRes.Close();
timer.Stop();
//loadbar
HttpStatusCode httpStatus = httpRes.StatusCode;
response[0] = httpStatus.ToString();
response[1] = timer.Elapsed.ToString();//*
httpRes.Close();
return response;
}
catch (Exception he)
{
response[0] = "catch";
response[1] = he.Message;
return response;
}
}
response[0] = "catch";
response[1] = "No URL entered";
return response;
//dataGrid.Rows[i].DefaultCellStyle.BackColor = System.Drawing.Color.Blue;
}
Thanks in advance.
Assuming the code provided is the actual code used:
First of all, your definition of 'False result' and 'Good result' is wrong. If you expect A but get B, that doesn't mean B is invalid. If your wife is giving birth and you expect a boy but it turns out the be a girl, its not a false result. Just unexpected.
That said: lets analyze your work: If it takes a long long time to check a site only to finally get a ??? result which isn't a 200 response code. We can almost savely assume you are dealing with a timeout. If your router, google or any fundamental network device in between is having problems, its expected to get an unexpected answer. "Timeout", "Bad Request", "Server not available" etc. Why would this happen? Its impossible to say for certain without having direct access to your environment.
Looking at your code however, i see that you're using the default TaskScheduler for making each check run as a task in the background (assuming you havent changed the default task scheduler which would be a vey bad practice to begin with). The default task scheduler, schedules each task on the threadpool which results in many many tasks running simultanious. Here we have a good candidate for overloading your network. Many sites (esspecially google) are kinda sensitive for handling many requests from the same source (esspecially if the frequency is high) so maybe google is blocking you temporarily or holding you back. Again, at this point it's pure speculation but the fact that you're running all checks simultaniously (unless the thread pool is on his max) is very likely the cause of your problem.
UPDATE
I would recommend working with a LimitedConcurrencyTaskScheduler ( see here: http://blogs.msdn.com/b/pfxteam/archive/2010/04/09/9990424.aspx ). Here you can limit the amount of tasks that can be run asynchronously. You have to do some testing for what number works ideally in your situation. Also make sure that the frequency is not 'too' high. Its hard to define what is too high, only testing can proof that.
In order to simulate your scenario, I have created a Winform with data grid and a button. On load of the form, I programmatically creates list of url’s (in a table) and bind to data grid. And on button click, we start the download process. In concise, the you have to write more defensive code and the following code only a skeleton of how you can fix the issue.
using System;
using System.Data;
using System.Net;
using System.Threading.Tasks;
using System.Windows.Forms;
namespace app
{
public partial class Form1 : Form
{
DataTable urls = new DataTable();
public Form1()
{
InitializeComponent();
}
//Fill your uri's and bind to a data grid.
void InitTable()
{
//Silly logic to simulate your scenario.
urls = new DataTable();
urls.Columns.Add(new DataColumn("Srl", typeof(string)));
urls.Columns.Add(new DataColumn("Urls", typeof(Uri)));
urls.Columns.Add(new DataColumn("Result", typeof(string)));
DataRow dr = urls.NewRow();
dr["Srl"] = "1";
dr["Urls"] = new Uri("http://www.microsoft.com");
dr["Result"] = string.Empty;
urls.Rows.Add(dr);
dr = urls.NewRow();
dr["Srl"] = "2";
dr["Urls"] = new Uri("http://www.google.com");
dr["Result"] = string.Empty;
urls.Rows.Add(dr);
dr = urls.NewRow();
dr["Srl"] = "3";
dr["Urls"] = new Uri("http://www.stackoverflow.com");
dr["Result"] = string.Empty;
urls.Rows.Add(dr);
urls.AcceptChanges();
}
void UpdateResult()
{
dataGridView1.DataSource = urls;
}
//Important
// This example will freeze UI. You can avoid this while implementing
//background worker or pool with some event synchronization. I haven't covered those area since
//we are addressing different issue. Let me know if you would like to address UI freeze
//issue. Or can do it your self.
private void button1_Click(object sender, EventArgs e)
{
//Create array for Task to parallelize multiple download.
var tasks = new Task<string[]>[urls.Rows.Count];
//Initialize those task based on number of Uri's
for(int i=0;i<urls.Rows.Count;i++)
{
int index = i;//Do not change this. This is to avoid data race
//Assign responsibility and start task.
tasks[index] = new Task<string[]>(
() => checkSite(
new TaskInput(urls.Rows[index]["Urls"].ToString(), urls.Rows[index]["Srl"].ToString())));
tasks[index].Start();
}
//Wait for all task to complete. Check other overloaded if interested.
Task.WaitAll(tasks);
//block shows how to access result from task
foreach (var item in tasks)
{
DataRow[] rows=urls.Select("Srl='"+item.Result[2]+"'");
foreach (var row in rows)
row["Result"]=item.Result[0]+"|"+item.Result[1];
}
UpdateResult();
}
//This is dummy method which in your case 'Check Site'. You can have your own
string[] checkSite(TaskInput input)
{
string[] response = new string[3];
if (input != null)
{
try
{
WebResponse wResponse = WebRequest.Create(input.Url).GetResponse();
response[0] = wResponse.ContentLength.ToString();
response[1] = wResponse.ContentType;
response[2] = input.Srl;
return response;
}
catch (Exception he)
{
response[0] = "catch";
response[1] = he.Message;
response[2] = input.Srl;
return response;
}
}
response[0] = "catch";
response[1] = "No URL entered";
response[2] = input.Srl;
return response;
}
private void Form1_Load(object sender, EventArgs e)
{
InitTable();
UpdateResult();
}
}
//Supply custom object for simplicity
public class TaskInput
{
public TaskInput(){}
public TaskInput(string url, string srl)
{
Url = url;
Srl = srl;
}
public string Srl { get; set; }
public string Url { get; set; }
}
}

.net remoting problem with chat server

I have a problem with my chat server implementation and I couldn't figure out why it doesn't work as intended.
The client could send messages to the server, but the server only sends the messages to itself instead of the client.
E.g. the client connects to the server, then types "hello" into the chat. The server successfully gets the message but then posts the message to its own console instead of sending it to the connected clients.
Well... maybe I have missed something as I'm very new to .Net remoting. Maybe someone could help me figure out what the problem is. Any help is appreciated!
The code:
I have a small interface for the chat implementation on the server
public class ChatService : MarshalByRefObject, IService
{
private Dictionary<string, IClient> m_ConnectedClients = new Dictionary<string, IClient>();
private static ChatService _Chat;
private ChatService()
{
Console.WriteLine("chat service created");
_Chat = this;
}
public bool Login(IClient user)
{
Console.WriteLine("logging in: " + user.GetIp());
if (!m_ConnectedClients.ContainsKey(user.GetIp()))
{
m_ConnectedClients.Add(user.GetIp(), user);
PostMessage(user.GetIp(), user.GetUserName() + " has entered chat");
return true;
}
return false;
}
public bool Logoff(string ip)
{
Console.WriteLine("logging off: " + ip);
IClient user;
if (m_ConnectedClients.TryGetValue(ip, out user))
{
PostMessage(ip, user + " has left chat");
m_ConnectedClients.Remove(ip);
return true;
}
return false;
}
public bool PostMessage(string ip, string text)
{
Console.WriteLine("posting message: " + text + " to: " + m_ConnectedClients.Values.Count);
foreach (var chatter in m_ConnectedClients.Values)
{
Console.WriteLine(chatter.GetUserName() + " : " + chatter.GetIp());
chatter.SendText(text);
}
return true;
}
}
My Server implements the chatservice as singleton:
RemotingConfiguration.RegisterWellKnownServiceType(typeof(ChatService), "chatservice", WellKnownObjectMode.Singleton);
My client is also simply straight forward:
[Serializable]
public class Chat_Client : IClient
{
private string m_IpAdresse;
private string m_UserName = "Jonny";
private string m_Input;
public Chat_Client(string ip, string username)
{
m_IpAdresse = ip;
m_UserName = username;
}
public bool HandleInput(string input)
{
if (input.Equals("exit"))
{
Client.m_ChatService.Logoff(m_IpAdresse);
return false;
}
m_Input = input;
Thread sendThread = new Thread(new ThreadStart(SendPostMessage));
sendThread.Start();
//Console.WriteLine("post message");
return true;
}
private void SendPostMessage()
{
Client.m_ChatService.PostMessage(m_IpAdresse, m_Input);
Thread thisThread = Thread.CurrentThread;
thisThread.Interrupt();
thisThread.Abort();
}
public void SendText(string text)
{
Console.WriteLine("send text got: " + text);
Console.WriteLine(text);
}
The main client connects to the server via:
public void Connect()
{
try
{
TcpChannel channel = new TcpChannel(0);
ChannelServices.RegisterChannel(channel, false);
m_ChatService = (IService)Activator.GetObject(typeof(IService), "tcp://" + hostname + ":9898/Host/chatservice");
System.Net.IPHostEntry hostInfo = Dns.GetHostEntry(Dns.GetHostName());
m_IpAdresse = hostInfo.AddressList[0].ToString();
Chat_Client client = new Chat_Client(m_IpAdresse, m_UserName);
Console.WriteLine("Response from Server: " + m_ChatService.Login(client));
string input = "";
while (m_Running)
{
input = Console.ReadLine();
m_Running = client.HandleInput(input);
}
}
}
#John: No, I wasn't aware of that. Thanks for the info, I'll look into it.
#Felipe: hostname is the dns name of the server I want to connect to.
I found a workaround to make this work. I added an additional TcpListener to the client to which the server connects when the client logs in. Over this second channel I transmit the chat messages back.
However I couldn't understand why the old solution does not work :<
Thanks for the hints guys.
The best thing to do with .NET remoting is to abandon it for WCF. If you read all the best practices for scalability, the way you end up using it is extremely compatible with the Web Services model, and web services is far easier to work with.
Remoting is technically fascinating and forms the basis of reflection, but falls apart once slow, unreliable connections are involved - and all connections are slow and unreliable compared to in-process messaging.

Categories

Resources