I'm working on a P2P application in C#.
It's a file transfer with a file splitting, and text chat.
On a client there are 2 threads, 1 for listening, 1 for sending.
When i send a file, it's first split into let's say 10 pieces, these 10 pieces are added to a send queue in the client, it then starts sending file chunk 1.
But now i want to send a message through the same pipe.
My idea is then to insert that message into the send list before file chunk 2.
What kind of threading do i need for 2 threads to work on the same list?
I have accounted for the objects being received this way.
My initial idea for the send function was something along these lines:
public void Send()
{
while (IsConnected())
{
if (unSentObjects.Count > 1)
{
Task sendTask = new Task(() => SendObj(unSentObjects[0]));
sendTask.Start();
}
}
}
You could use a Synchronization Object such as a mutex to prevent race conditions or simultaneous write/read to same file. Basically only 1 thread will be able to access the object.
If the data is global to the threads and they are all once process, you can use the synchronization object simply to signal when to use global shared data and when to not use it. Other than that using the shared global data is exactly the same, you are just trafficking the use of it.
Related
I have a BlockingCollection with one producer and one consumer in my application. The consumer writes the received data to a file. I need to make a real-time graph drawing from this data.
Is it possible to run a second consumer that would draw graph with the same data from the collection in parallel?
however, is it so that when one consumer takes out the data, the other consumer can no longer access it? In the sense that only one copy of the data can be taken out of the collection? (it seems to me that this is how this 'buffer' works).
in that case, is it a good idea that in this consumer that I have to write data to a file to do at the same time adding this data to the other defined BlockingCollection: collection2.Add(dataBuffer);? And then create a consumer2 that will draw the graph? I still just don't know how in this case consumer2 would get to the graph, which would be part of the GUI thread.
// -----------------------------------------------------------------
// A method that fetches data from a queue and writes it to a file.
// It works in a dedicated Task.
// -----------------------------------------------------------------
static void Consumer()
{
while (true)
{
foreach (DataBuffer dataBuffer in collection.GetConsumingEnumerable())
{
_FileStream.Write(dataBuffer.Buffer, 0, dataBuffer.Length);
_FileStream.Flush();
}
// adding dataBuffer to the second consumer here?
}
}
To be more precise. My question is how to organize such an application in an optimal way, because I have no experience (until a month ago, I had no idea about the existence of BlockingCollection).
At the moment:
I have Form window of the application,
I have an object in which a Task is started, which receives data from an external device,
after receiving the data, an Event is generated, with which I pass the received data,
I have another object in which a Handler for this Event runs, and in this Handler the data is taken from the argument and is added to the BlockingCollection,
I have the consumer Task running, which saves the received to a file.
Now I also need to add drawing an overview chart from the same data as it goes to the file and I am wondering how to do it.
I have a personal C# project that keeps track of live sports data during an event. It does this by scraping a JSON file on the sport's website. The JSON file is continuously updated during the sports event.
However, the page does NOT refresh itself. The existing file is simply overwritten. To monitor the data in real time as desired, I have to send requests continously for 2-4 hours -- from the start of the event, to the end.
My code is configured to loop endlessly until I hit the Esc key:
string url = "https://www.example.com/live/feeds/stats.json";
while (!(Console.KeyAvailable && Console.ReadKey(true).Key == ConsoleKey.Escape))
{
try
{
string json = (new WebClient()).DownloadString(url);
// parse JSON
...
}
catch (...)
{
...
}
}
My questions are:
If I do send such a high volume of requests for hours at a time, am I at risk of having my IP address blacklisted and access denied?
Is it possible to monitor this JSON file continuously without sending a million requests?
Can this be done using another language/framework? It doesn't need to be in C#.
You can just add a sleep to the loop if you want to limit the number of calls:
Thread.Sleep(TimeSpan.FromSeconds(30));
This will sleep for 30 seconds in between calls but you can set it to whatever frequency you like.
It isn't apparent from your code snippet, but if you are in an async method, you should use:
await Task.Delay(TimeSpan.FromSeconds(30));
I originally had a race condition when sending data, the issue was that I was allowing multiple SocketAsyncEventArgs to be used to send data, but the first packet didn't send fully before the 2nd packet, this is because I have it so if the data doesn't fit in the buffer it loops until all the data is sent, and the first packet was larger than the second packet which is tiny, so the second packet was being sent and reached to the client before the first packet.
I have solved this by assigning 1 SocketAyncEventArgs to an open connection to be used for sending data and used a Semaphore to limit the access to it, and make the SocketAsyncEventArgs call back once it completed.
Now this works fine because all data is sent, calls back when its complete ready for the next send. The issue with this is, its causing blocking when I want to send data randomly to the open connection, and when there is a lot of data sending its going to block my threads.
I am looking for a work around to this, I thought of having a Queue which when data is requested to be sent, it simply adds the packet to the Queue and then 1 SocketAsyncEventArgs simply loops to send that data.
But how can I do this efficiently whilst still being scalable? I want to avoid blocking as much as I can whilst sending my packets in the order they are requested to be sent in.
Appreciate any help!
If the data needs to be kept in order, and you don't want to block, then you need to add a queue. The way I do this is by tracking, on my state object, whether we already have an active send async-loop in process for that connection. After enqueue (which obviously must be synchronized), just check what is in-progress:
public void PromptToSend(NetContext context)
{
if(Interlocked.CompareExchange(ref writerCount, 1, 0) == 0)
{ // then **we** are the writer
context.Handler.StartSending(this);
}
}
Here writerCount is the count of write-loops (which should be exactly 1 or 0) on the connection; if there aren't any, we start one.
My StartSending tries to read from that connection's queue; if it can do so, it does the usual SendAsync etc:
if (!connection.Socket.SendAsync(args)) SendCompleted(args);
(note that SendCompleted here is for the "sync" case; it would have got to SendCompleted via the event-model for the "async" case). SendCompleted repeats this "dequeue, try send async" step, obviously.
The only thing left is to make sure that when we try to dequeue, we note the lack of action if we find nothing more to do:
if (bufferedLength == 0)
{ // nothing to do; report this worker as inactive
Interlocked.Exchange(ref writerCount, 0);
return 0;
}
Make sense?
i have a multi-threaded application which wants to send a sequence of data to an external device via a serial port. the sequence of data is a typical cmd - response protocol (ie: a given thread sends a sequence of bytes then waits to read a response which is typically an ack and then it might send another sequence).
what we are are looking to do is declare a sequence of code has exclusive access to this resource until it is done and if another thread wants access to the same external resouce, it waits.
this seems like what LOCK does, but all the examples that i have seen show lock being used to protect a specific block of code, not to serialize access to a resource.
so programatically can i have
Object serialPortLock = new Object();
and in different parts of my program use a construct that looks like:
Lock (serialPortLock)
{
// my turn to do something that is not the same as what
// someone else wants to do but it acts on the same resource
}
the c# documentation talks about using Mutex as a more robust version of Lock. is that whats required here?
Yes, your pattern is correct as long as your program is the only software accessing the serial port.
You have not posted your entire code. If the class that contains serialPortLock has multiple instances, then you MUST make serialPortLock a static. This is normally best practice.
class MySerialPort
{
static object synchLock = new object();
public void DoSomething()
{
lock (synchLock)
{
// whatever
}
}
}
Locking should work fine in the case you've suggested as long as you are locking around all access to any of the object instances that point at the external resource.
My application connects to a device and sends multiple commands across a single socket connection. It then reads the response to these the basic structure is
command 1
stream.write
stream.read
command 2
stream.write
stream.read
.
.
.
i am wondering if there is a better way of doing this. I am not worried about blocking because this is running on a different thread than the rest of the program. the problem i am encountering is that sometimes the data for command 1 lands in the read for command 2. The other thing is the 1st byte that i receive is unique to the command.
any help would be appreciated
Assuming TCP - there is no way to ensure that each command is read as it was sent. At the destination end, each command can be fragmented or joined to other commands, so you need to manually decide where the boundaries are between them.
A common technique is to prefix the commands with their length, which you can read first, and know precisely how many bytes to read before the next one. At the destination end, you usually have some kind of queue which you push all received data onto, and you read off the queue one command at a time, only when there is one or more completely received commands.
I wouldn't recommend using blocking sockets under any circumstances really, even if you're using a separate thread. If you need to both send and receive on the same socket, you could encounter issues where you attempt to call Read when no data is waiting, and you will not be able to send any data until some is received.
Rather than using the blocking calls, use BeginRead,EndRead for asynchronous receiving, then you'll be able to send and receive in the same thread without having those worries.
Because you are in multithreading, use lock around sending commands, Like:
public void SendCommand(Command command)
{
lock (_commandLocker)
{
stream write
stream read
}
}
So only one command at time will send and receive data.
However, if you are receiving data from the device at any time "maybe it sends a notifications.." then consider do something like:
private Queue<Notification> _notificationsBuffer = new Queue<Notification>();//Or use BlockingCollection if your are using 4.0
At SendCommand
...
while (stream read)
{
if (this is a notification)
{
then add it to the notification buffer and continue read
continue;
}
else (this is our command)
{ ... read the response..}
}