C# - Fastest way to copy image files in lan computer - c#

I request to read my problem till end. Someone might find it duplicate.
I have a windows application (Client App) a machine & Web Application (Server App) on another machine in same Network.
Client app is capturing screen 5fps and keeping in a local folder which is shared. I have a Windows service which runs on server machine. It moves client images to server directory from client shared folder. I am using File.Move to move the files along with FastDirectoryEnumerator class. These moved files are used to create videos later and also used to show live streaming.
Questions:
Is there any other (Best/Fastest) option to move these files in real time (transfer as soon as it is created at client side)?
I am also interested in file transfer in real time without shared folder.
Update:
My major concerns.
File Transfer should be faster to allow live streaming through my server app (ASP.Net)
Client should retain it if server/connection is not available & transfer as soon as it comes online.

I do not know why you have the server monitoring the client's share, surely this monitoring or as you say, DirectoryEnumerator procedure takes time.
Since the client knows when the image has been captured why don't you send this information to the server immediately from the client? In this way you do not need to monitor clients from server, you do not need to find / enumerate folders, you simply transfer data from client to server as soon as it is available with a specific WCF / Web Service call which takes a stream of bytes as input.

If you have no control over the client app creating the image files you could have a FileSystemWatcher running on the client machine. Add an event handler which will be called whenever the Created event is raised due to a file being created on the file system.
See https://stackoverflow.com/a/15018082/1730317

Things to consider in gaining the best performance out of this type of activity.
For "Real Time" notifications you may want to look at the FileSystemWatcher class. See the documentation for a complete example. Word of warning on this though, when the Operating system is busy it will drop some of these events and you will not be informed of the new file. It's a good idea to program a sweep up loop make doubly sure you have all the files, I have been caught out with this in the past.
You may want to check your hardware configuration, like network latency between the pc and server, and also the disk speeds at both the PC and Server. A low spec SSD is likely to have a big performance benefit in these kinds of operations over spinning disks.

Related

How to dump an xml file generated from server to a random client, but I have to damp that file in a specific folder of clients

I was thinking to dump an XML file generated from server to a random client requesting that. The problem is that I have to damp that file in a specific folder of clients (C:/Application/POSMachine/WaitingXML), for there is another application listening to that folder.
My approach:
A simple patch (to change chrome download address to the specific desired path)
Installing windows service/local-api for those clients who need this feature, and passing server-side generated XML to clients installed service/api, I am assuming I will get clients IP-address from the server and hit the service hosted in clients.
Request.GetOwinContext().Request.RemoteIpAddress
Any comments and better implementation is appreciated.
It would be a security risk to allow a website to download a file to an arbitrary location (that would mean a malicious website could just download a new svchost.exe to C:\Windows\System32 or what not)
Also your idea with running a service would not work in all cases, since your clients will most probably be behind a modem/router/NAT switch (or multiple). All these devices would have to be configured for port forwarding. So, you really need an 'client-outbound' connection (like a browser does).
I would implement a client program which can contact your server and download the file with a System.Net.WebRequest and save to the specific location. Another possibility might be to add a System.Windows.Forms.WebBrowser control and handle the FileDownload event. However, your question does not contain enough information to more specific (how does the client chose a file to download)

How to properly execute C# file over local network?

Overview
C# File - Users PC
PHP Server - Hosts Webpages for application
Server and Users PC on local network
I have a c# file that reads weight from a USB scale. How would I trigger this file to run so it feeds into my program. The problem is I am using PHP to host our webpage/application so its not running client side and the scale is not hooked up to the server but to the clients PC.
The C# script would have to be on the clients in order to read the scale so how would I trigger this to happen?
Is this even possible and if not what would be a better way?
Important Edit
I was able to run the Scale Script (C#) when I wanted by having PHP and C# use TCP sockets.
The C# would listen for PHP to send something and when it did it would read the scale and send this information back to PHP becuase PHP was listening for a response. Mixed in with a little Ajax and it updates in the web browser.
Gave Chris Credit because he was the most helpful with answering my questions
It sounds like what you really want is for the client application to submit the data to the website itself, and the most suitable approach is probably to expose a web service from your server.
This service should accept weight data, along with some sort of customer key or whatever, to correlate the records correctly on the server side. I've never created a web service in PHP personally, so I can't give any advice on the implementation of that, but it is fairly trivial to hook a C# client app up to a web service once you've exposed its metadata (assuming you use SOAP).
you can't start C# application from a web page in a way that'll work in every browser every time. BUT, you can have some workarounds:
Use ActiveX component that read the data in the client and upload it to the server. the biggest cons is that it'll only work in Internet Explorer
use Silverlight client application that runs on elevated mode (v4) and upload the data to your server.
refer your clients to download application (the C# application you wrote about) and run it - this application will upload the data to your server.
hope this helps.
C# isn't a scripting language, it's a language that compiles into executable binaries or libraries. You won't be able to execute C# code on the client's computer via a website because C# code needs to be compiled before it can run.
Presumably what you really want is for your compiled C# binary to be executed on the client's machine via your website. You won't be able to easily do that. There are a lot of security measures in place to prevent browsers from running programs on your computer. There may be ways to hack around these security measures by using plugins (such as ActiveX), but it's not something that will be a one-liner.
Edit: I think you need to step back and think about what you're trying to do in a broad sense. You're trying to create a website that can read information from a user's USB port. This is the type of thing that browsers are designed to prevent, and for good reason. I wouldn't want random websites to be able to access peripheral hardware without my explicit permission. If you want this website to function the way you're expecting, you're going to have to seriously think about the security implications. You'll need some kind of client-side code (ActiveX, Silverlight, ...), and the user will need to explicitly give permission to for this all to happen. It won't be easy, and it won't be automatic. And I'm damn glad that's true.

How can I block access to certain site until reboot, WITHOUT changing hosts file?

Here's what we'd like to achieve via a c# application. Is there a way this can be done, or is it impossible?
Block access to a website (say www.stackoverflow.com) between 11am and 2pm.
Also block access to MSN Messenger between the same hours.
Stop the blocking if the machine is rebooted.
What's confusing us is the exact point where the blocking belongs. The only possible solution we have at the moment is programatically altering the hosts file to block and unblock certain sites. This solution would fall down on a few points.
The user can manually change the hosts file back, which would be undesirable.
If the user had a browser open at 11am they would have to restart it to pick up the amended hosts file.
I'm a web developer so not entirely sure how this works, but I believe the c# app would have to be running as an administrator in order to edit the hosts file, which again would not be desirable.
To clarify, we're trying to produce something like macfreedom.com - so users's aren't going through a proxy or network that we control (unless the solution is to make the user's machine point to a proxy server but I doubt that's achievable or desirable). macfreedom.com appears to work by switching off the network adapter until reboot. We were hoping for a more subtle effect.
this should work like the most client firewalls ... a network filter driver dropping packets ... i fear you will have to use a kernel mode driver -> so .net seems to be out of the game ...

Upload a file to multiple servers

I see a ton of questions about uploading multiple files, but none about uploading a single file to multiple servers, so here goes...
I have an ASP.NET app that will be running on two load balanced servers, and I would like to allow users to upload files and have them end up on both servers. What is the cleanest way to do this? I am using IIS 6 btw.
Some ideas that come to mind are:
1) Use a virtual directory that points to some shared location that both servers can access. Will there be any access issues if the application runs at Network Service? I'm assuming the application will need to run as a user account that exists on the shared location machine. How should the permissions be set for this?
2) It would be nice if I could via jQuery post the request to both of my servers, referencing them by their port numbers. Even though the servers are on the same domain, this violates the same origin policy, right?
Is there another solution I'm overlooking? How do other sites do this?
I think you want to consider this problem more carefully - having a pair (or more) of servers means that some of them will be offline some of the time (at least for occasional reboots).
Uploads when not all of the servers are online won't be able to be sent to all servers immediately, so you'd need either an intermediate server (which would be a point of failure unless it was highly available itself) or a queuing system to "remember" which files were where, and to transfer them when the relevant servers were restored.
Also, you'll want a backup system, and some way to add newly provisioned servers to your cluster. You will also want a way to monitor these files are the same in case they get out of sync. Your architecture needs a lot of careful thought. I don't have the answers :)
The cleanest approach is forwarding the files server-side, really. If you force two uploads via JavaScript, not only will you have to worry about working around XSS safeguards, but you'll also force the user to use their very limited upstream bandwidth twice for each file.
You shouldn't be exposing that kind of detail to the client anyway. The browser doesn't need to know where the file ends up, just who to send it to. If you keep that logic server-side, not only do you keep the details hidden (and thus less prone to errors and exploits), but you'll also get more control over the process. You can create a gateway service later that handles a multitude of back end storages and you can handle failing servers better. You can queue failed uploads and retry. All these come at a very low cost if you do them on the server side, but are a pain to be made to work reliably on the client side.
Keep back end logic to your back end. Load balancing should be hidden from the user, so there's no need to tell them where they are sending their files exactly. Make it optional, if you want, but hide the action from them. Just swallow the file on the gateway server (which can be either of the load balancing servers -- in fact, it should probably be load balanced, too, so it should work with either of them in place) and send it to the other servers from there. The transfer from server to server will probably be faster too.
Your best bet is definitely a NAS, if one is available -- a shared file system that is not specifically associated with any machine. Then you can focus on making the NAS highly available via a clustered frontend.
If that's not an option, you can use a virtual directory on each machine that points to one folder on one of the machines, but then you lose redundancy.
I'm faced with this same challenge at my work. My app is small but needs to be highly available, but there's no NAS in sight. So in each machine's web.config I place a list of all the UNC paths that the uploaded file should be stored. After uploading to a temp folder, I copy the file to each machine one by one. It's not perfect -- a machine could go down, in which case when it came up it might not have all the files (and the copy would be slowed by the hunt for the missing machine) -- but in my situation uploads are so infrequent that it's not worth improvement.
As others have mentioned, Javascript is right out. Upload once.
I have seen this problem solved with a NAS, using credentials for the app pool that can read/write files to that NAS. Make sure your NAS is setup for high availability to prevent single point of failure ie:hot swap w/ raid, multiple array controllers, power supplies..etc
You could also put folder monitoring software on the severs that keep certain directories in sync. I don't recommend this solution.

Upload file to a remote server, how should I?

I am scratching my head about this. My scenario are that I need to upload a file to the company server machine(to a folder on c:) from our hosting one(totally different server). I don't know how I should do this. Any of you got tips or code on how this is done.
Thanks Guys
I would set up an FTP server (like the one in IIS or a third-party server) on the Company Server. If security is an issue then you'll want to set up SFTP (secure FTP) rather than vanilla FTP since FTP is not a natively secure transfer protocol. Then create a service on the Hosting Server to pick up the file(s) as they come in and ship them to the company server using C#/.NET's FTP control. Honestly, it should be pretty straightforward.
Update: Reading your question, I am under the strong impression that you will NOT have a web site running on the company server. That is, you do not need a file upload control in your web app (or already know how to implement one given that the control is right in the web page toolbox). Your question, as I understand it, is how to get a file from the web server over to the company server.
Update 2: Added a note about security. Note that this is less of a concern if the servers are on the same subdomain and won't be routed outside of the company network and/or if the data is not sensitive. I didn't think of this at first because I am working a project like this now but our data is not, in any way, sensitive.
Darren Johnstone's File Upload control is as good a solution as you will find anywhere. It has the ability to handle large files without impacting the ASP.NET server memory, and can display file upload progress without requiring a Flash or Silverlight dependency.
http://darrenjohnstone.net/2008/07/15/aspnet-file-upload-module-version-2-beta-1/
There isnt enough info to tell your whole hosting scenario but I have a few suggestions that might get you started in the right direction:
Is your external server owned by another company or group and you cant modify it? If not you might consider hosting the process on the same machine, either in process or as a separate service on the machine. If it cannot be modified, you might consider hosting the service on the destination machine, that way its in the same place as the files need to show up at.
Do the files need to stay in sync with the process? I.e. do they need to be uploaded, moved and verified as a single operation? If not then a separate process is probably the best way to go. The separate process will give you some flexibility, but remember it will be a separate process and a separate set of code to manage and work with
How big is the file(s) that are being uploaded? Do they vary by upload? are the plain files, binaries (zips, executables, etc)? If the files are small you have more options than if they are large. If they are small enough, you can even relay them in line.
Depending on the answers to the above some of these might work for you:
Use MSMQ. This will work for simple messages under about 3MB without too much hassle. Its ideal for messages that can be directly worked with (such as XML).
Use direct HTTP(s) relaying. On the host machine open a HTTP(s) connetion to the destination machine and transfer the file. Again this will work better for smaller files (i.e. only a few KB since it will be done in-line)
If you have access to the host machine, deploy a separate process on the machine which builds or collects the files and uses any of the listed methods to send them to the destination machine.
You can use SCP, FTP (of any form SFTP, etc) on either the host machine (if you have access) or the target machine to host the incoming files and use a batch process to move the files. This will have a lot of issues to address, such as file size, keeping submissions in sync, and timing. I would consider this as a last resort, depending on the situation.
Again depending on message size, you could also use a layer of abstraction such as a DB to act as the intermediate layer between the two machines. This will work as long as the two machines can see the DB (or other storage location) and both act on it. SQL Server Service Broker could be used for this purpose (and most other DB products offer similar products).
You can look at other products like WSO2 ESB or NServiceBus to facilitate messaging between the two apps and do it inline.
Hopefully that will give you some starting points to look into.

Categories

Resources