Design Advice on Image Processing Architecture - c#

I'm working on a stock-standard, ASP.NET MVC 3 web application (hosted on IIS 7). The site allows users to upload photos, among other things.
The upload process is as follows:
User makes use of widget (currently plupload) to select files from their PC.
AJAX call happens to my server, with image in HTTP POST (Request.Files)
Server resizes photo N amount of times
Each resized photo is uploaded to Amazon S3
At the moment, the above is implemented with a "fire and forget" technique using .NET 4.0's TPL.
I would like to make the above more flexible and robust. For example, if the image processing fails (it's using GDI, so it's likely), or S3 is down (which happens), i or the user won't know about it.
I'm thinking about hosting a WCF service as a Windows Service, which polls a folder for images.
My main website would simply FTP the image to the "watched" folder, then the service would take care of the image processing and the uploading.
The user doesn't need to be notified "immediately" that the photo is done. In other words, right now we show a "your image is being processed and will be available shortly" message.
To sum up, the service needs to:
Resize images
Upload images to S3
Read/write to database
Ability to "retry" failed images
Any advice? Is FileSystemWatcher a good option?

In my current project we implemented a similar middleware service responsible for data processing using FileSystemWatcher with relative success. Some things to remember about:
Be sure to implement some sort of queueing for core processing. Starting 100 image conversion processes at the same time is not a good idea. Consider using a ThreadPool.
FileSystemWatcher will give notifications as soon as the file gets created, at which point it may still be write-only locked - you will have to perform periodic checks to determine the right moment to start processing. Probably using a main loop and a queue.
Keep track of finely grained status changes (like file_created, file_processing, file_processed, file_uploading etc). You might really need them for debugging.
Hope this helps and good luck.

Related

Long Process in ASP.NET Web Forms - Browser Timeout

I realise this question has been asked in different variations but with newer features to .net (e.g. async await) I wonder what solution is the best.
I have a C# .Net Web Forms app that has a long running task: The task handles a user request where they upload a csv data file, serialises into object, and imports to a database. The task can take a few minutes and the browser regularly times out - this causes usability issues.
I have seen many solutions whereas the user will upload the data and then the task is carried out in the background. The page will then call the server intermittently to request the status of the task, thus keeping the user informed of the progress.
I would like to know how this is achieved? The options I see on the table:
Windows Service
Web Service - how is this hosted: IIS or a windows service?
Async, Await - is this a possibility?
I think you could take two different approaches.
The first would be a pull approach, you would be keeping the state of the process per user in the server, perhaps in session, and having the process update that state, then the client can pull the actual state via ajax regularly. The ajax call is made from the client's browser, and the function can be put in the same web page that the client is viewing, there's no need to separate it if it's going to be used only from there.
The second could be a push approach, which is a bit more complex but gives you other possibilities. You would need to use a library like signalr https://www.asp.net/signalr, that allows you to communicate from the server to the client's browser, call JS functions, and push the updated state to the client's form. That could create a more functional two-way communication and a better user experience in exchange of a bit more complexity.

C# - Fastest way to copy image files in lan computer

I request to read my problem till end. Someone might find it duplicate.
I have a windows application (Client App) a machine & Web Application (Server App) on another machine in same Network.
Client app is capturing screen 5fps and keeping in a local folder which is shared. I have a Windows service which runs on server machine. It moves client images to server directory from client shared folder. I am using File.Move to move the files along with FastDirectoryEnumerator class. These moved files are used to create videos later and also used to show live streaming.
Questions:
Is there any other (Best/Fastest) option to move these files in real time (transfer as soon as it is created at client side)?
I am also interested in file transfer in real time without shared folder.
Update:
My major concerns.
File Transfer should be faster to allow live streaming through my server app (ASP.Net)
Client should retain it if server/connection is not available & transfer as soon as it comes online.
I do not know why you have the server monitoring the client's share, surely this monitoring or as you say, DirectoryEnumerator procedure takes time.
Since the client knows when the image has been captured why don't you send this information to the server immediately from the client? In this way you do not need to monitor clients from server, you do not need to find / enumerate folders, you simply transfer data from client to server as soon as it is available with a specific WCF / Web Service call which takes a stream of bytes as input.
If you have no control over the client app creating the image files you could have a FileSystemWatcher running on the client machine. Add an event handler which will be called whenever the Created event is raised due to a file being created on the file system.
See https://stackoverflow.com/a/15018082/1730317
Things to consider in gaining the best performance out of this type of activity.
For "Real Time" notifications you may want to look at the FileSystemWatcher class. See the documentation for a complete example. Word of warning on this though, when the Operating system is busy it will drop some of these events and you will not be informed of the new file. It's a good idea to program a sweep up loop make doubly sure you have all the files, I have been caught out with this in the past.
You may want to check your hardware configuration, like network latency between the pc and server, and also the disk speeds at both the PC and Server. A low spec SSD is likely to have a big performance benefit in these kinds of operations over spinning disks.

Chatting and file sharing using C# in Asp.Net Webforms

NET Web-forms based application in c#. I need to add a module in the application which allows chatting between logged in users and users can share files during chatting, like Skype. Meanwhile I have to keep a PERMANENT RECORD of each and every word of conversation and files transferred during the session, on my server. I have a bit idea about the implementation of module to achieve the desired result, but I am sure that is not a good practice. Here is my idea:
Chatting:
While users are chatting, create a data-table which will contain the sender id, receiver id, and message contents. When ever user presses send button or hit Enter, a new row would be inserted in the data-table with both IDs and message contents and then the data-table will be bound to a div etc. to show updated messages to both users. At the end, on an event (like window close etc) data-table will be converted to the XML and the XML file will be stored permanently either on hard disk or in database.
File-transfer:
During chatting whenever user press enter/send button we will check the message contents, if the message being sent is a file (with some extension) then upload the file on server and provide a download link to the receiver.
I hope you got my point.
Problem:
1) I want to share files asynchronously i.e. transfer to the receiver and save on the server at the same time. Is it possible?
2) How to tell one user that the other user is typing?
Is there any better way to implement this module? What sort of knowledge should I have to properly comprehend and implement the module?
Thanks for any guidance.
For web-based real-time chat the current open source standard bearer seems to be SignalR.
There are quite a few discussions here on SO about that product and those should help move you in the right direction.
As far as storage is concerned, that will depend upon the infrastructure you have available and the costs you are willing to incur to build the system.
You might look into using RabbitMQ for message delivery and if you set that up appropriately, you can attach queue listeners that will also perform logging of chats as needed. (There are well documented .NET/C# clients already available for RabbitMQ.) You may also want to check out the Wikipedia page for RabbitMQ.
File transfer would probably be best done through uploading of the file to the web-server and temporary storage there with a link to the file to be downloaded by the other chat client. That causes the server to increase its bandwidth requirements though.
You might also look into running your own XMPP server and using a web interface through SignalR to interface into the XMPP server. It might leverage the most functionality for easing time to market.
Have you looked into SignalR?

The connection was reset ASP.NET

I have some code that pulls data from SQL DB, then loops through the records to generate a string, which will eventually be written to a text file.
The code runs fine on my local, from VS, but on the live server, after about a minute and half, I get "No Data Received" error (chrome). The code stops in middle of looping through the DataTable. Hosting support said "The connection was reset" error was thrown.
I"m not sure if this is a timeout issue or what. I've set the executionTimeout in my web.config (with debug = false) and it didn't seem to help. I also checked the Server.ScriptTimeout property, and it does match the executionTimeout value set in the web.config.
Additionally, a timeout would normally give "Page not available" message.
Any suggestions are appreciated.
after about a minute and half
There's your problem. This is a web application? A minute and a half is a very long time for a web application to respond to a request. Long enough that it's not really worth engaging in various trickery to make it kind of sort of work.
You'll want to offload this process to be more asynchronous with the web application itself. The nature of web applications is that they should receive a request and respond in a timely manner. What you have here is a long-running process which can't respond in a timely manner. The web application can facilitate interactions with the data, but shouldn't directly handle the processing thereof in the request/response directly.
How does the web application interact with the process? Does it just start it, or does it provide information for the process to begin? I would recommend that the process itself be handled by something like a Windows Service or perhaps a Console Application. The more de-coupled from the web application, the better. Now, since I don't know anything about the process itself, I'm making a few assumptions about its behavior...
The web application can receive a request to start the process, along with any information needed for the process. It can store this in a database with a status value (pending, queued, etc.) and then respond to the user (in a timely manner) that the request has been received and the process has been queued. The web application can have a page which checks the status so that the user can see how the process is doing (if it's started, how many records it's gone through, etc.).
The offline application (Windows Service, et al) would just monitor that database for newly-queued data to be processed. When it sees it, it updates the status (running, processing, etc.) and provides any relevant feedback during the process (number of records processed, etc.) by updating that data. So the offline application and the web application are both interacting with the same data, but not in a manner which blocks the thread of the web application and prevents a response to the user.
When the process is finished, the status is again updated. The web application can show that it's finished and provide a link to download the results. The offline process could even perhaps send an email to the user when it's done, or maybe the web application can have some kind of notification system (I'm picturing the little notification icons in Facebook) which would alert the user to new activity.
This way the thread isn't blocked, the user can continue to interact with the application (if there's even anything with which to interact), etc. And you get other added benefits, too. For example, results of the process are thus saved in the database and automatically historically tracked.
It sounds like it's the browser that's timing out waiting for a response, not on the server. You can't control what the browser has set for this. What you can do is send a response of some kind to the browser, so that it knows you're still around and haven't crashed in some way.
For this to work, you can't wait until you finish building the entire string. You need to re-think your code so that instead of appending to a string, you are writing each addition to an output stream. This has the added advantage of being a much more efficient way to create your text file. For purposes keeping the browser alive, you can write out anything, as long as some data is coming back for the browser to read. Html comments can work for this. You also need to periodically flush your response stream, so that your data isn't sitting buffered on your web server. Otherwise you might still timeout.
Of course, the real solution here is to re-think your design, such that your operation doesn't take 90 seconds plus in the first place. But until you can do that, hopefully this is helpful.
it does sound like a timeout, Could you try and return the information via a View, this would certainly speed things up.(if possible).
When i had this error, i was able to resolve it by adding in the Web.config file:
<system.web>
<httpRuntime executionTimeout="600" maxRequestLength="51200" />
</system.web>

Upload file to a remote server, how should I?

I am scratching my head about this. My scenario are that I need to upload a file to the company server machine(to a folder on c:) from our hosting one(totally different server). I don't know how I should do this. Any of you got tips or code on how this is done.
Thanks Guys
I would set up an FTP server (like the one in IIS or a third-party server) on the Company Server. If security is an issue then you'll want to set up SFTP (secure FTP) rather than vanilla FTP since FTP is not a natively secure transfer protocol. Then create a service on the Hosting Server to pick up the file(s) as they come in and ship them to the company server using C#/.NET's FTP control. Honestly, it should be pretty straightforward.
Update: Reading your question, I am under the strong impression that you will NOT have a web site running on the company server. That is, you do not need a file upload control in your web app (or already know how to implement one given that the control is right in the web page toolbox). Your question, as I understand it, is how to get a file from the web server over to the company server.
Update 2: Added a note about security. Note that this is less of a concern if the servers are on the same subdomain and won't be routed outside of the company network and/or if the data is not sensitive. I didn't think of this at first because I am working a project like this now but our data is not, in any way, sensitive.
Darren Johnstone's File Upload control is as good a solution as you will find anywhere. It has the ability to handle large files without impacting the ASP.NET server memory, and can display file upload progress without requiring a Flash or Silverlight dependency.
http://darrenjohnstone.net/2008/07/15/aspnet-file-upload-module-version-2-beta-1/
There isnt enough info to tell your whole hosting scenario but I have a few suggestions that might get you started in the right direction:
Is your external server owned by another company or group and you cant modify it? If not you might consider hosting the process on the same machine, either in process or as a separate service on the machine. If it cannot be modified, you might consider hosting the service on the destination machine, that way its in the same place as the files need to show up at.
Do the files need to stay in sync with the process? I.e. do they need to be uploaded, moved and verified as a single operation? If not then a separate process is probably the best way to go. The separate process will give you some flexibility, but remember it will be a separate process and a separate set of code to manage and work with
How big is the file(s) that are being uploaded? Do they vary by upload? are the plain files, binaries (zips, executables, etc)? If the files are small you have more options than if they are large. If they are small enough, you can even relay them in line.
Depending on the answers to the above some of these might work for you:
Use MSMQ. This will work for simple messages under about 3MB without too much hassle. Its ideal for messages that can be directly worked with (such as XML).
Use direct HTTP(s) relaying. On the host machine open a HTTP(s) connetion to the destination machine and transfer the file. Again this will work better for smaller files (i.e. only a few KB since it will be done in-line)
If you have access to the host machine, deploy a separate process on the machine which builds or collects the files and uses any of the listed methods to send them to the destination machine.
You can use SCP, FTP (of any form SFTP, etc) on either the host machine (if you have access) or the target machine to host the incoming files and use a batch process to move the files. This will have a lot of issues to address, such as file size, keeping submissions in sync, and timing. I would consider this as a last resort, depending on the situation.
Again depending on message size, you could also use a layer of abstraction such as a DB to act as the intermediate layer between the two machines. This will work as long as the two machines can see the DB (or other storage location) and both act on it. SQL Server Service Broker could be used for this purpose (and most other DB products offer similar products).
You can look at other products like WSO2 ESB or NServiceBus to facilitate messaging between the two apps and do it inline.
Hopefully that will give you some starting points to look into.

Categories

Resources