I need access to a couple of XML files on a remote server. Because of the crossdoamin policies, i cannot do it with jQuery alone, but need a small c# .aspx page, which will act as a gateway for my jQuery to fetch the remote xml from a local file. But how would i best go about this page - what is the simplest structure of such a file?
You could use a "Generic Handler" (ASHX file).
I'll use these kind of files often when sending files to the browser, without the need for an UI (i.e. no visible page content).
In my understanding I see ASHX handlers as "pages without an UI" that have low overhead compared to a normal Page.
Why not a web service?
client code (jQuery) -> your server (WCF) -> external xml
You can quickly create a REST web service with this template.
Related
I want to host 2 websites (Asp.net MVC) they have one folder with the same name and I want to copy data from one website to another periodically. For example website1/file/ to website2/file/.
That's why I thought to create a Windows service in order to do that.
My question is how can I copy data between these two folders via http.
Personally with the complexity around developing a solution I would look to use some kind of service like DropBox.
Another alternative would be to store the files in a distributed file system. This could be Amazon S3 or Azure Blob Store. This eliminates the need for the entire synchronization in the first place. This can be fronted by a proxy web service that can stream the file to the end user.
The reason I suggest this is because there is a lot of complexity around managing the synchronization of files via HTTP.
I don't think you will get a full solution on StackOverflow but I can make some recommendations.
I would use a master-slave system to co-ordinate synchronization. This would require some design and add to the complexity. But would give you the ability to add more nodes in the future. Implementing a master-slave system can't be easily detailed in a single post and would require you to research it further. There is good resource on here already. How to elect a master node among the nodes running in a cluster?
Calculating delta's for each node. e.g. What files do I have the master does not? What files does the master have that I do not. Are their naming conflicts on other nodes? How to determine what is the most upto date file?
Transfering the files.. Will require some sort of endpoint to connect to either as part of the service or as your existing website.
Http Client to send the files and handle progress/state of transfer for error handling.
Error handling over all, what happens if a file is part transfered to the Master and how to clean up failed files.
That is probably the tip of the complexity of trying to do this. Hence my recommendations of using an existing product or cloud service.
can u please help me out with the best solution how to do this?
I have a windows form application, which is continuously doing something. (making screenshots and generating data) This app is running locally.
So I dont want to use teamviewer from office just to see what my application is doing. I want to create an AngularJS Web Dashboard application (load it on a webhost) and display this winform data (in form of charts,..), so that I can access from everywere.
What is the best solution for this?
I have experience with AngularJS and parsing JSON-Files from a webserver.
I preffer creating/serialize a json file (each second) from my winform app and load it somewhere on webhost and then access this json file with http.get from my AngularJS application.
Is this a possible solution? Any suggestions?
Thanks!
I agree with making a Angular web app (although you don't need angular for this, you can use jquery to make a simple ajax call, seems like it would be quicker to add the jquery reference than to setup the angular scaffolding just for one or two ajax calls) to call a web request where the winforms app stores the data.
Ex. you can create a web request (or directory save) in winforms app to perform the 'snipping tool' action described here: C# snipping tool service and send the data to somewhere to store it. If the snipping tool doesn't work then probably use an export of the chart or data you are capturing. Then the web app can query that directory to retrieve (ajax/http get) whichever data you need.
I would advise clean up on that directory as it could become quite big if you are saving to it every few seconds or so.
(See what I did there?)
I am developing a WinForms application which needs to retrieve information from a file which contains sensitive information. The information retrieved is used to perform some complex calculations, but it includes things like salaries of certain pay bands for employees of a large company. The WinForms application will eventually need to be deployed to members of that company, but I need to make sure that I do not reveal the contents of this file to them.
The file itself is a JSON file, and is currently stored locally within the Visual Studio project file structure.
If I was to "Publish" this application through Visual Studio's Build menu, and release it through a web link, would people be able to open up this JSON file and view it? If so, is there some way this can be avoided? I have considered storing the file online and accessing it via HTTP request, however I don't really know much about that so could do with some advice.
Cheers,
Josh
If I was to "Publish" this application through Visual Studio's Build menu, and release it through a web link, would people be able to open up this JSON file and view it?
Yes.
If so, is there some way this can be avoided?
Only by not publishing the file.
You should look into storing this information in a database that can only be accessed through an authorised account via HTTPS. I'd recommend using WCF as it will integrate well with C# and WinForms. The best approach would be to perform the calculations on the server side (either in the WCF service itself or as stored procedures in the database). Thus you only need to gather the inputs on the client, pass these back to the server and then display the result.
You can also do things like logging all attempts (successful or not) to access this data so you have a complete audit trail. You can also expose your WCF service to other clients if necessary.
I would look into creating a separate (WebAPI or WCF) service that has access to that file and knows how to serve up the public facing portions of it to your application.
So let's assume the file lives at \\hrserver\C$\sensitive.dat. Your service has access to that file, but the client applications do not. Your client applications access the service (https://hrserverhelper/GetHrData), which encapsulates the authentication/authorization to that file. It then parses out the sensitive data (perhaps from the JSON you already are set up to create for that file), and passes the non-sensitive data to your client application.
If it turns out that all the data in the file is sensitive, then have your service provide operations to perform the calculations that your WinForms app performs currently. For example, your WinForms app submits the inputs it wishes to perform to a WebMethod that knows how to perform those calculations with the sensitive data - the WebMethod spits out the results.
However, in this scenario, be aware that basic mathematical skills will likely be able to reverse engineer the "sensitive" data here. If I submit 2 and get back 4, and I submit 3 and get back 6, I'll assume the "sensitive" number is 2.
anybody please tell me when to use http & ftp when trying to access files from the server in C#
There are a few considerations when making a decision between http and FTP. I would
Say it depends on who will be consuming your files and for what reason.
I would go http when
Hosting Html pages
Hosting Json / XML data structures that are are representation of data from a database. Hosting Images
Generally when the user is a non technical web user that just needs to read your info.
I would use FTP
When the userbase is limited ( you are not allowing access with a web site)
If you need to allow read and write of large files.
FTP is a potential integration approach for passing information between two systems maybe even between 2 companies.
I've created a Hello World WCF service that uses the ASP.NET Development Server - I launch the client which opens a page in my web browser, HelloWorldService.svc, then this triggers the command prompt to open and print out a Hello World message.
I don't fully understand this chain of action or how it's useful.
I'm trying to create a WCF service that acts as a server that allows users to connect to the server and perform a file upload - I will take this file and store it locally on my machine.
For now, allowing this to work locally on only my PC is fine but I really don't know where to begin to accomplish this. Any advice would be appreciated.
EDIT: I NEED to use WCF. So please don't suggest alternative solutions.
WCF is arguably overkill. Simplest case, just use the standrad .Net FileUpload control.
Here are some examples:
ASP.NET File Upload
http://asp.net-tutorials.com/controls/file-upload-control/
Which leads to the question "When should I use WCF"? Here are a couple of answers:
When & where I should use WCF
http://forums.asp.net/t/1480028.aspx
http://msdn.microsoft.com/en-us/library/cc512038.aspx
http://forums.asp.net/t/1478962.aspx
Finally, here are a couple of links that describe WCF-based file transfer services:
http://www.codeproject.com/Articles/166763/WCF-Streaming-Upload-Download-Files-Over-HTTP
http://code.msdn.microsoft.com/windowsdesktop/Upload-files-using-a-REST-13f16af2