Reading from a confidential file - c#

(See what I did there?)
I am developing a WinForms application which needs to retrieve information from a file which contains sensitive information. The information retrieved is used to perform some complex calculations, but it includes things like salaries of certain pay bands for employees of a large company. The WinForms application will eventually need to be deployed to members of that company, but I need to make sure that I do not reveal the contents of this file to them.
The file itself is a JSON file, and is currently stored locally within the Visual Studio project file structure.
If I was to "Publish" this application through Visual Studio's Build menu, and release it through a web link, would people be able to open up this JSON file and view it? If so, is there some way this can be avoided? I have considered storing the file online and accessing it via HTTP request, however I don't really know much about that so could do with some advice.
Cheers,
Josh

If I was to "Publish" this application through Visual Studio's Build menu, and release it through a web link, would people be able to open up this JSON file and view it?
Yes.
If so, is there some way this can be avoided?
Only by not publishing the file.
You should look into storing this information in a database that can only be accessed through an authorised account via HTTPS. I'd recommend using WCF as it will integrate well with C# and WinForms. The best approach would be to perform the calculations on the server side (either in the WCF service itself or as stored procedures in the database). Thus you only need to gather the inputs on the client, pass these back to the server and then display the result.
You can also do things like logging all attempts (successful or not) to access this data so you have a complete audit trail. You can also expose your WCF service to other clients if necessary.

I would look into creating a separate (WebAPI or WCF) service that has access to that file and knows how to serve up the public facing portions of it to your application.
So let's assume the file lives at \\hrserver\C$\sensitive.dat. Your service has access to that file, but the client applications do not. Your client applications access the service (https://hrserverhelper/GetHrData), which encapsulates the authentication/authorization to that file. It then parses out the sensitive data (perhaps from the JSON you already are set up to create for that file), and passes the non-sensitive data to your client application.
If it turns out that all the data in the file is sensitive, then have your service provide operations to perform the calculations that your WinForms app performs currently. For example, your WinForms app submits the inputs it wishes to perform to a WebMethod that knows how to perform those calculations with the sensitive data - the WebMethod spits out the results.
However, in this scenario, be aware that basic mathematical skills will likely be able to reverse engineer the "sensitive" data here. If I submit 2 and get back 4, and I submit 3 and get back 6, I'll assume the "sensitive" number is 2.

Related

Copy data between two web site directory

I want to host 2 websites (Asp.net MVC) they have one folder with the same name and I want to copy data from one website to another periodically. For example website1/file/ to website2/file/.
That's why I thought to create a Windows service in order to do that.
My question is how can I copy data between these two folders via http.
Personally with the complexity around developing a solution I would look to use some kind of service like DropBox.
Another alternative would be to store the files in a distributed file system. This could be Amazon S3 or Azure Blob Store. This eliminates the need for the entire synchronization in the first place. This can be fronted by a proxy web service that can stream the file to the end user.
The reason I suggest this is because there is a lot of complexity around managing the synchronization of files via HTTP.
I don't think you will get a full solution on StackOverflow but I can make some recommendations.
I would use a master-slave system to co-ordinate synchronization. This would require some design and add to the complexity. But would give you the ability to add more nodes in the future. Implementing a master-slave system can't be easily detailed in a single post and would require you to research it further. There is good resource on here already. How to elect a master node among the nodes running in a cluster?
Calculating delta's for each node. e.g. What files do I have the master does not? What files does the master have that I do not. Are their naming conflicts on other nodes? How to determine what is the most upto date file?
Transfering the files.. Will require some sort of endpoint to connect to either as part of the service or as your existing website.
Http Client to send the files and handle progress/state of transfer for error handling.
Error handling over all, what happens if a file is part transfered to the Master and how to clean up failed files.
That is probably the tip of the complexity of trying to do this. Hence my recommendations of using an existing product or cloud service.

Which is better? Having class file in the application or getting the data from web service?

I recently took over a project which is partially done. In that project he is using web service for everything i mean for getting each and every data from database.
Ex: I need some data which takes 3 parameters (district code, taluk code an village code)
What is happening is :
Creating an xml document using these three parameters
Encrypting this xml
Sending this XML to the web service
Decrypting this XML in web service
Retrieving the data through stored procedure with the above mentioned parameters
Again generating an xml document for the retrieved data.
Again Encrypting this XML and returning this XML to the application
Decrypting the returned XML and Generating the datatable with this XML.
I asked him why he has done all this? He said for security purpose.
I feel this is very lengthy and time consuming and what i learned of stored procedures is it secures the application.
My question is why do i need to go through all these procedure when i can have a class file and use stored procedures in my application?
Isn't usage of stored procedures enough to secure my application?
Or should i continue with his technique?(To be frank i disagree with this method)
Note: The parameters are not passed by the user. It will be in the session once the user logs in.
Unless the client code resides entirely in the browser side, your mate seems to have created an over-enginereed solution. Actually, you would create a full HTTP interface through Web services if you're developing a full HTML5 app or same backend should be accessed by multiple client technologies.
Security layers can be both implemented before a request hits a Web service resource or as a regular concern in a layered software system.
A special note can be made about encryption. If we're talking about Web services through HTTP, encrypting the request body is unnecessary since it can be done in transport level using standard SSL/HTTPS.
Finally, about the XML thing, I guess that you're consuming SOAP services. Maybe your mate either started his project a long time ago or, again, if we're talking about a Web site against a Web service, and he has implemented a SOAP/XML service in the last 5-6 years, he should reconsider using a convention over configuration approach like REST...
This sort of separation is not uncommon in large scale secure web applications in dual firewall configurations.
If the web application simply called stored procedures directly, you would have to open up port 1433 (or other ODBC port) between the web network zone and the database network zone. This creates a bit of exposure since the web servers are in a DMZ while your database servers will tend to be in a more secure zone. Generally speaking you want to keep as many ports sealed up as you can to keep the DMZ contained in case of compromise.
It is shame that your colleague wrote his own solution, since it sounds like most of that functionality would be covered with SQLXML.

Two ASP.NET websites that shares one database

In general I need to know amount of visits on my website and access that data via API to have it everywhere.
For this I am trying to share EF database with 2 projects. One is simple Azure ASP.NET website with one controller which collects statistics of site visits. Second project is Azure mobile service that connects to the same database as website and provides access to that statistic via GET requests.
Locally I am getting such error:
Cannot attach file '...App_Data\aspnet-TargetrWebsite-20151001100420.mdf' as database 'aspnet-TargetrWebsite-20151001100420' because this database name is already attached with file '...\tagetr_statisticService\App_Data
So the problem that I have 2 web.config files with connection strings that points for 2 different files with the same database name.
How to get this work with one file on localhost and keep it worked on production as well?
Actually my target is know visits of page from everywhere. It is not required to use separated service for this. Just adding new authenticated controller which binds to Visits table on the same website solves the problem. Service removed then.
This could probably be done via Powershell script which sits on any machine.
Here's a good start where you can get back a list of IP addresses which are stored in an xml. You can then pull the xml into API quite easily I would believe. Also it should be quite easy to convert IP to url or location etc.
https://www.petri.com/powershell-problem-solver - Thanks to Jeff
Remember to watch your permissions!

web based remote connections in c# advice

I am going to write up a webapp hosted on a windows 2003 server to allow me to connect to local and remote servers to do some basic things.
The webapp will be hosted on serverA. It will need to be able to copy files/folders from one folder to another on this server.
It will need to be able to connect to ServerB and copy files in the same way, e.g. copy \serverB\path\to\sourcefiles to \serverB\path\to\destinationfiles
ServerB hosts an installation of MSSQL 2008, I want to be able to create new database/login etc.
How do I go about this please? I've been reading a bit about Windows Authentication, Impersonation, Delegation but i don't know where to focus on.
thanks
S
To be honest there isn't really a one size fits all complete answer to your question, however there are a number of things that you need to take into consideration early in development to ensure that your platform is built on solid foundations.
From the description you have given the most critical consideration has to be security and everything you develop has to have this at its core. Judging by your post if the wrong person was to access your front end then they could wreak havoc.
As for the model to use, I would suggest Windows Authentication as this is built into the framework and gives you the ability to segregate into usergroups with differing levels of access. It will also open up some of the functionality you need, i.e. network copy of files etc
As for the database management aspect, this again can easily be done via Windows Authentication as you can grant (in SQL) windows users the ability to perform certain tasks, i.e. Create Database, Create Login, drop x, etc
All this said, it of course assumes that the two servers share user credentials, i.e. domain controller etc.
Another method, would be to use the web "interface" as a pass through onto a WCF service that operates under a specific user account that has the access you need. You would then seperately manage authentication/authorisation in a manner that you decide.
Like I said, no simple one size answer - but hopefully this will give you something to chew on.
If your goal is to create new databases or logins, why can't you use the create database and create login commands?

Persisting User Specific Settings in ASP.NET Server-Side

I'm working on a .NET 3.5 Web Application and I was wondering what would be good way to persist user specific settings (i.e. user preferences) server-side?
Here are the conditions:
It needs to be able to store/retrieve
settings based on a User ID
I don't want to use SQL Server or any such DB engine
I don't want to store it in cookies
Any guidance would be greatly appreciated.
Edit: If it makes any difference, it doesn't have to support a web farm.
Use the ASP.NET Profile feature for this. See ASP.NET Profile Properties Overview
If you want to persist data, and you don't want to use a database, then you need to save the data to disk (such as XML). If you're looking for something that isn't local to your server, you could use a SaaS solution that would host your data for you, such as Amazon's S3 service. If you do that, the latency of data retrieval will slow your application down, so you'll want to cache the data.
Text files (JSON/XML/etc), though security then becomes an associated problem.
Given those parameters, you could just keep track of your own user preferences XML file on the server.

Categories

Resources