I want to host 2 websites (Asp.net MVC) they have one folder with the same name and I want to copy data from one website to another periodically. For example website1/file/ to website2/file/.
That's why I thought to create a Windows service in order to do that.
My question is how can I copy data between these two folders via http.
Personally with the complexity around developing a solution I would look to use some kind of service like DropBox.
Another alternative would be to store the files in a distributed file system. This could be Amazon S3 or Azure Blob Store. This eliminates the need for the entire synchronization in the first place. This can be fronted by a proxy web service that can stream the file to the end user.
The reason I suggest this is because there is a lot of complexity around managing the synchronization of files via HTTP.
I don't think you will get a full solution on StackOverflow but I can make some recommendations.
I would use a master-slave system to co-ordinate synchronization. This would require some design and add to the complexity. But would give you the ability to add more nodes in the future. Implementing a master-slave system can't be easily detailed in a single post and would require you to research it further. There is good resource on here already. How to elect a master node among the nodes running in a cluster?
Calculating delta's for each node. e.g. What files do I have the master does not? What files does the master have that I do not. Are their naming conflicts on other nodes? How to determine what is the most upto date file?
Transfering the files.. Will require some sort of endpoint to connect to either as part of the service or as your existing website.
Http Client to send the files and handle progress/state of transfer for error handling.
Error handling over all, what happens if a file is part transfered to the Master and how to clean up failed files.
That is probably the tip of the complexity of trying to do this. Hence my recommendations of using an existing product or cloud service.
Related
(See what I did there?)
I am developing a WinForms application which needs to retrieve information from a file which contains sensitive information. The information retrieved is used to perform some complex calculations, but it includes things like salaries of certain pay bands for employees of a large company. The WinForms application will eventually need to be deployed to members of that company, but I need to make sure that I do not reveal the contents of this file to them.
The file itself is a JSON file, and is currently stored locally within the Visual Studio project file structure.
If I was to "Publish" this application through Visual Studio's Build menu, and release it through a web link, would people be able to open up this JSON file and view it? If so, is there some way this can be avoided? I have considered storing the file online and accessing it via HTTP request, however I don't really know much about that so could do with some advice.
Cheers,
Josh
If I was to "Publish" this application through Visual Studio's Build menu, and release it through a web link, would people be able to open up this JSON file and view it?
Yes.
If so, is there some way this can be avoided?
Only by not publishing the file.
You should look into storing this information in a database that can only be accessed through an authorised account via HTTPS. I'd recommend using WCF as it will integrate well with C# and WinForms. The best approach would be to perform the calculations on the server side (either in the WCF service itself or as stored procedures in the database). Thus you only need to gather the inputs on the client, pass these back to the server and then display the result.
You can also do things like logging all attempts (successful or not) to access this data so you have a complete audit trail. You can also expose your WCF service to other clients if necessary.
I would look into creating a separate (WebAPI or WCF) service that has access to that file and knows how to serve up the public facing portions of it to your application.
So let's assume the file lives at \\hrserver\C$\sensitive.dat. Your service has access to that file, but the client applications do not. Your client applications access the service (https://hrserverhelper/GetHrData), which encapsulates the authentication/authorization to that file. It then parses out the sensitive data (perhaps from the JSON you already are set up to create for that file), and passes the non-sensitive data to your client application.
If it turns out that all the data in the file is sensitive, then have your service provide operations to perform the calculations that your WinForms app performs currently. For example, your WinForms app submits the inputs it wishes to perform to a WebMethod that knows how to perform those calculations with the sensitive data - the WebMethod spits out the results.
However, in this scenario, be aware that basic mathematical skills will likely be able to reverse engineer the "sensitive" data here. If I submit 2 and get back 4, and I submit 3 and get back 6, I'll assume the "sensitive" number is 2.
In general I need to know amount of visits on my website and access that data via API to have it everywhere.
For this I am trying to share EF database with 2 projects. One is simple Azure ASP.NET website with one controller which collects statistics of site visits. Second project is Azure mobile service that connects to the same database as website and provides access to that statistic via GET requests.
Locally I am getting such error:
Cannot attach file '...App_Data\aspnet-TargetrWebsite-20151001100420.mdf' as database 'aspnet-TargetrWebsite-20151001100420' because this database name is already attached with file '...\tagetr_statisticService\App_Data
So the problem that I have 2 web.config files with connection strings that points for 2 different files with the same database name.
How to get this work with one file on localhost and keep it worked on production as well?
Actually my target is know visits of page from everywhere. It is not required to use separated service for this. Just adding new authenticated controller which binds to Visits table on the same website solves the problem. Service removed then.
This could probably be done via Powershell script which sits on any machine.
Here's a good start where you can get back a list of IP addresses which are stored in an xml. You can then pull the xml into API quite easily I would believe. Also it should be quite easy to convert IP to url or location etc.
https://www.petri.com/powershell-problem-solver - Thanks to Jeff
Remember to watch your permissions!
I don't know why this is becoming such a hard concept for me to grasp. I'm struggling with the following issue and any help would be greatly appreciated.
I have two ASP.net MVC 4 applications running C#. They are two sepereate applications one for the public facing site and the other for our admin side. The reason we separated the two is because they are two completely separate designs and code bases and it will be easier to manage.
The two applications are connected to one SQL Server Database instance.
We have a file upload functionallity on each site and I'm trying to figure out a way to store the file uploads in one common directory for both sites.
The issue is that when a file gets uploaded we store the image location in the database.
/Uploads/filename.png
We do this using the following function.
Server.MapPath("~" + TempImage.ThumbnailLocation.Replace("TempUploads/", "")));
How can I save the files from both sites to the same directory on the server so I can keep all my image paths the same in the database?
The other issues that I need to be able to call, from both applications, the following to delete an image.
if (System.IO.File.Exists(HttpContext.Current.Server.MapPath(Path)))
{
System.IO.File.Delete(HttpContext.Current.Server.MapPath(Path));
}
You can create a virtual directory in each of your applications. The virtual directory will point to a single physical path. So you can upload and delete file from the same physical directory on both sites.
I usually use BLOB storage, which is very cheap either from Amazon or Microsoft (and many other providers)
This approach is better because:
It reduces the risk of data loss in case of hardware failure on your single server machine
Your page loads faster since assets are loaded from a CDN
You can reuse the files from any application since they're all in the cloud
Here's a couple of tutorials to get started on azure:
http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/
http://code.msdn.microsoft.com/windowsazure/How-To-Use-Azure-Blob-16882fe2
One way of doing this would be to use Virtual Directories - in IIS, both sites can be configured as having a "/Uploads/" virtual directory and they can both be mapped to the same location on the hard drive.
I have a few MVC4 websites that share some of the same images / videos / pdfs etc. They are confidential: that is only authorized users can access them.
At the moment I just have the content in a folder under one of the web apps, and then I create a symlink to that folder from within the other web apps so that they share that directory. I don't want to do this because it makes things complicated for testing and deployment and would rather have some kind of CDN type of website to serve it.
What's the best practice here?
I guess you've answered your own question. Try using a CDN instead of having them inside your server(s) and shared across your other web apps. Some CDN's may or may not have authentication.
One CDN you might want to consider:
Amazon S3 (it has token auth)
Another is Softlayer (this also has auth)
If you really intend to create a CDN-like website, I would say, do a cost-benefit analysis. Is it worth to build it from scratch? Can you just get a CDN (with authentication, of course) and host it by yourself? Or can you just have it hosted externally (which might be more reliable as well)?
Just my 2 cents.
I am using sharepoint solely as a repository to store and retrieve large files (~100 MBs). How can I authenticate a web application such that it can upload and download files to a document list on Sharepoint 2007 without using Windows intergrated authentication?
The web application will handle the authorization - it'll figure out which users are allowed to access the repository via integrated windows authentication and a bunch of business rules that depend on the application's state. When the user wants a file they will use the web app. The web app will then download that file on the user's behalf using some sort of credentials. I prefer that these credentials be somewhat permanent so it's password doesn't expire every so often. I was thinking of using basic authentication because the files that I'm access controlling aren't high valued files (so its poor security is tolerable), and it seems to be the simplest. What are my options?
I wouldn't recommend using SharePoint for this at all. Its value comes from the features it provides through its user interface. If you remove this then you are looking at an expensive and over-complicated data store.
SharePoint stores all data in a database. Storage for databases is more expensive than storage for files. It's more costly to configure, administer, backup, load balance, scale, etc...
Development time is more costly with SharePoint. It's a big and complex product that's not trivial or quick to develop against. There needs to be a solid business case and using SharePoint for its back end only isn't a good one.
Please seriously consider this approach before going down it!
You are better off just enabling windows auth on your web application and then setting the permissions to the folders/files.
If you do need to get just the files however...go to www.codeplex.com and search for sharepoint powershell. There is a script there to upload stuff. This could be modified to download I believe.
As mentioned above, using SharePoint as a repository pretty much nullifies any of its benefits. You might as well just use a database to store your content (that's what SharePoint is doing behind the scenes anyway.)