I need a way to acccess files on a fileshare from a different domain from my own? for example, I have here is an application that exists on a sever in Domain1 and this application needs to retrieve files from a server on Domain2.
Any ideas...
Does this help you in the right direction? I assume it's what your asking...
Map a Network Drive From Code for Cross-Domain File Copy.
The CodeProject link given on the site also gives the source code for downloading.
I recently had a similar issue and executing
net use \\machine.otherdomain myusername /USER:password
as part of the user's network logon was a solution for us.
This is obviously not perfect but for our environment it was sufficient.
Related
I have my files stored in Azure File System and here are the requirements -
User should be able to view the documents without downloading it to the local
This is working fine for pdf but not any other mime types
I tried setting Content-Type,Content-Disposition (in azure file properties but no luck) and also iframe.
User should be able to edit the doc online without downloading.
I don't think this is possible just with Azure and have to integrate with One Drive may be? Correct me if I am wrong?
I would really appreciate any inputs/thoughts.
Not sure if this is a viable option but using Storage Accounts in Azure, you can map these accounts as a network drives to any client machine. So they would be able to access these files via File Explorer.
This link covers the basic steps in setting it up.
Unfortunately for anyone who wishes to use this feature, they need to be on Windows 8 (or above) to be able to map a network drive successfully as it uses SMB3.
If this option is a no go I will delete the post.
I have created a web service which gives the data in JSON format, I am going to read this data to create a high-charts. When I coded this web service , I get values in string which I have serialized into JSON format and store it my folder on system, but problem is when I will deploy my web service in remote machine that time I will face the problem to store the file since I have provided local path as explained in below code,
System.IO.File.WriteAllText(#"C:\Json\Json.json", jSearializer.Serialize(modified_listofstrings));
Can anyone please suggest what I am suppose to do so that I can store this file in such a way that it will be easy to access after deployment of my web service in remote machine?
Is that possible or I will have to create a simple asp.net application and consume that web service and store that file in the folder of that newly created application?
I am very new to this concept hence I don't know about storing in virtual folder or something like that, I got suggestion to do so, It will be very grateful if someone explains me the concept as well...
You have two ways:
Give write permission to your application identity to write in c:\json folder, which is not a good idea.
Change your code to use relative path:
File.WriteAllText(System.Web.HttpContext.Current.Server.MapPath("/json"), "json-text");
Server.MapPath maps a virtual directory to it's equivalent absolute directory in OS. For example, if your website is hosted in c:\websites\json-project\, then using Server.MapPath("/foo") would be translated to c:\websites\json-project\foo path.
By default any ASP.NET application has full access to all of its folders.
You can find the physical path to your application by giving relative path something like Server.MapPath("/Json/Json.json")
for more information check below SO question
Server.MapPath("."), Server.MapPath("~"), Server.MapPath(#"\"), Server.MapPath("/"). What is the difference?
I don't know why this is becoming such a hard concept for me to grasp. I'm struggling with the following issue and any help would be greatly appreciated.
I have two ASP.net MVC 4 applications running C#. They are two sepereate applications one for the public facing site and the other for our admin side. The reason we separated the two is because they are two completely separate designs and code bases and it will be easier to manage.
The two applications are connected to one SQL Server Database instance.
We have a file upload functionallity on each site and I'm trying to figure out a way to store the file uploads in one common directory for both sites.
The issue is that when a file gets uploaded we store the image location in the database.
/Uploads/filename.png
We do this using the following function.
Server.MapPath("~" + TempImage.ThumbnailLocation.Replace("TempUploads/", "")));
How can I save the files from both sites to the same directory on the server so I can keep all my image paths the same in the database?
The other issues that I need to be able to call, from both applications, the following to delete an image.
if (System.IO.File.Exists(HttpContext.Current.Server.MapPath(Path)))
{
System.IO.File.Delete(HttpContext.Current.Server.MapPath(Path));
}
You can create a virtual directory in each of your applications. The virtual directory will point to a single physical path. So you can upload and delete file from the same physical directory on both sites.
I usually use BLOB storage, which is very cheap either from Amazon or Microsoft (and many other providers)
This approach is better because:
It reduces the risk of data loss in case of hardware failure on your single server machine
Your page loads faster since assets are loaded from a CDN
You can reuse the files from any application since they're all in the cloud
Here's a couple of tutorials to get started on azure:
http://www.windowsazure.com/en-us/develop/net/how-to-guides/blob-storage/
http://code.msdn.microsoft.com/windowsazure/How-To-Use-Azure-Blob-16882fe2
One way of doing this would be to use Virtual Directories - in IIS, both sites can be configured as having a "/Uploads/" virtual directory and they can both be mapped to the same location on the hard drive.
I just need to create an extremely basic web server that will basically allow me to go to http://1.2.3.4:8080 and browse a list of files in C:\web or something.
I found this http://mikehadlow.blogspot.com/2006/07/playing-with-httpsys.html which looks perfect but I ran into a couple of questions.
1) When I replace the IP with * or + like the documentation says, I get access denied errors in system.dll. When I use localhost or my local IP it works fine. Why is this? I would like to potentially be able to bind it to a specific IP address on machines that have more than one.
2) I am probably missing something, but how do you specify the core directory where the files are that it is serving with this code?
re 1: because you dont have permissions to register this url. Use "http add urlacl2 to register permissions for your user (as admin) to make the binding. Example: http add urlacl url=http://+:8080/ user=DOMAIN\UserName
Re 2: You dont. THat is pretty much your code. Http.sys does not read from a file system - it is a driver. Your application must read the files and answer the request.
This might be a little overkill for what you want, but check out the aspNETserve web server project.
It is open source, so at the very least you can browse the code to get some ideas.
I know this does not help you with your code problems, but why re-invent the wheel! I think you should look at using IIS Express, as I think it could meet your needs nicely:
http://learn.iis.net/page.aspx/868/iis-express-overview/
IIS Express is a standalone executable that will provide all the functionality you need. It will also run on Windows XP and above.
Here's a Simple and Secure C# Webserver, offering Digest authentication without the need for Active Directory. Digest Auth is broken, but it is not practical to crack with passwords over 18 characters, anyway one can see how to make a webserver using C# and .NET HTTP.SYS which was the point of this question.
https://git.motes.camp/web/index.php?p=DigestAuthWebServer.NET-HTTPSYS.git&a=summary
clone url: https://git.motes.camp/DigestAuthWebServer.NET-HTTPSYS.git
Ok, my web application is at C:\inetpub\wwwroot\website
The files I want to link to are in S:\someFolder
Can I make a link in the webapp that will direct to the file in someFolder?
If its on a different drive on the server, you will need to make a virtual directory in IIS. You would then link to "/virtdirect/somefolder/"
You would have to specifically map it to some URL through your web server. Otherwise, all your files would be accessible to anyone who guessed their URL and you don't want that...
Do you have another virtual directory/application pointing to s:\someFolder? If so, it's just a simple link.
Are you trying to stream files back? If so, take a look at Response.TransmitFile and Response.WriteFile.
Otherwise, maybe you could create a handler (.ashx) to grab a specified file and stream its contents back?
i think there are only two ways
1) make a virtual path wich points to download directory
2) call a your aspx/ashx handler wich load file locally and send it to client.
A solution which works at the OS level rather than the webserver level is to make a symbolic link.
Links to files are supported on Vista and links to folders ("junctions") are supported on Win2000 onwards.
That depends on the configuration of your web server. Probably not. You don't want the web server to be able to access any file on the hard drive (ie your passwords file), so only those files configured to be accessible in the web server's configuration files are accessible and can be linked to. Usually these are all kept under one directory. You could, of course, copy someFolder and place it under your web directory, then it would be accesible, or if you are sure it is safe, change the configuraton of your web server to allow access to that folder.