How do I reference a file outside my web site's root directory?
For example my website is located at C:\dev\TestSite
I am using ASP.NET with XSP. The webapp will be deployed on Apache using mod_mono.
I have images in C:\images and I would like to do this:
<img src="C:\images\logo.gif"/>
Your img tag's src value is going to be sent to the client. You need to specify those paths relative to your document root. Your best bet is to set up a virtual folder (in IIS, alias is the apache equivalent) to point to the c:\images path and then change the mentioned tag src path as follows
<img src="/images/logo.gif" />
To do this in apache, you need an alias in your httpd.conf. The line looks like this
Alias /images c:/images
Here's the docs http://httpd.apache.org/docs/2.0/mod/mod_alias.html#alias
While it might be possible to do this through some hacky method, it's not a good idea. Allowing the IIS account to access files/folders on the greater file system would be a big potential security hole.
The best way to accomplish this is to use IIS Virtual Directories. Put your content in folders dedicated to supporting the site, DO NOT make your entire C: drive a virtual directory.
I'm going to assume that C:\images\logo.gif is the path on the server, and not the path on the client.
The src attribute is interpreted by the html client (i.e. Internet Explorer). The client can't see anything outside of your web directory. In fact, the client can only see things inside your web directory if you've provided permission for them to do so. Thus, this isn't an ASP.NET issue, but an issue of how web clients have access to web servers... which is designed this way for security.
In order for your application to use these images, you've got a couple of options that immediately spring to mind - neither of which is ideal:
The ASP.NET code (in the codebehind) is going to need to go and grab the file, and serve it out in the html stream that is being served to the client, which is more a complex task than I suspect you are willing to embark on.
The ASP.NET code (using System.IO) is going to need to go and grab the file from it's home location in C:\images\logo.gif and copy it to a location that is accessible to the client - you could create a temporary directory, copy your image to it, serve it out, delete it, delete the directory.
Both of these are certainly hacks that should be avoided if possible, but if you're adamant that this is what you want to do, this will allow you to do it via your ASP.NET app.
The most ideal solution is to add C:\Images as a virtual directory to your document root, i.e. /ImageCentral - this way you can have images that are central to multiple websites stored in this directory, it can then be referenced by clients for any of the websites just by adding virtual directories to each of them pointing at the central images folder. As DaveSwersky points out, don't make any directories containing sensitive information virtual directories, the minute you add a virtual directory to an externally visible website, you're giving people free reign to any of the information in it.
Good luck
You can't do this from within the HTML code of your page. The HTML page can only reference web accessible content (in regards to images, CSS, javascript, etc). You could create a virtual directory that points to your images folder so that it becomes web accessible.
EDIT:
The apache way inside of your conf file.
Alias /images "C:/Images"
And a little walkthrough from some dude.
Related
I'm looking to do this exactly :
set src property in view to a url outside of the MVC3 project
Fine but in web form ?
I tried simply putting the path as a string into the src of the image :
<asp:Image ID="imgInside" runat="server" src="\\serverName.com\dfs$\APPL-ADM\FichiersDev\MandatsInfo\SAR220-2020_1.jpg" >
Obviously not working, so I made src pointe on this function I wrote like so :
<asp:Image runat="server" Width="160px" src='<%# getImage(Container.DataItem as MandatMobile.DAL.MandatsEcoleCC_Result) %>' ></asp:Image>
In back end C# :
protected Byte[] getImage(MandatsEcoleCC_Result p)
{
using (MandatsDatas db = new MandatsDatas())
{
GROUPE_ARTICLE g = db.GROUPE_ARTICLE.First(t => t.ID_GROUPE == p.ID_GROUPE);
if (string.IsNullOrWhiteSpace(g.image))
return null;
FileStream fs = null;
try
{
fs = new FileStream(#"\\serverName.com\dfs$\APPL-ADM\FichiersDev\MandatsInfo\" + g.MANDAT.NO_MANDAT + g.image, FileMode.Open, FileAccess.Read);
}
catch
{
}
BinaryReader br = new BinaryReader(fs);
return br.ReadBytes((int)fs.Length);
}
}
Still not working, I've been searching but I just can't figure it out and I'm stuck trying all sorts of non-sens.
Well, you confusing two things:
Code behind:
Anytime you run code that uses a file, then you writing 100% server side code. As such any file path is a proper windows FULL qualified path name. It has ZERO ZERO to do with web URL's.
Read the above a dozen times. Your code does not use URL path names - end of story.
Web site:
Anytime you reference a file, picture, script files or anything? You are and MUST use a URL based on the path names of the web site, and more so path names that resolve to the folders that represent the site.
root:
\Pictures (say a folder in the web site folder list with pictures.
So, a src, or ANY URL in the web site? They do NOT use windows path names like code behind.
So, if there is a cat.png picture in folder pictures? When your URL will be this:
www.mywebsite.com/Pictures/cat.png
If you write code to read/load/see/use that cat.png picture? Then you convert in code from that extenral URL to a full qualifed standard windows path name (with back slaches).
So, in code behind if you want to read, or do somthing with the above file?
You use
dim strFile as string
strFile = Server.MapPath("www.mywebsite.com/Pictures/cat.png")
map path will now return a full qualified windows server path
eg:
c:\inetpub\wwwrootmysite\Pictures\cat.png
Ok, so now we realize that to use a VALID link to pictures on teh web site, we MUST use a valid URL.
So, what happens if say we have a network connected HUGE massive say SAN drive or some other huge server on the network that has huge storage, and has our pictures in that site?
Say:
\SANSERVER\WebPictures\cat.png
Well, obviosity that file folder can't be used in a URL. ONLY URL's in the web folder path name can be used. And this is a good thing. Since when I go to www.amazon.com it is a VERY good thing I can't type in a URL to get at their intenral accouting files server and steal all the credit card information of all customers.
So, now, how can I get at that cat.png, and turn it into a valid URL?
There are two ways:
One:
You make the decision to expose and INCLUDE the above path name as part of the web site. This is typical done with what is called a virutal folder. You need IIS, and during development with IIS + Visual Studio, it is a "pain" to setup such path names. But if you have full version of IIS, then you can add the virutal folder to the web site though the IIS user interface tools.
So, you add a virutal folder called MyPictures, and it will be mapped to:
\SANSERVER\WebPictures\cat.png
So, now the web site URL becomes:
www.mycoolsite.com/MyPictures/cat.png
And in code if you do a server.map path, the above url will return this:
\SANSERVER\WebPictures\cat.png
Ok, next issue:
I don't want to expose that other folder to the web site. I don't want a valid URL, and I don't even want users to be able to type in say this:
www.mycoolsite.com/MyPictures/doggie.png
So, if you DO expose another folder or add a folder to the web site hiarchy, then users ARE FREE to type in a URL that will resolve to that other folder (but you are assumed to have added a virtual folder to the web site).
Now, with a valid URL resolution, then you can place markup code on teh web site, and provide valid full URL path names to the picture or whatever for the web site.
However, lets say for reason of security, I do NOT want that other server to be exposed to as a URL?
Well, it it is NOT exposed as a valid web URL folder, then you can NOT put in a valid URL - it that's simple.
However, that don't mean the code behind can't read/load/open that file on the server. In fact the web site code behind can often read any file on the server, and in fact read any file anyplace on the network that the web server is running. And as noted, code behind does not use URL
s, and does not use "forward" "/" for the file - but a plan jane old fashting fully qualfied windows path name.
Since the code behind can darn near read any file and do anything it wants?
Ok, then how can we get the code behind to dish out a file, or send that file to the web site?
Two simple ways:
Your code behind could read the cat.png file, and copy it to a folder that is part of the web server folder layout. Once one, then you can provide a valid URL. However, with a huge picture library, that would be pain full.
And in some cases the picture might come from a database row(s) that store pictures, and once again no valid path name exists for the web site.
So, what you can do is read the file in code behind and then "stream" the data directly to the web site.
When you steam contents from code behind, then you don't care nor even require a valid URL, because the code behind is pumping out the object data (in this case a picture cat.png) directly to the web browser. So this is often done because your pictures don't even exist in a file, or in fact it not practical to include that folder in the web site folder list for reasons of security.
As noted, if this was/is just a folder of pictures OUTSIDE of the folders for the web site? Well then 99% of the time, then adding a mapped folder (a virtual folder) to the web site that points to the picture hard drive is common done, and is practical.
however, you might have a HUGE library of pictures on a big file server, and you have a database that has key words for searching the pictures, and the database row stores a valid path name to the hard drive/server that has all the pictures in a Hodge podge folder hierarchy that is not practical to expose as web based urls.
So, how to stream a file?
You code is close, but you need to include additional information. And unfortantly the server can't stream the file down as 100% binary format.
So, say we drag + drop a image control onto the form. You have this:
<asp:Image ID="Image1" runat="server" />
So, now in code behind to stream + set the picture to a picture on the hard drive?
You can use this:
Dim strFile As String = "c:\Test4\pcards.bmp"
Me.Image1.ImageUrl = Gimage2(strFile)
Now of course the URL path name to the above Test4 folder does not exist.
Gimage2 - it just converts the file as a byte array, and then to a string coded as base64.
Function Gimage2(strPath As String) As String
Dim PicData As Byte() = Nothing
PicData = File.ReadAllBytes(strPath)
Dim ContentType As String = "image/" & Path.GetExtension(strPath)
Return "data:" & ContentType & ";base64," & Convert.ToBase64String(PicData, 0, PicData.Length)
End Function
So I spent some time with a long post. The reason is you attempted to use a URL with standard windows back slashes, and that means in your mind, you are using the concept of a windows full path name and MAJOR confusing that with a URL path name. Failure to make this distingishing will cause you years of pain and suffering. You must have BEYOND CRYSTAL clear this concpet of a URL and that of a file name in code behind. They are two VERY different things.
If that addtional folder is "ok" to expose to the web site? Then create a Virtural folder.
That means:
wwww.mycoolsite.com/MyPictures/dog.png
Could in fact point to ANY mapped folder on your server. And this means the web server will require permisions to that folder, and in most cases thus a user (or your code) can type in and use a full web path name to the picture.
However, as noted, for pdf documents and many other types of files, then it is out of the question to have a valid URL and a mapped folder. So you can use the 100% file based approach as per above, and read the file as bytes, and then stream + output the file to the browser.
You can even do a response.write and pump out the file directly to the browser, but then again you don't have much control as to where it will be. Do realize that pumping out a string as base 64 data as per above can and will cause some bloat and expansion in the size of the string sent to be rendered as a picture. So for a simple image - sure that's ok. But for a larger high quality high resolution image, then of course I don't recommend you send the picture as a base64 string due to the expansion that string will result in.
I ended up putting a fonction in another MVC project that works correctly to retrieve images.
So my src path point on an URL instead of a file on a server path.
src='https://NameOf_MVC_webSite.csdn.qc.ca/imageBank/ForMandat?name=' + (Container.DataItem as MandatMobile.DAL.MandatsEcoleCC_Result).image
Dirty solution using another deployed app that has a (better / easy to use / functional) framework
But this is not an "OK" solution
Let's assume our app is offline, i.e. we can't use 3rd party CDNs thus we're creating our own.
I'd like to host all of the vendor scripts in a separate (Parent) web app and then include them in the bundles in several other MVC Apps.
e.g.
http://localhost/parentWeb/Scripts/jquery.js
http://localhost/parentWeb/Scripts/jquery-ui.js
http://localhost/parentWeb/Scripts/globalize.js
I'd like to include in the ASP.NET MVC App Website located in: http://localhost/parentWeb/childWeb
i.e. do something like this:
bundles.UseCdn = true;
bundles.Add(
new ScriptBundle(
"~/bundles/VendorScripts",
"http://localhost/parentWeb/Scripts/jquery.js",
"http://localhost/parentWeb/Scripts/jquery-ui.js",
"http://localhost/parentWeb/Scripts/globalize.js"));
...which of course isn't currently possible. Is there a good workaround?
You can't bundle external resources. If you think about it, it makes sense why you can't. It would require the bundler to actually download the resource and save it to the filesystem before it could work with it, and of course do it all asynchronously with some sort of fallback if the external resource couldn't be reached. And, then, it would have to do this on every page load because it can't check for lastmod (and therefore, know whether it actually needs to rebundle or not) without fetching the resource first.
If you use a CDN resource, the bundler merely prints the URL directly to the page; it doesn't make any modifications. Even then, it only lets you create a "bundle" of just that one URL, because 1) it wouldn't make sense to bundle multiple CDN resources since that would defeat the purpose of a CDN and 2) the bundle only exists in this scenario to provide a fallback if the CDN resource is unavailable. Otherwise, you would be served just as well by just hardcoding it to the page and not worrying about setting up a bundle at all.
I know this is an old topic, but I came here looking for an actual way to bundle CDN resources. From #Chris Pratt's answer, I understood it wasn't possible.
If you're wondering, I am working on optimizing an an existing project according to Google's Web Performance Best Practises which gives a low score when there are multiple script tags and a higher one when all scripts are bundled into a single script reference.
I needed a way to bundle all the CDN script resources as well as local resources in order. I worked on this github repo, which solved my problem.
With it, you build a bundle with a list of bundles, each containing a reference to the cdn resource, local resource to save to, and a Boolean indicating whether or not you want the bundle to be minified.
List<Bundle> jsBundles = new List<Bundle>();
jsBundles.Add(new Bundle("https://cdnjs.cloudflare.com/ajax/libs/jquery/3.1.0/jquery.min.js", #"~/jquery.min.js", Bundle.BundleType.JavaScript, false));
jsBundles.Add(new Bundle("https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.0/jquery-ui.min.js", #"~/jquery-ui.min.js", Bundle.BundleType.JavaScript, false));
jsBundles.Add(new Bundle(#"~/my-local-script.js", Bundle.BundleType.JavaScript, true));
To place on the page, you use
#jsBundles.Load();
This will process all bundles in the list, downloading content for bundles that have not been downloaded in the last 24 hours (It updates every 24 hours or when the web application restarts). All content downloaded will be placed in local files (where specified).
All content will be combined into the final result which will be spooled into the page in a script tag (or link tag for CSS).
The Load function also accepts a local File URL for the final script/css content. If specified, a tag with a src to the relative path for that local file will be given instead. E.g.
#jsBundles.Load("~/js/all-my-scripts.js");
The above statement will return something like:
<script src="~/js/all-my-scripts.js"></script>
An async attribute may be added to the script tag if the second parameter of the Load function is provided.
It also works on css cdn resources too. E.g.
List<Bundle> cssBundles = new List<Bundle>();
cssBundles.Add(new Bundle("https://cdnjs.cloudflare.com/ajax/libs/jqueryui/1.12.0/jquery-ui.min.css", #"~/jquery.ui.css", Bundle.BundleType.CSS, false));
cssBundles.Add(new Bundle(#"~/css/my-local-style.css", Bundle.BundleType.CSS, true));
#cssBundles.Load("~/css/all-my-styles.css");
This is for the benefit of those, who like me, came in here looking for a way to actually bundle CDN resources.
I have found a solution, which has nothing to do with CDN. Basically, granted the childWeb is hosted in the parentWeb's subdirectory, the following bundle configuration in the childWeb apps picks the file from the parentWeb and bundles them as usual:
bundles.Add(
new ScriptBundle(
"~/bundles/VendorScripts").Include(
"~/../Scripts/jquery.js",
"~/../Scripts/Scripts/jquery-ui.js",
"~/../Scripts/globalize.js"));
the important bit being: ~/../, which takes you one level up from the root location.
To use ScriptBundles with CDN resources, you need to use the overloaded constructor. Unfortunately you need to specify multiple ScriptBundles per file.
Here's a great blog post explaining things:
http://www.hanselman.com/blog/CDNsFailButYourScriptsDontHaveToFallbackFromCDNToLocalJQuery.aspx
And here's a code snippet:
bundles.UseCdn = true;
var bundle = new ScriptBundle("~/bundles/bundleNameHere", "//cdn.host.path/to/file");
// Path to the version of the file on your server (in case the CDN fails)
bundle.Include("~/../Scripts/path/to/file");
// JS expression to run, to test if CDN delivered the file or not
bundle.CdnFallbackExpression = "window.ValueYouExpectToBeTruthy";
bundles.Add(bundle);
The purpose of bundle is reduce the traffic of your web server where your web application hosted. using bundle the script files will not load in parallel. this is needed when you are using your local files.
but in case of cdn the files will load from cdn server so you don't have need to make bundle for cdn. you need bundle for local files only
In my project i will be having an link like
Download
I want the users to download files of different types. The file will be in the root folder. When i am clicking on the link it is displaying an error. This is the plugin to install in the chrome. If the user download this link and open then it will automatically add to the chrome.
How can i do this.
The file is not even downloading.
This isn't a valid path:
~/hello world.crx
The ~ character is for use server-side to denote the root of the application. Client-side it has no meaning. The browser doesn't know what the root of the application is (or what the application is at all), it's just sending requests to resources at addresses. And it doesn't know what to do with that address.
You'll need to either use some server-side logic to translate that path into a browser-useable path, or manually make it a relative or absolute path.
If the ASP.NET MVC Framework isn't translating this for you then you're probably using a version that requires a little more manual work for it. Try something like:
Download
(Note: This assumes the use of the Razor view engine. If you're not using that then you'll want to use whatever your view engine equivalent is.)
What you need to do is set up a directory online, where you can host the file.
I also see that in your aref you don't want to type the full path so denote it with a /hello_world.crx, but make sure that you've set up a base href:
<base href="http://yourdomain.com/something/">
Try renaming the file to remove any spaces e.g. "hello_world.crx" and then change the name in the link code to match.
if a webpage and the downloadable file is in the same location
(i.e)
SampleFolder->Download.html
SampleFolder->hello world.crx
then try the below
download
If the webpage and the downloadable file in different location
(i.e)
SampleFolder->Download.html
SampleFolder->Downloads->hello world.crx
then try the below
download
i'm working on an asp.net app, the following link works in IE but not in FF.
<a href="~/BusinessOrderInfo/page.aspx" >
Isn't the tilde something that can only be used in asp.net server controls. Where it will be replaced by an actual path?
Is it possible to use the tilde in an anchor tag? If so what does it mean?
When I'm at the root, the link works
www.myserver.com/default.aspx, click the link, ok!
www.myserver.com/otherpart/default.aspx, click the link, not ok!
The link generated by ASP.NET is:
www.myserver.com/otherpart/~BusinessOrderInfo/page.aspx
Is this by design?
You are correct, it only works in server controls. You've got these basic options:
Change to HyperLink to run as a Web Control:
<asp:HyperLink NavigateUrl="~/BusinessOrderInfo/page.aspx" Text="Whatever" runat="server" />
Or, run the anchor on the server side as an HTML Control:
<a href="~/BusinessOrderInfo/page.aspx" runat="server" >
Or, use Page.ResolveUrl:
...
HTML controls can be turned into server controls by adding the runat="server" attribute.
<a href="~/BusinessOrderInfo/page.aspx" runat="server">
The tilde refers to the application root directory, and will be translated correctly in control properties such as NavigateUrl.
My understanding is that if you use it in plain-HTML tags, it will not be translated by ASP.Net.
This function can also be used to resolve paths for non server elements
VirtualPathUtility.ToAbsolute($"~/App_Themes/Default/Icons/myimage.gif")
If you remove tilde and use forward slash only you will achieve the same result, i.e. pointing to the root folder on the current domain:
<a href="/BusinessOrderInfo/page.aspx" >
Using Web Paths and Tilde "~" in ASP.NET
~/ is not part of HTML, CSS, or JavaScript path systems.
~/ is an artificial path resolution character only ASP.NET or 3rd party products use.
~/ is a Web Server only path that gets translated to a new path by the code running on the server.
~/ is a character that tells ASP.NET on the IIs Windows
Server to find the "application root" of your website.
~/ resolves as a "Virtual Path" as it tells the server to find virtual or application root of a ASP.NET Web Application controlled by a given AppDomain on the server and resolve it from that new virtual root.
~/ in most cases resolves to the web root of any website right right after the domain, no matter what page or subfolder you are in when the path is called. In almost all cases this resolves to /. So the two are the same in MOST cases unless you set up a Virtual Application on the server.
~/ is really only useful when your website uses one or more Virtual Applications in a web server like IIs. These are artificial sub-applications under your web domain that add a new folder or path under a web root that do not truly exist but represent separate applications and processes managed by the server. This often creates one or more virtual application folders under your domain in IIs which ASP.NET and IIs manage when running separate instances of your ASP.NET website under one domain. See below...
Microsoft .NET is now using ~/ in Routing Attribute paths. When used they start the path back at the web root as an absolute path, but also override all controller or other attribute paths.
VIRTUAL WEB APPLICATIONS
In the old days, we used to create Virtual Applications in IIs Web Server to create two more web paths in order to isolate one or more web 'experiences' using the same domain. Each Virtual Path might be a "ghost" path that points back to the web root but creates an additional ghost folder under the web root. In many cases, that new virtual path pointed to a physical folder separate from the normal web path or even to computer hard drive path or mapping. ASP.NET with the right permission then ran web site code from there. The new virtual path shown to visitors of your web domain would then appear as part of the main site but run a second instance of your web application using a separate process run by ASP.NET (a separate AppPool or worker process).
~/ was then very useful in those cases. It was used in path resolution and easily mapped to the root of these new virtual application roots or paths created by the server, allowing you run multiple application under one website with no change to your paths in your ASP.NET code. The server-side code would then resolve the paths for you inside each virtual application with no changes to the code base.
~/ in those situations was extremely valuable as you no longer needed to manage multiple paths in your web app for each application if it ran in multiple virtual web applications under one website with different web roots. It could always find the new root in each application using ~/ rather than the true web root which was always http://example.com/
EXAMPLES
Most paths in ASP.NET using ~/ resolve to / in a normal website without virtual applications, and point all paths to the web root of the URL below. In most cases that is why ASP.NET ~/ is redundant. Just use /. Both point to the web root:
https://example.com/
However, if you added virtual directories to your domain, as this example below shows, ~/ inside each separate web application would resolve to two different web roots:
https://example.com/virtualapplication1/
https://example.com/virtualapplication2/
In the early days of ASP.NET, I always grabbed the application path using this code below stored in a global variable. This allowed me to fully control all paths from a relative application web root off the domain root or a virtual root no matter where my web application was moved to. But this path is what the ~/ replaced long ago. However, it still might be better as you can build paths from it dynamically on the server:
var myWebRoot = HttpContext.Current.Request.ApplicationPath;
My opinion is virtual applications like this are rarely used today as domains are cheap and subdomains are often used instead, like so:
https://app1.example.com/
https://app2.example.com/
All web paths should use absolute paths in every case possible /. The exception is CSS paths which are relative to the page source page or code calling them internally. Many say that means those absolute web paths break if you move them. But I argue, why would you need to reference the root for your website then suddenly change that? If you do, that should be managed on the server side and injected into your HTML and JavaScript, not the other way around.
Second of all, many Open Source, UNIX-based vendors are creating JavaScript API libraries that stumble around with dot paths which HTML and CSS do not support, like ./ or .
These are UNIX conventions that just mean to point to the local folder or the same folder the calling code is in. It's the same as NO PATH, so why use it? There are cases for their use, but the end result has zero affect on Web Paths. So I would avoid their use. The ONLY place they work in JavaScript reliably is in the new JavaScript Module in ECMAScript. But in proprietary API's like Google Angular, they are required.
For example these two image paths using UNIX local path conventions using ./ or . both fail in HTML and create missing image errors:
// These return broken image icons in browsers when using
// these unconventional UNIX local dot path conventions on the Web:
<img id="image1" src="./images/image1.png" />
<img id="image2" src="/images/.image2.png" />
So avoid all these deviant path system and stick with / absolute HTML paths and your code will always work for decades to come!
We have a c# asp.net web application that, amongst other things, allows users to download previously uploaded files such as PDF's, Word docs etc. The asp.net app is served up via an IIS6 server and the file resources live on a different server.
When the user requests a file (i.e. click a button on the web form), we stream the file back to their browser, changing the ContentType appropriately.
This seemed a good way to avoid going down the IIS virtual folder route to serve up the file resources - which we had concerns about due to the potential for users to hack the URL. i.e. with a URL like https://mydomain/myresource/clientid/myreport.docx, a savvy user could have a good stab at guessing alternative cvlientid's and document names.
The trouble with streaming a Word document to the browser is that when the browser throws it at Word, Word treats it as a brand new doc, which means the original document's properties & margin info is lost.
Our users store metadata information in the Word doc properties, so this solution is not acceptable to them.
Serving up via IIS virtual folders solves that problem, but introduces the URL security problem.
So my questions are ...
Does anyone know how we can use URL encryption/decryption (or obfuscation) with IIS Virtual folders?
Or does anyone know of any open source projects that do a similar job.
Or does anyone have any sugestions on how to go about writing our own implementation of Virtual folders but with encrypted URLs?
Many thanks in advance.
ps. our web app is delivered over https
Sorry guys, in my question, I have made some incorrect assumptions.
What am I trying to do is persist the properties stored on a word document when they are delivered from server (using either Response.TransmitFile or via a virtual folder) to a client browser.
I set up a test scenario with an IIS virtual folder and dropped a docx file (that I know contains info in the title & subject properties) in my virtual folder's physical path.
I pointed my browser at the virtual folder alias and the browser popped up its message to either open or save the doc.
If I choose to save it, the saved docx still has the properties intact.
If I choose to open it fist and then save it from Word, the saved docx has lost the properties.
So I think I need to post a different question!
You may find that the ClaimsAuthorizationManager class in "Windows Identity Foundation" does what you want. You get to implement whatever logic you like to determine who can download what without using "directory security".