How to load Dicom image from server (path contains https://)
Am using fo-Diocm library.
var image = new DicomImage(filePath);
if filePath is from local directory its working fine.
if filePath is from server (like https://example.com/filepath.dcm), its throwing 'Dicom.DicomFileException' saying that specified path value is not correct.
What is the right way to load DicomImage from server ??
AFAIK: Not at all. Obtaining DICOM Files through Web API is referred to as WADO (Web Access to DICOM Objects), and there is an open feature request in fo-dicom to support this, but it has been assigned quite a high priority.
However, the URL https://example.com/filepath.dcm does not adhere to WADO encoding, so maybe you cannot use a DICOM communication protocol to obtain the image (or you do not want to). But in this case, your task is simply "downloading a file from an URL" and not related to DICOM at all. After the download is completed, you have it stored locally, and can go on with local file access as you are used to.
The "missing link": How to download a file from an URL
Related
i have multiple web server and one central file server inside my data center.
and all my Web server store the user uploaded files into central internal file server.
i would like to know what is the best way to pass the file from web server to file server in this case?
as suggested i try to add more details to question:
the solution i came up was:
after receiving files from user at web server, i should just do an Http Post to the file server. but i think there is some thing wrong with this because it causes large files to be entirely loaded into memory twice: (once at web server and once at file server)
Is your file server just another windows/linux server or is it a NAS device. I can suggest you number of approaches based on your requirement. The question is why d you want to use HTTP protocol when you have much better way to transfer files between servers.
HTTP protocol is best when you send text data as HTTP itself is based
on text.From the client side to Server side HTTP is used as that is
the only available option for you by our browsers .But
between your servers ,I feel you should use SMB protocol(am assuming
you are using windows as it is tagged for IIS) to move data.It will
be orders of magnitude faster as much more efficient to transfer the same data over SMB vs
HTTP.
And for SMB protocol,you do not have to write any code or complex scripts to do this.As provided by one of the answers above,you can just issue a simple copy command and it will happen for you.
So just summarizing the options for you (based on my preference)
Let the files get upload to some location on the each IIS web server e.g C:\temp\UploadedFiles . You can write a simple 2-3 line powershell script which will copy the files from this C:\temp\UploadedFiles to \FileServer\Files\UserID\\uploaded.file .This same powershell script can delete the file once it is moved to the other server successfully.
E.g script can be this simple and easy to make it as windows scheduled task
$Destination = "\\FileServer\Files\UserID\<FILEGUID>\"
New-Item -ItemType directory -Path $Destination -Force
Copy-Item -Path $Source\*.* -Destination $Destination -Force
This script can be modified to suit your needs to delete the files if it is done :)
In the Asp.net application ,you can directly save the file to network location.So in the SaveAs call,you can give the network path itself. This you have to make sure this network share is accessible for the IIS worker process and also has write permission.Also in my understanding asp.net gets the file saved to temporary location first (you do not have control on this if you are using the asp.net HttpPostedFileBase or FormCollection ). More details here
You can even run this in an async so that your requests will not be blocked
if (FileUpload1.HasFile)
// Call to save the file.
FileUpload1.SaveAs("\\networkshare\filename");
https://msdn.microsoft.com/en-us/library/system.web.ui.webcontrols.fileupload.saveas(v=vs.110).aspx
3.Save the file the current way to local directory and then use HTTP POST. This is worst design possible as you are first going to read the contents and then transfer it as chunked to other server where you have to setup another webservice which recieves the file.The you have to read the file from request stream and again save it to your location. Am not sure if you need to do this.
let me know if you need more details on any of the listed method.
Or you just write it to a folder on the webservers, and create a scheduled task that moves the files to the file server every x minutes (e.g. via robocopy). This also makes sure your webservers are not reliant on your file server.
Assuming that you have an HttpPostedFileBase then the best way is just to call the .SaveAs() method.
You need the UNC path to the file server and that is it. The simplest version would look something like this:
public void SaveFile(HttpPostedFileBase inputFile) {
var saveDirectory = #"\\fileshare\application\directory";
var savePath = Path.Combine(saveDirectory, inputFile.FileName);
inputFile.SaveAs(savePath);
}
However, this is simplistic in the extreme. Take a look at the OWASP Guidance on Unrestricted File Uploads. File uploads can be the source of many vulnerabilities in your application.
You also need to make sure that the web application has access to the file share. Take a look at this answer
Creating a file on network location in asp.net
for more info. Generally the best solution is to run the application pool with a special identity which is only used to access the folder.
the solution i came up was: after receiving files from user at web server, i should just do an Http Post to the file server. but i think there is some thing wrong with this because it causes large files to be entirely loaded into memory twice: (once at web server and once at file server)
I would suggest not posting the file at once - it's then full in memory, which is not needed.
You could post the file in chunks, by using ajax. When a chunk receives at your server, just add it to the file.
With the File Reader API, you could read the file in chunks in Javascript.
Something like this:
/** upload file in chunks */
function upload(file) {
var chunkSize = 8000;
var start = 0;
while (start < file.size) {
var chunk = file.slice(start, start + chunkSize);
var xhr = new XMLHttpRequest();
xhr.onload = function () {
//check if all chunks are and then send filename or send in in the first/last request.
};
xhr.open("POST", "/FileUpload", true);
xhr.send(chunk);
start = end;
}
}
It can be implemented in different ways. If you are storing files in files server as files in file system. And all of your servers inside the same virtual network
Then will be better to create shared folder on your file server and once you received files at web server, just save this file in this shared folder directly on file server.
Here the instructions how to create shared folders: https://technet.microsoft.com/en-us/library/cc770880(v=ws.11).aspx
Just map a drive
I take it you have a means of saving the uploaded file on the web server's local filesystem. The question pertains to moving the file from the web server (which is probably one of many load-balanced nodes) to a central file system all web servers can access it.
The solution to this is remarkably simple.
Let's say you are currently saving the files some folder, say c:\uploadedfiles. The path to uploadedfiles is stored in your web.config.
Take the following steps:
Sign on as the service account under which your web site executes
Map a persistent network drive to the desired location, e.g. from command line:
NET USE f: \\MyFileServer\MyFileShare /user:SomeUserName password
Modify your web.config and change c:\uploadedfiles to f:\
Ta da, all done.
Just make sure the drive mapping is persistent, and make sure you use a user with adequate permissions, and voila.
I am developing a Windows Phone 8 application but am having a lot of issues with file access permission exceptions hindering the approval of my application when ever I try accessing files in the "local" folder (this only happens after the application has been signed by the WP store, not when deployed from Visual Studio). To solve this I have moved all file operations to IsolatedStorage and this seems to have fixed the problems.
I only have one problem left though. My application needs to make use of the file extension system to open external files and this seems to involve the file first being copied to the local folder where after I can then manually copy it into IsolatedStorage. I have no problem in implementing this but it seems that a file access permission exception also occurs once the system tries to copy the external file into the local folder.
The only way I think this can be solved is if I can direct the system to directly copy into IsolatedStorage but I cannot figure how to do this or if it is even possible. It seems as if though the SharedStorageAccessManager can only copy into a StorageFolder instance but I have no idea how to create one that is directed into IsolatedStorage, any ideas?
PS. Do you think that the Microsoft system might be signing my application with some incompetent certificate or something because there is not a hint of trouble when I deploy the application from Visual Studio, it only happens when Microsoft tests it or when I install it from the store using the Beta submission method.
Below is a screenshot of the catched exception being displayed in a messagebox upon trying to open a file from an email:
EDIT:
Just to make it even clearer, I do NOT need assistance in figuring out the normal practice of using a deep link uri to copy an external file into my application directory. I need help in either copying it directly into isolatedstorage or resolving the file access exception.
Listening for a file launch
When your app is launched to handle a particular file type, a deep link URI is used to take the user to your app. Within the URI, the FileTypeAssociation string designates that the source of the URI is a file association and the fileToken parameter contains the file token.
For example, the following code shows a deep link URI from a file association.
/FileTypeAssociation?fileToken=89819279-4fe0-4531-9f57-d633f0949a19
Upon launch, map the incoming deep link URI to an app page that can handle the file
// Get the file token from the URI
// (This is easiest done from a UriMapper that you implement based on UriMapperBase)
// ...
// Get the file name.
string incomingFileName = SharedStorageAccessManager.GetSharedFileName(fileID);
// You will then use the file name you got to copy it into your local folder with
// See: http://msdn.microsoft.com/en-us/library/windowsphone/develop/windows.phone.storage.sharedaccess.sharedstorageaccessmanager.copysharedfileasync(v=vs.105).aspx
SharedStorageAccessManager.CopySharedFileAsync(...)
I've inline the information on how to do this from MSDN http://msdn.microsoft.com/en-us/library/windowsphone/develop/jj206987(v=vs.105).aspx
Read that documentation and it should be clear how to use the APIs as well as how to setup your URI mapper.
Good luck :)
Ok I figured it out. The "install" directory is actually restricted access but for some reason the Visual Studio signing process leaves the app with enough permissions to access this folder. The correct procedure of determining a relative directory is not to use "Directory.GetCurrentDirectory()" but rather to use "ApplicationData.Current.LocalFolder". Hope this helps!
How can I get the path of a file on my computer or on the local area network.
I would to show a window that allows me to browse the file system when I click a button, and I want to select a file and get the path to the file. How can I do this?
P.S. I'm not looking to upload the file; I just want to get the path.
The web application is running on the server, and you don't have access to the client file system at all. Can you imagine how much of a vulnerability that would be? I know I don't want the sites I visit inspecting my file system...
EDIT: Question is for a WinForms application
For a WinForms application, you can use the OpenFileDialog, and extract the path with something like this:
If you're looking for the file path:
string path = OpenFileDialog1.FileName; //output = c:\folder\file.txt
If you're looking for the directory path:
string path = Path.GetDirectoryName(OpenFileDialog1.FileName); //output = c:\folder
In general, the System.IO.Path class has a lot of useful features for retrieving and manipulating path information.
To do this in VB.NET use the FolderBrowserDialog.
Due to security restrictions browsers include for user safety, you can't manipulate the client file system directly. You can only use to let them pick a file, and even then it only sends the filename, not the whole path.
The FileSystem API in HTML5 allows for file manipulation, but only within a sandbox for your site specifically, not browsing across the network or other files on the client system.
Instead, provide your users easy steps on how they should use My Computer (or whatever equivalent on other OS's) to navigate to the file and copy & paste the path into a simple text input box.
Have you taken a look at the Path class from System.IO ?
In my scenario, users are able to upload zip files to a.example.com
I would love to create a "daemon" which in specified time intervals will move-transfer any zip files uploaded by the users from a.example.com to b.example.com
From the info i gathered so far,
The daemon will be an .ashx generic handler.
The daemon will be triggered at the specified time intervals via a plesk cron job
The daemon (thanks to SLaks) will consist of two FtpWebRequest's (One for reading and one for writing).
So the question is how could i implement step 3?
Do i have to read into to a memory() array the whole file and try to write that in b.example.com ?
How could i write the info i read to b.example.com?
Could i perform reading and writing of the file at the same time?
No i am not asking for the full code, i just can figure out, how could i perform reading and writing on the fly, without user interaction.
I mean i could download the file locally from a.example.com and upload it at b.example.com but that is not the point.
Here is another solution:
Let ASP.Net in server A receive the file as a regular file upload and store it in directory XXX
Have a windows service in server A that checks directory XXX for new files.
Let the window service upload the file to server B using HttpWebRequest
Let server B receive the file using a regular ASP.Net file upload page.
Links:
File upload example (ASP.Net): http://msdn.microsoft.com/en-us/library/aa479405.aspx
Building a windows service: http://www.codeproject.com/KB/system/WindowsService.aspx
Uploading files using HttpWebRequest: Upload files with HTTPWebrequest (multipart/form-data)
Problems you gotto solve:
How to determine which files to upload to server B. I would use Directory.GetFiles in a Timer to find new files instead of using a FileSystemWatcher. You need to be able to check if a file have been uploaded previously (delete it, rename it, check DB or whatever suits your needs).
Authentication on server B, so that only you can upload files to it.
To answer your questions - yes you can read and write the files at the same time.
You can open an FTPWebRequest to ServerA and a FTPWebRequest to ServerB. On the FTPWebRequest to serverA you would request the file, and get the ResponseStream. Once you have the ResponseStream, you would read a chunk of bytes at a time, and write that chunck of bytes to the serverB RequestStream.
The only memory you would be using would be the byte[] buffer in your read/write loop. Just keep in mind though that the underlying implementation of FTPWebRequest will download the complete FTP file before returning the response stream.
Similarly, you cannot send your FTPWebRequest to upload the new file until all bytes have been written. In effect, the operations will happen synchronously. You will call GetResponse which won't return until the full file is available, and only then can you 'upload' the new file.
References:
FTPWebRequest
Something you have to take into consideration is that a long running web requests (your .ashx generic handler) may be killed when the AppDomain refreshes. Therefore you have to implement some sort of atomic transaction logic in your code, and you should handle sudden disconnects and incomplete FTP transfers if you go that way.
Did you have a look at Windows Azure before? This cloud platform supports distributed file system, and has built-in atomic transactions. Plus it scales nicely, should your service grow fast.
I would make it pretty simple. The client program uploads the file to server A. This can be done very easily in C# with an FtpWebRequest.
http://msdn.microsoft.com/en-us/library/ms229715.aspx
I would then have a service on server A that monitors the directory where files are uploaded. When a file is uploaded to that directory or on certain intervals it simply copies files over to server B. Again this can be done via Ftp or other means if they're on the same network.
you need some listener on the target domain, ftp server running there, and on the client side you will use System.Net.WebClient and UploadFile or UploadFileAsync to send the file. is that what you are asking?
It sounds like you don't really need a webservice or handler. All you need is a program that will, at regular intervals, open up an FTP connection to the other server and move the files. This can be done by any .NET program with the System.WebClient library, doesn't have to be a "web app". This other program could be a service, which could handle its own timing, or a simple app run by your cron job. If you need this to go two ways, for instance if the two servers are mirrors, you simply have the same app on the second box doing the same thing to upload files over to the first.
If both machines are in the same domain, couldn't you just do file replication at the OS level?
DFS
set up keys if you are using linux based systems:
http://compdottech.blogspot.com/2007/10/unix-login-without-password-setting.html
Once you have the keys working, you can copy the file from system A to system B by writing regular shell scripts that would not need any user interactions.
I have an ASP FileUpload control and I am uploading:
C:\Documents and Settings\abpa\Desktop\TTPublisher\apache-tomcat-6.0.26\webapps\ttpub\WEB-INF\classes\org\gtfs\tmp\GTFS_Rail\routes.txt
What is the C# code to grab this entire string using the code below:
var pathOfCsvFile = Server.MapPath(ImportRoutes.FileName);
var adapter = new GenericParsing.GenericParserAdapter(pathOfCsvFile);
DataTable data = adapter.GetDataTable();
I know that Server.MapPath needs to change.
UPDATE:
Using System.IO.Path.GetFullPath gave me the below output:
pathOfCsvFile = "C:\\Program Files\\Common Files\\Microsoft Shared\\DevServer\\10.0\\routes.txt"
You are mixing up client and server behavior, which is easy to do when you are testing locally. The issue you are having is that the FileUploadControl (and HTML file uploading in general) is specifically designed to not provide you with the full path to the file. That would be a privacy breach. What it is designed to give you is the binary data of the file that was uploaded itself. Specifically you should query the properties on FileUploadControl: FileBytes or FileContent.
Just to further clarify the issue, what would happen if the browser user was actually on a physically different machine from the web server (the usual case)? What good would the full path of the file on the client machine be to you on the server?
Server.MapPath will return the physical path to a file in or below the application root. If that path you list is outside the application root, Server.MapPath will not work.
You can map a virtual directory to a folder you want to use to hold file uploads, which you can then discover with Server.MapPath.