I've been experiencing the following issue.
I have a database attached to a remote sql server.
All needed impersonation actions are done - in other words I have all needed access to both file system and sql server.
Let's say I have FileStreanDB1 sql database:
\\server\C$\MSSQL\Data\FileStreamDB1.mdf
\\server\C$\MSSQL\Data\FileStreamDB1_log.ldf
\\server\C$\MSSQL\Data\FileStreamDB1
At some point I would like to drop this database.
So I just use the following sql statement (I call this code using c#):
DROP DATABASE [FileStreamDB1]
After that the database is deleted and all files are deleted as well (If I go to that server I don't find them - files and directories are really deleted).
But unfortunatelly the following code says to me that \\server\C$\MSSQL\Data\FileStreamDB1 still exists.
new DirectoryInfo(#"\\server\C$\MSSQL\Data\FileStreamDB1").Exists // returns true
Directory.Exists(#"\\server\C$\MSSQL\Data\FileStreamDB1") // returns true
It looks like that info about the directory cached and I need to clean that cash
(SMB2 Directory Cache and I DO NOT WANT TO DISABLE IT)
I've also tried to do that:
new DirectoryInfo(#"\\server\C$\MSSQL\Data\FileStreamDB1").Refresh().Exists // return true
Any ideas how I can clean windows cache about unc paths using c# ?
Problem is that the remote info is cached for a certain amount of time configurable in the registry.
All of the code will first read from the cache, resulting in the File.Exists=true result until the cache is being invalidated.
I found a couple of ways to bypass this cache from code.
Try to access the server unc $NOCSC$, this will bypass the client-side cache. (note: this doesn't work on Windows Server).
Like so: Directory.Exists(#"\\server$NOCSC$\C$\MSSQL\Data\FileStreamDB1").
Also it appears having an FileSystemWatcher looking at the specified folder will bypass caching as well. (Haven't tried this myself so correct me if I'm wrong)
Note: watching the unc path from Windows Explorer also bypasses the cache but only for that window not for any other code running.
Sources:
https://stackoverflow.com/a/41057871/6058174
https://stackoverflow.com/a/35158172/6058174
http://woshub.com/slow-network-shared-folder-refresh-windows-server/
Related
Im having trouble with writing files to remote directory via network. The following code fails when I try to check if the directory exists:
if (!Directory.Exists(processingPath))
Directory.CreateDirectory(processingPath);
processingPath is composed like
processingPath = xxxObject.serverPath + "processing\\";
xxxObject.serverPath contains something like this
\\machineNetworkName\sharedFolder\
Its working properly, but when many requests are processing (running as tasks asynchronously), it stops working and failing into exception:
System.IO.IOException: The network path was not found.
Could you please help me what could be the problem and why it is failing after some time on network path???
Thanks for your solutions
I got the same error before, it was about authentication problems.
You have to be sure that you set properly the user on IIS, because it use a Default App Pool's identity which can't access to your NFS.
You can also use IIS virtual folders to set the identity.
(on IIS manager, see App Pool settings -> Identity and also virtual folders settings -> identity).
In my case, it worked better by using the Impersonation directly in the code, so I recommend you to use the VladL WrappedImpersonationContext Object: How to provide user name and password when connecting to a network share
Last thing to check, the owner of the files on your NFS server, if they were created under the root user, it might not work.
I had the same problem and solved it. The problem in my code and I see it in yours, too, is that you have the slash at the end of the network path.
Instead of processingPath = xxxObject.serverPath + "processing\\"; write: processingPath = xxxObject.serverPath + "processing";
I'm working on an efficient solution to copy big files in the same remote machine, let's call it FILESERVER. Then, from another server (WEBSERVER) I want to issue copies of these files remotely, so I tried to copy/paste files in the same remote shared folder with Windows Explorer and I noticed it doesn't need to move the file contents through the network, so I thought using shared folders and simply copying files from WEBSERVER could make it.
So, I gave it a try with the following code.
File.Copy("\\FILESERVER\FOLDER\bigfile", "\\FILESERVER\FOLDER2\bigfile");
This works, but I noticed that it is actually moving the file contents through the network and that's exactly what I wanted to avoid. I don't want to have to implement a server in FILESERVER to receive commands to copy files if I can do it with a built-in Windows mechanism. So the behaviour I would like to implement is the same Explorer does, invoking it from my c# code. So, is possible to do this in .NET?
EDIT:
I tried XCOPY command and at first seemed it didn't use the network.
But after some reboots to ensure it wasn't any OS caching involved, I noticed that when I execute XCOPY from cmd it doesn't show any I/O in Process Explorer/taskmgr, but, when I execute this command from my C# code it does. So I think it does use the network to fetch/write the file contents but for a weird reason it's not reported in these diagnostics tools (taskmgr / Process Explorer).
use PSEXEC and run the copy with local folder paths on the remote machine.
Definitely WMI is a good way to do it. I finally managed to do it with the following code and the CopyEx method to copy directories recursively.
var classInstance = new ManagementObject("\\\\FILESERVER\\root\\cimv2", "Win32_Directory.Name='c:\\path\\to\\directory1'", null);
var copyExInParams = classInstance.GetMethodParameters("CopyEx");
// Add the input parameters.
copyExInParams["FileName"] = "c:\\path\\to\\directory2";
copyExInParams["Recursive"] = true;
copyExInParams["StartFileName"] = null;
var copyExOutParams = classInstance.InvokeMethod("CopyEx", copyExInParams, null);
It's important to notice that paths must be in the remote machine format. I can't prove it but maybe Windows Explorer is taking advantage of WMI to copy files in the same remote machine in shared folders to prevent useless network traffic. I haven't found a way to do it directly with UNC. Even though this suits my use case.
I am developing a Windows Phone 8 application but am having a lot of issues with file access permission exceptions hindering the approval of my application when ever I try accessing files in the "local" folder (this only happens after the application has been signed by the WP store, not when deployed from Visual Studio). To solve this I have moved all file operations to IsolatedStorage and this seems to have fixed the problems.
I only have one problem left though. My application needs to make use of the file extension system to open external files and this seems to involve the file first being copied to the local folder where after I can then manually copy it into IsolatedStorage. I have no problem in implementing this but it seems that a file access permission exception also occurs once the system tries to copy the external file into the local folder.
The only way I think this can be solved is if I can direct the system to directly copy into IsolatedStorage but I cannot figure how to do this or if it is even possible. It seems as if though the SharedStorageAccessManager can only copy into a StorageFolder instance but I have no idea how to create one that is directed into IsolatedStorage, any ideas?
PS. Do you think that the Microsoft system might be signing my application with some incompetent certificate or something because there is not a hint of trouble when I deploy the application from Visual Studio, it only happens when Microsoft tests it or when I install it from the store using the Beta submission method.
Below is a screenshot of the catched exception being displayed in a messagebox upon trying to open a file from an email:
EDIT:
Just to make it even clearer, I do NOT need assistance in figuring out the normal practice of using a deep link uri to copy an external file into my application directory. I need help in either copying it directly into isolatedstorage or resolving the file access exception.
Listening for a file launch
When your app is launched to handle a particular file type, a deep link URI is used to take the user to your app. Within the URI, the FileTypeAssociation string designates that the source of the URI is a file association and the fileToken parameter contains the file token.
For example, the following code shows a deep link URI from a file association.
/FileTypeAssociation?fileToken=89819279-4fe0-4531-9f57-d633f0949a19
Upon launch, map the incoming deep link URI to an app page that can handle the file
// Get the file token from the URI
// (This is easiest done from a UriMapper that you implement based on UriMapperBase)
// ...
// Get the file name.
string incomingFileName = SharedStorageAccessManager.GetSharedFileName(fileID);
// You will then use the file name you got to copy it into your local folder with
// See: http://msdn.microsoft.com/en-us/library/windowsphone/develop/windows.phone.storage.sharedaccess.sharedstorageaccessmanager.copysharedfileasync(v=vs.105).aspx
SharedStorageAccessManager.CopySharedFileAsync(...)
I've inline the information on how to do this from MSDN http://msdn.microsoft.com/en-us/library/windowsphone/develop/jj206987(v=vs.105).aspx
Read that documentation and it should be clear how to use the APIs as well as how to setup your URI mapper.
Good luck :)
Ok I figured it out. The "install" directory is actually restricted access but for some reason the Visual Studio signing process leaves the app with enough permissions to access this folder. The correct procedure of determining a relative directory is not to use "Directory.GetCurrentDirectory()" but rather to use "ApplicationData.Current.LocalFolder". Hope this helps!
I have made a window application that works very well when i ran through but after creating its setup it is throwing below mentioned exception. I have try to give full access to the database file but still it is not working.
system.data.oledb.oledbexception operation must use an updateable.
query
i am using window 7 and installation folder is c:\program files\abc\ and access db is in same folder. Is this any issue of permissions? Please assist me to remove this exception.
There can be some permission issue just refer this Link
http://www.mikesdotnetting.com/Article/74/Solving-the-Operation-Must-Use-An-Updateable-Query-error
Make sure the ASPNET account (or whatever account is in use at the time) has
Change permissions to the directory where the .mdb file is located. Access
needs to write some temp and locking files during the operation.
I have my project files on my Dropbox folder so I can play around with my files at the office as well.
My project contains an EmbeddableDocumentStore with UseEmbeddedHttpServer set to true.
const int ravenPort = 8181;
NonAdminHttp.EnsureCanListenToWhenInNonAdminContext(ravenPort);
var ds = new EmbeddableDocumentStore {
DataDirectory = "Data",
UseEmbeddedHttpServer = true,
Configuration = { Port = ravenPort }
};
Now, this day when I started my project on my office pc I saw this message: Could not open transactional storage: D:\Dropbox\...\Data
Since it's early in my development stage I deleted the data folder on my Dropbox and the project started flawlessly. Now I'm back at home I ran into the same issue! I don't want to end up deleting this folder every time of course.
Can't I store my development data on my Dropbox? Should I bypass something to get this to work?
Set a data directory to a physical disk volume on your local computer. You will not be able to use any sort of mapped drive, network share, UNC path, dropbox or skydrive as a data directory. Just because you have a drive letter does not mean you have a physical disk.
The only types of non-physical storage that even make sense is a LUN attached from a SAN over iSCSI or FibreChannel, or an attached VHD in a virtualized or cloud environment. They will all present as physical disks to the OS.
This would be the case for just about ANY data access environment. Try it with SQL Server if you don't believe me. In RavenDB's case, it is using ESENT as its data store, which requires direct access to the filesystem.
Update
To clarify, even if you are storing on a physical disk, you can't rely on any type of synchronization technology like DropBox or SkyDrive. Why? Because they will be taking a shared read lock on the files to watch for changes. Technologies like ESENT (which RavenDB is based upon) require an exclusive lock to the file.
Other technologies like SQL Server and Windows Virtual Machine also take exclusive locks on their data stores. Why? Because they are constantly reading and writing bits of data in a random-access manner to the file. Would you really want DropBox to be trying to perform an sync operation for every bit of data change? It would be very inefficient and problematic.
Applications that use shared locks don't have this problem. For example, when you work on an MS Word document, it is all being done in memory. When you save the file, DropBox can read the entire file and sync it to the cloud. It can optimize by sending only the bits that have changed, but it still needs to be able to read the file to do so.
So if DropBox has a shared read lock on the ESENT file, then when RavenDB tries to open it exclusively, it gets an error and raises the exception you are seeing.