Reset Application setting in wp7 - c#

I have created an app that initially creates a database and saves some data in it.
Now I want to delete this database and its files when the user clicks on the reset button but I am getting an error – 'this is use in another process'. I want it to delete and recreate the database when click on the reset button. Any ideas?

The most frequent cause of this is ude to the thread unsafe nature of interacting with isolated storage on Windows Phone. Regardless of how you're implementing the database (be it in a file, or series of files), you're interacting with the isolated storage on some level.
I highly encourage you to read, and make sure you understand this overview of isolated storage before going too far.
You're remark:
This is in use in another process
makes me think you're using a third party library to do your database stuff. This exception/error is being thrown when the library itsself is unable to access isolated storage. Without knowing exactly how you're implementing the database, it's hard to be exactly speak to your situation.
You never "recreate IsolatedStorage", Isolated Storage is a term used to define the collection of disk space your application has access to. Much like a folder, this disk space has a root, and contains only files that you create.
In order to avoid thread exceptions when accessing Isolated Storage, make sure you use the using keyword in C# like so:
namespace IsolatedStorageExample
{
public class ISOAccess
{
// This example method will read a file inside your Isolated Storage.
public static String ReadFile(string filename)
{
string fileContents = "";
// Ideally, you should enclose this entire next section in a try/catch block since
// if there is anything wrong with below, it will crash your app.
//
// This line returns the "handle" to your Isolated Storage. The phone considers the
// entire isolated storage folder as a single "file", which is why it can be a
// little bit of a confusing name.
using(IsolatedStorageFile file = IsolatedStorageFile.GetUserStoreForAppliaction())
{
// If the file does not exist, return an empty string
if(file.Exists(filename))
{
// Obtain a stream to the file
using(IsolatedStorageFileStream stream = File.OpenFile(filename, FileMode.Open)
{
// Open a stream reader to actually read the file.
using(StreamReader reader = new StreamReader(stream))
{
fileContents = reader.ReadToEnd();
}
}
}
}
return fileContents;
}
}
}
That should help with your problem of thread safety. To be more specifically helpful toward what you want to do, take a look at the following methods (you can add this to the above class):
// BE VERY CAREFUL, running this method will delete *all* the files in isolated storage... ALL OF THEM
public static void ClearAllIsolatedStorage()
{
// get the handle to isolated storage
using(IsolatedStorageFile file = IsolatedStorageFile.GetUserStoreForApplication())
{
// Get a list of all the folders in the root directory
Queue<String> rootFolders = new Queue<String>(file.GetDirectoryNames());
// For each folder...
while(0 != rootFolders.Count)
{
string folderName = rootFolders.Dequeue();
// First, recursively delete all the files and folders inside the given folder.
// This is required, because you cannot delete a non-empty directory
DeleteFilesInFolderRecursively(file, folderName);
// Now that all of it's contents have been deleted, you can delete the directory
// itsself.
file.DeleteDirectory(rootFolders.Dequeue());
}
// And now we delete all the files in the root directory
Queue<String> rootFiles = new Queue<String>(file.GetFileNames());
while(0 != rootFiles.Count)
file.DeleteFile(rootFiles.Dequeue());
}
}
private static void DeleteFilesInFolderRecursively(IsolatedStorageFile iso, string directory)
{
// get the folders that are inside this folder
Queue<string> enclosedDirectories = new Queue<string>(iso.GetDirectoryNames(directory));
// loop through all the folders inside this folder, and recurse on all of them
while(0 != enclosedDirectories.Count)
{
string nextFolderPath = Path.Combine(directory, enclosedDirectories.Dequeue());
DeleteFilesInFolderRecursively(nextFolderPath);
}
// This string will allow you to see all the files in this folder.
string fileSearch = Path.Combine(directory, "*");
// Getting the files in this folder
Queue<string> filesInDirectory = iso.GetFileNames(fileSearch);
// Finally, deleting all the files in this folder
while(0 != filesInDirectory.Count)
{
iso.DeleteFile(filesInDirectory.Dequeue());
}
}
Another thing I highly recommend is implementing the class that accesses IsolatedStorage using a "Multithreaded Singleton Pattern" as described here.
Hope that's helpful. Code is provided "as-is", I have not compiled it, but the general concepts are all there, so if there's something amiss, read the MSDN docs to see where I goofed. But I assure you, most of this is copied from functional code of mine, so it should work properly with very little fanagaling.

Related

PCL storage package does not create folder

I have used PCL storage package to create a folder for my application. I referred to this. Here is my code sample:
public ListPage()
{
testFile();
Content = new StackLayout
{
Children = {
new Label { Text = "Hello ContentPage" }
}
};
}
async public void testFile()
{
// get hold of the file system
IFolder rootFolder = FileSystem.Current.LocalStorage;
// create a folder, if one does not exist already
IFolder folder = await rootFolder.CreateFolderAsync("MySubFolder", CreationCollisionOption.OpenIfExists);
// create a file, overwriting any existing file
IFile file = await folder.CreateFileAsync("MyFile.txt", CreationCollisionOption.ReplaceExisting);
// populate the file with some text
await file.WriteAllTextAsync("Sample Text...");
}
The folder for files is getting created under sdcard/android/data/ directory but it does not create "MySubFolder" folder under files.
I have set WRITE_EXTERNAL_STORAGE and READ_EXTERNAL_STORAGE for my android project. Am I missing any other configurations?
Having run into similar issues (though on iOS), I now have this working, maybe it helps you. The issues are properly dealing with the async calls and other threading fun.
First, my use case is that I bundle a number of file resources with the app, provided for the user at first run, but from then on updated online. Therefore, I take the bundles resources and copy them into the filesystem proper:
var root = FileSystem.Current.LocalStorage;
// already run at least once, don't overwrite what's there
if (root.CheckExistsAsync(TestFolder).Result == ExistenceCheckResult.FolderExists)
{
_testFolderPath = root.GetFolderAsync(TestFolder).Result;
return;
}
_testFolderPath = await root.CreateFolderAsync(TestFolder, CreationCollisionOption.FailIfExists).ConfigureAwait(false);
foreach (var resource in ResourceList)
{
var resourceContent = ResourceLoader.GetEmbeddedResourceString(_assembly, resource);
var outfile = await _testFolderPath.CreateFileAsync(ResourceToFile(resource), CreationCollisionOption.OpenIfExists);
await outfile.WriteAllTextAsync(resourceContent);
}
Notice the .ConfigureAwait(false). I learned this from the excellent
MSDN Best Practises article on async/await.
Before, I was going back and forth between the method NOT creating directories or files - as in your question - or the thread hanging. The article talks about the latter in detail.
The ResourceLoader class is from here:
Embedded Resource
The ResourceToFile() method is just a helper that turns the long resource names in iOS to shorted file names, as I prefer those. It's not germaine here (IOW: it's a kludge I'm ashamed to show ;)
I think I understand threading better day by day, and if I understand correctly, the art here is to ensure you wait for the async method that load and write files to finish, but make sure you do that on a thread pool that will not deadlock with the main UI thread.

How to delete files in AppData/Temp after file upload with .NET WebAPI 2? File in use error

I am trying to upload files to S3 after a user uploads a file to my API. I obviously don't want them to live on MY server, in fact I'd prefer they never exist on the server at all. My problem is that the files appear to remain in use for the lifetime of server app! Here is the code:
[HttpPost]
public async Task<HttpResponseMessage> Upload()
{
if (!Request.Content.IsMimeMultipartContent())
{
this.Request.CreateResponse(HttpStatusCode.UnsupportedMediaType);
}
var provider = GetMultipartProvider();
var result = await Request.Content.ReadAsMultipartAsync(provider);
var originalFileName = GetDeserializedFileName(result.FileData.First());
var fi = new FileInfo(result.FileData.First().LocalFileName);
var extension = Path.GetExtension(originalFileName).ToLower();
var amazonKey = S3Helper.Upload(fi.FullName, extension);
// DELETE THE FILE HERE (BodyPart_2f33be26-09a2-4ae3-8b89-4158b99fe32d)
File.Delete(fi.FullName); // This doesn't work, file in use error...
return this.Request.CreateResponse(HttpStatusCode.OK, new Img{
Extension = extension,
S3Key = amazonKey.Key,
OriginalFilename = originalFileName
});
}
How, when or where do I delete these files? OR is there a way to keep the files from being written to my server's disk in the first place?
The similar question shows that this approach should work in general. The only visible differences are:
You're using FileInfo. So why don't you use FileInfo's Delete method?
You're using S3Helper.Upload(fi.FullName, extension). Try to comment this line out to determine if it locks the file.
Not sure but s3helper.upload might be uploading on background thread. Try this
While(true){
try {
// delete file
break;
} catch () {
Thread.Sleep(1000);
}
}
Just looked at s3 docs and yes it uses not just a background thread but multiple!
Uploads the specified file. The object key is derived from the file's name. Multiple threads are used to read the file and perform multiple uploads in parallel. For large uploads, the file will be divided and uploaded in parts using Amazon S3's multipart API. The parts will be reassembled as one object in Amazon S3.

Show every file in a directory, but don't show files that are currently being copied

I have a function that checks every file in a directory and writes a list to the console. The problem is, I don't want it to include files that are currently being copied to the directory, I only want it to show the files that are complete. How do I do that? Here is my code:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
Console.WriteLine(file);
}
There's really no way to tell "being copied" vs "locked for writing by something". Relevant: How to check for file lock? and Can I simply 'read' a file that is in use?
If you want to simply display a list of files that are not open for writing, you can do that by attempting to open them:
foreach (string file in Directory.EnumerateFiles("C:\folder"))
{
try {
using (var file = file.Open(file, FileMode.Open, FileAccess.Read, FileShare.ReadWrite) {
Console.WriteLine(file);
}
} catch {
// file is in use
continue;
}
}
However -- lots of caveats.
Immediately after displaying the filename (end of the using block) the file could be opened by something else
The process writing the file may have used FileShare.Read which means the call will succeed, despite it being written to.
I'm not sure what exactly you're up to here, but it sounds like two processes sharing a queue directory: one writing, one reading/processing. The biggest challenge is that writing a file takes time, and so your "reading" process ends up picking it up and trying to read it before the whole file is there, which will fail in some way depending on the sharing mode, how your apps are written, etc.
A common pattern to deal with this situation is to use an atomic file operation like Move:
Do the (slow) write/copy operation to a temporary directory that's on the same file system (very important) as the queue directory
Once complete, do a Move from the temporary directory to the queue directory.
Since move is atomic, the file will either not be there, or it will be 100% there -- there is no opportunity for the "reading" process to ever see the file while it's partially there.
Note that if you do the move across file systems, it will act the same as a copy.
There's no "current files being copied" list stored anywhere in Windows/.NET/whatever. Probably the most you could do is attempt to open each file for append and see if you get an exception. Depending on the size and location of your directory, and on the security setup, that may not be an option.
There isn't a clean way to do this, but this... works...
foreach (var file in new DirectoryInfo(#"C:\Folder").GetFiles())
{
try
{
file.OpenRead();
}
catch
{
continue;
}
Console.WriteLine(file.Name);
}

UnauthorizedAccessException when trying to save a photo to the same file a second time

So, the code below allows me to take a picture. I then display the picture. My XAML is bound to the Photo property of the Vehicle object. It works fine, until I go in and try to take a picture again. I then get an UnauthorizedAccessException. I create the file in 'LocalStorage', so I don't believe I need special permissions to write files there. I'm not sure what is causing the error.
public async Task TakePicture()
{
CameraCaptureUI camera = new CameraCaptureUI();
camera.PhotoSettings.CroppedAspectRatio = new Size(16, 9);
StorageFile photo = await camera.CaptureFileAsync(CameraCaptureUIMode.Photo);
if (photo != null)
{
var targetFolder = ApplicationData.Current.LocalFolder;
var targetFile = await targetFolder.CreateFileAsync(String.Format
("VehiclePhoto{0}.jpg", this.Vehicle.PrimaryKey), CreationCollisionOption.ReplaceExisting);
if (targetFile != null)
{
await photo.MoveAndReplaceAsync(targetFile);
this.Vehicle.Photo = String.Format("ms-appdata:///local/VehiclePhoto{0}.jpg", this.Vehicle.PrimaryKey);
}
}
}
I assume that StoragePhoto encapsulates some kind of File I/O under the hood. You must properly dispose these objects in order to release the underlying unmanaged OS resources that will keep "hooks" into the file. If you don't dispose them, the application will keep access to the file open, which is probably why your second access to the file gives you an exception (the first access still remains). Show me the StoragePhoto code and I can get more specific.
On another note, if this application is multi-threaded, you should create granular semaphores/locks around writing the files to disk (perhaps by interning the physical path string and locking on that reference) to ensure you don't try to write the same file to disk at the same physical path at the same time - that would be bad.

How to delete the xmldocument saved in particular place

I have the following code where i save the xml file into this particular location shown below
public bool GetList(string keyword1, string streetname, string lat, string lng, string radius)
{
XmlDocument xmlDoc= CreateXML( keyword1,streetname,lat,lng,radius);
xmlDoc.Save(#"C:\Documents and Settings\Vijay.EKO-03\Desktop\blockseek3-9-2010\Block3.xml");
return true;
}
This Block3.xml file gets stored in my application folder, i refer to that particular Block3.xml using this code
function searchLocationsNear()
{
var searchUrl = "Block3.xml";
GDownloadUrl(searchUrl, function(data) {
var xml = GXml.parse(data);
var markers = xml.documentElement.getElementsByTagName('marker');
map.clearOverlays();
I am able to parse that Block3.xml and display results but my problem is during second iteration again when i try to save Block3.xml
xmlDoc.Save(#"C:\Documents and Settings\Desktop\blockseek3-9-2010\Block3.xml");
The previous Block3.xml file gets replaced by new one and gets stored in application folder
when i execute var searchUrl="Block3.xml"; it reads the first Block3.xml not the replaced one ,can any one help to tackle this code .
Is their any syntax to clear the saved xmldoc file in that particular folder .
Most likely, the GXml class retains an open file handle. Check the documentation: if it implements IDisposable, wrap your processing code like this:
using ( var xml = GXml.parse(data) )
{
var markers = xml.documentElement.getElementsByTagName('marker');
map.clearOverlays();
}
This also assumes that the parse() method is the one that actually loads and reads the file. If you don't release the file handle, your process doesn't know that there's another version in the file system. (And other nasty stuff can happen, depending on the exact mode the O/S opened the file in.)
I trust your production code won't contain these hard coded paths...?

Categories

Resources