.NET File Access Violation - c#

I feel kind of stupid posting this, but it seems like a genuine issue that I've made sufficiently simple so as to demonstrate that it should not fail. As part of my work I am responsible for maintaining build systems that take files under version control, and copy them to other locations. Sounds simple, but I've constantly experienced file access violations when attempting to copy files that I've supposedly already set as 'Normal'.
The code sample below simply creates a set of test files, makes them read only, and then copies them over to another folder. If the files already exist in the destination folder, the RO attribute is cleared so that the file copy will not fail.
The code works to a point, but at seemingly random points an exception is thrown when the file copy is attempted. The code is all single threaded, so unless .NET is doing something under the hood that causes a delay on the setting of attributes I can't really explain the problem.
If anyone can explain why this is happening I'd be interested. I'm not looking for a solution unless there is something I am definitely doing wrong, as I've handled the issue already, I'm just curious as no one else seems to have reported anything related to this.
After a few iterations I get something like:
A first chance exception of type 'System.UnauthorizedAccessException' occurred in mscorlib.dll
Additional information: Access to the path 'C:\TempFolderB\TEMPFILENAME8903.txt' is denied.
One other fact, if you get the file attributes BEFORE the file copy, the resulting state says the file attributes indeed Normal, yet examination of the local file shows it as Read Only.
/// <summary>
/// Test copying multiple files from one folder to another while resetting RO attr
/// </summary>
static void MultiFileCopyTest()
{
/// Temp folders for our test files
string folderA = #"C:\TempFolderA";
string folderB = #"C:\TempFolderB";
/// Number of files to create
const int fileCount = 10000;
/// If the test folders do not exist populate them with some test files
if (System.IO.Directory.Exists(folderA) == false)
{
const int bufferSize = 32768;
System.IO.Directory.CreateDirectory(folderA);
System.IO.Directory.CreateDirectory(folderB);
byte[] tempBuffer = new byte[bufferSize];
/// Create a bunch of files and make them all Read Only
for (int i = 0; i < fileCount; i++)
{
string filename = folderA + "\\" + "TEMPFILENAME" + i.ToString() + ".txt";
if (System.IO.File.Exists(filename) == false)
{
System.IO.FileStream str = System.IO.File.Create(filename);
str.Write(tempBuffer, 0, bufferSize);
str.Close();
}
/// Ensure files are Read Only
System.IO.File.SetAttributes(filename, System.IO.FileAttributes.ReadOnly);
}
}
/// Number of iterations around the folders
const int maxIterations = 100;
for (int idx = 0; idx < maxIterations; idx++)
{
Console.WriteLine("Iteration {0}", idx);
/// Loop for copying all files after resetting the RO attribute
for (int i = 0; i < fileCount; i++)
{
string filenameA = folderA + "\\" + "TEMPFILENAME" + i.ToString() + ".txt";
string filenameB = folderB + "\\" + "TEMPFILENAME" + i.ToString() + ".txt";
try
{
if (System.IO.File.Exists(filenameB) == true)
{
System.IO.File.SetAttributes(filenameB, System.IO.FileAttributes.Normal);
}
System.IO.File.Copy(filenameA, filenameB, true);
}
catch (System.UnauthorizedAccessException ex)
{
Console.WriteLine(ex.Message);
}
}
}
}

(This isn't a full answer, but I don't have enough reputation yet to post comments...)
I don't think you are doing anything wrong, when I run your test code I can reproduce the problem every time. I have never got past 10 iterations without the error occurring.
I did a further test, which might shed some light on the issue:
I set all of the files in TempFolderA to hidden.
I then ensured all of the files in TempFolderB were NOT hidden.
I put a break-point on Console.WriteLine(ex.Message)
I ran the code, if it got past iteration 1 then I stopped, reset the hidden attributes and ran again.
After a couple of tries, I got a failure in the 1st iteration so I opened Windows Explorer on TempFolderB and scrolled down to the problematic file.
The file was 0 bytes in size, but had RHA attributes set.
Unfortunately I have no idea why this is. Process Monitor doesn't show any other activity which could be relevant.

Well, right in the documentation for the System.IO.File.SetAttributes(string path, System.IO.FileAttributes attributes) method, I found the following:
Exceptions:
System.UnauthorizedException:
path specified a file that is read-only.
-or- This operation is not supported on the current platform.
-or- The caller does not have the required permission.
So, if I had to guess, the file in the destination (e.g. filenameB) did in fact already exist. It was marked Read-Only, and so, the exception was thrown as per the documentation above.
Instead, what you need to do is remove the Read-Only attribute via an inverse bit mask:
if (FileExists(filenameB))
{
// Remove the read-only attribute
FileAttributes attributes = File.GetAttributes(filenameB);
attributes &= ~FileAttributes.ReadOnly;
File.SetAttributes(filenameB, attributes);
// You can't OR the Normal attribute with other attributes--see MSDN.
File.SetAttributes(filenameB, FileAttributes.Normal);
}
To be fair, the documentation on the SetAttributes method isn't real clear about how to go about setting file attributes once a file is marked as Readonly. Sure, there's an example (using the Hidden attribute), but they don't explicitly say that you need to use an inverted bitmask to remove the Hidden or Readonly attributes. One could easily assume it's just how they chose to "unset" the attribute. It's also not clear from the documentation about what would happen, for instance, if you marked the file thusly:
File.SetAttributes(pathToFile, FileAttributes.Normal);
File.SetAttributes(pathToFile, FileAttributes.Archived);
Does this result in the file first having Normal attributes set, then only Archived, or does it result in the file having Normal set, and then _additionallyhavingArchived` set, resulting in a Normal, but Archived file? I believe it's the former, rather than the latter, based on how attributes are "removed" from a file using the inverted bitmask.
If anyone finds anything contrary, please post a comment and I'll update my answer accordingly.
HTH.

This may cause because user don't have sufficient permission to access file system ,
work around :
1,Try to run application in administrative mode
OR
2,Try to run visual studio in administrative mode (if you are using debugger)

Related

System.IO.File.Delete throws "The process cannot access the file because it is being used by another process"

Every time I save a file and delete it right away using the function below, I keep getting this error message: "System.IO.IOException: The process cannot access the file because it is being used by another process".
Waiting for a couple of minutes or closing visual studio seems to only unlock the files that you uploaded previously.
public static bool DeleteFiles(List<String> paths)
{ // Returns true on success
try
{
foreach (var path in paths)
{
if (File.Exists(HostingEnvironment.MapPath("~") + path))
File.Delete(HostingEnvironment.MapPath("~") + path);
}
}
catch (Exception ex)
{
return false;
}
return true;
}
I think that the way I'm saving the files may cause them to be locked. This is the code for saving the file:
if (FileUploadCtrl.HasFile)
{
filePath = Server.MapPath("~") + "/Files/" + FileUploadCtrl.FileName;
FileUploadCtrl.SaveAs(filePath)
}
When looking for an answer I've seen someone say that you need to close the streamReader but from what I understand the SaveAs method closes and disposes automatically so I really have no idea whats causing this
After some testing, I found the problem. turns out I forgot about a function I made that was called every time I saved a media file. the function returned the duration of the file and used NAudio.Wave.WaveFileReader and NAudio.Wave.Mp3FileReader methods which I forgot to close after I called them
I fixed these issues by putting those methods inside of a using statement
Here is the working function:
public static int GetMediaFileDuration(string filePath)
{
filePath = HostingEnvironment.MapPath("~") + filePath;
if (Path.GetExtension(filePath) == ".wav")
using (WaveFileReader reader = new WaveFileReader(filePath))
return Convert.ToInt32(reader.TotalTime.TotalSeconds);
else if(Path.GetExtension(filePath) == ".mp3")
using (Mp3FileReader reader = new Mp3FileReader(filePath))
return Convert.ToInt32(reader.TotalTime.TotalSeconds);
return 0;
}
The moral of the story is, to check if you are opening the file anywhere else in your project
I think that the problem is not about streamReader in here.
When you run the program, your program runs in a specific folder. Basically, That folder is locked by your program. In that case, when you close the program, it will be unlocked.
To fix the issue, I would suggest to write/delete/update to different folder.
Another solution could be to check file readOnly attribute and change this attribute which explained in here
Last solution could be using different users. What I mean is that, if you create a file with different user which not admin, you can delete with Admin user. However, I would definitely not go with this solution cuz it is too tricky to manage different users if you are not advance windows user.

Unhandled DirectoryNotFound exception thrown using DirectoryInfo.EnumerateFiles and "subst" command

Background:
I'm trying to write a little program that recursively goes through all files contained in a user specified folder (and all its subfolders). The program is supposed to check the size of each file and if it matches a user defined value, copy their full path (File.FullName) to a text box control I've setup on the only form of the program.
The trouble is that because of the intended use of the program (working with recovered files and folders from damaged partitions on external drives), often the path length goes well over the maximum path length limit. In order to marginally circumvent this I decided to map the user selected starting directory to a virtual drive using the Win32 command line function "subst".
Incriminated Code:
private void button1_Click(object sender, EventArgs e)
{
if (txtboxSizeFilter.Text != "")//code will execute only if a size filter has been provided by the user
{
if (folderBrowserDialog1.ShowDialog() == DialogResult.OK)
{
List<char> driveLetters = new List<char>(26); // Allocate space for alphabet
for (int i = 65; i < 91; i++) // increment from ASCII values for A-Z
{
driveLetters.Add(Convert.ToChar(i)); // Add uppercase letters to possible drive letters
}
foreach (string drive in Directory.GetLogicalDrives())
{
driveLetters.Remove(drive[0]); // removed used drive letters from possible drive letters
}
GlobalVars.Drive = driveLetters[0].ToString() + ":"; //gets the next available drive letter to be used as the virtual drive and adds a convenient ":" at the end
string command = "subst " + GlobalVars.Drive + " " + "\"" + folderBrowserDialog1.SelectedPath.ToString() + "\""; //command to be passed to "cmd". Ex. Content= subst J: "C:\users\someuser\somelongnamefolder\some longer name folder with spaces"
System.Diagnostics.Process.Start("cmd.exe", #"/c " + command);//launches the command prompt and initiates subst. This section has been tested and works fine.
DirectoryInfo DI = new DirectoryInfo(GlobalVars.Drive);//here i start at the drive letter that points to the desired directory.
foreach (var fi in DI.EnumerateFiles("*", SearchOption.AllDirectories))//searches for any file in all dirs. THE EXCEPTION "DirectoryNotFound" OBJECT OF THIS QUESTION IS THROWN EXACTLY HERE.
{
try //I need this because I might throw a PathTooLongException despite my use of subst
{
if (fi.Length == Convert.ToInt64(txtboxSizeFilter.Text))//if the size of the file is = to target size then I add the full path name to the textbox along with its actual size (for debug purposes only).
{
txtboxResults2Confirm.Text = txtboxResults2Confirm.Text + fi.FullName.ToString() + "_" + fi.Length.ToString() + Environment.NewLine;
}
}
catch (PathTooLongException)//if the path is too long, indicate it with the codeword "SKIP"
{
txtboxResults2Confirm.Text = txtboxResults2Confirm.Text + "SKIP" + Environment.NewLine;
}
}
btnConfirm.Enabled = true;//enables the other button on the form that will actually delete the files.
}
}
else
{
MessageBox.Show("INPUT SIZE FIRST!");
}
}
Problem/Research/Question:
As specified in the comments and the tile, an unhandled DirectoryNotFound exception occurs at the second foreach statement (as indicated by the IDE, Microsoft Visual Studio Ultimate 2013). Doing a step by step debug indicates that the foreach loop actually works for a while and then "randomly" throws the exception. I was able to verify that it goes through all the files in the root directory and at least a large part of the first subdirectory without any trouble at all (because of the large amount of files/subdirectories I wasn't able to pinpoint where exactly the failure occurs). The error states that the directory not found is the root one (so, to follow the example in the comments: "Unable to find J:\"). I've tried to follow the first catch with another to handle the DirectoryNotFound exception to not avail, which makes sense since the exception seems to originate directly in the foreach statement. I've also tried to wrap the entire foreach statement into a try-catch framework without any luck. Finally, getting rid of the whole "subst" section and just working with the user selected path produces no such error.
My question is, why is the exception thrown? And why am I not able to handle it in any way? Is there an alternate approach to this problem that would ensure the avoidance of the DirectoryNotFound exception?
The Process.Start does exactly that: it starts a new process (with its own main thread, etc etc) which may or may not complete by the time your next line of code in the original program executes. A couple of ways to resolve this: a) call DefineDosDevice which will mean the "subst" command runs in the current program thread, or b) grab the pipe to the cmd process and listen on it for completion. Both are of intermediate difficulty, pick your poison.
Of course, the exception that you're experiencing is because the subst command didn't complete in time when the code to enumerate the (yet un-aliased) directory executes.

jpegoptim on ASP.Net - "error opening temporary file"

I suspect I'm failing to understand where jpegoptim tries to write its temp files.
I have IIS 7.5 running an ASP.Net 4 AppDomain. In it I have a process that optimizes JPEGs with jpegoptim like so:
FileHelper.Copy(existingPath, optimizerPath);
var jpegOptimResult = await ImageHelper.JpegOptim(optimizerPath, 30);
Running locally I get an optimized image. Running on the above server I get:
D:\www\hplusf.com\b\pc\test.jpg 4096x2990 24bit N Adobe [OK] jpegoptim: error opening temporary file.
I can show the code for FileHelper.Copy(), but it's basically just File.Copy() that overwrites if the file already exists.
Here's ImageHelper.JpegOptim:
public static async Task<string> JpegOptim(string path, int quality)
{
string jpegOptimPath = Path.GetDirectoryName(new Uri(Assembly
.GetExecutingAssembly().CodeBase).LocalPath)
+ #"\Lib\jpegoptim.exe";
var jpegOptimResult = await ProcessRunner.O.RunProcess(
jpegOptimPath,
"-m" + quality + " -o -p --strip-all --all-normal \"" + path + "\"",
false, true
);
return jpegOptimResult;
}
jpegOptimResult is what you're seeing there as the error message it's producing. And here's ProcessRunner.RunProcess:
public async Task<string> RunProcess(string command, string args,
bool window, bool captureOutput)
{
var processInfo = new ProcessStartInfo(command, args);
if (!window)
makeWindowless(processInfo);
string output = null;
if (captureOutput)
output = await runAndCapture(processInfo);
else
runDontCapture(processInfo);
return output;
}
protected void makeWindowless(ProcessStartInfo processInfo)
{
processInfo.CreateNoWindow = true;
processInfo.WindowStyle = ProcessWindowStyle.Hidden;
}
protected async Task<string> runAndCapture(ProcessStartInfo processInfo)
{
processInfo.UseShellExecute = false;
processInfo.RedirectStandardOutput = true;
processInfo.RedirectStandardError = true;
var process = Process.Start(processInfo);
var output = process.StandardOutput;
var error = process.StandardError;
while (!process.HasExited)
{
await Task.Delay(100);
}
string s = output.ReadToEnd();
s += '\n' + error.ReadToEnd();
return s;
}
So:
jpegOptim runs properly on my local machine, and optimizes the file, so it's not how I'm calling jpegOptim.
The Copy operation succeeds without Exception, so it's not a Permissions issue with the ASP.Net user reading/writing from that directory
jpegOptim just optimizes and overwrites the file, so if it is in fact running under the same ASP.Net user, it should have no problem writing this file, but...
It's unclear where jpegOptim attempts to write its temp file, so perhaps the underlying issue is where this temporary file is being written.
However, judging by the Windows source:
http://sourceforge.net/p/jpegoptim/code/HEAD/tree/jpegoptim-1.3.0/trunk/jpegoptim.c
jpegOptim's "temporary file" appears to just be the destination file when used with the above options. Relevant lines of jpegOptim source:
int dest = 0;
int main(int argc, char **argv)
{
...
There's some code here looking for the -d argument that sets dest=1 - meaning here dest remains 0. It then hits an if branch, and the else clause, for dest == 0, does this:
if (!splitdir(argv[i],tmpdir,sizeof(tmpdir)))
fatal("splitdir() failed!");
strncpy(newname,argv[i],sizeof(newname));
That's copying the directory name portion of the input image filename to the variable tmpdir - so like C:\Blah\18.jpg would assign tmpdir="C:\Blah\". Then it dumps the entire input image filename to newname, meaning it's just going to overwrite it in place.
At this point in the code the variables it's using should be:
dest=0
argv[i]=D:\www\hplusf.com\b\pc\test.jpg
tmpdir=D:\www\hplusf.com\b\pc\
newname=D:\www\hplusf.com\b\pc\test.jpg
It then in fact opens the file, and there's an opportunity to error out there, suggesting jpegoptim is successfully opening the file. It also decompresses the file further confirming it's successfully opening it.
The specific error message I'm seeing occurs in these lines - I'll confess I don't know if MKSTEMPS is set or not for a default build (which I'm using):
snprintf(tmpfilename,sizeof(tmpfilename),
"%sjpegoptim-%d-%d.XXXXXX.tmp", tmpdir, (int)getuid(), (int)getpid());
#ifdef HAVE_MKSTEMPS
if ((tmpfd = mkstemps(tmpfilename,4)) < 0)
fatal("error creating temp file: mkstemps() failed");
if ((outfile=fdopen(tmpfd,"wb"))==NULL)
#else
tmpfd=0;
if ((outfile=fopen(tmpfilename,"wb"))==NULL)
#endif
fatal("error opening temporary file");
So snprintf is like C# String.Format(), which should produce a path like:
D:\www\hplusf.com\b\pc\jpegoptim-1-2.XXXXXX.tmp
Judging by what I can find it's likely MKSTEMPS is not defined meaning fopen is being called with "wb" meaning it's writing a binary file, and it's returning null meaning it failed to open, and out comes the error message.
So - possible causes:
Bad path in tmpdir It's possible I'm following the C++ poorly (likely), but, from the looks of it it should be identical to the source path of the image. But perhaps it's mangled for tmpdir, by jpegoptim? The input path is clearly clean because jpegoptim actually emits it cleanly in the error message.
Permissions issue Seems fairly unlikely. The ASP.Net user this is running under can clearly read and write because it copies to the dir before jpegoptim fires, and the only user on the machine with any permissions to this dir is that user, so jpegoptim should have failed prior to this point if it were permissions. It could be attempting to access a different dir, but that would really be the Bad tmpdir scenario.
Something else I've not thought of.
Ideas?
Note: This question is similar:
Using jpegtran, jpegoptim, or other jpeg optimization/compression in C#
However, that question is asking about a shared env on GoDaddy, causing answers to spiral around the likelihood he can't spin up processes. We have full control over our server, and as should be clear from the above, the jpegoptim Process is definitely starting successfully, so it's a different scenario.
As it turns out my reading of jpegoptim was incorrect. The tmpdir it uses is where the executable's Working Directory points to, not where the input images are, and not where the executable sits. So, the solution was 2-fold:
Give the exe permissions to write to its own directory* (but deny it access to modify itself)
Modify ProcessRunner to run processes in-place - set the Working Directory to where the exe resides.
The second modification looks like this:
var processInfo = new ProcessStartInfo(command, args);
// Ensure the exe runs in the path where it sits, rather than somewhere
// less safe like the website root
processInfo.WorkingDirectory = (new FileInfo(command)).DirectoryName;
*Note: I happen to have jpegoptim.exe isolated on the server to its own dir to limit risk. If you had it someplace more global like Program Files, you definitely should not do this - instead set the Working Directory as above, but to someplace isolated/safe like a tmp dir or even better a scratch disk. If you've got the RAM for it a RAMdrive would be fastest.
**Second Note: Because of how hard drives and jpegoptim work if the tmp location is not the same disk as the ultimate destination of the output there is a potential, partial race condition introduced between jpegoptim and other code you might be using that depends on its outputs. In particular if you use the same disk, when jpegoptim is done the output JPEG is complete - the OS changes the entry in its file table but the data for the image on the hard drive has already been written to completion. When tmp and destination are separate disks, jpegoptim finishes by telling the OS to move from the tmpdir to the output dir. That's a data move that finishes sometime after jpegoptim is done running. If your waiting code is fast enough it will start its work with an incomplete JPEG.

getting file name and moving it

string fName = Path.GetFileName(tempPaths[z]);
if (!File.Exists(subAch + fName))
{
File.Move(tempPaths[z], subAch + fName);
Console.WriteLine("moved!!! from " + tempPaths[z] + " tooooo ");
}
tempPaths is a list with all the image file paths. e.g. ./images/image4.jpg
subAch is a directory string.
I wish to get the file name of the file then move them to another directory. But with the code above i kept getting error: file is being used by other process.
Is there anyway which get the file name and move them? I have tried fileStream but was confused by it.
Please advice.
Thank you!
Your code should work just fine. You just need to figure out who is locking the files.
I'd put the code inside the if-block in a try-catch block to deal with the locked files.
I will also recommend you to use Path.Combine instead of dir + file.
One thing: you are checking if subAch + tempPaths[z] exists, yet you are copying to a different location; subAch + fName.
File is being used by another process means exactly that. Someone/something is already using the file, so can't move it. You can always catch the error and moving everything else?
I have use a non-ideal way to grab the file name and move the files to another place.
tempPaths.AddRange(Directory.GetFiles(rawStorePath, filter, SearchOption.AllDirectories));
The code above gets all the directories of all the files in the folder set. The outcome with be something like this. tempPaths is a List.
"./images/glass_numbers_5.jpg"
"./images/G.JPG"
"./images/E.JPG"
"./images/F.JPG"
"./images/glass_numbers_0.jpg"
"./images/C.JPG"
"./images/B.JPG"
"./images/A.JPG"
"./images/D.JPG"
"./images/glass_numbers_7.jpg"
then after i use a loop to grab the file names.
for (int i = 0; i < tempPaths.Count; i++)
{
//Getting the original names of the images
int pLength = rawStorePath.Length;
string something = tempPaths[i].Remove(0, pLength);
if (!_tfileName.ContainsKey(tempPaths[i]))
{ _tfileName.Add(tempPaths[i], something); }
}
rawStorePath is the path of the targeted path e.g.: ./images/
tempPath[i] e.g. : ./images/G.JPG
So with the length i remove the letters and get the file name back.
Please advice me for a ideal way to do this if there is any.
Thanks!

File.Copy and WPF

I have a little problem with the File.Copy method in WPF, my code is very simple and I get an exception when I run it,
Could not find a part of the path 'Images\37c31987-52ee-4804-8601-a7b9b4d439fd.png'.
where Images is a relative folder.
Here is my code, as I said simple and the same code works fine in a console application, no problem at all.
string filenamae = System.IO.Path.Combine(images, Guid.NewGuid().ToString() + System.IO.Path.GetExtension(imageFile)); ;
System.IO.File.Copy(imageFile, filenamae);
this.ImageLocation = string.Empty;
So if any can help, thanks.
Does the images folder exist? File.Copy doesn't create it automatically.
Do you know what your current directory is? File open/save boxes can change that. So it's always safer to work with absolute paths.
Do a
Path.GetFullPath(filename)
and see where that points to. Is it the right location?
If you use the absolute instead of the relative path, does it work then?
Before you access a file, you should call System.IO.File.Exists(). It's not clear from your error description if the origin file exists or not before the copy.
If you don't specify an absolute path, your relative path with often be resolved from unexpected places, usually the current working directory of the process. Calling this method may tell you were the process is currently running:
System.IO.Directory.GetCurrentDirectory()
You should never make assumptions about the current working directory of a running process as the user could start your program from anywhere. Even if you think you always control the current working directory, you will be surprised how often you will be wrong.
Do you have a debugger? Why not insert a breakpoint and check the values used at each step?
If the file system says "cannot find file", I wouldn't bother arguing with it...
use \\ for the file path directory if it in local.. if your file exists in network path use \\\\(atfirst).. So that it look for network drive..
Thanks
It is necessary to embed all external files into the executable and change your code to work with these embedded files rather than to expect files on the disk.
To use images or whatever you need files("xml/txt/doc"), you need to set the build action of your file to Embedded Resource, and call the method with the fully qualified name of the file, where the name is assembled like this:
[RootNameSpaceOfTheProject].[NameOfFolderInTheProject].[FileNameWithExtension]
Example:
Call the method:
var b = ResourceOperations.GetResourceAsByteArray("Store.Resources.EmbeddedIcons.toolbox.png");
Now you can write the byte array to a temporary file for example and use this as an image source, or you can build an image from the byte array directly. At least, you've got your data...
and to save this files to a disk we should write a code by #Jon Skeet :
public static void CopyStream(Stream input, Stream output)
{
// Insert null checking here for production
byte[] buffer = new byte[8192];
int bytesRead;
while ((bytesRead = input.Read(buffer, 0, buffer.Length)) > 0)
{
output.Write(buffer, 0, bytesRead);
}
}
then call it:
using (Stream input = assembly.GetManifestResourceStream(resourceName))
using (Stream output = File.Create(path))
{
CopyStream(input, output);
}

Categories

Resources