Make a collection available to two processes - c#

Hi guys I have a dictionary which has to be shared between two different exe files. The first application creates a key, then stores it in the dictionary, then the other application creates a key and stores it in the dictionary.
At the moment i do this:
private static void WriteToFile(Dictionary<string, byte[]> dictionary, string path)
{
Contract.Requires(dictionary != null);
Contract.Requires(!string.IsNullOrEmpty(path));
if (!(timestamp == File.GetLastWriteTime(DatabasePath)))
{
using (FileStream fs = File.OpenWrite(path))
using (var writer = new BinaryWriter(fs))
{
// Put count.
writer.Write(dictionary.Count);
// Write pairs.
foreach (var pair in dictionary)
{
writer.Write(pair.Key);
writer.Write(pair.Value);
}
timestamp = DateTime.Now;
File.SetLastWriteTime(DatabasePath, timestamp);
}
}
}
/// <summary>
/// This is used to read a dictionary from a file
/// http://www.dotnetperls.com/dictionary-binary
/// </summary>
/// <param name="path">The path to the file</param>
/// <returns>The dictionary read from the file</returns>
private static Dictionary<string, byte[]> ReadFromFile(string path)
{
Contract.Requires(!string.IsNullOrEmpty(path));
var result = new Dictionary<string, byte[]>();
using (FileStream fs = File.OpenRead(path))
using (var reader = new BinaryReader(fs))
{
// Determine the amount of key value pairs to read.
int count = reader.ReadInt32();
// Read in all the pairs.
for (int i = 0; i < count; i++)
{
string key = reader.ReadString();
//// The byte value is hardcoded as the keysize is consistent
byte[] value = reader.ReadBytes(513);
result[key] = value;
}
}
return result;
}
Then when I want to store a key I call this method:
public static bool StoreKey(byte[] publicKey, string uniqueIdentifier)
{
Contract.Requires(ValidPublicKeyBlob(publicKey));
Contract.Requires(publicKey != null);
Contract.Requires(uniqueIdentifier != null);
Contract.Requires(uniqueIdentifier != string.Empty);
bool success = false;
if (File.Exists(DatabasePath))
{
keyCollection = ReadFromFile(DatabasePath);
}
if (!keyCollection.ContainsKey(uniqueIdentifier))
{
if (!keyCollection.ContainsValue(publicKey))
{
keyCollection.Add(uniqueIdentifier, publicKey);
success = true;
WriteToFile(keyCollection, DatabasePath);
}
}
return success;
}
When the programs generates the key and when we then try to access them, it only has 1 key, what am I doing wrong? The key and string is stored perfectly, but I'm just afraid that they are overwriting the files or something.
Thank you very much in advance, any help is greatly appreciated
PS: The databasePath is the path where I want to save the file, created as a field.

It is hard to say what exactly going on since you've not provided an information regarding how many items in dictionary and so on, but it seems like you've encountered some kind of a file access issue when accessing the same file from multiple processes.
You can use named Mutex as a cross process synchronization object so before accessing a file you have to ensure that Mutex handle is released so you can aquire an ownership and an other process would be able to wait before accessing a file.
// Create a mutex
Mutex mutex = new Mutex(false, "DictionaryAccessMutex");
// Acquire an ownership
mutex.WaitOne();
// Release
mutex.ReleaseMutex();
EDIT: New finding
Also you trying to write immediately after the read, so perhaps FileSystem operation is not completed yet so write failed, I'm not sure 100% in this perhaps .NET managed classes like File/StreamReader/etc already handled such cases but I believe it worth to double check in your case since is not 100% clear what is happened. So just try out adding some timeout like Thread.Sleep(500) between read and write operations.
EDIT: One more thing you can do is download Process Monitor SysInternals utility and see which operations are failed when accessign a given file. So just add a new filter Path=file name and you would be able see what is going on on the low level.

Writing to a file in parallel is generally not the best idea. You have two options here:
Use a mutex for cross-process synchronization to regulate access to the file.
Forward all write requests to a third process that has exclusive ownership of the file and does the actual writing.

So Process 1 loads the dictionary, adds an item, calls write.
So Process 2 loads the dictionary, adds an item, calls write.
You get which ever one writes second, and you don't know which one it will be.
Trying to make this work is way more trouble than it's worth, and it wil be as future proof as an inflatable dartboard.
Mutex at a push or a third process to maintain the dictionary.

Related

How to read and write more then 25000 records/lines into text file at a time?

I am connecting my application with stock market live data provider using web socket. So when market is live and socket is open then it's giving me nearly 45000 lines in a minute. at a time I am deserializing it line by line
and then write that line into text file and also reading text file and removing first line of text file. So handling another process with socket becomes slow. So please can you help me that how should I perform that process very fast like nearly 25000 lines in a minute.
string filePath = #"D:\Aggregate_Minute_AAPL.txt";
var records = (from line in File.ReadLines(filePath).AsParallel()
select line);
List<string> str = records.ToList();
str.ForEach(x =>
{
string result = x;
result = result.TrimStart('[').TrimEnd(']');
var jsonString = Newtonsoft.Json.JsonConvert.DeserializeObject<List<LiveAMData>>(x);
foreach (var item in jsonString)
{
string value = "";
string dirPath = #"D:\COMB1\MinuteAggregates";
string[] fileNames = null;
fileNames = System.IO.Directory.GetFiles(dirPath, item.sym+"_*.txt", System.IO.SearchOption.AllDirectories);
if(fileNames.Length > 0)
{
string _fileName = fileNames[0];
var lineList = System.IO.File.ReadAllLines(_fileName).ToList();
lineList.RemoveAt(0);
var _item = lineList[lineList.Count - 1];
if (!_item.Contains(item.sym))
{
lineList.RemoveAt(lineList.Count - 1);
}
System.IO.File.WriteAllLines((_fileName), lineList.ToArray());
value = $"{item.sym},{item.s},{item.o},{item.h},{item.c},{item.l},{item.v}{Environment.NewLine}";
using (System.IO.StreamWriter sw = System.IO.File.AppendText(_fileName))
{
sw.Write(value);
}
}
}
});
How to make process fast, if application perform this then it takes nearly 3000 to 4000 symbols. and if there is no any process then it executes 25000 lines per minute. So how to increase line execution time/process with all this code ?
First you need to cleanup you code to gain more visibility, i did a quick refactor and this is what i got
const string FilePath = #"D:\Aggregate_Minute_AAPL.txt";
class SomeClass
{
public string Sym { get; set; }
public string Other { get; set; }
}
private void Something() {
File
.ReadLines(FilePath)
.AsParallel()
.Select(x => x.TrimStart('[').TrimEnd(']'))
.Select(JsonConvert.DeserializeObject<List<SomeClass>>)
.ForAll(WriteRecord);
}
private const string DirPath = #"D:\COMB1\MinuteAggregates";
private const string Separator = #",";
private void WriteRecord(List<SomeClass> data)
{
foreach (var item in data)
{
var fileNames = Directory
.GetFiles(DirPath, item.Sym+"_*.txt", SearchOption.AllDirectories);
foreach (var fileName in fileNames)
{
var fileLines = File.ReadAllLines(fileName)
.Skip(1).ToList();
var lastLine = fileLines.Last();
if (!lastLine.Contains(item.Sym))
{
fileLines.RemoveAt(fileLines.Count - 1);
}
fileLines.Add(
new StringBuilder()
.Append(item.Sym)
.Append(Separator)
.Append(item.Other)
.Append(Environment.NewLine)
.ToString()
);
File.WriteAllLines(fileName, fileLines);
}
}
}
From here should be more easy to play with List.AsParallel to check how and with what parameters the code is faster.
Also:
You are opening the write file twice
The removes are also somewhat expensive, in the index 0 is more (however, if there are few elements this could not make much difference
if(fileNames.Length > 0) is useless, use a for, if the list is empty, then he for will simply skip
You can try StringBuilder instead string interpolation
I hope this hints can help you to improve your time! and that i have not forgetting something.
Edit
We have nearly 10,000 files in our directory. So when process is
running then it's passing an error that The Process can not access the
file because it is being used by another process
Well, is there a possibility that in your process lines there is duplicated file names?
If that is the case, you could try a simple approach, a retry after some milliseconds, something like
private const int SleepMillis = 5;
private const int MaxRetries = 3;
public void WriteFile(string fileName, string[] fileLines, int retries = 0)
{
try
{
File.WriteAllLines(fileName, fileLines);
}
catch(Exception e) //Catch the special type if you can
{
if (retries >= MaxRetries)
{
Console.WriteLine("Too many tries with no success");
throw; // rethrow exception
}
Thread.Sleep(SleepMillis);
WriteFile(fileName, fileLines, ++retries); // try again
}
}
I tried to keep it simple, but there are some annotations:
- If you can make your methods async, it could be an improvement by changing the sleep for a Task.Delay, but you need to know and understand well how async works
- If the collision happens a lot, then you should try another approach, something like a concurrent map with semaphores
Second edit
In real scenario I am connecting to websocket and receiving 70,000 to
1 lac records on every minute and after that I am bifurcating those
records with live streaming data and storing in it's own file. And
that becomes slower when I am applying our concept with 11,000 files
It is a hard problem, from what i understand, you're talking about 1166 records per second, at this size the little details can become big bottlenecks.
At that phase i think it is better to think about other solutions, it could be so much I/O for the disk, could be many threads, or too few, network...
You should start by profiling the app to check where the app is spending more time to focus in that area, how much resources is using? how much resources do you have? how is the memory, processor, garbage collector, network? do you have an SSD?
You need a clear view of what is slowing you down so you can attack that directly, it will depend on a lot of things, it will be hard to help with that part :(.
There are tons of tools for profile c# apps, and many ways to attack this problem (spread the charge in several servers, use something like redis to save data really quick, some event store so you can use events....

How to set a dynamic number of threadCounter variables?

I'm not really into multithreading so probably the question is stupid but it seems I cannot find a way to solve this problem (especially because I'm using C# and I've been using it for a month).
I have a dynamic number of directories (I got it from a query in the DB). Inside those queries there are a certain amount of files.
For each directory I need to use a method to transfer these files using FTP in a cuncurrent way because I have basically no limit in FTP max connections (not my word, it's written in the specifics).
But I still need to control the max amount of files transfered per directory. So I need to count the files I'm transfering (increment/decrement).
How could I do it? Should I use something like an array and use the Monitor class?
Edit: Framework 3.5
You can use the Semaphore class to throttle the number of concurrent files per directory. You would probably want to have one semaphore per directory so that the number of FTP uploads per directory can be controlled independently.
public class Example
{
public void ProcessAllFilesAsync()
{
var semaphores = new Dictionary<string, Semaphore>();
foreach (string filePath in GetFiles())
{
string filePathCapture = filePath; // Needed to perform the closure correctly.
string directoryPath = Path.GetDirectoryName(filePath);
if (!semaphores.ContainsKey(directoryPath))
{
int allowed = NUM_OF_CONCURRENT_OPERATIONS;
semaphores.Add(directoryPath, new Semaphore(allowed, allowed));
}
var semaphore = semaphores[directoryPath];
ThreadPool.QueueUserWorkItem(
(state) =>
{
semaphore.WaitOne();
try
{
DoFtpOperation(filePathCapture);
}
finally
{
semaphore.Release();
}
}, null);
}
}
}
var allDirectories = db.GetAllDirectories();
foreach(var directoryPath in allDirectories)
{
DirectoryInfo directories = new DirectoryInfo(directoryPath);
//Loop through every file in that Directory
foreach(var fileInDir in directories.GetFiles()) {
//Check if we have reached our max limit
if (numberFTPConnections == MAXFTPCONNECTIONS){
Thread.Sleep(1000);
}
//code to copy to FTP
//This can be Aync, when then transfer is completed
//decrement the numberFTPConnections so then next file can be transfered.
}
}
You can try something along the lines above. Note that It's just the basic logic and there are proberly better ways to do this.

Modify Emdeded String in C# compiled exe

I have an issue where I need to be able to have a compiled exe ( .net 3.5 c# ) that I will make copies of to distribute that will need to change a key for example before the exe is sent out.
I cannot compile each time a new exe is needed. This is a thin client that will be used as part of a registration process.
Is it possible to add a entry to a resource file with a blank value then when a request comes in have another application grab the blank default thin client, copy it, populate the blank value with the data needed.
If yes how? If no do you have any ideas? I have been scratching my head for a few days now and the limitation as due to the boundaries I am required to work in.
The other idea I has was to inject the value into a method, which I have no idea how I would even attempt that.
Thanks.
Convert the assembly to IL, do a textual search and replace, recompile the IL to an assembly again. Use the standard tools from the .NET SDK.
Instead of embedding the key in the assembly, put it in the app.config file (or another file delivered with the application) and prevent your application from running if the key is not present and valid. To protect it against modification by users, also add an RSA signature the config file.
This code could be used to generate XML containing your key.
public static void Main()
{
Console.WriteLine(GenerateKey());
}
public static Byte[] Transform(Byte[] bytes, ICryptoTransform xform)
{
using (System.IO.MemoryStream stream = new System.IO.MemoryStream())
{
using (CryptoStream cstream = new CryptoStream(stream, xform, CryptoStreamMode.Write))
{
cstream.Write(bytes, 0, bytes.Length);
cstream.Close();
stream.Close();
return stream.ToArray();
}
}
}
public static string GenerateKey()
{
RSACryptoServiceProvider rsa = new RSACryptoServiceProvider();
// This is the private key and should never be shared.
// Generate your own with RSA.Create().ToXmlString(true).
String rsaPrivateKey = "<RSAKeyValue><Modulus>uPCow37yEzlKQXgbqO9E3enSOXY1MCQB4TMbOZyk9eXmc7kuiCMhJRbrwild0LGO8KE3zci9ETBWVVSJEqUqwtZyfUjvWOLHrf5EmzribtSU2e2hlsNoB2Mu11M0SaGd3qZfYcs2gnEnljfvkDAbCyJhUlxmHeI+35w/nqSCjCk=</Modulus><Exponent>AQAB</Exponent><P>4SMSdNcOP0qAIoT2qzODgyl5yu9RubpIU3sSqky+85ZqJHXLUDjlgqAZvT71ROexJ4tMfMOgSWezHQwKWpz3sw==</P><Q>0krr7cmorhWgwCDG8jmzLMo2jafAy6tQout+1hU0bBKAQaPTGGogPB3hTnFIr84kHcRalCksI6jk4Xx/hiw+sw==</Q><DP>DtR9mb60zIx+xkdV7E8XYaNwx2JeUsqniwA3aYpmpasJ0N8FhoJI9ALRzzp/c4uDiuRNJIbKXyt6i/ZIFFH0qw==</DP><DQ>mGCxlBwLnhkN4ind/qbQriPYY8yqZuo8A9Ggln/G/IhrZyTOUWKU+Pqtx6lOghVdFjSxbapn0W8QalNMFGz7AQ==</DQ><InverseQ>WDYfqefukDvMhPHqS8EBFJFpls/pB1gKsEmTwbJu9fBxN4fZfUFPuTnCIJsrEsnyRfeNTAUFYl3hhlRYZo5GiQ==</InverseQ><D>qB8WvAmWFMW67EM8mdlReI7L7jK4bVf+YXOtJzVwfJ2PXtoUI+wTgH0Su0IRp9sR/0v/x9HZlluj0BR2O33snQCxYI8LIo5NoWhfhkVSv0QFQiDcG5Wnbizz7w2U6pcxEC2xfcoKG4yxFkAmHCIkgs/B9T86PUPSW4ZTXcwDmqU=</D></RSAKeyValue>";
rsa.FromXmlString(rsaPrivateKey);
String signedData = "<SignedData><Key>Insert your key here</Key></SignedData>";
Byte[] licenseData = System.Text.Encoding.UTF8.GetBytes(signedData);
Byte[] sigBytes = rsa.SignData(licenseData, new SHA1CryptoServiceProvider());
String sigText = System.Text.Encoding.UTF8.GetString(Transform(sigBytes, new ToBase64Transform()));
System.Text.StringBuilder sb = new StringBuilder();
using (System.Xml.XmlWriter xw = System.Xml.XmlTextWriter.Create(sb))
{
xw.WriteStartElement("License");
xw.WriteRaw(signedData);
xw.WriteElementString("Signature", sigText);
xw.WriteEndElement();
}
return sb.ToString();
}
Example output from this code:
<?xml version="1.0" encoding="utf-16"?>
<License>
<SignedData>
<Key>Insert your key here</Key>
</SignedData>
<Signature>cgpmyqaDlHFetCZbm/zo14NEcBFZWaQpyHXViuDa3d99AQ5Dw5Ya8C9WCHbTiGfRvaP4nVGyI+ezAAKj287dhHi7l5fQAggUmh9xTfDZ0slRtvYD/wISCcHfYkEhofXUFQKFNItkM9PnOTExZvo75pYPORkvKBF2UpOIIFvEIU=</Signature>
</License>
Then you can use code like this to verify it. You never have to distribute the private key:
public static Boolean CheckLicenseSignature(String licXml)
{
try
{
System.Xml.XmlDocument xd = new System.Xml.XmlDocument();
xd.LoadXml(licXml);
String licSig = xd.SelectSingleNode("/License/Signature").InnerText;
RSACryptoServiceProvider rsa = new RSACryptoServiceProvider();
String rsaPublicKey = "<RSAKeyValue><Modulus>uPCow37yEzlKQXgbqO9E3enSOXY1MCQB4TMbOZyk9eXmc7kuiCMhJRbrwild0LGO8KE3zci9ETBWVVSJEqUqwtZyfUjvWOLHrf5EmzribtSU2e2hlsNoB2Mu11M0SaGd3qZfYcs2gnEnljfvkDAbCyJhUlxmHeI+35w/nqSCjCk=</Modulus><Exponent>AQAB</Exponent></RSAKeyValue>";
rsa.FromXmlString(rsaPublicKey);
Byte[] licenseData = System.Text.Encoding.UTF8.GetBytes(xd.SelectSingleNode("/License/SignedData").OuterXml);
return rsa.VerifyData(licenseData, new SHA1CryptoServiceProvider(), Transform(System.Text.Encoding.UTF8.GetBytes(licSig), new FromBase64Transform()));
}
catch (System.Xml.XmlException ex)
{
return false;
}
catch (InvalidOperationException ex)
{
return false;
}
}
From within the capability of the .NET code itself, I'm not sure if this is doable. But it is possible to dynamically generate a .NET DLL which contains some key that can be referred from the main application. That is, if you wouldn't mind a second file in the distribution.
Or if you don't mind to use Ildasm to disassemble the .exe, change the key, then use Ilasm to reassemble, then you can do something to automate that.
The accepted answer is GARBAGE!
I HAVE DONE THIS SUCCESSFULLY. MUCH EASIER
Just put your base application (.net) that needs the key somewhere with a string resource FILLED WITH "XXXXXXXXXXXXXXX" (more than you'll need)
.Net resources are usually kept at the top of the code so you will find them fast skipping the first 100,000 bytes in my case.
Then you just read it in and look for those XXXXXX's. When you find them you replace them with the real API key and replace the rest of the X's with spaces you just trim off in code. This is the answer. It works and it works well.
ApiToken at = new ApiToken(UserId, SelectedCID);
at.MakeToken();
byte[] app = System.IO.File.ReadAllBytes(Path.Combine(AppDomain.CurrentDomain.GetData("DataDirectory").ToString(), "notkeyedapp.exe"));
for (int i = 100000; i < app.Length; i++)
{
if (app[i] == 0x58 && app[i + 1] == 0x58 && app[i + 2] == 0x58)
{
for (int j = 0; j < 128; j++)
{
if (at.Token.Length >= j + 1)
app[i + j] = System.Text.Encoding.ASCII.GetBytes(at.Token[j].ToString())[0];
else
app[i + j] = 0x20;
}
break;
}
}
string filename = "SoftwareProduct for - " + BaseModel.CompanyName.Replace(".", "") + ".exe";
return File(app, System.Net.Mime.MediaTypeNames.Application.Octet, filename);
I don't think You can get away without recompiling Your .exe and having key embedded into said .exe. The compilation process can be automated though via use of ildasm.exe and ilasm.exe as Daniel Earwicker suggested in his response https://stackoverflow.com/a/2742902/2358659
I'd like to expand on that if anyone else stumbles across this topic in the future.
I recently was facing similar problem due to my poor source code version control habits. In a nutshell I had an executable that was supposed to write some data to a Google Spreadsheet by referencing it's ID. Long after executable was released came another request from a different team to use the tool, but it had to write same information into a different spreadsheet in order to keep data separate for two teams. At the time I did not have the original source code, hence I was not able to change the static variable holding the original spreadsheet ID. What I did was as follows:
Using CMD.exe → call "C:\Program Files (x86)\Microsoft SDKs\Windows\v8.0A\bin\NETFX 4.0 Tools\ildasm.exe" "myApplication.exe" /out="myApplication.il"
Using Notepad++ → Find and replace original ID to new ID inside myApplication.il file. This action can also be automated by writing own C# application to do this, or using PowerShell, or using vb/j-script or using some other find and replace tool available off-the-shelf, like FART (using CMD.exe → call fart.exe myApplication.il "OldKey" "NewKey")
Using CMD.exe → call "C:\Windows\Microsoft.NET\Framework\v4.0.30319\ilasm.exe" "myApplication.il" /res="myApplication.res" /key="myApplicationKeyFile.snk"
As You see, all of these steps can be put into one .bat file that takes "NewKey" as an input and produces new .exe with NewKey embedded.
I hope that helps.
What comes to my mind, but not tried yet: Create a default String in your program, for example as
static public string regGuid = "yourguidhere";
Then, search the compiled EXE with any decent hex editor. If you find the string, replace it with another test. If you still can execute the program, you could try to automate this process and voila! Here you are.

Marshalling C# com items when using recursion

I am using the SourceSafe COM object (SourceSafeTypeLib) from C# to automate a SourceSafe recursive get (part of a larger build process). The recursive function is shown below. How do I ensure that all the COM objects created in the foreach loop get released correctly?
/// <summary>
/// Recursively gets files/projects from SourceSafe (this is a recursive function).
/// </summary>
/// <param name="vssItem">The VSSItem to get</param>
private void GetChangedFiles(VSSItem vssItem)
{
// 'If the object is a file perform the diff,
// 'If not, it is a project, so use recursion to go through it
if(vssItem.Type == (int)VSSItemType.VSSITEM_FILE)
{
bool bDifferent = false; //file is different
bool bNew = false; //file is new
//Surround the diff in a try-catch block. If a file is new(doesn't exist on
//the local filesystem) an error will be thrown. Catch this error and record it
//as a new file.
try
{
bDifferent = vssItem.get_IsDifferent(vssItem.LocalSpec);
}
catch
{
//File doesn't exist
bDifferent = true;
bNew = true;
}
//If the File is different(or new), get it and log the message
if(bDifferent)
{
if(bNew)
{
clsLog.WriteLine("Getting " + vssItem.Spec);
}
else
{
clsLog.WriteLine("Replacing " + vssItem.Spec);
}
string strGetPath = vssItem.LocalSpec;
vssItem.Get(ref strGetPath, (int)VSSFlags.VSSFLAG_REPREPLACE);
}
}
else //Item is a project, recurse through its sub items
{
foreach(VSSItem fileItem in vssItem.get_Items(false))
{
GetChangedFiles(fileItem);
}
}
}
If it is a short running program and there is nothing to "commit" on the COM side, it is ok to let them go, believe it or not. The GC will come and properly release the interfaces when it needs to.
If it is a long running program (like a server component or takes hours and hours to complete), or you need to "commit" or "save" changes the best bet would be to release them as you would any VSSItem right after your call to GetChangedFiles(fileItem); in your foreach loop.
Example:
foreach (VSSItem fileItem in vssItem.get_Items(false))
{
GetChangedFiles(fileItem);
// fileItem.Release(); or fileItem.Dispose();
// or even Marshal.ReleaseComObject(fileItem);
}

Wipe Free space on hard disk drive using C# [closed]

Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 5 years ago.
Improve this question
I have been tasked to overwrite all the free space on a few laptops 3 times. I know there are some alternatives but I like to know how things work and if I can to do it myself with C#.
1) yes, I know there are plenty of freeware applications that will do this
2) no, we don't need to conform to any specific government standard
Where do I look for ideas on how to start this?
Thanks if you can point me in the right direction.
Can it be achieved using C#? If so, how?
Simple algorithm:
Create a large text file full of arbitrary text (best to use a pre-created file instead of regenerating random for performance reasons. Test it out.)
Create a clever folder and file naming scheme so as to be able to track the files. You should also track the files with your app but if it crashes, especially near the end of a first few test runs, you'll want to be able to easily find and clean up your handy work.
Write it to the HDD until it's full
Delete the files you created
Repeat above steps two more times
Update: More advanced consideration on wiping per subsequent discussion:
On first pass write files of 0x0000 values (all bits off)
On second pass write all bits as 0xFFFF (all bits on)
On last pass repeat with 0x0000
The above ignores a few ideas such as what is the best size of the file, depends on your file system anyway. You might also get different behaviors from the OS as you near a filled HDD...
This is really dangerous but..
You can use the Defrag APIs (here's a C# wrapper) to get hold of the drive 'map' and specifically target the freespace and write junk to those parts of the disk.
Check the SDelete documentation, maybe you can get a clue there.
You're going to have to do some low level manipulations, so you'll certainly have to talk to the Win32 API. I haven't done this sort of thing, so I can't give you specifics, but a good place to start looking might be the Win32 API reference:http://msdn.microsoft.com/en-us/library/aa383749%28VS.85%29.aspx
I'm really not an expert in this field at all, but it seems to my naive understanding that you'll need to:
1) get info on where the filesystem starts & stops
2) using the non-deleted files as a reference, get a list of physical locations of what should be free space
3) write 0's to those locations
Maybe this isn't a great answer since I'm not an expert in the field, but it was a bit too long for a comment ;) I hope that helps a little.
System.Diagonstics.Process.Start("chipher.exe /WC:\");
This is asynchronous by default, you get the idea.
This code is from The Code Project I think. I'm unsure where the orignal article is, but it does what you asked for:
Based on comments I clearly need to spoonfeed a bit more..
You can do this very simply, based on your requirements.
Make 1 large file that fills the remaining free size on your drive. Then simply wipe this file.
Make several files until your drive is full. (This might be better if you want to use the machine while its going on ). Then you can start wiping each file, so in effect the total time the system has a full hard disj drive is smaller than using method 1. But it will likely be a bit slower and use a bit more code.
The advantage of using are a few, easy code for you to use. You don't have to play with low level APIs that will screw you over.
using System;
using System.IO;
using System.Security.Cryptography;
namespace QuickStarterShared
{
public class Wipe
{
/// <summary>
/// Deletes a file in a secure way by overwriting it with
/// random garbage data n times.
/// </summary>
/// <param name="filename">Full path of the file to be deleted</param>
/// <param name="timesToWrite">Specifies the number of times the file should be overwritten</param>
public void WipeFile(string filename, int timesToWrite)
{
#if !DEBUG
try
{
#endif
if (File.Exists(filename))
{
// Set the files attributes to normal in case it's read-only.
File.SetAttributes(filename, FileAttributes.Normal);
// Calculate the total number of sectors in the file.
double sectors = Math.Ceiling(new FileInfo(filename).Length/512.0);
// Create a dummy-buffer the size of a sector.
byte[] dummyBuffer = new byte[512];
// Create a cryptographic Random Number Generator.
// This is what I use to create the garbage data.
RNGCryptoServiceProvider rng = new RNGCryptoServiceProvider();
// Open a FileStream to the file.
FileStream inputStream = new FileStream(filename, FileMode.Open);
for (int currentPass = 0; currentPass < timesToWrite; currentPass++)
{
// Go to the beginning of the stream
inputStream.Position = 0;
// Loop all sectors
for (int sectorsWritten = 0; sectorsWritten < sectors; sectorsWritten++)
{
// Fill the dummy-buffer with random data
rng.GetBytes(dummyBuffer);
// Write it to the stream
inputStream.Write(dummyBuffer, 0, dummyBuffer.Length);
}
}
// Truncate the file to 0 bytes.
// This will hide the original file-length if you try to recover the file.
inputStream.SetLength(0);
// Close the stream.
inputStream.Close();
// As an extra precaution I change the dates of the file so the
// original dates are hidden if you try to recover the file.
DateTime dt = new DateTime(2037, 1, 1, 0, 0, 0);
File.SetCreationTime(filename, dt);
File.SetLastAccessTime(filename, dt);
File.SetLastWriteTime(filename, dt);
File.SetCreationTimeUtc(filename, dt);
File.SetLastAccessTimeUtc(filename, dt);
File.SetLastWriteTimeUtc(filename, dt);
// Finally, delete the file
File.Delete(filename);
}
#if !DEBUG
}
catch(Exception e)
{
}
#endif
}
}
# region Events
# region PassInfo
public delegate void PassInfoEventHandler(PassInfoEventArgs e);
public class PassInfoEventArgs : EventArgs
{
private readonly int cPass;
private readonly int tPass;
public PassInfoEventArgs(int currentPass, int totalPasses)
{
cPass = currentPass;
tPass = totalPasses;
}
/// <summary> Get the current pass </summary>
public int CurrentPass { get { return cPass; } }
/// <summary> Get the total number of passes to be run </summary>
public int TotalPasses { get { return tPass; } }
}
# endregion
# region SectorInfo
public delegate void SectorInfoEventHandler(SectorInfoEventArgs e);
public class SectorInfoEventArgs : EventArgs
{
private readonly int cSector;
private readonly int tSectors;
public SectorInfoEventArgs(int currentSector, int totalSectors)
{
cSector = currentSector;
tSectors = totalSectors;
}
/// <summary> Get the current sector </summary>
public int CurrentSector { get { return cSector; } }
/// <summary> Get the total number of sectors to be run </summary>
public int TotalSectors { get { return tSectors; } }
}
# endregion
# region WipeDone
public delegate void WipeDoneEventHandler(WipeDoneEventArgs e);
public class WipeDoneEventArgs : EventArgs
{
}
# endregion
# region WipeError
public delegate void WipeErrorEventHandler(WipeErrorEventArgs e);
public class WipeErrorEventArgs : EventArgs
{
private readonly Exception e;
public WipeErrorEventArgs(Exception error)
{
e = error;
}
public Exception WipeError{get{ return e;}}
}
# endregion
# endregion
}

Categories

Resources