I use to store document/file in byte[] in database, and I want user can view/run that file from my application.
You need to know the file extension for the file you're writing, so the OS can run the default program based on the extension. The code would be something like this:
byte[] bytes = GetYourBytesFromDataBase();
string extension = GetYourFileExtension(); //.doc for example
string path = Path.GetTempFileName() + extension;
try
{
using(BinaryWriter writer = new BinaryWriter(File.Open(path, FileMode.Create)))
{
writer.Write(yourBytes);
}
// open it with default application based in the
// file extension
Process p = System.Diagnostics.Process.Start(path);
p.Wait();
}
finally
{
//clean the tmp file
File.Delete(path);
}
You will need to store the file extension in the database too. If you don't have the file extension the problem becomes very difficult as you cannot rely on the operating system to work out which program to launch to handle the file.
You can use the following pattern:
Load data from database and save to file using the original file extension.
Start a new System.Diagnostics.Process that points to the saved file path.
As you have saved the file with the original file extension, the OS will look for a program that is registered for the extension to open the file.
As chibacity and Daniel suggest, storing the file extension in the db, and agreed -- storing the file extension, or at least some indicator that tells you the file type, is a good idea.
If these files are of a format of your own creation then you might also want to store information about which version of the file format the data is stored in. During development file formats are prone to changing, and if you don't remember which version you used to store the data then you have a hard job recovering the information.
The same problems are faced in object persistence generally.
Related
I'm using GetTempFileName() in order to write bytes from a database to a temporary file. These files are normally PDF type, however, they can and do vary.
I'm running into an issue where attempting Process.Start(tempPath) throws an error because Windows cannot find an associated program with a .tmp file.
As opposed to writing a custom method that verifies the mime type and adjusts the file name as needed, is there any common/standard way to handle this .tmp file type in .NET?
I guess that Reed Copsey's answer on a related question helps to answer yours:
You can append the extension to the autogenerated temp-file:
string filename = System.IO.Path.GetTempFileName() + ".pdf"; // Makes something like "C:\Temp\blah.tmp.pdf"
File.WriteAllBytes(filename, filedata);
var process = Process.Start(filename);
// Clean up our temporary file...
process.Exited += (s,e) => System.IO.File.Delete(filename);
I've had 3 reports now of user's machines crashing while using my software.. the crashes are not related to my program but when they restart the config files my program writes are all corrupt.
There is nothing special to how the files are being written, simply creating a Json representation and dumping it to disk using File.WriteAllText()
// save our contents to the disk
string json = JsonConvert.SerializeObject(objectInfo, Formatting.Indented);
// write the contents
File.WriteAllText(path, json);
I've had a user send me one of the files and the length looks about right (~3kb) but the contents are all 0x00.
According to the post below File.WriteAllText should close the file handle, flushing any unwritten contents to the disk:
In my C# code does the computer wait until output is complete before moving on?
BUT, as pointed out by Alberto in the comments:
System.IO.File.WriteAllText when completes, will flush all the text to
the filesystem cache, then, it will be lazily written to the drive.
So I presume what is happening here is that the file is being cleared and initialized with 0x00 but the data is not yet written when the system crashes.
I was thinking of maybe using some sort of temp file so the process would be like this:
Write new contents to temp file
Delete original file
Rename temp file to original
I don't think that will solve the problem as I presume Windows will just move the file even though the IO is still pending.
Is there any way I can force the machine to dump that data to disk instead of it deciding when to do it or perhaps a better way to update a file?
UPDATE:
Based on suggestions by #usr, #mikez and #llya luzyanin I've created a new WriteAllText function that performs the write using the following logic:
Create a temp file with the new contents using the FileOptions.WriteThrough flag
Writes the data to disk (won't return until the write has completed)
File.Replace to copy the contents of the new temp file to the real file, making a backup
With that logic, if the final file fails to load, my code an check for a backup file and load that instead
Here is the code:
public static void WriteAllTextWithBackup(string path, string contents)
{
// generate a temp filename
var tempPath = Path.GetTempFileName();
// create the backup name
var backup = path + ".backup";
// delete any existing backups
if (File.Exists(backup))
File.Delete(backup);
// get the bytes
var data = Encoding.UTF8.GetBytes(contents);
// write the data to a temp file
using (var tempFile = File.Create(tempPath, 4096, FileOptions.WriteThrough))
tempFile.Write(data, 0, data.Length);
// replace the contents
File.Replace(tempPath, path, backup);
}
You can use FileStream.Flush to force the data to disk. Write to a temp file and use File.Replace to atomically replace the target file.
I believe this is guaranteed to work. File systems give weak guarantees. These guarantees are hardly ever documented and they are complex.
Alternatively, you can use Transactional NTFS if available. It is available for .NET.
FileOptions.WriteThrough can replace Flush but you still need the temp file if your data can exceed a single cluster in size.
I am using the fileUpload control. When I upload the file, I want to find the exact location of the file.
I tried using:
string fname= Server.MapPath(FileUpload2.FileName);
string fname= FileUpload2.FileName;
string fname= FileUpload2.PostedFile.FileName;
Numbers 2 & 3 gave me the name of the file. Number 1 gave me the the path of my website location. I do not know what is the difference between 2 and 3, why both gave me same results.
I read somewhere, that you cannot get the path. Is it true? If not, what code should I use?
There is no actual file path because a file uploaded to the server is simply held in memory.
The FileUpload control is just a wrapper around an HttpPostedFile instance, which itself is basically just a wrapper around an InputStream.
It's up to you to actually save the file somewhere. Until then it doesn't exist in any physical location.
The FileName property simply corresponds to the filename from the client's machine, minus the path. It has no correlation to anything on the server's file system.
There are a couple of different ways you can deal with the file.
Save The File To Disk:
The FileUpload control provides a SaveAs method that will allow you to save the file locally, or some UNC that you have access to.
FileUpload2.SaveAs("C:\\Temp\\" + FileUpload2.FileName);
Process The File In Memory:
Since you have access to the FileContent, you could simply manipulate and process the file directly. Assuming you know what type of file it is (txt, pdf, csv, etc...)
using (var sr = new StreamReader(FileUpload2.FileContent))
{
while ((var line = sr.ReadLine()) != null)
{
//Do something with 'line'
}
}
I have code that reads encrypted credentials from a text file. I updated that text file to include a connection string. Everything else is read and decrypted fine, but not the connection string (naturally, I updated my code accordingly, too).
So I got to wondering: is it reading the correct file. The answer: No! The file in \bin\debug is dated 6/5/2012 9:41 am, but this code:
using (StreamReader reader = File.OpenText("Credentials.txt")) {
string line = null;
MessageBox.Show(File.GetCreationTime("Credentials.txt").ToString());
...shows 6/4/2012 2:00:44 pm
So I searched my hard drive for all instances of "Credentials.txt" to see where it was reading the file from. It only found one instance, the one with today's date in \bin\debug.
???
Note: Credentials.txt is not a part of my solution; should it be? (IOW, I simply copied it into \bin\debug, I didn't perform an "Add | Existing Item")
Provided you don't change the current directory, the file in bin\Debug is going to be the one being read, as you're not specifying a full path.
The problem is likely due to the differences between the different File dates. The Creation Date (which is what you are fetching and displaying as 6/4 # 2:00:44pm) is likely different from the Date modified (which is what is shown by default in Windows Explorer). This date can be fetched using File.GetLastWriteTime instead of GetCreationTime.
That being said, I would recommend using the full path to the file, and not assuming that the current directory is the same as the executable path. Specifying the full path (which can be determined based on the executable path) will be safer, and less likely to cause problems later. This can be done via:
var exePath = System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetEntryAssembly().Location);
var file = System.IO.Path.Combine(exePath, "Credentials.txt");
using (StreamReader reader = File.OpenText(file)) { // ...
All right guys and gals it's time for the age old question, how do you password protect an xml file using C#?
I have acutally created the file in C# as well (not as if that is relevant) and now I need to password protect it so I can email it out to clients, any suggestions guys,
Also I tried putting the xml file into a zip file, using C# and upon doing this the file loses its extension, and it does this with every method I find, so I would really just like to password protect the original file.
I should have been more clear on this, the file loses it's extension permanetly, when the end user unzips it, it's no longer an xml file, it's just a file with a name, no association or any thing
ok changing this a bit, it's been pointed out a lot that xml doesn't get password protected because it's just text, not a problem, so lets change this up how about the ziping of it
FileStream sourceFile = File.OpenRead(#"C:\sample.xml");
FileStream destFile = File.Create(#"C:\sample.zip");
GZipStream compStream = new GZipStream(destFile, CompressionMode.Compress);
try
{
int theByte = sourceFile.ReadByte();
while (theByte != -1)
{
compStream.WriteByte((byte)theByte);
theByte = sourceFile.ReadByte();
}
}
finally
{
compStream.Dispose();
}
this code above zips the file, but when the file is unziped by the end user the file loses it's xml extension and with it it's file association
ok i have an update i figured out how to keep the file from losing it's extension, if i change the output file name to sample.xml.zip, the system handles it fine, granted the output file comes out reading just like this, sample.xml.zip, but winzip never bitches about opening it, neither does 7zip so i'm perfectly happy with this, now the password protected thing is something i haven't figured out yet.
just for reference sake, my new code.
FileStream sourceFile = File.OpenRead(#"C:\sample.xml");
FileStream destFile = File.Create(#"C:\sample.xml.zip");
GZipStream compStream = new GZipStream(destFile, CompressionMode.Compress);
try
{
int theByte = sourceFile.ReadByte();
while (theByte != -1)
{
compStream.WriteByte((byte)theByte);
theByte = sourceFile.ReadByte();
}
}
finally
{
compStream.Dispose();
}
and upon doing this the file looses
it's extension
What do you mean, the file name changes from MyXMLFile.xml to MyXMLFile.zip?
There's nothing you can do about that, absolutely nothing.
An xml file is a plain text file, you can't password protect the file without somehow encrypting it. Once you encrypt it, it's no longer an Xml file, it's an encrypted file, that when decrypted will produce an Xml file.
Encrypting your xml file into a password protected Zip file is a perfectly good solution to this problem.
Once the end user unzips the zip file, they'll see it as an Xml file, and then everything will be ok.
Hope this helps.
You can't password protect an XML file the way you can with a Word document. The reason you can place passwords on Word documents is because Word and presumably other programs which can read Word documents support password protection. Nothing prevents a program from completely ignoring the password (unless the file is somehow encrypted using the password as a key generator).
XML files are simply text files. No password protection is possible without placing them in a password protected container (such as a zip file). When you zip up the XML file, it is placed inside a zip archive with the extension of .zip to indicate that it is a zip file.
It is then up to the person receiving the zip file provide the correct password in order to decompress the zip file and retrieve the original XML file.
I don't believe .NET has any support for managing .zip files. You can use an third-party library like DotNetZip to help you with this.
Encrypting the file and then decrypting it would be one option. This article give some information on encrypting and decrypting.
I think a good option would be to zip it and password protect the zip file. Not the xml. A library like dotnetzip could work for this and is pretty straight forward.