I'm trying to implement SkyDrive backup into my app. Everything is working fine except I have no idea how to easily save stream (with downloaded backup from SkyDrive) into Isolated Storarage.
I know that I could deserilaze the stream and than save it, but it would unnecessarily complicated.
So, here's my problem:
I have a Stream with the file from SkyDrive and I need to save it into IsolatedStorage as file 'Settings.XML'.
So basicly I need to Write the 'stream' into Settings.xml file in Isolated Storage
static void client_DownloadCompleted(object sender, LiveDownloadCompletedEventArgs e, string saveToPath)
{
Stream stream = e.Result; //Need to write this into IS
try
{
using (IsolatedStorageFile myIsolatedStorage = IsolatedStorageFile.GetUserStoreForApplication())
{
using (IsolatedStorageFileStream fileStream = myIsolatedStorage.OpenFile(saveToPath, FileMode.Create))
{
//How to save it?
}
}
}...
Thanks!
Use the steam's CopyTo() method - http://msdn.microsoft.com/en-us/library/dd782932.aspx
using (IsolatedStorageFileStream fileStream = myIsolatedStorage.OpenFile(saveToPath, FileMode.Create))
{
stream.CopyTo(fileStream);
}
The quickest and simplest thung to do is to copy the bytes block by block from one stream to the next - see the accepted answer in How do I copy the contents of one stream to another?
However, if there is any chance that the network stream will be corrupt or contain invalid data, then it might be worth deserializing the stream and validating before you save it.
Related
I am calling REST API which is accepting Stream to upload file from local device, so for that right now I am using following code to get Stream from a file and than closing that stream after it get's uploaded:
var stream = new FileStream(file, FileMode.Open, FileAccess.ReadWrite);
The problem with the above approach is that, until entire file gets uploaded to server user don't have any chance to delete that file because stream of that file is open, what would be the solution to resolve this issue?
If your typical file is reasonably sized (and I'm hoping you won't be uploading 2GB+ files to a REST API), you could always just read the stream into memory and before feeding it to your API, like so:
using (MemoryStream memoryStream = new MemoryStream())
{
using (FileStream fileStream = new FileStream(file, FileMode.Open, FileAccess.ReadWrite)) {
fileStream.CopyTo(memoryStream);
}
memoryStream.Position = 0; // Reset to origin.
// Now use the MemoryStream as you would a FileStream:
api.Upload(memoryStream);
}
Another alternative is to create a temp copy of the file on your hard drive and feed that to the API - but then dealing with cleanup can become a bit cumbersome. FileOptions.DeleteOnClose is your friend and may very well suffice for your purposes, but it still offers no bulletproof guarantees.
I am doing Silverlight application. which i need to save data into server.
Is it Possible to Save recorded stream in one dummy file.
Stream stream = saveFileDialog.OpenFile();
WavManager.SavePcmToWav(_sink.BackingStream, stream, _sink.CurrentFormat);
stream.Close();
Instead of selecting user by SaveFileDialog I want to use Dummy file at runtime.
if it possible any one will tell i will greatly appreciate.Advance thanks.
You can use the IsolatedStorageFile to create a temp/dummy file without asking the user to select a file.
The IsolatedStorage is a restricted area for your silverlight application to store files and data.
IsolatedStorageFile store = IsolatedStorageFile.GetUserStoreForApplication ();
IsolatedStorageFileStream stream = store.CreateFile("dummy.wav");
WavManager.SavePcmToWav(_sink.BackingStream, stream, _sink.CurrentFormat);
stream.Close();
Another solution would be to store the data of your .wav file in a in-memory stream. This can be done by using a MemoryStream.
I have files stored in one container within a blob storage account. I need to create a zip file in a second container containing the files from the first container.
I have a solution that works using a worker role and DotNetZip but because the zip file could end up being 1GB in size I am concerned that doing all the work in-process, using MemoryStream objects etc. is not the best way of doing this. My biggest concern is that of memory usage and freeing up resources given that this process could happen several times a day.
Below is some very stripped down code showing the basic process in the worker role:
using (ZipFile zipFile = new ZipFile())
{
foreach (var uri in uriCollection)
{
var blob = new CloudBlob(uri);
byte[] fileBytes = blob.DownloadByteArray();
using (var fileStream = new MemoryStream(fileBytes))
{
fileStream.Seek(0, SeekOrigin.Begin);
byte[] bytes = CryptoHelp.EncryptAsBytes(fileStream, "password", null);
zipFile.AddEntry("entry name", bytes);
}
}
using (var zipStream = new MemoryStream())
{
zipFile.Save(zipStream);
zipStream.Seek(0, SeekOrigin.Begin);
var blobRef = ContainerDirectory.GetBlobReference("output uri");
blobRef.UploadFromStream(zipStream);
}
}
Can someone suggest a better approach please?
At the time of writing this question, I was unaware of the LocalStorage options available in Azure. I was able to write files individually to this and the work with them within the LocalStorage and then write them back to blob storage.
If all you are worried aobut is your memorysteam taking up too much memory then what you can do is implement your own stream and as your stream is being read, you you add your zip files to the stream and remove already read files from the stream. This will keep your memory stream size to the size of one file.
I have see this term branded around but I don't really understand how you open a file in memory.
I have the files written to disk in a temp location but this needs cleaning when a certain form closes and I can't do it when it's open. It's a must that this folder gets emptied. I was wondering if I opened files in memory instead whether it would make a difference?
MemoryStream inMemoryCopy = new MemoryStream();
using (FileStream fs = File.OpenRead(path))
{
fs.CopyTo(inMemoryCopy);
}
// Now you can delete the file at 'path' and still have an in memory copy
I think you want to work with Memory Mapped files added recently to .NET 4.
http://blogs.msdn.com/b/salvapatuel/archive/2009/06/08/working-with-memory-mapped-files-in-net-4.aspx
Memory Mapped Files .NET
I think it means to read the content of that file into memory as a whole and then close the connection to the file. Assuming it's a file that's not too big you could just read it into a byte[]:
byte[] fileContent = File.ReadAllBytes(fileName);
If it's a text file read it into a string using
string fileContent = File.ReadAllText(fileName);
Once you've done that use a StreamReader to read it later as you would a file on disk.
You can use DeleteOnClose parameter of FileStream constructor:
FileStream fs = new FileStream("<Path Here>", FileMode.Create,
FileAccess.ReadWrite, FileShare.None, 1024, FileOptions.DeleteOnClose);
and the file will be deleted when closed.
Is it possible to open a file directly from a MemoryStream opposed to writing to disk and doing Process.Start() ? Specifically a pdf file? If not, I guess I need to write the MemoryStream to disk (which is kind of annoying). Could someone then point me to a resource about how to write a MemoryStream to Disk?
It depends on the client :) if the client will accept input from stdin you could push the dta to the client. Another possibility might be to write a named-pipes server or a socket-server - not trivial, but it may work.
However, the simplest option is to just grab a temp file and write to that (and delete afterwards).
var file = Path.GetTempFileName();
using(var fileStream = File.OpenWrite(file))
{
var buffer = memStream.GetBuffer();
fileStream.Write(buffer, 0, (int)memStream.Length);
}
Remember to clean up the file when you are done.
Path.GetTempFileName() returns file name with '.tmp' extension, therefore you cant't use Process.Start() that needs windows file association via extension.
If by opening a file, you mean something like starting Adobe Reader for PDF files, then yes, you have to write it to a file. That is, unless the application provides you with some API do that.
One way to write a stream to file would be:
using (var memoryStream = /* create the memory stream */)
using (var fileStream = File.OpenWrite(fileName))
{
memoryStream.WriteTo(fileStream);
}