How to create zip file in memory? - c#

I have to create a zip file from set of urls. and it should have a proper folder structure.
So i tried like
public async Task<byte[]> CreateZip(Guid ownerId)
{
try
{
string startPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "zipFolder");//base folder
if (Directory.Exists(startPath))
{
DeleteAllFiles(startPath);
Directory.Delete(startPath);
}
Directory.CreateDirectory(startPath);
string zipPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, $"{ownerId.ToString()}"); //folder based on ownerid
if (Directory.Exists(zipPath))
{
DeleteAllFiles(zipPath);
Directory.Delete(zipPath);
}
Directory.CreateDirectory(zipPath);
var attachemnts = await ReadByOwnerId(ownerId);
attachemnts.Data.ForEach(i =>
{
var fileLocalPath = $"{startPath}\\{i.Category}";
if (!Directory.Exists(fileLocalPath))
{
Directory.CreateDirectory(fileLocalPath);
}
using (var client = new WebClient())
{
client.DownloadFile(i.Url, $"{fileLocalPath}//{i.Flags ?? ""}_{i.FileName}");
}
});
var zipFilename = $"{zipPath}//result.zip";
if (File.Exists(zipFilename))
{
File.Delete(zipFilename);
}
ZipFile.CreateFromDirectory(startPath, zipFilename, CompressionLevel.Fastest, true);
var result = System.IO.File.ReadAllBytes(zipFilename);
return result;
}
catch (Exception ex)
{
var a = ex;
return null;
}
}
currently im writing all files in my base directory(may be not a good idea).corrently i have to manually delete all folders and files to avoid exception/unwanted files. Can everything be written in memory?
What changes required to write all files and folder structure in memory?

No you can't. Not with the built in Dotnet any way.
As per my comment I would recommend storing the files in a custom location based on a Guid or similar. Eg:
"/xxxx-xxxx-xxxx-xxxx/Folder-To-Zip/....".
This would ensure you could handle multiple requests with the same files or similar file / folder names.
Then you just have to cleanup and delete the folder again afterwards so you don't run out of space.

Hope the below code does the job.
public async Task<byte[]> CreateZip(Guid ownerId)
{
try
{
string startPath = Path.Combine(Path.GetTempPath(), $"{Guid.NewGuid()}_zipFolder");//folder to add
Directory.CreateDirectory(startPath);
var attachemnts = await ReadByOwnerId(ownerId);
attachemnts.Data = filterDuplicateAttachments(attachemnts.Data);
//filtering youtube urls
attachemnts.Data = attachemnts.Data.Where(i => !i.Flags.Equals("YoutubeUrl", StringComparison.OrdinalIgnoreCase)).ToList();
attachemnts.Data.ForEach(i =>
{
var fileLocalPath = $"{startPath}\\{i.Category}";
if (!Directory.Exists(fileLocalPath))
{
Directory.CreateDirectory(fileLocalPath);
}
using (var client = new WebClient())
{
client.DownloadFile(i.Url, $"{fileLocalPath}//{i.Flags ?? ""}_{i.FileName}");
}
});
using (var ms = new MemoryStream())
{
using (var zipArchive = new ZipArchive(ms, ZipArchiveMode.Create, true))
{
System.IO.DirectoryInfo di = new DirectoryInfo(startPath);
var allFiles = di.GetFiles("",SearchOption.AllDirectories);
foreach (var attachment in allFiles)
{
var file = File.OpenRead(attachment.FullName);
var type = attachemnts.Data.Where(i => $"{ i.Flags ?? ""}_{ i.FileName}".Equals(attachment.Name, StringComparison.OrdinalIgnoreCase)).FirstOrDefault();
var entry = zipArchive.CreateEntry($"{type.Category}/{attachment.Name}", CompressionLevel.Fastest);
using (var entryStream = entry.Open())
{
file.CopyTo(entryStream);
}
}
}
var result = ms.ToArray();
return result;
}
}
catch (Exception ex)
{
var a = ex;
return null;
}
}

Related

unable to update zip file in Azure File Share/Blob

I am using Azure File share I want to create zip file only once but wants to update it multiple times (upload multiple files after once created).
is it possible to create .zip file only once and add more files in it later without **overriding **existing files in zip.?
when i tried to add more files in .zip it overrides existing files in zip with new file.
private static async Task OpenZipFile()
{
try
{
using (var zipFileStream = await OpenZipFileStream())
{
using (var zipFileOutputStream = CreateZipOutputStream(zipFileStream))
{
var level = 0;
zipFileOutputStream.SetLevel(level);
BlobClient blob = new BlobClient(new Uri(String.Format("https://{0}.blob.core.windows.net/{1}", "rtsatestdata", "comm/2/10029.txt")), _currentTenantTokenCredential);
var zipEntry = new ZipEntry("newtestdata")
{
Size = 1170
};
zipFileOutputStream.PutNextEntry(zipEntry);
blob.DownloadToAsync(zipFileOutputStream).Wait();
zipFileOutputStream.CloseEntry();
}
}
}
catch (TaskCanceledException)
{
throw;
}
}
private static async Task<Stream> OpenZipFileStream()
{
BlobContainerClient mainContainer = _blobServiceClient.GetBlobContainerClient("comm");
var blobItems = mainContainer.GetBlobs(BlobTraits.Metadata, BlobStates.None);
foreach (var item in blobItems)
{
if (item.Name == "testdata.zip")
{
BlobClient blob = new BlobClient(new Uri(String.Format("https://{0}.blob.core.windows.net/{1}", "rtsatestdata", "comm/testdata.zip")), _currentTenantTokenCredential);
return await blob.OpenWriteAsync(true
, options: new BlobOpenWriteOptions
{
HttpHeaders = new BlobHttpHeaders
{
ContentType = "application/zip"
}
}
);
}
}
}
private static ZipOutputStream CreateZipOutputStream(Stream zipFileStream)
{
return new ZipOutputStream(zipFileStream)
{
IsStreamOwner = false,
};
}
This is not possible in Azure storage. The workaround would be to download the zip, unzip it, add more files, re-zip it, and re-upload to storage.

Read Text file without copying to hard disk

I'm using Asp.Net Core 3.0 and I find myself in a situation where the client will pass text file(s) to my API, the API will then parse the text files into a data model using a function that I have created called ParseDataToModel(), and then store that data model into a database using Entity Framework. Since my code is parsing the files into a data model, I really don't need to copy it to the hard disk if it isn't necessary. I don't have a ton of knowledge when it comes to Streams, and I've googled quite a bit, but I was wondering if there is a way to retrieve the string data of the uploaded files without actually copying them to the hard drive? It seems like a needless extra step.... Below is my code for the file Upload and insertion into the database:
[HttpPost("upload"), DisableRequestSizeLimit]
public IActionResult Upload()
{
var filePaths = new List<string>();
foreach(var formFile in Request.Form.Files)
{
if(formFile.Length > 0)
{
var filePath = Path.GetTempFileName();
filePaths.Add(filePath);
using(var stream = new FileStream(filePath, FileMode.Create))
{
formFile.CopyTo(stream);
}
}
}
BaiFiles lastFile = null;
foreach(string s in filePaths)
{
string contents = System.IO.File.ReadAllText(s);
BaiFiles fileToCreate = ParseFileToModel(contents);
if (fileToCreate == null)
return BadRequest(ModelState);
var file = _fileRepository.GetFiles().Where(t => t.FileId == fileToCreate.FileId).FirstOrDefault();
if (file != null)
{
ModelState.AddModelError("", $"File with id {fileToCreate.FileId} already exists");
return StatusCode(422, ModelState);
}
if (!ModelState.IsValid)
return BadRequest();
if (!_fileRepository.CreateFile(fileToCreate))
{
ModelState.AddModelError("", $"Something went wrong saving file with id {fileToCreate.FileId}");
return StatusCode(500, ModelState);
}
lastFile = fileToCreate;
}
return CreatedAtRoute("GetFile", new { fileId = lastFile.FileId }, lastFile);
}
It would be nice to just hold all of the data in memory instead of copying them to the hard drive, just to turn around and open it to read the text.... I apologize if this isn't possible, or if this question has been asked before. I'm sure it has, and I just wasn't googling the correct keywords. Otherwise, I could be wrong and it is already doing exactly what I want - but System.IO.File.ReadAllText() makes me feel it's being copied to a temp directory somewhere.
After using John's answer below, here is the revised code for anyone interested:
[HttpPost("upload"), DisableRequestSizeLimit]
public IActionResult Upload()
{
var filePaths = new List<string>();
BaiFiles lastFile = null;
foreach (var formFile in Request.Form.Files)
{
if (formFile.Length > 0)
{
using (var stream = formFile.OpenReadStream())
{
using (var sr = new StreamReader(stream))
{
string contents = sr.ReadToEnd();
BaiFiles fileToCreate = ParseFileToModel(contents);
if (fileToCreate == null)
return BadRequest(ModelState);
var file = _fileRepository.GetFiles().Where(t => t.FileId == fileToCreate.FileId).FirstOrDefault();
if (file != null)
{
ModelState.AddModelError("", $"File with id {fileToCreate.FileId} already exists");
return StatusCode(422, ModelState);
}
if (!ModelState.IsValid)
return BadRequest();
if (!_fileRepository.CreateFile(fileToCreate))
{
ModelState.AddModelError("", $"Something went wrong saving file with id {fileToCreate.FileId}");
return StatusCode(500, ModelState);
}
lastFile = fileToCreate;
}
}
}
}
if(lastFile == null)
return NoContent();
else
return CreatedAtRoute("GetFile", new { fileId = lastFile.FileId }, lastFile);
}
System.IO.File.ReadAllText(filePath) is a convenience method. It essentially does this:
string text = null;
using (var stream = FileStream.OpenRead(filePath))
using (var reader = new StreamReader(stream))
{
text = reader.ReadToEnd();
}
FormFile implements an OpenReadStream method, so you can simply use this in place of stream in the above:
string text = null;
using (var stream = formFile.OpenReadStream())
using (var reader = new StreamReader(stream))
{
text = reader.ReadToEnd();
}

FileStream does not save array files

I have array links. Each of link presents one XML file. How can I iterate each XML and save to folder with one call.
GetMesssageAttachments(userId) return array 6 links, but current code save only first file. What's wrong here? Thanks
public async void SaveXMLMessages(string userId)
{
try
{
if (_responseMessage.IsSuccessStatusCode)
{
string messagesFolder = #"C:\XMLMessages";
Directory.CreateDirectory(messagesFolder);
string messageFileName = Path.GetRandomFileName();
string messagesPath = Path.Combine(messagesFolder, messageFileName);
foreach (string xmlMessage in await GetMesssageAttachments(userId))
{
var xmlMessageResponse = await _client.GetAsync(xmlMessage);
using (FileStream fileStream = new FileStream(messagesPath, FileMode.Create))
{
await xmlMessageResponse.Content.CopyToAsync(fileStream);
}
}
}
}
catch (Exception e)
{
throw e.InnerException;
}
}
UPDATED
This is work..
public async void SaveXMLMessages(string userId)
{
try
{
if (_responseMessage.IsSuccessStatusCode)
{
string messagesFolder = #"C:\XMLMessages";
Directory.CreateDirectory(messagesFolder);
foreach (string xmlMessage in await GetMesssageAttachments(userId))
{
string messageFileName = Path.GetRandomFileName();
string messagesPath = Path.Combine(messagesFolder, messageFileName);
var xmlMessageResponse = await _client.GetAsync(xmlMessage);
using (FileStream fileStream = new FileStream(messagesPath, FileMode.Create))
{
await xmlMessageResponse.Content.CopyToAsync(fileStream);
}
}
}
}
catch (Exception e)
{
throw e.InnerException;
}
}
Same messagesPath used in foreach. That means there is only one file created in loop
You must reinitialize it in the loop like this:
foreach (string xmlMessage in await GetMesssageAttachments(userId))
{
string messageFileName = Path.GetRandomFileName();
string messagesPath = Path.Combine(messagesFolder, messageFileName);
var xmlMessageResponse = await _client.GetAsync(xmlMessage);
using (FileStream fileStream = new FileStream(messagesPath, FileMode.Create))
{
await xmlMessageResponse.Content.CopyToAsync(fileStream);
}
}
Maybe you are overwriting files in each iteration, try moving this block to inside your foreach:
string messageFileName = Path.GetRandomFileName();
string messagesPath = Path.Combine(messagesFolder, messageFileName);
Only small change in 'thierry v' code and what rsb55 is saying is right. Your code should look like as below
foreach (string xmlMessage in await GetMesssageAttachments(userId))
{
string messageFileName = Path.GetRandomFileName();
string messagesPath = Path.Combine(messagesFolder, messageFileName);
var xmlMessageResponse = await _client.GetAsync(xmlMessage);
using (FileStream fileStream = new FileStream(messagesPath, FileMode.Create))
{
await xmlMessageResponse.Content.CopyToAsync(fileStream);
}
}

Parse WebCacheV01.dat in C#

I'm looking to parse the WebCacheV01.dat file using C# to find the last file location for upload in an Internet browser.
%LocalAppData%\Microsoft\Windows\WebCache\WebCacheV01.dat
I using the Managed Esent nuget package.
Esent.Isam
Esent.Interop
When I try and run the below code it fails at:
Api.JetGetDatabaseFileInfo(filePath, out pageSize, JET_DbInfo.PageSize);
Or if I use
Api.JetSetSystemParameter(instance, JET_SESID.Nil, JET_param.CircularLog, 1, null);
at
Api.JetAttachDatabase(sesid, filePath, AttachDatabaseGrbit.ReadOnly);
I get the following error:
An unhandled exception of type
'Microsoft.Isam.Esent.Interop.EsentFileAccessDeniedException' occurred
in Esent.Interop.dll
Additional information: Cannot access file, the file is locked or in use
string localAppDataPath = Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData);
string filePathExtra = #"\Microsoft\Windows\WebCache\WebCacheV01.dat";
string filePath = string.Format("{0}{1}", localAppDataPath, filePathExtra);
JET_INSTANCE instance;
JET_SESID sesid;
JET_DBID dbid;
JET_TABLEID tableid;
String connect = "";
JET_SNP snp;
JET_SNT snt;
object data;
int numInstance = 0;
JET_INSTANCE_INFO [] instances;
int pageSize;
JET_COLUMNDEF columndef = new JET_COLUMNDEF();
JET_COLUMNID columnid;
Api.JetCreateInstance(out instance, "instance");
Api.JetGetDatabaseFileInfo(filePath, out pageSize, JET_DbInfo.PageSize);
Api.JetSetSystemParameter(JET_INSTANCE.Nil, JET_SESID.Nil, JET_param.DatabasePageSize, pageSize, null);
//Api.JetSetSystemParameter(instance, JET_SESID.Nil, JET_param.CircularLog, 1, null);
Api.JetInit(ref instance);
Api.JetBeginSession(instance, out sesid, null, null);
//Do stuff in db
Api.JetEndSession(sesid, EndSessionGrbit.None);
Api.JetTerm(instance);
Is it not possible to read this without making modifications?
Viewer
http://www.nirsoft.net/utils/ese_database_view.html
Python
https://jon.glass/attempts-to-parse-webcachev01-dat/
libesedb
impacket
Issue:
The file is probably in use.
Solution:
in order to free the locked file, please stop the Schedule Task -\Microsoft\Windows\Wininet\CacheTask.
The Code
public override IEnumerable<string> GetBrowsingHistoryUrls(FileInfo fileInfo)
{
var fileName = fileInfo.FullName;
var results = new List<string>();
try
{
int pageSize;
Api.JetGetDatabaseFileInfo(fileName, out pageSize, JET_DbInfo.PageSize);
SystemParameters.DatabasePageSize = pageSize;
using (var instance = new Instance("Browsing History"))
{
var param = new InstanceParameters(instance);
param.Recovery = false;
instance.Init();
using (var session = new Session(instance))
{
Api.JetAttachDatabase(session, fileName, AttachDatabaseGrbit.ReadOnly);
JET_DBID dbid;
Api.JetOpenDatabase(session, fileName, null, out dbid, OpenDatabaseGrbit.ReadOnly);
using (var tableContainers = new Table(session, dbid, "Containers", OpenTableGrbit.ReadOnly))
{
IDictionary<string, JET_COLUMNID> containerColumns = Api.GetColumnDictionary(session, tableContainers);
if (Api.TryMoveFirst(session, tableContainers))
{
do
{
var retrieveColumnAsInt32 = Api.RetrieveColumnAsInt32(session, tableContainers, columnIds["ContainerId"]);
if (retrieveColumnAsInt32 != null)
{
var containerId = (int)retrieveColumnAsInt32;
using (var table = new Table(session, dbid, "Container_" + containerId, OpenTableGrbit.ReadOnly))
{
var tableColumns = Api.GetColumnDictionary(session, table);
if (Api.TryMoveFirst(session, table))
{
do
{
var url = Api.RetrieveColumnAsString(
session,
table,
tableColumns["Url"],
Encoding.Unicode);
var downloadedFileName = Api.RetrieveColumnAsString(
session,
table,
columnIds2["Filename"]);
if(string.IsNullOrEmpty(downloadedFileName)) // check for download history only.
continue;
// Order by access Time to find the last uploaded file.
var accessedTime = Api.RetrieveColumnAsInt64(
session,
table,
columnIds2["AccessedTime"]);
var lastVisitTime = accessedTime.HasValue ? DateTime.FromFileTimeUtc(accessedTime.Value) : DateTime.MinValue;
results.Add(url);
}
while (Api.TryMoveNext(session, table.JetTableid));
}
}
}
} while (Api.TryMoveNext(session, tableContainers));
}
}
}
}
}
catch (Exception ex)
{
// log goes here....
}
return results;
}
Utils
Task Scheduler Wrapper
You can use Microsoft.Win32.TaskScheduler.TaskService Wrapper to stop it using c#, just add this Nuget package [nuget]:https://taskscheduler.codeplex.com/
Usage
public static FileInfo CopyLockedFileRtl(DirectoryInfo directory, FileInfo fileInfo, string remoteEndPoint)
{
FileInfo copiedFileInfo = null;
using (var ts = new TaskService(string.Format(#"\\{0}", remoteEndPoint)))
{
var task = ts.GetTask(#"\Microsoft\Windows\Wininet\CacheTask");
task.Stop();
task.Enabled = false;
var byteArray = FileHelper.ReadOnlyAllBytes(fileInfo);
var filePath = Path.Combine(directory.FullName, "unlockedfile.dat");
File.WriteAllBytes(filePath, byteArray);
copiedFileInfo = new FileInfo(filePath);
task.Enabled = true;
task.Run();
task.Dispose();
}
return copiedFileInfo;
}
I was not able to get Adam's answer to work. What worked for me was making a copy with AlphaVSS (a .NET class library that has a managed API for the Volume Shadow Copy Service). The file was in "Dirty Shutdown" state, so I additionally wrote this to handle the exception it threw when I opened it:
catch (EsentErrorException ex)
{ // Usually after the database is copied, it's in Dirty Shutdown state
// This can be verified by running "esentutl.exe /Mh WebCacheV01.dat"
logger.Info(ex.Message);
switch (ex.Error)
{
case JET_err.SecondaryIndexCorrupted:
logger.Info("Secondary Index Corrupted detected, exiting...");
Api.JetTerm2(instance, TermGrbit.Complete);
return false;
case JET_err.DatabaseDirtyShutdown:
logger.Info("Dirty shutdown detected, attempting to recover...");
try
{
Api.JetTerm2(instance, TermGrbit.Complete);
Process.Start("esentutl.exe", "/p /o " + newPath);
Thread.Sleep(5000);
Api.JetInit(ref instance);
Api.JetBeginSession(instance, out sessionId, null, null);
Api.JetAttachDatabase(sessionId, newPath, AttachDatabaseGrbit.None);
}
catch (Exception e2)
{
logger.Info("Could not recover database " + newPath + ", will try opening it one last time. If that doesn't work, try using other esentutl commands", e2);
}
break;
}
}
I'm thinking about using the 'Recent Items' folder as when you select a file to upload an entry is written here:
C:\Users\USER\AppData\Roaming\Microsoft\Windows\Recent
string recent = (Environment.GetFolderPath(Environment.SpecialFolder.Recent));

Dynamic resource from MemoryStream with ResourceWriter

I'm developing an app with C# and WPF;
I'm creating a resource to a MemoryStream.
var ms = new MemoryStream();
var rWriter = new ResourceWriter(ms);
rWriter.AddResource("key1", "value1");
rWriter.AddResource("key2", "value2");
rWriter.AddResource("key3", "value3");
rWriter.Generate();
rWriter.Close();
Everything works fine untill here but, I don't know how to use this Resource?
Can you help me with using ?
Once you have read the resource into the stream you need to add it to the MergedDictionaries in order to make it available in the application. An example of this might look like this:
var resourceInfo = skinAssembly.GetManifestResourceInfo(resourceName);
if (resourceInfo.ResourceLocation != ResourceLocation.ContainedInAnotherAssembly)
{
var resourceStream = skinAssembly.GetManifestResourceStream(resourceName);
using (var resourceReader = new ResourceReader(resourceStream))
{
foreach (DictionaryEntry entry in resourceReader)
{
if (IsRelevantResource(entry, bamlResourceName))
{
skinBamlStreams.Add(entry.Value as Stream);
}
}
}
}
The code above was taken from my demo application which you can see the full source code for on GitHub.
protected override sealed void LoadResources()
{
var skinResolver = PreLoadResources();
try
{
var skinBamlStreams = skinResolver.GetSkinBamlStreams(_fullName, _resourceName);
foreach (var resourceStream in skinBamlStreams)
{
var skinResource = BamlHelper.LoadBaml<ResourceDictionary>(resourceStream);
if (skinResource != null)
{
Resources.Add(skinResource);
}
}
}
catch (Exception ex)
{
Console.WriteLine(ex.Message);
}
finally
{
PostLoadResources();
}
}

Categories

Resources