Writing to file using values from an object in C# - c#

My objective with this code is to use the foreach loop to go through each object and write the current string value to a txt file.
I'm using "Woof" and "Bull" as a test. Bull is the string variable in my AverageValues class.
Unfortunately, it currently will not write the value of bull to the file, however, it will create the file.
I think this is something easy to fix, I'm just can't seem to find it right now.
All help would be appreciated!
public void doStuff()
{
AverageValues AVS = new AverageValues();
AVS.Bull = "Woof";
string path = "C:\\users\\kjenks11\\Averages.txt";
FileStream NewFile = File.Create(path);
StreamWriter writeIt = new StreamWriter(NewFile);
List<AverageValues> AV = new List<AverageValues>();
AV.Add(AVS);
foreach (var value in AV)
{
writeIt.Write(value.Bull);
}
NewFile.Close();
}

You need to flush to write the data to the file. You also might consider adding using statements to your writer to free the resources when you're done.
public void doStuff()
{
AverageValues AVS = new AverageValues();
AVS.Bull = "Woof";
string path = "C:\\users\\kjenks11\\Averages.txt";
using (var NewFile = File.Create(path))
{
using (var writeIt = new StreamWriter(NewFile))
{
List<AverageValues> AV = new List<AverageValues> {AVS};
foreach (var value in AV)
{
writeIt.Write(value.Bull);
}
}
}
}

Flush or close the stream before closing the file itself:
foreach (var value in AV)
{
writeIt.WriteLine(value.Bull);
}
writeIt.Flush();
writeIt.Close();
Note on style - when creating a Stream (of any kind), or rather, any object that implements IDisposable, create it with a using statement:
using(var writeIt = new StreamWriter(NewFile))
{
// use writeIt here - it will dispose properly
}

If you use using around your stream, there's no need to call flush and close explicitly. As soon as you leave the using block' scope the stream will be closed and disposed. The close will call flush for you.
public void doStuff()
{
AverageValues AVS = new AverageValues();
AVS.Bull = "Woof";
string path = "C:\\users\\kjenks11\\Averages.txt";
FileStream NewFile = File.Create(path);
List<AverageValues> AV = new List<AverageValues>();
AV.Add(AVS);
using(StreamWriter writeIt = new StreamWriter(NewFile))
{
foreach (var value in AV)
{
writeIt.Write(value.Bull);
}
}
NewFile.Close();
}

Related

Import macro to excel using open xml c#

I am able to copy a vbaproject part from xlsm to another xlsm through memory stream and add it through addnew part. Is there any possibility to import/add only the bas file into the xlsm file using open xml.
Can some one assist.Below code helped to take a clone of vba part and update the same.
private static void cloneVbaPart(string src, string dst)
{
using (WordprocessingDocument srcDoc = WordprocessingDocument.Open(src, false))
{
var vbaPart = srcDoc.MainDocumentPart.VbaProjectPart;
using (WordprocessingDocument dstDoc = WordprocessingDocument.Open(dst, true))
{
var partsToRemove = new List<OpenXmlPart>();
foreach (var part in dstDoc.MainDocumentPart.GetPartsOfType<VbaProjectPart>())
{
partsToRemove.Add(part);
}
foreach (var part in dstDoc.MainDocumentPart.GetPartsOfType<CustomizationPart>())
{
partsToRemove.Add(part);
}
foreach (var part in partsToRemove)
{ dstDoc.MainDocumentPart.DeletePart(part); }
var vbaProjectPart = dstDoc.MainDocumentPart.AddNewPart<VbaProjectPart>();
var vbaDataPart = vbaProjectPart.AddNewPart<VbaDataPart>();
using (Stream data = vbaPart.GetStream())
{
vbaProjectPart.FeedData(data);
}
using (Stream data = vbaPart.VbaDataPart.GetStream())
{
vbaDataPart.FeedData(data);
}
}
}
}

NAudio code holds lock on input file

My code below holds a lock on the input file preventing me from working with it later:
public static void ToWave_WMF(string source, string destination)
{
using (var reader = new MediaFoundationReader(source))
using (var rateSampler = new MediaFoundationResampler(reader, new WaveFormat(DefaultEncoding.SampleRate, reader.WaveFormat.Channels)))
using (var channelSampler = new MediaFoundationResampler(rateSampler, new WaveFormat(rateSampler.WaveFormat.SampleRate, DefaultEncoding.Channels)))
{
WaveFileWriter.CreateWaveFile(destination, channelSampler);
}
}
public static string BuildWavFile(string userFileLocation)
{
var sampleList = new List<ISampleProvider>();
try
{
// Add input file
var waveFile = AudioHelpers.ToWave_WMF(userFileLocation);
sampleList.Add(new AudioFileReader(waveFile));
WaveFileWriter.CreateWaveFile16(sirenWaveFile, new ConcatenatingSampleProvider(sampleList));
}
finally
{
foreach (var sample in sampleList)
{
((AudioFileReader)sample).Dispose();
}
}
return sirenWaveFile;
}
Am I using resources wrong? Why is this lock being held? If I delete the file after toWave_WMF() there is no issue. If I use sampleList.Add(new AudioFileReader(userFileLocation)); I also dont have any issues deleting the userFile.

the process cannot access the file, using filewriter

Trying to check if the file is empty or not, and then write something like "the text is empty" into the document.
But whenever I do get the
the process cannot access the file because it is being used by another process
even though I'm closing the file after the write.
What am I missing here?
StreamWriter myWriter1 = new StreamWriter(resultpath);
List<string> a = File.ReadAllLines(path).ToList();
List<string> b = File.ReadAllLines(newPath).ToList();
foreach (string s in a)
{
Console.WriteLine(s);
if (!b.Contains(s))
{
myWriter1.WriteLine(s);
myWriter1.Close();
}
else
{
continue;
}
}
string[] resultfile = File.ReadAllLines(resultpath);
if (resultfile == null || resultfile.Length == 0)
{
myWriter1.WriteLine("Der er ikke nogen udmeldinger idag", true);
}
myWriter1.Close();
You can close & dispose file writer after writing to it in the loop and re-create it when you neet to write to the same file again.
Also note it is better to wrap it into using statement to ensure it wil be closed and set free unmanaged resources automatically (so you don't need to close it in the loop again and again).
List<string> a = File.ReadAllLines(path).ToList();
List<string> b = File.ReadAllLines(newPath).ToList();
using (var myWriter1 = new StreamWriter(resultpath, false))
{
foreach (string s in a)
{
Console.WriteLine(s);
if (!b.Contains(s))
myWriter1.WriteLine(s);
}
}
string[] resultfile = File.ReadAllLines(resultpath);
if (resultfile == null || resultfile.Length == 0)
{
using (var myWriter1 = new StreamWriter(resultpath, true))
{
myWriter1.WriteLine("Der er ikke nogen udmeldinger idag", true);
}
}
Try this code. You were closing the StreamWriter on each line that is in the 2 analized files,but if there is no coincidence, you never close it.
using (var myWriter1 = new StreamWriter(resultpath, true))
{
List<string> a = File.ReadAllLines(path).ToList();
List<string> b = File.ReadAllLines(newPath).ToList();
int coincidences=0;
foreach (string s in a)
{
Console.WriteLine(s);
if (!b.Contains(s))
{
myWriter1.WriteLine(s);
coincidences++;
}
}
if (coincidences == 0)
{
myWriter1.WriteLine("Der er ikke nogen udmeldinger idag", true);
}
}
Also,note that for IDisposable objects it's better to enclose it in a using clause,as it disposes all the resources when finished.

Threading and SqlFileStream. The process cannot access the file specified because it has been opened in another transaction

I am extracting content of the Files in SQL File Table. The following code works if I do not use Parallel.
I am getting the following exception, when reading sql file stream simultaneously (Parallel).
The process cannot access the file specified because it has been opened in another transaction.
TL;DR:
When reading a file from FileTable (using GET_FILESTREAM_TRANSACTION_CONTEXT) in a Parallel.ForEach I get the above exception.
Sample Code for you to try out:
https://gist.github.com/NerdPad/6d9b399f2f5f5e5c6519
Longer Version:
Fetch Attachments, and extract content:
var documents = new List<ExtractedContent>();
using (var ts = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
{
var attachments = await dao.GetAttachmentsAsync();
// Extract the content simultaneously
// documents = attachments.ToDbDocuments().ToList(); // This works
Parallel.ForEach(attachments, a => documents.Add(a.ToDbDocument())); // this doesn't
ts.Complete();
}
DAO Read File Table:
public async Task<IEnumerable<SearchAttachment>> GetAttachmentsAsync()
{
try
{
var commandStr = "....";
IEnumerable<SearchAttachment> attachments = null;
using (var connection = new SqlConnection(this.DatabaseContext.Database.Connection.ConnectionString))
using (var command = new SqlCommand(commandStr, connection))
{
connection.Open();
using (var reader = await command.ExecuteReaderAsync())
{
attachments = reader.ToSearchAttachments().ToList();
}
}
return attachments;
}
catch (System.Exception)
{
throw;
}
}
Create objects for each file:
The object contains a reference to the GET_FILESTREAM_TRANSACTION_CONTEXT
public static IEnumerable<SearchAttachment> ToSearchAttachments(this SqlDataReader reader)
{
if (!reader.HasRows)
{
yield break;
}
// Convert each row to SearchAttachment
while (reader.Read())
{
yield return new SearchAttachment
{
...
...
UNCPath = reader.To<string>(Constants.UNCPath),
ContentStream = reader.To<byte[]>(Constants.Stream) // GET_FILESTREAM_TRANSACTION_CONTEXT()
...
...
};
}
}
Read the file using SqlFileStream:
Exception is thrown here
public static ExtractedContent ToDbDocument(this SearchAttachment attachment)
{
// Read the file
// Exception is thrown here
using (var stream = new SqlFileStream(attachment.UNCPath, attachment.ContentStream, FileAccess.Read, FileOptions.SequentialScan, 4096))
{
...
// extract content from the file
}
....
}
Update 1:
According to this article it seems like it could be an Isolation level issue. Has anyone ever faced similar issue?
The transaction does not flow in to the Parallel.ForEach, you must manually bring the transaction in.
//Switched to a thread safe collection.
var documents = new ConcurrentQueue<ExtractedContent>();
using (var ts = new TransactionScope(TransactionScopeAsyncFlowOption.Enabled))
{
var attachments = await dao.GetAttachmentsAsync();
//Grab a reference to the current transaction.
var transaction = Transaction.Current;
Parallel.ForEach(attachments, a =>
{
//Spawn a dependant clone of the transaction
using (var depTs = transaction.DependentClone(DependentCloneOption.RollbackIfNotComplete))
{
documents.Enqueue(a.ToDbDocument());
depTs.Complete();
}
});
ts.Complete();
}
I also switched from List<ExtractedContent> to ConcurrentQueue<ExtractedContent> because you are not allowed call .Add( on a list from multiple threads at the same time.

The process cannot access the file because it is being used by another process. XML

I am calling below method in a loop with same xmlRequestPath and xmlResponsePath files. Two loop counts it executes fine in the 3rd iteration I am getting exception "The process cannot access the file because it is being used by another process.".
public static void UpdateBatchID(String xmlRequestPath, String xmlResponsePath)
{
String batchId = "";
XDocument requestDoc = null;
XDocument responseDoc = null;
lock (locker)
{
using (var sr = new StreamReader(xmlRequestPath))
{
requestDoc = XDocument.Load(sr);
var element = requestDoc.Root;
batchId = element.Attribute("BatchID").Value;
if (batchId.Length >= 16)
{
batchId = batchId.Remove(0, 16).Insert(0, DateTime.Now.ToString("yyyyMMddHHmmssff"));
}
else if (batchId != "") { batchId = DateTime.Now.ToString("yyyyMMddHHmmssff"); }
element.SetAttributeValue("BatchID", batchId);
}
using (var sw = new StreamWriter(xmlRequestPath))
{
requestDoc.Save(sw);
}
using (var sr = new StreamReader(xmlResponsePath))
{
responseDoc = XDocument.Load(sr);
var elementResponse = responseDoc.Root;
elementResponse.SetAttributeValue("BatchID", batchId);
}
using (var sw = new StreamWriter(xmlResponsePath))
{
responseDoc.Save(sw);
}
}
Thread.Sleep(500);
requestDoc = null;
responseDoc = null;
}
Exception is occurring at using (var sw = new StreamWriter(xmlResponsePath)) in above code.
Exception:
The process cannot access the file 'D:\Projects\ESELServer20130902\trunk\Testing\ESL Server Testing\ESLServerTesting\ESLServerTesting\TestData\Assign\Expected Response\Assign5kMACResponse.xml' because it is being used by another process.
Maybe at the third loop the stream is still being closed, so it tells you that it is non accessible. Try waiting a bit before calling it again in the loop, for example:
while (...)
{
UpdateBatchID(xmlRequestPath, xmlResponsePath);
System.Threading.Thread.Sleep(500);
}
Or, close explicitly the stream instead of leaving the work to the garbage collector:
var sr = new StreamReader(xmlResponsePath);
responseDoc = XDocument.Load(sr);
....
sr.Close();
Instead of using two streams, a Write and a Read stream, try using only a FileStream, since the problem might be that after loading the file the stream remains opened until the garbadge collector actives.
using (FileSteam f = new FileStream(xmlResponsePath))
{
responseDoc = XDocument.Load(sr);
var elementResponse = responseDoc.Root;
elementResponse.SetAttributeValue("BatchID", batchId);
responseDoc.Save(sw);
}

Categories

Resources