I am writing an application which create files based on timestamp each 1 second, then move them to another folder, then send them as post to a webservice which save them on the folder.
When running the generating function it generates successfully.
When runnng the upload function it upload them successfully.
But when running both of them and a backgroundworker components, the first works perfectly, but the upload mechanism tell em that the file is opened by another proccess.
How can I solve that?
Thx
A good practice when dealing with classes that implement the IDisposable interface, such as the file stream class, is to wrap these classes usage in a unsing statement. From MSDN:
//Create the file.
using (FileStream fs = File.Create(path))
{
AddText(fs, "This is some text");
AddText(fs, "This is some more text,");
AddText(fs, "\r\nand this is on a new line");
AddText(fs, "\r\n\r\nThe following is a subset of characters:\r\n");
for (int i=1;i < 120;i++)
{
AddText(fs, Convert.ToChar(i).ToString());
}
}
Another thing that you should be aware of is multi-threading synchronization. Maybe your "upload" background worker is trying to access the file before your "generate file" background worker had time to finish creating it.
Related
I have a routine that reads XPS documents and chops off pages to separate documents. Originally it read one document, decided how to chop it, closed it and wrote out the new files.
Features were added, this was causing headaches with cleaning up old files before running the routine and so I saved all the chopped pieces to be written out at the end.
ChoppedXPS is a dictionary, the key is the filename, the data is the FixedDocument prepared from the original:
foreach (String OneReport in ChoppedXPS.Keys)
{
File.Delete(OneReport);
using (XpsDocument TargetFile = new XpsDocument(OneReport, FileAccess.ReadWrite))
{
XpsDocumentWriter Writer = XpsDocument.CreateXpsDocumentWriter(TargetFile);
Writer.Write(ChoppedXPS[OneReport]);
Logger($"{OneReport} written to disk", 2);
}
Application.DoEvents();
}
If the FixedDocument being written out here contains graphics the source file is opened by the Writer.Write line and left open until the program is closed.
The XpsDocumentWriter does not seem to implement anything that can be used to clean it up.
(Yeah, that Application.DoEvents is ugly--this is an in-house program used by two people, it's not worth the hassle of making this run in the background and without it a big enough task can cause Windows to decide it's non-responsive and kill it. And, yes, I know how to indent--I took them out to make it all fit this screen.)
.Net 4.5, using some C# 8.0 features.
I found a workaround for this problem. I'm not going to try to post the whole thing as I had to change the whole data handling but the heart of it:
using (XPSDocument Source = new XPSDocument(SourceFile, FileAccess.Read)
{
[the using loop from my question]
}
I'm still hoping for understanding and something more appropriate than this approach.
Yes--this produces a warning that Source is unused, but the compiler isn't eliminating it so it does work.
So, the title may be misleading. I am building an android app that reads information from a text file, which is located on a cloud server (I would prefer to use either OneDrive, DropBox, or Google Drive [whichever is easiest]; others are fine). Periodically, the program will write information to the text file, still located on the cloud server. So, my question is twofold: Is it possible to read and write to a text file that is located on a cloud server? If so, how in the world would I complete this task? I have noticed the use of WebClient but I can't find a reasonable method or explanation on how this works. This program is coded in C#. This is what I have so far:
private string filename = "datafile.txt";
private List<Category> myList; //A list of an object that I developed ('Category')
//Allow the user interface to handle the error
public void readDatabase() {
//Here is where the magic has to occur, in order to read the file
...
//The usual reader that I use to read standard text files
StreamReader fileReader = new StreamReader(filename);
string line = "";
while ((line = fileReader.ReadLine()) != null)
//convertToCategory is my private method to convert the string to
myLine.Add(convertToCategory(line);
fileReader.close();
}
public void writeDatabase() {
//Here is where the magic has to occur, in order to write to the file
...
//The usual writer that I use to write standard text files
StreamWriter fileWriter = new StreamWriter(filename);
for (int i = 0; i < this.myList.Count; i++)
//toString() is something was developed in my object called 'Category'
fileWriter.WriteLine(fileWriter[i].toString());
fileWriter.close();
}
I would love to use Google Drive as my cloud server, but I am open to other possibilities, if necessary. I just want an easy and efficient method to read/write to the text file.
Possible Implementations:
Have seen possible solutions, where the file is downloaded locally and then read like normal and then uploaded at time of closing. However, if I could get away with it, I don't want the text file to be downloaded.
I have, also, seen several places where a SQL database is used in this instance. But the unfortunate thing is that I don't have any knowledge in developing with SQL. So, using a SQL server would be ideal (because speed is very important for this application) but it will be difficult for me to understand how it works.
I made a program in C# where it processes about 30 zipped folders which have about 35000 files in total. My purpose is to read every single file for processing its information. As of now, my code extracts all the folders and then read the files. The problem with this process is it takes about 15-20 minutes for it to happen, which is a lot.
I am using the following code to extract files:
void ExtractFile(string zipfile, string path)
{
ZipFile zip = ZipFile.Read(zipfile);
zip.ExtractAll(path);
}
The extraction part is the one which takes the most time to process. I need to reduce this time. Is there a way I can read the contents of the files inside the zipped folder without extracting them? or if anyone knows any other way that can help me reduce the time of this code ?
Thanks in advance
You could try reading each entry into a memory stream instead of to the file system:
ZipFile zip = ZipFile.Read(zipfile);
foreach(ZipEntry entry in zip.Entries)
{
using(MemoryStream ms = new MemoryStream())
{
entry.Extract(ms);
ms.Seek(0,SeekOrigin.Begin);
// read from the stream
}
}
Maybe instead of extracting it to the hard disk, you should try read it without extraction, using OpenRead, then you would have to use the ZipArchiveEntry.Open method.
Also have a look at the CodeFluent Runtime tool, which claims to be improved for performances issues.
Try to break your responses into single await async methods, which started one by one if one of the responses is longer than 50 ms. http://msdn.microsoft.com/en-us/library/hh191443.aspx
If we have for example 10 executions which call one by one, in async/await we call our executions parallel, and operation will depend only from server powers.
I'm upload big files dividing its on chunks(small parts) on my ASMX webservice(asmx doesn't support streaming, I not found another way):
bool UploadChunk(byte[] bytes, string path, string md5)
{
...
using (FileStream fs = new FileStream(tempPath, FileMode.Append) )
{
fs.Write( bytes, 0, bytes.Length );
}
...
return status;
}
but on some files after ~20-50 invokes I catch this error: The process cannot access the file because it is being used by another process.
I suspect that this related with Windows can't realize the file. Any idea to get rid of this boring error?
EDIT
the requests executes sequentially and synchronously
EDIT2
client code looks like:
_service.StartUpload(path);
...
do
{
..
bool status = _service.UploadChunk(buf, path, md5);
if(!status)return Status.Failed;
..
}
while(bytesRead > 0);
_service.CheckFile(path, md5);
Each request is handled independently. The process still accessing the file may be the previous request.
In general, you should use file transfer protocols to transfer files. ASMX is not good for that.
And, I presume you have a good reason to not use WCF?
Use WhoLockMe at the moment the error occurs to check who is using the file. You could put the application into debug mode and hold the break point to do this. In all probability it will be your process.
Also try adding a delay after each transfer (and before the next) to see if it helps. Maybe your transfers are too fast and the stream is still in use or being flushed when the next transfer comes in.
Option 1: Get the requirements changed so you don't have to do this using ASMX. WCF supports a streaming model that I'm about to experiment with, but it should be much more effective for what you want.
Option 2: Look into WSE 3.0. I haven't looked at it much, but I think it extends ASMX web services to support things like DIME and MTOM which are designed for transferring files so that may help.
Option 3: Set the system up so that each call writes a piece of the file into a different filename, then write code to rejoin everything at the end.
use this for creating a file
if you want to append something then add FileMode.Append
var filestreama = new FileStream(name, FileMode.OpenOrCreate, FileAccess.ReadWrite, FileShare.ReadWrite);
i am working on a pc 192.168.2.200
i have made a simple C# Windows application on 192.168.2.200
I want to Create a Text File on D: of a Network PC whose IP is 192.168.2.201 and UserName is abc and Password is 123 by C# code in Windows Application
How i Will Create file on Network PC
any one can Help me
System.IO.File.Create(#"D:\myfile.txt");
OR
System.IO.File.WriteAllText(#"D:\myfile.txt","Hello this is my File");
Windows application or not, you'd use the same technique. If you don't care what the format is, but want to use it as an input file that you read/write from, then you might want to look into XmlSerializer, along with a custom class that you write (with username/password/IP properties).
Here is a simple tutorial on XML serialization:
http://www.switchonthecode.com/tutorials/csharp-tutorial-xml-serialization
If you want a custom format, and only want to write to it (such as a log file), then you can do this:
var file = System.IO.File.Create("file.txt");
var writer = new System.IO.StreamWriter(file);
writer.WriteLine("fjkldsaf");
Or use the overload for StreamWriter that takes a filename:
var writer = new System.IO.StreamWriter("otherfile.txt");
writer.WriteLine("some text");
Keep in mind that writing a password in clear text onto your hard drive is not very secure (same with clear-text over the network, though I know you're not asking about that).
Make sure you either call Dispose or Close on your file streams when you're done with them. You can stick them in a using block to do this automatically (even if an exception is accidentally thrown in your file writing code).
using(var writer = new System.IO.StreamWriter("otherfile.txt"))
{
writer.WriteLine("some text");
}