i am working on a pc 192.168.2.200
i have made a simple C# Windows application on 192.168.2.200
I want to Create a Text File on D: of a Network PC whose IP is 192.168.2.201 and UserName is abc and Password is 123 by C# code in Windows Application
How i Will Create file on Network PC
any one can Help me
System.IO.File.Create(#"D:\myfile.txt");
OR
System.IO.File.WriteAllText(#"D:\myfile.txt","Hello this is my File");
Windows application or not, you'd use the same technique. If you don't care what the format is, but want to use it as an input file that you read/write from, then you might want to look into XmlSerializer, along with a custom class that you write (with username/password/IP properties).
Here is a simple tutorial on XML serialization:
http://www.switchonthecode.com/tutorials/csharp-tutorial-xml-serialization
If you want a custom format, and only want to write to it (such as a log file), then you can do this:
var file = System.IO.File.Create("file.txt");
var writer = new System.IO.StreamWriter(file);
writer.WriteLine("fjkldsaf");
Or use the overload for StreamWriter that takes a filename:
var writer = new System.IO.StreamWriter("otherfile.txt");
writer.WriteLine("some text");
Keep in mind that writing a password in clear text onto your hard drive is not very secure (same with clear-text over the network, though I know you're not asking about that).
Make sure you either call Dispose or Close on your file streams when you're done with them. You can stick them in a using block to do this automatically (even if an exception is accidentally thrown in your file writing code).
using(var writer = new System.IO.StreamWriter("otherfile.txt"))
{
writer.WriteLine("some text");
}
Related
So, the title may be misleading. I am building an android app that reads information from a text file, which is located on a cloud server (I would prefer to use either OneDrive, DropBox, or Google Drive [whichever is easiest]; others are fine). Periodically, the program will write information to the text file, still located on the cloud server. So, my question is twofold: Is it possible to read and write to a text file that is located on a cloud server? If so, how in the world would I complete this task? I have noticed the use of WebClient but I can't find a reasonable method or explanation on how this works. This program is coded in C#. This is what I have so far:
private string filename = "datafile.txt";
private List<Category> myList; //A list of an object that I developed ('Category')
//Allow the user interface to handle the error
public void readDatabase() {
//Here is where the magic has to occur, in order to read the file
...
//The usual reader that I use to read standard text files
StreamReader fileReader = new StreamReader(filename);
string line = "";
while ((line = fileReader.ReadLine()) != null)
//convertToCategory is my private method to convert the string to
myLine.Add(convertToCategory(line);
fileReader.close();
}
public void writeDatabase() {
//Here is where the magic has to occur, in order to write to the file
...
//The usual writer that I use to write standard text files
StreamWriter fileWriter = new StreamWriter(filename);
for (int i = 0; i < this.myList.Count; i++)
//toString() is something was developed in my object called 'Category'
fileWriter.WriteLine(fileWriter[i].toString());
fileWriter.close();
}
I would love to use Google Drive as my cloud server, but I am open to other possibilities, if necessary. I just want an easy and efficient method to read/write to the text file.
Possible Implementations:
Have seen possible solutions, where the file is downloaded locally and then read like normal and then uploaded at time of closing. However, if I could get away with it, I don't want the text file to be downloaded.
I have, also, seen several places where a SQL database is used in this instance. But the unfortunate thing is that I don't have any knowledge in developing with SQL. So, using a SQL server would be ideal (because speed is very important for this application) but it will be difficult for me to understand how it works.
This is a tricky question. I suspect it will require some advanced knowledge of file systems to answer.
I have a WPF application, "App1," targeting .NET framework 4.0. It has a Settings.settings file that generates a standard App1.exe.config file where default settings are stored. When the user modifies settings, the modifications go in AppData\Roaming\MyCompany\App1\X.X.0.0\user.config. This is all standard .NET behavior. However, on occasion, we've discovered that the user.config file on a customer's machine isn't what it's supposed to be, which causes the application to crash.
The problem looks like this: user.config is about the size it should be if it were filled with XML, but instead of XML it's just a bunch of NUL characters. It's character 0 repeated over and over again. We have no information about what had occurred leading up to this file modification.
We can fix that problem on a customer's device if we just delete user.config because the Common Language Runtime will just generate a new one. They'll lose the changes they've made to the settings, but the changes can be made again.
However, I've encountered this problem in another WPF application, "App2," with another XML file, info.xml. This time it's different because the file is generated by my own code rather than by the CLR. The common themes are that both are C# WPF applications, both are XML files, and in both cases we are completely unable to reproduce the problem in our testing. Could this have something to do with the way C# applications interact with XML files or files in general?
Not only can we not reproduce the problem in our current applications, but I can't even reproduce the problem by writing custom code that generates errors on purpose. I can't find a single XML serialization error or file access error that results in a file that's filled with nulls. So what could be going on?
App1 accesses user.config by calling Upgrade() and Save() and by getting and setting the properties. For example:
if (Settings.Default.UpgradeRequired)
{
Settings.Default.Upgrade();
Settings.Default.UpgradeRequired = false;
Settings.Default.Save();
}
App2 accesses info.xml by serializing and deserializing the XML:
public Info Deserialize(string xmlFile)
{
if (File.Exists(xmlFile) == false)
{
return null;
}
XmlSerializer xmlReadSerializer = new XmlSerializer(typeof(Info));
Info overview = null;
using (StreamReader file = new StreamReader(xmlFile))
{
overview = (Info)xmlReadSerializer.Deserialize(file);
file.Close();
}
return overview;
}
public void Serialize(Info infoObject, string fileName)
{
XmlSerializer writer = new XmlSerializer(typeof(Info));
using (StreamWriter fileWrite = new StreamWriter(fileName))
{
writer.Serialize(fileWrite, infoObject);
fileWrite.Close();
}
}
We've encountered the problem on both Windows 7 and Windows 10. When researching the problem, I came across this post where the same XML problem was encountered in Windows 8.1: Saved files sometime only contains NUL-characters
Is there something I could change in my code to prevent this, or is the problem too deep within the behavior of .NET?
It seems to me that there are three possibilities:
The CLR is writing null characters to the XML files.
The file's memory address pointer gets switched to another location without moving the file contents.
The file system attempts to move the file to another memory address and the file contents get moved but the pointer doesn't get updated.
I feel like 2 and 3 are more likely than 1. This is why I said it may require advanced knowledge of file systems.
I would greatly appreciate any information that might help me reproduce, fix, or work around the problem. Thank you!
It's well known that this can happen if there is power loss. This occurs after a cached write that extends a file (it can be a new or existing file), and power loss occurs shortly thereafter. In this scenario the file has 3 expected possible states when the machine comes back up:
1) The file doesn't exist at all or has its original length, as if the write never happened.
2) The file has the expected length as if the write happened, but the data is zeros.
3) The file has the expected length and the correct data that was written.
State 2 is what you are describing. It occurs because when you do the cached write, NTFS initially just extends the file size accordingly but leaves VDL (valid data length) untouched. Data beyond VDL always reads back as zeros. The data you were intending to write is sitting in memory in the file cache. It will eventually get written to disk, usually within a few seconds, and following that VDL will get advanced on disk to reflect the data written. If power loss occurs before the data is written or before VDL gets increased, you will end up in state 2.
This is fairly easy to repro, for example by copying a file (the copy engine uses cached writes), and then immediately pulling the power plug on your computer.
I had a similar problem and I was able to trace my problem to corrupted HDD.
Description of my problem (all related informations):
Disk attached to mainboard (SATA):
SSD (system),
3 * HDD.
One of the HDD's had a bad blocks and there were even problems reading the disk structure (directories and file listing).
Operation system: Windows 7 x64
file system (on all disks): NTFS
When the system tried to read or write to the corrupted disk (user request or automatic scan or any other reason) and the attempt failed, all write operations (to other disk's) were incorrect. The files created on system disk (mostly configuration files by another applications) were written and were valid (probably because the files were cashed in RAM) on direct check of file content.
Unfortunately, after a restart, all the files (written after the failed write/read access on corrupted drive) had the correct size, but the content of the files was 'zero byte' (exactly like in your case).
Try rule out hardware related problems. You can try to check 'copy' the file (after a change) to a different machine (upload to web/ftp). Or try to save specific content to a fixed file. When the check file on different will be correct, or when the fixed content file will be 'empty', the reason is probably on local machine. Try to change HW components, or reinstall the system.
There is no documented reason for this behavior, as this is happening to users but nobody can tell the origin of this odd conditions.
It might be CLR problem, although this is a very unlikely, the CLR doesn't just write null characters and XML document cannot contain null characters if there's no xsi:nil defined for the nodes.
Anyway, the only documented way to fix this is to delete the corrupted file using this line of code:
try
{
ConfigurationManager.OpenExeConfiguration(ConfigurationUserLevel.PerUserRoamingAndLocal);
}
catch (ConfigurationErrorsException ex)
{
string filename = ex.Filename;
_logger.Error(ex, "Cannot open config file");
if (File.Exists(filename) == true)
{
_logger.Error("Config file {0} content:\n{1}", filename, File.ReadAllText(filename));
File.Delete(filename);
_logger.Error("Config file deleted");
Properties.Settings.Default.Upgrade();
// Properties.Settings.Default.Reload();
// you could optionally restart the app instead
}
else
{
_logger.Error("Config file {0} does not exist", filename);
}
}
It will restore the user.config using the Properties.Settings.Default.Upgrade();
again without null values.
I ran into a similar issue but it was on a server. The server restarted while a program was writing to a file which caused the file to contain all null characters and become unusable to the program writing/reading from it.
So the file looked like this:
The logs showed that the server restarted:
The corrupted file showed that it was last updated at the time of the restart:
I have the same problem, there is an extra "NUL" character at the end of serialized xml file:
I am using XMLWriter like this:
using (var stringWriter = new Utf8StringWriter())
{
using (var xmlWriter = XmlWriter.Create(stringWriter, new XmlWriterSettings { Indent = true, IndentChars = "\t", NewLineChars = "\r\n", NewLineHandling = NewLineHandling.Replace }))
{
xmlSerializer.Serialize(xmlWriter, data, nameSpaces);
xml = stringWriter.ToString();
var xmlDocument = new XmlDocument();
xmlDocument.LoadXml(xml);
if (removeEmptyNodes)
{
RemoveEmptyNodes(xmlDocument);
}
xml = xmlDocument.InnerXml;
}
}
I'm working on an UWP application where a user can input data which is placed in a listview. All fine and dandy, but how can I save the user data to a separate file and load it the next time a user boots up the app?
I've tried to find a solution, but I had great difficulty to understand these code snippets and on how to apply these (since I'm fairly new to C# and App development). Would somebody like to explain how I can achieve the saving/loading of the data and explain what the code does?
Thanks in advance! :)
You can create a file like this:
StorageFile ageFile = await local.CreateFileAsync("Age.txt", CreationCollisionOption.FailIfExists);
I can read and write to a file like this:
StorageFolder local = Windows.Storage.ApplicationData.Current.LocalFolder;
var ageFile = await local.OpenStreamForReadAsync(#"Age.txt");
// Read the data.
using (StreamReader streamReader = new StreamReader(ageFile))
{
//Use like a normal streamReader
}
if you are trying to write, use OpenStreamForWriteAsync;
If I understood well, you have some kind of object structure that serves as a model for your ListView. When the application is started, you want to read a file where the data is present. When closing the application (or some other event) write the file with the changes done. Right?
1) When your application is loaded / closed (or upon modifications or some event of your choice), use the Windows.Storage API to read / write the text into the file.
2) If the data you want to write is just a liste of strings, you can save this as is in the file. If it is more complicated, I would recommend serializing it in JSON format. Use JSON.NET to serialize (object -> string) and deserialize (object <- string) the content of your file and object structure.
Product product = new Product();
product.Name = "Apple";
...
string json = JsonConvert.SerializeObject(product);
This problem is somewhat similar to this.
In my case, I have a text file. And since there is no content importer that works for text file, I have to write my own functions using stream readers. What I am trying to accomplish is to read from the text file, and set a few values accordingly into the options screen. I have added all necessary references, but using "../options.txt" as filepath does not work. Quite possibly the filepath is resolved to something else than Content folder. How do I then proceed with it?
Also, I am getting errors saying "attempt to access method (System.IO....ctor) failed". Is it that I am missing to add some other reference?
Does this have to work on the xBox 360? If so it's not possible to just use the filesystem like you can in Windows, you'll need to use a storage device. To be honest I would just use this method anyway so if you decide to take your game on to WP7 or 360 it'll just work.
There is a nice demo game that loads and saves data on this Microsoft page as well as a description of what it's doing
This is taken from the demo, it shows how to open a storage container and serialize some config data to it, the load operation is just as simple.
// Create the data to save.
SaveGameData data = new SaveGameData();
data.PlayerName = "Hiro";
data.AvatarPosition = new Vector2(360, 360);
data.Level = 11;
data.Score = 4200;
// Open a storage container.
IAsyncResult result =
device.BeginOpenContainer("StorageDemo", null, null);
// Wait for the WaitHandle to become signaled.
result.AsyncWaitHandle.WaitOne();
StorageContainer container = device.EndOpenContainer(result);
// Close the wait handle.
result.AsyncWaitHandle.Close();
string filename = "savegame.sav";
// Check to see whether the save exists.
if (container.FileExists(filename))
// Delete it so that we can create one fresh.
container.DeleteFile(filename);
// Create the file.
Stream stream = container.CreateFile(filename);
// Convert the object to XML data and put it in the stream.
XmlSerializer serializer = new XmlSerializer(typeof(SaveGameData));
serializer.Serialize(stream, data);
// Close the file.
stream.Close();
// Dispose the container, to commit changes.
container.Dispose();
I have a text file with a list of 300,000 words and the frequency with wich they occur. Each line is in the format Word:FequencyOfOccurence.
I want this information to be accessible from within the C# code. I can't hard code the list since it is too long, and I'm not sure how to go about accessing it from a file on the server. Ideally I'd ideally like the information to be downloaded only if it's used (To save on bandwidth) but this is not a high priority as the file is not too big and internet speeds are always increasing.
It doesn't need to be useable for binding.
The information does not need to be editable once the project has been built.
Here is another alternative. Zip the file up and stick it in the clientBin folder next to the apllication XAP. Then at the point in the app where the content is needed do something like this:-
public void GetWordFrequencyResource(Action<string> callback)
{
WebClient client = new WebClient();
client.OpenReadAsync += (s, args) =>
{
try
{
var zipRes = new StreamResourceInfo(args.Result, null)
var txtRes = Application.GetResourceStream(zipRes, new Uri("WordFrequency.txt", UriKind.Relative));
string result = new StreamReader(txtRes.Stream).ReadToEnd();
callback(result);
}
catch
{
callback(null); //Fetch failed.
}
}
client.OpenReadAsync(new Uri("WordFrequency.zip", UriKind.Relative"));
}
Usage:-
var wordFrequency = new Dictionary<string, int>();
GetWordFrequencyResource(s =>
{
// Code here to burst string into dictionary.
});
// Note code here is asynchronous with the building of the dictionary don't attempt to
// use the dictionary here.
The above code allows you to store the file in an efficient zip format but not in the XAP itself. Hence you can download it on demand. It makes use of the fact that a XAP is a zip file so Application.GetResourceStream which is designed to pull resources from XAP files can be used on a zip file.
BTW, I'm not actually suggesting you use a dictionary, I'm just using a dictionary as simple example. In reality I would imagine the file is in sorted order. If that is the case you could use a KeyValuePair<string, int> for each entry but create a custom collection type that holds them in an array or List and then use some Binary search methods to index into it.
Based on your comments, you could download the word list file if you are required to have a very thin server layer. The XAP file containing your Silverlight application is nothing more than a ZIP file with all the referenced files for your Silverlight client layer. Try adding the word list as content that gets compiled into the XAP and see how big the file gets. Text usually compresses really well. In general, though, you'll want to be friendly with your users in how much memory your application consumes. Loading a huge text file into memory, in addition to everything else you need in your app, may untimately make your app a resource hog.
A better practice, in general, would be to call a web service. The service could would perform whatever look up logic you need. Here's a blog post from a quick search that should get you started: (This was written for SL2, but should apply the same for SL3.)
Calling web services with Silverlight 2
Even better would be to store your list in a SQL Server. It will be much easier and quicker to query.
You could create a WCF service on the server side that will send the data to the Silverlight application. Once you retrieve the information you could cache it in-memory inside the client. Here's an example of calling a WCF service method from Silverlight.
Another possibility is to embed the text file into the Silverlight assembly that is deployed to the client:
using (var stream = Assembly.GetExecutingAssembly()
.GetManifestResourceStream("namespace.data.txt"))
using (var reader = new StreamReader(stream))
{
string data = reader.ReadToEnd();
// Do something with the data
}