I have a c# WPF Solution. The solution needs to add an object to a JSON file. Which works. The problem i have is when opening a new connection to the file the old data is overwritten. What am i missing? why does it overwrite the old data. am pretty sure it's something simple. but i just can't see it.
if (clickCount == 1)
{
//changes create button text
createBtn.Content = "Create";
//creates new message obj
Message message = new Message();
//depending on the form type creates a json object
if (valid.MessageType == "E")
{
//checks e-mail
valid.CheckEmail(senderTxtBox.Text);
//creates varibles for adding to JSON file
message.MessageId = messageTypeComboBox.Text + messageTypeTxtBox.Text;
message.SenderTxt = senderTxtBox.Text;
message.Subject = subjectTxtBox.Text;
message.MessageTxt = messageTxtBox.Text;
}
messageList.Add(message);
String json = JsonConvert.SerializeObject(messageList, Formatting.Indented);
System.IO.File.WriteAllText(#"JsonMessage.Json", json);
clickCount = 0;
messageTxtBox.Clear();
senderTxtBox.Clear();
subjectTxtBox.Clear();
messageTxtBox.Clear();
messageTypeTxtBox.Clear();
messageTypeComboBox.SelectedIndex = -1;
}
You are using WriteAllText, which will rewrite the file every time you want to transmit a new Json object to your json file.
File.AppendText seems like a better solution, since you would not need to actually rewrite all the file, whenever a new message is added to your MessageList, but would also solve your existing problem when you open a new connection to the file not having all the previously inserted json data deleted.
PS. If you use AppendText, you will have to pass to the file all your collection, but only the message that you just received, otherwise you would have your file constantly being written with duplicated data and the situation would only get worse with the increase in size of your message list object.
I see that you've used the WriteAllText method to write the json to the file.
You should use AppendText method instead.
Something like this
using (StreamWriter sw = System.IO.File.AppendText(#"JsonMessage.Json"))
{
sw.WriteLine(json);
}
Here's a link for more information on File.AppendText
https://msdn.microsoft.com/en-us/library/system.io.file.appendtext(v=vs.110).aspx
Related
Ok, So I am trying to build a program that will allow me to make Recipe database.
The Data I am trying to save is as;
class RecipeDataBase
{
int RecipeID;
string RecipeNames;
string[,] Ingredients = new string[30, 30];
string[] Steps = new string[30];
}
What I am trying to get the program to do is first, it has a create new database option, this will create an empty save file in with the recipe database's name (e.g. DinnerRecipes.rdb). Then I have another form that is designed to allow the user to add recipes to whichever database they select and finally I have another one that allows the user to edit the recipes inside the database as well as lists all the current recipe names within whichever database they chose.
The Idea is so I have a program I can create almost digital cookbooks with for another program I am working on. while almost 85% of the code I know and the rest I am getting it this savefile part that I getting me.
Currently, I only have;
//Setting up and restricting the save system
SaveFileDialog CRDB = new SaveFileDialog();
CRDB.InitialDirectory = #"./";
CRDB.RestoreDirectory = true;
CRDB.FileName = "*.rdb";
CRDB.Filter = "Recipe Database |*.rdb";
//running the save display and if they click ok creating a blank datebase.
if (CRDB.ShowDialog() == DialogResult.OK)
{
//This is to prep the database template for saving and creating the database file.
RecipeDataBase recipes = new RecipeDataBase();
Stream fileStream = CRDB.OpenFile();
StreamWriter sw = new StreamWriter(fileStream);
sw.Write(recipes);
sw.Close();
fileStream.Close();
}
Ok, I think I might be making this more complex than it really is. however, here is an example of the data I am trying to save
RecipeID = 1;
RecipeNames = "Homemade Pasta Sauce";
Ingredients = ["Tomato", "Basil", "Onions", "Garlic", ect...]["4", "6 leaves or a tablespoon of ground", "3 Medium Sized", "8 clove(minced)", ect..];
Steps = ["Peel and seed the tomatoes and set aside.", "Chop the onion, mince the garlic, and grate half of the carrot.", "Pour the olive oil into a large stockpot over medium heat.", ect];
What I am worried about saving the data straight into a text file is not every recipe has the same number of ingredients or steps. I want to ensure that when loading the information I don't end up with it pulling the wrong data, I am also wanting to have the RecipeIDs easily searched.
I've spent the last three days trying to find the answer I'm looking for and I don't know if I'm asking it wrong or just not connecting the answers I am finding with my project.
Please help me Stackoverflow Kenobi Your may only hope.
You should serialize your data to a format you can save, typically either a JSON or XML string or a binary format such as protobuff or MessagePack. More details on serialization can be found here:
https://learn.microsoft.com/en-us/dotnet/csharp/programming-guide/concepts/serialization/
and
https://learn.microsoft.com/en-us/dotnet/standard/serialization/basic-serialization
You may have to alter your classes - for example adding default constructors or the [Serializable] attribute.
Some examples of serialization (from the links):
Binary:
MyObject obj = new MyObject();
obj.n1 = 1;
obj.n2 = 24;
obj.str = "Some String";
IFormatter formatter = new BinaryFormatter();
Stream stream = new FileStream("MyFile.bin", FileMode.Create, FileAccess.Write, FileShare.None);
formatter.Serialize(stream, obj);
stream.Close();
JSON:
string jsonString = JsonSerializer.Serialize(weatherForecast);
File.WriteAllText(fileName, jsonString);
Hi I have created a console application for kafka consumer for receiving messages.
class Program
{
static void Main(string[] args)
{
string topic = "IDGTestTopic";
Uri uri = new Uri("http://localhost:9092");
var options = new KafkaOptions(uri);
var router = new BrokerRouter(options);
var consumer = new Consumer(new ConsumerOptions(topic, router));
foreach (var message in consumer.Consume())
{
Console.WriteLine(Encoding.UTF8.GetString(message.Value));
//Saving messages in files
string lines = Encoding.UTF8.GetString(message.Value);
System.IO.File.WriteAllText(#"C:\Project\Kafka Research\Kafka_Consumer\Kafka_Consumer\KafkaMessages\Messages.txt", lines);
}
}
}
But it only store the current messages. if you see the console all the messages are displaying.
But if you see the text file it only contain current messages
How to save all the messages in a file ?
For each message, System.IO.File.WriteAllText overwrites the file and therefore the created file will contain only the latest message.
In order to keep all the messages in a single file, you can replace System.IO.File.WriteAllText with System.IO.File.AppendAllText as shown below:
foreach (var message in consumer.Consume()) {
Console.WriteLine(Encoding.UTF8.GetString(message.Value));
//Saving messages in files
string lines = Encoding.UTF8.GetString(message.Value);
System.IO.File.AppendAllText(#"C:\Project\Kafka Research\Kafka_Consumer\Kafka_Consumer\KafkaMessages\Messages.txt", lines);
}
According to the docs,
File.AppendAllText Method (String, String)
Opens a file, appends the specified string to the file, and then
closes the file. If the file does not exist, this method creates a
file, writes the specified string to the file, then closes the file.
and File.WriteAllText Method (String, String)
Creates a new file, writes the specified string to the file, and then
closes the file. If the target file already exists, it is overwritten.
Every time you consume a message, you overwrite the whole file:
System.IO.File.WriteAllText
You need to do this outside the consume loop.
# Giorgos Myrianthous Your previous second choice was better in some ways. Appending to a stringbuilder and writing only once outside the loop to a file is most likely much faster then going through IO many times in every loop. Here is what I was suggesting:
StringBuilder linebuilder = new StringBuilder(); //this line outside the loop
foreach (var message in consumer.Consume()) {
Console.WriteLine(Encoding.UTF8.GetString(message.Value));
//Saving messages in files
linebuilder.Append(Encoding.UTF8.GetString(message.Value)); //this line inside the loop
}
System.IO.File.AppendAllText(#"C:\Project\Kafka Research\Kafka_Consumer\Kafka_Consumer\KafkaMessages\Messages.txt", linebuilder.ToString(());
I'm having troubles making JSON file from my Server class. This is my class:
public class CsServerInfo
{
public string ip { get; set; }
public string name { get; set; }
}
The idea is to add new servers into JSON file on a Button Click. It means every time I click on a button (in a WPF window which has TextBoxes for IP and Name properties) a new server should be added into JSON file.
CsServerInfo newServ = new CsServerInfo();
newServ.ip = this.serverIP.Text;
newServ.name = this.serverName.Text;
string json = JsonConvert.SerializeObject(newServ);
System.IO.File.AppendAllText(#"C:\JSON4.json", json);
The problem is I get JSON file that is not correctly formatted:
{"ip":"52.45.24.2","name":"new"}{"ip":"45.45.45.4","name":"new2"}
There's no comma between the servers and if I use ToArray()I get:
[{"ip":"52.45.24.2","name":"new"}][{"ip":"45.45.45.4","name":"new2"}]
Correct format should be [{server properties}, {another server}] but I'm not able to get that. Thanks for your help
You're appending the JSON text of one server at a time to the file. You should parse the existing list, add your server, and then serialize the whole list.
// TODO first check if there's an existing file or not
var servers =
JsonConvert.DeserializeObject<List<CsServerInfo>>(File.ReadAllText(#"C:\JSON4.json"));
servers.Add(newServ);
File.WriteAllText(#"C:\JSON4.json", JsonConvert.SerializeObject(servers));
[{server properties}, {another server}] this is a list of objects.
You should serializie list
List<CsServerInfo> listServ = new List<CsServerInfo>;
...
string json = JsonConvert.SerializeObject(listServ );
If you need append in file you should read all from file to list, add new and save back.
Don't try to append JSON to the file. Let Json.NET handle the work of serializing to JSON. You should be manipulating a List<CsServerInfo> and serializing the entire list when you're done modifying it. That way when you serialize and save, Json.NET is generating the JSON, which it does well, and it will be correctly formatted.
I'm trying to write an application in MVC 5 that will accept a file specified by a user and upload that file information into the database. The file itself has multiple worksheets, which I think FileHelpers handles gracefully, but I can't find any good documentation about working with a byte array. I can get the file just fine, and get to my controller, but don't know where to go from there. I am currently doing this in the controller:
public ActionResult UploadFile(string filepath)
{
//we want to check here that the first file in the request is not null
if (Request.Files[0] != null)
{
var file = Request.Files[0];
byte[] data = new byte[file.ContentLength];
ParseInputFile(data);
//file.InputStream.Read(data, 0, data.Length);
}
ViewBag.Message = "Success!";
return View("Index");
}
private void ParseInputFile(byte[] data)
{
ExcelStorage provider = new ExcelStorage(typeof(OccupationalGroup));
provider.StartRow = 3;
provider.StartColumn = 2;
provider.FileName = "test.xlsx";
}
Am I able to use the Request like that in conjunction with FileHelpers? I just need to read the Excel file into the database. If not, should I be looking into a different way to handle the upload?
So, I decided instead to use ExcelDataReader to do my reading from Excel. It puts the stream (in the below code, test) into a DataSet that I can just manipulate manually. I'm sure it might not be the cleanest way to do it, but it made sense for me, and allows me to work with multiple worksheets fairly easily as well. Here is the snippet of regular code that I ended up using:
//test is a stream here that I get using reflection
IExcelDataReader excelReader = ExcelReaderFactory.CreateOpenXmlReader(test);
DataSet result = excelReader.AsDataSet();
while(excelReader.Read())
{
//process the file
}
excelReader.Close();
---short version:
When I get to the while (!checkReader.EndOfStream) every time after the first, it says EndOfStream = true.
---more detail:
A user will upload a file using an Ajax AsyncFileUpload control. I take that file, ensure it's a very specific format of csv that we use and spit it out into a GridView. This all works great the first time through: I get the file, parse it out, and it displays great.
But, if I call this same code again anytime during the user's session the StreamReader.EndOfStream = true.
For example, a user uploads a file and I spit it out into the GridView. Oops! User realizes there are headers... I have a checkbox available with an event handler that will call the method below to re-read the original file (it's stored in a session variable). User checks the box, event fires, method gets called, but my EndOfStream is now true.
I thought that using () would change that flag and I have tried adding checkReader.DiscardBufferedData just after the while loop below, but neither of those seem to have any affect.
What am I doing wrong?
private void BuildDataFileGridView(bool hasHeaders)
{
//read import file from the session variable
Stream theStream = SessionImportFileUpload.PostedFile.InputStream;
theStream.Position = 0;
StringBuilder sb = new StringBuilder();
using (StreamReader checkReader = new StreamReader(theStream))
{
while (!checkReader.EndOfStream)
{
string line = checkReader.ReadLine();
while (line.EndsWith(","))
{
line = line.Substring(0, line.Length - 1);
}
sb.AppendLine(line);
}
}
using (TextReader reader = new StringReader(sb.ToString()))
{
//read the file in and shove it out for the client
using (CsvReader csv = new CsvReader(reader, hasHeaders, CsvReader.DefaultDelimiter))
{
sDataInputTable = new DataTable();
try
{
//Load the DataTable with csv values
sDataInputTable.Load(csv);
}
catch
{
DisplayPopupMessage("ERROR: A problem was encountered");
}
//Copy only the first 10 rows into a temp table for display.
DataTable displayDataTable = sDataInputTable.Rows.Cast<System.Data.DataRow>().Take(10).CopyToDataTable();
MyDataGridView.DataSource = displayDataTable;
MyDataGridView.DataBind();
}
}
}
Edit:
SessionImportFileUpload is the actual Ajax AsyncFileUpload control being stored as a session variable (this was already the case as a previous person wrote other stuff in that uses it).
You are storing the posted file stream in Session. This is not correct, because the stream is not the data, but rather the mechanism to read the data. The file is uploaded only once, during a single POST request, and you won't be able to read from the same stream again later. Usually you even cannot rewind the stream to re-read it.
That's why I suggest to read the posted file stream only once and put the whole content into Session - this way the content will be reusable, and you'll be able to reprocess it as many times as you need.