C# MongoDB : Chunk file does not upload following a prior deletion - c#

I'm having a bit of a problem with an update feature designed to compare two images, and if different, delete the existing image and data from Mongo, and replace it with a new copy. The problem is, individually, each component works. The loading feature will successfully upload an image and Bson Document. The delete method will successfully (seemingly) remove them; the document, the fs.files entry, and the fs.chunks entry.
However, when the entry is deleted and then proceeds to upload the new image, only the fs.files entry and the BsonDocument will be pushed to the server. The actual image is left off.
I'm running MongoDB 3.2.6 for Windows.
The replace block followed by the upload block
if (newMD5.Equals(oldMD5) == false)
{
Debug.WriteLine("Updating image " + fileWithExt);
BsonValue targetId = docCollection.FindOne(Query.EQ("id", fileNoExt))["_id"];
deleteImageEntry(Query.EQ("_id", new ObjectId(targetId.ToString())));
//continues to upload replacement
} else
{
continue;
}
}
//create new entry
uploadInfo = mongoFileSystem.Upload(memStream, fileFs);
BsonDocument entry = new BsonDocument();
entry.Add("fileId", uploadInfo.Id);
entry.Add("id", fileNoExt);
entry.Add("filename", fileFs);
entry.Add("user", "");
//appends to image collection
var newItemInfo = docCollection.Save(entry);
And the delete method
public static bool deleteImageEntry(IMongoQuery query)
{
MongoInterface mongo = new MongoInterface();
try
{
var docCollection = mongo.Database.GetCollection("employees");
var imageCollection = mongo.Database.GetCollection<EmployeeImage>("employees");
var toDelete = docCollection.FindOne(query);
BsonValue fileId = toDelete.GetValue("fileId");
mongo.Gridfs.DeleteById(fileId);
WriteConcernResult wresult = imageCollection.Remove(query);
}
catch (Exception e)
{
Debug.WriteLine("Image could not be deleted \n\r" + e.Message);
return false;
}
return true;
}
Sorry the code is messy, I've doing a lot a guerrilla testing to try and find a reason for this. Similar code has worked in other parts of the program.

Well this is embarrassing. After a few more hours of debugging it turned out to be the MD5.ComputerHash() function was setting the memStream iterator to the end, and so there was no data being uploaded. Using memStream.Position = 0; solved the problem.

Related

datagridview update row to csv file

I am working with Csv file and datagridview in a C# project for a inventory app, I try to update a row to CSV file!
i need to update if user edit a row current word with a new word but my problem here is i need save the current word and new word and get total in pseudo code example:
foreach (DataGridViewRow row in dataGridView1.Rows)
{
if(row in column is modified)
update specific row with comma to current file and load it...
}
Csv file is look like,
Current:
1;2;;4;5
Update:
1;2,A;;4;5 changed device A total: 1 time...
Next row modified :
1;A;;4,B,C;5 changed device B and C total change : 2 time...
With a database it's easy to update data but i don't have sql server installed so this option has not for me i think..
My goal is for tracking device out/in so if you have a solution please share it.
Short of using an SQL server, maybe something like this could help? LiteDB You'd have your LiteDB to host your data, and export it CSV whenever you need. Working with CSV files usually means you'll re-write the whole file every time there is an update to make... Which is slow and cumbersome. I recommend you use CSV to transport data from Point A to Point B, but not to maintain data.
Also, if you really want to stick to CSV, have a look at the Microsoft Ace OLEDB driver, previously known as JET driver. I use it to query CSV files, but I have never used it to update... so your mileage may vary.
Short of using an actual DataBase or a database driver, you'll have to use a StreamReader along with a StreamWriter. Read the file with the StreamReader, write the new file with the StreamWriter. In your StreanReader. This implies you'll have code in your StreamReader to find the correct Line(s) to update.
Here's the class I created and am using to interact with LiteDB. It's not all that robust, but it did exactly what I needed it to do at the time. I had to make changes to a slew of products hosted on my platform, and I used this to keep track of the progress.
using System;
using LiteDB;
namespace FixProductsProperty
{
public enum ListAction
{
Add = 0,
Remove,
Update,
Disable,
Enable
}
class DbInteractions
{
public static readonly string dbFilename = "MyDatabaseName.db";
public static readonly string dbItemsTableName = "MyTableName";
public void ToDataBase(ListAction incomingAction, TrackingDbEntry dbEntry = null)
{
if (dbEntry == null)
{
Exception ex = new Exception("dbEntry can not be null");
throw ex;
}
// Open database (or create if not exits)
using (var db = new LiteDatabase(dbFilename))
{
var backupListInDB = db.GetCollection<TrackingDbEntry>(dbItemsTableName);
//ovverride action if needed
if (incomingAction == ListAction.Add)
{
var tempone = backupListInDB.FindOne(p => p.ProductID == dbEntry.ProductID);
if (backupListInDB.FindOne(p => p.ProductID == dbEntry.ProductID) != null)
{
//the record already exists
incomingAction = ListAction.Update;
//IOException ex = new IOException("Err: Duplicate. " + dbEntry.ProductID + " is already in the database.");
//throw ex;
}
else
{
//the record does not already exist
incomingAction = ListAction.Add;
}
}
switch (incomingAction)
{
case ListAction.Add:
backupListInDB.Insert(dbEntry);
break;
case ListAction.Remove:
//backupListInDB.Delete(p => p.FileOrFolderPath == backupItem.FileOrFolderPath);
if (dbEntry.ProductID != 0)
{
backupListInDB.Delete(dbEntry.ProductID);
}
break;
case ListAction.Update:
if (dbEntry.ProductID != 0)
{
backupListInDB.Update(dbEntry.ProductID, dbEntry);
}
break;
case ListAction.Disable:
break;
case ListAction.Enable:
break;
default:
break;
}
backupListInDB.EnsureIndex(p => p.ProductID);
// Use Linq to query documents
//var results = backupListInDB.Find(x => x.Name.StartsWith("Jo"));
}
}
}
}
I use it like this:
DbInteractions yeah = new DbInteractions();
yeah.ToDataBase(ListAction.Add, new TrackingDbEntry { ProductID = dataBoundItem.ProductID, StoreID = dataBoundItem.StoreID, ChangeStatus = true });
Sorry... my variable naming convention sometimes blows...

Xamarin ListView display item from SQLite Database C# Android

in my case i wanted to display items from local SQLite database which i created as shown below:
public string CreateDB() //create database
{
var output = "";
string dbPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), "IsoModule.db3");
output = "Database Created";
return output;
}
public string CreateTable() //create table
{
try
{
string dbPath = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), "IsoModule.db3");
var db = new SQLiteConnection(dbPath);
db.CreateTable<UserInfo>();
db.CreateTable<TableInfo>();
string result = "Table(s) created";
return result;
}
catch (Exception ex)
{
return ("Error" + ex.Message);
}
}
and this is my code where i wish to retrieve data
string path = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.MyDocuments), "IsoModule.db3");
var tablelistout = new SQLiteConnection(path);
var alltables = tablelistout.Table<TableInfo>();
foreach (var listing in alltables)
{
var from = new string[]
{
listing.tname + " - " + listing.status
};
ListView listtable = (ListView)FindViewById(Resource.Id.listtable);
listtable.Adapter = new ArrayAdapter(this, Android.Resource.Layout.SimpleListItem1, from);
}
the code runs with NO ERROR but it only display last item in the table. it is confusing me, so i would like to ask how can i retrieve all the data from specific table?
or if someone has asked the same question please share me the link. many appreciate.
var alltables = tablelistout.Table<TableInfo>();
var data = new List<string>();
foreach (var listing in alltables)
{
data.Add(listing.tname + " - " + listing.status);
}
ListView listtable = (ListView)FindViewById(Resource.Id.listtable);
listtable.Adapter = new ArrayAdapter(this, Android.Resource.Layout.SimpleListItem1, data.ToArray());
All I did was move 2 things out of the loop. First, I moved out the initialization of the array. Second, I moved out the listView + assignation of the adapter.
Your issue is that in the loop, you were always overriding everything you had done in the previous iteration (leaving you with the last item like you said).
Also, You should take note that it will be important for you to create a custom adapter if you plan on having a decent amount of data. ArrayAdapter is a native Android class which is then wrapped by a C# Xamarin object, meaning you will have both a C# and Java object per row. It adds overhead as both garbage collectors will have work to do and can cause performance issues. Xamarin devs tend to generally avoid it with the exception of quick prototyping.
On another note, I would use the FindViewById<T>(Int32) instead of the FindViewById(Int32) and casting it. FindViewById<ListView>(Resource.Id.listtable) in your case. Just taking advantage of the power of generics.

Download ParseObject from Parse and Display

I am developing and application in xamarin for android. There are two buttons one for uploading the Parse object and another one for downloading the object and display Image. Uploading button is working properly. But I am facing two main Problem
First: I am getting a null value in my IEnumerable object. I have attached a screenshot of my breakpoint value.
Second: When I am trying to display the object I am getting an error. I have attached a screenshot of the error shown
Upload Button:
upbutton.Click += async delegate {
try {
byte[] myfile = System.IO.File.ReadAllBytes (path);
ParseFile file = new ParseFile ("imgfl.png", myfile);
await file.SaveAsync ();
// link your file object to your Parse object
ParseObject gameScore = new ParseObject ("GameScore");
gameScore ["score"] = 0001;
gameScore ["playerName"] = " Bob";
gameScore ["image"] = file;
await gameScore.SaveAsync ();
} catch (Exception e) {
System.Console.WriteLine (e);
}
};
Download Button:
downbutton.Click+= async delegate {
var query = from GameScore in ParseObject.GetQuery("GameScore")
orderby GameScore.CreatedAt descending
select GameScore;
IEnumerable<ParseObject> results = await query.FindAsync();
//I am getting a null object result here
foreach (var obj in results)
{
ParseFile img = obj.Get<ParseFile>("image");
_imageView.SetImageURI(img.Url);
//error here
}
};
One of your GameScore rows probably doesn't have the image property set, for that row you'll get a null result in your img variable.
Also note that in your Download Button code, you're going to get every single row and keep calling _imageView.SetImageURI(img.Url) line for each one, potentially replacing the value over and over again.
You might want to be doing a FirstAsync() call instead of FindAsync() if you are going to want just a single row.

strange behavior of XamlReader.Load()?

I've got a very strange issue while parsing an external XAML file. The pre-history is that I want to load an external XAML file with content to process. But I want to load as many different files as I want. That happens by unloading the old and loading the new one.
My issue is:
When I load a xaml the first time, everything is good, all as it should be.
But when I load the same xaml the second time, every entry of the object im Loading is there twice. If I run this again, every object is there three times and so on...
To debug the project yourself, download it here. The function starts at line 137 in the file "Control Panel.xaml.cs". I realy don't know what this is. Is it my fault or simply a bug? If yes, is there a workaround?
/// <summary>
/// Load a xaml file and parse it
/// </summary>
public void LoadPresentation()
{
this.Title = "Control Panel - " + System.IO.Path.GetFileName(global.file);
System.IO.FileStream XAML_file = new System.IO.FileStream(global.file, System.IO.FileMode.Open);
try
{
System.IO.StreamReader reader = new System.IO.StreamReader(XAML_file);
string dump = reader.ReadToEnd(); //This is only for debugging purposes because of the strange issue...
XAML_file.Seek(0, System.IO.SeekOrigin.Begin);
presentation = (ResourceDictionary)XamlReader.Load(XAML_file);
//Keys the resourceDictionary must have to be valid
if (presentation["INDEX"] == null || presentation["MAIN_GRID"] == null || presentation["CONTAINER"] == null || presentation["LAYOUTLIST"] == null)
{
throw new Exception();
}
//When this list is loaded, every item in it is there twice or three times or four... Why????
TopicList Index = null;
Index = (TopicList)presentation["INDEX"];
for (int i = 0; i < topics.Count; )
{
topics.RemoveAt(i);
}
foreach (TopicListItem item in Index.Topics)
{
topics.Insert(item.TopicIndex, (Topic)presentation[item.ResourceKey]);
}
lv_topics.SelectedIndex = 0;
selectedIndex = 0;
}
catch
{
System.Windows.Forms.MessageBox.Show("Failed to load XAML file \"" + global.file + "\"", "Parsing Error", System.Windows.Forms.MessageBoxButtons.OK, System.Windows.Forms.MessageBoxIcon.Error);
presentation = null;
}
finally
{
XAML_file.Close();
}
}
Edit:
I have tried to serialize the object that was read from the XamlReader and in the output was nowhere any childelement... But if I pull the object out of the dictionary, the children are all there (duplicated and triplicated, but there).
I have already tried to clear the list over
topics.Clear();
and
topics=new ObservableCollection<TopicListItem>();
lv_topics.ItemsSource=topics;
Try Index.Topics.Clear() after loading the Topics into your topics object. That appears to get rid of the duplication.
//When this list is loaded, every item in it is there twice or three times or four... Why????
TopicList Index = null;
Index = (TopicList)presentation["INDEX"];
topics.Clear();
foreach (TopicListItem item in Index.Topics)
{
topics.Insert(item.TopicIndex, (Topic)presentation[item.ResourceKey]);
}
Index.Topics.Clear(); //Adding this will prevent the duplication
lv_topics.SelectedIndex = 0;
selectedIndex = 0;
In the code post topics is not declared in LoadPresentation() so naturally it will have any prior values.
I know you said you tried topics=new ObservableCollection(); but please try again. And put that IN LoadPresentation()
public void LoadPresentation()
{
ObservableCollection<TopicListItem> topics = new ObservableCollection<TopicListItem>()
I would pass filename
public void LoadPresentation(string fileName)
I get you may need to use topics outside LoadPresentation but this is debugging. If you need topics outside the return it.
public ObservableCollection<TopicListItem> LoadPresentation(string fileName)
If that does not fix it I would put a try catch block on the XAML_file.Close(); to see if something weird is not going on.

Adding AsParallel() call cause my code to break on writing a file

I'm building a console application that have to process a bunch of document.
To stay simple, the process is :
for each year between X and Y, query the DB to get a list of document reference to process
for each of this reference, process a local file
The process method is, I think, independent and should be parallelized as soon as input args are different :
private static bool ProcessDocument(
DocumentsDataset.DocumentsRow d,
string langCode
)
{
try
{
var htmFileName = d.UniqueDocRef.Trim() + langCode + ".htm";
var htmFullPath = Path.Combine("x:\path", htmFileName;
missingHtmlFile = !File.Exists(htmFullPath);
if (!missingHtmlFile)
{
var html = File.ReadAllText(htmFullPath);
// ProcessHtml is quite long : it use a regex search for a list of reference
// which are other documents, then sends the result to a custom WS
ProcessHtml(ref html);
File.WriteAllText(htmFullPath, html);
}
return true;
}
catch (Exception exc)
{
Trace.TraceError("{0,8}Fail processing {1} : {2}","[FATAL]", d.UniqueDocRef, exc.ToString());
return false;
}
}
In order to enumerate my document, I have this method :
private static IEnumerable<DocumentsDataset.DocumentsRow> EnumerateDocuments()
{
return Enumerable.Range(1990, 2020 - 1990).AsParallel().SelectMany(year => {
return Document.FindAll((short)year).Documents;
});
}
Document is a business class that wrap the retrieval of documents. The output of this method is a typed dataset (I'm returning the Documents table). The method is waiting for a year and I'm sure a document can't be returned by more than one year (year is part of the key actually).
Note the use of AsParallel() here, but I never got issue with this one.
Now, my main method is :
var documents = EnumerateDocuments();
var result = documents.Select(d => {
bool success = true;
foreach (var langCode in new string[] { "-e","-f" })
{
success &= ProcessDocument(d, langCode);
}
return new {
d.UniqueDocRef,
success
};
});
using (var sw = File.CreateText("summary.csv"))
{
sw.WriteLine("Level;UniqueDocRef");
foreach (var item in result)
{
string level;
if (!item.success) level = "[ERROR]";
else level = "[OK]";
sw.WriteLine(
"{0};{1}",
level,
item.UniqueDocRef
);
//sw.WriteLine(item);
}
}
This method works as expected under this form. However, if I replace
var documents = EnumerateDocuments();
by
var documents = EnumerateDocuments().AsParrallel();
It stops to work, and I don't understand why.
The error appears exactly here (in my process method):
File.WriteAllText(htmFullPath, html);
It tells me that the file is already opened by another program.
I don't understand what can cause my program not to works as expected. As my documents variable is an IEnumerable returning unique values, why my process method is breaking ?
thx for advises
[Edit] Code for retrieving document :
/// <summary>
/// Get all documents in data store
/// </summary>
public static DocumentsDS FindAll(short? year)
{
Database db = DatabaseFactory.CreateDatabase(connStringName); // MS Entlib
DbCommand cm = db.GetStoredProcCommand("Document_Select");
if (year.HasValue) db.AddInParameter(cm, "Year", DbType.Int16, year.Value);
string[] tableNames = { "Documents", "Years" };
DocumentsDS ds = new DocumentsDS();
db.LoadDataSet(cm, ds, tableNames);
return ds;
}
[Edit2] Possible source of my issue, thanks to mquander. If I wrote :
var test = EnumerateDocuments().AsParallel().Select(d => d.UniqueDocRef);
var testGr = test.GroupBy(d => d).Select(d => new { d.Key, Count = d.Count() }).Where(c=>c.Count>1);
var testLst = testGr.ToList();
Console.WriteLine(testLst.Where(x => x.Count == 1).Count());
Console.WriteLine(testLst.Where(x => x.Count > 1).Count());
I get this result :
0
1758
Removing the AsParallel returns the same output.
Conclusion : my EnumerateDocuments have something wrong and returns twice each documents.
Have to dive here I think
This is probably my source enumeration in cause
I suggest you to have each task put the file data into a global queue and have a parallel thread take writing requests from the queue and do the actual writing.
Anyway, the performance of writing in parallel on a single disk is much worse than writing sequentially, because the disk needs to spin to seek the next writing location, so you are just bouncing the disk around between seeks. It's better to do the writes sequentially.
Is Document.FindAll((short)year).Documents threadsafe? Because the difference between the first and the second version is that in the second (broken) version, this call is running multiple times concurrently. That could plausibly be the cause of the issue.
Sounds like you're trying to write to the same file. Only one thread/program can write to a file at a given time, so you can't use Parallel.
If you're reading from the same file, then you need to open the file with only read permissions as not to put a write lock on it.
The simplest way to fix the issue is to place a lock around your File.WriteAllText, assuming the writing is fast and it's worth parallelizing the rest of the code.

Categories

Resources