Grouping Data from a text file in C# - c#

I am currently working on a small project that returns text from 'txt' file based on criteria and then groups it before I export it to a database. In the text file I have:
c:\test\123
Other Lines...
c:\test\124
Problem: "description of error". (this error is for directory 124)
Problem: "description of error". (this error is for directory 124)
c:\test\125
...
I would like to group the 'problems' to their associated directory when importing them to the database. So far I have tried using 'foreach' to return the rows where the line contains/begins with directory or problem. Although this passes the value in order it is not clear for users to see which directory the problem belongs to. Ideally I am after:
Directory(column1) Problem(column2)
c:\test\123 || Null
c:\test\124 || Problem: "description of Error".
c:\test\124 || Problem: "description of Error".
c:\test\125 || Null
Any help that you can give would be greatly appreciated. I have been racking my brains on this for the last week!
(current code)
var lines = File.ReadAllLines(filename);
foreach (var line in File.ReadLines(filename))
{
String stringTest = line;
if (stringTest.Contains(directory))
{
String test = stringTest;
var csb = new SqlConnectionStringBuilder();
csb.DataSource = host;
csb.InitialCatalog = catalog;
csb.UserID = user;
csb.Password = pass;
using (var sc = new SqlConnection(csb.ConnectionString))
using (var cmd = sc.CreateCommand())
{
sc.Open();
cmd.CommandText = "DELETE FROM table";
cmd.CommandText = "INSERT INTO table (ID, Directory) values (NEWID(), #val)";
cmd.Parameters.AddWithValue("#VAL", test);
cmd.ExecuteNonQuery();
sc.Close();
}
}
if (stringTest.Contains(problem))
{
Same for problem....

Here is one solution:
Assuming that you have the following class to hold a result item:
public class ResultItem
{
public string Directory { get; set; }
public string Problem { get; set; }
}
You can do the following:
var lines = File.ReadAllLines(filename);
string current_directory = null;
List<ResultItem> results = new List<ResultItem>();
//maintain the number of results added for the current directory
int problems_added_for_current_directory = 0;
foreach (var line in lines)
{
if (line.StartsWith("c:\\test"))
{
//If we are changing to a new directory
//And we didn't add any items for current directory
//Add a null result item
if (current_directory != null && problems_added_for_current_directory == 0)
{
results.Add(new ResultItem
{
Directory = current_directory,
Problem = null
});
}
current_directory = line;
problems_added_for_current_directory = 0;
}
else if (line.StartsWith("Problem"))
{
results.Add(new ResultItem
{
Directory = current_directory,
Problem = line
});
problems_added_for_current_directory++;
}
}
//If we are done looping
//And we didn't add any items for current (last) directory
//Add a null result item
if (current_directory != null && problems_added_for_current_directory == 0)
{
results.Add(new ResultItem
{
Directory = current_directory,
Problem = null
});
}

Related

How to read and separate segments of a txt file?

I have a txt file, that has headers and then 3 columns of values (i.e)
Description=null
area = 100
1,2,3
1,2,4
2,1,5 ...
... 1,2,1//(these are the values that I need in one list)
Then another segment
Description=null
area = 10
1,2,3
1,2,4
2,1,5 ...
... 1,2,1//(these are the values that I need in one list).
In fact I just need one list per "Table" of values, the values always are in 3 columns but, there are n segments, any idea?
Thanks!
List<double> VMM40xyz = new List<double>();
foreach (var item in VMM40blocklines)
{
if (item.Contains(','))
{
VMM40xyz.AddRange(item.Split(',').Select(double.Parse).ToList());
}
}
I tried this, but it just work with the values in just one big list.
It looks like you want your data to end up in a format like this:
public class SetOfData //Feel free to name these parts better.
{
public string Description = "";
public string Area = "";
public List<double> Data = new List<double>();
}
...stored somewhere in...
List<SetOfData> finalData = new List<SetOfData>();
So, here's how I'd read that in:
public static List<SetOfData> ReadCustomFile(string Filename)
{
if (!File.Exists(Filename))
{
throw new FileNotFoundException($"{Filename} does not exist.");
}
List<SetOfData> returnData = new List<SetOfData>();
SetOfData currentDataSet = null;
using (FileStream fs = new FileStream(Filename, FileMode.Open))
{
using (StreamReader reader = new StreamReader(fs))
{
while (!reader.EndOfStream)
{
string line = reader.ReadLine();
//This will start a new object on every 'Description' line.
if (line.Contains("Description="))
{
//Save off the old data set if there is one.
if (currentDataSet != null)
returnData.Add(currentDataSet);
currentDataSet = new SetOfData();
//Now, to make sure there is something after "Description=" and to set the Description if there is.
//Your example data used "null" here, which this will take literally to be a string containing the letters "null". You can check the contents of parts[1] inside the if block to change this.
string[] parts = line.Split('=');
if (parts.Length > 1)
currentDataSet.Description = parts[1].Trim();
}
else if (line.Contains("area = "))
{
//Just in case your file didn't start with a "Description" line for some reason.
if (currentDataSet == null)
currentDataSet = new SetOfData();
//And then we do some string splitting like we did for Description.
string[] parts = line.Split('=');
if (parts.Length > 1)
currentDataSet.Area = parts[1].Trim();
}
else
{
//Just in case your file didn't start with a "Description" line for some reason.
if (currentDataSet == null)
currentDataSet = new SetOfData();
string[] parts = line.Split(',');
foreach (string part in parts)
{
if (double.TryParse(part, out double number))
{
currentDataSet.Data.Add(number);
}
}
}
}
//Make sure to add the last set.
returnData.Add(currentDataSet);
}
}
return returnData;
}

Efficiently comparing two lists

so, i have this block of code on my app that scans a directory for files, makes a list with them and compares that list with a list of files from a database (if the path to that directory exists on the db) and adds the difference between them to one of two other lists.Here it is :
if (id > 0)
{
var dbDrawingList = mdl_drawing.GetDrawingsByBaseId(id);
var counter = 0;
if (dbDrawingList.Count() < serverDrawingList.Count())
{
counter = serverDrawingList.Count();
}
else
{
counter = dbDrawingList.Count();
}
for (int i = 0; i <= counter; i++)
{
if (i < serverDrawingList.Count())
{
if (dbDrawingList.Select(f => f.partNumber).Contains(serverDrawingList[i].partNumber) == false)
{
onServerAndNotDb.Add(serverDrawingList[i]);
}
}
if (i < dbDrawingList.Count())
{
if (serverDrawingList.Select(f => f.partNumber).Contains(dbDrawingList[i].partNumber) == false)
{
onDbAndNotServer.Add(dbDrawingList[i]);
}
}
}
serverDrawingList = null;
dbDrawingList = null;
}
Does anyone have a better way of doing this?(there can be more than one file with the same name, so the Except method doesn't work)
I have one method that does exactly that, this is the way I implemented it and it's working so far:
private void deleteRegistersFromFilesThatWasRemoved(string path)
{
// add local files to list
List<string> allFiles = new List<string>();
string[] dirs = Directory.GetDirectories(path, "*",
SearchOption.TopDirectoryOnly);
foreach (var dir in dirs)
{
string[] files = Directory.GetFiles(dir, "*", SearchOption.TopDirectoryOnly);
foreach (var file in files)
{
if(file != path)
{
allFiles.Add(file);
}
}
}
// list for file records in the database
List<string> record = new List<string>();
string queryfiles = //your query
SqlCommand cmd = new SqlCommand(queryfiles, connection);
cmd.Connection.Open();
cmd.ExecuteNonQuery();
SqlDataReader read = cmd.ExecuteReader();
while (read.Read())
{
// I have path and file name separated that's why here's a string sum
string r = read[2].ToString() + read[1].ToString();
record.Add(r);
}
cmd.Connection.Close();
for (int i = 0; i < record.Count; i++)
{
if (!allFiles.Contains(record[i]))
{
// do something if the record on the database is not in the local
//files list
}
}
}
}

Import two CSV, add specific columns from one CSV and import changes to new CSV (C#)

i have to import 2 CSV's.
CSV 1 [49]: Including about 50 tab seperated colums.
CSV 2:[2] Inlcudes 3 Columns which should be replaced on the [3] [6] and [11] place of my first csv.
So heres what i do:
1) Importing the csv and split into a array.
string employeedatabase = "MYPATH";
List<String> status = new List<String>();
StreamReader file2 = new System.IO.StreamReader(filename);
string line = file2.ReadLine();
while ((line = file2.ReadLine()) != null)
{
string[] ud = line.Split('\t');
status.Add(ud[0]);
}
String[] ud_status = status.ToArray();
PROBLEM 1: i have about 50 colums to handle, ud_status is just the first, so do i need 50 Lists and 50 String arrays?
2) Importing the second csv and split into a array.
List<String> vorname = new List<String>();
List<String> nachname = new List<String>();
List<String> username = new List<String>();
StreamReader file = new System.IO.StreamReader(employeedatabase);
string line3 = file.ReadLine();
while ((line3 = file.ReadLine()) != null)
{
string[] data = line3.Split(';');
vorname.Add(data[0]);
nachname.Add(data[1]);
username.Add(data[2]);
}
String[] db_vorname = vorname.ToArray();
String[] db_nachname = nachname.ToArray();
String[] db_username = username.ToArray();
PROBLEM 2: After loading these two csv's i dont know how to combine them, and change to columns as mentioned above ..
somethine like this?
mynewArray = ud_status + "/t" + ud_xy[..n] + "/t" + changed_colum + ud_xy[..n];
save "mynewarray" into tablulator seperated csv with encoding "utf-8".
To read the file into a meaningful format, you should set up a class that defines the format of your CSV:
public class CsvRow
{
public string vorname { get; set; }
public string nachname { get; set; }
public string username { get; set; }
public CsvRow (string[] data)
{
vorname = data[0];
nachname = data[1];
username = data[2];
}
}
Then populate a list of this:
List<CsvRow> rows = new List<CsvRow>();
StreamReader file = new System.IO.StreamReader(employeedatabase);
string line3 = file.ReadLine();
while ((line3 = file.ReadLine()) != null)
{
rows.Add(new CsvRow(line3.Split(';'));
}
Similarly format your other CSV and include unused properties for the new fields. Once you have loaded both, you can populate the new properties from this list in a loop, matching the records by whatever common field the CSVs hopefully share. Then finally output the resulting data to a new CSV file.
Your solution is not to use string arrays to do this. That will just drive you crazy. It's better to use the System.Data.DataTable object.
I didn't get a chance to test the LINQ lambda expression at the end of this (or really any of it, I wrote this on a break), but it should get you on the right track.
using (var ds = new System.Data.DataSet("My Data"))
{
ds.Tables.Add("File0");
ds.Tables.Add("File1");
string[] line;
using (var reader = new System.IO.StreamReader("FirstFile"))
{
//first we get columns for table 0
foreach (string s in reader.ReadLine().Split('\t'))
ds.Tables["File0"].Columns.Add(s);
while ((line = reader.ReadLine().Split('\t')) != null)
{
//and now the rest of the data.
var r = ds.Tables["File0"].NewRow();
for (int i = 0; i <= line.Length; i++)
{
r[i] = line[i];
}
ds.Tables["File0"].Rows.Add(r);
}
}
//we could probably do these in a loop or a second method,
//but you may want subtle differences, so for now we just do it the same way
//for file1
using (var reader2 = new System.IO.StreamReader("SecondFile"))
{
foreach (string s in reader2.ReadLine().Split('\t'))
ds.Tables["File1"].Columns.Add(s);
while ((line = reader2.ReadLine().Split('\t')) != null)
{
//and now the rest of the data.
var r = ds.Tables["File1"].NewRow();
for (int i = 0; i <= line.Length; i++)
{
r[i] = line[i];
}
ds.Tables["File1"].Rows.Add(r);
}
}
//you now have these in functioning datatables. Because we named columns,
//you can call them by name specifically, or by index, to replace in the first datatable.
string[] columnsToReplace = new string[] { "firstColumnName", "SecondColumnName", "ThirdColumnName" };
for(int i = 0; i < ds.Tables[0].Rows.Count; i++)
{
//you didn't give a sign of any relation between the two tables
//so this is just by row, and assumes the row count is equivalent.
//This is also not advised.
//if there is a key these sets of data share
//you should join on them instead.
foreach(DataRow dr in ds.Tables[0].Rows[i].ItemArray)
{
dr[3] = ds.Tables[1].Rows[i][columnsToReplace[0]];
dr[6] = ds.Tables[1].Rows[i][columnsToReplace[1]];
dr[11] = ds.Tables[1].Rows[i][columnsToReplace[2]];
}
}
//ds.Tables[0] now has the output you want.
string output = String.Empty;
foreach (var s in ds.Tables[0].Columns)
output = String.Concat(output, s ,"\t");
output = String.Concat(output, Environment.NewLine); // columns ready, now the rows.
foreach (DataRow r in ds.Tables[0].Rows)
output = string.Concat(output, r.ItemArray.SelectMany(t => (t.ToString() + "\t")), Environment.NewLine);
if(System.IO.File.Exists("MYPATH"))
using (System.IO.StreamWriter file = new System.IO.StreamWriter("MYPATH")) //or a variable instead of string literal
{
file.Write(output);
}
}
With Cinchoo ETL - an open source file helper library, you can do the merge of CSV files as below. Assumed the 2 CSV file contains same number of lines.
string CSV1 = #"Id Name City
1 Tom New York
2 Mark FairFax";
string CSV2 = #"Id City
1 Las Vegas
2 Dallas";
dynamic rec1 = null;
dynamic rec2 = null;
StringBuilder csv3 = new StringBuilder();
using (var csvOut = new ChoCSVWriter(new StringWriter(csv3))
.WithFirstLineHeader()
.WithDelimiter("\t")
)
{
using (var csv1 = new ChoCSVReader(new StringReader(CSV1))
.WithFirstLineHeader()
.WithDelimiter("\t")
)
{
using (var csv2 = new ChoCSVReader(new StringReader(CSV2))
.WithFirstLineHeader()
.WithDelimiter("\t")
)
{
while ((rec1 = csv1.Read()) != null && (rec2 = csv2.Read()) != null)
{
rec1.City = rec2.City;
csvOut.Write(rec1);
}
}
}
}
Console.WriteLine(csv3.ToString());
Hope it helps.
Disclaimer: I'm the author of this library.

Check if a location is indexed in Windows Search

How to check if a location is indexed or not? I found following code to index a location in Windows which works fine but I want to check if it is indexed or not before I make it indexed.
Uri path = new Uri(location);
string indexingPath = path.AbsoluteUri;
CSearchManager csm = new CSearchManager();
CSearchCrawlScopeManager manager = csm.GetCatalog("SystemIndex").GetCrawlScopeManager();
manager.AddUserScopeRule(indexingPath, 1, 1, 0);
manager.SaveAll();
Guys i have found a way to check if the location has been included for indexing by using IncludedInCrawlScope.
CSearchManager csm = new CSearchManager();
CSearchCrawlScopeManager manager = csm.GetCatalog("SystemIndex").GetCrawlScopeManager();
if (manager.IncludedInCrawlScope(indexingPath) == 0)
{
manager.AddUserScopeRule(indexingPath, 1, 1, 0);
manager.SaveAll();
}
But it only checks if it has been added for indexing, not if the indexing is complete.Since i will be querying on the SystemIndex, i need to make sure that the location is indexed.
I ran into a similar need and this is what I came up with. In my case I have certain file extensions that are going to end up being sent to a document management system.
I have two methods one uses the System.IO to get a list of the files in the directory that contain the extension from the list.
public IEnumerable<string> DirectoryScan(string directory)
{
List<string> extensions = new List<string>
{
"docx","xlsx","pptx","docm","xlsm","pptm","dotx","xltx","xlw","potx","ppsx","ppsm","doc","xls","ppt","doct","xlt","xlm","pot","pps"
};
IEnumerable<string> myFiles =
Directory.GetFiles(directory, "*", SearchOption.AllDirectories)
.Where(s => extensions.Any(s.EndsWith))
.ToList();
return myFiles;
}`
The second method uses the windows index search Microsoft.Search.Interop
public IEnumerable<string> QueryWindowsDesktopSearch(string directory)
{
List<string> extensions = new List<string>
{ "docx","xlsx","pptx","docm","xlsm","pptm","dotx","xltx","xlw","potx","ppsx","ppsm","doc","xls","ppt","doct","xlt","xlm","pot","pps"};
string userQuery = "*";
Boolean fShowQuery = true;
List<string> list = new List<string>();
CSearchManager manager = new CSearchManager();
CSearchCatalogManager catalogManager = manager.GetCatalog("SystemIndex");
CSearchQueryHelper queryHelper = catalogManager.GetQueryHelper();
queryHelper.QueryWhereRestrictions = string.Format("AND (\"SCOPE\" = 'file:{0}')", directory);
if (extensions != null)
{
queryHelper.QueryWhereRestrictions += " AND Contains(System.ItemType,'";
bool fFirst = true;
foreach (string ext in extensions)
{
if (!fFirst)
{
queryHelper.QueryWhereRestrictions += " OR ";
}
queryHelper.QueryWhereRestrictions += "\"" + ext + "\"";
fFirst = false;
}
queryHelper.QueryWhereRestrictions += "') ";
}
string sqlQuery = queryHelper.GenerateSQLFromUserQuery(userQuery);
using (OleDbConnection connection = new OleDbConnection(queryHelper.ConnectionString))
{
using (OleDbCommand command = new OleDbCommand(sqlQuery, connection))
{
connection.Open();
OleDbDataReader dataReader = command.ExecuteReader();
while (dataReader.Read())
{
var file = dataReader.GetString(0);
if (file != null)
{
list.Add(file.Replace("file:", ""));
}
}
}
}
return list;
}
I call both of these methods from another methods that takes the two results and compares them and returns a Boolean value indicating if they two list match. If they do not match then the folder has not been indexed fully.
If you call the QueryWindowsDesktopSearch on a folder that has not been indexed it returns zero files. You could use this as an indication that the folder isn't in the index bt its possible that the file has been added to the index but the file indexing is stopped.
You could check the status by calling something like this
CSearchManager manager = new CSearchManager();
CSearchCatalogManager catalogManager = manager.GetCatalog("SystemIndex");
_CatalogPausedReason pReason;
_CatalogStatus pStatus;
catalogManager.GetCatalogStatus(out pStatus, out pReason);
That may return something like pStatus = CATALOG_STATUS_PAUSED and pReason = CATALOG_PAUSED_REASON_USER_ACTIVE
You would know that the index is not running. Another thing you could do is call the following
int incrementalCount, notificationQueue, highPriorityQueue;
catalogManager.NumberOfItemsToIndex(out incrementalCount, out notificationQueue, out highPriorityQueue);
This is going to return the in plIncrementalCount value which would list the number of file that the entire SystemIndex has queued for indexing.
Check this implementation from a document management system:
https://code.google.com/p/olakedms/source/browse/SearchEngine/CSearchDAL.cs?r=171

C# Observable Collection LDAP Paths Children for WPF TreeView

I'm hoping someone can help. A long time windows forms/aspx user, moving to WPF.
Not expecting a coded answer to this, but any pointers on a different way to approach would be greatly appreciated - I am probably approaching this in a very backward way.
So the objective is to have an ObservableCollection with sub ObservableCollection "childen" within to then bind to my WPF treeview control.
I can bind my collection to the treeview without issues, and have styled it with checkboxes images as desired, frustratingly, its the ObservableCollection with children of children of children I am having trouble generating in the first place.
I have a table in SQL with LDAP Paths, and various other information I'm storing against that LDAP path, which I read into my ObservableCollection.
Single level, no problem, the bit I'm struggling with is sorted the sub objects of sub objects by LDAP Path, so when I bind to the treeview is presented as AD OU's are structured.
EG:
TopOU
Users
Front Office Users
Helpdesk Users
Example LDAP Paths in my DB
LDAP://OU=Front Office Users,OU=Users,OU=TopOU,DC=dev,DC=local
LDAP://OU=Helpdesk Users,OU=Users,OU=TopOU,DC=dev,DC=local
LDAP://OU=OU=Users,OU=TopOU,DC=dev,DC=local
LDAP://OU=OU=TopOU,DC=dev,DC=local
private ObservableCollection<AssignmentData> OUTreeAssignmentsCollection = new ObservableCollection<AssignmentData>();
public class AssignmentData : INotifyPropertyChanged
{
public Int32 AssignmentID { get; set; }
public String AssignmentName { get; set; }
public AssignmentTypes AssignmentType { get; set; }
//other stuff....
//For TreeView all sub nodes
public ObservableCollection<AssignmentData> Children { get; set; }
}
I then start to read from my db in a rather nasty way, and this is where it all goes wrong, and I could use some pointers.
cmd = new SqlCommand("SELECT UserGroups.UserGroupID, UserGroups.Name, UserGroups.LDAPPath FROM UserGroups WHERE UserGroups.TypeID=1", DBCon);
reader = cmd.ExecuteReader();
while (reader.Read())
{
String strLDAPHierarchical = GetLDAPHierarchical(reader[2].ToString());
AssignmentData newItem = new AssignmentData()
{
AssignmentID = Convert.ToInt32(reader[0]),
AssignmentName = reader[1].ToString(),
AssignmentImage = ouIcon,
AssignmentLDAPPath = reader[2].ToString(),
AssignmentCNPath = GetCNFromLDAPPath(reader[2].ToString()),
AssignmentTooltip = GetADSLocationTooltip(reader[2].ToString()),
AssignmentType = AssignmentTypes.UserOU,
AssignmentLDAPHierarchical = strLDAPHierarchical
};
if (strLDAPHierarchical.Contains(","))
{
//Now check all the root nodes exist to continue
String strLDAPHierarchicalCheckPath = strLDAPHierarchical;
String[] SplitLDAPHierarchical = strLDAPHierarchical.Split(new Char[] { ',' });
Int32 reverseI = SplitLDAPHierarchical.Length - 1;
String prevPath = "";
for (int i = 0; i < SplitLDAPHierarchical.Length; i++)
{
String path = SplitLDAPHierarchical[reverseI];
//now check if this node is already there and if not look it up and create it
if (path != "")
{
if (i == 0) { strLDAPHierarchicalCheckPath = path; }
else { strLDAPHierarchicalCheckPath = path + "," + prevPath; }
WriteLog("CHECK:" + strLDAPHierarchicalCheckPath);
LookupItemByLDAPHierarchical(strLDAPHierarchicalCheckPath, newItem);
if (i == 0) { prevPath = path; }
else { prevPath = path + "," + prevPath; }
reverseI = reverseI - 1;
}
}
}
else
{
//is top level object, so create at the root of the collection
UserOUCollection.Add(newItem);
}
Function to add sub items :-/
internal AssignmentData LookupItemByLDAPHierarchical(String strLDAPHierarchical, AssignmentData fromItem)
{
AssignmentData currentItem = null;
foreach (AssignmentData d in UserOUCollection)
{
if (d.AssignmentLDAPHierarchical == strLDAPHierarchical) { currentItem = d; break; }
if (d.Children != null)
{
currentItem = CheckChildNodesByLDAPHierarchical(d, strLDAPHierarchical);
if (currentItem != null) { break; }
}
}
String strMessage = "null";
if (currentItem != null) { strMessage = currentItem.AssignmentLDAPPath; }
if (currentItem == null)
{
String strWhere = "LDAPPath LIKE 'LDAP://" + strLDAPHierarchical + "%'";
SqlConnection DBCon = new SqlConnection(SQLString);
DBCon.Open();
SqlCommand cmd = new SqlCommand("SELECT UserGroupID, Name, LDAPPath FROM UserGroups WHERE " + strWhere + " AND TypeID=1", DBCon);
SqlDataReader reader = cmd.ExecuteReader();
while (reader.Read())
{
strLDAPHierarchical = GetLDAPHierarchical(reader[2].ToString());
AssignmentData newItem = new AssignmentData()
{
AssignmentID = Convert.ToInt32(reader[0]),
AssignmentName = reader[1].ToString(),
AssignmentImage = ouIcon,
AssignmentLDAPPath = reader[2].ToString(),
AssignmentCNPath = GetCNFromLDAPPath(reader[2].ToString()),
AssignmentTooltip = GetADSLocationTooltip(reader[2].ToString()),
AssignmentType = AssignmentTypes.UserOU,
AssignmentLDAPHierarchical = strLDAPHierarchical
};
String strLDAPHierarchicalCheckPath = strLDAPHierarchical;
foreach (String path in strLDAPHierarchical.Split(new Char[] { ',' }))
{
//now check if this node is already there and if not look it up and create it
if (path != "")
{
strLDAPHierarchicalCheckPath = strLDAPHierarchicalCheckPath.Replace(path + ",", "");
currentItem = LookupItemByLDAPHierarchical(strLDAPHierarchicalCheckPath, currentItem);
if (null == currentItem)
{
UserOUCollection.Add(newItem); //new root item
}
else
{
if (currentItem.Children == null)
{
//add new child
currentItem.Children = new ObservableCollection<AssignmentData> { newItem };
}
else
{
//add more children to exisiting
currentItem.Children.Add(newItem);
}
}
currentItem = null;
}
}
//Find a current Item to add the node to
//currentItem = LookupItemByLDAPHierarchical(strLDAPHierarchical);
}
reader.Close();
reader.Dispose();
DBCon.Close();
DBCon.Dispose();
}
return currentItem;
}
With my current solution, I get a treeview, with sub nodes of sub nodes, but they are wrong/lots of duplication etc. I have spent literally days trying to fix my probably overcomplicated attempt above - but have come to the conclusion I'm probably going about it the wrong way.
Any help greatly appreciated!
Just having a peruse ;) through your code. Think I can see why you have lots of duplications. Looks like your first SQL query get's all parent/child records. Then the second query will go and get some of those records again, if that makes sense.
One approach would be to only get the top level items in your first query. Possibly by getting SQL to count the number of commas.
SELECT UserGroups.UserGroupID, UserGroups.Name, UserGroups.LDAPPath,
LENGTH(LDAPPath) - LENGTH(REPLACE(LDAPPath, ',', '')) as CommaCount
FROM UserGroups
WHERE UserGroups.TypeID=1
AND CommaCount = 2
Since you asked for different approach id say it's not very efficient to repeatedly query the database in a loop. When I'm building a tree of parent child objects I'd normally get all parent/child records in one query. Build a flat dictionary of all the objects. Then loop through it and make the parent/child associations.
The dictionary can also be useful to lookup your objects later on either directly by key or to loop through without having to make a recursive function that crawls the tree.
So I'd suggest that you break it down into 2 blocks of code.
First block: Using your existing query that get's all of the items, create a flat Dictionary with everything in.
They key of each item should probably be the result from GetLDAPHierarchical().
Second block: Next loop through the dictionary and create the hierarchy. Add anything with no parent directly to the UserOUCollection
foreach(AssignmentData d in myDictionary.Values)
{
String parentKey = GetParentLDAPKey(d.AssignmentLDAPHierarchical);
if (myDictionary.ContainsKey(parentKey))
{
myDictionary(parentKey).children.Add(d);
}
else
{
UserOUCollection.Add(d);
}
}
GetParentLDAPKey() will need to produce the same key as it's parent by removing the first part of the LDAP Path.
Hope that points you in the right direction.
H
(SMASH)
Thanks so much to hman, who pointed me in a much more logical direction. I used LDAPPath as my dictionary key.
Dictionary<String, AssignmentData> OUDictionary = new Dictionary<String, AssignmentData>();
//Read from DB
cmd = new SqlCommand("SELECT UserGroups.UserGroupID, UserGroups.Name, UserGroups.LDAPPath FROM UserGroups WHERE UserGroups.TypeID=1", DBCon);
reader = cmd.ExecuteReader();
while (reader.Read())
{
AssignmentData newItem = new AssignmentData()
{
AssignmentID = Convert.ToInt32(reader[0]),
AssignmentName = reader[1].ToString(),
AssignmentImage = ouIcon,
AssignmentLDAPPath = reader[2].ToString(),
AssignmentCNPath = GetCNFromLDAPPath(reader[2].ToString()),
AssignmentTooltip = GetADSLocationTooltip(reader[2].ToString()),
AssignmentType = AssignmentTypes.UserOU,
};
UserOUDictionary.Add(reader[2].ToString(), newItem);
}
reader.Close();
reader.Dispose();
//Now Read OU List into TreeView Collection
foreach (AssignmentData d in UserOUDictionary.Values)
{
String parentKey = GetParentLDAPPath(d.AssignmentLDAPPath);
if (UserOUDictionary.ContainsKey(parentKey))
{
AssignmentData parentItem = UserOUDictionary[parentKey];
if (parentItem.Children == null) { parentItem.Children = new ObservableCollection<AssignmentData> { d }; } //add first child
else { parentItem.Children.Add(d); } //add more children to exisiting
}
else
{
UserOUCollection.Add(d); //add to root of control
}
}
private String GetParentLDAPKey(String strLDAPPath)
{
String retParentKey = strLDAPPath;
if (strLDAPPath.Contains(","))
{
retParentKey = retParentKey.Replace("LDAP://", "");
retParentKey = retParentKey.Remove(0, retParentKey.IndexOf(",") + 1);
retParentKey = "LDAP://" + retParentKey;
}
return retParentKey;
}

Categories

Resources