I just encountered strange behavior from my VSTO Outlook add-in. I am trying to delete all distribution lists stored in an Outlook folder.
Here is how I do that:
public Outlook.MAPIFolder ListsFolder;
foreach (DistListItem distList in ListsFolder.Items.OfType<DistListItem>())
{
distList.Delete();
}
It deletes a whole lot of lists, but strangely not all of them. A few always remain. As far as I can see, there's nothing special about those. All of the lists in this folder have been programmatically created by the same add-in, like this:
myList = ListsFolder.Items.Add(Outlook.OlItemType.olDistributionListItem) as Outlook.DistListItem;
Any ideas on what I might be doing wrong?
Firstly, never loop through all items in a folder - you wouldn't execute a SELECT SQL statement without a WHERE clause, would you? Use Items.Find/FindNext or Items.Restrict - let the store provider do the filtering. In your case, the search query would be [MessageClass] = 'IPM.DistList'.
Secondly, you are deleting items from a collection thus changing it - that is why some items are always skipped. You need to either use a down loop from Count to 1, or store all entry ids in a string list and then loop though that list opening each item using Namespace.GetItemfromID.
Related
It's a struggle with any application that provides select fields, that are populated by a certain datasource: Everything works fine in the first place, but once the application ages, some older entries might be deleted, leading to the problem that prior select fields can no longer access the entity in question.
Opening a view, where a select points to an already deleted datarow will (best case) show empty string.
We designed our system in a way, that deletions are not real delete-operations, but only the setting of a deleted flag. (So, all the information is still there)
However, when using Databindings along with C# (or even if not) the most blatant use case is still not covered by general mechanics (I assume):
Select-Field should show all NOT-Deleted-Entities while creating a new object containing references to the entity in question.
Select-Field (populated the very same way) should show the "deleted" entity, if it was selected "days/months/years" ago.
Is there a "handy" solution to this?
Currently we are using a "Proxy-Method" for every datasource, which will reload the data of the deleted entity, if it's not in the "available data" collection - but it's hard to believe there is no better way to deal with this, as this problem applies for almost every language out there?
In a normalized database you would have a constraint with ON DELETE NO ACTION/RESTRICT event that would prevent removal of a referenced element from the list. It would force you to decide what is to be done with the referencing rows.
With your manually-controlled deletions this could have been covered by a trigger. As none of these were implemented, you are left with only one thing to do: updating the dropdown with the selected option before rendering the UI. My approach (in Java, I'm not good at C#):
List<String> options = getNonDeletedWhatever();
if (!options.contains(currentEntity.getWhatever())) {
options.add(currentEntity.getWhatever()); // This optionally inserts an outdated value
}
or simply:
Set<String> options = getNonDeletedWhatever();
options.add(currentEntity.getWhatever()); // This optionally inserts an outdated value
I solve it by creating a list of available (non-deleted) items and if the selected item is a deleted one, then I add that item to the list.
This list becomes the data source for my dropdown.
I have a Sharepoint list on a site that I want to update nightly from a SQL server DB, preferably using C#. Here is the catch, I do not know if any records were removed, added, or if any field in any record has been updated. I would believe then the simplest thing to do is remove the data from the list and then replace it with the new list data. But is there any simple way to do this? I would hate to remove 3000+ items line by line from the list and then add the 3000+ records one at a time.
Its up to your environment. If you not that much load on the systems in the night, i would prefer one of the following ways:
1) Build a timerjob, delete the list (not the items one by one, cause this is slow), recreate the list and import the items from the db. When we are talking about 3.000 - 5.000 Elements, this is not that much and i think done under 10 Minutes.
2) Loop through the sharepoint list with the items and check field by field if it was updated within the db and if yes, update it.
I would preferr to delete the list and import the complete table, cause we are talking about not that much data.
Another way, which is a good idea, is to use BCS or BDC. Then you would have the data always in place and synched with the db. Look at
https://msdn.microsoft.com/en-us/library/office/jj163782.aspx
https://msdn.microsoft.com/de-de/library/ee231515(v=vs.110).aspx
Unfortunately there is no "easy" and/or elegant way to delete all the items in a list, like the delete statement in SQL. You can either delete the entire list and recreate it if the list can be easily created from a list definition or, if your concern is performance, since SP 2007 the SPWeb Class has a method called ProcessBatchData. You can use it to batch process commands to avoid the performance penalty of issuing 6000 separate commands to the server. However, it still requires you to pass an ugly XML that contains a list of all the items to be deleted or added.
The ideal way is to enumerate all the rows from the database and see if each row already exists in the SharePoint list using a primary field value. If it already exists, simply update them[1]. Otherwise you can add a new item.
[1] - Optionally, while updating them we can compare the list item field values with database column values. Only if there is a change in any of the field, update it. Otherwise skip it.
I have a function witch tries to remove a member from a group
The problem is if you try to remove a member, without knowing the existence in the group, you could cause an exception.
So I try to enumerate its membership beforehand.
The problem now is that the member property stops after 3000 Entries, and I don't know a way to get more, or the next 3000 members of that group.
Here is my code
DirectoryEntry target_group = new DirectoryEntry(LDAP_group_DN);
if (target_group.Properties["member"].Contains(LDAP_member_to_remove_DN)) {
target_group.Properties["member"].Remove(LDAP_member_to_remove_DN);
}
target_group.CommitChanges();
target_group.Properties["member"] contains exactly 3000 entries, but in reality it is around 7500.
As a shorthand fix I am using the remove statement in a try/catch block without the .Contains() check, but that doesn't seem correct/beautiful/right.
Can anyone lead me to the correct way?
PS: I can not change the structure of our Directory.
This is a Group of RADIUS users, with should not be split up in more groups!
Instead of getting all the group members to determine if the user is part of that list I would use the memberOf/isMemberOf attribute (assuming that your directory supports this feature). This attribute will tell you if a user belongs to a group without having to retrieve all group members.
This other answer might help.
You need to look at into MaxValRange and learn how to retrieve more values using C#.
We have a very simple sample, but, alas, it is in Java
I just used List of string to store words from json file. I parsed the json and stored the values in the List. My script looks like this.
public List <string> a = new List<string>();
void Start()
{
//JSON Parsing
var jd = JSONNode.Parse(jsonString);
print (jd.Count);
for(int no=0; no<jd["A"].Count;no++)
{
a.Add(jd["A"][no].Value);
}
print ("A => "+a.Count);
}
If I have 10 values from json, it is added to the List a. I get the print "A => 10". When I stop and run my project again my start method again does parsing and adds value to List a. But my List count is now 20. And if I run again, it will be 30 and so on. I tried it on device also. On device after uninstalling and again installing, I get it added to it and still get the count as 20. Is it always necessary to clear() the List in the Start() to make the count 0? If I am not doing Clear() before adding strings to List, it always keeps previous values even after stopping the app on editor and on device also.
Make the List private or use the "NonSerialized" attribute on the List variable if you are running your project in "ExecuteInEditMode".
Based on your comment about ExecuteInEditMode:
that will cause the list to be persisted
you might want that when, for example, you are live editing a level or environment
and you would want the final values/settings to persist across runs and builds.
Public fields would get serialized for persistence. NonSerialized attribute might
be useful to prevent that field from being persisted.
An alternate way would be to identify what values/state etc you want to persist and which
ones you want reset when actually playing and then do that accordingly.
I have a Excel COM addin which reads the CustomDocumentProperties section of a workbook.
This is how I access a particular entry from the CustomDocumentProperties section
DocumentProperties docProperties = (DocumentProperties)
xlWorkbook.CustomDocumentProperties;
docProperty = docProperties[propName];
The problem is when the CustomDocumentProperties contain more than 8000 entries, the performance of this
code is really bad. I have ran CPU profiler and it showed that the following line takes more than a minute.
docProperty = docProperties[propName];
Does anyone know how to improve the performance of accessing DocumentProperties?
Thanks!
I doubt that there is anything that you could do to improve the performance of the document properties. I believe that it is implemented as a simple list -- not as a dictionary or hash table. In fact, I don't believe that the list is sorted, so with 8000 entries, on average half of them, or 4000, would have to be accessed in order to find the property that you are looking for.
You might consider not using the CustomDocumentProperties as a dictionary. Instead, you might try putting all 8000 of your entries into a custom dictionary, serializing it, and then adding the entire serialized dictionary to the CustomDocumentProperties as a single entry. So to use it, you would access the CustomDocumentProperties, deserialize the dictionary, and then use it repeatedly. When done, if there were any changes to the dictionary, you would have to re-serialize it and save it back to the CustomDocumentProperties, which you would probably only want to do once -- for example, just before saving your workbook. (You might want to put code to re-serialize and save your custom dictionary to the CustomDocumentProperties within the Workbook.BeforeSave event.)