FilteredElementCollector Lincoln = new FilteredElementCollector(doc);
Lincoln.OfCategory(BuiltInCategory.OST_RvtLinks); Autodesk.Revit.DB.View CurrentView = uiDoc.ActiveView;
ICollection<ElementId> Toggle_On = Lincoln.ToElementIds(); Toggle_On.Clear();
ICollection<ElementId> Toggle_Off = Lincoln.ToElementIds(); Toggle_Off.Clear();
List<Element> Processed = new List<Element>();
List<string> Revit_On = new List<string>(); List<string> Revit_Off = new List<string>();
List<string> Revit_Names = new List<string>();
foreach (Element One_Link in Lincoln)
{
string Revit_Name = One_Link.Name;
if (!Revit_Names.Contains(Revit_Name))//prevents processing same link twice;/but does NOT change anyway!!!!
{
Revit_Names.Add(Revit_Name);
Boolean Is_Hidden = One_Link.IsHidden(CurrentView);//
if (Is_Hidden)
{
Toggle_On.Add(One_Link.Id); Revit_On.Add(One_Link.Name);
}//this apparently does detect what is hidden;
else
{
Toggle_Off.Add(One_Link.Id); Revit_Off.Add(One_Link.Name);
}
}
}
Transaction Do_Toggle = new Transaction(doc, "DoToggle");
Do_Toggle.Start();
if (!Toggle_Off.Count.Equals(0)) { CurrentView.HideElements(Toggle_Off); }
if (!Toggle_On.Count.Equals(0)) { CurrentView.UnhideElements(Toggle_On); }
Do_Toggle.Commit();
Is somehow the transaction failing? Undo is not available, so it doesn't think it has done anything that might need to be undone. Note that this EXACT code is used in another one of my addins (in which multiple optional sub-programs are controlled by picking radio options on a form). But when I am trying to use the code in a standalone version, it fails (without errors). Note also that I've inserted multiple TaskDialog entries to verify that it is indeed finding the RvtLinks that are either visible or hidden in the current view. But it is simply refusing to actually change their visibility. If I run the dialog controlled version, everything toggles, but if I immediately run the standalone, nothing toggles (proving it isn't uneditable pinned links). I have made this option available to users by making "Toggle Links" the default so they can call up my collected program and just hit a carriage return, but I need this to be a true standalone.
You code confuses me. For instance, why do you initialise the Toggle_On and Toggle_Off collections with member values, only to clear them immediately afterwards?
In any case, you use of the transaction does not follow the recommended pattern of enclosing it in a using statement.
Please refer to The Building Coder topic group on Handling Transactions and Transaction Groups for more information on using transactions in the Revit API.
Related
I am working on a Task in which the code automatically opens a drawing selected by the user [in a UI] and selects all the objects in the drawing and starts to explode all the of them till they cant be exploded anymore. While doing this I face a problem, the original (un-exploded 3D object) is still present in the drawing, super imposed by the Exploded object. Every recursive call of the Explode function creates a new exploded 3D object of that object.
Here is a snippet of the code I working on:
PromptSelectionResult ss = ed.SelectAll();
using (DocumentLock acLckDoc = doc.LockDocument())
{
using (Transaction tr = db.TransactionManager.StartTransaction())
{
objs = new DBObjectCollection();
foreach (SelectedObject so in ss.Value)
{
Entity ent = (Entity)tr.GetObject(so.ObjectId, OpenMode.ForWrite);
if (!(ent is Solid3d))
{
ent.Explode(objs);
ent.UpgradeOpen();
ent.Erase();
}
}
tr.Commit();
}
}
As soon as the control comes on to the ent.Erase() statement - it throws an exception, eCannotBeErasedByCaller. I cant figure out why? I have unlocked all layers, opened the entity for Write, CommandFlags have been set to Session and UsePickSet (shuffled through all).
Anybody got any suggestions?
Looking at your description, you probably need a recursive explode. Sometime ago I did a code around this, for other type of entities, but you can adjust it.
private List<DBObject> FullExplode(Entity ent)
{
// final result
List<DBObject> fullList = new List<DBObject>();
// explode the entity
DBObjectCollection explodedObjects = new DBObjectCollection();
ent.Explode(explodedObjects);
foreach (Entity explodedObj in explodedObjects)
{
// if the exploded entity is a blockref or mtext
// then explode again
if (explodedObj.GetType() == typeof(BlockReference) ||
explodedObj.GetType() == typeof(MText))
{
fullList.AddRange(FullExplode(explodedObj));
}
else
fullList.Add(explodedObj);
}
return fullList;
}
source: http://adndevblog.typepad.com/infrastructure/2013/04/get-cogopoint-label-text.html
I finally found out the reason why the Original objects werent getting erased.
In the earlier part of the code, a AutoCAD Plant3D dwg is exported to AutoCAD (ExporttoAutoCAD / Saveas), this was creating Proxy items. These cant be deleted manually or via code.
Only way is to explode the PipeLines and Inline assets before exporting the file. This happens automatically if you export the file, but if you use saveas, you will have to explode the Pipe components before you export the file.
Wasted a lot of time understanding the cause, but finally got it!
So I have a cmdlet written in c#: Get-LivingCharacter. I want users to use this like Get-LivingCharacter -Name "Bran", but I would like to allow for the list of available characters to change. Maybe today, "Bran" is a valid name to pass in for Get-LivingCharacter, but maybe in the future it will not be. Things happen.
For convenience I want to allow tab-completion of this field. However, I can't seem to get that to work for non-const data sets. Dynamic fields don't even auto-complete the field name, nevermind the value, and I don't know a way to implement this for a non-dynamic field. Conceptually, I could generate a .ps1 file on startup given the current data set, and then load that ps1 as the module, but this feels a bit like killing a pup with a greatsword - lots of overkill. Is there a better option?
I had already implemented a similar function to the DynamicParam helper function, as reference in the comments. However, tab completion wasn't working. I was writing a minimal reproduction example, when...my tab completion worked.
It turns out, it reproducibly works/breaks based on the inclusion of a WriteDebug statement:
[Cmdlet("Get", "LivingCharacter")]
public class GetLivingCharacter : Cmdlet, IDynamicParameters
{
protected override void ProcessRecord()
{
}
public object GetDynamicParameters()
{
WriteDebug("Getting names"); // Tab completion won't work with this here - comment it out and it works.
^^^^^^^^^^
var chars = new List<String>() { "Bran", "Arya" };
var dict = new RuntimeDefinedParameterDictionary();
var attributes = new Collection<Attribute>
{
new ParameterAttribute
{
HelpMessage = "Enter a valid open name",
Mandatory = true
},
new ValidateSetAttribute(chars.ToArray()),
};
dict.Add("Name", new RuntimeDefinedParameter("Name", typeof(string), attributes));
return dict;
}
}
After some digging, the WriteDebug statement is throwing (which I assume is because it can't output while I'm typing). It then recreates the GetLivingCharacter class after I've finished the command to validate. It took a while to find since, because of the issue, I can't write the error to the console, so I had to append to a temp file instead.
I can't sort this weird issue out and I have tried anything and everything I can think of.
I got 5 pages, everyone of them passing variables with navigation this way:
Pass:
NavigationSerice.Navigate(new Uri("/myPage.xaml?key=" + myVariable, UriKind.Relative));
Retrieve:
If (NavigationContext.QueryString.ContainsKey(myKey))
{
String retrievedVariable = NavigationContext.QueryString["myKey"].toString();
}
I open a list on many pages and one of the pages automatically deletes an item from the list actualProject (actualProject is a variable for a string list). Then, when I go so far back that I reach a specific page - the app throws an exception. Why? I have no idea.
The code that deletes the item:
// Remove the active subject from the availible subjects
unlinkedSubjects.Remove(actualSubject);
unlinkedsubjectsListBox.ItemsSource = null;
unlinkedsubjectsListBox.ItemsSource = unlinkedSubjects;
Then the page that throws the exception's OnNavigatedTo event:
protected override void OnNavigatedTo(NavigationEventArgs e)
{
if (NavigationContext.QueryString.ContainsKey("key"))
{
actualProject = NavigationContext.QueryString["key"];
try
{
//Read subjectList from IsolatedStorage
subjectList = readSetting(actualProject) != null ? (List<String>)readSetting(actualProject) : new List<String>();
//Put the subjectList into the subjectListBox
subjectListBox.ItemsSource = subjectList;
//Set the subjectsPageTitle to the "actualProject" value, to display the name of the current open project at the top of the screen
subjectsPageTitle.Text = actualProject;
}
catch (Exception)
{
if (language.Equals("en."))
{
// Language is set to english
MessageBox.Show("Couldn't open the project, please try again or please report the error to Accelerated Code - details on the about page");
}
else if (language.Equals("no."))
{
// Language is set to norwegian
MessageBox.Show("Kunne ikke åpne prosjektet, vennligst prøv igjen eller rapporter problemet til Accelerated Code - du finner detaljer på om-siden");
}
}
}
}
Exception:
_exception {System.ArgumentException: Value does not fall within the expected range.} System.Exception {System.ArgumentException}
My theory:
The app kind of loads the currently opened and modified List. Is that possible? No idea.
So there are a number of ways to pass data between pages.
The way you have chosen is the least suggested.
You can use the PhoneApplicationService.Current dictionary but this is messy also if you have a ton of variables, doesn't persist after app shut down and could be simplified.
I wrote a free DLL that kept this exact scenario in mind called EZ_iso.
You can find it here
Basically what you would do to use it is this.
[DataContractAttribute]
public class YourPageVars{
[DataMember]
public Boolean Value1 = false;
[DataMember]
public String Value2 = "And so on";
[DataMember]
public List<String> MultipleValues;
}
Once you have your class setup you can pass it easily between pages
YourPageVars vars = new YourPageVars { /*Set all your values*/ };
//Now we save it
EZ_iso.IsolatedStorageAccess.SaveFile("PageVars",vars);
That's it! Now you can navigate and retrieve the file.
YourPageVars vars = (YourPageVars)EZ_iso.IsolatedStorageAccess.GetFile("PageVars",typeof(YorPageVars));
This is nice because you can use it for more than navigation. You can use it for anything that would require Isolated storage. This data is serialized to the device now so even if the app shuts down it will remain. You can of course always delete the file if you choose as well.
Please make sure to refer to the documentation for any exceptions you have. If you still need help feel free to hit me up on twitter #Anth0nyRussell or amr#AnthonyRussell.info
We recently upgraded from Ektron 8.6 to 9.0 (Ektron CMS400.NET, Version: 9.00 SP2(Build 9.0.0.249)).
I have some code (below) which we use to display links to items in a taxonomy. Under 8.6, this would show library items if they had been added to the taxonomy. As of 9.0, it no longer displays library items. It still works for DMS items and normal pages (all first class content in Ektron).
private List<ContentData> getTaxonomyItems(long TaxonomyId)
{
listContentManager = new ContentManager();
criteria = new ContentTaxonomyCriteria(ContentProperty.Id, EkEnumeration.OrderByDirection.Ascending);
criteria.PagingInfo = new Ektron.Cms.PagingInfo(400); // there's a lot of items and I don't want to page them.
criteria.AddFilter(TaxonomyId, true); // this gets sub taxonomies too :)
List<ContentData> contentList = listContentManager.GetList(criteria);
return contentList;
}
(I would love to simply say to users to use the DMS instead of the library, but we have a security requirement and I'm not aware of a way I can enforce security on DMS items like we can with library items by dropping a webconfig file in the library folder.)
Is this a bug that anyone else has experienced?
Or is there a problem with my code (did an API change in the upgrade to 9.0)?
Thanks.
I ended up emailing Ektron support in Sydney (I'm in Australia), and they said:
I would expect ContentManager to only return content, not library
items – must have been a loophole which is now closed. Taxonomy is the
way to go.
So I used some of the code they provided and came up with the following, which appears to work...
private List<TaxonomyItemData> getTaxonomyItems(long TaxonomyId)
{
List<TaxonomyItemData> list = new List<TaxonomyItemData>();
TaxonomyManager taxManager = new TaxonomyManager(Ektron.Cms.Framework.ApiAccessMode.Admin);
TaxonomyCriteria taxonomyCriteria = new Ektron.Cms.Organization.TaxonomyCriteria();
taxonomyCriteria.AddFilter(Ektron.Cms.Organization.TaxonomyProperty.Path,
Ektron.Cms.Common.CriteriaFilterOperator.StartsWith, GetTaxonomyPathById(TaxonomyId));
List<TaxonomyData> TaxonomyDataList = taxManager.GetList(taxonomyCriteria);
foreach (TaxonomyData taxd in TaxonomyDataList)
{
TaxonomyData taxTree = taxManager.GetTree(taxd.Path,
1, // depth. doesn't seem to work. have to manually tranverse lower taxonomies.
true, // include items
null,
Ektron.Cms.Common.EkEnumeration.TaxonomyType.Content,
Ektron.Cms.Common.EkEnumeration.TaxonomyItemsSortOrder.taxonomy_item_display_order);
foreach (TaxonomyItemData taxItem in taxTree.TaxonomyItems)
{
list.Add(taxItem);
}
}
return list;
}
private static String GetTaxonomyPathById(long taxonomyId)
{
TaxonomyManager tMgr = new TaxonomyManager();
TaxonomyData tData = tMgr.GetItem(taxonomyId);
if (tData != null)
{
return tData.Path;
}
return "";
}
This code fetches items for all the child taxonomies as well as returning library items.
The one problem is that it fetches duplicates for some items, but those are easy to clean out.
I was also told by Ektron that...
TaxonomyManager.GetItem(“{path}”) is a more efficient way to get the
categories
That's why I've included the GetTaxonomyPathById() method (inspired by this blog post: http://www.nimbleuser.com/blog/posts/2009/iterating-through-ektron-content-in-multiple-taxonomies-via-directly-interfacing-with-search-indexing-services/ )
I am using the SharePoint Object Model via a console app on the same server as the SharePoint installation, and using the following code:
SPSite MySite = new SPSite("http://server/");
SPWeb MyWeb = MySite.OpenWeb();
MyWeb.AllowUnsafeUpdates = true;
SPList MyList = MyWeb.Lists["Test"];
const string EmptyQuery = "0";
SPQuery q = new SPQuery { Query = EmptyQuery };
String Source = "Test String";
for( int i = 1; i < 1000; i++)
{
Console.WriteLine("Creating new item");
SPListItem MyItem = MyList.GetItems(q).Add();
Console.WriteLine("Created new item");
Console.WriteLine("Assigning Title Value");
MyItem["Title"] = Source.ToString();
Console.WriteLine("Assigned Title Value");
MyItem.Update();
}
I am getting a several second pause between "Assigning Title Value" and "Assigned Title Value".
When I deploy the code as a Web Part, its instantaneous, the delay only seems to be when the code is deployed as a console application.
Edit: More information! When I have more than one field being assigned, its always the first field that is slow, any subsequent assignments are as fast as expected. If I switch the order of fields around, it has no effect on the delay - the first field is always slow.
Any thoughts?
It looks like this is because you're not accessing any fields before using the setter. If you read at least one field first, you'd probably see a slight delay there, and no delay on the setter, because under the hood SetValue() calls EnsureFieldCollection(), which - if the fields for the list item haven't already been populated - has to check back to the SPList's fields collection.
Also, this is not in your code snippet, but make sure you are disposing of your SPWeb and SPSite objects when you're done. A good pattern is to use using:
using(SPSite site = new SPSite("url"))
{
using(SPWeb web = site.OpenWeb())
{
//do stuff
}
}
First of all I would suggest making use of
using (SPSite MySite = new SPSite("http://server/"))
{
using (SPWeb MyWeb = MySite.OpenWeb())
{
//do your stuff here
}
}
You can check the EnsureFieldCollection by reading all the fields of the item before updating the field. If after that your first field is updated instantly you can be pretty sure that that is the reason.
It may be a difference between Release and Debug builds, or the fact it has the debugger attatched. Try changing to a Release build, and if that doesn't work try running it without the debugger (Ctrl+F5 in Visual Studio)