Property of an AppModel application not getting updated in SCCM - c#

I'm trying to update SDMPackageXML property of an AppModel application through C# code. SDMPackageXML is an XML property. I've to update only one node named AutoInstall in the
SDMPackageXML XML property. Here is my code:
ObjectGetOptions opt = new ObjectGetOptions(null, System.TimeSpan.MaxValue, true);
var path = new ManagementPath("SMS_Application.CI_ID=16777568");
ManagementObject obj = new ManagementObject(scope, path, opt);
obj.Get();
foreach (PropertyData property in obj.Properties)
{
if (property.Name == "SDMPackageXML")
{
//change the property value. Set AutoInstall to true
XmlDocument xml = new XmlDocument();
xml.LoadXml(property.Value.ToString());
var autoInstallTag = xml.GetElementsByTagName("AutoInstall");
autoInstallTag[0].InnerText = "false";
property.Value = xml.OuterXml;
}
}
obj.Put();
The problem is that obj.Put(); updates nothing on the SCCM server. Can someone help me please?

So similar to what I talked about in this answer the main problem here is that Microsoft uses a special method to serialize their XML. The deserialization still works with using the default classes but to serialize again there is no documentation as to how to (I'm pretty sure it is possible but I am not knowledgeable enough to do it)
Instead of documentation they provide wrapper classes for this which are shipped with the SCCM Console (Located in the bin directory of the Installation folder of the Console).
In this case this would be Microsoft.ConfigurationManagement.ApplicationManagement.dll. Unlike in powershell where the dependencies in the same path seem to be loaded as well you seem also to have to reference at least Microsoft.ConfigurationManagement.ApplicationManagement.TaskSequenceInstaller.dll as well.
There are also further dlls with names like Microsoft.ConfigurationManagement.ApplicationManagement.MsiInstaller.dll present however at least in my tests the two above were the only ones needed, but if you notice the deserialization failing with "InvalidPropertyException" errors you might need the dll matching your specific application type.
With those two dlls referenced you can write something like this (note I deserialized using the dll as well because why not if it is already loaded and it creates a nice application object to directly modify the properties. This is however technically not necessary. You could deserialize like in your example and only use the serialization part.
ManagementObject obj = new ManagementObject(#"\\<siteserver>\root\SMS\site_<sitecode>:SMS_Application.CI_ID=<id>");
Microsoft.ConfigurationManagement.ApplicationManagement.Application app = Microsoft.ConfigurationManagement.ApplicationManagement.Serialization.SccmSerializer.DeserializeFromString(obj["SDMPackageXML"].ToString(), true);
app.AutoInstall = true;
obj["SDMPackageXML"] = Microsoft.ConfigurationManagement.ApplicationManagement.Serialization.SccmSerializer.SerializeToString(app, true);
obj.Put();
Now one thing to keep in mind is that is can be a little tricky referencing the applications by their CI_ID because if you update the application the id for the currently valid version of the app changes (the old id still can be used to reference the older revision). So if you change the application gotten using the ID and then change it back with the same ID it will look like only the first change worked. I don't know if this is problematic for you (If you just get all IDs then change every application only once it should not matter) but if it does you can search for the application using their name plus isLatest = 'true' in the WQL query to always get the current one.

Related

WF4.5- Expression Activity type 'CSharpValue`1' requires compilation in order to run

Background
What I'm trying to do is to have a scoped variable of one of my models in the xaml.
In my workflow project "MyProject.Workflows" I have created model classes, code activities and Xaml files. They are all under same namespace. In another project ("Engine"), I load and execute these workflows.
To load the workflows in the "Engine", I use ActivityXamlServices with ActivityXamlServicesSettings including CompileExpressions = true.
When loading the ActivityXamlServices, I use a XamlXmlReader with XamlXmlReaderSettings where I actually point to the "MyProject.Workflows" dll.
Since Both these projects are in the same solution I actually referred MyProject.Workflows in the "Engine".
Because Earlier, they were in different solutions, So when I tried to do this It gave me It cant find the "MyProject.Workflows" dll even though I point it in the XamlXmlReaderSettings.
Then I tried to load the dll to the app domain and then it worked.But I did not want to deal with App Domains so I decided to get both projects under one solution so I can refer the "MyProject.Workflows" in the "Engine".
Issue:
If I use one of those models inside of the Xaml as an expression like "Assign Activity" the Workflow isn't getting compiled when I try to execute this.
For example if I use this in an "Assign" activity having a scoped variable of type MyObject
Newtonsoft.Json.JsonConvert.DeserializeObject<MyProject.Workflows.Models.MyObject>(inputString);
I will get the below error message when I run the workflow.
NotSupportedException:'Expression Activity type 'CSharpValue`1' requires compilation in order to run. Please ensure that the workflow has been compiled.
If I remove these objects and just deal with strings or ints, it works fine.
Things I found in my research:
I found this was a bug in .Net Framework 4.5. But Im using 4.6
Even though I used CompileExpressions = true , I tried this compile method I found. But did not change a thing.
private static void Compile(DynamicActivity dynamicActivity)
{
TextExpressionCompilerSettings settings = new TextExpressionCompilerSettings
{
Activity = dynamicActivity,
Language = "C#",
ActivityName = dynamicActivity.Name.Split('.').Last() + "_CompiledExpressionRoot",
ActivityNamespace = string.Join(".", dynamicActivity.Name.Split('.').Reverse().Skip(1).Reverse()),
RootNamespace = null,
GenerateAsPartialClass = false,
AlwaysGenerateSource = true,
};
TextExpressionCompilerResults results =
new TextExpressionCompiler(settings).Compile();
if (results.HasErrors)
{
throw new Exception("Compilation failed.");
}
ICompiledExpressionRoot compiledExpressionRoot =
Activator.CreateInstance(results.ResultType,
new object[] { dynamicActivity }) as ICompiledExpressionRoot;
CompiledExpressionInvoker.SetCompiledExpressionRootForImplementation(
dynamicActivity, compiledExpressionRoot);
}
I read that some people faced this problem and they had to actually move the models to a diffrent namespace. I did that too. Didn't fix the problem.
My Xaml file has this entry added at the top.
xmlns:local="clr-namespace:MyProject.Workflows.Models"
Can someone please help me to get through this?

How to find in VSPackage which version control system a solution uses

I'm new to extending Visual Studio and I'm trying to find way to find which source control system is used by current solution.
I created VsPackage project and I am able to obtain reference to solution via IVsSolution and to hook up to solution events via IVsSolutionEvents.
Inside OnAfterSolutionOpen (or possibly some other if there's an alternative) I would like to act differently basing on whether the solution uses TFS or Git or something else. How can I obtain this information?
I plan to support as many Visual Studio versions as possible, but if it isn't possible I would like to support at least VS2012 and higher.
Ok, after several hours of digging I've found a solution to this. Thanks to the article of Mark Rendle and the source code for his NoGit extension I've found, that the list of registered source control plugins is located in registry: HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\12.0_Config\SourceControlProviders (in case of VS 2013).
So now, we can have both plugin guid, and the name of the provider. This sample code can fetch those values:
var key = #"Software\Microsoft\VisualStudio\" + "12.0" + #"_Config\SourceControlProviders";
var subkey = Microsoft.Win32.Registry.CurrentUser.OpenSubKey(key);
var providerNames = subkey.GetSubKeyNames().Dump();
var dict = new Dictionary<Guid, String>();
foreach (var provGuidString in subkey.GetSubKeyNames())
{
var provName = (string)subkey.OpenSubKey(provGuidString).GetValue("");
dict.Add(Guid.Parse(provGuidString), provName);
}
Now, there are two ways I've found to obtain guid of currently active provider.
important update: Apparently the second way of obtaining currently active plugin does not work as expected. I strongly advise using first solution.
This is the way that bases on the extension mentioned earlier:
var getProvider = GetService(typeof(IVsRegisterScciProvider)) as IVsGetScciProviderInterface;
Guid pGuid;
getProvider.GetSourceControlProviderID(out pGuid);
Or we can just go to HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\12.0\CurrentSourceControlProvider and get the default value of this key:
var key2 = #"Software\Microsoft\VisualStudio\12.0\CurrentSourceControlProvider";
var guidString = (string)Microsoft.Win32.Registry.CurrentUser.OpenSubKey(key2).GetValue("");
var currentGuid = Guid.Parse(guidString);
Now we just take var activeProviderName = dict[currentGuid]; and that's all.

BizTalk Dynamic Disassembler Problems - The body part is NULL

I started with the solution here http://social.technet.microsoft.com/wiki/contents/articles/20547.biztalk-server-dynamic-schema-resolver-real-scenario.aspx
which matches my scenario perfectly except for the send port, but that isn't necessary. I need the receive port to choose the file and apply a schema to disassemble. From their the orchestration does the mapping, some of it custom, etc.
I've done everything in the tutorial but I keep getting the following error.
"There was a failure executing the receive pipeline... The body part is NULL"
The things I don't get from the tutorial but don't believe they should be an issue are:
I created a new solution and project to make the custompipeline component (reference figure 19) and thus the dll file. Meaning it is on it's own namespace. However, it looks like from the tutorial they created the project within the main biztalk solution (ie the one with the pipeline and the orchestration) and thus the namespace has "TechNetWiki.SchemaResolver." in it. Should I make the custompipeline component have the namespace of my main solution? I'm assuming this shouldn't matter because I should be able to use this component in other solutions as it is meant to be generic to the business rules that are associated with the biztalk application.
The other piece I don't have is Figure 15 under the "THEN Action" they have it equal the destination schema they would like to disassemble to but then they put #Src1 at the end of "http://TechNetWiki.SchemaResolver.Schemas.SRC1_FF#Src1". What is the #Src1 for?
In the sample you've linked to, the probe method of the pipeline component is pushing the first 4 characters from the filename into a typed message that is then passed into the rules engine. Its those 4 characters that match the "SRC1" in the example.
string srcFileName = pInMsg.Context.Read("ReceivedFileName", "http://schemas.microsoft.com/BizTalk/2003/file-properties This link is external to TechNet Wiki. It will open in a new window. ").ToString();
srcFileName = Path.GetFileName(srcFileName);
//Substring the first four digits to take source code to use to call BRE API
string customerCode = srcFileName.Substring(0, 4);
//create an instance of the XML object
XmlDocument xmlDoc = new XmlDocument();
xmlDoc.LoadXml(string.Format(#"<ns0:Root xmlns:ns0='http://TechNetWiki.SchemaResolver.Schemas.SchemaResolverBRE This link is external to TechNet Wiki. It will open in a new window. '>
<SrcCode>{0}</SrcCode>
<MessageType></MessageType>
</ns0:Root>", customerCode));
//retreive source code in case in our cache dictionary
if (cachedSources.ContainsKey(customerCode))
{
messageType = cachedSources[customerCode];
}
else
{
TypedXmlDocument typedXmlDocument = new TypedXmlDocument("TechNetWiki.SchemaResolver.Schemas.SchemaResolverBRE", xmlDoc);
Microsoft.RuleEngine.Policy policy = new Microsoft.RuleEngine.Policy("SchemaResolverPolicy");
policy.Execute(typedXmlDocument);
So the matching rule is based on the 1st 4 characters of the filename. If one isn't matched, the probe returns a false - i.e. unrecognised.
The final part is that the message type is pushed into the returned message - this is made up of the namespace and the root schema node with a # separator - so your #src1 is the root node.
You need to implement IProbeMessage near to class
I forgot to add IProbeMessage in the code of article. It is updated now.
but it is there in sample source code
Src1 is the the root node name of schema. I mentioned that in article that message type is TargetNamespace#Root
I recommend to download the sample code
I hope this will help you

C# code completion with NRefactory 5

I just found out about NRefactory 5 and I would guess, that it is the most suitable solution for my current problem. At the moment I'm developing a little C# scripting application for which I would like to provide code completion. Until recently I've done this using the "Roslyn" project from Microsoft. But as the latest update of this project requires .Net Framework 4.5 I can't use this any more as I would like the app to run under Win XP as well. So I have to switch to another technology here.
My problem is not the compilation stuff. This can be done, with some more effort, by .Net CodeDomProvider as well. The problem ist the code completion stuff. As far as I know, NRefactory 5 provides everything that is required to provide code completion (parser, type system etc.) but I just can't figure out how to use it. I took a look at SharpDevelop source code but they don't use NRefactory 5 for code completion there, they only use it as decompiler. As I couldn't find an example on how to use it for code completion in the net as well I thought that I might find some help here.
The situation is as follows. I have one single file containing the script code. Actually it is not even a file but a string which I get from the editor control (by the way: I'm using AvalonEdit for this. Great editor!) and some assemblies that needs to get referenced. So, no solution files, no project files etc. just one string of source code and the assemblies.
I've taken a look at the Demo that comes with NRefactory 5 and the article on code project and got up with something like this:
var unresolvedTypeSystem = syntaxTree.ToTypeSystem();
IProjectContent pc = new CSharpProjectContent();
// Add parsed files to the type system
pc = pc.AddOrUpdateFiles(unresolvedTypeSystem);
// Add referenced assemblies:
pc = pc.AddAssemblyReferences(new CecilLoader().LoadAssemblyFile(
System.Reflection.Assembly.GetAssembly(typeof(Object)).Location));
My problem is that I have no clue on how to go on. I'm not even sure if it is the right approach to accomplish my goal. How to use the CSharpCompletionEngine? What else is required? etc. You see there are many things that are very unclear at the moment and I hope you can bring some light into this.
Thank you all very much in advance!
I've just compiled and example project that does C# code completion with AvalonEdit and NRefactory.
It can be found on Github here.
Take a look at method ICSharpCode.NRefactory.CSharp.CodeCompletion.CreateEngine. You need to create an instance of CSharpCompletionEngine and pass in the correct document and the resolvers. I managed to get it working for CTRL+Space compltition scenario. However I am having troubles with references to types that are in other namespaces. It looks like CSharpTypeResolveContext does not take into account the using namespace statements - If I resolve the references with CSharpAstResolver, they are resolved OK, but I am unable to correctly use this resolver in code completition scenario...
UPDATE #1:
I've just managed to get the working by obtaining resolver from unresolved fail.
Here is the snippet:
var mb = new DefaultCompletionContextProvider(doc, unresolvedFile);
var resolver3 = unresolvedFile.GetResolver(cmp, loc); // get the resolver from unresolvedFile
var engine = new CSharpCompletionEngine(doc, mb, new CodeCompletionBugTests.TestFactory(resolver3), pctx, resolver3.CurrentTypeResolveContext );
Update #2:
Here is the complete method. It references classes from unit test projects, sou you would need to reference/copy them into your project:
public static IEnumerable<ICompletionData> DoCodeComplete(string editorText, int offset) // not the best way to put in the whole string every time
{
var doc = new ReadOnlyDocument(editorText);
var location = doc.GetLocation(offset);
string parsedText = editorText; // TODO: Why there are different values in test cases?
var syntaxTree = new CSharpParser().Parse(parsedText, "program.cs");
syntaxTree.Freeze();
var unresolvedFile = syntaxTree.ToTypeSystem();
var mb = new DefaultCompletionContextProvider(doc, unresolvedFile);
IProjectContent pctx = new CSharpProjectContent();
var refs = new List<IUnresolvedAssembly> { mscorlib.Value, systemCore.Value, systemAssembly.Value};
pctx = pctx.AddAssemblyReferences(refs);
pctx = pctx.AddOrUpdateFiles(unresolvedFile);
var cmp = pctx.CreateCompilation();
var resolver3 = unresolvedFile.GetResolver(cmp, location);
var engine = new CSharpCompletionEngine(doc, mb, new CodeCompletionBugTests.TestFactory(resolver3), pctx, resolver3.CurrentTypeResolveContext );
engine.EolMarker = Environment.NewLine;
engine.FormattingPolicy = FormattingOptionsFactory.CreateMono();
var data = engine.GetCompletionData(offset, controlSpace: false);
return data;
}
}
Hope it helps,
Matra
NRefactory 5 is being used in SharpDevelop 5. The source code for SharpDevelop 5 is currently available in the newNR branch on github. I would take a look at the CSharpCompletionBinding class which has code to display a completion list window using information from NRefactory's CSharpCompletionEngine.

NUnit is not failing test with dynamic keyword of .Net 4.0

I am using NUnit with Visual Studio Express Edition 2010 for C#, Now, normally test works fine. But whenever I try to use Massive.cs, which is open source api to access database. Test fails from that file only. Now, if I run the application, api is working fine. I have created a different library file to access data base.
I seriously don't understand the error. It is just giving error that object reference is not set to an object. But if I run the code, it works fine. I am using dynamic keyword as shown in link of api above. Does that making problem with NUnit ?
Is there any other way to test in this type of Scenarios?
Here are the further details of the code,
Test class is like this
dynamic item = new Item();
item.Insert(new { Name = "Maggi", Description = "Its 2 Min Nuddles", IsDelete = false });
var items = item.All();
Assert.AreEqual("Maggi", items.FirstOrDefault().Name);
Now, I have put test here. Which gives error like shown in image,
Now if I run code in console application, then code is working fine, code snippet is given below
dynamic item = new Item();
item.Insert(new { Name = "Maggi", Description = "Its 2 Min Nuddles", IsDelete = false });
var result = item.All();
foreach (var i in result)
{
Console.WriteLine(i.Name + i.Description);
}
Console.Read();
Here, code is working and same thing is not working with NUnit Test. Please have a look and help me out. Please let me know if any further information is needed from my side.
Most probable explanation is that you haven't set up your connection string in the test project.
If you are using NUnit, just put it in app.config of your test project.
Solved... There is a issue with NUnit Testing. It was not taking config file pefectly. So, I made two changes. Changes I have done in Project setting.
First change is to change Application Base to bin\debug just give application base as this and then config file to .config to .exe.config and things are up and running. :)

Categories

Resources