Dynamically Loading Project Online Properties via CSOM - c#

I'm looking at trying to load our projects from project online into a .NET application, while i can load the projects one thing i'm having trouble with is loading all the columns i would like. I know i can specify the columns i want to load using include but i wanted to be able to use a list of columns that could be generated dynamically.
It's part of an ETL program that I would like to allow the users to configure the list of columns being brought over into the cache database. Below is what I have so far
static void Main(string[] args)
{
ProjectContext pc = getProjCtxt();
pc.Load(pc.Projects);
pc.ExecuteQuery();
ColumnNames fldList = new ColumnNames();
var enumerator = pc.Projects.GetEnumerator();
while (enumerator.MoveNext()) {
var p2 = enumerator.Current;
pc.Load(p2);
pc.ExecuteQuery();
Console.WriteLine(p2.FinishDate);
}
foreach (PublishedProject p in pc.Projects)
{
var pubProj = p.IncludeCustomFields;
pc.Load(pubProj);
pc.ExecuteQuery();
//Dictionary<string, object> projDict = pubProj.FieldValues;
var type = p.GetType();
foreach(ColumnNames.colInfo ci in fldList.impFields)
{
if (type.GetProperty(ci.FieldName) != null)
{
Console.WriteLine(p.FinishDate);
Console.WriteLine(type.GetProperty(ci.FieldName).GetValue(p, null));
}
}
}
}
I get an error on the FinishDate because it hasn't been initialized well how to i initialize all the properties of a task/project so i can work with them if the program doesn't know ahead of time what columns it is looking for.
Is there a way to build up a string and pass it to the project online to tell it what properties to initialize.?

So i found an answer buried eventually.
https://sharepoint.stackexchange.com/questions/89634/syntax-for-including-fields-dynamically-in-csom-query
Managed to also simplify the vode and clean it up. as well as make sure that i'm only including the non Custom Fields in the explicit properties list since the IncludeCustomFields option brings all the custom fields down with it.
static void Main(string[] args)
{
ProjectContext pc = getProjCtxt();
ColumnNames fldList = new ColumnNames();
var q = from ColumnNames.colInfo fld in fldList.impFields
where fld.CustomField == false
select fld;
Console.WriteLine(q.Count());
foreach(ColumnNames.colInfo dynField in q){
pc.Load(pc.Projects, p => p.Include(pj => pj[dynField.FieldName]));
}
pc.Load(pc.Projects, p => p.Include(pj => pj.IncludeCustomFields));
pc.ExecuteQuery();
}

Related

How to read data from Mongodb which have duplicate element name in c#

I am using MongoDB.Drivers in my C# MVC application to communicate with Mongodb database.
C# Code
var client = new MongoClient("mongodb://localhost:27012");
var db = client.GetDatabase("Test_DB");
var collection = db.GetCollection<BsonDocument>("TestTable");
var tData = await collection.FindAsync(new BsonDocument(true)); // I used - new BsonDocument(true) to specify allow duplicate element name while read data.
MongoDB Data Pic
In above image you can see I have multiple Columns called DuplicateCol with different values.
When I tried to read these data in c# using MongoDB.Driver I got following error : InvalidOperationException: Duplicate element name 'DuplicateCol'.
While insert duplicate element name I used AllowDuplicateNames=true of BsonDocument object as below. (It insert duplicate element name without error.)
BsonDocument obj = new BsonDocument();
obj.AllowDuplicateNames = true;
obj.Add("DuplicateCol", "Value_One");
obj.Add("propone", "newVal");
obj.Add("DuplicateCol", "Value_Two");
.... // other properties with value
await collection.InsertOneAsync(obj);
Note: This Schema is Must. I can not altered it.
Please provide me suggestions to fix this Issue.
Any help would be highly appreciated..
Thanks.
If nothing else helps, you reviewed other answers and comments, and still think you absolutely must keep design described in your question, you can use the following hack. Create class like this:
class AlwaysAllowDuplicateNamesBsonDocumentSerializer : BsonDocumentSerializer {
protected override BsonDocument DeserializeValue(BsonDeserializationContext context, BsonDeserializationArgs args) {
if (!context.AllowDuplicateElementNames)
context = context.With(c => c.AllowDuplicateElementNames = true);
return base.DeserializeValue(context, args);
}
public override BsonDocument Deserialize(BsonDeserializationContext context, BsonDeserializationArgs args) {
if (!context.AllowDuplicateElementNames)
context = context.With(c => c.AllowDuplicateElementNames = true);
return base.Deserialize(context, args);
}
}
This is custom serializer for BsonDocument which always sets AllowDuplicateElementNames while deserializing. Then you need a bit of reflection to overwrite default BsonDocument serializer (because BsonDocumentSerializer.Instance has no setter):
// get __instance field, which is backing field for Instance property
var instanceField = typeof(BsonDocumentSerializer).GetField("__instance", BindingFlags.Static | BindingFlags.NonPublic);
// overwrite with our custom serializer
instanceField.SetValue(null, new AlwaysAllowDuplicateNamesBsonDocumentSerializer());
By doing that somewhere at startup you will be able to read back your documents with duplicated attributes.
Full code for test:
static void Main(string[] args) {
var instanceField = typeof(BsonDocumentSerializer).GetField("__instance", BindingFlags.Static | BindingFlags.NonPublic);
instanceField.SetValue(null, new AlwaysAllowDuplicateNamesBsonDocumentSerializer());
TestMongoQuery();
Console.ReadKey();
}
static async void TestMongoQuery() {
var client = new MongoClient();
var db = client.GetDatabase("Test_DB");
var collection = db.GetCollection<BsonDocument>("TestTable");
using (var allDocs = await collection.FindAsync(FilterDefinition<BsonDocument>.Empty)) {
while (allDocs.MoveNext()) {
foreach (var doc in allDocs.Current) {
var duplicateElements = doc.Elements.Where(c => c.Name == "DuplicateCol");
foreach (var el in duplicateElements) {
Console.WriteLine(el.Name + ":" + el.Value);
}
}
}
}
}
The error you receive is by design
InvalidOperationException: Duplicate element name 'DuplicateCol'
As CodeFuller said, MongoDB document keys should be unique. If you need a document to contain duplicate field names and you cannot alter this schema, MongoDB might not be the right database solution for you....I don't know which one will to be honest. Although it does seem possible to save duplicate keys using:
AllowDuplicateNames=true
I imagine you will experience challenges with querying and indexing, among others.
An argument could be made that this schema is a very strange requirement. A more appropriate schema might be:
{
"_id" : ObjectId("xxx"),
"propTest" : 0,
...
"duplicateCol": [ "Value_Two", "Value_One" ]
}
Here you have a single property (duplicateCol) which is an array which accepts multiple strings.

Roslyn get IdentifierName in ObjectCreationExpressionSyntax

Currently I am working on simple code analyse for c# with roslyn. I need to parse all document of all projects inside one solution and getting the declared used classes inside this document.
For example from:
class Program
{
static void Main(string[] args)
{
var foo = new Foo();
}
}
I want to get Program uses Foo.
I already parse all documents and get the declared class inside.
// all projects in solution
foreach (var project in _solution.Projects)
{
// all documents inside project
foreach (var document in project.Documents)
{
var syntaxRoot = await document.GetSyntaxRootAsync();
var model = await document.GetSemanticModelAsync();
var classes = syntaxRoot.DescendantNodes().OfType<ClassDeclarationSyntax>();
// all classes inside document
foreach (var classDeclarationSyntax in classes)
{
var symbol = model.GetDeclaredSymbol(classDeclarationSyntax);
var objectCreationExpressionSyntaxs = classDeclarationSyntax.DescendantNodes().OfType<ObjectCreationExpressionSyntax>();
// all object creations inside document
foreach (var objectCreationExpressionSyntax in objectCreationExpressionSyntaxs)
{
// TODO: Get the identifier value
}
}
}
}
The problem is to get the IdentifierName Foo. Using the debugger, I see objectCreationExpressionSyntax.Typegot the Identifier.Text got the value I need, but objectCreationExpressionSyntax.Type.Identifierseems to be private.
I could use the SymbolFinder to find all references of a Class in the solution. As I already need to parse all documents its should work without.
Maybe I am on the wrong path? How to get the identifier value?
You'll need to handle the different types of TypeSyntaxes. See here: http://sourceroslyn.io/#Microsoft.CodeAnalysis.CSharp/Syntax/TypeSyntax.cs,29171ac4ad60a546,references
What you see in the debugger is a SimpleNameSyntax, which does have a public Identifier property.
Update
var ns = objectCreationExpressionSyntax.Type as NameSyntax;
if (ns != null)
{
return ns.Identifier.ToString();
}
var pts = objectCreationExpressionSyntax.Type as PredefinedTypeSyntax;
if (pts != null)
{
return pts.Keyword.ToString();
}
...
All other subtypes would need to be handed. Note that ArrayType.ElementType is also a TypeSyntax, so you would most probably need to make this method recursive.
You can get the identifier from the syntax's Type property:
foreach (var objectCreationExpressionSyntax in objectCreationExpressionSyntaxs)
{
IdentifierNameSyntax ins = (IdentifierNameSyntax)objectCreationExpressionSyntax.Type;
var id = ins.Identifier;
Console.WriteLine(id.ValueText);
}
Strings can be misleading.
Let's say you have the expression new SomeClass(), and you get the string "SomeClass" out of it. How do you know if that refers to Namespace1.SomeClass or Namespace2.SomeClass ? What if there is a using SomeClass = Namespace3.SomeOtherType; declaration being used?
Fortunately, you don't have to do this analysis yourself. The compiler can bind the ObjectCreationExpressionSyntax to a symbol. You have your semantic model, use it.
foreach (var oce in objectCreationExpressionSyntaxs)
{
ITypeSymbol typeSymbol = model.GetTypeInfo(oce).Type;
// ...
}
You can compare this symbol with the symbols you get from model.GetDeclaredSymbol(classDeclarationSyntax), just make sure you use the Equals method, not the == operator.

Obtaining classes from a namespace

I'm trying to get the Class and Methods from a assembly, it works when the assembly is the same where the class is, but when it does not work when the assembly is in other project. I already add the reference from the project I want to obtain the Class and Methods but the var theList returns null. I want get the class and methods from one proyect to another , the 2 proyects are in the same solution. I need some help
class Program
{
static void Main(string[] args)
{
var theList = Assembly.GetExecutingAssembly().GetTypes().ToList().Where(t => t.Namespace == "____mvc4.Models").ToList();
Console.WriteLine("--List of Classes with his respective namescpace : ");
foreach (var item in theList)
{
Console.WriteLine(item);
}
Console.WriteLine("------List of classes: ");
foreach (var item in theList)
{
Console.WriteLine("*****************" + item.Name + "*****************");
MemberInfo[] memberInfo = item.GetMethods(BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.DeclaredOnly | BindingFlags.Static | BindingFlags.Instance);
for (int i = 0; i < memberInfo.Length; i++)
{
if (!(memberInfo[i].Name.Contains("get_") || memberInfo[i].Name.Contains("set_")))
{
Console.WriteLine("{0}", memberInfo[i].Name);
}
}
}
Console.ReadLine();
}
the asembly where are the classes that I want obtain does not appear in AppDomain.CurrentDomain.GetAssemblies().ToList();
Here...
var theList = Assembly.GetExecutingAssembly().GetTypes()...etc
you are referring to the current ("executing") assembly. If you want to get types from another assembly, you need to get a reference to that assembly. A simple way to do so is to reference some type from that referenced assembly:
var otherAssembly = typeof(SomeTypeDefinedInAReferencedAssembly).Assembly;
var theList = otherAssembly.GetTypes()...etc
If you want to do it dynamically, then you need to get all the assemblies in the current domain or iterate the /bin/ directory. The domain will get you all kinds of assemblies, including your standard ones, like System. /bin/ will restrict you to just your custom stuff.
Here's a utility method I use. You pass in the evaluation -- i.e. the filter -- and it spits back a list of Types.
public static List<Type> GetClassesWhere(Func<Type, bool> evaluator)
{
var assemblies = AppDomain.CurrentDomain.GetAssemblies().ToList();
var types = new List<Type>();
foreach(var assembly in assemblies)
{
try
{
types.AddRange(assembly.GetTypes().Where(evaluator).ToList());
}
catch
{
}
}
return types;
}
I try/catch each assembly individually because I found that sometimes I get some weird permission denied errors, especially in shared environments such as Azure and AppHarbor. It was always on assemblies I didn't care about anyway, so that's why I take no action on catch. For my custom assemblies, it always works for me.
In your example, you'd use it thusly (assuming you put it in a static class called Utilities)
var types = Utilities.GetClassesWhere(t => t.Namespace == "____mvc4.Models");
If you're trying to do this generically as possible, without knowing or caring which assemblies contain this namespaces, you need to check the loaded modules:
var theList = new List<Type>();
BuildManager.GetReferencedAssemblies();
var modules = Assembly.GetExecutingAssembly().GetLoadedModules();
theList.AddRange(modules.SelectMany(x => x.Assembly.GetTypes().Where(t => t.Namespace == "____mvc4.Models")));
AppDomain.CurrentDomain.GetAssemblies()
.SelectMany(am => am.GetTypes())
.Select(a => a.Namespace == "Name of your namespace")
.Distinct();

LINQ Deffered Execution Subtlety

All, I have recently localised the entire aplication and I am faced with the following problem;I have the following LINQ query in my application
var ccsum = from a in dtSumGL2.AsEnumerable()
group Math.Abs(a.Field<double>(strpcPSPassBegCost)) by new
{
BaseCC = a.Field<string>(strpcBaseCC)
}
into g
select new
{
g.Key.BaseCC,
PSPassBegCost = g.Sum()
};
This is creating a new object ccsum which we use to create a DataTable and subsequently populate an SQL Server database.
The problem is that each of the new items being created in the DataTable with column names BaseCC and PSPassBegCost, but these names do not matched the German version of these names. Now for my question: is there a way to do something like:
var ccsum = from a in dtSumGL2.AsEnumerable()
group Math.Abs(a.Field<double>(strpcPSPassBegCost)) by new
{
BaseCC = a.Field<string>(strpcBaseCC)
}
into g
select new
{
g.Key.BaseCC as Resources.BaseCC,
PSPassBegCost = g.Sum() as Resources.PSPassBegCost
};
so that I can name the tables according to their localised names?
Edit. The code that retrieve a DataTable from ccsum is
fooDt = Utils.LINQToDataTable(ccsum);
and
public static DataTable LINQToDataTable<T>(IEnumerable<T> varlist)
{
DataTable dtReturn = new DataTable();
PropertyInfo[] oProps = null;
if (varlist == null)
return dtReturn;
foreach (T rec in varlist)
{
// Use reflection to get property names, to create
// table, Only first time, others will follow.
if (oProps == null)
{
oProps = ((Type)rec.GetType()).GetProperties();
foreach (PropertyInfo pi in oProps)
{
Type colType = pi.PropertyType;
if ((colType.IsGenericType) && (colType.GetGenericTypeDefinition() == typeof(Nullable<>)))
{
colType = colType.GetGenericArguments()[0];
}
dtReturn.Columns.Add(new DataColumn(pi.Name, colType));
}
}
DataRow dr = dtReturn.NewRow();
foreach (PropertyInfo pi in oProps)
{
pi.GetValue(rec, null) == null ? DBNull.Value : pi.GetValue(rec, null);
}
dtReturn.Rows.Add(dr);
}
return dtReturn;
}
Thanks for your time.
Another approach might be to rename the 'columns' in the ccsum object as a post-processing step although the values of sumcc are not populated until they are requested at run-time - any other ideas?
Create class or enum which will map column indexes to some readable names:
public static class SumGLColumn
{
public const int BaseCC = 0;
public const int PSPassBegCost = 1;
}
And use column indexes instead of column names to query your datatable:
var ccsum = from a in dtSumGL2.AsEnumerable()
group Math.Abs(a.Field<double>(SumGLColumn.PSPassBegCost))
by a.Field<string>(SumGLColumn.BaseCC) into g
select new {
BaseCCg = g.Key,
PSPassBegCost = g.Sum()
};
Attempting to localize components of your system for greater human comprehension is a laudable goal, but it will give you challenges in using the the vast majority of tools and libraries: generally database tooling expects these things to be constant and ignorant of localization.
If you wish your database tables to be easier to understand, perhaps a more practical solution would be to produce localized views instead? The views could live in a de schema and be one-to-one translations of your tables. This would allow you to leverage a lot of the standard tooling, keeping your system in a consistent "neutral" culture internally (whatever your development culture is) and providing translations over the top of these wherever required.
I think trying to embed this kind of localization into the heart of your system is likely to not be worth the cost of working around the expectations of most developers and toolsets and you're better providing a façade.
This is not possible. In the select statement you define an anonymous type. This is not a language feature, but a compiler-feature, which means that the compiler creates a class for this type with the properties you define.
This means that the compiler must know the names at compile time. If you want something more dynamic, I recommend you to use a dictionary:
select new Dictionary<string, object>
{
{ Resources.BaseCC, g.Key.BaseCC },
{ Resources.PSPassBegCost , g.Sum() }
};

EF 4.1 Insert/Update Logic Best Practices

I'm inserting a lot of data, wrapped in a transaction (like 2 million+ rows) at a time, using EF 4.1. Now I'd like to add UPDATE logic. Keep in mind, change-tracking is disabled given the volume of data. Off the top of my head, I'd do something like this:
// Obviously simplified code...
public void AddOrUpdate(Foo foo)
{
if(!db.Foos.Any(x => someEqualityTest(foo)))
{
db.Foos.Add(foo);
}
else
{
var f = db.Foos.First(x => someEqualityTest(foo));
f = foo;
}
db.SaveChanges();
}
Any ideas on how possibly to improve on this?
I would keep the inserts separate from the updates.
For inserts, I would recommend using SqlBulkCopy to insert all records which don't already exist and it's going to be way faster.
First, the Bulk Insert method in your DbContext:
public class YourDbContext : DbContext
{
public void BulkInsert<T>(string tableName, IList<T> list)
{
using (var bulkCopy = new SqlBulkCopy(base.Database.Connection))
{
bulkCopy.BatchSize = list.Count;
bulkCopy.DestinationTableName = tableName;
var table = new DataTable();
var props = TypeDescriptor.GetProperties(typeof(T))
// Dirty hack to make sure we only have system
// data types (i.e. filter out the
// relationships/collections)
.Cast<PropertyDescriptor>()
.Where(p => "System" == p.PropertyType.Namespace)
.ToArray();
foreach (var prop in props)
{
bulkCopy.ColumnMappings.Add(prop.Name, prop.Name);
var type = Nullable.GetUnderlyingType(prop.PropertyType)
?? prop.PropertyType;
table.Columns.Add(prop.Name, type);
}
var values = new object[props.Length];
foreach (var item in list)
{
for (var i = 0; i < values.Length; i++)
{
values[i] = props[i].GetValue(item);
}
table.Rows.Add(values);
}
bulkCopy.WriteToServer(table);
}
}
}
Then, for your insert/update:
public void AddOrUpdate(IList<Foo> foos)
{
var foosToUpdate = db.Foos.Where(x => foos.Contains(x)).ToList();
var foosToInsert = foos.Except(foosToUpdate).ToList();
foreach (var foo in foosToUpdate)
{
var f = db.Foos.First(x => someEqualityTest(x));
// update the existing foo `f` with values from `foo`
}
// Insert the new Foos to the table named "Foos"
db.BulkInsert("Foos", foosToinsert);
db.SaveChanges();
}
Your update...
var f = db.Foos.First(x => someEqualityTest(foo));
f = foo;
...won't work because you are not changing the loaded and attached object f at all, you just overwrite the variable f with the detached object foo. The attached object is still in the context, but it has not been changed after loading and you don't have a variable anymore which points to it. SaveChanges will do nothing in this case.
The "standard options" you have are:
var f = db.Foos.First(x => someEqualityTest(foo));
db.Entry(f).State = EntityState.Modified;
or just
db.Entry(foo).State = EntityState.Modified;
// attaches as Modified, no need to load f
This marks ALL properties as modified - no matter if they really changed or not - and will send an UPDATE for each column to the database.
The second option which will only mark the really changed properties as modified and only send an UPDATE for the changed columns:
var f = db.Foos.First(x => someEqualityTest(foo));
db.Entry(f).CurrentValues.SetValues(foo);
Now, with 2 million objects to update you don't have a "standard" situation and it is possible that both options - especially the second which likely uses reflection internally to match property names of source and target object - are too slow.
The best option when it comes to performance of updates are Change Tracking Proxies. This would mean that you need to mark EVERY property in your entity class as virtual (not only the navigation properties, but also the scalar properties) and that you don't disable creation of change tracking proxies (it is enabled by default).
When you load your object f from the database EF will create then a dynamic proxy object (derived from your entity), similar to lazy loading proxies, which has code injected into every property setter to maintain a flag if the property has been changed or not.
The change tracking provided by proxies is much faster than the snapshot based change tracking (which happens in SaveChanges or DetectChanges).
I am not sure though if the two options mentioned above are faster if you use change tracking proxies. It is possible that you need manual property assignments to get the best performance:
var f = db.Foos.First(x => someEqualityTest(foo));
f.Property1 = foo.Property1;
f.Property2 = foo.Property2;
// ...
f.PropertyN = foo.PropertyN;
In my experience in a similar update situation with a few thousand of objects there is no real alternative to change tracking proxies regarding performance.

Categories

Resources