Generate multiple attribute? - c#

ATM i cant quiet imagine how this will work. I'm sure it can be done. I notice a pattern use in my attribute where i always use 3 specific attributes together. Take the below as an example
[MyAttr(4, #"a"),
MyAttr(41, "b"),
MyAttr(45, "ab")]
Mine is much more complicated but i would like to define one attribute with more params to generate the data above. How might i do that? Lets say my one attribute will look like this
MyAttr2(4, 41, "a", "b"); //4+41=45, "a"+"b" = "ab"
How might i generate the 3 MyAttr to apply to a class using MyAttr2?

The C# compiler can't convert a single attribute entry into multiple ones in the assembly metadata. However, you could model your attribute in such as way that it exposes additional attribute information as properties (or a collection). However, the child information will not be accessible directly via reflection as independent attributes.

Related

C# attributes used for information or to import extern functions?

I'm a bit confused about C#'s use of attributes. At first I thought it was simply used to give program code additional information through the use of the [Obsolete] attribute. Now I find that [Dllimport] can be used to import a dynamic linked library and its functions. Can attributes import .exe files and other kind of files?
A last question, for programmers working in C# every day, how much do you use attributes, and do you use it for anything else than extending information and importing dll's?
Simply said, attributes are just metadata attached to classes or methods, at the very base.
The compiler, however, reads through your code, and runs specific actions for specific attributes it encounters while doing so, hardcoded into it. E.g., when it finds a DllImportAttribute on a method, it will resolve it to an external symbol (again, this is a very simplified explanation).
When it finds an ObsoleteAttribute, it emits a warning of deprecation.
Your own attributes (which you can create with a class inheriting from the Attribute base class) will not have an effect on the default compiler. But you (or other libraries) can also scan for them at runtime, opening up many possibilities and leading to your second question:
I typically use them to do meta programming. For example, imagine a custom network server handling packets of a specific format, implemented in different classes. Each packet format is recognized by reading an integer value. Now I need to find the correct class to instantiate for that integer.
I could do that with a switch..case or dictionary mapping integer -> packet which I extend every time I add a packet, but that is ugly since I have to touch code possibly far away from the actual Packet class whenever I add or delete a packet. I may not even know about the switch or dictionary in case the server is implemented in another assembly than my packets (modularity / extensibility)!
Instead, I create a custom PacketAttribute, storing an integer property set via the attribute, and decorate all my Packet classes with it. The server only has to scan through my assembly types at startup (via reflection) and build a dictionary of integer -> packet pairs automatically. Of course I could scan my assembly every time I need a packet, but that's probably a bit slow performance-wise.
There are APIs which are much more attribute heavy, like controllers in ASP.NET Core: You map full request URLs to methods in handler classes with them, which then execute the server code. Even URL parameters are mapped to parameters in that way.
Debuggers can also make use of attributes. For example, decorating a class with the DebuggerDisplayAttribute lets you provide a custom string displayed for the instances of the class when inspecting them in Visual Studio, which has a specific format and can directly show the values of important members.
You can see, attributes can be very powerful if utilized nicely. The comments give some more references! :)
To answer the second part of your questions, they are also used, for example, in setting validation and display attributes for both client and server side use in a web application. For example:
[Display(Name = "Person's age")]
[Required(ErrorMessage = "Persons's age is required")]
[RangeCheck(13, 59, ErrorMessage = "The age must be between 13 and 59")]
public int? PersonsAgeAtBooking { get; set; }
Or to decorate enums for use in display
public enum YesNoOnlyEnum
{
[Description("Yes")]
Yes = 1,
[Description("No")]
No = 2
}
There are many other uses.

nunit - set Order attribute from custom attribute of Test method

Let's say we have a custom attribute:
[Precondition(1, "Some precondition")]
This would implement [Test, Order(1), Description("Some precondition")]
Can I access and modify the Order attribute (or create one) for this method?
I can modify the Description and Author, but Order is not a possibility.
I have tried
1: context.Test.Properties["Order"][0] = order;
2:method.CustomAttributes.GetEnumerator()
by walking the stack frames with
Object[] attributes = method.GetCustomAttributes(typeof(PreconditionAttribute), false);
if (attributes.Length >= 1){...}
3:
OrderAttribute orderAttribute = (OrderAttribute)Attribute.GetCustomAttribute(i, typeof(OrderAttribute));
orderAttribute.Order = _order;
Which is readonly.
If I try orderAttribute.Order = new OrderAttribute(myOrd), it doesn't do anything.
I have two answers to choose from. One is in the vein of "Don't do this" and the other is about how to do it. Just for fun, I'm putting both answers up, separately, so they can compete with one another. This one is about why I don't think this is a good idea.
It's easy enough to write either
[Test, Order(1), Description("xxx")] or the equivalent...
[Test(Description="xxx"), Order(1)]
The proposed attribute gives users a second way to specify order, making it possible to assign two different orders to a test. Which of two attributes will win the day depends on (1) how each one is implemented, (2) the order in which the attributes are listed and (3) the platform on which you are running. For all practical purposes, it's non-deterministic.
Keeping the two things separate allows devs to decide which they need independently... which is why NUnit keeps them separate.
Using the standard attributes means that the devs can rely on the nunit documentation to tell them what the attributes do. If you implement your own attribute, you should document what it does in itself as well as what it does in the presence of the standard attributes... As stated above, that's difficult to predict.
I know this isn't a real answer in SO terms, but it's not pure opinion either. There are real technical issues in providing the kind of solution you want. I'd love to see what people think of it in comparison with "how to" I'm going to post next.
See my prior answer first! If you really want to do this, here's the how-to...
In order to combine the action of two existing attributes, you need equivalent code to those two attributes.
In this case both are extremely simple and both have about the same amount of code. DescriptionAttribute is based on PropertyAttribute so some of its code is hidden. OrderAttribute has a bit more logic because it checks to make sure the order has not already been set. Ultimately, both of them have code that implements the IApplyToTest interface.
Because they are both simple, I would copy the code, in order to avoid relying on implementation details that could change. Start with the slightly more complete OrderAttribute. Change its name. Modify the ApplyToTest method to set the description. You're done!
It will look something like this, depending on the names you use for properties...
public void ApplyToTest(Test test)
{
if (!test.Properties.ContainsKey(PropertyNames.Order))
test.Properties.Set(PropertyNames.Order, Order);
test.Properties.Set(PropertyNames.Description, Description);
}
A comment on what you tried...
There is no reason to think that creating an attribute in your code will do anything. NUnit has no way to know about those attributes. Your attribute cannot modify the code so that the test magically has other attributes. The only way Attributes communicate with NUnit is by having their interfaces (like IApplyToTest) called. And only attributes actually present in the code will receive such a call.

C# custom file parsing with 2 delimiters and different record types

I have a (not quite valid) CSV file that contains rows of multiple types. Any record could be one of about 6 different types and each type has a different number of properties. The first part of any row contains the timestamp and the type of record, followed by a standard CSV of the data.
Example
1456057920 PERSON, Ted Danson, 123 Fake Street, 555-123-3214, blah
1476195120 PLACE, Detroit, Michigan, 12345
1440581532 THING, Bucket, Has holes, Not a good bucket
And to make matters more complex, I need to be able to do different things with the records depending on certain criteria. So a PERSON type can be automatically inserted into a DB without user input, but a THING type would be displayed on screen for the user to review and approve before adding to DB and continuing the parse, etc.
Normally, I would use a library like CsvHelper to map the records to a type, but in this case since the types could be different, and the first part uses a space instead of comma, I dont know how to do that with a standard CSV library. So currently how I am doing it each loop is:
String split based off comma.
Split the first array item by the space.
Use a switch statement to determine the type and create the object.
Put that object into a List of type object.
Get confused as to where to go now because i now have a list of various types and will have to use yet another switch or if to determine the next parts.
I don't really know for sure if I will actually need that List but I have a feeling the user will want the ability to manually flip through records in the file.
By this point, this is starting to make for very long, confusing code, and my gut feeling tells me there has to be a cleaner way to do this. I thought maybe using Type.GetType(string) would help simplify the code some, but this seems like it might be terribly inefficient in a loop with 10k+ records and might make things even more confusing. I then thought maybe making some interfaces might help, but I'm not the greatest at using interfaces in this context and I seem to end up in about this same situation.
So what would be a more manageable way to parse this file? Are there any C# parsing libraries out there that would be able to handle something like this?
You can implement an IRecord interface that has a Timestamp property and a Process method (perhaps others as well).
Then, implement concrete types for each type of record.
Use a switch statement to determine the type and create and populate the correct concrete type.
Place each object in a List
After that you can do whatever you need. Some examples:
Loop through each item and call Process() to handle it.
Use linq .OfType<{concrete type}> to segment the list. (Warning with 10k
records, this would be slow since it would traverse the entire list for each concrete type.)
Use an overridden ToString method to give a single text representation of the IRecord
If using WPF, you can define a datatype template for each concrete type, bind an ItemsControl derivative to a collection of IRecords and your "detail" display (e.g. ListItem or separate ContentControl) will automagically display the item using the correct DataTemplate
Continuing in my comment - well that depends. What u described is actually pretty good for starters, u can of course expand it to a series of factories one for each object type - so that you move from explicit switch into searching for first factory that can parse a line. Might prove useful if u are looking to adding more object types in the future - you just add then another factory for new kind of object. Up to you if these objects should share a common interface. Interface is used generally to define a a behavior, so it doesn't seem so. Maybe you should rather just a Dictionary? You need to ask urself if you actually need strongly typed objects here? Maybe what you need is a simple class with ObjectType property and Dictionary of properties with some helper methods for easy typed properties access like GetBool, GetInt or generic Get?

View model properties has changing validation rules at run time

I'm new to C# MVC and I'm trying to add some dynamic validation checks to my view models that are used in a form. For example, I have a string property called FirstName. I can add the attribute StringLength(10) and Required() to it.
My problem is, depending on some other field, the FirstName StringLength could vary from 10 to 20, etc. I still want to use the MVC validations but be able to modify it. I know that attributes are bound to the class so maybe I'm using the wrong thing.
I want the abilities for attribute validation but have it modifiable at run time. Is this possible?
The values in an attribute have to be literals. You can still use attribute based validation, but you will need to use the CustomValidation tag and point it at a method to use. If it depends on multiple fields in the object, you will want to put this on the class rather than the property.
It seems you can add validation attributes at runtime by implementing DataAnnotationsModelValidatorProvider:
Dynamic Attributes # forums.asp.net

Linq to DataTable without enumerating fields

i´m trying to query a DataTable object without specifying the fields, like this :
var linqdata = from ItemA in ItemData.AsEnumerable()
select ItemA
but the returning type is
System.Data.EnumerableRowCollection<System.Data.DataRow>
and I need the following returning type
System.Data.EnumerableRowCollection<<object,object>>
(like the standard anonymous type)
Any idea?
Thanks
If I understand you correctly, you'd like to get a collection of objects that you don't need to define in your code but that are usable in a strongly typed fashion. Sadly, no you can't.
An anonymous type seems like some kind of variant or dynamic object, but it is in fact a strongly typed class that is defined at compile time. .NET defines the type for you automatically behind the scenes. In order for .net to be able to do this, it has to have some clue from the code with which to infer the type definition. It has to have something like:
from ItemA in ItemData.AsEnumerable()
select ItemA.Item("Name"), ItemA.Item("Email")
so it knows what members to define. There's no way to get around it, the information has to logically be there for the anonymous type to be defined.
Depending on why exactly your are trying to do this, there are some options.
If you want intellisense while still encapsulating your data access, you can return xml instead of a datatable from your encapsulated data access class. (You can convert data tables to xml very easily. You'll want to use the new System.Xml.Linq classes like the XElement. They're great!) Then you can use VS2008's ability to create an xsd schema from xml. Then use/import that schema at the top of your code page, and you have intellisense.
If you have to have an object an with properties for your data, but don't want to define a class/structure for them, you'll love the new dynamic objects coming in C#4.0/VB10. You have object properties based on what the sql returns, but you won't have intellisense. There is also a performance cost to this, but (a) that might not matter for your situation and (b) it actually is not so bad in some situations.
If you're just trying to avoid making a lot of classes, consider defining structs/structures on the same code file, beneath your class definition. When you add more columns to your result set, it's easy to adjust a struct with more public fields.
In short you can have any two of the following three: (a) dynamic, (b) strontly-typed objects, (3) intellisense. But not all three.
There is one way to accomplish what you want, but it required knowledge of dynamic linq. You would build the query during run-time and then use it. I am no expert and have never really played around with it, but here is a link to Scott Guthrie's blog about it - Dynamic Linq. Hope that helps.
Wade

Categories

Resources