Can I parameterize generic Type - c#

We are having several reports in fixed length record format. We are using FileHelpers for transforming them to delimited record format.
As there are serveral reports we thought of describing the fixed and delimited models and pass to
FileTransformEngine<TSource, TDestination>();
As to make it DRY can we do something like
Transform<TInput, TOutput>()
{
var engine = new FileTransformEngine<TInput, TOutput>();
engine.TransformFileFast(...);
}
I'm new to C# and generics and I'm not sure where to start.
Can anyone can give me some guidance, whether this is possible, as all models are already created they are available at compile time I think.

The documentation for filehelpers shows how to read a fixed file and write a delimited file - so it should be pretty easy to wrap calls to both in your own FileTransformEngine
public class FileTransformEngine<T>
{
public void TransformFileFast(string inputFile, string outputFile)
{
var readEngine = new FixedFileEngine<T>();
T[] records = readEngine.ReadFile(inputFile);
var writeEngine = new FileHelperEngine<T>();
writeEngine.WriteFile(outputFile,records);
}
}
For this to work, your class will need the attributes for both reading and writing. eg
[FixedLengthRecord()]
[DelimitedRecord("|")]
public class MyRecord
{
[FieldFixedLength(5)] // for reading
[FieldConverter(ConverterKind.Decimal, ".")] // for writing
public decimal Foo{get; set; }
}
Then you would use
var engine = new FileTransformEngine<MyRecord>();
engine.TransformFileFast("from/input.txt","to/output.csv");
If you want different types for input and output you could have the class take two types (an input and an output) along with a predicate for transforming one to the other.
public class FileTransformEngine<TInput, TOutput>
{
public void TransformFileFast(string inputFile, string outputFile,
Func<TInput,TOutput> transformer)
{
var readEngine = new FixedFileEngine<TInput>();
TInput[] records = readEngine.ReadFile(inputFile);
IEnumerable<TOutput> outputRecords = records.Select(transformer);
var writeEngine = new FileHelperEngine<TOutput>();
writeEngine.WriteFile(outputFile,outputRecords );
}
}
This would then be called with your two types
var engine = new FileTransformEngine<MyInputRecord, MyOutputRecord>();
engine.TransformFileFast("from/input.txt","to/output.csv", input => {
//transform input to output.
return new MyOutputRecord();
});

Related

Looping header and details records in same file

I have re edit this question below I have an example file which as multiple purchase orders in the file which is identified by the second column.
Order Number, Purchase Number,DATE,Item Code ,Qty, Description
1245456,98978,12/01/2019, 1545-878, 1,"Test"
1245456,98978,12/01/2019,1545-342,2,"Test"
1245456,98978,12/01/2019,1545-878,2,"Test"
1245456,98979,12/02/2019,1545-878,3,"Test 3"
1245456,98979,12/02/2019,1545-342,4,"Test 4"
1245456,98979,12/02/2019,1545-878,5,"Test 4"
What I want the end result to be is to be able to place the above into one class like the following
At the min I am using filelpers to parse the csv file this would work fine if I had sep header file and row file but they are combined as you see
var engine = new FileHelperEngine<CSVLines>();
var lines = engine.ReadFile(csvFileName);
So the Class should be like below
[DelimitedRecord(",")]
public class SalesOrderHeader
{
private Guid? _guid;
public Guid RowID
{
get
{
return _guid ?? (_guid = Guid.NewGuid()).GetValueOrDefault();
}
}
public string DocReference { get; set; }
public string CardCode { get; set; }
public string DocDate { get; set; }
public string ItemCode { get; set; }
public string Description { get; set; }
public string Qty { get; set; }
public string Price { get; set; }
[FieldHidden]
public List<SalesOrderHeader> OrdersLines { get; set; }
}
What I imagine I will have to do is two loops as you will see from my createsales order routine i first create the header and then add the lines in.
public void CreateSalesOrder(List<SalesOrderHeader> _salesOrders)
{
foreach (var record in _salesOrders.GroupBy(g => g.DocReference))
{
// Init the Order object
oOrder = (SAPbobsCOM.Documents)company.GetBusinessObject(SAPbobsCOM.BoObjectTypes.oOrders);
SAPbobsCOM.SBObob oBob;
// set properties of the Order object
// oOrder.NumAtCard = record.Where(w=>w.RowID = record.Where()
oOrder.CardCode = record.First().CardCode;
oOrder.DocDueDate = DateTime.Now;
oOrder.DocDate =Convert.ToDateTime(record.First().DocDate);
foreach (var recordItems in _salesOrders.SelectMany(e=>e.OrdersLines).Where(w=>w.DocReference ==record.First().DocReference))
{
oOrder.Lines.ItemCode = recordItems.ItemCode;
oOrder.Lines.ItemDescription = recordItems.Description;
oOrder.Lines.Quantity = Convert.ToDouble(recordItems.Qty);
oOrder.Lines.Price = Convert.ToDouble(recordItems.Price);
oOrder.Lines.Add();
log.Debug(string.Format("Order Line added to sap Item Code={0}, Description={1},Qty={2}", recordItems.ItemCode, recordItems.Description, recordItems.Qty));
}
int lRetCode = oOrder.Add(); // Try to add the orer to the database
}
if(lRetCode == 0)
{
string body = "Purchase Order Imported into SAP";
}
if (lRetCode != 0)
{
int temp_int = lErrCode;
string temp_string = sErrMsg;
company.GetLastError(out temp_int, out temp_string);
if (lErrCode != -4006) // Incase adding an order failed
{
log.Error(string.Format("Error adding an order into sap ErrorCode {0},{1}", temp_int, temp_string));
}
}
}
The problem you will see i have is how do I first split the csv into the two lists and second how do i access the header rows correctly in the strongly type object as you see I am using first which will not work correctly.
With FileHelpers it is important to avoid using the mapping class for anything other than describing the underlying file structure. Here I suspect you are trying to map directly to a class which is too complex.
A FileHelpers class is just a way of defining the specification of a flat file using C# syntax.
As such, the FileHelpers classes are an unusual type of C# class and you should not try to use accepted OOP principles. FileHelpers should not have properties or methods beyond the ones used by the FileHelpers library.
Think of the FileHelpers class as the 'specification' of your CSV format only. That should be its only role. (This is good practice from a maintenance perspective anyway - if the underlying CSV structure were to change, it is easier to adapt your code).
Then if you need the records in a more 'normal' object, then map the results to something better, that is, a class that encapsulates all the functionality of the Order object rather than the CSVOrder.
So, one way of handling this type of file is to parse the file twice. In the first pass you extract the header records. Something like this:
var engine1 = new FileHelperEngine<CSVHeaders>();
var headers = engine1.ReadFile(csvFileName);
In the second pass you extract the details;
var engine2 = new FileHelperEngine<CSVDetails>();
var details = engine2.ReadFile(csvFileName);
Then you combine this information into a new dedicated class, maybe with some LINQ similar to this
var niceOrders =
headers
.DistinctBy(h => h.OrderNumber)
.SelectMany(d => details.Where(d => d.OrderNumber = y))
.Select(x =>
new NiceOrder() {
OrderNumber = x.OrderNumber,
Customer = x.Customer,
ItemCode = x.ItemCode
// etc.
});

MongoDB Linq OfType() on fields

This is my MongoDB document structure:
{
string _id;
ObservableCollection<DataElement> PartData;
ObservableCollection<DataElement> SensorData;
...
other ObservableCollection<DataElement> fields
...
other types and fields
...
}
Is there any possibility to retrieve a concatenation of fields with the type ObservableCollection<DataElement>? Using LINQ I would do something like
var query = dbCollection
.AsQueryable()
.Select(x => new {
data = x
.OfType(typeof(ObservableCollection<DataElement>))
.SelectMany(x => x)
.ToList()
});
or alternatively
data = x.Where(y => typeof(y) == typeof(ObservableCollection<DataElement>)
.SelectMany(x => x).ToList()
Unfortunately .Where() and .OfType() do not work on documents, only on queryables/lists, so is there another possibility to achieve this? The document structure must stay the same.
Edit:
After dnickless answer I tried it with method 1b), which works pretty well for getting the fields thy way they are in the collection. Thank you!
Unfortunately it wasn't precisely what I was looking for, as I wanted to be all those fields with that specific type put together in one List, at it would be returned by the OfType or Where(typeof) statement.
e.g. data = [x.PartData , x.SensorData, ...] with data being an ObsverableCollection<DataElement>[], so that I can use SelectMany() on that to finally get the concatenation of all sequences.
Sorry for asking the question unprecisely and not including the last step of doing a SelectMany()/Concat()
Finally I found a solution doing this, but it doesn't seem very elegant to me, as it needs one concat() for every element (and I have more of them) and it needs to make a new collection when finding a non-existing field:
query.Select(x => new
{
part = x.PartData ?? new ObservableCollection<DataElement>(),
sensor = x.SensorData ?? new ObservableCollection<DataElement>(),
}
)
.Select(x => new
{
dataElements = x.part.Concat(x.sensor)
}
).ToList()
In order to limit the fields returned you would need to use the MongoDB Projection feature in one way or the other.
There's a few alternatives depending on your specific requirements that I can think of:
Option 1a (fairly static approach): Create a custom type with only the fields that you are interested in if you know them upfront. Something like this:
public class OnlyWhatWeAreInterestedIn
{
public ObservableCollection<DataElement> PartData { get; set; }
public ObservableCollection<DataElement> SensorData { get; set; }
// ...
}
Then you can query your Collection like that:
var collection = new MongoClient().GetDatabase("test").GetCollection<OnlyWhatWeAreInterestedIn>("test");
var result = collection.Find(FilterDefinition<OnlyWhatWeAreInterestedIn>.Empty);
Using this approach you get a nicely typed result back without the need for custom projections.
Option 1b (still pretty static): A minor variation of Option 1a, just without a new explicit type but a projection stage instead to limit the returned fields. Kind of like that:
var collection = new MongoClient().GetDatabase("test").GetCollection<Test>("test");
var result = collection.Find(FilterDefinition<Test>.Empty).Project(t => new { t.PartData, t.SensorData }).ToList();
Again, you get a nicely typed C# entity back that you can continue to operate on.
Option 2: Use some dark reflection magic in order to dynamically create a projection stage. Downside: You won't get a typed instance reflecting your properties but instead a BsonDocument so you will have to deal with that afterwards. Also, if you have any custom MongoDB mappings in place, you would need to add some code to deal with them.
Here's the full example code:
First, your entities:
public class Test
{
string _id;
public ObservableCollection<DataElement> PartData { get; set; }
public ObservableCollection<DataElement> SensorData { get; set; }
// just to have one additional property that will not be part of the returned document
public string TestString { get; set; }
}
public class DataElement
{
}
And then the test program:
public class Program
{
static void Main(string[] args)
{
var collection = new MongoClient().GetDatabase("test").GetCollection<Test>("test");
// insert test record
collection.InsertOne(
new Test
{
PartData = new ObservableCollection<DataElement>(
new ObservableCollection<DataElement>
{
new DataElement(),
new DataElement()
}),
SensorData = new ObservableCollection<DataElement>(
new ObservableCollection<DataElement>
{
new DataElement(),
new DataElement()
}),
TestString = "SomeString"
});
// here, we use reflection to find the relevant properties
var allPropertiesThatWeAreLookingFor = typeof(Test).GetProperties().Where(p => typeof(ObservableCollection<DataElement>).IsAssignableFrom(p.PropertyType));
// create a string of all properties that we are interested in with a ":1" appended so MongoDB will return these fields only
// in our example, this will look like
// "PartData:1,SensorData:1"
var mongoDbProjection = string.Join(",", allPropertiesThatWeAreLookingFor.Select(p => $"{p.Name}:1"));
// we do not want MongoDB to return the _id field because it's not of the selected type but would be returned by default otherwise
mongoDbProjection += ",_id:0";
var result = collection.Find(FilterDefinition<Test>.Empty).Project($"{{{mongoDbProjection}}}").ToList();
Console.ReadLine();
}
}

Use C# Linq Lambda to combine fields from two objects into one, preferably without anonymous objects

I have a class setup like this:
public class Summary
{
public Geometry geometry { get; set; }
public SummaryAttributes attributes { get; set; }
}
public class SummaryAttributes
{
public int SERIAL_NO { get; set; }
public string District { get; set; }
}
public class Geometry
{
public List<List<List<double>>> paths { get; set; }
}
and i take a json string of records for that object and cram them in there like this:
List<Summary> oFeatures = reportObject.layers[0].features.ToObject<List<Summary>>();
my end goal is to create a csv file so i need one flat List of records to send to the csv writer i have.
I can do this:
List<SummaryAttributes> oAtts = oFeatures.Select(x => x.attributes).ToList();
and i get a nice List of the attributes and send that off to csv. Easy peasy.
What i want though is to also pluck a field off of the Geometry object as well and include that in my final List to go to csv.
So the final List going to the csv writer would contain objects with all of the fields from SummaryAttributes plus the first and last double values from the paths field on the Geometry object (paths[0][0][first] and paths[0][0][last])
It's hard to explain. I want to graft two extra attributes onto the original SummaryAttributes object.
I would be ok with creating a new SummaryAttributesXY class with the two extra fields if that's what it takes.
But i'm trying to avoid creating a new anonymous object and having to delimit every field in the SummaryAttributes class as there are many more than i have listed in this sample.
Any suggestions?
You can select new anonymous object with required fields, but you should be completely sure that paths has at least one item in each level of lists:
var query = oFeatures.Select(s => new {
s.attributes.SERIAL_NO,
s.attributes.District,
First = s.geometry.paths[0][0].First(), // or [0][0][0]
Last = s.geometry.paths[0][0].Last()
}).ToList()
Got it figured out. I include the X and Y fields in the original class definition. When the json gets deserialized they will be null. Then i loop back and fill them in.
List<Summary> oFeatures = reportObject.layers[0].features.ToObject<List<Summary>>();
List<Summary> summary = oFeatures.Select(s =>
{
var t = new Summary
{
attributes = s.attributes
};
t.attributes.XY1 = string.Format("{0} , {1}", s.geometry.paths[0][0].First(), s.geometry.paths[0][1].First());
t.attributes.XY2 = string.Format("{0} , {1}", s.geometry.paths[0][0].Last(), s.geometry.paths[0][1].First());
return t;
}).ToList();
List<SummaryAttributes> oAtts = summary.Select(x => x.attributes).ToList();

How can I run the same Coded UI test with multiple input files

I am looking for a way to be able to run the same coded ui test class with different input files, e.g. I have a test with end to end flow in the app, I want to be able to run this test with two different users doing different workflows once inside the app. I do not want to run both the tests each time (This is possible with having two rows in the data input csv). I wasn't able to find a way of doing this so far. Any help/guidance is appreciated.
I can think of three possibilities.
1.
You could arrange the CSV to have two groups of columns, eg
UserName1,Password1,DataAa1,DataBb1,UserName2,Password2,DataAa2,DataBb2
Within the test method change the data source accesses to use something like
string testCode = (...) ? "1" : "2";
... = TestContext.DataRow["UserName" + testCode].ToString();
... = TestContext.DataRow["Password" + testCode].ToString();
This requires something else to specify which data file to use. That could be done via an environment variable.
2.
Have three CSV files within the solution. Two of them are the CSV files for the two runs. For example SourceData1.csv and SourceData2.csv. The third file is SourceData.csv and is named in the [DataSource(...) attribute as "|DataDirectory|\\SourceData.csv". In the ".testsettings" file give the name of a batch file that chooses the wanted SourceData1.csv or SourceData2.csv file and uses xcopy to copy that file and overwrite SourceData.csv.
3.
Assuming that the test is currently written as
[TestMethod, DataSource(...)]
public void MyCodedUiTestMethod() {
...body of the test
}
Then change to having two test methods that call a third method. These two methods specify different CSV files and the called method accesses values from the whichever file is being read.
[TestMethod, DataSource(... SourceData1.csv ...)]
public void MyFirstCodedUiTestMethod() {
BodyOfTheTest();
}
[TestMethod, DataSource(... SourceData2.csv ...)]
public void MySecondCodedUiTestMethod() {
BodyOfTheTest();
}
public void BodyOfTheTest() {
...body of the test
... = TestContext.DataRow["UserName"].ToString();
... = TestContext.DataRow["Password"].ToString();
}
Note that TextContext is visible in all methods of the class hence the TestContext.DataRow... expressions can be written outside the methods that specify the [DataSource...] attribute.
If it is a same test case then you should have same set of input parameters, create a class with your test arguments then serialize the list of class instance to XML file with different argument set. When running the test case you can deserialize the XML file (inside TestInitialize() block) and iterate though each class instance and pass the instance coded UI test method. you can call the test method as many time you want according to the count of class instances you have in the xml file. This is the method which I am using for the data driven testing on coded UI test.
Create a class with the test arguments
public class ClientDetails
{
public String ClientType { get; set; }
public String clientCode { get; set; }
public String Username { get; set; }
public String Password { get; set; }
}
Create some class instances and first serialize to a XML file
// location to store the XML file following relative path will store outside solution folder with
// TestResults folder
string xmlFileRelativePath = "../../../TestClientInfo.xml";
public List<ClientDetails> ListClientConfig = new List<ClientDetails>();
ClientDetails Client1 = new Classes.ClientDetails();
Client1.ClientType = "Standard";
Client1.clientCode = "xxx";
Client1.Username = "username";
Client1.Password = "password";
ClientDetails Client2 = new Classes.ClientDetails();
Client2.ClientType = "Easy";
Client2.clientCode = "xxxx";
Client2.Username = "username";
Client2.Password = "password";
ListClientConfig.Add(Client1);
ListClientConfig.Add(Client2);
XmlSerialization.genericSerializeToXML(ListClientConfig, xmlFileRelativePath );
Retrieve the stored XML objects within the test method or any where you prefer (better if inside inside TestInitialize() block)
[TestMethod]
public void CommonClientExecution()
{
List<ClientDetails> ListClientConfig = XmlSerialization.genericDeserializeFromXML(new ClientDetails(), xmlFileRelativePath );
foreach (var ClientDetails in ListClientConfig )
{
// you test logic here...
}
}
XML Serialization methods for serialize collection of objects
using System.Xml;
using System.Xml.Serialization;
class XmlSerialization
{
public static void genericSerializeToXML<T>(T TValue, string XmalfileStorageRelativePath)
{
XmlSerializer serializer = new XmlSerializer(typeof(T));
FileStream textWriter = new FileStream((string)System.IO.Path.GetFullPath(XmalfileStorageRelativePath), FileMode.OpenOrCreate, FileAccess.ReadWrite);
serializer.Serialize(textWriter, TValue);
textWriter.Close();
}
public static T genericDeserializeFromXML<T>(T value, string XmalfileStorageFullPath)
{
T Tvalue = default(T);
try
{
XmlSerializer deserializer = new XmlSerializer(typeof(T));
TextReader textReader = new StreamReader(XmalfileStorageFullPath);
Tvalue = (T)deserializer.Deserialize(textReader);
textReader.Close();
}
catch (Exception ex)
{
// MessageBox.Show(#"File Not Found", "Not Found", MessageBoxButtons.OK, MessageBoxIcon.Asterisk);
}
return Tvalue;
}
}

How to update object values (based on the curr-previous pattern)?

Assume you have a CSV file with the following simplified structure
LINE1: ID,Description,Value
LINE2: 1,Product1,2
LINE3: ,,3
LINE4: ,,4
LINE5: 2,Product2,2
LINE6: ,,3
LINE7: ,,5
I am using FileHelpers to read the CSV and have hooked up one the interfaces that allows me me to access the current line, after it has been read. Refer to this SO question for more background.
The issues is that using that approach I will need to write many more if statements to check all the fields that need to be copied. (I have at least, 6 csv files at the moment with the same 'blank' format all having more than 20 fields that need to be copied ~ 120 if statements. urggh)
Now this is not a micro optimisation exercise since there will be more files that will have this 'incomplete' format.
How can I update the previous record in an elegant way such that I wont have to write if conditions and declarations for each field?
The current solution is to annotate the required fields using a custom attribute called CopyMe and then use reflection to copy.
[DelimitedRecord(",") ]
[IgnoreFirst(1)]
public class Product
{
[CopyMe()] public int ID { get; set; }
[CopyMe()] public string Description { get; set; }
[FieldConverter(ConverterKind.Decimal)]
public decimal Val{ get; set; }
}
with the AfterRead method looking like so...
public void AfterRead(AfterReadEventArgs<RawVolsDataPoints> e)
{
var record = (Product)e.Record;
if (PreviousRecord == null)
{
PreviousRecord = new Product();
PreviousRecord = record;
}
if (String.IsNullOrEmpty(record.ID)) // null value indicates a new row
{
var ttype = typeof(Product);
var fields = ttype.GetFields();
var fieldsToCopy = fields.Where(field =>
field.GetCustomAttributes(typeof(CopyMeAttribute), true).Any());
foreach (var item in fieldsToCopy)
{
var prevvalue = item.GetValue(PreviousRecord);
item.SetValue(record, prevvalue);
}
}
}

Categories

Resources