Storing (and using) file naming schemas to a class property - c#

I have a project that contains a stateProfile class and an Inquiry class.
The Inquiry class is static and it's job is to create a text file. The text file's contents are calculated using information stored in the properties of a stateProfile object (one for each State in the USA, that winds up being read in from an XML file).
I had been able to share a global naming schema for all of the states but I now have requirements to meet for naming them based on the stateProfile object.
For example:
I have a stateProfile for Kansas and another for Missouri.
Inquiry inquiry = new Inquiry();
foreach(stateProfile p in _listOfProfiles){
inquiry.CreateFile(p);
}
in Inquiry.CreateFile(stateProfile p) I have the expression that defines the filename:
sOutFileName = $"{p.prop1}_{p.prop2}_Literal3_{_staticProp.ToString("D4")}.txt";
Any suggestions for being able to store the logic/expression in a property like stateProfile.outFileName such as
public string outFileName {
$"{p.prop1}_{p.prop2}_Literal3_{_staticProp.ToString("D4")}.txt";
}
and then be able to refer to that property in Inquiry.CreateFile(stateProfile p) like the following?
sOutFileName = $"{p.outFileName}";

I think that this sort of approach might be what you are looking for. It allows you to embed C# scripts into your XML and interpret them in code.
Every State could have a different filename format now.
using System;
using Microsoft.CodeAnalysis.Scripting;
using Microsoft.CodeAnalysis.CSharp.Scripting;
using System.Threading.Tasks;
namespace ConsoleApp1
{
class Program
{
static async Task Main(string[] args)
{
// This was read from your XML file.
var filenameFormatterScript = "p => $\"{p.Name.ToLower()}.txt\"";
var options = ScriptOptions.Default.AddReferences(typeof(StateProfile).Assembly);
var sp = new StateProfile
{
Name = "Alabama",
FilenameFormatter = await CSharpScript.EvaluateAsync<Func<StateProfile, string>>(filenameFormatterScript, options)
};
Console.WriteLine(sp.Filename());
}
}
public class StateProfile
{
public string Name { get; set; }
public Func<StateProfile, string> FilenameFormatter { get; set; }
public string Filename()
{
return FilenameFormatter(this);
}
}
}

After thinking about this some more, Alex's comment was the way to go.
Since all of the properties I am looking to use in constructing the filename formulas are belonging to the stateProfile object, I make a method GetFileName() class member and pass in anything else I might need.
I use a switch statement, which could get unwieldy as the number of states I have to consider grows, but for now it works.
public string GetFileName(int prop1) {
string sOutFileName = "";
switch (SourceState) {
case "MO":
sOutFileName = $"ReqPref_{this.stProfPropA}_{prop1}.txt";
break;
default:
sOutFileName = $"{this.stProfPropB}_{prop1}.txt";
break;
}
return sOutFileName;
}

Related

Looping header and details records in same file

I have re edit this question below I have an example file which as multiple purchase orders in the file which is identified by the second column.
Order Number, Purchase Number,DATE,Item Code ,Qty, Description
1245456,98978,12/01/2019, 1545-878, 1,"Test"
1245456,98978,12/01/2019,1545-342,2,"Test"
1245456,98978,12/01/2019,1545-878,2,"Test"
1245456,98979,12/02/2019,1545-878,3,"Test 3"
1245456,98979,12/02/2019,1545-342,4,"Test 4"
1245456,98979,12/02/2019,1545-878,5,"Test 4"
What I want the end result to be is to be able to place the above into one class like the following
At the min I am using filelpers to parse the csv file this would work fine if I had sep header file and row file but they are combined as you see
var engine = new FileHelperEngine<CSVLines>();
var lines = engine.ReadFile(csvFileName);
So the Class should be like below
[DelimitedRecord(",")]
public class SalesOrderHeader
{
private Guid? _guid;
public Guid RowID
{
get
{
return _guid ?? (_guid = Guid.NewGuid()).GetValueOrDefault();
}
}
public string DocReference { get; set; }
public string CardCode { get; set; }
public string DocDate { get; set; }
public string ItemCode { get; set; }
public string Description { get; set; }
public string Qty { get; set; }
public string Price { get; set; }
[FieldHidden]
public List<SalesOrderHeader> OrdersLines { get; set; }
}
What I imagine I will have to do is two loops as you will see from my createsales order routine i first create the header and then add the lines in.
public void CreateSalesOrder(List<SalesOrderHeader> _salesOrders)
{
foreach (var record in _salesOrders.GroupBy(g => g.DocReference))
{
// Init the Order object
oOrder = (SAPbobsCOM.Documents)company.GetBusinessObject(SAPbobsCOM.BoObjectTypes.oOrders);
SAPbobsCOM.SBObob oBob;
// set properties of the Order object
// oOrder.NumAtCard = record.Where(w=>w.RowID = record.Where()
oOrder.CardCode = record.First().CardCode;
oOrder.DocDueDate = DateTime.Now;
oOrder.DocDate =Convert.ToDateTime(record.First().DocDate);
foreach (var recordItems in _salesOrders.SelectMany(e=>e.OrdersLines).Where(w=>w.DocReference ==record.First().DocReference))
{
oOrder.Lines.ItemCode = recordItems.ItemCode;
oOrder.Lines.ItemDescription = recordItems.Description;
oOrder.Lines.Quantity = Convert.ToDouble(recordItems.Qty);
oOrder.Lines.Price = Convert.ToDouble(recordItems.Price);
oOrder.Lines.Add();
log.Debug(string.Format("Order Line added to sap Item Code={0}, Description={1},Qty={2}", recordItems.ItemCode, recordItems.Description, recordItems.Qty));
}
int lRetCode = oOrder.Add(); // Try to add the orer to the database
}
if(lRetCode == 0)
{
string body = "Purchase Order Imported into SAP";
}
if (lRetCode != 0)
{
int temp_int = lErrCode;
string temp_string = sErrMsg;
company.GetLastError(out temp_int, out temp_string);
if (lErrCode != -4006) // Incase adding an order failed
{
log.Error(string.Format("Error adding an order into sap ErrorCode {0},{1}", temp_int, temp_string));
}
}
}
The problem you will see i have is how do I first split the csv into the two lists and second how do i access the header rows correctly in the strongly type object as you see I am using first which will not work correctly.
With FileHelpers it is important to avoid using the mapping class for anything other than describing the underlying file structure. Here I suspect you are trying to map directly to a class which is too complex.
A FileHelpers class is just a way of defining the specification of a flat file using C# syntax.
As such, the FileHelpers classes are an unusual type of C# class and you should not try to use accepted OOP principles. FileHelpers should not have properties or methods beyond the ones used by the FileHelpers library.
Think of the FileHelpers class as the 'specification' of your CSV format only. That should be its only role. (This is good practice from a maintenance perspective anyway - if the underlying CSV structure were to change, it is easier to adapt your code).
Then if you need the records in a more 'normal' object, then map the results to something better, that is, a class that encapsulates all the functionality of the Order object rather than the CSVOrder.
So, one way of handling this type of file is to parse the file twice. In the first pass you extract the header records. Something like this:
var engine1 = new FileHelperEngine<CSVHeaders>();
var headers = engine1.ReadFile(csvFileName);
In the second pass you extract the details;
var engine2 = new FileHelperEngine<CSVDetails>();
var details = engine2.ReadFile(csvFileName);
Then you combine this information into a new dedicated class, maybe with some LINQ similar to this
var niceOrders =
headers
.DistinctBy(h => h.OrderNumber)
.SelectMany(d => details.Where(d => d.OrderNumber = y))
.Select(x =>
new NiceOrder() {
OrderNumber = x.OrderNumber,
Customer = x.Customer,
ItemCode = x.ItemCode
// etc.
});

Can I parameterize generic Type

We are having several reports in fixed length record format. We are using FileHelpers for transforming them to delimited record format.
As there are serveral reports we thought of describing the fixed and delimited models and pass to
FileTransformEngine<TSource, TDestination>();
As to make it DRY can we do something like
Transform<TInput, TOutput>()
{
var engine = new FileTransformEngine<TInput, TOutput>();
engine.TransformFileFast(...);
}
I'm new to C# and generics and I'm not sure where to start.
Can anyone can give me some guidance, whether this is possible, as all models are already created they are available at compile time I think.
The documentation for filehelpers shows how to read a fixed file and write a delimited file - so it should be pretty easy to wrap calls to both in your own FileTransformEngine
public class FileTransformEngine<T>
{
public void TransformFileFast(string inputFile, string outputFile)
{
var readEngine = new FixedFileEngine<T>();
T[] records = readEngine.ReadFile(inputFile);
var writeEngine = new FileHelperEngine<T>();
writeEngine.WriteFile(outputFile,records);
}
}
For this to work, your class will need the attributes for both reading and writing. eg
[FixedLengthRecord()]
[DelimitedRecord("|")]
public class MyRecord
{
[FieldFixedLength(5)] // for reading
[FieldConverter(ConverterKind.Decimal, ".")] // for writing
public decimal Foo{get; set; }
}
Then you would use
var engine = new FileTransformEngine<MyRecord>();
engine.TransformFileFast("from/input.txt","to/output.csv");
If you want different types for input and output you could have the class take two types (an input and an output) along with a predicate for transforming one to the other.
public class FileTransformEngine<TInput, TOutput>
{
public void TransformFileFast(string inputFile, string outputFile,
Func<TInput,TOutput> transformer)
{
var readEngine = new FixedFileEngine<TInput>();
TInput[] records = readEngine.ReadFile(inputFile);
IEnumerable<TOutput> outputRecords = records.Select(transformer);
var writeEngine = new FileHelperEngine<TOutput>();
writeEngine.WriteFile(outputFile,outputRecords );
}
}
This would then be called with your two types
var engine = new FileTransformEngine<MyInputRecord, MyOutputRecord>();
engine.TransformFileFast("from/input.txt","to/output.csv", input => {
//transform input to output.
return new MyOutputRecord();
});

How can I run the same Coded UI test with multiple input files

I am looking for a way to be able to run the same coded ui test class with different input files, e.g. I have a test with end to end flow in the app, I want to be able to run this test with two different users doing different workflows once inside the app. I do not want to run both the tests each time (This is possible with having two rows in the data input csv). I wasn't able to find a way of doing this so far. Any help/guidance is appreciated.
I can think of three possibilities.
1.
You could arrange the CSV to have two groups of columns, eg
UserName1,Password1,DataAa1,DataBb1,UserName2,Password2,DataAa2,DataBb2
Within the test method change the data source accesses to use something like
string testCode = (...) ? "1" : "2";
... = TestContext.DataRow["UserName" + testCode].ToString();
... = TestContext.DataRow["Password" + testCode].ToString();
This requires something else to specify which data file to use. That could be done via an environment variable.
2.
Have three CSV files within the solution. Two of them are the CSV files for the two runs. For example SourceData1.csv and SourceData2.csv. The third file is SourceData.csv and is named in the [DataSource(...) attribute as "|DataDirectory|\\SourceData.csv". In the ".testsettings" file give the name of a batch file that chooses the wanted SourceData1.csv or SourceData2.csv file and uses xcopy to copy that file and overwrite SourceData.csv.
3.
Assuming that the test is currently written as
[TestMethod, DataSource(...)]
public void MyCodedUiTestMethod() {
...body of the test
}
Then change to having two test methods that call a third method. These two methods specify different CSV files and the called method accesses values from the whichever file is being read.
[TestMethod, DataSource(... SourceData1.csv ...)]
public void MyFirstCodedUiTestMethod() {
BodyOfTheTest();
}
[TestMethod, DataSource(... SourceData2.csv ...)]
public void MySecondCodedUiTestMethod() {
BodyOfTheTest();
}
public void BodyOfTheTest() {
...body of the test
... = TestContext.DataRow["UserName"].ToString();
... = TestContext.DataRow["Password"].ToString();
}
Note that TextContext is visible in all methods of the class hence the TestContext.DataRow... expressions can be written outside the methods that specify the [DataSource...] attribute.
If it is a same test case then you should have same set of input parameters, create a class with your test arguments then serialize the list of class instance to XML file with different argument set. When running the test case you can deserialize the XML file (inside TestInitialize() block) and iterate though each class instance and pass the instance coded UI test method. you can call the test method as many time you want according to the count of class instances you have in the xml file. This is the method which I am using for the data driven testing on coded UI test.
Create a class with the test arguments
public class ClientDetails
{
public String ClientType { get; set; }
public String clientCode { get; set; }
public String Username { get; set; }
public String Password { get; set; }
}
Create some class instances and first serialize to a XML file
// location to store the XML file following relative path will store outside solution folder with
// TestResults folder
string xmlFileRelativePath = "../../../TestClientInfo.xml";
public List<ClientDetails> ListClientConfig = new List<ClientDetails>();
ClientDetails Client1 = new Classes.ClientDetails();
Client1.ClientType = "Standard";
Client1.clientCode = "xxx";
Client1.Username = "username";
Client1.Password = "password";
ClientDetails Client2 = new Classes.ClientDetails();
Client2.ClientType = "Easy";
Client2.clientCode = "xxxx";
Client2.Username = "username";
Client2.Password = "password";
ListClientConfig.Add(Client1);
ListClientConfig.Add(Client2);
XmlSerialization.genericSerializeToXML(ListClientConfig, xmlFileRelativePath );
Retrieve the stored XML objects within the test method or any where you prefer (better if inside inside TestInitialize() block)
[TestMethod]
public void CommonClientExecution()
{
List<ClientDetails> ListClientConfig = XmlSerialization.genericDeserializeFromXML(new ClientDetails(), xmlFileRelativePath );
foreach (var ClientDetails in ListClientConfig )
{
// you test logic here...
}
}
XML Serialization methods for serialize collection of objects
using System.Xml;
using System.Xml.Serialization;
class XmlSerialization
{
public static void genericSerializeToXML<T>(T TValue, string XmalfileStorageRelativePath)
{
XmlSerializer serializer = new XmlSerializer(typeof(T));
FileStream textWriter = new FileStream((string)System.IO.Path.GetFullPath(XmalfileStorageRelativePath), FileMode.OpenOrCreate, FileAccess.ReadWrite);
serializer.Serialize(textWriter, TValue);
textWriter.Close();
}
public static T genericDeserializeFromXML<T>(T value, string XmalfileStorageFullPath)
{
T Tvalue = default(T);
try
{
XmlSerializer deserializer = new XmlSerializer(typeof(T));
TextReader textReader = new StreamReader(XmalfileStorageFullPath);
Tvalue = (T)deserializer.Deserialize(textReader);
textReader.Close();
}
catch (Exception ex)
{
// MessageBox.Show(#"File Not Found", "Not Found", MessageBoxButtons.OK, MessageBoxIcon.Asterisk);
}
return Tvalue;
}
}

ServiceStack support for conditionally omitting fields from a REST response on a per-call basis

<TL;DR>
At a minimum, I'm looking for a way to conditionally exclude certain properties on the resource from being included in the response on a per-call basis (See fields below).
Ideally, I'd like to implement a REST service with ServiceStack that supports all the major points below.
UPDATE
While I really like ServiceStack's approach in general and would prefer to use it if possible, if it isn't particularly well suited towards these ideas I'd rather not bend over backwards bastardizing it to make it work. If that's the case, can anyone point to another c# framework that might be more appropriate? I'm actively exploring other options myself, of course.
</TD;DR>
In this talk entitled Designing REST + JSON APIs, the presenter describes his strategy for Resource References (via href property on resources) in JSON. In addition to this, he describes two query parameters (fields and expand) for controlling what data is included the response of a call to a REST service. I've been trying without success to dig into the ServiceStack framework to achieve support for fields in particular but have thus far been unsuccessful. Is this currently possible in ServiceStack? Ideally the solution would be format agnostic and would therefore work across all of ServiceStack's supported output formats. I would imagine expand would follow the same strategy.
I'll describe these features here but I think the talk at the link does a better job of explaining them.
Lets say we have an Profiles resource with the following properties: givenName, surname, gender, and favColor. The Profiles resource also includes a list of social networks the user belongs to in the socialNetworks property.
href - (42:22 in video) Every resource includes a full link to it on the REST service. A call to GET /profiles/123 would return
{
"href":"https://host/profiles/123",
"givenName":"Bob",
"surname":"Smith",
"gender":"male",
"favColor":"red",
"socialNetworks": {
"href":"https://host/socialNetworkMemberships?profileId=123"
}
}
Notice that the socialNetworks property returns an object with just the href value populated. This keeps the response short and focused while also giving the end user enough information to make further requests if desired. The href property, used across the board in this manor, makes it easy (conceptually anyway) to reuse resource data structures as children of other resources.
fields - (55:44 in video) Query string parameter that instructs the server to only include the specified properties of the desired resource in the REST response.
A normal response from GET /profiles/123 would include all the properties of the resource as seen above. When the fields query param is included in the request, only the fields specified are returned. 'GET /propfiles/123?fields=surname,favColor' would return
{
"href":"https://host/profiles/123",
"surname":"Smith",
"favColor":"red"
}
expand - (45:53 in video) Query string parameter that instructs the server to flesh out the specified child resources in the result. Using our example, if you were to call GET /profiles/123?expand=socialNetworks you might receive something like
{
"href":"https://host/profiles/123",
"givenName":"Bob",
"surname":"Smith",
"gender":"male",
"favColor":"red",
"socialNetworks": {
"href":"https://host/socialNetworkMemberships?profileId=123",
"items": [
{
"href":"https://host/socialNetworkMemberships/abcde",
"siteName":"Facebook",
"profileUrl":"http://www.facebook.com/..."
},
...
]
}
}
So...in my opinion ServiceStack's best feature is that it makes sending, receiving and handling POCOs over HTTP super easy. How you set up the POCOs and what you do in between (within the 'Service') is up to you. Does SS have opinions? Yes. Do you have to agree with them? No. (But you probably should :))
I think expanding on something like below would get you close to how you want to handle your api. Probably not the best example of ServiceStack but the ServiceStack code/requirements are barely noticeable and don't get in your way (AppHost configure not shown). You could probably do something similar in other .NET Frameworks (MVC/Web API/etc) but, in my opinion, won't look as much like straight C#/.NET code as with ServiceStack.
Request classes
[Route("/Profiles/{Id}")]
public class Profiles
{
public int? Id { get; set; }
}
[Route("/SocialNetworks/{Id}")]
public class SocialNetworks
{
public int? Id { get; set; }
}
Base Response class
public class BaseResponse
{
protected virtual string hrefPath
{
get { return ""; }
}
public string Id { get; set; }
public string href { get { return hrefPath + Id; } }
}
Classes from example
public class Profile : BaseResponse
{
protected override string hrefPath { get { return "https://host/profiles/"; } }
public string GivenName { get; set; }
public string SurName { get; set; }
public string Gender { get; set; }
public string FavColor { get; set; }
public List<BaseResponse> SocialNetworks { get; set; }
}
public class SocialNetwork: BaseResponse
{
protected override string hrefPath { get { return "https://host/socialNetworkMemberships?profileId="; }}
public string SiteName { get; set; }
public string ProfileUrl { get; set; }
}
Services
public class ProfileService : Service
{
public object Get(Profiles request)
{
var testProfile = new Profile { Id= "123", GivenName = "Bob", SurName = "Smith", Gender = "Male", FavColor = "Red",
SocialNetworks = new List<BaseResponse>
{
new SocialNetwork { Id = "abcde", SiteName = "Facebook", ProfileUrl = "http://www.facebook.com/"}
}
};
if (!String.IsNullOrEmpty(this.Request.QueryString.Get("fields")) || !String.IsNullOrEmpty(this.Request.QueryString.Get("expand")))
return ServiceHelper.BuildResponseObject<Profile>(testProfile, this.Request.QueryString);
return testProfile;
}
}
public class SocialNetworkService : Service
{
public object Get(SocialNetworks request)
{
var testSocialNetwork = new SocialNetwork
{
Id = "abcde",
SiteName = "Facebook",
ProfileUrl = "http://www.facebook.com/"
};
if (!String.IsNullOrEmpty(this.Request.QueryString.Get("fields")) || !String.IsNullOrEmpty(this.Request.QueryString.Get("expand")))
return ServiceHelper.BuildResponseObject<SocialNetwork>(testSocialNetwork, this.Request.QueryString);
return testSocialNetwork;
}
}
Reflection Helper Class
public static class ServiceHelper
{
public static object BuildResponseObject<T>(T typedObject, NameValueCollection queryString) where T: BaseResponse
{
var newObject = new ExpandoObject() as IDictionary<string, object>;
newObject.Add("href", typedObject.href);
if (!String.IsNullOrEmpty(queryString.Get("fields")))
{
foreach (var propertyName in queryString.Get("fields").Split(',').ToList())
{
//could check for 'socialNetwork' and exclude if you wanted
newObject.Add(propertyName, typedObject.GetType().GetProperty(propertyName, BindingFlags.IgnoreCase | BindingFlags.Public | BindingFlags.Instance).GetValue(typedObject, null));
}
}
if (!String.IsNullOrEmpty(queryString.Get("expand")))
{
foreach (var propertyName in queryString.Get("expand").Split(',').ToList())
{
newObject.Add(propertyName, typedObject.GetType().GetProperty(propertyName, BindingFlags.IgnoreCase | BindingFlags.Public | BindingFlags.Instance).GetValue(typedObject, null));
}
}
return newObject;
}
}
Usually you can control the serialization of your DTOs by setting the DataMember attributes. With those attributes you can control if the property should have defaults or not.
Meaning if you simply do not define the property of the object you want to return, it should not be serialized and therefore will not be shown in the resulting Json.
ServiceStack internally uses the standard DataContract...Serializer, so this should be supported
Otherwise you could also make use of dynamic objects and simply compose your object at runtime, serialize it and send it back.
Here is a very very basic example:
var seri = JsonSerializer.Create(new JsonSerializerSettings() { });
using (var textWriter = new StringWriter())
{
var writer = new JsonTextWriter(textWriter);
dynamic item = new { Id = id };
seri.Serialize(writer, item);
return textWriter.ToString();
}

How to update object values (based on the curr-previous pattern)?

Assume you have a CSV file with the following simplified structure
LINE1: ID,Description,Value
LINE2: 1,Product1,2
LINE3: ,,3
LINE4: ,,4
LINE5: 2,Product2,2
LINE6: ,,3
LINE7: ,,5
I am using FileHelpers to read the CSV and have hooked up one the interfaces that allows me me to access the current line, after it has been read. Refer to this SO question for more background.
The issues is that using that approach I will need to write many more if statements to check all the fields that need to be copied. (I have at least, 6 csv files at the moment with the same 'blank' format all having more than 20 fields that need to be copied ~ 120 if statements. urggh)
Now this is not a micro optimisation exercise since there will be more files that will have this 'incomplete' format.
How can I update the previous record in an elegant way such that I wont have to write if conditions and declarations for each field?
The current solution is to annotate the required fields using a custom attribute called CopyMe and then use reflection to copy.
[DelimitedRecord(",") ]
[IgnoreFirst(1)]
public class Product
{
[CopyMe()] public int ID { get; set; }
[CopyMe()] public string Description { get; set; }
[FieldConverter(ConverterKind.Decimal)]
public decimal Val{ get; set; }
}
with the AfterRead method looking like so...
public void AfterRead(AfterReadEventArgs<RawVolsDataPoints> e)
{
var record = (Product)e.Record;
if (PreviousRecord == null)
{
PreviousRecord = new Product();
PreviousRecord = record;
}
if (String.IsNullOrEmpty(record.ID)) // null value indicates a new row
{
var ttype = typeof(Product);
var fields = ttype.GetFields();
var fieldsToCopy = fields.Where(field =>
field.GetCustomAttributes(typeof(CopyMeAttribute), true).Any());
foreach (var item in fieldsToCopy)
{
var prevvalue = item.GetValue(PreviousRecord);
item.SetValue(record, prevvalue);
}
}
}

Categories

Resources