Is there anybody, who knows how to extract reports from AVEVA E3D using C#?. For example, how to get pipe name, equipment name by using C#.
Is there any document except .NET customization guide, to read about AVEVA E3D and C#?
With AVEVA E3D, you can easily extract them with the command line (using PML - Programmable Macro Langage).
You don't have to use C# but if you really want, you can have a look here :
PDMS C#
Write this into a .txt file :
-- this is a comment
-- Change the path directory, this is your output
!path=|C:\Users\XXX\Desktop\extract.csv|
-- Here you can choose between a lot of type element, like pipe or equi for the sample
-- var !c is a declaration of variable, collect is a collection of type, all equi means every equipment of my MDB (multi database).
var !c collect all equi
-- The first line of my array contains Name, you can delete this if you don't need
!tab = array()
!tab.append('Name')
-- For each values of my array !c which contains all equipment, I do this action
Do !x values !c
-- You can select your attribute, here I choose the name
!name = !x.dbref().name
-- I put the result of the name into my array
!tab.append(!name)
Enddo
!file = object file(!path)
-- I create a object file with the name of my output file, then I choose the Append option and put my array on the output file.
!file.writefile('Append', !tab)
Once you have save this into a .txt file, you have to call it into E3D, open E3D, open the command window and write this : "$m C:\Users\XXX\Desktop\YourMacroName.txt"
The $m call the macro, the syntax is $m PathName\FileName
You can write a macro in .txt format, or like every AVEVA macro, you can choose the .pmlmac format (pml macro). The format doesn't matter, it is just a easy way to recognize the AVEVA macro.
Related
I am using ctreeACE to create a local database, and I was given a csv file that contains 1000 entries of data and wanted to know if there was a way to import it without hard coding it?
Right now I am having to insert line by line with:
INSERT INTO testdata VALUES
('1ZE83A545192635139','2018-06-19 00:00:00',etc)
Note that ctreeACE only allows single row inserts with INSERT...VALUES (Source)
I can't find a way to do this directly, but you could use this tool to create your insert statements.
First input your data. You can load the csv directly, I just hardcoded two sample lines:
Next set your input options as needed. I used comma separators and ' as a quoting character in the example:
Third, set your output options. This would be a huge screenshot and is pretty self-explanatory so I'm leaving it out.
Last, click CSV to SQL Insert, and it will generate formatted insert statements (one line per insert) for you:
Hope that helps.
I want to read through a xaml file, and find all the lines with 'Annotation.AnnotationText' and get specific data from that line.For example, this line:
<prwab:Branch Condition="{x:Null}" sap2010:Annotation.AnnotationText="testing information " ContinuouslyExecute="False" CreatedBy="System Administrator" CreatedOn="2013-02-23T14:51:28.1555955-05:00" DisplayName="Failure" EnableValidationRule="False" sap:VirtualizedContainerService.HintSize="160,234" ID="ab91dec8-1976-491e-91eb-58e073a69d16" IsReportable="False" LastModifiedBy="System Administrator" LastModifiedOn="2013-02-23T14:51:28.1555955-05:00" MediaRecord="[MediaRecord]" SystemName="CollectDigitsActivity1 Failure6" Timeout="10000" Type="Voice">
I want find all the lines with 'AnnotationText' in my xaml file, and get information like text = 'testing information', id = 'ab91dec8-1976-491e-91eb-58e073a69d16' , created date and lastmodified date.
I have 0 knowledge in this area and I don't know where to start and which method should I use. Thanks for helping!
XAML is just a specific flavour of XML. You will need to use XML parsing to read the file into an object that you can process in this manner. I recommend Linq to XML for this (look at XDocument class to get started), specifically as finding values by XName using a specific namespace as you will need to for the "sap2010" namespace is very easy.
You can then easily parse and extract the information you are looking for using those classes.
I am trying to automate some testing for one of our web applications and I need to know how I can make my Coded UI project read data from a CSV file. Lets say I want to test a log in screen. My CSV file will contain a few user names and passwords. I want my Coded UI test to read these log in details and loop through them to run the test on each set of data.
The web has many tutorials on data driving Coded UI tests. The basic steps for data driving with a CSV file are as follows.
Create the CSV file.
Add the CSV file to the project.
Make sure the CSV file is deployed.
Add the CSV file as a data source for an individual test.
Read the CSV fields and use them in the test.
The detailed steps, with some variations, are explained below.
Visual Studio 2010 has a "data source wizard" that does some of these steps. Visual Studio versions 2012 and 2013 do not have the wizard and so all the steps have to be done manually.
Create the CSV file
One way is to create the file in a spreadsheet then save it as Comma Separated Values. Another way is to use a text editor and just write the file. I use a spreadsheet program for big data source files and a text editor for creating small files. Some editors add a byte order mark (BOM) at the start of a file, that will be added to the first field name of the CSV which appears to make the field unreadable. See this page for more about the BOM.
Add the CSV file to the project
Use the context menu in solution explorer, select Add -> Existing Item. Then browse to the required file. Note the file filter will probably need to be altered to be *.* or *.csv.
Make sure the CSV file is deployed
Open the properties panel for the CSV file from solution explorer. Set "Copy to output directory" to "Copy if newer" or to "Copy always". Some documents recommend "Copy if newer" but I prefer "Copy always" as occasionally a file was not copied as I expected. The difference between the two copy methods is a little disk space and a little time, but disks are normally big and the time to copy is normally small. Any savings are, in my opinion, far outweighed by being sure that the file will be copied correctly.
Add the CSV file as a data source for an individual test
Replace the [TestMethod] attribute with the correct data source line. This Microsoft blog shows the replacement code for several possible data source file types. For CSV use:
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV",
"|DataDirectory|\\data.csv", "data#csv",
DataAccessMethod.Sequential), DeploymentItem("data.csv"),
TestMethod]
Note that the file name occurs three times and one copy has a # rather than a .. I have not found any useful documentation about the different fields of the Datasource(...) attribute so cannot advise further on how to choose values for non-CSV data sources.
The |DataDirectory| part above is replaced by the directory where files are deployed when the tests run. The whole file name within the string quotes could be replaced by a full path name of a file, if required.
Read the CSV fields and use them in the test
The Coded UI record and generate tool creates classes with fields that hold values entered into text boxes or used in assertions. Each action method has a ...Params class and each assert method has an ...ExpectedValues class, where the ... is the method name. The default values of these fields are the values used when the test was recorded. The recorded values can be overwritten by an assignment before the action or assertion method is called. The fields of the current row of the data source are accessed from TestContext.DataRow[...].
Suppose a Coded UI test has an EnterValue method that writes text into two fields of the screen and it also has a CheckResult method that asserts one field. The test method might then be written as follows.
[DataSource...
TestMethod]
public void CodedUITestMethod1()
{
this.UIMap.EnterValueParams.UIItem0TextSendKeys = TestContext.DataRow["ValueOne"].ToString();
this.UIMap.EnterValueParams.UIItem1TextSendKeys = TestContext.DataRow["ValueTwo"].ToString();
this.UIMap.EnterValue();
this.UIMap.CheckResultExpectedValues.UIItem0TextDisplayText = TestContext.DataRow["Result"].ToString();
this.UIMap.CheckResult();
}
The ...Params and ...ExpectedValues classes allow the test to create values when the test runs. For example, if the EnterValue method also wanted to write tomorrow's date into a field we could add the following line before it is called:
this.UIMap.EnterValueParams.UIItem2TextSendKeys = DateTime.Now.AddDays(1).ToString("yyyy-MM-dd");
Add Data Source attribute in the Coded UI Test.
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV", "|DataDirectory|\\data.csv", "data#csv", DataAccessMethod.Sequential), DeploymentItem("data.csv"), TestMethod]
Note: This datasource driver determines cell type based on the data in the first data row. If you have a column that should be formatted as a string, but the first data row has a numer e.g.1234. The following rows will be returned as 0 if they are not empty.
Hope, this link may help you :
http://blogs.msdn.com/b/mathew_aniyan/archive/2009/03/17/data-driving-coded-ui-tests.aspx
You don't need to go into test view. Simply replace your [TestMethod] with the below script:
[DataSource("Microsoft.VisualStudio.TestTools.DataSource.CSV", "|DataDirectory|\\LoginInfo.csv", "Sheet$1", DataAccessMethod.Sequential), DeploymentItem("LoginInfo.csv"), TestMethod]
From there, change the LoginInfo.csv to the name of your .csv file. To reference your data just use:
// Username and Password are Column Headers
UIMap.LoginParams.UserNameTextBox = TestContext.DataRow["UserName"].ToString();
UIMap.LoginParams.PasswordTextBox = TestContext.DataRow["Password"].ToString();
UIMap.Login();
This will take the item in each column and use it sequentially in each test.
I have this code below to copy VBA codes from one word document to another (I'm using C#). It works for modules however I can't seem to get it to work with userforms.
VBComponent sourceVBC = GetSourceDocVB();
VBComponent targetVBC = document.VBProject.VBComponents.Add(sourceVBC.Type);
string codes = sourceVBC.CodeModule.get_Lines(1, sourceVBC.CodeModule.CountOfLines);
targetVBC.CodeModule.AddFromString(codes);
targetVBC.Name = sourceVBC.Name;
Yes, the userform is copied to the target document but its fields are not. Like if it contains labels and textboxes. Those fields are not copied. Am I missing something here?
Yes, you are missing something. Forms are not defined in the code file only, but need a binary file too.
You don't tell anything about the way the source files are generated. Normally, in VBA, you use the "Export" statement of the VBComponent object. Of course one can do it manually by going to the VBA Editor in Word, right-clicks the project component and selects "Export". If you have a look into the export folder, you'll see that a form is saved as two files "Form1.frm" (containing the code) and "Form1.frx" (containing binary form data, as labels and other stuff).
In the other project, you can use maually the File, Import function, which takes care of the binary definition if you import a form.
In VBA, you may use something like this to export from a project:
For Each vbC In ActiveDocument.VBProject.VBComponents
Select Case vbC.Type
Case vbext_ct_StdModule
strVbcExt = ".bas"
Case vbext_ct_ClassModule
strVbcExt = ".cls"
Case vbext_ct_MSForm
strVbcExt = ".frm"
Case Else
End Select
strvbCName = vbC.Name
strFilename = strPath & "\" & strvbCName & strVbcExt
vbC.Export strFilename
(omitted the rest)
And to import you'll use
ActiveDocument.VBProject.VBComponents.Import strFilename
How do I import data in Excel from a CSV file using C#? Actually, what I want to achieve is similar to what we do in Excel, you go to the Data tab and then select From Text option and then use the Text to columns option and select CSV and it does the magic, and all that stuff. I want to automate it.
If you could head me in the right direction, I'll really appreciate that.
EDIT: I guess I didn't explained well. What I want to do is something like
Excel.Application excelApp;
Excel.Workbook excelWorkbook;
// open excel
excelApp = new Excel.Application();
// something like
excelWorkbook.ImportFromTextFile(); // is what I need
I want to import that data into Excel, not my own application. As far as I know, I don't think I would have to parse the CSV myself and then insert them in Excel. Excel does that for us. I simply need to know how to automate that process.
I think you're over complicating things. Excel automatically splits data into columns by comma delimiters if it's a CSV file. So all you should need to do is ensure your extension is CSV.
I just tried opening a file quick in Excel and it works fine. So what you really need is just to call Workbook.Open() with a file with a CSV extension.
You could open Excel, start recording a macro, do what you want, then see what the macro recorded. That should tell you what objects to use and how to use them.
I beleive there are two parts, one is the split operation for the csv that the other responder has already picked up on, which I don't think is essential but I'll include anyways. And the big one is the writing to the excel file, which I was able to get working, but under specific circumstances and it was a pain to accomplish.
CSV is pretty simple, you can do a string.split on a comma seperator if you want. However, this method is horribly broken, albeit I'll admit I've used it myself, mainly because I also have control over the source data, and know that no quotes or escape characters will ever appear. I've included a link to an article on proper csv parsing, however, I have never tested the source or fully audited the code myself. I have used other code by the same author with success. http://www.boyet.com/articles/csvparser.html
The second part is alot more complex, and was a huge pain for me. The approach I took was to use the jet driver to treat the excel file like a database, and then run SQL queries against it. There are a few limitations, which may cause this to not fit you're goal. I was looking to use prebuilt excel file templates to basically display data and some preset functions and graphs. To accomplish this I have several tabs of report data, and one tab which is raw_data. My program writes to the raw_data tab, and all the other tabs calculations point to cells in this table. I'll go into some of the reasoning for this behavior after the code:
First off, the imports (not all may be required, this is pulled from a larger class file and I didn't properly comment what was for what):
using System.IO;
using System.Diagnostics;
using System.Data.Common;
using System.Globalization;
Next we need to define the connection string, my class already has a FileInfo reference at this point to the file I want to use, so that's what I pass on. It's possible to search on google what all the parameters are for, but basicaly use the Jet Driver (should be available on ANY windows install) to open an excel file like you're referring to a database.
string connectString = #"Provider=Microsoft.Jet.OLEDB.4.0;Data Source={filename};Extended Properties=""Excel 8.0;HDR=YES;IMEX=0""";
connectString = connectString.Replace("{filename}", fi.FullName);
Now let's open up the connection to the DB, and be ready to run commands on the DB:
DbProviderFactory factory = DbProviderFactories.GetFactory("System.Data.OleDb");
using (DbConnection connection = factory.CreateConnection())
{
connection.ConnectionString = connectString;
using (DbCommand command = connection.CreateCommand())
{
connection.Open();
Next we need the actual logic for DB insertion. So basically throw queries into a loop or whatever you're logic is, and insert the data row-by-row.
string query = "INSERT INTO [raw_aaa$] (correlationid, ipaddr, somenum) VALUES (\"abcdef", \"1.1.1.1", 10)";
command.CommandText = query;
command.ExecuteNonQuery();
Now here's the really annoying part, the excel driver tries to detect you're column type before insert, so even if you pass a proper integer value, if excel thinks the column type is text, it will insert all you're numbers as text, and it's very hard to get this treated like a number. As such, excel must already have the column type as the number. In order to accomplish this, for my template file I fill in the first 10 rows with dummy data, so that when you load the file in the jet driver, it can detect the proper types and use them. Then all my forumals that point at my csv table will operate properly since the values are of the right type. This may work for you if you're goals are similar to mine, and to use templates that already point to this data (just start at row 10 instead of row 2).
Because of this, my raw_aaa tab in excel might look something like this:
correlationid ipaddr somenum
abcdef 1.1.1.1 5
abcdef 1.1.1.1 5
abcdef 1.1.1.1 5
abcdef 1.1.1.1 5
abcdef 1.1.1.1 5
abcdef 1.1.1.1 5
abcdef 1.1.1.1 5
abcdef 1.1.1.1 5
Note row 1 is the column names that I referenced in my sql queries. I think you can do without this, but that will require a little more research. By already having this data in the excel file, the somenum column will be detected as a number, and any data inserted will be properly treated as such.
Antoher note that makes this annoying, the Jet Driver is 32-bit only, so in my case where I had an explicit 64-bit program, I was unable to execute this directly. So I had the nasty hack of writing to a file, then launch a program that would insert the data in the file into my excel template.
All in all, I think the solution is pretty nasty, but thus far haven't found a better way to do this unfortunatly. Good luck!
You can take a look at TakeIo.Spreadsheet .NET library. It accepts files from Excel 97-2003, Excel 2007 and newer, and CSV format (semicolon or comma separators).
Example:
var inputFile = new FileInfo("Book1.csv"); // could be .xls or .xlsx too
var sheet = Spreadsheet.Read(inputFile);
foreach (var row in sheet)
{
foreach (var cell in row)
{
// do something
}
}
You can remove beginning and trailing empty rows, and also beginning and trailing columns from the imported data using the Normalize() function:
sheet.Normalize();
Sometimes you can find that your imported data contains empty rows between data, so you can use another helper for this case:
sheet.RemoveEmptyRows();
There is a Serialize() function to convert any input to CSV too:
var outfile = new StreamWriter("AllData.csv");
sheet.Serialize(outfile);
If you like to use comma instead of the default semicolon separator in your CSV file, do:
sheet.Serialize(outfile, ',');
And yes, there is also a ToString() function too...
This package is available at NuGet too, just take a look at TakeIo.Spreadsheet.
You can use ADO.NET
http://vbadud.blogspot.com/2008/09/opening-comma-separate-file-csv-through.html
Well, importing from CSV shouldn't be a big deal. I think the most basic method would be to do it using string operations. You could build a pretty fine parser using simple Split() command, and getting the stuff in arrays.