Simplest possible key/value pair file parsing in .NET - c#

My project requires a file where I will store key/value pair data that should be able to be read and modified by the user. I want the program to just expect the keys to be there, and I want to parse them from the file as quickly as possible.
I could store them in XML, but XML is way to complex, and it would require traversing nodes, and child nodes and so on, all I want is some class that takes a file and generates key value pairs. I want as little error handling as possible, and I want it done with as little code as possible.
I could code a class like that myself, but I'd rather learn how it's don'e in the framework than inventing the wheel twice. Are there some built in magic class in .NET (3.5) that are able to do so?
MagicClass kv = new MagicClass("Settings.ini"); // It doesn't neccesarily have to be an INI file, it can be any simple key/value pair format.
string Value1 = kv.get("Key1");
...

If you're looking for a quick easy function and don't want to use .Net app\user config setting files or worry about serialization issues that sometimes occur of time.
The following static function can load a file formatted like KEY=VALUE.
public static Dictionary<string, string> LoadConfig(string settingfile)
{
var dic = new Dictionary<string, string>();
if (File.Exists(settingfile))
{
var settingdata = File.ReadAllLines(settingfile);
for (var i = 0; i < settingdata.Length; i++)
{
var setting = settingdata[i];
var sidx = setting.IndexOf("=");
if (sidx >= 0)
{
var skey = setting.Substring(0, sidx);
var svalue = setting.Substring(sidx+1);
if (!dic.ContainsKey(skey))
{
dic.Add(skey, svalue);
}
}
}
}
return dic;
}
Note: I'm using a Dictionary so keys must be unique, which is usually that case with setting.
USAGE:
var settingfile = AssemblyDirectory + "\\mycustom.setting";
var settingdata = LoadConfig(settingfile);
if (settingdata.ContainsKey("lastrundate"))
{
DateTime lout;
string svalue;
if (settingdata.TryGetValue("lastrundate", out svalue))
{
DateTime.TryParse(svalue, out lout);
lastrun = lout;
}
}

Use the KeyValuePair class for you Key and Value, then just serialize a List to disk with an XMLSerializer.
That would be the simplest approach I feel. You wouldn't have to worry about traversing nodes. Calling the Deserialize function will do that for you. The user could edit the values in the file if they wish also.

I don't know of any builtin class to parse ini file. I've used nini when needed to do so. It's licensed under the MIT/X11 license, so doesn't have any issue to be included in a closed source program.
It's very to use. So if you have a Settings.ini file formatted this way:
[Configuration]
Name = Jb Evain
Phone = +330101010101
Using it would be as simple as:
var source = new IniConfigSource ("Settings.ini");
var config = source.Configs ["Configuration"];
string name = config.Get ("Name");
string phone = config.Get ("Phone");

if you want the user to be able to read and modify the file, i suggest a comma-delimited pair, one per line
key1,value1
key2,value2
...
parsing is simple: read the file, split at newline or comma, then take the elements in pairs

Format the file this way:
key1=value1
key2=value2
Read the entire file into a string (there is a simple convenience function that does that, maybe in the File or string class), and call string.Split('='). Make sure you also call string.Trim() on each key and value as you traverse the list and pop each pair into a hashtable or dictionary.

Related

C# getting array with a string name?

So here's a hypothetical. From someone fairly new to the whole C# and Unity thing:
Suppose for a moment that I have a series of string[] arrays. All of which have similar naming convention. For example:
public string[] UndeadEntities =
{
// stuff
};
public string[] DemonEntities =
{
// stuff
};
Now suppose I want to call one of them at random, I have another list that contains the names of all of those arrays and I return it at random.
My problem is that I grab the name from the array and it's a string, not something I can use. So my question is this:
is there any way for me to use this string and use it to call the above mentioned arrays.
Something like this is what I'm up to but unsure where to go from here and I really would like to avoid making a massive series of If Else statements just for that.
public string[] EnemiesType = { // list of all the other arrays }
public string enemiesTypeGeneratedArrayName = "";
public void GenerateEncounterGroup()
{
enemiesTypeGeneratedArrayName = EnemiesType[Random.Range(0, 12)];
}
Can I nest arrays inside of other arrays? Is there another alternative?
I'm not sure if it is possible at all but if it is, I'll take any pointers as to where to go from there. Thanks.
There are several solutions to your specific problem, an easy one is using Dictionaries:
A Dictionary is a data structure wher you have a key (usually a string) and a value (whatever type you may want to store).
What you can do is at start, initialized a Dictionary were each key is your enemy type, and the value it store is your array, something like:
Dictionary<string, string[]> enemyArrays= new Dictionary<string, string[]>();
.
void Start()
{
enemyArrays["typeA"] = myArrayA;
enemyArrays["typeB"] = myArrayB;
}
Then when you need to get that array, just:
enemiesTypeGeneratedArrayName = EnemiesType[Random.Range(0, 12)];
string[] myRandomArray =enemyArrays[enemiesTypeGeneratedArrayName];
string randomEnemy = myRandomArray[index];
Here you can read more about Dictionary class if you want.
There are other ways to do it, but I think this one is pretty easy to implement in the code you already made, and Dicionaries are cool haha.
I hope is clear:)

Importing different files from Excel with different rules

I have recently been tasked with writing a piece of software that will import Excel files.
The problem I am trying to solve is that my company has c100 clients and each supply a file in a different layout, in so much as the columns in a file will differ between clients but the pertinent information is there in each file.
This process is complicated due to the fact that certain operations need to be done to different files.
In 1 file, for example, a column needs to be inserted after a specifc column and then the result of a calculation needs to be placed into that column. In that same sheet an address is supplied across 9 columns, this address needs to be moved into the last 6 of the 9 columns and then have the first 3 columns removed.
What I don't want to do is write the processing logic for each file (c 100 as mentioned) and thereby get trapped into the drudge of having to maintain this code and be responsible for adding new customer files as they come in.
What I want to do is create a Rule or Processing engine of sorts whereby I can have basic rules like "Insert Column", "Remove Column", "Insert Calculation", "Format a, b, c, d, e & f Columns To Use d, e & f" - the reason being so that configuring the read and process of any new file can be done through a front-end piece of software by an end user (obviously with some training on what to do).
Is there a pattern or strategy that might fit this? I have read about Rules engines but the best examples of these are simple boolean comparisons like "Age = 15" or "Surname = 'Smith'" but can't find a decent example of doing something like "Insert Column after Column G" then "Put G - 125 in to Column H".
Any help here, or a pointer to a good approach, would be greatly appreciated.
Let me see if I can help you out here.
Correct me if I am wrong, but it seems like all your input and output files contain data in columns and columns only.
In that case, you should imagine your problem as a transformation of X input columns to Y output columns. For each client, you will need a configuration that will specify the transform. The configuration might look like below
Y1 = X1
Y2 = X1 + X2
Y3 = X3 + " some string"
As you can see, your configuration lines are simply C# expressions. You can use the LINQ Expression class to build an expression tree from your transformation formulas. You can learn about Expressions here. These expressions can then be compiled and used to do the actual transform. If you think in terms of C#, you will build a static transform method that takes a list as input and returns a list as output for each client. When you use Expressions, you will have to parse the configuration files yourself.
You can also use the Roslyn Compiler Services, which can support proper C# syntax. This way, you can literally have a static method which can do the transform. This also relieves you of the parsing duties.
In either case, you will still have to deal with things like: should I expect the columns to be a string (which means your support needs to know explicitly instruct the configuration GUI to parse needed columns into numbers) or should I automatically convert number like fields into numbers (now support doesn't have to do extra configuration, but they might hit issues when dealing with columns which have numbers, like ID, but should be treated as a string to avoid any improper handling), etc.
In Summary, my approach is:
Create config file per client.
Convert the config file into C# method dynamically using Expressions or Roslyn
Provide a GUI for generating this config - this way the support person can easily specify the transform without knowing your special syntax (Expressions) or C# syntax (Roslyn). When saving config, you can generate one method per client in a single assembly (or separate assembly per client) and persist it. Let's call it client library.
Your main application can do all the standard stuff of reading from excel, validating, etc and then call the client library method to generate the output in a standard format, which can be further processed in your main application.
Hope you got the gist.
Edit: Adding some code to demonstrate. The code is a bit long-winded, but commented for understanding.
// this data represents your excel data
var data = new string[][] {
new string [] { "col_1_1", "10", "09:30" },
new string [] { "col_2_1", "12", "09:40" }
};
// you should read this from your client specific config file/section
// Remember: you should provide a GUI tool to build this config
var config = #"
output.Add(input[0]);
int hours = int.Parse(input[1]);
DateTime date = DateTime.Parse(input[2]);
date = date.AddHours(hours);
output.Add(""Custom Text: "" + date);
";
// this template code should be picked up from a
// non client specific config file/section
var code = #"
using System;
using System.Collections.Generic;
using System.Linq;
namespace ClientLibrary {
static class ClientLibrary {
public static List<string> Client1(string[] input) {
var output = new List<string>();
<<code-from-config>>
return output;
}
}
}
";
// Inject client configuration into template to form full code
code = code.Replace(#"<<code-from-config>>", config);
// Compile your dynamic method and get a reference to it
var references = new MetadataReference[] {
MetadataReference.CreateFromFile(typeof(object).Assembly.Location),
MetadataReference.CreateFromFile(typeof(Enumerable).Assembly.Location)
};
CSharpCompilation compilation = CSharpCompilation.Create(
null,
syntaxTrees: new[] { CSharpSyntaxTree.ParseText(code) },
references: references,
options: new CSharpCompilationOptions(OutputKind.DynamicallyLinkedLibrary));
MethodInfo clientMethod = null;
using (var ms = new MemoryStream()) {
EmitResult result = compilation.Emit(ms);
if (!result.Success) {
foreach (Diagnostic diagnostic in result.Diagnostics) {
Console.Error.WriteLine("{0}: {1}", diagnostic.Id, diagnostic.GetMessage());
}
} else {
ms.Seek(0, SeekOrigin.Begin);
Assembly assembly = Assembly.Load(ms.ToArray());
clientMethod = assembly.GetType("ClientLibrary.ClientLibrary").GetMethod("Client1");
}
}
if (clientMethod == null)
return;
// Do transformation
foreach (string[] row in data) {
var output = clientMethod.Invoke(null, new object[] { row }) as List<string>;
Console.WriteLine(string.Join("|", output));
}
You will need some nuget libraries to compile this, and their corresponding using clauses
nuget install Microsoft.Net.Compilers # Install C# and VB compilers
nuget install Microsoft.CodeAnalysis # Install Language APIs and Services
using System;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Reflection;
using Microsoft.CodeAnalysis;
using Microsoft.CodeAnalysis.CSharp;
using Microsoft.CodeAnalysis.Emit;
As you notice, the only piece to worry about is the GUI to auto-generate the code for the transformation - which I have not provided here. If you want simple transforms, that should be very easy, but for a complex transform, it will be more involved
It sounds like you're expecting your end user to be technical-savvy enough to understand this configuration mechanism that you're going to write. If they can handle that level of technical detail, it might be simpler to give them an Excel book and an official excel template that contains all the columns that your import app needs and they can manually massage the data to the spec.
Otherwise, I would suggest some strategy design based pattern solution to build a library of "data massager" classes for known formats, and just add new classes as new formats are encountered. e.g.
public interface IClientDataImporter
{
List<MyCustomRowStructure> Import(string filename);
}
// client 1 importer
public class ClientOneImporter : IClientDataImporter
{
public List<MyCustomRowStructure> Import(string filename)
{
var result = new List<MyCustomRowStructure>();
// ..... insert custom logic here
return result;
}
}
// client 2 importer
public class ClientTwoImporter : IClientDataImporter
{
public List<MyCustomRowStructure> Import(string filename)
{
var result = new List<MyCustomRowStructure>();
// ..... insert custom logic here
return result;
}
}
// repeat up to however many formats you need
// then.....
public class ExcelToDatabaseImporter
{
public void ImportExcelFile(string filename, string clientName)
{
var myValidData = GetClientDataImporter(clientName).Import(filename);
StickMyDataToMyDatabase(myValidData); // this is where you would load the structure into the db... won't need to touch every time a new format is encountered
}
public IClientDataImporter GetClientDataImporter(string clientName)
{
switch (clientName):
case "ClientOne":
return new ClientOneImporter();
break;
case "ClientTwo":
return new ClientTwoImporter();
break;
default:
throw new ArgumentException("No importer for client");
break;
}
}
I would suggest you to maintain an xml configuration file for each excel file. The xml configuration has to read by a tool, may be a console application, and generate new CSV file, based on the xml configuration.
As XML configuration file can be easily edited by any text editor, users can update the same.

How to organize a large number of file/directory path constants

I have a static class where I keep a large number of relative paths that are used in different places in my application. It looks like that:
static class FilePathConstants
{
public const string FirstDirectory = "First";
public const string FirstSecondDirectory = "First/Second";
public const string FirstSecondThirdFileA = "First/Second/Third/FileA";
public const string FirstSecondFourthFileB = "First/Second/Fourth/FileB";
... nearly 100 of similar members
}
All of them are relative to some parent directory, location of which I know only during the program run. I need to keep them all together because it allows me to easily control what files are used by my application and change their locations from time to time.
However even though they are organized in alphabetic order and its easy to find a certain path, I need to be able to change some of them depending on some setting. Lets say, there is a setting 'bool SettingA' and when I turn it on, I have to do modify some of the paths to use a different directory or a different file name.
The problem is that now I can't use constants, I have to rewrite my code to properties or methods so that I can change file paths at runtime. And here where my code becomes much bigger in size and the strict order now looks ugly. Is there a way I can group them, so that it will not confuse anybody who uses this code? I can't break them into a separate classes because it is difficult to remember in what class what constant you may keep. For now I'm grouping them by regions, but I have a bad feeling that keeping more than one hundred of properties in one class is wrong.
Edit:
All directories and files that I declare in FilePathConstants are used in a large number of places in application (each path can be used multiple times, taking into account the fact that there is more then one hundred of paths - this is a large number). I would like to keep the interface of this class the same or with minimum changes to other classes that use them.
maybe you could use rowstructs
Use something like "index" file to store the directory paths and load it in runtime.
const string indexFilePath = #"C:\dirlist.txt";
IEnumerable<string> paths = File.ReadAllLines(indexFilePath);
Update
I would like to suggest using indirection - "mapper" class.
Here is how it should look like.
public enum FileSystemElement
{
FirstDirectory,
FirstSecondDirectory,
FirstSecondThirdFileA,
FirstSecondFourthFileB
}
public class FileSystemMapper
{
private readonly string _rootDirectory;
private readonly Dictionary<FileSystemElement, string> _fileElements;
public FileSystemMapper(string rootDirectory, string fileName)
{
_rootDirectory = rootDirectory;
string[] lines = File.ReadAllLines(fileName);
_fileElements = lines.Select(ParsePair).ToDictionary(pair => pair.Key, pair => pair.Value);
}
public string GetPath(FileSystemElement element)
{
string relativePath;
if (!_fileElements.TryGetValue(element, out relativePath))
{
throw new InvalidOperationException("Element not found");
}
string resultPath = Path.Combine(_rootDirectory, relativePath);
return resultPath;
}
private static KeyValuePair<FileSystemElement, string> ParsePair(string line)
{
const string separator = "|";
// File element alias | Path
if (string.IsNullOrEmpty(line))
throw new ArgumentException("Null or empty line", "line");
string[] components = line.Split(new[] { separator }, StringSplitOptions.RemoveEmptyEntries);
if (components.Length != 2)
throw new ArgumentException("Line has invalid format", "line");
FileSystemElement element;
bool parseResult = FileSystemElement.TryParse(components[0], out element);
if (!parseResult)
throw new ArgumentException("Invalid element name", "line");
string path = components[1]; // for clarity
return new KeyValuePair<FileSystemElement, string>(element, path);
}
Client example:
FileSystemMapper fileSystemMapper = new FileSystemMapper(#"C:\root", #"C:\dirs.txt");
string firstDirectory = fileSystemMapper.GetPath(FileSystemElement.FirstDirectory);
string secondDirectory = fileSystemMapper.GetPath(FileSystemElement.FirstSecondDirectory);
string secondThirdFile = fileSystemMapper.GetPath(FileSystemElement.FirstSecondThirdFileA);
Index file format: <Element name>|<Path><New Line>
Example:
FirstDirectory|First
FirstSecondDirectory|First\Second
FirstSecondThirdFileA|First\Second\Third\FileA
FirstSecondFourthFileB|First\Second\Fourth\FileB
can you not use your projects Properties.Settings? it's stored in the .config file so can be edited after deployment
or just dont make them const, then you can edit them at runtime bu they revert to the original setting on next run.
or dont make the calss static and create an instance each time you use it, then change whats needed and discard the instance when finished.

Importing data files using generic class definitions

I am trying to import a file with multiple record definition in it. Each one can also have a header record so I thought I would define a definition interface like so.
public interface IRecordDefinition<T>
{
bool Matches(string row);
T MapRow(string row);
bool AreRecordsNested { get; }
GenericLoadClass ToGenericLoad(T input);
}
I then created a concrete implementation for a class.
public class TestDefinition : IRecordDefinition<Test>
{
public bool Matches(string row)
{
return row.Split('\t')[0] == "1";
}
public Test MapColumns(string[] columns)
{
return new Test {val = columns[0].parseDate("ddmmYYYY")};
}
public bool AreRecordsNested
{
get { return true; }
}
public GenericLoadClass ToGenericLoad(Test input)
{
return new GenericLoadClass {Value = input.val};
}
}
However for each File Definition I need to store a list of the record definitions so I can then loop through each line in the file and process it accordingly.
Firstly am I on the right track
or is there a better way to do it?
I would split this process into two pieces.
First, a specific process to split the file with multiple types into multiple files. If the files are fixed width, I have had a lot of luck with regular expressions. For example, assume the following is a text file with three different record types.
TE20110223 A 1
RE20110223 BB 2
CE20110223 CCC 3
You can see there is a pattern here, hopefully the person who decided to put all the record types in the same file gave you a way to identify those types. In the case above you would define three regular expressions.
string pattern1 = #"^TE(?<DATE>[0-9]{8})(?<NEXT1>.{2})(?<NEXT2>.{2})";
string pattern2 = #"^RE(?<DATE>[0-9]{8})(?<NEXT1>.{3})(?<NEXT2>.{2})";
string pattern3 = #"^CE(?<DATE>[0-9]{8})(?<NEXT1>.{4})(?<NEXT2>.{2})";
Regex Regex1 = new Regex(pattern1);
Regex Regex2 = new Regex(pattern2);
Regex Regex3 = new Regex(pattern3);
StringBuilder FirstStringBuilder = new StringBuilder();
StringBuilder SecondStringBuilder = new StringBuilder();
StringBuilder ThirdStringBuilder = new StringBuilder();
string Line = "";
Match LineMatch;
FileInfo myFile = new FileInfo("yourFile.txt");
using (StreamReader s = new StreamReader(f.FullName))
{
while (s.Peek() != -1)
{
Line = s.ReadLine();
LineMatch = Regex1.Match(Line);
if (LineMatch.Success)
{
//Write this line to a new file
}
LineMatch = Regex2.Match(Line);
if (LineMatch.Success)
{
//Write this line to a new file
}
LineMatch = Regex3.Match(Line);
if (LineMatch.Success)
{
//Write this line to a new file
}
}
}
Next, take the split files and run them through a generic process, that you most likely already have, to import them. This works well because when the process inevitably fails, you can narrow it to the single record type that is failing and not impact all the record types. Archive the main text file along with the split files and your life will be much easier as well.
Dealing with these kinds of transmitted files is hard, because someone else controls them and you never know when they are going to change. Logging the original file as well as a receipt of the import is very import and shouldn't be overlooked either. You can make that as simple or as complex as you want, but I tend to write a receipt to a db and copy the primary key from that table into a foreign key in the table I have imported the data into, then never change that data. I like to keep a unmolested copy of the import on the file system as well as on the DB server because there are inevitable conversion / transformation issues that you will need to track down.
Hope this helps, because this is not a trivial task. I think you are on the right track, but instead of processing/importing each line separately...write them to a separate file. I am assuming this is financial data, which is one of the reasons I think provability at every step is important.
I think the FileHelpers library solves a number of your problems:
Strong types
Delimited
Fixed-width
Record-by-Record operations
I'm sure you could consolidate this into a type hierarchy that could tie in custom binary formats as well.
Have you looked at something using Linq? This is a quick example of Linq to Text and Linq to Csv.
I think it would be much simpler to use "yield return" and IEnumerable to get what you want working. This way you could probably get away with only having 1 method on your interface.

turn javascript array into c# array

Hey. I have this javascript file that I'm getting off the web and it consists of basically several large javascript arrays. Since I'm a .net developer I'd like for this array to be accessible through c# so I'm wondering if there are any codeplex contributions or any other methods that I could use to turn the javascript array into a c# array that I could work with from my c# code.
like:
var roomarray = new Array(194);
var modulearray = new Array(2055);
var progarray = new Array(160);
var staffarray = new Array(3040);
var studsetarray = new Array(3221);
function PopulateFilter(strZoneOrDept, cbxFilter) {
var deptarray = new Array(111);
for (var i=0; i<deptarray.length; i++) {
deptarray[i] = new Array(1);
}
deptarray[0] [0] = "a/MPG - Master of Public Governance";
deptarray[0] [1] = "a/MPG - Master of Public Governance";
deptarray[1] [0] = "a/MBA_Flex MBA 1";
deptarray[1] [1] = "a/MBA_Flex MBA 1";
deptarray[2] [0] = "a/MBA_Flex MBA 2";
deptarray[2] [1] = "a/MBA_Flex MBA 2";
deptarray[3] [0] = "a/cand.oecon";
deptarray[3] [1] = "a/cand.oecon";
and so forth
This is what I'm thinking after overlooking the suggestions:
Retrieve the javascript file in my c# code by making an httprequest for it
paste it together with some code i made myself
from c# call an execute on a javascript function selfmade function that will turn the javascript array into json (with help from json.org/json2.js), and output it to a new file
retrieve the new file in c# parsing the json with the DataContractJsonSerializer resulting hopefully resulting in a c# array
does it sound doable to you guys?
I'm not in front of a computer with c# right now so I'm not able to fully try this.
What you're going to need to do #Jakob is the following:
Write a parser that will download the file and store it in memory.
For each section that you want to "parse" into a c# array (for example zonearray), you need to setup bounds to begin searching and end searching the file. Example: We know that zonearray starts building the array the two lines after zonearray[i] = new Array(1); and ends on zonearray.sort().
So with these bounds we can then zip through each line between and parse a C# array. This is simple enough I think that you can figure out. You'll need to keep track of sub-index as well remember.
Repeat this 2-3 for each array you want to parse (zonearray, roomarray..etc).
If you can't quite figure out how to code the bounds or how to parse the line and dump them into arrays, I might be able to write something tomorrow (even though it's a holiday here in Canada).
EDIT: It should be noted that you can't use some JSON parser for this; you have to write your own. It's not really that difficult to do, you just need to break it into small steps (first figure out how to zip through each line and find the right "bounds").
HTH
EDIT: I just spent ~20 minutes writing this up for you. It should parse the file and load each array into a List<string[]>. I've heavily commented it so you can see what's going on. If you have any questions, don't hesitate to ask. Cheers!
private class SearchBound
{
public string ArrayName { get; set; }
public int SubArrayLength { get; set; }
public string StartBound { get; set; }
public int StartOffset { get; set; }
public string EndBound { get; set; }
}
public static void Main(string[] args)
{
//
// NOTE: I used FireFox to determine the encoding that was used.
//
List<string> lines = new List<string>();
// Step 1 - Download the file and dump all the lines of the file to the list.
var request = WebRequest.Create("http://skema.ku.dk/life1011/js/filter.js");
using (var response = request.GetResponse())
using(var stream = response.GetResponseStream())
using(var reader = new StreamReader(stream, Encoding.GetEncoding("ISO-8859-1")))
{
string line = null;
while ((line = reader.ReadLine()) != null)
{
lines.Add(line.Trim());
}
Console.WriteLine("Download Complete.");
}
var deptArrayBounds = new SearchBound
{
ArrayName = "deptarray", // The name of the JS array.
SubArrayLength = 2, // In the JS, the sub array is defined as "new Array(X)" and should always be X+1 here.
StartBound = "deptarray[i] = new Array(1);",// The line that should *start* searching for the array values.
StartOffset = 1, // The StartBound + some number line to start searching the array values.
// For example: the next line might be a '}' so we'd want to skip that line.
EndBound = "deptarray.sort();" // The line to stop searching.
};
var zoneArrayBounds = new SearchBound
{
ArrayName = "zonearray",
SubArrayLength = 2,
StartBound = "zonearray[i] = new Array(1);",
StartOffset = 1,
EndBound = "zonearray.sort();"
};
var staffArrayBounds = new SearchBound
{
ArrayName = "staffarray",
SubArrayLength = 3,
StartBound = "staffarray[i] = new Array(2);",
StartOffset = 1,
EndBound = "staffarray.sort();"
};
List<string[]> deptArray = GetArrayValues(lines, deptArrayBounds);
List<string[]> zoneArray = GetArrayValues(lines, zoneArrayBounds);
List<string[]> staffArray = GetArrayValues(lines, staffArrayBounds);
// ... and so on ...
// You can then use deptArray, zoneArray etc where you want...
Console.WriteLine("Depts: " + deptArray.Count);
Console.WriteLine("Zones: " + zoneArray.Count);
Console.WriteLine("Staff: " + staffArray.Count);
Console.ReadKey();
}
private static List<string[]> GetArrayValues(List<string> lines, SearchBound bound)
{
List<string[]> values = new List<string[]>();
// Get the enumerator for the lines.
var enumerator = lines.GetEnumerator();
string line = null;
// Step 1 - Find the starting bound line.
while (enumerator.MoveNext() && (line = enumerator.Current) != bound.StartBound)
{
// Continue looping until we've found the start bound.
}
// Step 2 - Skip to the right offset (maybe skip a line that has a '}' ).
for (int i = 0; i <= bound.StartOffset; i++)
{
enumerator.MoveNext();
}
// Step 3 - Read each line of the array.
while ((line = enumerator.Current) != bound.EndBound)
{
string[] subArray = new string[bound.SubArrayLength];
// Read each sub-array value.
for (int i = 0; i < bound.SubArrayLength; i++)
{
// Matches everything that is between an equal sign then the value
// wrapped in quotes ending with a semi-colon.
var m = Regex.Matches(line, "^(.* = \")(.*)(\";)$");
// Get the matched value.
subArray[i] = m[0].Groups[2].Value;
// Move to the next sub-item if not the last sub-item.
if (i < bound.SubArrayLength - 1)
{
enumerator.MoveNext();
line = enumerator.Current;
}
}
// Add the sub-array to the list of values.
values.Add(subArray);
// Move to the next line.
if (!enumerator.MoveNext())
{
break;
}
}
return values;
}
If I understand your question right, you are asking whether you can execute JavaScript code from C#, and then pass the result (which in your example would be a JavaScript Array object) into C# code.
The answer is: Of course it’s theoretically possible, but you would need to have an actual JavaScript interpreter to execute the JavaScript. You’ll have to find one or write your own, but given that JavaScript is a full-blown programming language, and writing interpreters for such a large and full-featured programming language is quite an undertaking, I suspect that you won’t find a complete ready-made solution, nor will you be able to write one unless your dedication exceeds that of all other die-hard C#-and-JavaScript fans worldwide.
However, with a bit of trickery, you might be able to coerce an existing JavaScript interpreter to do what you want. For obvious reasons, all browsers have such an interpreter, including Internet Explorer, which you can access using the WinForms WebBrowser control. Thus, you could try the following:
Have your C# code generate an HTML file containing the JavaScript you downloaded plus some JavaScript that turns it into JSON (you appear to have already found something that does this) and outputs it in the browser.
Open that HTML file in the WebBrowser control, have it execute the JavaScript, and then read the contents of the website back, now that it contains the result of the executed JavaScript.
Turn the JSON into a C# array using DataContractJsonSerializer as you suggested.
This is a pretty roundabout way to do it, but it is the best I can think of.
I have to wonder, though, why you are retrieving a JavaScript file from the web in the first place. What generates this JavaScript file? Whatever generates it, surely could generate some properly readable stuff instead (e.g. an XML file)? If it is not generated but written by humans, then why is it written in JavaScript instead of XML, CSV, or some other data format? Hopefully with these thoughts you might be able to find a solution that doesn’t require JavaScript trickery like the above.
Easiest solution is to just execute the Javascript function that makes the array. Include there a function that makes it an JSON (http://www.json.org/js.html). After that make a XMLHttpRequest (AJAX) to the server and from there extract the JSON to a custom class.
If I may use jQuery, here's an example of the needed Javascript:
var myJSONText = JSON.stringify(deptarray);
(function($){
$.ajax({
type: "POST",
url: "some.aspx",
data: myJSONText,
success: function(msg){
alert( "Data Saved: " + msg );
}
});
})(jQuery);
Only now need some code to rip the JSON string to an C# Array.
EDIT:
After looking around a bit, I found Json.NET: http://json.codeplex.com/
There are also a lot of the same questions on Stackoverflow that ask the same.

Categories

Resources