convert jquery chat to csv using asp.net - c#

I need to convert jquery chart into csv file.
can you please help me with the function.
I all ready have the graphs but i could not find the function to convert my graph into csv and download it.
Actually i found this option right here :
http://flotr.googlecode.com/svn/trunk/flotr/examples/prototype/data-download.html
but there is no code for that.
Thanks

Just looked in the source of the page and got you the function that is doing the magic there. May it helps you -
/**
* Converts the data into CSV in order to download a file
*/
downloadCSV: function(){
var i, csv = '',
series = this.series,
options = this.options,
dg = this.loadDataGrid(),
separator = encodeURIComponent(options.spreadsheet.csvFileSeparator);
if (options.spreadsheet.decimalSeparator === options.spreadsheet.csvFileSeparator) {
throw "The decimal separator is the same as the column separator ("+options.spreadsheet.decimalSeparator+")";
}
// The first row
for (i = 0; i < series.length; ++i) {
csv += separator+'"'+(series[i].label || String.fromCharCode(65+i)).replace(/\"/g, '\\"')+'"';
}
csv += "%0D%0A"; // \r\n
// For each row
for (i = 0; i < dg.length; ++i) {
var rowLabel = '';
// The first column
if (options.xaxis.ticks) {
var tick = options.xaxis.ticks.find(function(x){return x[0] == dg[i][0]});
if (tick) rowLabel = tick[1];
}
else if (options.spreadsheet.tickFormatter){
rowLabel = options.spreadsheet.tickFormatter(dg[i][0]);
}
else {
rowLabel = options.xaxis.tickFormatter(dg[i][0]);
}
rowLabel = '"'+(rowLabel+'').replace(/\"/g, '\\"')+'"';
var numbers = dg[i].slice(1).join(separator);
if (options.spreadsheet.decimalSeparator !== '.') {
numbers = numbers.replace(/\./g, options.spreadsheet.decimalSeparator);
}
csv += rowLabel+separator+numbers+"%0D%0A"; // \t and \r\n
}
if (Prototype.Browser.IE && !Flotr.isIE9) {
csv = csv.replace(new RegExp(separator, 'g'), decodeURIComponent(separator)).replace(/%0A/g, '\n').replace(/%0D/g, '\r');
window.open().document.write(csv);
}
else window.open('data:text/csv,'+csv);
}
});
If you have problems with this code here's a link to the whole source - Flotr
You could also get the Table with the class "flotr-datagrid" and convert it in JavaScript (HowTo) or send it via PostBack to your asp server and do the magic serverside via C#.

Related

How to replace characters in a string while looping?

I would like to verify if a file exists on multiple servers. The only thing that's different on the servers is the number on the server, ex. Server1, Server2, Server3, etc.
How would I write a loop that replaces each number with the next highest number up until, say Server 10?
Here is what I have so far:
var fileLocation = #"\\Server1\documents\File.txt";
var newFileInfoTest = fileLocation.Replace("Server1", "Server2");
if (File.Exists(newFileInfoTest))
txtTextBox.Text = "Server1 -- File copy successful.";
else
txtTextBox.Text = "Server1 -- File copy unsuccessful";
"How would I write a loop that replaces each number with the next highest number up until, say Server 10?"
You can set the server name in a loop, using the loop iterator value as part of the server name
for(int i = 1; i <= 10; i++)
{
txtTextBox.Text = File.Exists($#"\\Server{i}\documents\File.txt")
? $"Server{i} -- File copy successful."
: $"Server{i} -- File copy unsuccessful";
}
Note that the code above will overwrite the txtTextBox.Text value on each iteration. You may instead want to capture all the statuses in the loop and then display them at the end:
txtTextBox.Text = string.Join(Environment.NewLine, Enumerable.Range(1, 10)
.Select(i => File.Exists($#"\\Server{i}\documents\File.txt")
? $"Server{i} -- File copy successful."
: $"Server{i} -- File copy unsuccessful."));
In the comments you asked:
"How would you do this if the file location was in a variable?"
One way to do this is to use a format string with a placeholder ({0}) where the number would go, and then use string.Format to fill in that placeholder inside the loop.
We can extract the server name from this string using string.Split on the \ character and grabbing the first item.
For example:
var serverPath = #"\\Server{0}\documents\File.txt";
txtTextBox.Text = string.Join(Environment.NewLine, Enumerable.Range(1, 10)
.Select(i =>
{
var thisPath = string.Format(serverPath, i);
var serverName = thisPath.Split(new[] { '\\' },
StringSplitOptions.RemoveEmptyEntries).First();
return File.Exists(thisPath)
? $"{serverName} -- File copy successful."
: $"{serverName} -- File copy unsuccessful.";
}));
You can do something like this, make sure your message will not get over write, you can use string builder instead of concatenation. hope you can get the logic
var msg = string.Empty;
for(int i = 1; i < 11; i++) {
var fileLocation = $"\\Server{i}\documents\File.txt";
if (File.Exists(fileLocation))
{
msg += $"Server{i} -- File copy successful.";
}
else
{
msg += $"Server{i} -- File copy unsuccessful.";
}
}
txtTextBox.Text = msg;
Try this:
for (int i = 0; i < 11; i++){
var fileLocation = $#"\\Server{i}\documents\File.txt";
if (File.Exists(fileLocation))
txtTextBox.Text = $"Server{i} -- File copy successful.";
else
txtTextBox.Text = $"Server{i} -- File copy unsuccessful";
}
}
"How would I write a loop that replaces each number with the next highest number up until, say Server 10?"
var numberofServers = 10;
for(int i =0; i <= numberOfServers; i++){
var fileLocation = $"\\Server{i}\\documents\\File.txt";
if(File.Exists(fileLocation)){
//Success
}
else{
//UnSuccessful
}
}

difference between Metatrader CSV and C# StreamWriter CSV. How to solve it?

I have a Code in Metatrader that writes some Quotes to CSV, but while doing same in C# the Expert advisor reads the values a different way....
This Code in MetaEditor writes the CSV file:
li_40 = FileOpen(ls_32, FILE_CSV|FILE_WRITE|FILE_SHARE_READ, ";");
if (li_40 > 0) {
FileWrite(li_40, ls_16);
FileClose(li_40);
This Writes in C#:
List<string> listB = new List<string>();
using (StreamReader reader = new StreamReader(File.OpenRead(oFile)))
{
while (!reader.EndOfStream)
{
var line = reader.ReadLine();
listB.Add(reader.ReadLine());
}
reader.Close();
}
using (StreamWriter swOut = new StreamWriter(oFile))
{
foreach (var item in listB)
{
swOut.Write(item);
swOut.Write(Environment.NewLine);
}
for (int i = 0; i <= gridIn.Columns.Count - 1; i++)
{
if (i > 0)
{
swOut.Write(", ");
}
value = dr.Cells[i].Value.ToString();
string vals = dr.Cells[2].Value.ToString();
int priceDecimalPlaces = vals.Split('.').Count() > 1
? vals.Split('.').ToList().ElementAt(1).Length
: 0;
string nell = "0";
if (priceDecimalPlaces == 3)
{
nell = "0.001";
}
if (priceDecimalPlaces == 5)
{
nell = "0.00001";
}
if (priceDecimalPlaces == 4)
{
nell = "0.0001";
}
//replace comma's with spaces
value = value.Replace(',', ' ');
//replace embedded newlines with spaces
value = value.Replace(Environment.NewLine, "");
If the difference between the C# double and the Metatrader's current double value is 0.12098-0.12096=2, the Metatrader won't see the value as 2 but something very much higher like 18,17 and so on, but writing this same value from the MetaTrader's code gives correct value...
I read the CSV using _lread:
uchar chBuff[1024];
int res = _lread(hFile, chBuff, 1);
//
//CreateFileA(
res = _lread(hFile, chBuff, 350);
ls_308 = CharArrayToString(chBuff, 0, res, CP_UTF8);
//Alert(Ls_84);
ls_308=StringSubstr(ls_308,0,StringFind(ls_308,"\r\n",0));
if (_lclose(hFile)<0) Print("Error closing");
I think there are some difference between C# doubles on Metatrader and normal Metatrader doubles
The error was caused by the System's encoding language which was different to UTF-8 that was specified in the Metatrader code
ls_308 = CharArrayToString(chBuff, 0, res, CP_UTF8);
So, I changed the UTF to All Code Page assigned by System by doing this
ls_308 = CharArrayToString(chBuff, 0, res, CP_ACP);
Hope someone find this useful..

Moving in text file with C#

I have a problem with C#.
I am writing code to search a text file until it finds a certain word, then the code should move three lines and read the fourth, then continue the search to find the certain word again.
Now I don't know how to navigate through the file (forward and backward) to the line I want.
Can anybody help?
You can do something like this:
var text = File.ReadAllLines("path"); //read all lines into an array
var foundFirstTime = false;
for (int i = 0; i < text.Length; i++)
{
//Find the word the first time
if(!foundFirstTime && text[i].Contains("word"))
{
//Skip 3 lines - and continue
i = Math.Min(i+3, text.Length-1);
foundFirstTime = true;
}
if(foundFirstTime && text[i].Contains("word"))
{
//Do whatever!
}
}
// read file
List<string> query = (from lines in File.ReadLines(this.Location.FullName, System.Text.Encoding.UTF8)
select lines).ToList<string>();
for (int i = 0; i < query.Count; i++)
{
if (query[i].Contains("TextYouWant"))
{
i = i + 3;
}
}
Your requirements state that you are searching for a specific word. If that is true and you are not instead looking for a specific string, then the checked answer on this is wrong. Instead you should use:
string[] lines = System.IO.File.ReadAllLines("File.txt");
int skip = 3;
string word = "foo";
string pattern = string.Format("\\b{0}\\b", word);
for (int i = 0; i < lines.Count(); i++)
{
var match = System.Text.RegularExpressions.Regex.IsMatch(lines[i], pattern);
System.Diagnostics.Debug.Print(string.Format("Line {0}: {1}", Array.IndexOf(lines, lines[i], i) + 1, match));
if (match) i += skip;
}
If you use the string.contains method and the word you are searching for is "man", while your text somewhere contains "mantle" and "manual", the string.contains method will return as true.

How to fix this Base 64 array issue

I have a base64 string in the view side. If I pass the whole base64 array at a time I can convert that in to bytes like this
byte[] myBinary = Convert.FromBase64String(data);
where data represents the data that is coming form the view page. But I am having huge data. So, I am splitting the data in the view page like
var arr = [];
for (var i = 0; i < data.length - 1; i += 1000000) {
arr.push(data.substr(i, 1000000));
}
And now I am passing the data to the controller
for (var x = 0; x < arr.length; x++) {
if (x = 0) {
r = "first";
}
else if (x = arr.length - 1) {
r = "last";
}
else {
r = "next";
}
$.post('/Home/Content', { content: e, data: r }, function (d) {
});
}
And in the controller side I have written code like:
public JsonResult Content(string content, string data)
{
datavalueincont += content;
if (data == "last")
{
byte[] myBinary = Convert.FromBase64String(datavalueincont);
var fname = "D://sri//data.mp4";
FileStream stream = new FileStream(fname, FileMode.Create, FileAccess.Write);
System.IO.BinaryWriter br = new System.IO.BinaryWriter(stream);
br.Write(myBinary);
br.Close();
read.Close();
stream.Close();
}
return Json("suc", JsonRequestBehavior.AllowGet);
}
But I am getting error at:
byte[] myBinary = Convert.FromBase64String(datavalueincont);
and that error is
The input is not a valid Base-64 string as it contains a non-base 64
character, more than two padding characters, or an illegal character
among the padding characters.
How can I rectify this. If I pass the data at a time I am able to get the bytes in the
myBinary array. Hope you understand my question.
I have an idea.
As you are sending your data using Ajax, nothing ensures you that your chunks will be sent sequentially.
So maybe when you aggregate your data your chunks are not in a good order.
Try to make your Ajax call sequentially to confirm this point.
[Edit]
something like this (not tested):
var data = [];//your data
var sendMoreData = function (firstTime) {
if (data.length == 0)
return;//no more data to send
var content = data.shift();
var r = firstTime ? "first" :
data.length == 0 ? "last":
"next";
$.post('/Home/Content', { content: content, data: r }, function (d) {
sendMoreData();
});
};
sendMoreData(true);
You can't use byte[] myBinary = Convert.FromBase64String(datavalueincont); until you have the fully encrypted string.
The problem is that you're splitting the Base64 data into chunks after which you send those chunks to the server -> on the server you're trying to convert back from base64 on each individual chunk rather than the whole collection of chunks.
The way I see it, you have 2 options:
Encrypt each individually split chunk of data to base64 (rather than the whole thing before hand) and decrypt it on the server.
Encrypt the whole thing, then split it into pieces (like you're doing now) -> send it to the server -> cache each result (any way you
want -> session, db etc.) till you get the last one -> decrypt at
once
As a side note:
if (x = 0) {
r = "first";
}
else if (x = arr.length - 1) {
r = "last";
}
should really be:
if (x == 0) {
r = "first";
}
else if (x == arr.length - 1) {
r = "last";
}
Not sure if typo, just sayin'.
I think your concept is fine... from what I understand you are doing the following...
View converts binary data to Base64String
View splits string into chunks and sends to controller
Controller waits for all chunks and concatenates them
Controller converts from Base64String
The problem is in how you are splitting your data in the view... I am assuming the splitting code has some extra padding characters on the end maybe?
var arr = [];
for (var i = 0; i < data.length - 1; i += 1000000) {
arr.push(data.substr(i, 1000000));
}
I can't build a test rig to check the code but certainly on your last section of text you can't get 1000000 characters from .substr because there aren't that many characters in the string. I don't know what .substr will return but I would troubleshoot the splitting section of code to find the problem.
Are you sure that datavalueincont += content; is really aggregating all your data. How do you store datavalueincont after each http request?
Maybe you are only missing that.
Have you debugged when data == "last" to see if you have all your data in datavalueincont ?

out of bounds error c#

Im trying to read contents of a csv file into different variables in order to send to a web service.It has been working fine but suddenly today i got and exception.
index was outside the bounds of the array:
what Did I do wrong?
String sourceDir = #"\\198.0.0.4\e$\Globus\LIVE\bnk.run\URA.BP\WEBOUT\";
// Process the list of files found in the directory.
string[] fileEntries = Directory.GetFiles(sourceDir);
foreach (string fileName2 in fileEntries)
{
// read values
StreamReader st = new StreamReader(fileName2);
while (st.Peek() >= 0)
{
String report1 = st.ReadLine();
String[] columns = report1.Split(','); //split columns
String prnout = columns[0];
String tinout = columns[1];
String amtout = columns[2];
String valdate = columns[3];
String paydate = columns[4];
String status = columns[5];
String branch = columns[6];
String reference = columns[7];
}
}
It's hard to guess without even seeing the .csv file, but my first one would be that you don't have 8 columns.
It would be easier if you could show the original .csv file, and tell us where the exception pops.
edit: If you think the data is alright, I'd suggest you debugging and see what the split call returns in Visual Studio. That might help
edit2: And since you're doing that processing in a loop, make sure each row has at least 8 columns.
My money is on bad data file. If that is the only thing in the equation that has changed (aka you haven't made any code changes) then that's pretty much your only option.
If your data file isn't too long post it here and we can tell you for sure.
You can add something like below to check for invalid column lengths:
while (st.Peek() >= 0)
{
String report1 = st.ReadLine();
String[] columns = report1.Split(','); //split columns
if(columns.Length < 8)
{
//Log something useful, throw an exception, whatever.
//You have the option to quitely note that there was a problem and
//continue on processing the rest of the file if you want.
continue;
}
//working with columns below
}
Just for sanity's sake, I combined all the various notes written here. This code is a bit cleaner and has some validation in it.
Try this:
string dir = #"\\198.0.0.4\e$\Globus\LIVE\bnk.run\URA.BP\WEBOUT\";
foreach (string fileName2 in Directory.GetFiles(dir)) {
StreamReader st = new StreamReader(fileName2);
while (!sr.EndOfStream) {
string line = sr.ReadLine();
if (!String.IsNullOrEmpty(line)) {
string[] columns = line.Split(',');
if (columns.Length == 8) {
string prnout = columns[0];
string tinout = columns[1];
string amtout = columns[2];
string valdate = columns[3];
string paydate = columns[4];
string status = columns[5];
string branch = columns[6];
string reference = columns[7];
}
}
}
}
EDIT: As some other users have commented, the CSV format also accepts text qualifiers, which usually means the double quote symbol ("). For example, a text qualified line may look like this:
user,"Hello!",123.23,"$123,123.12",and so on,
Writing CSV parsing code is a little more complicated when you have a fully formatted file like this. Over the years I've been parsing improperly formatted CSV files, I've worked up a standard code script that passes virtually all unit tests, but it's a pain to explain.
/// <summary>
/// Read in a line of text, and use the Add() function to add these items to the current CSV structure
/// </summary>
/// <param name="s"></param>
public static bool TryParseLine(string s, char delimiter, char text_qualifier, out string[] array)
{
bool success = true;
List<string> list = new List<string>();
StringBuilder work = new StringBuilder();
for (int i = 0; i < s.Length; i++) {
char c = s[i];
// If we are starting a new field, is this field text qualified?
if ((c == text_qualifier) && (work.Length == 0)) {
int p2;
while (true) {
p2 = s.IndexOf(text_qualifier, i + 1);
// for some reason, this text qualifier is broken
if (p2 < 0) {
work.Append(s.Substring(i + 1));
i = s.Length;
success = false;
break;
}
// Append this qualified string
work.Append(s.Substring(i + 1, p2 - i - 1));
i = p2;
// If this is a double quote, keep going!
if (((p2 + 1) < s.Length) && (s[p2 + 1] == text_qualifier)) {
work.Append(text_qualifier);
i++;
// otherwise, this is a single qualifier, we're done
} else {
break;
}
}
// Does this start a new field?
} else if (c == delimiter) {
list.Add(work.ToString());
work.Length = 0;
// Test for special case: when the user has written a casual comma, space, and text qualifier, skip the space
// Checks if the second parameter of the if statement will pass through successfully
// e.g. "bob", "mary", "bill"
if (i + 2 <= s.Length - 1) {
if (s[i + 1].Equals(' ') && s[i + 2].Equals(text_qualifier)) {
i++;
}
}
} else {
work.Append(c);
}
}
list.Add(work.ToString());
// If we have nothing in the list, and it's possible that this might be a tab delimited list, try that before giving up
if (list.Count == 1 && delimiter != DEFAULT_TAB_DELIMITER) {
string[] tab_delimited_array = ParseLine(s, DEFAULT_TAB_DELIMITER, DEFAULT_QUALIFIER);
if (tab_delimited_array.Length > list.Count) {
array = tab_delimited_array;
return success;
}
}
// Return the array we parsed
array = list.ToArray();
return success;
}
You should note that, even as complicated as this algorithm is, it still is unable to parse CSV files where there are embedded newlines within a text qualified value, for example, this:
123,"Hi, I am a CSV File!
I am saying hello to you!
But I also have embedded newlines in my text.",2012-07-23
To solve those, I have a multiline parser that uses the Try() feature to add additional lines of text to verify that the main function worked correctly:
/// <summary>
/// Parse a line whose values may include newline symbols or CR/LF
/// </summary>
/// <param name="sr"></param>
/// <returns></returns>
public static string[] ParseMultiLine(StreamReader sr, char delimiter, char text_qualifier)
{
StringBuilder sb = new StringBuilder();
string[] array = null;
while (!sr.EndOfStream) {
// Read in a line
sb.Append(sr.ReadLine());
// Does it parse?
string s = sb.ToString();
if (TryParseLine(s, delimiter, text_qualifier, out array)) {
return array;
}
}
// Fails to parse - return the best array we were able to get
return array;
}
Since you don't know how many columns will be in csv file, you might need to test for length:
if (columns.Length == 8) {
String prnout = columns[0];
String tinout = columns[1];
...
}
I bet you just got an empty line (extra EOL at the end), and that's as simple as that

Categories

Resources