I have a middleware telemetry handler, that has a method that awaits the execution of a request, and then tries to store some key data values from the response body into custom dimensions fields in application insights, so I can use graphana and potentially other 3rd party products to analyse my reponses.
public class ResponseBodyHandler : IResponseBodyHandler
{
private readonly ITelemetryPropertyHandler _telemetryPropertyHandler = new TelemetryPropertyHandler();
public void TransformResponseBodyDataToTelemetryData(RequestTelemetry requestTelemetry, string responseBody)
{
SuccessResponse<List<Booking>> result = null;
try
{
result = JsonConvert.DeserializeObject<SuccessResponse<List<Booking>>>(responseBody);
}
catch (Exception e)
{
Log.Error("Telemetry response handler, failure to deserialize response body: " + e.Message);
return;
}
_telemetryPropertyHandler.CreateTelemetryProperties(requestTelemetry, result);
}
}
public class TelemetryPropertyHandler : ITelemetryPropertyHandler
{
private readonly ILabelHandler _labelHandler = new LabelHandler();
public void CreateTelemetryProperties(RequestTelemetry requestTelemetry, SuccessResponse<List<Booking>> result)
{
Header bookingHeader = result?.SuccessObject?.FirstOrDefault()?.BookingHeader;
requestTelemetry?.Properties.Add("ResponseClientId", "" + bookingHeader?.ConsigneeNumber);
Line line = bookingHeader?.Lines.FirstOrDefault();
requestTelemetry?.Properties.Add("ResponseProductId", "" + line?.PurchaseProductID);
requestTelemetry?.Properties.Add("ResponseCarrierId", "" + line?.SubCarrierID);
_labelHandler.HandleLabel(requestTelemetry, bookingHeader);
requestTelemetry?.Properties.Add("ResponseBody", JsonConvert.SerializeObject(result));
}
}
Now, inside: _labelHandler.HandleLabel(requestTelemetry, bookingHeader);
It extracts an Image that is base64 encoded, chunks up the string in sizes of 8192 characters, and adds them to the Properties as: Image index 0 .. N (N being the total chunks)
I can debug and verify that the code works.
However, on application insights, the entire "request" entry, is missing, not just the custom dimensions.
I am assuming that this is due to a maximum size constraint, and I am likely trying to add more data than is "allowed", however, I can't for the life of me, find the documentation that enforces this restriction.
Can someone tell what rule I am breaking? so I can either truncate the image out, if it isn't possible to store that much data? Or if there is something else I am doing wrong?
I have validated, that my code works fine, as long as I truncate the data into a single Property, that of course only partially stores the Image. (Making said "feature" useless)
public class LabelHandler : ILabelHandler
{
private readonly IBase64Splitter _base64Splitter = new Base64Splitter();
public void HandleLabel(RequestTelemetry requestTelemetry, Header bookingHeader)
{
Label label = bookingHeader?.Labels.FirstOrDefault();
IEnumerable<List<char>> splitBase64String = _base64Splitter.SplitList(label?.Base64.ToList());
if (splitBase64String != null)
{
bool imageHandlingWorked = true;
try
{
int index = 0;
foreach (List<char> chunkOfImageString in splitBase64String)
{
string dictionaryKey = $"Image index {index}";
string chunkData = new string(chunkOfImageString.ToArray());
requestTelemetry?.Properties.Add(dictionaryKey, chunkData);
index++;
}
}
catch (Exception e)
{
imageHandlingWorked = false;
Log.Error("Error trying to store label in chunks: " + e.Message);
}
if (imageHandlingWorked && label != null)
{
label.Base64 = "";
}
}
}
}
The above code is responsible for adding the chunks to a requestTelemetry Property field
public class Base64Splitter : IBase64Splitter
{
private const int ChunkSize = 8192;
public IEnumerable<List<T>> SplitList<T>(List<T> originalList)
{
for (var i = 0; i < originalList.Count; i += ChunkSize)
yield return originalList.GetRange(i, Math.Min(ChunkSize, originalList.Count - i));
}
}
This is the specific method for creating a char list chunk of characters, that correspond to the application insights maximum size pr custom dimension field.
Here is an image of the truncated field being added, if I just limit myself to a single property, but truncate the base64 encoded value.
[I'm from Application Insights team]
You can find field limits documented here: https://learn.microsoft.com/en-us/azure/azure-monitor/app/data-model-request-telemetry
On ingestion side there is a limit of 64 * 1024 bytes for overall JSON payload (need to add it to documentation).
You're facing something different though - that custom dimensions are removed completely. Maybe SDK detects that 64kb is exceeded and "mitigates" it this way. Can you try to limit it to a little bit less than 64kb?
Related
So I have no idea why it's not deleting the actual string from the WebsiteList, it's weird because it does delete from the ProxyList.
When debugging it says that it does delete something because the websiteList.Count gets lower after running through webisteList.Remove(website);
But it doesnt delete the string, it keeps looping through the same string.
foreach (var website in websiteList.ToArray())
{
var webSplit = website.Split(')');
foreach (var proxy in proxyList.ToArray())
{
if (proxyList.Count > 0)
{
if(websiteList.Count > 0)
{
var proxySplit = proxy.Split(':');
int Port;
bool convert = Int32.TryParse(proxySplit[1], out Port);
if (this returns true)
{
Console.WriteLine("Removing proxy");
proxyList.Remove(proxy);
websiteList.Remove(website);
}
if (this returns true)
{
Console.WriteLine("Removing proxy");
proxyList.Remove(proxy);
websiteList.Remove(website);
}
}
}
else
break;
}
}
You are repeatedly deleting from the same proxyList (i.e. you are repeating the whole inner loop as many times as there are websites). Why are those 2 loops nested? The websites seem not to be related to the proxies. Only if the proxy list would be extracted from a website, nesting would make sense.
Are these 2 lists supposed to have the same length and to have proxies belonging to websites at the same index? If this is the case, loop using a for-loop and loop in reverse order to avoid messing up the indexes.
for (int i = websiteList.Count - 1; i >= 0; i--) {
if (<condition>) {
proxyList.RemoveAt(i);
websiteList.RemoveAt(i);
}
}
If you had a class for the websites, this would simplify manipulating things belonging together. It also has the advantage that you can add additional logic belonging to websites and proxies (like extracting the port number):
public class Website
{
public string Site { get; set; }
public string Proxy { get; set; }
public int Port {
get {
string[] proxySplit = proxy.Split(':');
int portNo = 0;
if (proxySplit.Length == 2) {
Int32.TryParse(proxySplit[1], out portNo);
}
return portNo;
}
}
}
Now the list is of type List<Website> and contains both, the websites and the proxies
You can delete by using the for loop as before or use LINQ and create a new list containing only the desired items
websiteList = websiteList.Where(w => <condition using w.Site, w.Proxy, w.Port>).ToList();
Note: There is a System.Uri class for the manipulation of uniform resource identifiers. Among other things it can extract the port number. Consider using this class instead of your own.
Create a Visual C# application that displays the contents of the Teams.txt file in a ListBox control. When the user selects a team in the ListBox, the application should display the number of times that team has won the World Series in the time period from 1903 to 2012.
The two files used are Teams.txt , which contains a list of the names of teams that have won the Championship at least once, and WorldSeriesWinners.txt - this file contains a chronological list of the World Series winning teams from 1903 - 2012. The first line in the file is the nae of the team that won in 1903 and the last line is the name of the team that won in 2012. Note that the World Series was not played in 1904 or 1994.
This is the question that i'm having problems with. Actually in this question I must make use of class, but the code is not working
This is my code. I hope that you can help me find the problem
This is the class part
class WorldSeries
{
// Field
private string _wins; // The team's total number of wins.
// Constructor
public WorldSeries()
{
_wins = "";
}
// Wins property
public string Wins
{
get { return _wins; }
set { _wins = value; }
}
}
This is the rest of my code
// Variables
string teamName; // To hold the teams names.
private void ReadTeams()
{
try
{
// Local Variables
StreamReader inputTeams; //To read the file
// Open the Teams.txt file.
inputTeams = File.OpenText("Teams.txt");
// Read the file's contents.
while (!inputTeams.EndOfStream)
{
// Read a line and add it to the ListBox.
teamName = inputTeams.ReadLine();
lst_teams.Items.Add(teamName);
}
// Close the file.
inputTeams.Close();
}
catch (Exception ex)
{
// Display an error message.
MessageBox.Show(ex.Message);
}
}
private void GetTeamWin (WorldSeries worldSeries)
{
try
{
//Local Variables
int index=0; // Loop counter, initialized to 0.
int winCount = 0; // Accumulator, initialized to 0.
// Open the WorldSeriesWinners.txt file.
StreamReader inputFile=File.OpenText
("WorldSeriesWinners.txt")
// Create a List object to hold strings.
List<string> winnerList = new List<string>();
// Read the file's contents
while (!inputFile.EndOfStream)
{
// Read a line and add it to the list.
winnerList.Add(inputFile.ReadLine());
}
// Sort the items in the List.
winnerList.Sort();
while (index >=0)
{
// Search the team name in the List
index = winnerList.BinarySearch(teamName);
winCount++;
// Remove the team name from the List
winnerList.RemoveAt(index);
}
// Store the total number of wins of the team in the Wins
// parameter.
worldSeries.Wins = winCount.ToString();
// Clear the List
winnerList.Clear();
// Display the number of times the team has won.
lbl_results.Text = "The" + lst_teams.SelectedItem.ToString()
+ "has won the World Series" +
winCount + "time(s), between 1903 and 2012.";
}
catch (Exception ex)
{
// Display an error message.
MessageBox.Show(ex.Message);
}
}
private void btn_Exit_Click(object sender, EventArgs e)
{
// Close the file.
this.Close();
}
}
The number of teams wins is easily small enough to hold in memory. You can read the whole file once and store a dictionary of the team name to the number of wins in memory. Something like this:
Dictionary<string, int> numberOfWins =
File.ReadAllLines("WorldSeriesWinners.txt")
.GroupBy(t => t)
.ToDictionary(g => g.Key, g => g.Count() );
You could then have simple function that checked if the selected team was in this list and returned the no of wins if so, if not, zero:
private int GetNoOfWins(string teamName)
{
if (numberOfWins.ContainsKey(teamName))
{
return numberOfWins[teamName];
}
else
{
return 0
}
}
which could easily be used in your existing code:
int winCount = GetNoOfWins(lst_teams.SelectedItem.ToString());
lbl_results.Text = "The" + lst_teams.SelectedItem.ToString()
+ "has won the World Series" +
winCount + "time(s), between 1903 and 2012.";
My application using UdpClient to receive images from some other machine.
Each image size is 951000 bytes and MTU limit is 1500 bytes.
So the sender application must use fragmentation ... and each sending package contain header that contain 2 int
total_number
current_number
The code receiving bytes .. .and this is very intensive bit rate because the video have new frame to send to my application every 30 milisecond ..
I found myself losing packages and i don't know how to do it different and not to lose packages.
Someone have any idea how to solve this ?
Is there any better way ?
this is the code
public class PackagePartial
{
public int total_count;
public int current_count; // first package is 1
public byte[] buffer;
public byte[] Serializable()
{
// make the Serialize
}
public static void DeSerializable(byte[] v)
{
total_count = ... ;
current_count = ...
buffer = ...
}
}
// the network layer
int lastPackIndex = 0;
List<byte> collection = new List<byte>();
while(true)
{
byte[] package = _clientListener.Receive(ref ep);
PackagePartial p = PackagePartial.DeSerializable(package);
// indication that i lost package
if(p.current_count - lastPackIndex != 1 )
{
collection.Clear();
lastPackIndex = 0
continue;
}
if(p.current_count == p.total_count)
{
// image Serialize and send it to the GUI layer as bitmap
Image img = ConvertBytesToImage(collection);
SendToGui(img);
collection.Clear();
lastPackIndex = 0
}
else
{
lastPackIndex = p.current_count
collection.AddRange(p.Buffer)
}
Don't deserialize each package to an intermediate class after recieve.
Create a List of byte arrays and stuff them all in there as they come in.
Once the other side finishes sending, look in the first one to find the total count and see if the List.Count matches the total count.
If it does, you have all of the packages, now you can reassemble the image, just disregard the headers, you don't need them anymore.
Since you don't need anything at this point but the data from each packet, assembling the image should be faster (no serialization to an intermediate class involved anymore).
This should minimize the processing required for each image.
I have a Stock class which loads lots of stock data history from a file (about 100 MB). I have a Pair class that takes two Stock objects and calculates some statistical relations between the two then writes the results to file.
In my main method I have a loop going through a list of pairs of stocks (about 500). It creates 2 stock objects and then a pair object out of the two. At this point the pair calculations are written to file and I'm done with the objects. I need to free the memory so I can go on with the next calculation.
I addition to setting the 3 objects to null I have added the following two lines at the end of the loop:
GC.Collect(GC.MaxGeneration);
GC.WaitForPendingFinalizers();
Stepping over theses two lines just seems to just free up 50 MB out of the 200-300 MB that is allocated per every loop iteration (viewing it from task manager).
The program does about eight or ten pairs before it gives me a system out of memory exception. The memory usage steadily increases until it crashes at about 1.5 GB. (This is an 8 GB machine running Win7 Ultimate)
I don't have much experience with garbage collection. Am I doing something wrong?
Here's my code since you asked: (note: program has two modes, 1> add mode in which new pairs are added to system. 2> regular mode which updates the pair files realtime based on filesystemwatcher events. The stock data is updated by external app called QCollector.)
This is the segment in MainForm which runs in Add Mode:
foreach (string line in PairList)
{
string[] tokens = line.Split(',');
stockA = new Stock(QCollectorPath, tokens[0].ToUpper());
stockB = new Stock(QCollectorPath, tokens[1].ToUpper());
double ratio = double.Parse(tokens[2]);
Pair p = new Pair(QCollectorPath, stockA, stockB, ratio);
// at this point the pair is written to file (constructor handles this)
// commenting out the following lines of code since they don't fix the problem
// stockA = null;
// stockB = null;
// p = null;
// refraining from forced collection since that's not the problem
// GC.Collect(GC.MaxGeneration);
// GC.WaitForPendingFinalizers();
// so far this is the only way i can fix the problem by setting the pair classes
// references to StockA and StockB to null
p.Kill();
}
I am adding more code as per request: Stock and Pair are subclasses of TimeSeries, which has the common functionality
public abstract class TimeSeries {
protected List<string> data;
// following create class must be implemented by subclasses (stock, pair, etc...)
// as each class is created differently, although their data formatting is identical
protected void List<string> Create();
// . . .
public void LoadFromFile()
{
data = new List<string>();
List<StreamReader> srs = GetAllFiles();
foreach (StreamReader sr in srs)
{
List<string> temp = new List<string>();
temp = TurnFileIntoListString(sr);
data = new List<string>(temp.Concat(data));
sr.Close()
}
}
// uses directory naming scheme (according to data month/year) to find files of a symbol
protected List<StreamReader> GetAllFiles()...
public static List<string> TurnFileIntoListString(StreamReader sr)
{
List<string> list = new List<string>();
string line;
while ((line = sr.ReadLine()) != null)
list.Add(line);
return list;
}
// this is the only mean to access a TimeSeries object's data
// this is to prevent deadlocks by time consuming methods such as pair's Create
public string[] GetListCopy()
{
lock (data)
{
string[] listCopy = new string[data.count];
data.CopyTo(listCopy);
return listCopy();
}
}
}
public class Stock : TimeSeries
{
public Stock(string dataFilePath, string symbol, FileSystemWatcher fsw = null)
{
DataFilePath = dataFilePath;
Name = symbol.ToUpper();
LoadFromFile();
// to update stock data when external app updates the files
if (fsw != null) fsw.Changed += FileSystemEventHandler(fsw_Changed);
}
protected void List<string> Create()
{
// stock files created by external application
}
// . . .
}
public class Pair : TimeSeries {
public Pair(string dataFilePath, Stock stockA, Stock stockB, double ratio)
{
// assign parameters to local members
// ...
if (FileExists())
LoadFromFile();
else
Create();
}
protected override List<string> Create()
{
// since stock can get updated by fileSystemWatcher's event handler
// a copy is obtained from the stock object's data
string[] listA = StockA.GetListCopy();
string[] listB = StockB.GetListCopy();
List<string> listP = new List<string>();
int i, j;
i = GetFirstValidBar(listA);
j = GetFirstValidBar(listB);
DateTime dtA, dtB;
dtA = GetDateTime(listA[i]);
dtB = GetDateTime(listB[j]);
// this hidden segment adjusts i and j until they are starting at same datetime
// since stocks can have different amount of data
while (i < listA.Count() && j < listB.Count)
{
double priceA = GetPrice(listA[i]);
double priceB = GetPrice(listB[j]);
double priceP = priceA * ratio - priceB;
listP.Add(String.Format("{0},{1:0.00},{2:0.00},{3:0.00}"
, dtA
, priceP
, priceA
, priceB
);
if (i < j)
i++;
else if (j < i)
j++;
else
{
i++;
j++;
}
}
}
public void Kill()
{
data = null;
stockA = null;
stockB = null;
}
}
Your memory leak is here:
if (fsw != null) fsw.Changed += FileSystemEventHandler(fsw_Changed);
The instance of the stock object will be kept in memory as long as the FileSystemWatcher is alive, since it is responding to an event of the FileSystemWatcher.
I think that you want to either implement that event somewhere else, or at some other point in your code add a:
if (fsw != null) fsw.Changed -= fsw_Changed;
Given the way that the code is written it might be possible that stock object is intended to be called without a FileSystemWatcher in cases where a bulk processing is done.
In the original code that you posted the constructors of the Stock classes were being called with a FileSystemWatcher. You have changed that now. I think you will find that now with a null FileSystemWatcher you can remove your kill and you will not have a leak since you are no longer listening to the fsw.Changed.
I am using log4net and I have created my own appender from the AdoNetAppender. My appender just implements a kind of a buffer which permits grouping identical events in one log (for thousands of identical errors, I will only have one line in the database).
Here is the code for easy comprehension (my appender has a buffersize = 1):
class CustomAdoNetAppender : AdoNetAppender
{
//My Custom Buffer
private static List<LoggingEvent> unSendEvents = new List<LoggingEvent>();
private int customBufferSize = 5;
private double interval = 100;
private static DateTime lastSendTime = DateTime.Now;
protected override void SendBuffer(log4net.Core.LoggingEvent[] events)
{
LoggingEvent loggingEvent = events[0];
LoggingEvent l = unSendEvents.Find(delegate(LoggingEvent logg) { return GetKey(logg).Equals(GetKey(loggingEvent), StringComparison.OrdinalIgnoreCase); });
//If the events already exist in the custom buffer (unSendEvents) containing the 5 last events
if (l != null)
{
//Iterate the count property
try
{
l.Properties["Count"] = (int)l.Properties["Count"] + 1;
}
catch
{
l.Properties["Count"] = 1;
}
}
//Else
else
{
//If the custom buffer (unSendEvents) contains 5 events
if (unSendEvents.Count() == customBufferSize)
{
//Persist the older event
base.SendBuffer(new LoggingEvent[] { unSendEvents.ElementAt(0) });
//Delete it from the buffer
unSendEvents.RemoveAt(0);
}
//Set count properties to 1
loggingEvent.Properties["Count"] = 1;
//Add the event to the pre-buffer
unSendEvents.Add(loggingEvent);
}
//If timer is over
TimeSpan timeElapsed = loggingEvent.TimeStamp - lastSendTime;
if (timeElapsed.TotalSeconds > interval)
{
//Persist all events contained in the unSendEvents buffer
base.SendBuffer(unSendEvents.ToArray());
//Update send time
lastSendTime = unSendEvents.ElementAt(unSendEvents.Count() - 1).TimeStamp;
//Flush the buffer
unSendEvents.Clear();
}
}
/// <summary>
/// Function to build a key (aggregation of important properties of a logging event) to facilitate comparison.
/// </summary>
/// <param name="logg">The loggign event to get the key.</param>
/// <returns>Formatted string representing the log event key.</returns>
private string GetKey(LoggingEvent logg)
{
return string.Format("{0}|{1}|{2}|{3}", logg.Properties["ErrorCode"] == null ? string.Empty : logg.Properties["ErrorCode"].ToString()
, logg.Level.ToString()
, logg.LoggerName
, logg.MessageObject.ToString()
);
}
}
The buffer and count part is going well. My issue is that I am losing the 5 last logs because the buffer is not flushed at the end of the program. The unSendEvent buffer is full but never flushed in the database because no more new logs are going to "push" in the db older logs.
Is there any solution for me? I have tried to use the Flush() method but with no success.
The Smtp appender has a lossy parameter. If it's not set to false, you aren't guaranteed to get all of the logging messages. Sounds like that might be your problem perhaps? I use a config file, so this line is in my appender definition.
<lossy value="false" />
There are a couple ways I can think of to handle this. The first is to change your buffer size to one (it is at 5 right now). That would ensure that all entries get written right away. However, this might not be ideal. If that is the case, one work-around I can think of is to put five dummy log messages into your buffer. That will flush out the real ones and your dummy events will be the ones that get dropped.