Reading only first value out of each JSON object - c#

I'm sending over 120 requests (between multiple tasks) to
this API (200 objects) and my goal is to only get the names without even touching the rest.
public async Task<List<string>> ZwrocNazwe(string zapytanie)
{
var listaNazw = new List<string>();
var s = await _klientHttp.GetStreamAsync(zapytanie);
using (StreamReader sr = new StreamReader(s))
using (JsonReader reader = new JsonTextReader(sr))
{
reader.SupportMultipleContent = true;
while (reader.Read())
{
if (reader.TokenType != JsonToken.StartObject) continue;
reader.Read();
if (reader.Value.ToString() != "name") continue;
reader.Read();
listaNazw.Add(Convert.ToString(reader.Value));
reader.Skip();
}
}
return listaNazw;
}
I works, but takes more time than I expected it would. Am I doing something wrong?
This is the function that combines results:
public async Task<List<string>> ZwrocListePrzedmiotow(List<int> listaId, string sciezka)
{
// Groups ids of items into groups of 200.
var listaZapytan = ZwrocListeZapytan(listaId);
// Makes request per each group to get item info.
var listaZadan = new List<Task<List<string>>>();
foreach (var zapytanie in listaZapytan)
listaZadan.Add(
ZwrocNazwe(sciezka + zapytanie));
// Combines results.
await Task.WhenAll(listaZadan);
var listaPrzedmiotow = new List<string>();
foreach (var zadanie in listaZadan)
listaPrzedmiotow.AddRange(zadanie.Result);
return listaPrzedmiotow;
}
Funny thing is that since I started using GetStreamAsync instead of GetStringAsync I'm waiting even longer for the results.

Related

Parsing Azure Analysis Response to a JSON in Azure function using ADOMDClient

I am using Microsoft.AnalysisServices.AdomdClient.dll file to connect to Azure Analysis Service for executing DAX queries in Azure function and I need it to spit out in a JSON. Below is how am doing but when there are vast records I see delay in converting the response to json. Analysis service response in 2 secs but masking the response to json is taking more than 40secs. Can someone help suggesting a better way
AdomdCommand cmd = new AdomdCommand(query, _connect);
public List<Dictionary<string, object>> Results { get; } = new List<Dictionary<string, object>>();
var reader = cmd.ExecuteReader();
var schemeTable = reader.GetSchemaTable();
ISet<string> columnSet = new HashSet<string>();
foreach (DataRow row in schemeTable.Rows)
{
String columnName = row[0].ToString();
columnSet.Add(columnName);
}
while (reader.Read())
{
Dictionary<string, object> columns = new Dictionary<string, object>();
foreach (string columnName in columnSet)
{
var value = reader[reader.GetOrdinal(columnName)];
if (value != null)
{
columns.Add(columnName, value);
}
else
{
columns.Add(columnName, null);
}
}
Results.Add(columns);
}
JsonConvert.SerializeObject(Results)
I have a sample for this on GitHub: microsoft/azure-analysis-services-http-sample. It streams the results from an AdomdDataReader to an output stream as JSON. The Stream can be a MemoryStream or (in my case) an HttpResponse stream.
public static async Task WriteResultsToStream(object results, Stream stream, CancellationToken cancel)
{
if (results == null)
{
return;
}
if (results is AdomdDataReader rdr)
{
var encoding = new System.Text.UTF8Encoding(false);
using (var tw = new StreamWriter(stream,encoding,1024*4,true))
using (var w = new Newtonsoft.Json.JsonTextWriter(tw))
{
await w.WriteStartObjectAsync(cancel);
var rn = "rows";
await w.WritePropertyNameAsync(rn);
await w.WriteStartArrayAsync(cancel);
while (rdr.Read())
{
await w.WriteStartObjectAsync(cancel);
for (int i = 0; i < rdr.FieldCount; i++)
{
string name = rdr.GetName(i);
object value = rdr.GetValue(i);
await w.WritePropertyNameAsync(name, cancel);
await w.WriteValueAsync(value, cancel);
}
await w.WriteEndObjectAsync(cancel);
}
await w.WriteEndArrayAsync(cancel);
await w.WriteEndObjectAsync(cancel);
await w.FlushAsync();
await tw.FlushAsync();
await stream.FlushAsync();
}
}
else if (results is CellSet cs)
{
throw new NotSupportedException("CellSet results");
}
else
{
throw new InvalidOperationException("Unexpected result type");
}
}

how to implement on error goto next on API response in C#

I have list of ID's and i am looping through each of these ID's to get the details.
In few cases, the API call is failing. So i want to skip the failed API call and want to proceed with next ID API call; similar to on error goto next.
foreach (var item in ID)
{
try
{
List<string> resultList = new List<string>();
url = string.Format(HttpContext.Current.Session["url"].ToString() +
ConfigurationManager.AppSettings["propertyURL"].ToString(),
HttpContext.Current.Session["ObjectID"].ToString(), item);
var request1 = WebRequest.Create(url);
request1.Headers["X-Authentication"] = HttpContext.Current.Session["vaultToken"].ToString();
request1.Method = "GET";
request.Headers.Add("Cache-Control", "no-cache");
//// Get the response.
var response1 = request1.GetResponse();
var deserializer1 = new DataContractJsonSerializer(typeof(PropertyValue[]));
var result1 = (PropertyValue[])deserializer1.ReadObject(response1.GetResponseStream());
}
catch (Exception ex)
{
}
}
What is the possible solution so that even if the API call fails the foreach loops goes for next ID API call.
Looks like you just need to move your resultList outside the loop. Everything else looks fine...
List<string> resultList = new List<string>();
foreach (var item in ID)
{
try
{
// Your Call Here
// Add Result to resultList here
}
catch
{
}
}
Actually, this is more clear.
Dictionary<int, string> resultList = new Dictionary<int, string>();
foreach (var item in ID)
{
resultList.Add(item,string.Empty);
try
{
// Your Call Here
// Add Result to resultList here
var apiCallResult = apiCall(item);
resultList[item] = apiCallResult;
}
catch
{
}
}
You can query the dictionary for IDs that don't have a result.

How can I read a CSV from Webserver using c# with LINQ? (ClickOnce deployment)

Hello I'm very new to coding but I developed a tool which works for local use.
My Boss wants me to put it on our website for customers.
I figured out ClickOnce is a way to do this.
The .exe is reading a .csv file to find data for use:
public string FindSalesmenPhone(string userName)
{
List<string> resLines = new List<string>();
var lines = File.ReadLines(#"S:\data.csv");
foreach (var line in lines)
{
var res = line.Split(new char[] { ',' });
//id to search
if (res[7] == userName)
{
resLines.Add(res[10]);
}
}
//to get the output
foreach (var line in resLines)
{
return line;
}
MessageBox.Show("no phone found!");
return null;
}
My question is: How can I change this path and will the .csv-file still be accessible after I deployed the tool with ClickOnce.
List<string> resLines = new List<string>();
var lines = File.ReadLines(#"S:\Manuel\data.csv");
Can I simply change it to something like:
var lines = File.ReadLines(#"http://mywebsite.com/data.csv");
Sorry might be easy-pie for you guys but I really appreciate your help!
It would not be File.ReadLines as that is using File.IO and you're asking to get the text from a webpage. The best way to go about this would be to create a WebRequest to the URL you have specified and then read the csv from there using a StreamReader. See this thread which is similar.
EDIT:
Glad this helped you. You may have some bad performance simply because you called the StreamReader without a using statement. Try the code below:
public string TestCSV(string id)
{
List<string> splitted = new List<string>();
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("https://mywebsite.com/data.csv");
HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
using (StreamReader sr = new StreamReader(resp.GetResponseStream()))
{
string currentline = sr.ReadLine();
if (currentline != string.IsNullOrWhiteSpace)
{
var res = currentline.Split(new char[] { ',' });
if (res[0] == id)
{
splitted.Add(res[1]);
}
foreach (var line in splitted)
{
return line;
}
}
}
return null;
}
Thanks Ryan, this was very helpful.
I modified it for my use, but now the performance is very bad. Guess the Tool reads the file over and over again...(A few points call this method during runtime)
Is there a way I can save the .csv in a 2D Array and then save this for the whole runtime of the program? The .csv contains about 900 entries..
public string TestCSV(string id)
{
List<string> splitted = new List<string>();
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("https://mywebsite.com/data.csv");
HttpWebResponse resp = (HttpWebResponse)req.GetResponse();
StreamReader sr = new StreamReader(resp.GetResponseStream());
string currentline;
while ((currentline = sr.ReadLine()) != null)
{
var res = currentline.Split(new char[] { ',' });
if (res[0] == id)
{
splitted.Add(res[1]);
}
foreach (var line in splitted)
{
return line;
}
}
return null;
}

Returning a large collection serialized as JSON via MVC controller

I have a very large result set which I want to return to an Ajax call using JSON.
I started with creating a collection of objects and then serialize the collection but the collection creation would throw a System.OutOfMemoryException.
I've now tried to change the implementation to stream the JSON without having a collection but I still get the System.OutOfMemoryException.
Here my current code snippets.
using (var stream = new MemoryStream())
{
using (var streamWriter = new StreamWriter(stream))
{
using (var jsonWriter = new JsonTextWriter(streamWriter))
{
var serializer = new JsonSerializer();
serializer.Serialize(jsonWriter,new { pins = MakePins(model), missingLocations = 0 });
jsonWriter.Flush();
}
}
stream.Seek(0, SeekOrigin.Begin);
return new FileStreamResult(stream, "application/json");
The MakePins function looks like this:
var pinData = _geographyService.GetEnumerationQueryable()
.SelectMany(x => x.EnumeratedPersonRoleCollection)
.ApplyFilter(model).Where(x => x.EnumerationCentre.Location != null)
.AsNoTracking()
.AsEnumerable();
return pinData.Select(item => new MapPin
{
Id = item.EnumerationCentre.EnumerationCentreUid.ToString(),
Name = item.Person.FullName,
FillColour = GetMapPinColour(item, model),
Latitude = item.EnumerationCentre.Location.Latitude,
Longitude = item.EnumerationCentre.Location.Longitude,
Count = item.IssuedVoucherCollection.Count()
});
I've tried using a yield return instead of the select but the OutOfMemoryException is throw withing the Select function.
I've done a fair bit of googling but can't quite see what else I could try.
Your current solution still have the same problem, because just before return you collect and store all data in the memory stream
You can try something in the following fashion:
public ActionResult RobotsText() {
Response.ContentType = "application/json";
Response.Write("[");
foreach(var item in Items)
{
Response.Write(JsonSerializer.Serialize(item));
if ( /*not last*/)
{
Response.Write(",");
}
}
Response.Write("]");
return new EmptyResult();
}

Task.WaitAll not waiting on other async methods

I'm asynchronously retrieving some rss articles with my Portable Class Library that uses the Microsoft.Bcl library (which doesn't have Task.WhenAll). Each article has a url to rss comments that I need to asynchronously retrieve as well.
The code below is my library. I call GetArticles() but it does not return any of the which creates a list of tasks that call GetComments() to asynchronously get the comments.
I've tried using Task.WaitAll in GetArticles to wait for the comments but it does not block the thread. Any help would be appreciated.
private const string ArticlesUri = "";
public async Task<List<ArticleBrief>> GetArticles()
{
var results = new List<ArticleBrief>();
try
{
var wfw = XNamespace.Get("http://wellformedweb.org/CommentAPI/");
var media = XNamespace.Get("http://search.yahoo.com/mrss/");
var dc = XNamespace.Get("http://purl.org/dc/elements/1.1/");
var t = await WebHttpRequestAsync(ArticlesUri);
StringReader stringReader = new StringReader(t);
using (var xmlReader = System.Xml.XmlReader.Create(stringReader))
{
var doc = System.Xml.Linq.XDocument.Load(xmlReader);
results = (from e in doc.Element("rss").Element("channel").Elements("item")
select
new ArticleBrief()
{
Title = e.Element("title").Value,
Description = e.Element("description").Value,
Published = Convert.ToDateTime(e.Element("pubDate").Value),
Url = e.Element("link").Value,
CommentUri = e.Element(wfw + "commentRss").Value,
ThumbnailUri = e.Element(media + "thumbnail").FirstAttribute.Value,
Categories = GetCategoryElements(e.Elements("category")),
Creator = e.Element(dc + "creator").Value
}).ToList();
}
var tasks = new Queue<Task>();
foreach (var result in results)
{
tasks.Enqueue(
Task.Factory.StartNew(async ()=>
{
result.Comments = await GetComments(result.CommentUri);
}
));
}
Task.WaitAll(tasks.ToArray());
}
catch (Exception ex)
{
// should do some other
// logging here. for now pass off
// exception to callback on UI
throw ex;
}
return results;
}
public async Task<List<Comment>> GetComments(string uri)
{
var results = new List<Comment>();
try
{
var wfw = XNamespace.Get("http://wellformedweb.org/CommentAPI/");
var media = XNamespace.Get("http://search.yahoo.com/mrss/");
var dc = XNamespace.Get("http://purl.org/dc/elements/1.1/");
var t = await WebHttpRequestAsync(uri);
StringReader stringReader = new StringReader(t);
using (var xmlReader = System.Xml.XmlReader.Create(stringReader))
{
var doc = System.Xml.Linq.XDocument.Load(xmlReader);
results = (from e in doc.Element("rss").Element("channel").Elements("item")
select
new Comment()
{
Description = e.Element("description").Value,
Published = Convert.ToDateTime(e.Element("pubDate").Value),
Url = e.Element("link").Value,
Creator = e.Element(dc + "creator").Value
}).ToList();
}
}
catch (Exception ex)
{
// should do some other
// logging here. for now pass off
// exception to callback on UI
throw ex;
}
return results;
}
private static async Task<string> WebHttpRequestAsync(string url)
{
//TODO: look into getting
var request = WebRequest.Create(url);
request.Method = "GET";
var response = await request.GetResponseAsync();
return ReadStreamFromResponse(response);
}
private static string ReadStreamFromResponse(WebResponse response)
{
using (Stream responseStream = response.GetResponseStream())
using (StreamReader sr = new StreamReader(responseStream))
{
string strContent = sr.ReadToEnd();
return strContent;
}
}
private List<string> GetCategoryElements(IEnumerable<XElement> categories)
{
var listOfCategories = new List<string>();
foreach (var category in categories)
{
listOfCategories.Add(category.Value);
}
return listOfCategories;
}
Updated Code from Solution, just added .UnWrap() on the Enqueue method:
var tasks = new Queue<Task>();
foreach (var result in results)
{
tasks.Enqueue(
Task.Factory.StartNew(async ()=>
{
result.Comments = await GetComments(result.CommentUri);
}
).Unwrap());
}
Task.WaitAll(tasks.ToArray());
It is waiting appropriately. The problem is that you are creating a Task which creates another task (i.e. StartNew is returning a Task<Task> and you are only waiting on the outer Task which completes rather quickly (it completes before the inner Task is complete)).
The questions will be:
Do you really want that inner task?
If yes, then you can use Task.Unwrap to get a proxy task that represents the completion of both the inner and outer Task and use that to Wait on.
If no, then you could remove the use of async/await in StartNew so that there is not an inner task (I think this would be prefered, it's not clear why you need the inner task).
Do you really need to do a synchronous Wait on an asynchronous Task? Read some of Stephen Cleary's blog: http://blog.stephencleary.com/2012/02/async-unit-tests-part-1-wrong-way.html
As an aside, if you are not using C# 5, then watch out for closing over the foreach variable result See
Has foreach's use of variables been changed in C# 5?, and
http://blogs.msdn.com/b/ericlippert/archive/2009/11/12/closing-over-the-loop-variable-considered-harmful.aspx)
In Microsoft.Bcl.Async we couldn't add any static methods to Task. However, you can find most of the methods on TaskEx, for example, TaskEx.WhenAll() does exist.

Categories

Resources