I have the method below in a webapi that pulls data.
I am building an app which will have a listview with default data coming from this method.
I want this data to be changing each time any user starts the app.
How can I generate random data with this method. There are about 4 different categories.
public IEnumerable<ArticlesDto> Find(string category)
{
IEnumerable<ArticlesDto> objArticles = null;
var context = new ArticlesContext();
objArticles = (from j in context.Information
where j.Category == category
select new ArticlesDto()
{
Id = j.Id,
Headlines = j.Headlines,
Url = j.Url,
Category = j.Category,
Summary = j.Summary
});
return objArticles;
}
Example: first time I use the app, I see a list of data about 20 rows(default data).
Second time I use it, I see a different list of another 20 rows different from the last time I used the app.
Why don't you try using AutoFixture. This framework would help you generate random data every time your WebAPI call is made. Here the GITHub link. Please mark as answer if this helps.
https://github.com/AutoFixture
Just OrderBy a random number and then take as many as you like:
Random rnd = new Random();
objArticles = context.Information.Where(i=> i.Category == category)
.OrderBy(i=> rnd.Next())
.Select(i=> new ArticlesDto
{
Id = i.Id,
Headlines = i.Headlines,
Url = i.Url,
Category = i.Category,
Summary = i.Summary
}).Take(20);
Related
Hey I created a database with question. every question has a unique id (.ID) and a random generator randomly picks numbers and the the question with this number (id) is chosen and appears on the website.
So far so good but I want every question to appear a single time. Sadly every user request is unique so the values aren't remembered.
Is there an easy way to implement this to my code that my when form is transmitted my website still recognise the previous.
var questions = from m in _context.Question
select m;
var rowsTaken = new HashSet<int>();
Random r = new Random();
int rndRowIndex = r.Next(1, 10);
if (!string.IsNullOrEmpty(QuestionLayer))
{
do
{
questions = questions.Where(x => x.ID == rndRowIndex);
//questions = questions.Where(x => x.Layer == QuestionLayer);
} while (listids.Contains(rndRowIndex));
listids.Add(rndRowIndex);
}
Quick answer about random number generator is no. However, you could keep track of questions answered or displayed by marking them in database. Then when random number is generated check the question associated with that number. If answered/displayed already either generate another number for the next question or using the first number generated look at the next question and the next until and so forth until you find the next unanswered question. When a question is chosen for display mark it in database as having been displayed.
Once all questions answered mark all questions as unanswered.
Storage in the database is a good choice. If you don’t want to modify the database, you can use the session to store the randomly generated value each time.
More details, refer to the following code:
First, enable the session in your starup.cs :
In Configuration method add services.AddSession();
In Configure method add app.UseSession();
Then, in you controller:
//retrieve the session
var str = HttpContext.Session.GetString("mykey");
List<int> listids = str == null ? new List<int>() : JsonConvert.DeserializeObject<List<int>>(str);
var questions = from m in _context.Question
select m;
//If all the questions have occurred, to prevent entering into an endless loop, we need to store the previously stored empty.
if (listids.Count == questions.Count())
{
listids = new List<int>();
}
if (!string.IsNullOrEmpty(QuestionLayer)
{
var rowsTaken = new HashSet<int>();
Random r = new Random();
int rndRowIndex = r.Next(1, 10);
while (listids.Contains(rndRowIndex))
{
rndRowIndex = r.Next(1, 10);
}
questions = questions.Where(x => x.ID == rndRowIndex);
listids.Add(rndRowIndex);
//store new list to session
HttpContext.Session.SetString("mykey", JsonConvert.SerializeObject(listids));
}
I have more than 1000 customer(s) and invoice(s) and I am trying to fetch all those customers and invoice(s) into a drop-down list.
Documentation on the QBO site suggests that we should need to use pagination if I want to load all the customers in a grid, but what I want is to load all the customer(s) and invoice(s) in a drop-down list.
I am getting the following exception when I try to fetch more than 1000 customer(s) and invoice(s):
Validation Exception was thrown.
Details: QueryValidationError: value 100000 is too large. Max allowed value is 1000.
I am trying to fetch all the customers using the following code
public static List<Customer> GetAllQBOCustomers(ServiceContext context)
{
return Helper.FindAll<Customer>(context, new Customer(),1,100000);
}
I wrote the below code and solved my issue.
1. First I get the count of all the customers
2. Then I get all the customers in chunks and the chunk size is 1000
3. Create a List for customers.
4. Define 3 integer type variables for counting.
5. After that use do-while loop
6. Add all the customers are added to the main customer list
string strQuery = "Select Count(*) From Customer";
string custCount = qboAccess.GetCutomerCount(qboInz.QboServiceContext, strQuery);
List<qboData.Customer> customers = new List<Customer>();
int maxSize = 0;
int position = 1;
int count = Convert.ToInt32(custCount);
do
{
var custList = qboAccess.GetAllQBOEntityRecords(qboInz.QboServiceContext, new Customer(), position, 1000);
customers.AddRange(custList);
maxSize += custList.Count();
position += 1000;
} while (count > maxSize);
The straightforward answer is to loop enough times to get the records you need:
public static List<Customer> GetAllQBOCustomers(ServiceContext context)
{
var list = new List<Customer>();
for (int i=0; i<=10000; i+= 1000)
{
var results = Helper.FindAll<Customer>(context, new Customer(),i, 1000);
list.AddRange(results);
}
return list;
}
Or if you want to try to do it in parallel (and the API allows concurrent connections):
public static List<Customer> GetAllQBOCustomers(ServiceContext context)
{
var bag = new ConcurrentBag<Customer>();
Parallel.ForEach( Enumerable.Range(0, 10), i =>
{
var results = Helper.FindAll<Customer>(context, new Customer(),i * 1000, 1000);
bag.AddRange(results);
});
return bag.ToList();
}
Since the series of calls is likely to be expensive, I suggest you cache the results.
You can't download those records all at once. That is what the error is telling you - very clearly. There's no magic way to avoid the server's rules.
However, I really think you should not download them all at once anyway. A drop-down list is not a good way to display that amount of data to users. Consider the user experience - would you want to scroll through a list of thousands of customers to try and find the one you want? Or would it be easier to start typing part of the name and have it pop up a short list of possible matches to choose from?
A more user-friendly way to implement this would be use an auto-complete box instead of a drop-down list, and after the user has typed a few characters, it can use AJAX to search the API for customers whose names or IDs contain those characters. Then you'll only need to return a small number of records each time, and the user will not be stuck having to scroll for 10 minutes just to find a customer at the bottom of a list of 10,000 records.
I have a simple class which holds a primary key of which I don't know what type it will be before it runs, as i'm getting the data from COM. It will either be an int or string.
I basically just need to fill up my toUpdateList & toAddList. This was working fine below with not too many records to play around with. However now the mongoDBList returns around 65k records and it's all turned very slow and it's taking 15+ minutes to resolve toUpdateList.
I'm pretty new to C# so I'm likely missing something.
I basically just need to compare one list to another and see if the RecordRevision is higher in the toUpdateList. For the toAddList this ones pretty simple as if it doesn't exist it needs to be added.
Thanks for looking I appreciate it!
class KeyRevision
{
public dynamic RecordRevision;
public dynamic PrimaryKey;
}
List<KeyRevision> potentialUpdateList = new List<KeyRevision>();
List<KeyRevision> mongoDBList = new List<KeyRevision>();
List<KeyRevision> toUpdateList = new List<KeyRevision>();
List<KeyRevision> toAddList = new List<KeyRevision>();
var sql = env.ExecuteSQL(sqlQuery);
sql.First();
// Loop over them and add to array
do
{
if (sql.RecordCount > 0)
{
//Console.WriteLine(sql.GetPropertyValue(primaryKey).ToString() + " : " + sql.RecordRevision);
var o = new KeyRevision();
o.PrimaryKey = sql.GetPropertyValue(primaryKey);
o.RecordRevision = sql.RecordRevision;
potentialUpdateList.Add(o);
}
sql.Next();
} while (!sql.EOF);
// Ask mongo for docs
var docs = collection1.Find(_ => true).Project("{RecordRevision: 1}").ToList();
// Get them into our type
mongoDBList = docs.ConvertAll(x => new KeyRevision()
{
PrimaryKey = x.GetValue("_id"),
RecordRevision = x.GetValue("RecordRevision")
});
// Finds which records we need to update
toUpdateList = potentialUpdateList.Where(x =>
mongoDBList.Any(y => y.PrimaryKey == x.PrimaryKey && y.RecordRevision < x.RecordRevision)).ToList();
// Finds the records we need to add
toAddList = potentialUpdateList.Where(x =>
mongoDBList.FindIndex(y => y.PrimaryKey == x.PrimaryKey) < 0).ToList();
Console.WriteLine($"{toUpdateList.Count} need to be updated");
Console.WriteLine($"{toAddList.Count} need to be updated");
I have this query that was recently changed to allow searches using lists. However, the logic doesn't seem correct. My initial search logic was as follows:
data = data.where(u=>u.location.contains(FilterInput.RepositoryName)).ToList();
This worked for individual inputs and the logic made sense. In the data result, check if location field contains the Input variable
However in order to handle inputs that are lists, I had to change it to the bottom code which is in this Input list, check if the it contains the location field.
The database outputs data as follows:
Output = {arhde, brhje, ckio}
That means my list input is a small section of what the database contains.
FilterInput.RepositoryName = {a,b,c}
data = (from item in dbContext.Documents
join id in initialData
on item.Id equals id.DocumentId
select new DocumentsListViewModel
{
Id = item.Id,
Name = item.Name,
ApplicationName = item.ApplicationName,
ApplicationSecretKey = item.ApplicationSecretKey,
Link = item.Link,
Location = item.Location,
FileType = item.FileType,
CreatedOn = item.CreatedOn
}).ToList();
if (FilterInput.RepositoryName.Count>0)
{
data = data.Where(u => FilterInput.RepositoryName.Contains(u.Location)).ToList();
}
I don't know if its possible to change this logic to use the first one but accomodate lists as well?
I've used NEST for elasticsearch for a while now and up until now I've used the regular ElasticSearchClient.Index(...) function, but now I want to index many items in a bulk operation.
I found the IndexMany(...) function, but I must do something wrong because nothing is added to the elastic search database as it does with the regular Index(...) function?
Does anyone have any idea?
Thanks in advance!
I found the problem. I had to specifiy the index name in the call to IndexMany
var res = ElasticClient.CreateIndex("pages", i => i.Mappings(m => m.Map<ESPageViewModel>(mm => mm.AutoMap())));
var page = new ESPageViewModel
{
Id = dbPage.Id,
PageId = dbPage.PageId,
Name = dbPage.Name,
Options = pageTags,
CustomerCategoryId = saveTagOptions.CustomerCategoryId,
Link = dbPage.Link,
Price = dbPage.Price
};
var pages = new List<ESPageViewModel>() { page };
var res2 = ElasticClient.IndexManyAsync<ESPageViewModel>(pages, "pages");
This works as expected. Guess I could specify a default index name in the configuration to avoid specifying the index for the IndexMany call.
If you are using C# you should create a list of objects that you want to insert then call the IndexMany function.
Example :
List<Business> businessList = new List<Business>();
#region Fill the business list
...............................
#endregion
if (businessList.Count == 1000) // the size of the bulk.
{
EsClient.IndexMany<Business>(businessList, IndexName);
businessList.Clear();
}
And in the end check again
if (businessList.Count > 0)
{
EsClient.IndexMany<Business>(businessList, IndexName);
}