I'm using Azure Mobile Services with a C# client. I have a table of "Scores" that have a Facebook Id for each score. What I need to do is pass in an array of friends for a user and return all scores in that list of friends.
So I tried this on the client:
return _client.GetTable<Score>().WithParameters(new Dictionary<string, string>
{
{ "Level", level.ToString() },
//aggregate our string[] to a comma-delimited string
{ "Friends", friends.Aggregate(new StringBuilder(), (b, s) => b.Append(s).Append(',')).ToString() }
}).ToListAsync();
Which is weird I only have the option to pass in strings for custom parameters.
So I did this on the server's read:
function read(query, user, request) {
query.where({ Level: request.parameters.Level, UserId: request.parameters.Friends.split(',') });
request.execute();
}
It doesn't seem like a comma-delimited list is going to work. I get this error on the server:
Error in script '/table/Score.read.js'. Error: Unsupported literal value chucknorris,bobloblaw,
NOTE: I passed chucknorris and bobloblaw as Facebook ids for a test.
Is there another way to make this work? The "Level" value filters just fine if I take out the string delimited stuff.
Your usage of the mssql object definitely works, but you can also use a function in the query.where call, with the function returning the condition you want:
function read(query, user, request) {
query.where(function(level, userIds) {
return this.Level === level && this.UserId in userIds;
}, request.parameters.Level, request.parameters.Friends.split(','));
request.execute();
}
Notice that the in operator in this function doesn't behave quite exactly like the in operator in JavaScript. Instead, it's used exactly to have an 'or' grouping of equality statements, which you were creating "by hand" to pass to the SQL object.
Got it working with this server-side script:
function read(query, user, request) {
var innerSql = '';
var friends = request.parameters.Friends.split(',');
var parameters = new Array(friends.length + 1);
parameters[0] = request.parameters.Level;
for (var i = 0; i < friends.length; i++) {
if (i !== 0) {
innerSql += ' or ';
}
innerSql += 'UserId = ?';
parameters[i + 1] = friends[i];
}
mssql.query('select * from Score where Level=? and (' + innerSql + ')', parameters, {
success: function (results) {
request.respond(statusCodes.OK, results);
}
});
}
I'll keep this answer open for a few days if someone has a cleaner solution.
Related
I have a Cosmos DB stored procedure in which I am passing list of comma saperated Ids. I need to pass those IDs to in query. when I'm passing one value to the parameter then its working fine but not with more that one value.
It would be great if any one could help here.
below is the code of the stored procedure:
function getData(ids) {
var context = getContext();
var coll = context.getCollection();
var link = coll.getSelfLink();
var response = context.getResponse();
var query = {query: "SELECT * FROM c where c.vin IN (#ids)", parameters:
[{name: "#ids", value: ids}]};
var requestOptions = {
pageSize: 500
};
var run = coll.queryDocuments(link, query, requestOptions, callback);
function callback(err, docs) {
if (err) throw err;
if (!docs || !docs.length) response.setBody(null);
else {
response.setBody(JSON.stringify(docs));
}
}
if (!run) throw new Error('Unable to retrieve the requested information.');
}
For arrays, you should use ARRAY_CONTAINS function:
var query = {
query: "SELECT * FROM c where ARRAY_CONTAINS(#ids, c.vin)",
parameters: [{name: "#ids", value: ids}]
};
Also, it is possible that, as stated in this doc, your #ids array is being sent as string
When defining a stored procedure in Azure portal, input parameters are always sent as a string to the stored procedure. Even if you pass an array of strings as an input, the array is converted to string and sent to the stored procedure. To work around this, you can define a function within your stored procedure to parse the string as an array
So you might need to parse it before querying:
function getData(ids) {
arr = JSON.parse(ids);
}
Related:
How can I pass array as a sql query param for cosmos DB query
https://github.com/Azure/azure-cosmosdb-node/issues/156
This is how you can do it:
Inside the stored procedure
parse your one parameter into an array using the split function
loop through the array and
a) build the parameter name / value pair and push it into the parameter array used by the query later
b) use the parameter name to build a string for use inside the parenthesis of the IN statement
Build the query definition and pass it to the collection.
Example
This is how the value of the parameter looks: "abc,def,ghi,jkl"
If you are going to use this, replace "stringProperty" with the name of the property you are querying against.
// SAMPLE STORED PROCEDURE
function spArrayTest(arrayParameter) {
var collection = getContext().getCollection();
var stringArray = arrayParameter.split(",");
var qParams = [];
var qIn = "";
for(var i=0; i<stringArray.length; i++){
var nm = '#p'+ i; // parameter name
qParams.push({name: nm, value: stringArray[i]});
qIn += ( nm +','); // parameter name for query
}
qIn = qIn.substring(0,qIn.length-1); // remove last comma
// qIn only contains a list of the names in qParams
var qDef = 'SELECT * from documents d where d.stringProperty in ( ' + qIn + ' )';
console.log(qParams[0].name);
// Query Definition to be passed into "queryDocuments" function
var q = {
query: qDef,
parameters: qParams
};
// Query documents
var isAccepted = collection.queryDocuments(
collection.getSelfLink(),
q,
function (err, feed, options) {
if (err) throw err;
// Check the feed and if empty, set the body to 'no docs found',
// else return all documents from feed
if (!feed || !feed.length) {
var response = getContext().getResponse();
response.setBody('no docs found with an stringProperty in ' + arrayParameter);
}
else {
var response = getContext().getResponse();
response.setBody(feed);
}
});
if (!isAccepted) throw new Error('The query was not accepted by the server.');
}
Please refer to my sample js code, it works for me.
function sample(ids) {
var collection = getContext().getCollection();
var query = 'SELECT * FROM c where c.id IN ('+ ids +')'
console.log(query);
var isAccepted = collection.queryDocuments(
collection.getSelfLink(),
query,
function (err, feed, options) {
if (err) throw err;
if (!feed || !feed.length) getContext().getResponse().setBody('no docs found');
else {
for(var i = 0;i<feed.length;i++){
var doc = feed[i];
doc.name = 'a';
collection.replaceDocument(doc._self,doc,function(err) {
if (err) throw err;
});
}
getContext().getResponse().setBody(JSON.stringify("success"));
}
});
if (!isAccepted) throw new Error('The query was not accepted by the server.');
}
Parameter : '1','2','3'
Hope it helps you.
I have a stored procedure which gives me a document count (count.js on github). I have partitioned my collection. Due to this, I now have to pass the partition key in as an option to run the stored procedure.
Can and how should I enable crosspartition queries in the stored procedure (ie, collection(EnableCrossPartitionQuery = true)) so that I don't have to specify the partition key?
There is no way to do fan-out stored procedure execution in DocumentDB. The run against a single partition. I ran into this dilemma when trying to switch to partitioned collections and had to make some adjustments. Here are some options:
Download a 1 for every record and sum/count them client-side
Rerun the stored procedure for each unique partition key. In my case, this was not as bad as it sounds since the partition key is a tenantID and I only have a dozen of those and only expect a few hundred max.
I'm not sure about this one since I haven't tried it with partitioned collections, but each query now returns the resource usage of the collection in the x-ms-resource-usage header. That header has a documentsSize sub-header. You could use that divided by the average size of your documents to get an approximate count. There may even be a count record in that header information by now.
Also, there is an x-ms-item-count header but I'm not sure how that behaves. If you send a query for all the records in the entire partitioned collection and set the max-item-count to 1, you'll only get back one record and it shouldn't cost you a lot in RUs, but I don't know how that header behaves. Does it return a 1 in that case? Or does it return the total number of documents all the pages of the query would eventually return if you bothered to request every page. A quick experiment should confirm this.
Below you can find some example code that should allow you to read all records cross partion. The magic is inside the doForAll function, and at the top you can see how it is called.
// SAMPLE STORED PROCEDURE
function sample(prefix) {
var share = { counter: 0, hasEntityName : 0, isXXX: 0, partitions: {}, prefix };
doForAll({
filter: function limiter(record){
if (record && record.entityName === 'XXX') return true;
else return false;
},
callback: function handleRecord(record) {
//Keep track of this partition...
let partitionKey = record.partitionKey;
if (share.partitions[partitionKey])
share.partitions[partitionKey]++;
else
share.partitions[partitionKey] = 1;
//update some counters...
share.counter++;
if (record.entityName !== undefined) share.hasEntityName++;
if (record.entityName === 'XXX') share.isXXX++;
},
finaly: function whenAllIsDone() {
console.log("counter = " + share.counter + ". ");
console.log("has entity name: "+ share.hasEntityName+ ". ")
console.log("is XXX: " + share.isXXX+ ". ")
var parts = Object.getOwnPropertyNames(share.partitions)
console.log("partition keys: " + parts.length + " ...");
getContext()
.getResponse()
.setBody(share);
}
});
//The magic function...
//also see: https://azure.github.io/azure-cosmosdb-js-server/Collection.html
function doForAll(task, ctoken) {
if (!task) throw "Expected one parameter of type: { filter?: (rec?)=>boolean, callback?: (rec?) => void, finaly?: () => void }";
//Note:
//the "__" symbol is an alias for var collection = getContext().getCollection(); = aliased by __
var result = getContext()
.getCollection()
.chain()
.filter(task.filter || function (rec) { return true; })
.map(task.callback || function (rec) { return undefined; })
.value({ continuation: ctoken }, function afterBatchCallback (err, feed, options) {
if (err) throw err;
if (options.continuation)
doForAll(task, options.continuation);
else if (task.finaly)
task.finaly();
});
if (!result.isAccepted)
throw "catastrophic failure";
}
}
PS: it may to know how the data looks like that is used for the example.
This is an example of such a document:
{
"id": "123",
"partitionKey": "PART_1",
"entityName": "EXAMPLE_ENTITY",
"veryInterestingInfo": "The 'id' property is also the collections id, the 'partitionKey' property happens to be the collections partition key, and all the records in this collection have a 'entityName' property which contains a (non-unique) string"
}
MVC 4 seems to be having trouble parsing the object I am sending it.
Using jQuery's ajax function, I send the data using POST request. I receive it in the Request object, it appears like this in the Request.Form:
{Name=Test&Groups%5b0%5d%5bName%5d=GroupName1&Groups%5b0%5d%5bCount%5d=123Groups%5b1%5d%5bName%5d=GroupName2&Groups%5b1%5d%5bCount%5d=123ID=bee4c411-f06c-43c6-815f-8002df4f2779}
//formatted for readability
Name=Test &
Groups[0][Name]=GroupName1 &
Groups[0][Count]=123 &
Groups[1][Name]=GroupName2 &
Groups[1][Count]=123 &
ID=bee4c411-f06c-43c6-815f-8002df4f2779
The Name and ID values are parsed just fine, but the Groups array is not... I do get an IEnumerable, and it contains the correct number of groups but the values within are null.
I've read this and this and I can't seem to find what I did wrong...
What am I missing?
The MVC Action looks like this:
public ActionResult UpdateGroups(GroupsListRequest req)
{
[...] //handle the request
}
and GroupsListRequest looks like this:
public class GroupsListRequest
{
public string Name { get; set; }
public string ID { get; set; }
public IEnumerable<GroupRequest> Groups { get; set; }
}
Finally, GroupsRequest looks like this:
public class GroupsRequest
{
public string Name { get; set; }
public int Count { get; set; }
}
try this
remove the square bracket around name and count field in request.
Name:Test &
Groups[0].Name:GroupName1 &
Groups[0].Count:123 &
Groups[1].Name:GroupName2 &
Groups[1].Count:123 &
ID:bee4c411-f06c-43c6-815f-8002df4f2779
your input name should be like this
<input type="text" name="Groups[0].Name" value="George" />
So, as #sangramparmar and #stephenmuecke mentioned, the problem was with the format of the objects in the array - they were surrounded by an extra pair of brackets.
Several circumstances made this problematic to fix:
I am using jQuery to send the request to the server, and the param function is the one causing the problem by parsing the array incorrectly (at least as far as C#'s MVC is concerned
Each request to the server needs to have a (session-constant) key added to its data prior to sending the request. This is performed automatically through a global ajaxSetup
For some reason, MVC failed to read the ajax request's data when I set the contentType to application/json (at least, failed to read it in the way I was expecting... which, if I've learned anything ever, was probably misdirected in some way)
The solution I ended up using was to build my own stringify function to parse the data object to a string, like so:
stringify: function (obj, prefix) {
var objString = '';
prefix = prefix || '';
if(typeof(obj) === 'object') {
if (Array.isArray(obj)) {
_.each(obj, function (item, index) {
objString = objString + utilityMethods.stringify(item, prefix + '%5b' + index + '%5d');
});
} else {
if (prefix != undefined && prefix.length !== 0) {
prefix = prefix + '.';
}
_.each(_.keys(obj), function (key) {
objString = objString + utilityMethods.stringify(obj[key], prefix + key);
});
}
} else {
objString = objString + '&' + prefix + '=' + obj;
}
return objString;
}
Then, in global ajaxSetup, I added a beforeSend to add the needed key, like so:
beforeSend: function (jqXHR, settings) {
var userKeyString = 'userKey=' + [... fetch the user key...];
if (settings.type.toUpperCase() === 'GET') {
settings.url = settings.url + '&' + userKeyString;
} else {
if (settings.data != undefined) {
settings.data = userKeyString + '&' + settings.data;
}
}
},
Now, I do end up with an extra ampersand in each request. I'll fix that eventually... maybe.
I'm trying to retrieve the Risks field from the tabs in TFS, however, when I print all the Fields, I can't see the Risks .
I've tried accessing it directly via WorkItem.Fields["FieldName"] butno luck.
Any ideas?
You can use WIQL Queries to get the values of all fields. Here is a list of all Work item field index . Below is a sample with how to get all work items and all fields for a particular project:
using Microsoft.TeamFoundation.WorkItemTracking.Client;
Query query = new Query(
workItemStore,
"select * from issue where System.TeamProject = #project",
new Dictionary<string, string>() { { "project", project.Name } }
);
var workItemCollection = query.RunQuery();
foreach(var workItem in workItemCollection)
{
/*Get work item properties you are interested in*/
foreach(var field in workItem.Fields)
{
/*Get field value*/
info += String.Format("Field name: {0} Value: {1}\n", field.name, field.Value);
}
}
I am a bit late, I guess, but since it might still help somebody, I'm going to post this anyways. Even if you haven't specified, if you're in the frontend or backend.
tl;dr: try to omit the fields parameter in the request.
Background: I wanted to provide more workitem-details in the pull requests details-view, so I created a userscript for TamperMonkey. That means I don't have "direct" access to TFS, since I am only accessing the frontend via JavaScript.
Like you, I also noticed that TFS doesn't output all fields. To solve that, I then modified the TFS ajax request with jQuery to omit the fields-parameter. Then TFS started returning all existing fields for the work item.
I found the info in the TFS documentation for work-items
fields (string)
A comma-separated list of up to 100 fields to get with each work item.
If not specified, all fields are returned.
In case that actually is your use-case, I am also providing the script that I wrote to modify the ajax request:
// by Joel Richard -> http://stackoverflow.com/a/26849194/4524280
function parseParams(str) {
return str.split('&').reduce(function (params, param) {
var paramSplit = param.split('=').map(function (value) {
return decodeURIComponent(value.replace('+', ' '));
});
params[paramSplit[0]] = paramSplit[1];
return params;
}, {});
}
$.ajaxPrefilter(function( options, originalOptions, jqXHR ) {
// Modify ajax request to return all fields... definitely not a hack :D
if(options && options.url && options.url.indexOf('_apis/wit/workItems') >= 0) {
var parsedData = parseParams(options.data);
delete parsedData.fields;
options.data = $.param(parsedData);
}
});
$(document).ajaxComplete(function(event, request, settings) {
// trigger after ajax is complete to get values
if(settings && settings.url && settings.url.indexOf('_apis/wit/workItems') >= 0 && request.responseJSON) {
var workItemsData = request.responseJSON.value;
// -> workItemsData.fields contains all existing fields
}
});
Just for the protocol: I don't think anyone should use $.ajaxPrefilter in "normal" use-cases, but in this case, I didn't have much options available.
I'm attempting to implement the DataTables jQuery plugin in my ASP.NET MVC site. I used a slight variation of the code at this LINK to implement the server-side processing and it's working like it's supposed to. However, I didn't realize at the time that the server-side filtering would only return a search if the columns contained a single word or an exact sequence of words until trying it out. This differs from the client-side version in that the global search will filter on any column, for any word, in any order...truly filtering down the results.
For example, I have two columns in the table: ID and Description. If one of the values for ID is "A1B2C3" and one of the values for Description is "Capturing the sequence of events prior to initialization," if I search on the client-side by typing in "sequence initial," it will return the record. However, using server-side, if I type in "sequence initial," nothing will be returned. If I type in "sequence of," then the record will be returned, which tells me that the search is being performed by the entire search term and looking for a record that contains the entire phrase. This isn't the "filter-down" search I was hoping for.
I'm not well-versed in LINQ, so I don't know if what I'm trying to do is possible, but here's the FilterResult method that returns the search:
public IQueryable<IDS> FilterResult(string search, List<IDS> dtResult, List<string> columnFilters)
{
IQueryable<IDS> results = dtResult.AsQueryable();
results = results.Where(p => (search == null || p.ID != null && p.ID.ToLower().Contains(search.ToLower())
|| p.Description != null && p.Description.ToLower().Contains(search.ToLower())));
return results;
}
And here's my DataHandler method in the controller:
[HttpPost]
public JsonResult DataHandler(DTParameters param, FilterViewModel filterModel)
{
try
{
var dtSource = new List<IDS>();
dtSource = db.DBIDS.ToList();
List<string> columnSearch = new List<string>();
foreach (var column in param.Columns)
{
columnSearch.Add(column.Search.Value);
}
List<IDS> currentData = new ResultSet().GetCurrentDataSet(filterModel.currentSearch, db.DBIDS.ToList());
List<IDS> data = new ResultSet().GetResult(param.Search.Value, param.SortOrder, param.Start, param.Length, currentData, columnSearch);
int count = new ResultSet().Count(param.Search.Value, currentData, columnSearch);
DTResult<IDS> result = new DTResult<IDS>
{
draw = param.Draw,
data = data,
recordsFiltered = count,
recordsTotal = count
};
return Json(result, JsonRequestBehavior.AllowGet);
}
catch (Exception ex)
{
return Json(new { error = ex.Message }, JsonRequestBehavior.AllowGet);
}
}
I realize that my DataHandler is using the same data set for each call, which I believe is part of the problem. I attempted to capture the search term in a variable called newSearch, then assign it to a variable called oldSearch, and use the oldSearch as custom data (or HTTP variable) that's being sent to the server. My thought was that I would perform the search on the database using the oldSearch term to return a record set. I would then use that record set with the new search term to return the results I'm after. However, I'm not sure how to change the LINQ query so that it will search similarly to the client-side search.
Here's the DataTables code where I'm attempting to capture the oldSearch and send it to the server:
$("#myTable").dataTable({
ajax: {
type: "POST",
url: "datahandler",
contentType: "application/json; charset=utf-8",
data: function (data) { data.currentSearch = oldSearch; return data = JSON.stringify(data); }
},
columns: columnArray,
lengthChange: false,
order: [1, "asc"],
paging: true,
pageLength: 10,
processing: true,
searching: true,
serverSide: true
});
$("#myTable").dataTable().on("preXhr.dt", function (e, settings, data) {
newSearch = $("#dTable").DataTable().search();
data.currentSearch = oldSearch;
});
$("#dTable").dataTable().on("xhr.dt", function () {
oldSearch = newSearch;
});
Is this possible with LINQ? For reference, when I attempted to send the search term to the server, I also attempted to change the LINQ query by using the existing query and adding on to it. The add-on splits each word in the description into a list and splits the search term into a list, then uses .Any for the comparison:
results = results.Where(p => (search == null || p.ID != null && p.ID.ToLower().Contains(search.ToLower())
|| p.Description != null && (p.Description.ToLower().Contains(search.ToLower()) || p.Description.ToLower().Split(' ').ToList().Any(x => search.ToLower().Split(' ').ToList().Contains(x)))));
This doesn't work because the search is looking for each item in the list specifically. There's no filter, so any record that contains any of the words in the search list will be returned.
I've seen some posts online that indicate that people typically do a column-based search, but I'd prefer to stick to the one global search box due to limited real estate on the page where the DataTable resides.