We have an object (XML or JSON) and we map it to a DTO successfully, it takes too long (5~7 minutes) to be inserted in our database, so we went through Parallel.ForEach, but eventually, we noticed that there are some data entered incorrectly, like the Category has all items with the same name, but other different properties are 100% correct, in other case, we noticed that all data are the same in one category, although, the provided JSON object doesn't have that.
I confess it is so fast, it takes less than a minute, but with wrong insertion, have a look below on the used code:
JSON
[
{
"CategoryId": 1,
"CategoryName": "Drinks",
"SortOrder": 1,
"Products": [
{
"ProductId": 100,
"ProductName": "Black Tea",
"SortOrder": 1,
"Price": 5,
"Choices": []
},
{
"ProductId": 101,
"ProductName": "Turkish Coffee",
"SortOrder": 2,
"Price": 7.5,
"Choices": []
},
{
"ProductId": 102,
"ProductName": "Green Tea",
"SortOrder": 3,
"Price": 6,
"Choices": []
},
{
"ProductId": 103,
"ProductName": "Café Latte Medium",
"SortOrder": 4,
"Price": 10,
"Choices": []
},
{
"ProductId": 104,
"ProductName": "Orange Juice",
"SortOrder": 5,
"Price": 11,
"Choices": []
},
{
"ProductId": 105,
"ProductName": "Mixed Berry Juice",
"SortOrder": 6,
"Price": 12.5,
"Choices": []
}
]
},
{
"CategoryId": 1,
"CategoryName": "Meals",
"SortOrder": 1,
"Products": [
{
"ProductId": 200,
"ProductName": "Breakfast Meal",
"SortOrder": 1,
"Price": 16,
"Choices": [
{
"ChoiceId": 3000,
"ChoiceName": "Strawberry Jam",
"SortOrder": 1,
"Price": 0
},
{
"ChoiceId": 3001,
"ChoiceName": "Apricot Jam",
"SortOrder": 2,
"Price": 0
},
{
"ChoiceId": 3002,
"ChoiceName": "Orange Jam",
"SortOrder": 3,
"Price": 0
},
{
"ChoiceId": 3003,
"ChoiceName": "Café Latte",
"SortOrder": 4,
"Price": 2
}
]
},
{
"ProductId": 201,
"ProductName": "Mixed Grill",
"SortOrder": 1,
"Price": 30,
"Choices": [
{
"ChoiceId": 3004,
"ChoiceName": "Moutabal",
"SortOrder": 1,
"Price": 0
},
{
"ChoiceId": 3005,
"ChoiceName": "Mineral Water",
"SortOrder": 2,
"Price": 0
},
{
"ChoiceId": 3006,
"ChoiceName": "French Fries",
"SortOrder": 2,
"Price": 0
},
{
"ChoiceId": 3007,
"ChoiceName": "Grilled Potatoes",
"SortOrder": 2,
"Price": 0
}
]
}
]
}
]
C# code
Parallel.ForEach(categories, (category) =>
{
var newCreatedCategoryId = 0;
using (var connection = new SqlConnection("CONNECTION_STRING_HERE"))
{
connection.Open();
using (var command = new SqlCommand("SP_INSERT_INTO_CATEGORIES", connection))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("#P1", category.CategoryName);
command.Parameters.AddWithValue("#P2", category.SortOrder);
newCreatedCategoryId = int.Parse(command.ExecuteScalar().ToString());
command.Dispose();
}
connection.Close();
}
if (newCreatedCategoryId > 0)
{
Parallel.ForEach(category.Products, (product) =>
{
using (var connection = new SqlConnection("CONNECTION_STRING_HERE"))
{
connection.Open();
using (var command = new SqlCommand("SP_INSERT_INTO_PRODUCTS", connection))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("#P1", product.ProductName);
command.Parameters.AddWithValue("#P2", product.Price);
command.Parameters.AddWithValue("#P3", product.SortOrder);
command.Parameters.AddWithValue("#P4", newCreatedCategoryId);
command.ExecuteNonQuery();
command.Dispose();
}
connection.Close();
}
});
}
});
I had a look here, but this is not our issue, we are already using SCOPE_IDENTITY() to get the last generated identity in the current scope of execution.
On the other hand, it is not allowed to use SqlBulkCopy to insert this amount of data even if with no TableLock.
Its the newCreatedCategoryId that is the problem, what is confusing me is why you are calling newCreatedCategoryId = int.Parse(command.ExecuteScalar().ToString()); again in the inner loop. i mean if its just an id of category it doesn't need to be incremented again.
Take a look at the below edit. You might also be better to just put the second Parallel.ForEach into a standard foreach i mean this is all working in parallel anyway. Lastly Parallel.ForEach is not really suited to IO tasks, the correct pattern is async and await. on saying that you could probably use an ActionBlock out of TPL Dataflow to take advantage of the best of both worlds. Take a look at the dataflow example in the this question i answered Downloading 1,000+ files fast?
Parallel.ForEach(categories, (category) =>
{
var newCreatedCategoryId = 0;
using (var connection = new SqlConnection("CONNECTION_STRING_HERE"))
{
connection.Open();
using (var command = new SqlCommand("SP_INSERT_INTO_CATEGORIES", connection))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("#P1", category.CategoryName);
command.Parameters.AddWithValue("#P2", category.SortOrder);
newCreatedCategoryId = int.Parse(command.ExecuteScalar().ToString());
command.Dispose();
}
connection.Close();
}
if (newCreatedCategoryId > 0)
{
foreach(product in category.Products)
{
using (var connection = new SqlConnection("CONNECTION_STRING_HERE"))
{
connection.Open();
using (var command = new SqlCommand("SP_INSERT_INTO_PRODUCTS", connection))
{
command.CommandType = CommandType.StoredProcedure;
command.Parameters.AddWithValue("#P1", product.ProductName);
command.Parameters.AddWithValue("#P2", product.Price);
command.Parameters.AddWithValue("#P3", product.SortOrder);
command.Parameters.AddWithValue("#P4", newCreatedCategoryId);
command.Dispose();
}
connection.Close();
}
}//);
}
});
The objects that you are looping over are not thread-safe. You could add a lock object however this would serialise the operation and defeat the purpose of the Parallel.Foreach. You need to change theParallel.ForEach to a standard ForEach loop.
Potential Pitfalls in Data and Task Parallelism
You change the newCreatedCategoryId inside the Parallel.ForEach, which may cause incorrect data, because the queries wont run in order.
Related
I have an below json object.I would to know 2 points
How to convert below json object to DataTable or Dataset in C# Code
After converting,reverse back to (How to convert DataTable to Json object) in C# Code.
Please provide me example code
[
{
"transaction": {
"Version": "1.01",
"TranDtls": {
"TaxSch": "GST",
"SupTyp": "B2B",
"RegRev": "Y",
"EcmGstin": null,
"IgstOnIntra": "N"
},
"DocDtls": {
"Typ": "INV",
"No": "DAOC/007",
"Dt": "10/08/2020"
},
"SellerDtls": {
"Gstin": "29AAFCD5862R000",
"LglNm": "NIC company pvt ltd",
"TrdNm": "NIC Industries",
"Addr1": "5th block, kuvempu layout",
"Addr2": "kuvempu layout",
"Loc": "GANDHINAGAR",
"Pin": 560037,
"Stcd": "29",
"Ph": "9000000000",
"Em": "abc#gmail.com"
},
"BuyerDtls": {
"Gstin": "29AWGPV7107B1Z1",
"LglNm": "XYZ company pvt ltd",
"TrdNm": "XYZ Industries",
"Pos": "12",
"Addr1": "7th block, kuvempu layout",
"Addr2": "kuvempu layout",
"Loc": "GANDHINAGAR",
"Pin": 562160,
"Stcd": "29",
"Ph": "91111111111",
"Em": "xyz#yahoo.com"
},
"DispDtls": {
"Nm": "ABC company pvt ltd",
"Addr1": "7th block, kuvempu layout",
"Addr2": "kuvempu layout",
"Loc": "Banagalore",
"Pin": 562160,
"Stcd": "29"
},
"ShipDtls": {
"Gstin": "29AWGPV7107B1Z1",
"LglNm": "CBE company pvt ltd",
"TrdNm": "kuvempu layout",
"Addr1": "7th block, kuvempu layout",
"Addr2": "kuvempu layout",
"Loc": "Banagalore",
"Pin": 562160,
"Stcd": "29"
},
"ItemList": [
{
"SlNo": "1",
"PrdDesc": "Rice",
"IsServc": "N",
"HsnCd": "1001",
"Barcde": "123456",
"Qty": 100.345,
"FreeQty": 10,
"Unit": "BAG",
"UnitPrice": 99.545,
"TotAmt": 9988.84,
"Discount": 10,
"PreTaxVal": 1,
"AssAmt": 9978.84,
"GstRt": 12.0,
"IgstAmt": 1197.46,
"CgstAmt": 0,
"SgstAmt": 0,
"CesRt": 5,
"CesAmt": 498.94,
"CesNonAdvlAmt": 10,
"StateCesRt": 12,
"StateCesAmt": 1197.46,
"StateCesNonAdvlAmt": 5,
"OthChrg": 10,
"TotItemVal": 12897.7,
"OrdLineRef": "3256",
"OrgCntry": "AG",
"PrdSlNo": "12345",
"BchDtls": {
"Nm": "123456",
"ExpDt": "01/08/2020",
"WrDt": "01/09/2020"
},
"AttribDtls": [
{
"Nm": "Rice",
"Val": "10000"
}
]
}
],
"ValDtls": {
"AssVal": 9978.84,
"CgstVal": 0,
"SgstVal": 0,
"IgstVal": 1197.46,
"CesVal": 508.94,
"StCesVal": 1202.46,
"Discount": 10,
"OthChrg": 20,
"RndOffAmt": 0.3,
"TotInvVal": 12908,
"TotInvValFc": 12897.7
},
"PayDtls": {
"Nm": "ABCDE",
"AccDet": "5697389713210",
"Mode": "Cash",
"FininsBr": "SBIN11000",
"PayTerm": "100",
"PayInstr": "Gift",
"CrTrn": "test",
"DirDr": "test",
"CrDay": 100,
"PaidAmt": 10000,
"PaymtDue": 5000
},
"RefDtls": {
"InvRm": "TEST",
"DocPerdDtls": {
"InvStDt": "01/08/2020",
"InvEndDt": "01/09/2020"
},
"PrecDocDtls": [
{
"InvNo": "DOC/002",
"InvDt": "01/08/2020",
"OthRefNo": "123456"
}
],
"ContrDtls": [
{
"RecAdvRefr": "Doc/003",
"RecAdvDt": "01/08/2020",
"Tendrefr": "Abc001",
"Contrrefr": "Co123",
"Extrefr": "Yo456",
"Projrefr": "Doc-456",
"Porefr": "Doc-789",
"PoRefDt": "01/08/2020"
}
]
},
"AddlDocDtls": [
{
"Url": "https://einv-apisandbox.nic.in",
"Docs": "Test Doc",
"Info": "Document Test"
}
],
"ExpDtls": {
"ShipBNo": "A-248",
"ShipBDt": "01/08/2020",
"Port": "INABG1",
"RefClm": "N",
"ForCur": "AED",
"CntCode": "AE"
},
"EwbDtls": {
"TransId": "12AWGPV7107B1Z1",
"TransName": "XYZ EXPORTS",
"Distance": 100,
"TransDocNo": "DOC01",
"TransDocDt": "10/08/2020",
"VehNo": "ka123456",
"VehType": "R",
"TransMode": "1"
}
}
}
]
You can check it here How to create dataset from Object?
There are a lot of ways you can use to instantiate an object from json string.
If you can't use any kind of third party libraries such as Newtonsoft.Json you can do it with JsonReaderWriterFactory
public static TEntity Create<TEntity>(string json)
{
using (var memoryStream = new MemoryStream())
{
byte[] jsonBytes = Encoding.UTF8.GetBytes(json);
memoryStream.Write(jsonBytes, 0, jsonBytes.Length);
memoryStream.Seek(0, SeekOrigin.Begin);
using (var jsonReader = JsonReaderWriterFactory.CreateJsonReader(
memoryStream,
Encoding.UTF8,
XmlDictionaryReaderQuotas.Max,
null))
{
var serializer = new DataContractJsonSerializer(typeof(TEntity));
TEntity entity = (TEntity)serializer.ReadObject(jsonReader);
return entity;
}
}
}
And vice-versa
public static string Create(object entity)
{
var serializer = new DataContractJsonSerializer(entity.GetType());
using (var stream = new MemoryStream())
{
using (var writer = JsonReaderWriterFactory.CreateJsonWriter(stream, Encoding.UTF8))
{
serializer.WriteObject(writer, entity);
}
return Encoding.UTF8.GetString(stream.ToArray());
}
}
Then you will only need to work a little bit with C# to populate the required DataSet with your object instance.
Here is my code
so I'm using couchbase queue to Enqueue my beacon information. I'm trying to use n1ql query for my get method and I'm having trouble getting all the information. I realized I'm only getting the first beacon entry because result.Rows returns one element, an array of BeaconInfoN1ql. I wanted to iterate through that array and add each to a list.
try {
var cluster = new Cluster(new ClientConfiguration());
using (var bucket = cluster.OpenBucket("BeaconInfoN1ql"))
{
string query = "SELECT * FROM `BeaconInfoN1ql`";
var queryRequest = new QueryRequest(query);
var result = bucket.Query<dynamic>(queryRequest);
foreach (var row in result.Rows)
{
int i = 0;
var beacon = new Beacon()
{
SerialNumber = row.BeaconInfoN1ql[i].serialNumber,
ReceivedDate = Convert.ToDateTime(row.BeaconInfoN1ql[i].receivedDate),
ReceiverId = row.BeaconInfoN1ql[i].receiverId,
Distance = Convert.ToDouble(row.BeaconInfoN1ql[i].distance),
Rssi = Convert.ToInt32(row.BeaconInfoN1ql[i].rssi),
NewDistance = Convert.ToDouble(row.BeaconInfoN1ql[i].newDistance),
DistanceTesting = Convert.ToDouble(row.BeaconInfoN1ql[i].distanceTesting),
};
i++;
_beaconsList.Add(beacon);
}
}
return _beaconsList;
my result.Rows looks like this
result.Rows=
{{
"BeaconInfoN1ql": [
{
"distance": 2.2705747109792007,
"distanceTesting": 22,
"newDistance": 22,
"receivedDate": "0001-01-01T00:00:00",
"receiverId": "42008780c4b9b329",
"rssi": -73,
"serialNumber": "888"
},
{
"distance": 2.2705747109792007,
"distanceTesting": 22,
"newDistance": 22,
"receivedDate": "0001-01-01T00:00:00",
"receiverId": "42008780c4b9b329",
"rssi": -73,
"serialNumber": "888"
},
{
"distance": 2.2705747109792007,
"distanceTesting": 22,
"newDistance": 22,
"receivedDate": "0001-01-01T00:00:00",
"receiverId": "42008780c4b9b329",
"rssi": -73,
"serialNumber": "888"
},
{
"distance": 2.2705747109792007,
"distanceTesting": 22,
"newDistance": 22,
"receivedDate": "0001-01-01T00:00:00",
"receiverId": "42008780c4b9b329",
"rssi": -73,
"serialNumber": "888"
},
]
}}
I'm not sure about how to make the second foreach/for loop to iterate through all the keys.
For iterating JSON, I like to use dynamics. Here's an example:
var result = new Result()
{
Rows = #"{
'BeaconInfoN1ql': [
{
'distance': 2.2705747109792007,
'distanceTesting': 22,
'newDistance': 22,
'receivedDate': '0001-01-01T00:00:00',
'receiverId': '42008780c4b9b329',
'rssi': -73,
'serialNumber': '888'
}
]
}" //other entries omitted for brevity
};
dynamic parsedRows = JsonConvert.DeserializeObject(result.Rows);
foreach (var entry in parsedRows.BeaconInfoN1ql)
Debug.Write(entry.distance);
NOTE: I got rid of the double curly braces from your output in my example.
My first JSON is as follows
[{
"UserId": 4,
"FirstName": "rupesh",
"LastName": "Abc",
"Email": "abc#gmail.com",
"Gender": "Male"
}]
My Second JSON is as follows
[{
"AccountId": 2,
"AccountName": "rupeshinfo",
"AccountDomain": null,
"RoleId": 1,
"UserId": 4
}, {
"AccountId": 3,
"AccountName": "Rameshinfo",
"AccountDomain": null,
"RoleId": 2,
"UserId": 4
}]
the result must be
{
"UserDetails": [{
"UserId": 4,
"FirstName": "rupesh",
"LastName": "Abc",
"Email": "abc#gmail.com",
"Gender": "Male"
}],
"AccountDetails": [{
"AccountId": 2,
"AccountName": "rupeshinfo",
"AccountDomain": null,
"RoleId": 1,
"UserId": 4
}, {
"AccountId": 3,
"AccountName": "Rameshinfo",
"AccountDomain": null,
"RoleId": 2,
"UserId": 4
}]
}
If you don't want to mess with string inserts you can go with (and I recommend so) using dynamic objects:
var javaScriptSerializer = new JavaScriptSerializer();
var userDetails = javaScriptSerializer.DeserializeObject(json1);
var accountDetails = javaScriptSerializer.DeserializeObject(json2);
var resultJson = javaScriptSerializer.Serialize(new {UserDetails = userDetails, AccountDetails = accountDetails});
You can deserialize them into two objects, create new anonimous type of these objects, and serialize them into the end one json:
JavaScriptSerializer jsonSerializer = new JavaScriptSerializer();
var result = jsonSerializer.Serialize(new
{
UserDetails = jsonSerializer.DeserializeObject(#"[{
'UserId': 4,
'FirstName': 'rupesh',
'LastName': 'Abc',
'Email': 'abc#gmail.com',
'Gender': 'Male'
}]"),
AccountDetails = jsonSerializer.DeserializeObject(#"[{
'AccountId': 2,
'AccountName': 'rupeshinfo',
'AccountDomain': null,
'RoleId': 1,
'UserId': 4
}, {
'AccountId': 3,
'AccountName': 'Rameshinfo',
'AccountDomain': null,
'RoleId': 2,
'UserId': 4
}]")
});
Try this
var jsonStr ='{"UserDetails":[{"UserId": 4,"FirstName": "rupesh","LastName": "Abc","Email": "abc#gmail.com","Gender": "Male"}]}'
var obj = JSON.parse(jsonStr);
obj['AccountDetails'].push({"AccountId": 2,"AccountName": "rupeshinfo","AccountDomain": null,"RoleId": 1,"UserId": 4}, {"AccountId": 3,"AccountName": "Rameshinfo","AccountDomain": null,"RoleId": 2,"UserId": 4});
jsonStr = JSON.stringify(obj);
I'm building a dashboard using ASP.Net MVC, Angular.js, SQL Server and Fusion charts. All my data for charting is stored in database and I'm getting them via stored procedure. Now I need to pass the results from stored procedure to Json/XML, only formats Fusion Charts supports. What would be the best method to convert this data:
Hour Input Output InTarget OutTarget
7 22314 18537 6500 4875
8 36395 29931 6500 4875
9 32661 28518 6500 4875
10 34895 29793 6500 4875
11 30300 26538 6500 4875
12 31011 26898 6500 4875
13 16363 13716 6500 4875
into this Json?
{
"chart": {
"caption": "Input and Output",
"numberprefix": "$",
"plotgradientcolor": "",
"bgcolor": "FFFFFF",
"showalternatehgridcolor": "0",
"divlinecolor": "CCCCCC",
"showvalues": "0",
"showcanvasborder": "0",
"canvasborderalpha": "0",
"canvasbordercolor": "CCCCCC",
"canvasborderthickness": "1",
"yaxismaxvalue": "30000",
"captionpadding": "30",
"yaxisvaluespadding": "15",
"legendshadow": "0",
"legendborderalpha": "0",
"palettecolors": "#f8bd19,#008ee4,#33bdda,#e44a00,#6baa01,#583e78",
"showplotborder": "0",
"showborder": "0"
},
"categories": [
{
"category": [
{
"label": "7"
},
{
"label": "8"
},
{
"label": "9"
},
{
"label": "10"
},
{
"label": "11"
},
{
"label": "12"
},
{
"label": "13"
}
]
}
],
"dataset": [
{
"seriesname": "Input",
"data": [
{
"value": "22314"
},
{
"value": "36395"
},
{
"value": "32661"
},
{
"value": "34895"
},
{
"value": "30300"
},
{
"value": "31011"
},
{
"value": "16363"
}
]
},
{
"seriesname": "Output",
"data": [
{
"value": "18537"
},
{
"value": "29931"
},
{
"value": "28518"
},
{
"value": "29793"
},
{
"value": "26538"
},
{
"value": "26898"
},
{
"value": "13716"
}
]
},
{
"seriesname": "InTarget",
"renderas": "Line",
"data": [
{
"value": "6500"
},
{
"value": "6500"
},
{
"value": "6500"
},
{
"value": "6500"
},
{
"value": "6500"
},
{
"value": "6500"
},
{
"value": "6500"
}
]
},
{
"seriesname": "OutTarget",
"renderas": "Line",
"data": [
{
"value": "4875"
},
{
"value": "4875"
},
{
"value": "4875"
},
{
"value": "4875"
},
{
"value": "4875"
},
{
"value": "4875"
},
{
"value": "4875"
}
]
}
]
}
What I'm thinking to do is:
stored procedure into datatable
put each column into separate array
convert array to Json in the format below
Is this going to be the best (performance) approach?
EDIT:
public Series[] GetGraphData(string sp)
{
var connection = ConfigurationManager.ConnectionStrings["EFDbContext"].ConnectionString;
using (var da = new SqlDataAdapter("exec " + sp, connection))
{
var dt = new DataTable();
da.Fill(dt);
da.FillSchema(dt, SchemaType.Mapped);
Series[] arrSeries = new Series[dt.Columns.Count];
foreach(DataColumn dc in dt.Columns)
{
if (dc.Ordinal == 0)
{
//Category here
}
else
{
var strarr = dt.Rows.Cast<DataRow>().Select(row => row[dc.Ordinal]).ToList();
Series s = new Series()
{
seriesname = dc.ColumnName,
renderas = "Line",
data = strarr.Select(o => new SeriesValue { value = o.ToString() }).ToList()
};
arrSeries[dc.Ordinal] = s;
}
}
return arrSeries;
}
}
I would load all the data into a datatable as you said, then have a Series object:
class Series{
public string seriesname{get;set;}
public string renderas{get;set;}
public IList<SeriesValue> data{get;set;}
}
class SeriesValue{
public string value{get;set;}
}
and return an array of Series to the frontend, serialized as JSON. Then you have the dataset array already built and you don't need to do any other processing on it.
I expect the performance bottleneck to be in loading the data from the db and sending it to the client .. the actual conversion to json shouldn't matter in the grand scheme of things.
I am trying to Parse a schema and read an element from it however I am getting an error.
This is my code:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Data;
using System.IO;
using Newtonsoft.Json.Schema;
namespace ConsoleApplication1
{
class Test
{
static void Main(string[] args)
{
string ingameschemaFilePath = #"C:\Users\Andrew\Documents\GitHub\SteamBot\Bin\Debug\ingameschema.txt";
string dota2schemaFilePath = #"C:\Users\Andrew\Documents\GitHub\SteamBot\Bin\Debug\dota2schema.txt";
string schemaFilePath = #"C:\Users\Andrew\Documents\GitHub\SteamBot\Bin\Debug\schema.txt";
JsonSchema dota2schema = JsonSchema.Parse(File.ReadAllText(dota2schemaFilePath));
Console.WriteLine(dota2schema.result.items.name);
System.Console.WriteLine("Press any key to exit.");
System.Console.ReadKey();
}
}
}
And this is the error I am getting:
Error 2 'Newtonsoft.Json.Schema.JsonSchema' does not contain a definition for 'result' and no extension method 'result' accepting a first argument of type 'Newtonsoft.Json.Schema.JsonSchema' could be found (are you missing a using directive or an assembly reference?)
I am trying to follow the sample here:
http://james.newtonking.com/projects/json/help/#
Samples -> JsonSchema -> Parse Json schema
And here is the start of the schema I am trying to read from:
{
"result": {
"status": 1,
"items_game_url": "http:\/\/media.steampowered.com\/apps\/570\/scripts\/items\/items_game.d8ab2f9911cea9d7f4bce1add62c7bb83a902322.txt",
"qualities": {
"normal": 0,
"genuine": 1,
"vintage": 2,
"unusual": 3,
"unique": 4,
"community": 5,
"developer": 6,
"selfmade": 7,
"customized": 8,
"strange": 9,
"completed": 10,
"haunted": 11,
"tournament": 12,
"favored": 13
},
"originNames": [
{
"origin": 0,
"name": "Timed Drop"
},
{
"origin": 1,
"name": "Achievement"
},
{
"origin": 2,
"name": "Purchased"
},
{
"origin": 3,
"name": "Traded"
},
{
"origin": 4,
"name": "Crafted"
},
{
"origin": 5,
"name": "Store Promotion"
},
{
"origin": 6,
"name": "Gifted"
},
{
"origin": 7,
"name": "Support Granted"
},
{
"origin": 8,
"name": "Found in Crate"
},
{
"origin": 9,
"name": "Earned"
},
{
"origin": 10,
"name": "Third-Party Promotion"
},
{
"origin": 11,
"name": "Wrapped Gift"
},
{
"origin": 12,
"name": "Halloween Drop"
},
{
"origin": 13,
"name": "Steam Purchase"
},
{
"origin": 14,
"name": "Foreign Item"
},
{
"origin": 15,
"name": "CD Key"
},
{
"origin": 16,
"name": "Collection Reward"
},
{
"origin": 17,
"name": "Preview Item"
},
{
"origin": 18,
"name": "Steam Workshop Contribution"
},
{
"origin": 19,
"name": "Periodic Score Reward"
},
{
"origin": 20,
"name": "Recycling"
},
{
"origin": 21,
"name": "Tournament Drop"
},
{
"origin": 22,
"name": "Passport Reward"
},
{
"origin": 23,
"name": "Tutorial Drop"
}
]
,
"items": [
{
"name": "Riki's Dagger",
"defindex": 0,
"item_class": "dota_item_wearable",
"item_type_name": "#DOTA_WearableType_Daggers",
"item_name": "#DOTA_Item_Rikis_Dagger",
"proper_name": false,
"item_quality": 0,
"image_inventory": null,
"min_ilevel": 1,
"max_ilevel": 1,
"image_url": "",
"image_url_large": "",
"capabilities": {
"can_craft_mark": true,
"can_be_restored": true,
"strange_parts": true,
"paintable_unusual": true,
"autograph": true
I thought I did pretty much exactly what the sample did. What have I done wrong? I have searched for similar problems but have not found the answer. Also, if you could let me know the correct way of getting the Def index of "Riki's Dagger" from the json schema that would be great.
If you want to read an element from JSON.
Generate classes from your JSON using http://json2csharp.com/ or http://jsonclassgenerator.codeplex.com/ (I feel this is better).
Use JSON.net to deserialize your JSON into root class.
var ouput -JsonConvert.Deserialize<your_root_class>(JSONString);
Just read the value you want to.