Lambda can't parse input when triggered from Kinesis - c#

I have a C# AWS Lambda function which is being triggered from AWS Kinesis and it's failing before it even gets to my code. The error I am getting is
Unexpected character encountered while parsing value: 1. Path 'Records[0].kinesis.approximateArrivalTimestamp', line 1, position 302.: JsonReaderException
at Newtonsoft.Json.JsonTextReader.ReadStringValue(ReadType readType)
at Newtonsoft.Json.JsonTextReader.ReadAsDateTime()
I have used the method signature given in the tutorial here, which looks like this:
[LambdaSerializer(typeof(Amazon.Lambda.Serialization.Json.JsonSerializer))]
public void HandleKinesisRecord(KinesisEvent kinesisEvent)
{
}
Is this the correct signature I should be using? Looks like there might be a mismatch between the JSON Kinesis is sending and Lambda is expecting.

Related

Full type name followed by index, what does it mean?

I am trying to understand the logging output of dotnet run for an ASP.NET Core project. There are many places showing a full type name followed by what appears to be the indexer syntax.
This page explains how array types are represented, but in that case there is no index.
Console.WriteLine(new string[100]); shows:System.String[]
This is an actual dotnet run output:info: Microsoft.AspNetCore.DataProtection.KeyManagement.XmlKeyManager[58]
How to interpret the previous text? What is 58?
Is it a general C# string representation? What code construct would output something like that?
In the example provided, there are three components:
info, which is the log level.
Microsoft.AspNetCore.DataProtection.KeyManagement.XmlKeyManager, which is the log category.
58, which is the log event ID.
ASP.NET Core uses ILogger and ILogger<T> for logging, using calls such as:
logger.LogInformation(...);
The example log message you've shown is from a console provider, which has its own rules about how to format the message. By default, this starts with a header line of level: category[eventID], as I've shown.
As a crude example, you might imagine the following code being used to generate the final message:
var logLevel = "info";
var logCategory = "Microsoft.AspNetCore.DataProtection.KeyManagement.XmlKeyManager";
var logEventId = 58;
Console.Writeline($"{logLevel}: {logCategory}[{logEventId}]");

Identify Cause of Serialization Error Coming From Inside Library/NuGet Package

Sometimes when calling the Mastercard MATCH API via the NuGet package MasterCard-Match, I will get a JSON deserialization exception:
Unexpected character encountered while parsing value: <. Path '',
line 0, position 0.
The exception is from MasterCard.Core with an inner exception from Newtonsoft.Json, both have the same message.
It looks like I'm receiving HTML ('<' at line 0, position 0) and the library is trying to deserialize it as JSON. My guess is that the MasterCard API is sending back an HTML error page instead of a JSON error. But I can't step into the function call to "see" the response its getting before throwing the exception.
As per the documentation I create a request map with the provided data and call TerminationInquiryRequest.Create(map), this is the line the exception is thrown. This function call is a black box, I can't step into it, it just throws the exception.
try
{
RequestMap requestMap = CreateRequestMap();
// This line throws the exception
TerminationInquiryRequest apiResponse = TerminationInquiryRequest.Create(requestMap);
}
catch(Exception e)
{
// Exception handling
}
I've made over 11,000+ calls using this library and only 32 have had this error, but of course I get to hear about it every time it happens.
Is there any way to debug libraries that I'm not aware of, or a way to view the response that the library is getting from the API?
I already have some logic to wait and retry the call if it fails.

TensorflowSharp - TFException: Attempting to use uninitialized value

I'm attempting to import a model created in Keras/Tensorflow and use it for inference in a Unity project.
I have successfully imported the model and validated by printing names of input and output nodes in the graph. Though, when I try to get the output value from the runner, I get this exception.
TFException: Attempting to use uninitialized value action_W
[[Node: action_W/read = IdentityT=DT_FLOAT, _class=["loc:#action_W"], _device="/job:localhost/replica:0/task:0/cpu:0"]]
TensorFlow.TFStatus.CheckMaybeRaise (TensorFlow.TFStatus incomingStatus, System.Boolean last) (at <6ed6db22f8874deba74ffe3e566039be>:0)
TensorFlow.TFSession.Run (TensorFlow.TFOutput[] inputs, TensorFlow.TFTensor[] inputValues, TensorFlow.TFOutput[] outputs, TensorFlow.TFOperation[] targetOpers, TensorFlow.TFBuffer runMetadata, TensorFlow.TFBuffer runOptions, TensorFlow.TFStatus status) (at <6ed6db22f8874deba74ffe3e566039be>:0)
TensorFlow.TFSession+Runner.Run (TensorFlow.TFStatus status) (at <6ed6db22f8874deba74ffe3e566039be>:0)
RecordArbitraryData.ModelPredict (System.Single[,] input) (at Assets/Scripts/Spells/RecordArbitraryData.cs:230)
RecordArbitraryData.FixedUpdate () (at Assets/Scripts/Spells/RecordArbitraryData.cs:95)
Here are the two functions I use. InstantiateModel is called OnStart() in my unity script. And ModelPredict is called when the user passes an input to the script.
void InstantiateModel(){
string model_name = "simple_as_binary";
//Instantiate Graph
graphModel = Resources.Load (model_name) as TextAsset;
graph = new TFGraph ();
graph.Import (graphModel.bytes);
session = new TFSession (graph);
}
void ModelPredict(float[,] input){
using (graph) {
using (session) {
//Assign input tensors
var runner = session.GetRunner ();
runner.AddInput (graph [input_node_name] [0], input);
//Calculate and access output of graph
runner.Fetch (graph[output_node_name][0]);
Debug.Log ("Output node name: " + graph [output_node_name].Name);
float[,] recurrent_tensor = runner.Run () [0].GetValue () as float[,];
//var results = runner.Run();
//Debug.Log("Prediciton: " + results);
}
}
}
Any help appreciated - TensorflowSharp is very new to me.
I was able to figure out most of my problems. I'm currently at a point where my model is predicting in unity, but only predicting the first of four classes. My guess is it has something to do with the weights not getting initialized correctly from the checkpoint files? Edit: My values weren't being normalized before being passed to neural network.
Preface: Mozilla Firefox works best for displaying tensorboard; it took me a long time to realize that google chrome was causing my graph to be invisible (tensorboard is how I was able to figure out the nodes that needed to be used for input and output).
First Issue: I was renaming a .pb file into a .bytes file. This is incorrect because the model’s weights come from the checkpoint file, and are given to the nodes held in the .pb file. This was causing the uninitialized variables. These variables were used for training, which were removed after using the freeze_graph function.
Second Issue: I was using the file created called ‘checkpoint’, which was throwing an error. I then changed the name of the checkpoint to ‘test’ and used this in the freeze_graph function. When calling the checkpoints file, I was required to use ‘test.ckpt’. I assume this function knows to grab the three files automatically based on the .ckpt? ‘Test’ without ‘.ckpt’ did not work.
Third Issue: when using the freeze_graph function, I needed to export the .pb file in keras/tf with text=False. I tested True and False; True threw an error about “bad wiring”.
Fourth Issue: Tensorboard was very difficult to use without any organization. Using tf.name_scope helped a lot with not only visualization, but making sure I was using/referencing the correct nodes in TensorFlowSharp. In keras I found it helpful to separate the final Dense layer and Activation into their own scopes so I could find the correct output node. The rest of my network was put into a ‘body’ scope, and the sole input layer in ‘input’ scope. The name_scope function prepends ‘scopename/’ to the node name. I don't think it’s necessary, but it helped me.
Fifth Issue: The version of tensorflowsharp released as a unity package is not up to date. This caused an issue with a keras placeholder for ‘keras_training_phase’. In keras, you pass this along with an input like [0] + input. I tried to do the same by creating a new TFTensor(bool), but I was getting an error ‘inaccessible due to its protection level. This was an error with the implicit conversion between bool and TFTensor in my unity TensorFlowSharp version. To fix this I had to use a function found in this stackoverflow solution where the .bytes file is read in, the placeholder for keras_training_phase is found, and is swapped out for a bool constant set to false. This worked for me because my model was pretrained in python, so it may not be a great fix for someone that’s trying to train and test the model. A condition for removing this node with the freeze_graph function would really be useful.
Hope someone finds this useful!

Unexpected Character when Parsing .NET-Encoded JSON with JavaScript

I'm developing an ASP.NET Web Pages app, and I'm seeing a problem where the Javascript JSON.parse() method is unable to parse JSON output by the .NET Json.Encode() method. My specific problem is with the ampersand (&) character (Unicode U+0026).
For example, executing this code:
object SomeObject = new { SomeProperty = "A&B" };
Response.Write(Json.Encode(SomeObject));
In my .cshtml file results in the following content in the response:
{"SomeProperty":"A\u0026B"}
Which leads to a SyntaxError: Unexpected token u in my JavaScript:
function SomeCallback(aRequest) {
if (aRequest.status === 200) {
var lResponseJSON = JSON.parse(aRequest.Response); // Error on this line
}
}
How can I get the .NET JSON encoding and the JS JSON decoding to play nice when special characters are involved?
(Preferably, short of manually going through the stringified JSON before it's parsed to replace the unicode encodings)
EDIT: Might be worth mentioning that using Json.Write(SomeObject, Response.Output) instead of Response.Write(Json.Encode(SomeObject)) has no effect on the JSON output.
Your problem has to be somewhat different that you are showing:
When I run this code through my console:
var k = JSON.parse('{"SomeProperty":"A\u0026B"}')
console.log(k);
// Object {SomeProperty: "A&B"}
everything behaves correctly.
Though it looks strange, this is valid JSON:
{"SomeProperty":"A\u0026B"}

Unsupported Media Type error when using json-patch in Ramone

Update: I downloaded Ramone project, added it to my project and then ran the application again with debugger. The error is shown below:
public MediaTypeWriterRegistration GetWriter(Type t, MediaType mediaType)
{
...
CodecEntry entry = SelectWriters(t, mediaType).FirstOrDefault(); => this line throws error
...
}
Error occurs in CodecManager.cs. I am trying to figure out why it does not recognize json-patch media type. Could it be because writer is not being registered correctly? I am looking into it. If you figure out the problem, please let me know. Since you are the author of the library, it will be easier for you to figure out the issue. I will have to go through all the code files and methods to find the issue. Thanks!
I was excited to know that Ramone library supports json-patch operations but when I tried it, I got following error:
415- Unsupported Media Type
This is the same error that I get when I use RestSharp. I thought may be RestSharp does not support json-patch and errors out so I decided to try Ramone lib but I still get same error. Endpoint has no issues because when I try same command using Postman, it works but when I try it programmatically in C#, it throws unsupported media type error. Here is my code:
var authenticator = new TokenProvider("gfdsfdsfdsafdsafsadfsdrj5o97jgvegh", "sadfdsafdsafdsfgfdhgfhehrerhgJ");
JsonPatchDocument patch = new JsonPatchDocument<MetaData>();
patch.Add("/Resident2", "Boyle");
//patch.Replace("/Resident", "Boyle");
RSession = RamoneConfiguration.NewSession(new Uri("https://api.box.com"));
RSession.DefaultRequestMediaType = MediaType.ApplicationJson;
RSession.DefaultResponseMediaType = MediaType.ApplicationJson;
Ramone.Request ramonerequest = RSession.Bind("/2.0/files/323433290812/metadata");
ramonerequest.Header("Authorization", "Bearer " + authenticator.GetAccessToken(code).AccessToken);
//var ramoneresponse = ramonerequest.Patch(patch); //results in error: 405 - Method Not Allowed
var ramoneresponse = ramonerequest.Put(patch); //results in error: 415 - Unsupported Media Type
var responsebody = ramoneresponse.Body
Endpoint information is available here: http://developers.box.com/metadata-api
I used json-patch section in the following article as a reference:
http://elfisk.dk/Ramone/Documentation/Ramone.pdf
By the way I tried Patch() method (as shown in above ref. article) but that resulted in "Method not allowed" so I used Put() method which seems to work but then errors out because of json-patch operation.
Any help, guidance, tips in resolving this problem will be highly appreciated. Thanks much in advance.
-Sham
The Box documentation says you should use PUT (which is quite a bit funny). The server even tells you that it doesn't support the HTTP PATCH method (405 Method Not Allowed) - so PUT it must be.
Now, you tell Ramone to use JSON all the time (RSession.DefaultRequestMediaType = MediaType.ApplicationJson), so you end up PUT'ing a JSON document to Box - where you should be PUT'ing a JSON-Patch document.
Drop the "RSession.DefaultRequestMediaType = MediaType.ApplicationJson" statement and send the patch document as JSON-Patch with the use of: ramonerequest.ContentType("application/json-patch+json").Put(...).

Categories

Resources