Create REST controller & classes from proto files - c#

With gRPC on .NET Core I can define the interface of my service using proto-files.
I need to expose my service as REST, too, and would like to define the service once, using proto, instead of manually creating it all again.
Is it possible to create REST endpoints (controller and request/response-classes) using gRPC on .NET Framework from the proto files?
[Update]
For the REST endpoints I want to use JSON - I just want to create the controller and request/response-classes from the proto files.
For request/response it might be possible to use the classes generated for the gRPC-client, but it would be great if I could create a REST-controller from the proto-file, too.

Short answer: yes
Longer answer: yes but you'd have to write some custom code to read the proto files and then create the controllers in memory on startup. There's not much you can do out the box
A quick search on Google offers more insight

Related

gRPC Service use models from library

I am working on my first gRPC service. I have everything working. As in, I can call my service from the client and get a response. My question is can my gRPC service use models from another library?
I have several projects in my solution.
gRPC Server
gRPC Client
Common DTO Library
And a Few more
When I define my proto file is it possible to use the classes from the Common DTO Library?
my.proto
syntax = "proto3";
option csharp_namespace = "myNameSpace";
package myPackageName;
// The service definition.
service MyService{
rpc MyMethodName (DtoFromAnotherLibrary) returns (byte[]);
}
Thank you,
Travis
That's not possible because Proto does not know about your C# projects.
You may consider using code-first gRPC though, where you write C# code that is then creating your proto.
As #Ray stated, you cannot use your model objects through the gRPC interface and he provided a link to the code-first method.
I tend to think of my proto definitions as my external interface and update them with care to ensure backwards compatibility as the interface ages. Because of that, I will code up model objects separate from the gRPC definitions and write extension methods (ToProto for the model, ToModel for the gRPC message) to go back and forth between the two types. It may seem like duplicated effort, but having the flexibility to add things to my model objects like property change notifications, or other convenience methods/properties without affecting the external interface is a plus to me. I spend a lot of time working on the front end, so it analogous to the model/view model relationship.

ASP .Net Core Dto's and Controllers to typescript classes & interfaces

My idea consists of two main elements:
Take C# Dto's (Data-tranfer-objects) and convert them into typescript interfaces to ensure client-side models are in sync with server side.
Take ASP .Net core controller endpoints and convert them to typescript classes that uses a http-service or similar. Again, to ensure client-side requests are in sync with the server.
And whenever a change have been made to a controller or dto, the typescript generated items should then refresh to stay in sync while developing.
I have done some research and found the following Stack Overflow threads and other sources:
DTO to TypeScript generator which suggest using the TypeLite library, which seems great, but according to the documentation, this either requires a [TsClass] Attribute or a reference to class on startup. But, since the project structure I'm using is setup so that all dto's is located in a *.Dtos namespace, I'm kinda missing a TypeScript.Definitions().ForNameSpace(). Also, this only solves the first idea/problem.
Swashbuckly.AspNetCore Would allow me to generate swagger documentation from both the controllers and dto's, and then the task would be to someway interpret the swagger documentation and create typescript classes and interfaces from that. The cons is that as far as i can read, this requires me to startup the server, which if possible i would like to avoid since it would make it hard to update on file change.
FYI, this is a new project I'm about to start, so there's no legacy code to update, also, all of the ASP .NET Core endpoints will return IActionResult to enable the return of Ok(), BadRequest() and so on. So to get the return model would in my mind be hard, since there's not an easy way to get the dto it produces, if any.
So, i have thought of the following solutions that solves both problems:
Create a separate package/application that uses the Swashbuckly lib and generates the models and controllers without starting up the whole server.
Create annotations on every endpoint, something along the lines of [Produces(SomeDto)], where after i would create a small console-application that uses reflection to get information and generate typescript from that. This would of cause requires developers to keep this information in sync, so in my mind there's kinda duplicate information.
But, both of these solutions would not auto-update on C# source file save.
Looking forward to any discussions/suggestions.
Considering your first point, I made a C# DTO to typescript interface generator that uses MSBUILD tasks so its completely independent of your workflow. It also just does it from the source which makes it a bit less stable but you don't have to make any template files.
Find it here or just search for MTT on nuget
In case you are still looking.... I think Typewriter http://frhagn.github.io/Typewriter/ is your solution. You can generate templates specifying what and how to transform.
It doesn't meet all of my needs only because I need a tool to dynamically generate a complicated folder structure, but that's coming in their v2 roadmap.
Besides that, it does a lot of heavy lifting and is pretty easy to configure.

Custom Business Object : AJAX Enabled WCF

Can I pass a custom object between AJAX enabled WCF and my asp.net page?
I searched the web but could not find any examples. Most shows simple types like string and integers.
I also do not know how to populate custom object's property through JavaScript on the client side.
We have a browser add on and we have to pass data to that addon from a web service, I researched and looks like AJAX enabled WCF is way to go
Using .net framework 3.5 and VS 2008
You can't pass the actual custom objects, but you can of course pass the serialized version of them through your service and to your page, javascript, etc. Basically, you have to map the fields of your complex custom .NET types to classes decorated with the DataContract attribute. These classes are the types that your service will return. The DataContract-decorated classes will contain fields with primitive types, like strings, integers, etc. The WCF service will serialize these into XML or JSON.
On the client side, jQuery will be your best friend. I personally prefer JSON because the properties of your objects are much easier to get at that way instead having to deal with parsing a bunch of XML. So, setup your service to output JSON.
Also, to make your service URLs easier to read, make sure to use a RESTful approach. It's as easy as decorating your service methods with the WebGet attribute and supplying a UriTemplate. Once you see some examples, it'll blow your mind. Note: if you ever encounter a WebInvoke with Method="GET", just use WebGet instead...it's more compact...no Method specification needed.
This particular article was EXTREMELY useful to me when I was developing my WCF service and the ASP.NET app that consumed it: http://www.c-sharpcorner.com/UploadFile/sridhar_subra/116/
Here's another person asking the same question as you: http://social.msdn.microsoft.com/forums/en-US/wcf/thread/879d46af-9c78-4b5d-b746-82843d742a6f
Hope this helps! Long live WCF!
With .NET 3.5 your best bet is WebHttpBinding which accepts plain old XML (POX) and you need to send XML to the WCF service.
You can also use WCF REST using REST starter kit. For samples have a look here. This supports JSON as well.
If you were using .NET 4.0, JSON-enabled WCF HTTP was the way to go. WCF REST with 4.0 was an alternative although I really do not like it.

Generate Contracts for REST objects

I'm new to REST and this sounds like it should be pretty simple. In a .NET app, I can create a reference to a WCF service and the contracts for all the available types will be generated for me.
Now I'm trying to consume a REST service in a Windows Phone 7 app. While I can make my call and get back the proper response, is there a simple way to create the classes that each object would be deserialized to?
I'm using RestSharp to manage my calls. In some examples I've seen, user's have created their own classes, and generated the xml manually. I would like to avoid this if at all possible.
many thanks!
Assuming your response is XML, you can save the xml into a file, then call xsd.exe on it to generate a schema. Call xsd.exe on the schema and it will generate a c# class file you can seriazlize and deserialize to from the xml. Here's the documeantion on how XSD.exe works:
http://msdn.microsoft.com/en-us/library/x6c1kb0s(v=VS.100).aspx
You have to generate the classes that your response data will map to (or use a dynamic deserialization scheme if you're on .NET 4) since REST does not include a schema definition system the way SOAP does. In RestSharp, there's a T4 helper to make generating the C# classes easier. It gets you about 80% of the way there. If you need any help with it, post to the RestSharp Google Group.

Class Libraries, Silverlight and Webservices

I have a Silverlight Class Library that I want to use in both my Silverlight and my WebService project.
I am able to create and reference the Library in both projects without any problems, but when I try to use any of the classes in the Library on the Silerlight project, I get an ambiguous reference error between my Library and the Asmx Webservice (apparently, the silverlight project believes that the classes in the class library exist in the webservice).
How can I correct this issue? I have tried rebuilding and cleaning, but it does not seem to work. Can anyone help?
Sounds like the objects you are passing to Silverlight, via the WCF service, are the same objects in your class library. In that case the generated web-reference objects will be given the same names. Linking with the library will then give you 2 sets of objects with the same names.
If you install RIA services, once feature is the ability to share code between client and server by simply adding ".shared" in the class filenames before the extensions. ASMX services are so last century :)
if you don't want to learn the RIA services way of sharing objects across the great-web-divide (which I would recommend), you need to separate the data objects from the functionality you actually want to share client and server side.
To give more specific advice on your current set-up I would need to see more about how it is structured.
A technique you can use is aliasing your using statements:
using MyNameSpace = My.Name.Space;
using MyWebService = My.Web.Service;
Then access all of your objects with these aliases to remove the ambiguities.

Categories

Resources