I am facing the following scenario and would appreciate some advice on how best to iterate forward:
My team is responsible for a Web Service written on ServiceStack v3. This service is responsible for aggregating data from several other v3 web services for use in a SPA.
We are running into a situation where we are limited by the implementation of a downstream service - this particular service abstracts away data access and queries that return large result sets occasionally timeout.
We would like to rewrite this service to add pagination. The best solution (for us) would be to leverage AutoQuery from ServiceStack v4. However this would require upstream code being able to reference ServiceStack packages in 2 versions (is this possible?). We could also add pagination to the existing service, but it uses an internal data framework that is not that easy to change and we have a high chance of breaking.
Any ideas?
Yes, you can load in 2 versions of a dll inside your application. No, not while developing (only runtime) but I'm pretty sure this will lead to big problems in code execution (it wont be able to find the right class-version run-time).
Your question is also answered here: Using multiple versions of the same DLL
A better solution would be is to split your application into a v4 and v3 part using app domains also talked about in this question.
Original problem?
your original problem is you have old v3 services where you want to add pagination for performance issues?
Solution could be to add it in the v3 parts but this might break the services and they have to be tested?
You could migrate 3 to 4 (i'm not sure if this would fix your problem). I've found this is still very do-able.
Create your own wrapper services using redis caching (adv: no changing in the original code)
Build in a caching mechanic client-side/intermediary so you don't need to wait on the long api call.
migrate to autoquery. (I have no experience here)
Roadmap
I think you'd do good researching some of these options. I think your case there is not one perfect solution, there is only pros and cons.
To you to decide which risks to take.
Related
It's a web application running ASP.NET Core MVC 2.1 that started as an ASP.NET MVC 3 application and has been growing for 5 years.
I have defined services for different business processes but at some point, many of those services depend on each other. The end result is services with many dependencies and some circular references.
For example, service A performs an action and then it has trigger another action in service B. And then end result has to be sent back to the user. This is a simplification, of course.
In some cases, I'm using Azure Service Bus to decouple the processes. That way, service A performs an action and then queues an action that triggers a process in service B. Then service B is gonna send an HTTP request to the web application to notify the users. This approach worked out well but I'm not sure if I should apply it system-wide. It adds complexity for sure, as debugging is not that simple.
I know someone will say "use Microservices" but that's a big change and for now I need to have one datbase for all the processes.
I found the request-response pattern applied to Azure Service Bus to be really useful but I couldn't find any code examples. Documentation is here: https://learn.microsoft.com/en-us/azure/service-bus-messaging/message-sessions#request-response-pattern
Any recommendations are welcomed :)
I think that's a very complex problem you have encountered and to say there is an obvious solution would be wrong.
Recently I was working for the company project that had an intention to escape, such called Death Star Architecture, which is somewhat close to the problem you have. To escape coupling between projects/services, the company decided to move to expansive solutions based on Azure Microservices (Logic apps, Data Factories, Azure Functions, etc.).
Although it's worth mentioning that the solution solved the issues with couplings because with Azure Microservices, you can easily switch between solutions without the need to worry about dependencies.
There are quite a lot of articles about the good and bad sides of the microservices, and you should examine all the pros and cons before moving forward with them. Also, check the business needs, are there available time and resources for that or not.
One other thing that I can suggest you look at is the reactive programming approach.
Also, you can look at this thread. There are a couple of good suggestions, worth mentioning.
I will explain briefly my situation before asking for recommendations:
Context
Among other topics* I have been asked to develop a REST api that will be on the cloud (Azure)
The current (soon legacy) application works with windows service.
Behind this web API/windows service that receive the data and deserialize it (before serializing again when sending the response) there is Pricing Library which is used to compute data provided by custom-xml format.
The problem
I am quite concerned with compatibility issues as I keep encountering errors due to uncompatibilities from external libs with .NetCore 2.0
I had an issue with log4net as the Pricing Lib is using 1.2.13 version while 2.0.8 is already available. I solved this but I now encounter RealProxy in dotnet core? issue
I feel I will keep encountering new issues and it will be really time-consuming to fix them each time. But perhaps I am wrong since I only want to revamp the web API with .netcore 2 (not the pricing lib) ?
My question
Is it really profitable, performance wise, or functionally-wise, to switch now the web API to .NetCore 2.0 knowing that we call a Pricing Lib in 4.6.2 .Net Framework ? Is it worth to bother that much just to be using the trending framework while the former one is rather mature ?
Many thanks for your answers~ !
PS: I have already googled and read the relevant documentation, I am asking about experience from other users
https://learn.microsoft.com/en-us/dotnet/standard/choosing-core-framework-server
*code optimization, configuring automatic build and deployment, markdown doc etc.
(First i just want to say i'm sorry if its not worded the best but i have been researching for hours and i thought maybe someone on here could clear this up for me)
Im new to creating web apis and i have been googling just doing some research and i have built a few MVC applications just to get exposed but i never really thought of making an API for it until today. One of the reasons its listed to be useful is it allows your application to be used across tablets and smart phones etc. What im not understanding is how you would do this as in would you just add something to the API so it makes it compatible for all browsers or if i need to re build it using Web API instead of MVC.
Thanks,
I think your looking at a Web API from the wrong perspective. It's not really about compatibility necessarily, but rather about the ability to reuse the code/back-end functionality.
So rather than having your dependencies all wrapped up in one MVC project, the references are external. This allows for essentially the same functionality across multiple projects as long as the request are being handled in the same manner.
So I've spent the past few hours trolling through all the genuinely fantastic advice for Web API Versioning. Some of my favourites, for those having as much fun as I am, in no particular order:
Best practices for API versioning?
Versioning REST API of an ASP.NET MVC application
http://www.troyhunt.com/2014/02/your-api-versioning-is-wrong-which-is.html
http://www.pluralsight.com/courses/web-api-design
http://www.pluralsight.com/courses/implementing-restful-aspdotnet-web-api
So all this advice has been very helpful in designing what is essentially the "front end" of the API. We can version the API calls... Now, I'm on to the hard part.
This is a heavily data driven application, for a company with several products (this is a new one) doing monthly releases. Some big customers who will want long-term support for API calls, some smaller customers who will want the latest releases. This we could manage with something similar to milestone/long-term-support releases of the API. Great.
But in practice this is going to get really messy, really fast. We've worked hard to separate out the layers of our own website, the beta Internal/External APIs, Repository Layers and even an SDK to boot. We separate out each release out into separate branches, but it's SAAS - we host the database. So we're not just going to be able to version the API calls - but everything underneath that. The Business Logic, Repository and the Database. Let's not even get started on Unit/Integration Testing.
So, trying and probably failing only ask one question here.
Is there a decent pattern for structuring a layered, data-driven, .NET application to cope with multiple versions?
Specifically how the database will change and how you can structure a generic stack to version it all. Some ideas I have include:
Updating old source control branches of the stack and deploying these
Keeping everything in the same project, but use folders/namespacing all the way down
Splitting the projects further - so the API solution has a number of "Controller" projects, with similar concepts for the logic/repo layers
We have a fair number of developers and no matter how much ace documentation I write, realistically it will only be read when something isn't working. So ideally it needs to be as glaringly obvious as possible for developers to get this right.
There is no perfect solution that fits every situation, whether data-driven or not.
This is really difficult to answer. Your best bet is to use multiple versioning strategies.
For example, if the database change is simply adding a new column, older versions of the APIs can ignore the new columns.
If the database change means you have to completely re-write the repository layer, then you may want to create a new repository and new controller, instead of just versioning an API method. Then on the endpoint, you could either version the route or consumers could call the replacement endpoint.
If there are dramatic changes at all levels of the API, then versioning at IIS with a separate virtual directory may be your solution (and this may have corresponding branches or labels in source control with the intent of supporting bug/fix only).
The folders / namespacing idea can get very confusing for developers, so I would steer away from it. Routing (i.e. [Route("/v4/Orders")]) may be the better way to handle it. Again, that depends on how much code and the nature of the changes.
Currently I'm working with a big, old and extremely poorly written ASP.NET 1.1 application and the continuous maintenance is becoming quite a problem. Basically it's reaching breaking point and I'm reluctant to expand it any more than I have to as demanded by the business. Based on my experience creating other projects from scratch it would really suit an ASP.NET MVC based solution. Oh how I wish the world were that simple...
The fact is that I just cannot justify re-writing it from scratch and the business cannot afford it. The ideal solution would be to start writing an MVC-based application alongside it and begin a slow migration as new requirements arise.
I've read posts which state that this is entirely possible, but in my experiments I've not found it so easy. The current application contains several large data access and business logic layers shared by other applications that the company produces. These are also written in 1.1 and will not compile in 2.0 (and would destroy the other projects if I tried!) so I cannot upgrade them. Since I can't do that I'm stuck with an application that cannot even be opened in a .NET 3.5 capable visual studio. The new MVC app would also have to make use of these layers.
I am entirely open to suggestions. I'm desperate to find a solution that I can quickly demonstrate would allow me to improve the product immensely without taking too much time or affecting the rest of the business.
You could write a WCF service on top of the existing business layer and have your new app talk to that service instead of referencing the business layer directly.
You need to divide to conquer. Analyse the current app and its layers and see if you find a way to divide each significant piece of functionality into a discrete area with as few changes as possible.
Then make each area a unique service using the old technology.
Then you can rewrite each service slowly as you can fit it in and not affect the whole.
Otherwise you are going to have to come up with a convincing business case for your managers so that they allocate you the time to do the job properly. Sometimes our job is political as well as technical.