I’m using a service reference to communicate between my clients (C#) and my server.
I Frequently want to transfer big amounts of data (> 2 MB) over this web service, this works perfectly but with a few customers who has a very small upload this process is very slow and complete fills the upload for a limited time.
I’m looking for a way to limits the bandwidth used by the client to make sure there is some upload available for the customer itself.
I found various solutions the throttle streams but is it possible to use throttle in a web service call?
Maybe using a custom binding?
Related
We want to create apps that connects to our database (Sybase ADS, connectable via .NET) over the Internet. The clients will include a windows forms app and probably an android app for phones and devices.
I know C#. I'm new to Web apps and webservices so bear with me here.
I'm thinking maybe the best way is creating a webservice in C# to run on IIS that will interrogate the ADS database using ADO.net.
So the webservice will expose methods via WCF that can be consumed by the clients.
The clients then invoke the methods via WCF. Am I right in thinking WCF will return data in .net XML objects?
Also can .net Forms consume XML objects easily?
The apps would have bespoke restricted access using credentials.
How does this approach sound? Any performance or security issues to think about? The data is not exactly classified but I wouldn't want snoopers to be able to pick up phone numbers etc.
Can most security be sorted out by just going from http to https?
What about performance. Presumably slower than if the apps were just connected using ADO.net to ADS on the LAN. Does WCF use bufers for http requests?
E.g. can you can start reading the stream on the client before the whole http request has finished? I'm thinking for populating list boxes of large record nos. etc
I've a problem with a multithreading app in C# and I would appreciate some help, since I'm new to multithreading.
This is the scenario: I'll have a mobile app that will do a lot of queries/requests in my database(Mysql), my goal is to make a server side application that handles multiple queries using C# in a Linux machine(mono to the rescue). My company is doing the database side of the application, there's another company making the mobile app. I'll send the data to the cloud and the cloud server will send it to the client.
I'm reading the threading chapters of CLR via C# and C# 4.0 in a nutshell, but until now I have only a little clue of what I can do, I believe that asynchronous methods would work, since it doesn't use a lot of resources but I'm a little confused about how to handle thread concurrency(priority, state).
So here are my questions:
What is the best way to solve this problem?
Which class from .NET framework suits best for this job?
How should I handle the query queue?
How can I handle thousands of threads/queries fast enough, so my mobile app user can have the query result in a estimated time of 5 minutes?
Some observations:
I know that the data and time to finish a query will be exponentially equal to the size of the user's data in my database, but I need to handle(few and large data) it as fast as I can.
I'm sending the data to a cloud database(Amazon EC2) and from there i'll send it to the client. I'll not handle this, this will be handled by another company, so my job is to get the queries done quickly and make them avaliable to the cloud database.
I'm aware that to send the information to my client I depend on my IT infrastructure, but the point here is: how I can solve this problem quickly in a way that I'll have only to worry about my application infrastructure.
I cannot put the queries on a big string and throw it on the database, because I need to handle each query result separately before sending the result to the user.
The storage engine is MyISAM, so no transactions are allowed.
I would create a REST web service, either on top servicestack or WebAPI, to abstract access to your data via a service. Either of these services would be able to handle simultaneous requests from your mobile client, as they are designed to do so. In addition, I would create a class that can mediate access and provide a unit-of-work to your database (ie repository). The connection provider for MySQL should be able to handle simultaneous requests from your web service, so you should not have to worry about threading and request management. If a single instance is not enough, you can add more web servers running the same code and use a load-balancer to distribute the requests to each of your instances, where the service/data code is the same.
Some resources for mono based web-api/servicestack:
http://www.piotrwalat.net/running-asp-net-web-api-services-under-linux-and-os-x/
What is the best way to run ServiceStack on Linux / Mono?
I have a .NET web app (MVC4, Web API) that needs to receive large uploads (videos >500MB) via Multipart POST requests. I am trying to improve user experience by relaying the incoming stream of the video part directly to the server that will process and store the video (Kaltura). This is to be an improvement over the current process which stores the video in a file on the filesystem which is later retrieved by the video server.
I've done this by creating a subclass of MultipartStreamProvider called RelayingStreamProvider. I pass it the request stream of a WebRequest I've created to the Kaltura server and it uses this for writing. When I construct the RelayingStreamProvider, I wrap the request stream in a BufferedStream to handle any latency or interruptions on either the reading or writing side of this process.
I've turned off Request buffering in my Web API controller as well.
My questions:
Am I overlooking anything that will likely bite me later with this approach? I'm trying to constrain for efficiency by keeping the filesystem out of the process. On the upside, multiple simultaneous 500 MB uploads are consuming essentially no memory in my test implementation. Downsides?
I'm using a 32KB buffer for my BufferedStream. Is this a reasonable figure? What are the reasons to make this larger/smaller?
Finally, I am indeed aware that there is a widget and mobile APIs I can use to take myself out of the middleman role for Kaltura. And we are going to be implementing these. Since the widget requires Flash, we are using this approach as a fallback for users who don't have Flash.
I am looking for an open-source framework that can handle communication between my backend host and a WPF frontend client. The following are points I need to consider:
Client is a WPF desktop app, host is a C# console APP but can also run as Windows Service
Host can accept connection requests, client connects, but I still seek bi-direction communication capabilities.
The client and host are not in the same network (need to send messages over the internet)
Just one (possibly more but less than 10) client will connect to the host.
I like to be able to handle the following communication patterns: One-way message, two-way request/response, and streaming from host to client.
The content will consist of serialized POCOs/DTOs and serialized time series data. Each serialized DTO will be of approximately size 400 bytes, the serialized time series can be a lot larger, potentially several megabytes.
Some messages will be sent scheduled, meaning for example, the host sends a new DTO every second. (if the framework includes such scheduling mechanism then even better)
I like to be able to stream data to the client, such as a client that receives and then updates data on its GUI in real-time.
My question is: What might be the best C# based framework to handle this? (I run on .Net 4.5) I do not consider WCF because it seems way too thick and complex for what I try to handle. I also do not want to delve into web programming, so Rest/Soap type frameworks are not under consideration. I would be delighted if anyone could recommend a tcp/websocket based framework or similar that is light weight enough to potentially send messages a lot more frequently than just every second.
Thanks a lot in advance.
Any reason why you are not considering HTTP? It is just over TCP but you don't need to worry about firewalls and stuff, especially given the fact that you want to do this over internet. Self-hosted ASP.NET Web API can be a good candidate. Fire and forget and request/response can be accomplished fairly straight forward and for streaming it has PushStreamContent. ASP.NET Web API is light weight compared to WCF and is open source. Just a suggestion, that's all.
I ended up using ZeroMQ for full-duplex communication between WPF client and Server. It is lightweight, very fast and pretty reliable for my needs.
I'm currently designing a live analytic site using Asp.Net + C#, I was wondering what is the best method to transmit the data from the server to the client, here is what I've thought of so far;
Using Asp.Net AJAX UpdatePanel, and regularly update it with a timer
using JavaScript.
Create a server application (In C#, Java, Node.Js or Ruby) and use
Socket IO to retrieve a constant stream of data from it.
Create a JSON web service which returns large amounts of data, I
could use JQuery/AJAX to request it and process it in real-time on
the page.
Do any of these seem a good idea, or are their any other options?
I would never use the update panel, it would be too heavy for what your doing. Regular ajax calls would work just fine and write up a webApi service. Websockets would work well too, but is a little more complex. Depending on your time, you might want to check out SignalR which is a websocket client solution that can fall back to polling if the browser doesn't support websockets.