I am transfering a large zipped text file over the classic asmx web service.
My reason for doing so is that the file's size is 20 MB unzipped, 4MB zipped.
This is the method. I will provide additional information if necessary.
[WebMethod]
public byte[] Transfer()
{
return File.ReadAllBytes(#"4MBFile.zip");
}
I am using C# and .NET 4. (I changed the initial settings for the project from 2.0 to 4.0).
A webmethod uses a kind of serialization so I guess there will be some overhead.
Am i really transferring only 4MB?
How do I measure this overhead, if there is any?
XML Web Services expose useful functionality to Web users through a standard Web protocol. In most cases, the protocol used is SOAP.
This question shows that the XmlSerializer, used by ASMX Web Services, by default Base64-encodes binary data , so yes, the overhead will be noticable.
Am I really transfering only 4MB?
What keeps you from monitoring a service call using Fiddler? It'll tell the exact HTTP response body size.
There seems to be a solution by attributing the property as hexBinary, so it won't be Base64-encoded.
Related
I recently took over a project which is partially done. In that project he is using web service for everything i mean for getting each and every data from database.
Ex: I need some data which takes 3 parameters (district code, taluk code an village code)
What is happening is :
Creating an xml document using these three parameters
Encrypting this xml
Sending this XML to the web service
Decrypting this XML in web service
Retrieving the data through stored procedure with the above mentioned parameters
Again generating an xml document for the retrieved data.
Again Encrypting this XML and returning this XML to the application
Decrypting the returned XML and Generating the datatable with this XML.
I asked him why he has done all this? He said for security purpose.
I feel this is very lengthy and time consuming and what i learned of stored procedures is it secures the application.
My question is why do i need to go through all these procedure when i can have a class file and use stored procedures in my application?
Isn't usage of stored procedures enough to secure my application?
Or should i continue with his technique?(To be frank i disagree with this method)
Note: The parameters are not passed by the user. It will be in the session once the user logs in.
Unless the client code resides entirely in the browser side, your mate seems to have created an over-enginereed solution. Actually, you would create a full HTTP interface through Web services if you're developing a full HTML5 app or same backend should be accessed by multiple client technologies.
Security layers can be both implemented before a request hits a Web service resource or as a regular concern in a layered software system.
A special note can be made about encryption. If we're talking about Web services through HTTP, encrypting the request body is unnecessary since it can be done in transport level using standard SSL/HTTPS.
Finally, about the XML thing, I guess that you're consuming SOAP services. Maybe your mate either started his project a long time ago or, again, if we're talking about a Web site against a Web service, and he has implemented a SOAP/XML service in the last 5-6 years, he should reconsider using a convention over configuration approach like REST...
This sort of separation is not uncommon in large scale secure web applications in dual firewall configurations.
If the web application simply called stored procedures directly, you would have to open up port 1433 (or other ODBC port) between the web network zone and the database network zone. This creates a bit of exposure since the web servers are in a DMZ while your database servers will tend to be in a more secure zone. Generally speaking you want to keep as many ports sealed up as you can to keep the DMZ contained in case of compromise.
It is shame that your colleague wrote his own solution, since it sounds like most of that functionality would be covered with SQLXML.
What is the most current (.NET 4) way to send a raw XML request to a web service? Does it differ between asmx and wcf services?
I want to build XML via a front-end interface and have the ability to send it to different endpoints. These endpoints could be asmx or wcf. What is the best approach to get started doing this?
I think we mean the same thing; I usually prefer not just to send XML but to work with objects and let the WCF engine to decide in what format itll pass it to the service/client (which might depend on the protocls you and your service providers support). In the terms of WCF, it means defining classes as Data Contracts, and working with their instances only, and not the XML they might represent.
If the other side supports ASMX only, it limits the supported protocols WCF can use to work against to WS-I Basic Profile 1.1 (if Im not mistaken with this), but whatever technology You choose/work against, the concept, in my opinion, should remain the same: in .NET work with classes, and let the underlying communication technology to decide how to serialize and pass it to the other side. Of course, its also more secure than just to send raw string as xml (though you can probably solve some of the security issues by validating against a schema the received xml)
There is a need to develop some "Service" program that will receive and process BLOB data from Oracle DB server. The clients will be written in Delphi 2010. I have freedom of choice that technologies I will use to produce server part of this project. And that's why I have posted this question here. Could you guys point me some blog posts, articles, forums where I can get various information about creating such type of services? I have an experience with Microsoft's WCF services, but it has bas intergration with Delphi clients via WSDL. Now I stopped on ASMX Web Service written in C# and need to get some samples how can I transfer BLOB data between server and client. It would be better if server and client communicating thru raw socket, instead of incapsulation all data in SOAP. Thanks in advance and strongly hope for you help, guys!
I would recommended you to use RemObjects SDK for develop server & client web services applications, it has many features not available on Delphi & .Net, also they support different messaging, so you can use binary message instead of SOAP to transfer the BLOB data, which is much faster and more compact.
They also .Net version of server and client so you can mix between them.
A nice and standard way of handling BLOB fields is the REST protocol.
Thanks to the REST protocol, you can GET, POST, PUT or DELETE a binary BLOB from its URI. That is, if your URI is dedicated to the BLOB field, you'll be able to use raw binary transmission, and no MTOM or Base64 transmission.
For instance, you can get a BLOB content with ID=123 with a GET at such an URI:
http://servername/service/123/blob
It will work also from a standard web browser. So if the BLOB is a picture, it should be displayed directly in the browser.
With a POST at the same URI, you add a new blob, or with a PUT you update the blob. With a DELETE verb... you delete it. This is what RESTful means over HTTP.
This is, for instance, how our mORMot framework works. It is also able to fast have direct access to the Oracle database on the server side, with some dedicated classes. What is nice with such an ORM-based framework, is that high-level clients can use objects, and handle much more than only BLOBs, and that it handles URL-level security and authentication.
But you can easily write your own service using some units available in mORMot, if you don't need the whole RESTful ORM feature:
For a fast http.sys based HTTP/1.1 server, take a look at SynCrtSock;
For the HTTP/1.1 client access, the same SynCrtSock unit defines some client classes;
For very fast direct access to Oracle, see SynOracle.
This is all Open-Source, working from Delphi 5 and later. There is a lot of documentation available (more than 600 pages), including high-level presentation of such concepts as REST, ORM or n-Tier.
This is fairly high-level, but so is the question:
If it is a "raw socket" it isn't really a "web service"; although there is of course the middle ground of REST or a HTTP POST.
If you are looking at a web-service, and the data is non-trivial, then you probably want to look at MTOM to avoid the base-64 overhead (which is supported in WSE 3, or (simpler) WCF via basicHttpBinding). I would expect most tools to have a reasonable comprehension of a basic web-service with MTOM.
if you want to expose some data in a data base (in this case blob data in oracle) as a web services WSO2 DSS[1] provides an easier solution. This is under Apache license and it is available for free. Since all the WSO2 Products are based on WSO2 carbon platform the services you create supports MTOM, WS-Security and other Web service related features as well.
[1] http://wso2.org/library/dss
I've recently discovered a way to implement RESTful services using Global.asax (by handling the Application_BeginRequest event). Basically, I am saying it is possible (and easy) to implement a RESTful web service in classic ASP.NET, without any need for WCF.
It takes approximately 30 lines of code to figure out which method you want to call (from the URL) and pass it parameters (from the query string, via reflection) as well as serialize the result using XmlSerializer. It all leads to a web service that can be accessed through HTTP GET requests, and returns standard XML data.
So with that in mind, is there any reason to use WCF when creating a RESTful web service that will be invoked only through HTTP GET requests? WCF introduces a lot of overhead and restrictions, whereas the Global.asax approach I described above is much easier to implement, customize, and deploy.
Note - JSON endpoints can also be implemented without WCF by using the JavaScriptSerializer.
Also - HTTP POST requests can be handled by Global.asax in a similar way.
So in the end, what would be the reason to use WCF in such a case? Is there better scalability or performance?
You can also use Asp.Net MVC to implement REST quite easy.
What you don't get for free with this approach is:
non-HTTP bindings.
support for multiple message formats.
non-IIS hosting.
control over the process activation.
control over the instance creation.
object pooling.
message queueing.
transactions.
scalability and reliability.
additional technologies built on top of WCF like the OData support.
If none of these applies to your scenario - you don't need WCF.
The answer is, 2 different pipelines, but the WCF pipeline was built specifically for services, and ASP.Net was built for rendering content via HTTP. Both have their pluses and minuses.
If you are comfortable with the ASP.net stack, and don't have to worry about things like standards, then ASP.Net is fine.
But if you want the best of both worlds, try WCF Data Services, all the cool built-in features of WCF, with none of the hassles. Currently, MVC does not have a view engine to generate OData.
I am creating a few basic web services using c#, and I am trying to have the web service return back just a normal name=value&name=value without any kind of xml or json format. The legacy system hitting these services is fairly old and doesn't support xml or json. Is doing this possible?
If the legacy service that's targeting this web service is that old, how exactly are you calling the web service from it? It may be easier to create an .aspx page (or even better, .ashx) that parses the request and makes the response simply using Response.Write.
If you update your question/add a comment with the detail about how you're calling the service, I'll update my answer accordingly =)