I have a web-server and a database server. There is a WCF service on the web-server and a website using it. Website requests the data from the WCF service and WCF service connects to the database server, fetches the data and returns it to the website.
To optimize this process and decrease the calls to WCF service I decided to manually cache the data on the web-server. One option I can think of was Microsoft Sync Framework. But then I realized that I have to create a sync framework by myself to achieve my objective. Because Microsoft Sync Framework does not provide any option for my kind of process. My process will be actually like this:
Website requests the data.
Business logic of the website checks whether it is available on the compact edition (sdf) database in the website's App_Data folder.
If present, fetch the data from the compact edition.
If not present, connect to WCF service and fetch the data from main database server and copy it to the compact and then fetch from compact edition.
So what I want to ask, is this technique efficient? and if YES is it there any alternative way to quickly achieve this technique? Or I have to code all of it manually?
It could work, but if it was me, and my goal was the increase performance/cache the calls between your web server and a back-end sql server, I wouldn't choose sql server compact edition for caching purposes.
Something like redis or memcached might be a more appropriate/higher performance way to increase performance of your caching layer. The compact sql server may cut down on the number of calls to your back-end server, but it might do that at the expense of slower response times overall.
It could be efficient in some circumstances, but I could propose you to use presreve data cache in local SQL Server Express instead of Compact.
The type system of SQL Server Compact is different then SQL Server. I know that EF6 support both but I would not start this journey on my own...
In any case there should be reasons why you need to work with cache throwgh RDBMS..
You can also use NHibernate, 2nd level caching. For more information, read this article http://www.codeproject.com/Articles/529016/NHibernate-Second-Level-Caching-Implementation. It will store data in the MemCache, and detect changes to records.
Another caching technology, is AppFabric, and you can see more details here : http://msdn.microsoft.com/en-us/library/ff383731(v=azure.10).aspx
By using memcache performance can be increased.
These are the steps to implement the memcache
You have to create a window service that will retrieve data from database and store in memcache in JSON format as (key value pair).
For website create a handler file as an API that will retrieve data from memcache and display the result.
I have implemented this in one of my project it retrieves thousands of data in milliseconds
Related
I am working on a 3-tier project management application (IIS server + Application server + DB server). We have a large JSON data (~1 MB) that we need to cache in memory so that we can minimize the db calls to fetch that. There may be ~10-50 users simultaneously logged in and the total cache size may vary between 10-50 MB.
I can not use a caching server as it will add dependency on another server and we will need to check whether its up or not so avoid any problem downstream.
Its not a "highly scalable" application so i want to use something easy to implement and that can be performant for above parameters.
Thanks for your help.
there is several solution you can use NOSQL like memcache or redis or etc you also can use .net memory cache : "System.Runtime.Caching" : ObjectCache or if your using MSSQL 2012 you could use MSSQL memory table
I was wondering what the best approach to make secure connection to SQL Server would be? Here is my scenario. At my work, we have SQL Server 2012 Standard. My boss wanted me to create a new DB utilizing TDE. I found out that you have to have Enterprise Edition in order to use TDE. We looked into it and it was going to cost a fortune, so we are not going to purchase Enterprise Edition. So I was thinking about using Stored Procedures to interact with DB. Is this more secure than submitting SQL query across web? Also, what is the best security measure to communicate and transfer data to/from web app/DB server?
Thanks in advance,
Brad
EDIT:
Also, is there anyway to securely send username/password credentials in the connection string?
Stored procedures would in a sense be more secure, since you could simply submit objects into the procedure to generate your desired result. This would mask the underlying SQL statement, so it could be considered more secure. I think most places rely on the Windows Authentication aspect of SQL in a domain environment.
It is fairly secure, more so if your site is wrapped up in SSL. Avoid standard SQL authentication, it's text based and shouldn't really be considered.
Code wise, you probably want a layer in between your DB and your website to do all the heavy lifting. This somewhat obfuscates what your website is doing since it is calling to your middle-man, and he handles all the truly transactional stuff.
Also, how are users going to be interacting with your website? Will they be required to login first, and what mechanism will control this? There are quite a few other design details to figure out before you can really consider which method will be the best balance of security and usability. I'd go for WindowsAuth/SSL and utilize a security account to perform all your transactions. It's easy to setup and AFAIK not easy to hack.
This are two different things - TDE will help you just with encrypting data on file system (so if I have access to filesystem where you have your db I won't be able to read it if you're using TDE).
Communication between application and db is different issue. There are several things you can do:
open network ports for db just to webserver (only from web server ip(s) you can access db)
use integrated authentication (no-one can sniff your password)
embed your business logic into stored procedures (you limit access to db just to function needed for scope of your web application)
However especially the stored procedures part can be pain (ORM like EF, LinqToSQL or nHibernate are just terrible when it comes to stored procedures). And also this approach doesn't guarantee that no-one will be able to see data coming from database server to web server).
If sniffing data between webserver and db server can be a problem, you have to write webservice for accessing data. This webservice should be on trusted network to db server (as close to db as it can be - same box is the best). Webserver should call this webservice over https (thus sniffing data between web server and webservice is impossible) and use authentication to access webservice (recommended is windows authentication).
After much Google searching and at the risk of asking dumb questions, I could use some help. I’m developing a C# WinForms client application using ADO.NET to read/write data from a SQL Server 2012 database located on the Internet. That same application also needs to upload/download data files. The client application will only be used by a few employees (ever). The employees are all in different locations. The database is only about 20 MB. There will be about 100 data files totaling about 300 MB accessed individually on a periodic basis. SQL Server 2012 is running on a (non-virtual) Windows Server 2008 R2 machine which we have full control over. The client application will be running on Win-XP and Win-7 machines.
Priorities are 1. Internet security – keeping hackers out of the Windows Server machine and off the client/server communications. 2. Performance. 3. Simplicity. Corporate security and scalability are not issues. Also, performance is not that important if the solution is overlay complicated.
Two related questions I could really use help on:
Given the above priorities, what is the best way to communicate with the database? The only two options I’ve found are exclusively; a WCF service or directly through a VPN.
And again given the above priorities, what is the best way to upload/download data files? I’m sure there are many options for this using VPN, WCF, FTP; but I don’t know any specifics. Also, using a SQL Server 2012 FileTable looks promising but I’m not sure how that works over the web. Backup/restore plus being able to do a full-text search over the data would be nice features but not requirements.
I know what a VPN is but have never used one for these purposes. I know there are some security issues with PPTP, but we won’t be upgrading the XP machines for a while. I know what a WCF service is but have never written one. I also don’t know if SOAP or REST is better in this instance. I’ve built a FileTable in SQL Server, but I don’t know how to access the data remotely. I have decent knowledge of C#, ADO.NET, and SQL Server.
I realize these are big questions with subjective answers. Still, any ideas or a shove in the right direction would be greatly appreciated.
Keep it simple and use standard mechanisms. My recommendation is as follows:
Build a WCF service that is capable of performing the operations you want. You can build a SOAP or RESTful service. My general guidance here is to build a RESTful service because you're transferring files and this is much more integrated with REST. With SOAP you have some setting you're going to need to fiddle with to transfer large files.
Use SSL to secure the service, keep it simple. A VPN is an added layer of complexity and very likely not needed in this scenario. Further, it will only make the experience for the users less friendly.
I would not recommend using the FileTable in SQL Server 2012 for your needs. You own the server so when you send and receive files it will be much more straight forward to deal with the file system.
You can also build a simple forms authentication process that creates a session key for the user and passes it back. I'm not sure this is necessary, but if you need that extra layer, just make that one of the operations. Then that session key can then be passed into each method and validated before performing the operation. This will be safe because you're using SSL.
Here is a tutorial that will help walk you through building a RESTful WCF service, and it's fairly new.
My recommendation would be to deploy a VPN server to provide the security you are looking for. There are a number of good VPN servers available, and a Google search should provide a number of options at varying price points.
Once you have deployed the VPN server (and clients to all computers not on your local network that you would like to be able to access the database), you can use ADO.NET to access the database. ADO.NET will work seamlessly behind the VPN.
From the context of your question I am assuming that the files are stored in a file system outside of the database, and the database merely references the files. If this is the case, you could use any number of options for downloading the files, but FTP is a time-tested, easy-to-implement solution. There are others that may or may not work better in your situation (see here for a few options).
I have a web based CRM system that stores all the client data from all the clients into one database (MS Sql Server). We need to build a system that maintains a local copy of the data for each client. So basically the client database will have all tables and columns except for the ClientId that is in ever server table. I am aware that I will need to add fields to the server to support synchronization.
Are there any good solutions or components already out there to help me accomplish this?
We are using MS Sql Server and .NET C#
SQL Server vs. a local one (express) - this is what MS does in it's CRM.
Sync framework (part of SQL Server etc.) could be helpful to programming a sync mechanism, in case you do not want to use replication here - which MAY work.
I have a datalogging application (c#/.net) that logs data to a SQLite database. This database is written to constantly while the application is running. It is also possible for the database to be archived and a new database created once the size of the SQLite database reaches a predefined size.
I'm writing a web application for reporting on the data. My web setup is c#/.Net with a SQL Server. Clients will be able to see their own data gathered online from their instance of my application.
For test purposes, to upload the data to test with I've written a rough and dirty application which basically reads from the SQLite DB and then injects the data into the SQL Server using SQL - I run the application once to populate the SQL Server DB online.
My application is written in c# and is modular so I could add a process that periodically checks the SQLite DB then transfer new data in batches to my SQL Server.
My question is, if I wanted to continually synchronise the client side SQLLite database (s) with my server as the application is datalogging what would the best way of going about this be?
Is there any technology/strategy I should be looking into employing here? Any recommended techniques?
Several options come to mind. You can add a timestamp to each table that you want to copy from and then select rows written after the last update. This is fast and will work if you archive the database and start with an empty one.
You can also journal your updates for each table into an XML string that describes the changes and store that into a new table that is treated as a queue.
You could take a look at the Sync Framework. How complex is the schema that you're looking to sync up & is it only one-way or does data need to come back down?
As a simply solution I'd look at exporting data in some delimited format and then using bcp/BULK INSERT to pull it in to your central server.
Might want to investigate concept of Log Shipping
There exists a open source project on Github also available on Nuget. It is called SyncWinR, it implements the Sync Framework Toolkit to enabled synchronization with WinRT or Windows Phone 8 and SQLite.
You can access the project from https://github.com/Mimetis/SyncWinRT.