I'm about to release a small tool which uses a database connection for storing data. The question is: How can I prevent people reverse engineering my code and getting the Username and Password to gain access to the database?
For earlier projects (which were used only by myself), I defined the connection-string just as a global variable inside my app. But that's highly unsafe as it only takes minutes to get this string out of the exe.
Also a lot of methods to obfuscate code can be reversed.
I am really a big fan of providing code but I don't know what to post. This is more a question about the theory. Coding is the part I'll take care of myself.
Here is a small idea from me which I don't really like that much:
I could place a second tool on the server. The real app would connect to this second tool, give over the data and the second data would finally connect to my database itself. This way the connection-string would be stored inside the second app where nobody can grab it.
The fact of the matter is that storing sensitive information on the client machine is highly vulnerable to attacks against your database. A suggestion you can look into is a Three-tier architecture model for your application (http://en.wikipedia.org/wiki/Multitier_architecture#Three-tier_architecture). In a Three-tier architecture, you have your presentation layer (your application), your logic tier (this layer will be the central pit stop for all your clients will have access to your database), and you have your database layer (the server where your database is). With this architecture, you can ensure all the data being stored and being retrieved from is from a singular source and high level security.
In the past (and still in the present), programmers would have to create their own socket servers or do advance network programming to develop a solution like this, however Microsoft has developed a tool called Windows Communication Foundation (WCF) which takes away the pain of coding your own socket server and lets you focus on developing your own implementation. Be warned though, WCF is secure by default, but it is no excuse not to research into ways of making your product robust against hackers (like knowing what protocol you are going to use, what security measures you are going to use (Transport vs Message, etc), encrypting data on client side so potential viruses don't uncover sensitive informations, etc). In saying that, WCF is a highly polished service and is really easy to get something up and running.
A good beginner video tutorial on WCF can be found here: https://www.youtube.com/playlist?list=PLhq7kqloVlM-bI9W_7iDZhObAeyrFt1y_
EDIT: The playlist for the videos are gone, but the videos themselves are still there. Just search through all his videos looking for the keyword 'WCF'
Here's the link: https://www.youtube.com/user/JesseDietrichson/featured
Related
after much searching I am going to pose my question here. If there is a duplicate and my search-foo abilities failed me I will gladly defer to it. On to the question.
I have a heavy background in service and web applications, with centralized database servers in an 'always on' way. However, a friend of mine is in need of a simple CRUD application (probably C# due to my limited experience with other GUI libraries/APIs) to replace a costly solution he currently has.
The application itself is very simple and I have planned to use Dropbox as a network layer to keep it even simpler. The issue I'm having trouble solving is that I need a good way to handle a portable database for authentication/authorization. I was considering SQL Lite stored in a read only folder within Dropbox so that it stays current while connected to the internet but doesn't require a network connection to function (They will not always have internet when using the program).
I am looking for suggestions, reading material, or experience with a similar design as a starting place.
Thanks for taking the time to help.
I recommend Sqlite. Works on almost all platforms and pretty easy to use. With .NET I use a micro ORM with it like Dapper.
Here is an example: http://blog.maskalik.com/asp-net/sqlite-simple-database-with-dapper
You can also use SQL Server Compact Edition (SQLCE) to achieve this very quickly. VS provides designers for making SQLCE database files, and ADO.NET plays nicely with it as well. It also supports simple replication as well.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 4 years ago.
Improve this question
I'm trying to bootstrap a micro ISV on my nights and weekends. I have an application at a very early stage of development. It is written in C# and consists mainly of a collection of classes representing the problem domain. At this point there's no UI or data persistence. (I haven't even settled on the .NET platform. Its early enough that I could change to Java or native executables)
My goal for this application is that it will be a hybrid single user/ occasionally connected multiuser application. The single user part will use an embedded database for local storage. This is a development model I'm familiar with.
The multiuser part is where I have no prior experience. I know each user will need two things:
IP based communication to a remote server on the public internet
User authentication and remote data storage
I have an idea of what services I want this server to provide (information lookup and user to user transactions) but beyond that I'm out of my element. The server will need to be hosted by a third party since I don't have resources to run my own server. Keeping in mind that I will be the sole developer for this project for the foreseeable future:
Which technologies would be the simplest way to implement the two things mentioned above? Direct access to the datastore/database or is it better to isolate it? Should I implement a webservice? If so, SOAP or REST?
What other things do I need to consider when moving to a multiuser application?
I know security is a greater concern in a multiuser application. Especially when your dealing with any kind of banking information(which I will). Performance can be an issue when dealing with a remote connection and large numbers of users. Anything else I'm overlooking?
Regarding moving to a multiuser application, centralising your data is the first step of course, and the simplest way to achieve it is often to use a cloud-based database, such as Amazon SimpleDB or MS Azure. You typically get an access key and a long 'secret' for authentication.
If your data isn't highly relational, you might want to consider Amazon SimpleDB. There are SDKs for most languages, which allow simple code to store/retrieve data in your SimpleDB database using a key and secret, anywhere in the world. You pay for the service based on your data storage and volume of traffic, so it has a very low barrier of entry, especially during development. It will also scale from a tiny home application up to something of the size of amazon.com.
If you do choose to implement your own database server, you should remember two key things:
Ensure no session state exists, i.e. the client makes a call to your web service, some action occurs, and the server forgets about that client (apart from any changed data in the database of course). Similarly the client should not be holding any data locally that could change as a result of interaction from another user. Cache locally only data you know won't change (or that you don't care if it changes).
For a web service, each call will typically be handled on its own thread, and so you need to ensure that access to the database from multiple threads is safe. If you use the standard .NET or Java ways of talking to a SQL database, this should be handled for you. However, if you implement your own data storage, it would be something you'd need to worry about.
Regarding the question of REST/SOAP etc., a key consideration should be what kinds of platforms/devices you want to use to connect to the database server. For example if you were implementing your server in .NET you might consider WCF for implementing your web services. However that might introduce difficulties if you later want to use non-.NET clients. SOAP is a mature technology for web services, but quite onerous to implement, and libraries to wrap up the handling of SOAP calls may not necessarily be available for a given client platform. REST is simple to implement (trivially easy if you use ASP.NET MVC on your server), accessible by any client that can handle HTTP POST/GET without the need for libraries, and easy to test, so REST would be my technology of choice.
If you are sticking with .net (my personal preference), I would expose data access calls via WCF. WCF configuration is really flexible and pretty easy to pick up and you'll want to hide your DB behind a service layer.
1.Direct access to db is the simplest, and the worst. Just think about how you'd auth the db access... I would just write a remote-able API with serializable parameters, and worry about which methods to connect later (web services, IIOP, whatever) - the communication details are all wrapped and hidden anyway.
2.none
I will soon begin the painful*(kidding)* process of migrating multiple, separate, Access Applications to "Real" applications*(notice the quotes, no flame wars please)*. Most likely this will be Web Apps as the usual reason is multiple users and deployability but I will take it case by case.
Some of these are traditional Access apps using Access as the back end and others are using SQL Server(a central one) as the back end.
What I am looking for is a combination of your experience doing this and what resources you used to help.
Websites, apps, standards, best practices, gotcha's, don't forget's, etcetera.
I am a 1 person C# shop with SQL Server back end so whether Web or not I will be looking that direction.
Also, is it overkill or unattainable to try and develop a Framework for this kind of thing? Would there just be TOO MANY variables to even try and walk this path? Anyone ever try this?
Some further info based on below questions. We currently have ~250 users and they are spread between 5 Locations.
What I meant by deployability is perhaps a little vague. I simply meant that we are a Non-Profit Organization and as such we do not have the best bandwidth available so deploying full apps, even through ClickOnce can be tricky when combinded with the highly fickle nature of my users*(I want that box purple, no green, no get rid of it altogether type stuff...)*.
My idea is to try and develop a "framework", of sorts, that will help to streamline the process of moving an Access App to a .Net App.
Now I fully understand that this "framework" may be nothing more than a set of steps and guidelines; like, Use ORM*(LINQ2SQL or SubSonic)*to generate DAL, Copy UI to corresponding UserControls, rewrite Business Logic.
I am just looking for your experience/expertise to help me streamline my streamlining process... ;)
Those apps which use an Access database to store tables and which need web access should first be upsized to SQL Server. There is a tool from the SQL Server group. SQL Server Migration Assistant for Access (SSMA Access)
Then consider moving to the web only that portion of the app that requires remote access. And leaving the rest of the app in Access. That could save a considerable amount of time.
Alternatively consider going to Terminal Server. That along with a VPN means just some software licensing costs and next to no work on your part.
That said what do you mean by "multiple users" and "deployability"? Possibly we can give you some suggestions there. Access is multi user out of the box. However if you have mission critical data or can't rekey the data in the event of a corruption or have more than 25-50 users on the LAN then you should be moving the data to SQL Server.
Now that it's public Access 2010 can deploy applications to the web. All kinds of very interesting stuff can be done. For more information check the Microsoft Access product group blog or my blog with the appropriate Access 2010 tags
Speaking from experience I think you would need to upgrade on a case by case basis. Upgrading is essentially a re-write from scratch and you should take the opportunity here to re-design as necessary. The type of application structure and code style used for Access (likely to be procedural I'm guessing) is very different to a well designed OO .Net app.
You will be able to re-use the SQL Server databases of course and, depending on the apps maybe even the Access ones. If you're feeling brave you could even try the upsizing wizard although I wouldn't recommend it as we found the results less than ideal.
I would also advise you take a look at some kind of ORM tool (we use Subsonic) as this can massively reduce the amount of boiler plate code you need to write. Some ORM tools will also generate DDL for your database too.
We follow these standards (good idea to pick a standard early on and stick to it we found) and also found this really useful to get up and running.
Hope this was some help.
I've been asked to research approaches to deal with an app we're supposed to be building. This app, hypothetically a Windows form written in C#, will issue commands directly to the server if it's connected, but if the app is offline, the state must be maintained as if it was connected and then sync up and issue data changes/commands to the server once it is connected.
I'm not sure where to start looking. This is something akin to Google Gears, but I don't think I have that option if we go a Winform route (which looks likely, given that there are other functions the application needs that a web app couldn't perform). Is the Microsoft Sync framework a viable option? Does Silverlight do anything like this? Any other options? I've Googled around a bit but would like the community input on what's best given the scenario.
The Microsoft Sync Framework definitely supports the scenario you describe, although I would say that it's fairly complicated to get it working.
One thing to understand about the Sync Framework is that it's really two quite distinct frameworks shipping in the same package:
Sync Framework
ADO.NET Sync services v. 2
The ADO.NET Sync services are by far the easiest to set up, but they are constrained to synchronizing two relational data stores (although you can set up a web service as a remote facade between the two).
The core Sync Framework has no such limitations, but is far more complex to implement. When I used it about six months ago, I found that the best source to learn from was the SDK, and particularly the File/Folder sync sample code.
As far as I could tell, there was little to no sharing of code and types between the two 'frameworks', so you will have to pick one or the other.
In either case, there are no constraints on how you host the sync code, so Windows Forms is just one option among many.
If I understand correctly, this doesn't sound like an actual data synchronization issue to me where you want to keep two databases in sync. it sounds more like you want a reliable mechanism for a client to call functions on a server in an environment where the connection is unstable, and if the connection is not present at the time, you want the function called as soon as the connection is back up.
If my understanding is right, this is one option. if not, this will probably not be helpful.
This is a very short answer to an in-depth problem, but we had a similar situation and this is how we handled it.
We have a client application that needs to monitor some data on a PC in a store. When certain events happen, this client application needs to update our server in the corporate offices, preferably Real-Time. However, the connection is not 100% reliable, so we needed a similar mechanism.
We solved this by trying to write to the server via a web service. If there is an error calling the web service, the command is serialized as an XML file in a folder named "waiting to upload".
We have a routine running in our client app on a timer set for every n minutes. When the timer elapses, it checks for XML files in this folder. If found, it attempts to call the web service using the information saved in the file, and so on until it is successful. Upon a successful call, the XML file is deleted.
It sounds hack-ish, but it was simple to code and has worked flawlessly for five years now. It's actually been our most trouble-free application all-around and we've implemented the pattern elsewhere successfully
So my company stores alot of data in a foxpro database and trying to get around the performance hit of touching it directly I was thinking of messaging anything that can be done asynchronously for a snappier user experience. I started looking at ActiveMQ but don't know how well C# will hook with it. Wanting to hear what all of you guys think.
edit : It is going to be a web application. Anything touching this foxpro is kinda slow (probably because the person who set it up 10 years ago messed it all to hell, some of the table files are incredibly large). We replicate the foxpro to sql nightly and most of our data reads are ok being a day old so we are focusing on the writes. plus the write affects a critical part of the user experience (purchasing), we store it in sql and then just message to have it put into foxpro when it can. I wish we could just get rid of the foxpro, unfortunately the company doesn't want to get rid of a very old piece of software they bought that depends on it.
ActiveMQ works well with C# using the Spring.NET integrations and NMS. A post with some links to get you started in that direction is here. Also consider using MSMQ (The System.Messaging namespace) or a .NET based asynchronous messaging solution, with some options here.
MSMQ (Microsoft Message Queueing) may be a great choice. It is part of the OS and present as an optional component (can be installed via Add/Remove Programs / Windows Components), meaning it's free (as long you already paid for Windows, of course). MSMQ provides Win32/COM and System.Messaging APIs. More modern Windows Communication Foundation (aka Indigo) queued channels also use MSMQ.
Note that MSMQ is not supported on Home SKUs of Windows (XP Home and Vista Home)
Its worth mentioning that the ActiveMQ open source project defines a C# API for messaging called NMS which allows you to develop against a single C# / .Net API that can then use various messaging back ends such as
ActiveMQ
MSMQ
TibCo's EMS
any STOMP provider
any JMS provider via StompConnect
You may want to look at MSMQ. It can be used by .NET and VFP, but you'll need to rewrite to use them. Here's an article that tells you how to use MSMQ from VFP. https://learn.microsoft.com/en-us/previous-versions/visualstudio/foxpro/ms917361(v=msdn.10)
Sorry if this isn't what you are asking for...
Have you considered some sort of cache behind the scenes that acts a bit like the "bucket system" when using asynchronous sockets in c/c++ using winsock? Basicly, it works by accepting requests, and sends an immediate response back to the web app, and when it finally gets around to finding your record, it updates it on the app via AJAX or any other technology of your choice. Since I'm not a C# programmer I can't provide any specific example. Hope this helps!
Does the Fox app use .CDX indexes? If so, you might be able to improve performance by adding indexes without needing to change any program code. If it uses .IDX indexes, though, the change would have to be done in the actual app.
As the problem is with writes, I would look more towards >removing< any unneeded indexes on the tables. As is common in RDBMS, every index on a FoxPro table slows down a write operation as the indexes need to be updated, and as you aren't reading directly from (or presumably directly querying) the table you shouldn't need very many indexes. You might also want to look at any triggers or field rules on the tables as they may be slowing down the write operation. Be sure your referential integrity is still preserved, though..