Image Pattern Matching in SQL Server - c#

I have an metro app which takes a picture of a burning flame and sends it Azure. I am storing image directly in SQL Server table and not in BLOB because the image is generally < 100KB. The way I am implementing it is the image is inserted into the table, and after successful insert a push notification is sent to client with a set of instructions which indicate action to be taken for a flame.
Now, I am researching how I can implement pattern matching in the SQL Server table.
The table already has 10 images and my app takes a picture, inserts it into table and tries to compare it and finds the closest match and based on the match the specific instructions will be sent to metro app.
IS there any framework which I can use to do this pattern matching in cloud and carry specific task based on this pattern matching?
Can anybody please help me with any info in this regard?

While I can't recommend anything specific: Whatever app you find that will install and run in either a Windows or Linux virtual machine (or in a Windows VM in a Cloud Service, if the install can be automated and quick), should be ok. Just make sure whatever library you use doesn't rely on any specific GPU (since Azure doesn't offer GPU support today).
I saw another answer recommending a CLR procedure to process the images. I really wouldn't recommend that, as you're now stressing the CPU of your SQL Server, and that's not something that can easily be scaled out to multiple servers. And if you choose to use Windows Azure SQL Database, you won't have CLR as an option. You're better off placing processing in, say, a Cloud Service worker role, where you can scale out to any number of instances, and you can then use an Azure Queue to instruct the workers to perform specific comparisons / processing.

Related

Azure Mobile App Workflow

After using Azure Mobile Services a year ago, I decided to get back to mobile development but Microsoft changed a lot in their offer and I'm actually struggling to set my project up.
My goal is to create a service whith these features:
.NET backend preferred over the Javascript one (I don't like callbacks :))
SSO (Facebook, Twitter, Google+, Windows Live)
SQL Database (I really need relations, and I already have a T-SQL schema)
Push Notifications (just to Windows and Android for now and with unlimited custom channels so that I can have one channel for each user and avoid dealing with notifications' logic)
Monthly scheduled jobs to update database from an external JSON API and to remove old entries
Mobile Client (with a shared Xamarin library to handle all the data-related stuff and WUP + Android support)
Web Client (I don't have a Mac so I can't build and publish the iOS version, so a web app may be needed as a temporary replacement)
What I did was to:
Open Azure Preview Portal link
Click on New => Web + Mobile => Mobile App
Set the Resource Group with all the needed plans
Added a Data Connection to a newly created SQL Database
Added a Notification Hub with settings for GCM and WNS
Added Mobile Authentication with settings for Microsoft Account, Facebook, Twitter, Google
Created the schema for my SQL Database
Before going on, I'm not sure that this was the correct workflow but documentation is pretty confused and the Get Started sections just discuss about code and not how to properly setup the service and have it running, so I just did the same basic things that I would've done with the old Mobile Service, plus dealing with the SQL Database instead of the NOSQL one.
Now it comes the issue: I have no idea on how to move next, and even the Quickstart projects (both server and client) are not helpful (they're the old TodoItem sample working with the Mobile Service).
The first thing that I wanted to do was to create the Scheduled Job because I actually need to fill the database with the external data before moving forward.
The only thing close to what I need is the WebJob, but I can't schedule it yet and it requires me to upload an exe file while I'd like to be able to write my C# code directly to the server (being able to remotely debug it).
An alternative may be to create a Compute Instace and write an endless loop doing what I need, but this will force me to manually deal with the SQL Database inside the Mobile App Service.
Another issue is related to the SQL Database. As I already wrote, the Quickstart seems to work with the NOSQL included in the old Mobile Service, meaning that I don't have a direct connection to my SQL Database, while I'd like to be able something like
App.MobileService.GetTable<MyTable>()
Plus, having 10 tables, I'd also like to have a way to map them automatically (like NetBeans does for JavaEE projects).
So the question is: what's a good (or the best) workflow to get everything working as I need it or, at least, close to how I need it?
(I know that answers may be opionion-based but they still may be useful since Microsoft's documentation is not complete)
If you have an existing database, you could use Entity Framework Code First to Existing Database. That will generate the C# classes for you.
The database that you create when you add a Data Connection to your Mobile App is an Azure SQL Database by default--I'm not sure why you thought it was NoSQL?
Once you have done this, you can query your tables from the Azure Mobile Apps client SDK. For instance, in Xamarin, the quickstart project does queries as follows (see https://github.com/Azure/azure-mobile-services-quickstarts/blob/MobileApp/client/xamarin.android/ZUMOAPPNAME/ToDoActivity.cs#L126).
var list = await toDoTable.Where (item => item.Complete == false).ToListAsync ();
Finally, regarding your question on WebJobs, you can actually schedule it. See https://azure.microsoft.com/en-us/documentation/articles/web-sites-create-web-jobs/#CreateScheduledCRON for more information. Even though you Web Deploy your webjob target, you can still remote debug it. See this blog post for information: http://www.bursteg.com/remote-debugging-azure-webjobs-attach-a-debugger-from-server-explorer/

Scheduled, long running user queries

I need some suggestions from the community to a requirement I have. Below is the requirement and I need some approach suggestions.
Users from the client need to retrieve data from my source database (Let say SQL database in my production server). The users access the data by a intermediary service layer (WCF Rest service). On another server (Info Server) I have a SQL Database (Info DB) which will hold all queries that can be requested. Since in some cases my data is huge, I give the option to user to schedule the data retrieval and look at the data later. The schedule information per user would also be stored in the Info DB. I also allow user to retrieve data real time in case he wants.
In both cases I want to Query data from Source (Production DB), store them in file format (May be CSV or excel) and then when user wants the data I would send the data over to the client.
Since the queries are stored in InfoDB. I let the admin define schedule run time for the every Query. This is to enable Admin to adjust long running queries run at night time when calls to server is low. In case the user demands a query to be run at real time, I would allow that.
As a solution architecture I have thought of this :
I will have a WCF rest service which would be installed on Info Server. This service will act as calling point for the Users. When user calls a query real time, the service will get the results, save to a file format and transfer it over. If the user schedules the query, the service will add an entry for the user/ for the query in the info database.
I will have a Windows Service on the Info Server. This Windows Service will periodically check the Info DB for Scheduled Queries entries and if the queries fall within the scheduled time , it will start running the query, get the data and save that to a file location and add the file location entry to the Schedule entry. This will enable me to track which schedules are finished and where the data is available (File path).
Now here are my issues with this:
My data can be huge, will a WCF rest service be good enough to transfer large files over the wire ? Can I transfer files over wire or I need to transfer data as JSON ? What is the best approach.
If I use a windows service, is this a good approach or is there a better alternative ? The reason I am asking is because as per my understanding Windows Service will have to run always , because I need to figure out the entries which are scheduled. This means at specific interval the Windows Service would check the info database and see if the schedule entry should be run or not. In ideal scenario the windows service will run through out the day and check the database periodically without much action because preferably all schedules would be at night time.
I have used an intermediary service approach because if I need to move to cloud tomorrow, I can easily move this solution. Am I right in my assumption ?
If I move to cloud tomorrow, would I be bale to encrypt the data transfer (may be data encryption or file encryption). I have no idea on data encryption/decryption.
Need your suggestion(s) to this.
My data can be huge, will a WCF rest service be good enough to
transfer large files over the wire ? Can I transfer files over wire or
I need to transfer data as JSON ? What is the best approach.
When you say huge, how huge? Are we talking gigabytes, megabytes, or kilobytes. I regularly have 100mb rest responses (you will probably have to tweak some things, to increase your MaxMessageLength, but this should be enough to get you going. I would take their advice and use a streaming API though, especially if you are talking several megs of content.
If I use a windows service, is this a good approach or is there a better alternative ? The reason I am asking is because as per my understanding Windows Service will have to run always , because I need to figure out the entries which are scheduled. This means at specific interval the Windows Service would check the info database and see if the schedule entry should be run or not. In ideal scenario the windows service will run through out the day and check the database periodically without much action because preferably all schedules would be at night time.
Beware writing your own scheduler. You might be better off dropping things onto a queue for processing, then just firing up workers at the appropriate time. That way you can just invoke the worker directly for your realtime call. Plus you can run it whenever the database is idle, not on a scheduled basis. It's tricky "knowing" when a service will be idle. Especially in a world of round-the-clock users.
I have used an intermediary service approach because if I need to move to cloud tomorrow, I can easily move this solution. Am I right in my assumption ?
Yes, wrapping an endpoint in a rest service (WCF) will make moving to the cloud much easier.
If I move to cloud tomorrow, would I be bale to encrypt the data transfer (may be data encryption or file encryption). I have no idea on data encryption/decryption.
HTTPS is your friend here. Read this. Don't invent your own here, or use proprietary encryption. HTTPS is old, straightforward and good.

Advice on options for shared database for distributed c# application

I'd like to know my options for the following scenario:
I have a C# winforms application (developed in VS 2010) distributed to a number of offices within the country. The application communicates with a C# web service which lies on a main server at a separate location and there is one database (SQL Server 2012) at a further location. (All servers run Windows Server 2008)
Head Office (where we are) utilize the same front-end to manage certain information on the database which needs to be readily available to all offices - real-time. At the same time, any data they change needs to be readily available to us at Head Office as we have a real-time dashboard web application that monitors site-wide statistics.
Currently, the users are complaining about the speed at which the application operates. They say it is really slow. We work in a business-critical environment where every minute waiting may mean losing a client.
I have researched the following options, but do not come from a DB background, so not too sure what the best route for my scenario is.
Terminal Services/Sessions (which I've just implemented at Head Office and they say it's a great improvement, although there's a terrible lag - like remoting onto someones desktop, which is not nice to work on.)
Transactional Replication (Sounds like something quite plausible for my scenario, but would require all offices to have their own SQL server database on their individual servers and they have a tendency to "fiddle" and break everything they're left in charge of!) Wish we could take over all their servers, but they are franchises so have their own IT people on site.)
I've currently got a whole lot of the look-up data being cached on start-up of the application but this too takes 2-3 minutes to complete which is just not acceptable!
Does anyone have any ideas?
With everything running through the web service, there is no need for additional SQL Servers to be deployed local to the client. The WS wouldn't be able to communicate with these databases, unless the WS was also deployed locally as well.
Before suggesting any specific improvements, you need to benchmark where your bottlenecks are occurring. What is the latency between the various clients and the web service, and then from the web service and the database? Does the database show any waiting? Once you know the worst case scenario, improve that, and then work your way down.
Some general thoughts, though:
Move the WS closer to the database
Cache the data at the web service level to save on DB calls
Find the expense WS calls, and try to optimize the throughput
If the lookup data doesn't change all that often, use a local copy of SQL CE to cache that data, and use the MS Sync Framework to keep the data synchronized to the SQL Server
Use SQL CE for everything on the client computer, and use a background process to sync between the client and WS
UPDATE
After your comment, two additional thoughts. If your web service payload(s) is/are large, you can try adding compression on the web service (if it hasn't already been implemented).
You can also update your client to do the WS calls asynchronously, either in a thread or if you are using .NET 4.5 using async/await. This would at least allow the client to use the UI, but wouldn't necessary fix any issues with data load times.

online database for C# window application

I'm going to develop a POS system for medium scale company
and the requirement for me is to make all data on time for all of their branches
while in my mind, move the server from local to web would solve this problem
but, i never done any online server for window application
may i know what is the best option for use as secure database ?
such as SQL can handle this well ?
i tried to google but all of the result return is not what i want
may i know what will you do when you facing this problem ?
my knowledge on coding is just VB and CS
also SQL for database
i would like to learn new if there is better option
i hope it is impossible to access by anonymous and it is store secure at back-end only
What you probably want to do is create a series of services exposed on the internet and accessed by your application. All database access would be mediated by these services. For security you would probably want to build them in WCF and expose them through IIS. Then your Windows application would just call these services for most of its processing.
If you design it properly you could also have it work with a local database as well so that it could work in a disconnected manner if, for example, your servers go down.
Typically you don't move the server off of the site premises.
The problem is that they will go completely down in the event your remote server is inaccessible. Things that can cause this are internet service interruption (pretty common), remote server overloaded (common enough), basically anything that can stop the traffic between the store location and your remove server will bring them to their knees. The first time this happens they'll scream. The second time and they'll want your head due to the lost sales.
Instead, leave a sql server at each location. Set up a master sql server somewhere. Then set up a VPN connection between the stores and this central office. Finally, have the store sql boxes do merge replication with the central office. Incidentally, don't use the built in replication, but an off the shelf product which specializes in replicating sql server. The built in one can be difficult to learn.
In the event their internet connection goes dark the individual stores will still be able to function. It will also remain performant as all of the desktop app traffic is purely to the local sql box.
Solving replication errors is much easier than dealing with a flaky ISP.
I would recommend you to check Viravis Platform out.
It is an application platform that also can be used just as an online database for any .NET client with the provided SDK. It has its own generic windows and web clients and some custom web solutions for some specific applications.
You may be using it as a complete solution or as a secure online database backend.

Implementing a desktop .NET application that can work offline.

I need to create a desktop WPF application in .NET.
The application communicates with a web server, and can work in offline mode when the web server isn't available.
For example the application needs to calculate how much time the user works on a project. The application connects to the server and gets a list of projects, the user selects one project, and presses a button to start timer. The user can later stop the timer. The project start and stop times need to be sent to the server.
How to implement this functionality when the application is in offline mode?
Is there are some existing solution or some libraries to simplify this task?
Thanks in advance.
You'll need to do a couple of things differently in order to work offline.
First, you'll need to cache a list of projects. This way, the user doesn't have to go online to get the project list - you can pull it from your local cache when the user is offline.
Secondly, you'll need to save your timing results locally. Once you go online again, you can update the server will all of the historic timing data.
This just requires saving the information locally. You can choose to save it anywhere you wish, and even a simple XML file would suffice for the information you're saving, since it's simple - just a project + a timespan.
It sounds like this is a timing application for business tracking purposes, in which case you'll want to prevent the user from easily changing the data. Personally, I would probably save this in Isolated Storage, and potentially encrypt it.
You can use Sql Server Compact for you local storage and then you microsoft sync framework to sync your local database to the server database. I recommend doing some research on the Microsoft Sync Framework.
Hello all I implemented this application I've created my own off-line framework
based on this article and Microsoft Disconnected Service Agent
DSA
I've adapted this framework for my needs.
Thank you for all.
you can use a typed or untyped dataset for offline-storage.
when online (connected to internet) you can download the data into a dataset and upload it back to the database server. the dataset can be loaded from and saved to a local file.

Categories

Resources