Data import wizard library for .Net? - c#

Does anyone know of a 3rd party data import wizard that can be embedded into applications? It should import from Excel, Access, SQLServer, csv, tab-separated flat file, XML, Oracle etc. We have a fixed data structure within our application and the user should be able to configure the wizard to match his/her import fields to our own data structure.
The wizard should be a library of sorts – preferably a .Net type library. We may want to have it both web-based and desktop based (hence we may need an ASP.Net controls version and a Winforms version). We may also want integration with WPF and Silverlight.
If there’s no UI wizard available, does anyone know of a non-UI library that supports easily configurable import from many, many different datasources?

Possible solution would be to use SQL Server Integration Services (SSIS)
The client can develop the custom package to be run, and the system can run that package in order to import the data.

The primary problem here is that the requirements are nearly impossible to fulfill in a generic way that is still easy enough to use for an end user.
Any import tool would have to be programmed to know a great deal about your data structures, relationships, and business logic.
The actual act of performing the import or having a few screens to step through this is so minor in comparison (usually less than 5% of the work) that there's almost no point in building a "generic" tool for other coders to use.
Now, if you don't mind giving a lot of complexity to your end users AND allowing them to be able to "forget" about your business logic and potentially screw up the imports then by all means hand the something like SSIS.
However, if you need to control what goes in, validate it, and generally make sure it's not going to crater your system then you'll have to code this one yourself.

Related

Adding new types to compiled code (treating it like data)

This question is strictly about the inefficient architecture of an existing system that needs to be rebuilt. It solicits validation from fellow developers who have had experience with managing such awkward systems. I have tried to abstract it as best as I could below.
The application caters to a very complex need and it delivers very well. The problem is that internal plumbing makes code management and scalability a nightmare. The little information I can share about context includes the fact that we need to treat code as a data commodity. In other words, the system can only function if implemented classes are added to it on a continuing basis.
What the application delivers to end-users is not data, but an [Action] that requires a code execution context. So the application has to execute some code on the target system in order to deliver what the user expects. Now these expectations are not known at compile-time and new ones need to be added almost on a daily basis. That means, developers keep adding [Actions] to the system regularly.
The existing system links to these [Action] classes statically! Not only does that make code management a nightmare, but also requires a recompile every time an action is added.
My first instinct was to have the system dynamically link to assemblies at runtime where each assembly would contain a bunch of actions. This would be akin to adding extensibility capabilities to the application. I thought about the MEF framework but it just did not feel right.
The only alternative I can think of is storing each action in the database as either source code or a compiled module. Each has its own trade-offs such as storing as source is less secure but gives me more control over code review and continued maintenance. Storing as compiled has the benefit of server-side assembly signing.
I would appreciate some advice about how to structure a system like this.
I don't think you need a more flexible architecture, but a more flexible software process. Adding new functionality on a daily basis is what most developers do. That doesn't a valid argument for a plugin system.
You don't need a plugin architecture. You need a good software development methodology, such as the agile processes (such as Scrum and XP), and make sure you be able to do this:
Let developers build new components in braches.
After thorough testing, merge new functionality to the main branch.
This way the main branch always has production quality and you can roll out new versions each day using continuous integration and continuous delivery.

C# creating multilingual application with plugin support (MEF)

I have read many articles and questions here on SO about this, however I am still not comfortable. I am planning to develop a plug-in based GUI desktop application based on MEF technology.
I would like to provide a localization support for the application. The problem is that even if I localize the host application, the third party plugins which are basically DLL files and can be installed any time, will also need to be localized.
I think having all localizable controls in a dll is not an option for me. I can store the international texts in a database, have a caller function which is retrieving the text from DB in the host applciation, and ask plugins to call this caller function. Or I can ask the developers to have different resource files in their applications, but this way, they will not benefit from already translated texts.
What is the best practise to provide multilingual interface for this case?
I'm not sure that there is one best practice that applies, but I can talk you through the options as I see them. If you develop a central database with your different translations for everything, all of your tools can benefit from the translations. However, the downside is that now all of your plugins will also need to know about the database (in some way). That adds a more direct coupling that I prefer to avoid when using third party plugins.
If you use the resource files, you gain more flexibility but lose the ability to reuse the same text (which feels like you are violating DRY).
Personally, I would go down the resource file route for your localization. It provides you the simplest way to get everyone working without major dependancies. However, if you can figure out a way to have the plugins call the central application for their localization text, the central database would be a better option (again, in my mind).
Here are a couple links that might help you out as well:
Is there a best-practice approach for internationalization of an application?
http://www.businessandprocess.com/2010/11/why-application-localization-should-start-in-the-design-stage/
http://expatsoftware.com/articles/2010/03/why-internationalization-is-hopelessly.html

Is it ok to roll your own localization layer in a .NET application?

Is it ok to roll your own localization framework? I would be ok using the default .NET localization behavior (i.e., putting text in resource files named a certain way in the assembly), except that we have localized images and text that need to be rendered in DirectX in addition to WinForms and WPF.
I could put form-specific strings in one place and other strings somewhere else, but I think that it makes more sense to keep everything in one place, not to mention it will help to avoid duplicates (for domain values like Yes/No, etc.). It's also possible we may be moving this tool to another platform in the future, so it would be nice to have all the localization information in one platform-agnostic area.
I realize this is a little subjective, but I'm looking for a best practice here...I've worked on projects that take both approaches. Any thoughts?
I have developed systems in which localisation is implemented via database-stored data and metadata. If your app is already making intense use of a fast database backend, you could create a database-backed localisation layer and use it to store localised information, including textual and non-textual data. It has worked great for us in a few ocasions.
Edit. The details won't fit in here, but basically we mirrored the logic of the key/value resource manager that the Windows API or .NET use. We extended that by allowing resources to be grouped into groups, which can be nested arbitrarily. Resource names can be given as, for example, "ClientManagement.MainForm.StatusBar.ReadyMsg", meaning the ready message text to display on the status bar of the main form in the client management user interface. On app startup, a locale setting is read from a config file and a resource manager initialised with it; all the subsequent calls to the resource manager will be using such a locale setting until explicitly changed. We also built an administrative user interface that allowed us to edit the resources stored in the database, and even add new languages. A final comment: data to be localised is not only labels and icons on screen. Option values in combo boxes, for example, also need to be localised.
We implemented a localization using DB backend. We were able to create a great resource editor which allows "translator" end users to dynamically update translations (cannot do that with a resx!). We were also able to support an approval process and group translations by module such that an entire module could be approved for use in a language, or not.
We also decided to implement the localization provider for Asp.Net, which basically does 'automatic' localization with no code by the developer. This was actually the only difficult part of the project as the interface is not well documented. It was hard to debug because it actually runs within Visual Studio host process. We used a web service to decouple the implementation which greatly simplified things. Another good thing is that the translations are automatically cached so the DB is not working as hard. A bad thing is that when your translation service/back end is down and if you do not precompile your asp.net web site, when the user launches a 'new' page, the compiler might decided NOT to translate the page. This behaviour remains (even after the translation service starts up again) until you force a recompile of the site.

How do I implement an auto update strategy for my in-house winform app

We have an in house winform application that is used by about 20 users in my company. It's a real pain having to send the users a new msi when the application has changed in scope and I would like to have the users prompted from the application as to whether they would like to update their copy. My thoughts are that the source of the application would be on our company server and that the application would look to a database to see if updates area available. Aside from that I don't know where to go from there. Has any one done anything similar to this or does any one have any recommendations on how I should implement this.
Here's an open-source solution I wrote to address specific needs we had for WinForms and WPF apps. The general idea is to have the greatest flexibility, at the lowest overhead possible.
So, integration is super-easy, and the library does pretty much everything for you, including synchronizing operations. It is also highly flexible, and lets you determine what tasks to execute and on what conditions - you make the rules (or use some that are there already). Last by not least is the support for any updates source (web, BitTorrent, etc) and any feed format - whatever is not implemented you can just write for yourself.
Cold updates (requiring an application restart) is also supported, and done automatically unless "hot-swap" is specified for the task.
This boild down to one DLL, less than 70kb in size.
More details at http://www.code972.com/blog/2010/08/nappupdate-application-auto-update-framework-for-dotnet/
Code is at http://github.com/synhershko/NAppUpdate (Licensed under the Apache 2.0 license)
ClickOnce.
If it's a fairly simple program (not many dependencies) consider keeping the program on a network share have have users run from there.
The most popular solutions with graphical update prompts are AutoUpdater.NET and WinSparkle. For a more powerful solution, take a look at Google Omaha.
Squirrel is definitely worth a look

Create a document library with external persistence

I would like to create a custom document library where I use the standard UI but implement a different persistence layer. Basically fetch and display documents from a different source system. This way I can use my existing object model but leverage the great office integration within SharePoint.
I found a decent article here but they are cheating, they have coded a completely new UI for the external persistence.
I have looked at the SPList and SPDocumentLibrary objects but I can't override the necessary methods.
I have looked at the event framework and it is closer but it lacks important events such as 'GetFile' or 'PopulateList'.
Any thoughts?
This isn't a perfect (or probably even a "good") fit for what you're trying to do, but I mention it primarily for awareness and to possibly give you some additional ideas (and warnings).
SharePoint's storage architecture leverages two different back-end stores: one for metadata (always SharePoint's SQL databases), and another for BLOB storage (also SQL by default). In its current form, though, SharePoint allows you to "wire-in" your own BLOB storage provider via a type that implements the ISPExternalBinaryProvider interface. Wiring in a type that implements this interface allows you to continue storing metadata in SQL while storing documents and other BLOB item types in a different store of your choice.
This probably sounds somewhat promising, but there are a couple of serious considerations:
Wiring-in your own ISPExternalBinaryProvider has a farm-wide impact. It's all or nothing, so once the provider is wired in, all sites and libaries will use the new provider.
You'll need to dive into unmanaged code, as the ISPExternalBinaryProvider is doing to require you to work with some IDL.
You can read more here: http://msdn.microsoft.com/en-us/library/bb802976.aspx
My take is that the external BLOB storage (EBS) system is something of a "prototype" at this point -- not ready for prime-time. If nothing else, though, it gives you something to think about. SharePoint Server 2010 will hopefully do more with it and make it more attractive and easy to implement.
For what it's worth!
I have implemented SQL persistence in a Form Library by using a persistence Workflow that runs on creation and update of the library's documents.
I created an Office SharePoint 2007 Workflow project in Visual Studio 2008, retrieved my SPItem document content and extracted the relevant data from the XML generated by the InfoPath WebForm and persisted it to a database.
If you really wanna roll your own external persistance, try taking a look at this brand new, extensive article from june on TechNet:
http://technet.microsoft.com/en-us/magazine/2009.06.insidesharepoint.aspx
Now, hand me the bounty. ;)
Sorry to say, but ISPExternalBinaryProvider is the only way to do this i'm afraid if you want to use standard UI.
P.S. Another major setback is that is a backup / versioning nightmare. Not even sure if versioning is supported.
Perhaps SharePoint 2010 will have a better way to do this...

Categories

Resources