In ServiceStack I am using the MiniProfiler configured to store profiles using SqlServerStorage. The profiles are being recorded to the database into the 'MiniProfilers' table without issue. Is there a viewer that would render the data (especially the json) from the MiniProfilers table?
This sample shows how SqlServerStorage is being initialized. The method is called from the AppHost.cs on Configure:
private void EnableProfiling(string profilerConnection)
{
using (var conn = new SqlConnection(profilerConnection))
{
conn.Open();
var miniProfilersTableExists = conn.ExecuteScalar<int>("select case when exists((select * from information_schema.tables where table_name = 'MiniProfilers')) then 1 else 0 end");
if (miniProfilersTableExists != 1)
conn.Execute(SqlServerStorage.TableCreationScript);
}
Profiler.Settings.Storage = new SqlServerStorage(profilerConnection);
}
In your HTML page (just before </HEAD>) add the following line
#ServiceStack.MiniProfiler.Profiler.RenderIncludes().AsRaw()
In Application_Start (Global.asax) add the following lines
Profiler.Settings.PopupRenderPosition = RenderPosition.Left;
Profiler.Settings.SqlFormatter = new SqlServerFormatter();
It will display a tiny tab at the top left corner of your page in which you can click to reveal more information.
Related
The Add method does not insert a new record into the database.
Working with the database through the application.
The SelectAll request is executed.
The Insert request is not executed. There are no errors.
The Insert request code. Not running(Not working)
public void Insert(Source source)
{
using (var DbContext = new DbContextSQlite())
{
dbContext.Sources.Add(source);
dbContext.SaveChanges();
}
}
The selectAll request code.
Public void selectAll() is running
{
using (var DbContext = new DbContextSQlite())
{
var rows = from x in dbContext.Sources
select x;
int count = rows.Count();
}
}
Working with the database via DB Browser using SQL queries
The Select query is executed.
The Insert request is being executed.
Used.
DB Browser for SQLite Version 3.12.2;
Visual Studio Community 2019. 16.11.10;
Application-Console. .NET Framework 4.7.2;
Picture-1
Picture-2
Update-1
Update-1.
When I try to fully post a question, stackoverflow.com gives me comments on the design. I don't understand how to eliminate these comments.
That's why I'm posting the question in an online editor.
Follow the link -> Detailed question.
I've been trying to piece together how other users have finished their projects, but my understanding is still limited.
I want to take any given XML source, make a Data Flow Task, and pass its data to an OLE DB destination matching the table name of the XML file. Running it with the visual tool means I cannot do dynamic data flow tasks because the Metadata does not refresh.
I have created a script that creates a package, but when I open the package in Visual Studio, it has a red-x saying that there cannot be zero Input Columns. When I drill down and look at the mappings of the OLE DB Destination, then click OK - it corrects it for me. I cannot figure out how to do that programmatically.
I've seen others solve it by using foreach loops and going through the Input columns, but I cannot seem to figure it out.
I also have a separate script that I tried to mimic several people's scripts with, and it has different issues. Not sure how to post it as an attachment
Thank you in advance for the help :)
EDIT
I've been getting positive feedback for trying out BIML, and I will...but I want to know if in the short term anyone can help me figure out why this doesn't fill in ExternalMetaDataColumnId for my input. I've posted my updated code below with foreach loops that aren't doing what I expect them to.
Thank you
#region
using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using Microsoft.SqlServer.Dts.Runtime;
using Microsoft.SqlServer.Dts.Pipeline.Wrapper;
using System.Xml;
#endregion
namespace ConsoleApplication3
{
class Program
{
static void Main(string[] args)
{
#region Initial Setup
Application a = new Application();
Package p = new Package();
TaskHost t = p.Executables.Add("DTS.Pipeline") as TaskHost;
t.Name = "DataFlow Task";
t.Description = "Flat File to Database";
MainPipe mp = t.InnerObject as MainPipe;
#endregion
#region Flat File Source in Dataflow Task
IDTSComponentMetaData100 md = mp.ComponentMetaDataCollection.New();
md.ComponentClassID = "Microsoft.XmlSourceAdapter";
md.Name = "XML Source";
CManagedComponentWrapper wrp = md.Instantiate();
wrp.ProvideComponentProperties();
#endregion
#region Add connection manager to OLE DB
ConnectionManager conn = p.Connections.Add("OLEDB");
conn.Name = "westcoastuserDBO";
conn.ConnectionString = "Data Source=SERVER;Initial Catalog=DBO;Provider=SQLNCLI11.1;Integrated Security=SSPI;Auto Translate=False;";
#endregion
#region XML Source Properties
wrp.SetComponentProperty("XMLData", #"C:\Users\file.xml");
wrp.SetComponentProperty("XMLSchemaDefinition", #"C:\Users\file.xsd");
wrp.SetComponentProperty("AccessMode", 0);
wrp.SetComponentProperty("UseInlineSchema", false);
//below does not work
//wrp.SetComponentProperty("XMLIntegerMapping", 0).TypeConverter = "Microsoft.SqlServer.Dts.Pipeline.XmlSourceAdapter + XMLIntegerMappingConverter";
wrp.ReinitializeMetaData();
wrp.ReleaseConnections();
IDTSComponentMetaData100 md2 = mp.ComponentMetaDataCollection.New();
md2.ComponentClassID = "Microsoft.OLEDBDestination";
CManagedComponentWrapper wrp2 = md2.Instantiate();
wrp2.ProvideComponentProperties();
md2.Name = "OLE DB Connection";
md2.UsesDispositions = true;
md2.Version = 4;
wrp2.SetComponentProperty("OpenRowset", "dbo.authorizations");
#endregion
IDTSPath100 path = mp.PathCollection.New();
path.AttachPathAndPropagateNotifications(md.OutputCollection[0], md2.InputCollection[0]);
IDTSInput100 input = md2.InputCollection[0];
IDTSVirtualInput100 vInput = input.GetVirtualInput();
//below taken from https://stackoverflow.com/questions/12587709/c-sharp-ssis-data-flow-component-creating-custom-input-columns
IDTSExternalMetadataColumnCollection100 externalColumnCollection = input.ExternalMetadataColumnCollection;
// Iterate through the virtual input column collection.
foreach (IDTSVirtualInputColumn100 vColumn in vInput.VirtualInputColumnCollection)
{
// Call the SetUsageType method of the destination
// to add each available virtual input column as an input column.
wrp2.SetUsageType(
input.ID, vInput, vColumn.LineageID, DTSUsageType.UT_READONLY);
}
// Get the destination's default output collection
IDTSOutputCollection100 outColl = md2.OutputCollection;
// Iterate through the outputs in default output collection
foreach (IDTSOutput100 output in outColl)
{
// Iterate through the default output columns in the output
int count = output.OutputColumnCollection.Count;
foreach (IDTSOutputColumn100 outputColumn in output.OutputColumnCollection)
{
// Get the output's external metadata column collection
IDTSExternalMetadataColumnCollection100 extMetadataColumnColl = output.ExternalMetadataColumnCollection;
// Iterate through the external metadata column collection's external metadata columns
foreach (IDTSExternalMetadataColumn100 extMetadataColumn in extMetadataColumnColl)
{
// Call the MapOutPutColumn method of the destination to map
// each available output column to an external metadata column
wrp2.MapOutputColumn(
output.ID, outputColumn.ID, extMetadataColumn.ID, true);
}
}
}
md2.RuntimeConnectionCollection[0].ConnectionManager = DtsConvert.GetExtendedInterface(conn);
md2.RuntimeConnectionCollection[0].ConnectionManagerID = conn.ID;
conn.AcquireConnection(null);
#region Save Package to FileSystem
string packageXml = #"C:\Users\test.dtsx";
XmlDocument myPkgDocument = new XmlDocument();
p.SaveToXML(ref myPkgDocument, null, null);
a.SaveToXml(packageXml, p, null);
#endregion
}
}
}
I think the problem that you are not mapping the input columns to the OLEDB Destination, and after opening the package, if you click on the OLEDB Destination and go to the Mapping section, it will automatically map the columns based on their names. The Foreach loop that is used by others are to loop over columns and map them to the related Destination columns.
There are many articles talking about creating SSIS package dynamically, you can refer to them for more information:
Dynamic Data Flow in SSIS using .NET/C#
Programmatically map the columns of a flat file destination?
Building Packages Programmatically
Samples for creating SSIS packages programmatically
Generating SSIS Packages Programmatically (Part I)
I'll start by asking am I right in thinking that in the image below:
the 'TABLE=CLOASEUCDBA.T_BASIC_POLICY' is not part of the connection string? in fact it is the source table name?
I'm looking to alter this to another linked table on the same database. The connection string should there be the same and the name that appears in ACCESS should be the same. The only difference should be under the hood it is actually referencing another table and of course if you open the table it will contain different fields and data.
my code for far to do this is:
var dbe = new DBEngine();
Database db = dbe.OpenDatabase(#"C:\Users\xxxx\Documents\Test.accdb");
foreach (TableDef tbd in db.TableDefs)
{
if (tbd.Name.Contains("CLOASEUCDBA_T_BASIC_POLICY"))
{
tbd.SourceTableName = "CLOASEUCDBA_T_BILLING_INFORMATION";
}
}
db.Close();
However I'm getting a big fat COMException "Cannot set this property once the object is part of a collection.". I'm not sure exactly why and all the examples I can find online are all written in VB/VBA and I only have very very limited exposure to this. Any help is appreciated.
EDIT:
I have tried to go a different route with no futher success using the code:
if (tbd.Name.Contains("CLOASEUCDBA_T_BASIC_POLICY"))
{
var newtable = db.CreateTableDef("this is a new table");
newtable.Name = "new table";
newtable.Connect = tbd.Connect;
newtable.SourceTableName = "CLOASEUCDBA_T_BILLING_INFORMATION";
db.TableDefs.Append(newtable);
//tbd.SourceTableName = "CLOASEUCDBA_T_BILLING_INFORMATION";
}
In this case I get the error "ODBC--call failed."
Since we're not allowed to change the SourceTableName of a TableDef object that already exists in the TableDefs collection we need to create a new TableDef object, .Delete the old one, and then .Append the new one:
// This code requires the following COM reference in your project:
//
// Microsoft Office 14.0 Access Database Engine Object Library
//
// and the declaration
//
// using Microsoft.Office.Interop.Access.Dao;
//
// at the top of the class file
string tableDefName = "CLOASEUCDBA_T_BASIC_POLICY";
var dbe = new DBEngine();
Database db = dbe.OpenDatabase(#"C:\Users\xxxx\Documents\Test.accdb");
var tbdOld = db.TableDefs[tableDefName];
var tbdNew = db.CreateTableDef(tableDefName);
tbdNew.Connect = tbdOld.Connect;
tbdNew.SourceTableName = "CLOASEUCDBA_T_BILLING_INFORMATION";
db.TableDefs.Delete(tableDefName); // remove the old TableDef ...
db.TableDefs.Append(tbdNew); // ... and append the new one
db.Close();
I am trying to save data to my database. What I want is that when the button is pressed, data is saved to the database permanently. I've done tests where code is saved while the application is running. The saved data is viewable. But when I terminate the application, the data is not present when I view the data for that table in visual studio. I've provided the code that I am using for testing.
private void btn_otrFun_Click(object sender, EventArgs e)
{
tblCheese cheese = new tblCheese();
string cheesename = "TestCheese";
cheese.CheeseName = cheesename;
cheese.CheeseGroup = 1;
cheeseEntity.tblCheese.AddObject(cheese);
cheeseEntity.SaveChanges();
}
Here is where I am getting the context form. It is instantiated at the beginnning of the form.
private CheeseWorld_DatabaseEntities cheeseEntity = new CheeseWorld_DatabaseEntities(); //instanciate new database entities
And further I am using this snippet to retrieve data from the the database to dynamically created buttons.
var cheeselist = cheeseEntity.ExecuteStoreQuery<tblCheese>("Select * FROM tblCheese WHERE cheeseGroup = 1", null).ToList();
Hope these further details help. If more are required, let me know.
You've departed from the normal pattern we usually use for this... maybe try putting it back to something like this... (I don't see where you are getting the context)
using (var context = new cheeseEntity()) {
tblCheese cheese = new tblCheese();
cheese.CheeseName = "TestCheese";
cheese.CheeseGroup = 1;
context.tblCheese.Add(cheese);
context.SaveChanges();
}
This is covered in the documentation: http://msdn.microsoft.com/en-us/data/jj593489
(Pay attention to the bottom where it shows how to trace the generated SQL)
NOTE: I am using Add instead of AddObject.
I'm working on a webshop-like asp.net mvc 4 website with a wcf-service datalayer. My application is build with maincategories, subcategories and products. Each product can only be in one subcategory and my url's are like this:
/maincategoryname/subcategoryname/{productid}/producttitle
And the corresponding breadcrumb trail:
Home > Maincategory > Subcategory > Producttitle
I'm currently using MvcSitemapProvider to generate my navigation menu's and breadcrumbs. I'm loading all the url's as dynamic nodes without cache. This solution works for a couple of products but when I add 1000 products the sitemap takes 6,5 second to populate, wich is way too long.
I turned on caching in MvcSitemapProvider. This way the application loads much faster. But when a user adds a new product and navigates to this new product (page). The url is not yet in the sitemap file because it uses cache. This way my navigation and breadcrumbs are not generated.
My question is:
Is it possible to add a new node to the sitemap at runtime after a user adds a new product?
The accepted answer is now a little out of date. In MvcSiteMapProvider v4, there is no longer a GetCacheDescription() method in a DynamicNodeProvider. This didn't seem to work anyway.
You can now invalidate the cache manually by using the [SiteMapCacheRelease] attribute on the action methods that update the data:
[MvcSiteMapProvider.Web.Mvc.Filters.SiteMapCacheRelease]
[HttpPost]
public ActionResult Edit(int id)
{
// Update the record
return View();
}
Or by calling a static method:
MvcSiteMapProvider.SiteMaps.ReleaseSiteMap();
You also have the option now to extend the framework to supply your own cache dependencies.
MvcSiteMapProvider allows for Dynamic Sitemaps that solve for Cache Dependancies.
You can enable this by creating a class which implements IDynamicNodeProvider.
Below is an example that generates dynamic nodes based on a database query, and also sets up a cache dependency on that same query.
public class ProductNodesProvider : IDynamicNodeProvider
{
static readonly string AllProductsQuery =
"SELECT Id, Title, Category FROM dbo.Product;";
string connectionString =
ConfigurationManager.ConnectionStrings ["db"].ConnectionString;
/// Create DynamicNode's out of all Products in our database
public System.Collections.Generic.IEnumerable<DynamicNode> GetDynamicNodeCollection()
{
var returnValue = new List<DynamicNode> ();
using (SqlConnection connection = new SqlConnection(connectionString)) {
SqlCommand command = new SqlCommand (AllProductsQuery, connection);
connection.Open ();
SqlDataReader reader = command.ExecuteReader ();
try {
while (reader.Read()) {
DynamicNode node = new DynamicNode ();
node.Title = reader [1];
node.ParentKey = "Category_" + reader [2];
node.RouteValues.Add ("productid", reader [0]);
returnValue.Add (node);
}
} finally {
reader.Close ();
}
}
return returnValue;
}
/// Create CacheDependancy on SQL
public CacheDescription GetCacheDescription ()
{
using (SqlConnection connection = new SqlConnection(connectionString)) {
SqlCommand command = new SqlCommand (AllProductsQuery, connection);
SqlCacheDependency dependancy = new SqlCacheDependency (command);
return new CacheDescription ("ProductNodesProvider")
{
Dependencies = dependancy
};
}
}
}
While this is all very nifty - and should invalidate the cache when your customers change products in the datbase - the whole SqlCacheDependancy can be tricky and is SQL Server-Version dependent.
You may go with a custom CacheDependacy instead, if you're using the cache to store your products.