My problem is really simple.
I have a .bak file that contains one or more backup set.
When I'm using SMO to restore the database with this .bak file, it only takes the first backup set to do its work. It seems to ignore the remaining sets.
Why's that ?
See my code :
//Sets the restore configuration
Restore restore = new Restore()
{
Action = RestoreActionType.Database,
Database = _databaseToRestore.DatabaseName,
ReplaceDatabase = true
};
restore.Devices.Add(new BackupDeviceItem(_backupFilePath, DeviceType.File));
Server server = new Server(_databaseToRestore.ServerName);
DataTable fileList = restore.ReadFileList(server);
string serverDataFolder = server.Settings.DefaultFile;
if (string.IsNullOrEmpty(serverDataFolder))
serverDataFolder = server.Information.MasterDBPath;
foreach (DataRow file in fileList.Rows)
{
restore.RelocateFiles.Add(
new RelocateFile((string)file["LogicalName"],
Path.Combine(serverDataFolder, _databaseToRestore.DatabaseName + Path.GetExtension((string)file["PhysicalName"]))));
}
//Gets the exclusive access to database
server.KillAllProcesses(_databaseToRestore.DatabaseName);
restore.Wait();
restore.SqlRestore(server);
I thought the BackupDeviceItem could gives me a feedback on how many backup sets there's inside, this way I could warn the user, but it's not.
Anyone has a clue on this ?
Thanks for your time.
Ok, fixed my problem.
The important field is FileNumber on the Restore object.
The default value is 1, so that's why it always took my first backup set.
I just had to set this property to the number of backup sets in the file and now it takes the most recent backup done.
Note : No differencial backups are implicated in this concern.
I just found out that I could easily know how many backup sets the file contains.
DataTable backupSets = restore.ReadBackupHeader(server);
Now, a simple backupSets.Rows.Count can help me to warn the user.
Related
I am new in C# and I have a problem connecting to a Firebird database. I want my program to access a Firebird Database [FDB format file]. I have problem, see the code below:
File.Copy(pathway, new_pathway, true);
FbConnection addDetailsConnection = new FbConnection("User=sysdba;Password=masterkey;Dialect=3;Database= " + new_pathway +
";DataSource=localhost;" );
string SQLCOMMAND = " SELECT UOM FROM ST_ITEM_UOM WHERE CODE = 'ANT'";
addDetailsConnection.Open();
FbCommand readCommand = new FbCommand(SQLCOMMAND, addDetailsConnection);
FbDataReader myreader = readCommand.ExecuteReader();
while (myreader.Read())
{
MessageBox.Show(myreader[0].ToString());
}
myreader.Close();
readCommand.Dispose();
addDetailsConnection.Close();
addDetailsConnection.Dispose();
This code lets me read my FDB file and extract the data. When the code executes for the first time, there is no error or problem, However when when I execute it again, this error is shown:
The process cannot access the file 'C:\Users\ACC-0001.FDB' because it is being used by another process.
You can use Handle to check which program is locking the file. It might be caused by your code or by another process running on your machine.
The tool identifies the process, for example:
C:>handle.exe c:\test.xlsx
Handle v3.46 Copyright (C) 1997-2011 Mark Russinovich Sysinternals -
www.sysinternals.com
EXCEL.EXE pid: 3596 type: File 414: C:\test.xlsx
As found here.
If the problem lies within your code, make sure you dispose and close all connections, preferably by using them within using sections:
using (FbConnection addDetailsConnection = new FbConnection("..."))
{
// do work
}
More details on using using can be found here.
You might have bumped into this Firebird issue: FB server reports that DB file is used by another application on secondary attachment attempt through a symlink
It only happens on Windows and only when two non-embedded connections use different path names of which one or both have a symlink in their path so they effectively point to the same location.
Both handle.exe and Process Explorer will only show the canonical (final) filename that fbserver.exe actually opens.
The only way to find out is to:
compare connection strings.
verify with handle.exe or Process Explorer that the files are indeed opened by fbserver.exe (and not by your process itself using an embedded connection)
Introduction to the Task at hand: can be skipped if impatient
The company I work for is not a software company, but focus on mechanical and thermodynamic engineering problems.
To help solve their system design challenges, they have developed a software for calculating the system impact of replacing individual components.
The software is quite old, written in FORTRAN and has evolved over a period of 30 years, which means that we cannot quickly re-write it or update it.
As you may imagine the way this software is installed has also evolved, but significantly slower than the rest of the system, meaning that packaging is done by a batch script that gathers files from different places, and puts them in a folder, which is then compiled into an iso, burned to a cd, and shipped with mail.
You young programmers (I am 30), may expect a program to load dll's, but otherwise be fairly self-contained after linking. Even if the code is made up of several classes, from different namespaces etc..
In FORTRAN 70 however.. Not so much. Which means that the software it self consists of an alarming number of calls to prebuilt modules (read: seperate programs)..
We need to be able to distribute via the internet, as any other modern company have been able to for a while. To do this we could just make the *.iso downloadable right?
Well, unfortunately no, the iso contains several files which are user specific.
As you may imagine with thousands of users, that would be thousands of isos, that are nearly identical.
Also we wan't to convert the old FORTRAN based installation software, into a real installation package, and all our other (and more modern) programs are C# programs packaged as MSI's..
But the compile time for a single msi with this old software on our server, is close to 10 seconds, so it is simply not an option for us to build the msi, when requested by the user. (if multiple users requests at the same time, the server won't be able to complete before requests timeout..)
Nor can we prebuild the user specific msi's and cache them, as we would run out of memory on the server.. (total at ~15 giga Byte per released version)
Task Description tl:dr;
Here is what I though I would do: (inspired by comments from Christopher Painter)
Create a base MSI, with dummy files instead of the the user specific files
Create cab file for each user, with the user specific files
At request time inject the userspecific cab file into a temporary copy of the base msi using the "_Stream" table.
Insert a reference into the Media table with a new 'DiskID' and a 'LastSequence' corresponding to the extra files, and the name of the injected cabfile.
Update the Filetable with the name of the user specific file in the new cab file, a new Sequence number (in the range of the new cab files sequence range), and the file size.
Question
My code fails to do the task just described. I can read from the msi just fine, but the cabinet file is never inserted.
Also:
If I open the msi with DIRECT mode, it corrupts the media table, and if I open it in TRANSACTION mode, it fails to change anything at all..
In direct mode the existing line in the Media table is replaced with:
DiskId: 1
LastSequence: -2145157118
Cabinet: "Name of action to invoke, either in the engine or the handler DLL."
What Am I doing wrong ?
Below I have provided the snippets involved with injecting the new cab file.
snippet 1
public string createCabinetFileForMSI(string workdir, List<string> filesToArchive)
{
//create temporary cabinet file at this path:
string GUID = Guid.NewGuid().ToString();
string cabFile = GUID + ".cab";
string cabFilePath = Path.Combine(workdir, cabFile);
//create a instance of Microsoft.Deployment.Compression.Cab.CabInfo
//which provides file-based operations on the cabinet file
CabInfo cab = new CabInfo(cabFilePath);
//create a list with files and add them to a cab file
//now an argument, but previously this was used as test:
//List<string> filesToArchive = new List<string>() { #"C:\file1", #"C:\file2" };
cab.PackFiles(workdir, filesToArchive, filesToArchive);
//we will ned the path for this file, when adding it to an msi..
return cabFile;
}
snippet 2
public int insertCabFileAsNewMediaInMSI(string cabFilePath, string pathToMSIFile, int numberOfFilesInCabinet = -1)
{
//open the MSI package for editing
pkg = new InstallPackage(pathToMSIFile, DatabaseOpenMode.Direct); //have also tried direct, while database was corrupted when writing.
return insertCabFileAsNewMediaInMSI(cabFilePath, numberOfFilesInCabinet);
}
snippet 3
public int insertCabFileAsNewMediaInMSI(string cabFilePath, int numberOfFilesInCabinet = -1)
{
if (pkg == null)
{
throw new Exception("Cannot insert cabinet file into non-existing MSI package. Please Supply a path to the MSI package");
}
int numberOfFilesToAdd = numberOfFilesInCabinet;
if (numberOfFilesInCabinet < 0)
{
CabInfo cab = new CabInfo(cabFilePath);
numberOfFilesToAdd = cab.GetFiles().Count;
}
//create a cab file record as a stream (embeddable into an MSI)
Record cabRec = new Record(1);
cabRec.SetStream(1, cabFilePath);
/*The Media table describes the set of disks that make up the source media for the installation.
we want to add one, after all the others
DiskId - Determines the sort order for the table. This number must be equal to or greater than 1,
for out new cab file, it must be > than the existing ones...
*/
//the baby SQL service in the MSI does not support "ORDER BY `` DESC" but does support order by..
IList<int> mediaIDs = pkg.ExecuteIntegerQuery("SELECT `DiskId` FROM `Media` ORDER BY `DiskId`");
int lastIndex = mediaIDs.Count - 1;
int DiskId = mediaIDs.ElementAt(lastIndex) + 1;
//wix name conventions of embedded cab files is "#cab" + DiskId + ".cab"
string mediaCabinet = "cab" + DiskId.ToString() + ".cab";
//The _Streams table lists embedded OLE data streams.
//This is a temporary table, created only when referenced by a SQL statement.
string query = "INSERT INTO `_Streams` (`Name`, `Data`) VALUES ('" + mediaCabinet + "', ?)";
pkg.Execute(query, cabRec);
Console.WriteLine(query);
/*LastSequence - File sequence number for the last file for this new media.
The numbers in the LastSequence column specify which of the files in the File table
are found on a particular source disk.
Each source disk contains all files with sequence numbers (as shown in the Sequence column of the File table)
less than or equal to the value in the LastSequence column, and greater than the LastSequence value of the previous disk
(or greater than 0, for the first entry in the Media table).
This number must be non-negative; the maximum limit is 32767 files.
/MSDN
*/
IList<int> sequences = pkg.ExecuteIntegerQuery("SELECT `LastSequence` FROM `Media` ORDER BY `LastSequence`");
lastIndex = sequences.Count - 1;
int LastSequence = sequences.ElementAt(lastIndex) + numberOfFilesToAdd;
query = "INSERT INTO `Media` (`DiskId`, `LastSequence`, `Cabinet`) VALUES (" + DiskId.ToString() + "," + LastSequence.ToString() + ",'#" + mediaCabinet + "')";
Console.WriteLine(query);
pkg.Execute(query);
return DiskId;
}
update: stupid me, forgot about "committing" in transaction mode - but now it does the same as in direct mode, so no real changes to the question.
I will answer this my self, since I just learned something about DIRECT mode that I didn't know before, and wan't to keep it here to allow for the eventual re-google..
Apparently we only succesfully updates the MSI, if we closed the database handle before the program eventually chrashed.
for the purpose of answering the question, this destructor should do it.
~className()
{
if (pkg != null)
{
try
{
pkg.Close();
}
catch (Exception ex)
{
//rollback not included as we edit directly?
//do nothing..
//atm. we just don't want to break anything if database was already closed, without dereferencing
}
}
}
after adding the correct close statement, the MSI grew in size
(and a media row was added to the media table :) )
I will post the entire class for solving this task, when its done and tested,
but I'll do it in the related question on SO.
the related question on SO
After searching the googles for couple hours I found an answer to my question. I know this post Undo checkout TFS answers my question, however it doesn't answer all the questions I have. I want to achieve the same objective that the post asked about. How to only revert files that have been checked out if nothing was modified in that file? The answer to my question shouldn't be too hard to answer.
So what I'm doing is copying files from a server and overwriting them in my local workspace. I am checking out all the files being copied. However, if a file that was copied is not modified in anyway(server file and destination file are exact same), I'd like to undo the checkout of that file.
I know I'm to use the workspace.Undo() method and the gentleman said it worked for him. However he didn't show how he implemented it.
Here is the code I have with help from the link:
public static void CheckOutFromTFS(string filepath)
{
var workspaceInfo = Workstation.Current.GetLocalWorkspaceInfo(filepath);
if (workspaceInfo == null)
{
return;
}
var server = new TfsTeamProjectCollection(workspaceInfo.ServerUri);
var workspace = workspaceInfo.GetWorkspace(server);
workspace.PendEdit(filepath);
}
The answer given was to use the workspace.Undo() method. Do I add this method as the last line in CheckOutFromTFS() like so?
public static void CheckOutFromTFS(string filepath)
{
var workspaceInfo = Workstation.Current.GetLocalWorkspaceInfo(filepath);
if (workspaceInfo == null)
{
return;
}
var server = new TfsTeamProjectCollection(workspaceInfo.ServerUri);
var workspace = workspaceInfo.GetWorkspace(server);
workspace.PendEdit(filepath);
workspace.Undo();
}
Or is it done differently? I'm not sure if this Undo() will only revert files if there are no changes or just revert the checkout entirely and render the PendEdit() useless. Can someone help clarify this for me?
If you use a local workspace then all file that have no changes will automatically revert to not checked-out. You don't need to do anything at all. This works with VS 2012 or better with TFS 2012 or better. You'll need to convert you workspace to a local workspace first like this
So I found the answer to my question in various posts. I kinda took bits an pieces and combined them together to get my working solution. The use of the Undo() function with passing in the filepath actually does uncheckout the file regardless if it was modified or not. My workspace was also local but VS and TFS couldn't automatically revert those unmodified files for me so I took the below approach.
So what I decided to do was to just use the Team Foundation Power Tools "uu" command to undo the changes to unchanged files in the workspace. I created a batch file and entered the following command: echo y | tfpt uu . /noget /recursive. Since we will not show the shell during execution, I used the "echo y" command to automatically answer the question, "Do you wish to undo these redundant pending changes? (Y/N)". Including /noget is highly recommended since it prevents a forced 'get latest' of all your project's files which depending on the total number can take a extremely long time.
var startInfo = new System.Diagnostics.ProcessStartInfo
{
WorkingDirectory = projectRoot,
FileName = projectRoot + #"\undoUnchanged.bat",
UseShellExecute = false,
CreateNoWindow = true
};
Process process = Process.Start(startInfo);
process.WaitForExit();
process.Close();
After the script runs and the process.Close() executes you and double check if your unmodified files actually were unchecked out by hitting the refresh button on the Team Explorer window in your project. Hope someone else can find some use in this.
If I understand the question well and you actually need undo through C# code behind, I believe this shoul help you:
Undo checkout TFS
I've searched all over and I now have to ask SO. I'm trying to construct a simple dataflow using EzAPI. It's been anything but easy, but I'm committed to figuring this out. What I can't figure out is how to get the EzOleDBDestination working. Here's my complete code
var a = new Application();
// using a template since it's impossible to set up an ADO.NET connection to MySQL
// using EzAPI and potentially even with the raw SSIS API...
var pkg = new EzPackage(a.LoadPackage(#"C:\...\Package.dtsx", null));
pkg.Name = "Star";
var df = new EzDataFlow(pkg);
df.Name = "My DataFlow";
var src = new EzAdoNetSource(df);
src.Name = "Source Database";
src.SqlCommand = "SELECT * FROM enum_institution";
src.AccessMode = AccessMode.AM_SQLCOMMAND;
src.Connection = new EzConnectionManager(pkg, pkg.Connections["SourceDB"]);
src.ReinitializeMetaData();
var derived = new EzDerivedColumn(df);
derived.AttachTo(src);
derived.Name = "Prepare Dimension Attributes";
derived.LinkAllInputsToOutputs();
derived.Expression["SourceNumber"] = "id";
derived.Expression["Name"] = "(DT_STR,255,1252)description";
// EDIT: reordered the operation here and I no longer get an error, but
// I'm not getting any mappings or any input columns when I open the package in the designer
var dest = new EzOleDbDestination(df);
dest.AttachTo(derived, 0, 0);
dest.Name = "Target Database";
dest.AccessMode = 0;
dest.Table = "[dbo].[DimInstitution]";
dest.Connection = new EzConnectionManager(pkg, pkg.Connections["TargetDB"]);
// this comes from Yahia's link
var destInput = dest.Meta.InputCollection[0];
var destVirInput = destInput.GetVirtualInput();
var destInputCols = destInput.InputColumnCollection;
var destExtCols = destInput.ExternalMetadataColumnCollection;
var sourceColumns = derived.Meta.OutputCollection[0].OutputColumnCollection;
foreach(IDTSOutputColumn100 outputCol in sourceColumns) {
// Now getting COM Exception here...
var extCol = destExtCols[outputCol.Name];
if(extCol != null) {
// Create an input column from an output col of previous component.
destVirInput.SetUsageType(outputCol.ID, DTSUsageType.UT_READONLY);
var inputCol = destInputCols.GetInputColumnByLineageID(outputCol.ID);
if(inputCol != null) {
// map the input column with an external metadata column
dest.Comp.MapInputColumn(destInput.ID, inputCol.ID, extCol.ID);
}
}
}
Basically, anything that involves calls to ReinitializeMetadata() results in 0xC0090001, because that method is where the error happens. There's no real documentation to help me, so I have to rely on any gurus here.
I should mention that the source DB is MySQL and the target DB is SQL Server. Building packages like this using the SSIS designer works fine, so I know it's possible.
Feel free to tell me if I'm doing anything else wrong.
EDIT: here's a link to the base package I'm using as a template: http://www.filedropper.com/package_1 . I've redacted the connection details, but any MySQL and SQL Server database will do. The package will read from MySQL (using the MySQL ADO.NET Connector) and write to SQL Server.
The database schema is mostly irrelevant. For testing, just make a table in MySQL that has two columns: id (int) and description (varchar), with id being the primary key. Make equivalent columns in SQL Server. The goal here is simply to copy from one to the other. It may end up being more complex at some point, but I have to get past this hurdle first.
I can't test this now BUT I am rather sure that the following will help you get it working:
Calling ReinitializeMetadata() causes the component to fetch the table metadata. This should only be called after setting the AccessMode and related property. You are calling it before setting AccessMode...
Various samples including advice on debugging problems
define the derived column(s) directly in the SQL command instead of using a EzDerivedColumn
try to get it working with 2 SQL Server DBs first, some of the available MySQL ADO.NET provider have some shortcomings under some circumstances
UPDATE - as per comments some more information on debugging this and a link to a complete end-to-end sample with source:
http://blogs.msdn.com/b/mattm/archive/2009/08/03/looking-up-ssis-hresult-comexception-errorcode.aspx
http://blogs.msdn.com/b/mattm/archive/2009/08/03/debugging-a-comexception-during-package-generation.aspx
Complete working sample with source
I've had this exact same issue and been able to resolve it with a lot of experimentation. In short you must set the connection for both the source and destination, and then call the attachTo after both connections are set. You must call attachTo for every component.
I've written a blog about starting with an SSIS package as a template, and then manipulating it programmatically to produce a set of new packages.
The article explains the issue more.
On a Windows 7 (or server) box, we have a folder on a UNC share (cross machine UNC, not localhost). We rename that folder, and then check for the existence of a file at the new folder location. Even though it exists, it takes almost 5 seconds for File.Exists to return true on it.
Full repro can be found on https://github.com/davidebbo/NpmFolderRenameIssue. Here is the core code:
// This file doesn't exist yet
// Note that the presence of this existence check is what triggers the bug below!!
Console.WriteLine("Exists (should be false): " + File.Exists("test/test2/myfile"));
// Create a directory, with a file in it
Directory.CreateDirectory("test/subdir/test");
File.WriteAllText("test/subdir/test/myfile", "Hello");
// Rename the directory
Directory.Move("test/subdir/test", "test/test2");
var start = DateTime.UtcNow;
// List the files at the new location. Here, our file shows up fine
foreach (var path in Directory.GetFiles("test/test2"))
{
Console.WriteLine(path);
}
for (; ; )
{
// Now do a simple existence test. It should also be true, but when
// running on a (cross machine) UNC share, it takes almost 5 seconds to become true!
if (File.Exists("test/test2/myfile")) break;
Console.WriteLine("After {0} milliseconds, test/test2/myfile doesn't show as existing",
(DateTime.UtcNow - start).TotalMilliseconds);
Thread.Sleep(100);
}
Console.WriteLine("After {0} milliseconds, test/test2/myfile correctly shows as existing!",
(DateTime.UtcNow - start).TotalMilliseconds);
So it seems like the initial existence check causes the existence value to be cached, causing this bogus behavior.
Questions: what is the explanation for this? What's the best way to avoid it?
NOTE: this issue initially arose when using npm (Node Package Manager) on Windows. The code I have here is a C# port of the repro. See https://github.com/isaacs/npm/issues/2230 for the original Node/npm issue. The goal is to find a way to address it.
David,
The redirector implements a negative "File Not Found" cache which prevents a client from flooding a server with file not found requests. The default cache time is 5 seconds but you can modify the FileNotFoundCacheLifetime registry value to control the cache or disable it by setting this value to 0.
Details: http://technet.microsoft.com/en-us/library/ff686200(v=WS.10).aspx
There are multiple levels of caching in network code. This could slow down the time the file existence finally shows up.
A solution would be not to use file shares but create a simple client/server architecture where the server returns the file existence from the local file system. That should really speed up item detection times.
My guess would be that if you tried to open the file even if File.Exists says it doesn't exist yet it should be opened correctly so you can use the server existence information. If that won't work you can simply add a download option to the server/client architecture.
Once I knew about the "File Not Found" cache, I was able to get around the problem by using a FileInfo object, which implements a Refresh() method. Your code could do this instead:
FileInfo testFile = new FileInfo("test/test2/myfile");
Console.WriteLine("Exists (should be false): " + testFile .Exists);
Directory.Move("test/subdir/test", "test/test2");
testFile.Refresh();
// the FileInfo object should now be refreshed, and a second call to Exists will return a valid value
if (testFile.Exists)
{
...
}