Using a DB File to run Tests on Azure Devops - c#

Is there a way to create a DB File on the Azure Devops pipeline using one of the tasks?
My line of thought is Create a localdb on the agent (Using VS) and run the unit tests (SSDT) on that DB file like I do with VS. I can create db file Tools>connect db> Sql server db file and putting in a name. I can connect to it and run the tests. It seems like I cant do this on Azure devops pipeline.
I know the preferred way is to allocate an Azure SQL server and run the tests against those but the DB is very small and if i can run those against the db file it seems like a better idea.

Basically I found a way to do it all on the agent. However, the agent localdb has to be updated if you are using newer syntax.
- task: CopyFiles#2
inputs:
Contents: '**/Output/*.dacpac'
flattenFolders: true
TargetFolder: '$(Build.ArtifactStagingDirectory)'
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
sqllocaldb start MSSQLLocalDB
sqllocaldb info MSSQLLocalDB
#import SqlServer module
Import-Module -Name "SqlServer"
# create variable with SQL to execute
$sql = "
CREATE DATABASE [MyDatabase]
CONTAINMENT = NONE
ON PRIMARY
( NAME = N'MyDatabase', FILENAME = N'd:\a\1\s\testing.Data\test\bin\Output\MyDatabase.mdf' , SIZE = 1048576KB , FILEGROWTH = 262144KB )
LOG ON
( NAME = N'MyDatabase_log', FILENAME = N'd:\a\1\s\testing.Data\test\bin\Output\MyDatabase_log.ldf' , SIZE = 524288KB , FILEGROWTH = 131072KB )
GO
USE [master]
GO
ALTER DATABASE [MyDatabase] SET RECOVERY SIMPLE WITH NO_WAIT
GO
ALTER AUTHORIZATION ON DATABASE::[MyDatabase] TO [sa]
GO "
Invoke-SqlCmd -ServerInstance "(localdb)\MSSQLLocalDB" -database master -Query $sql
- task: SqlDacpacDeploymentOnMachineGroup#0
inputs:
TaskType: 'dacpac'
DacpacFile: '$(Build.ArtifactStagingDirectory)/*.dacpac'
TargetMethod: 'connectionString'
ConnectionString: 'Data Source=(localdb)\.;Initial Catalog=MyDatabase;Integrated Security=True;'
With this you can attach a created .mdf file that was generate to your localdb and publish your dacpac to it. Then if you want to run your tests, you can do so.

Using a DB File to run Tests on Azure Devops
As workaround, you could try to check-in the LocalDb files (mdf and ldf), copy the files to output and change the connection string to use the current execution path:
protected override void OnConfiguring(DbContextOptionsBuilder optionsBuilder)
{
optionsBuilder.UseSqlServer($”Data Source=(LocalDB)\\MSSQLLocalDB;Initial Catalog=Contacts;AttachDbFilename={AppDomain.CurrentDomain.BaseDirectory}Core\\IntegrationTests\\Contacts.mdf;Integrated Security=True”);
}
You could check the document Integration Testing with SQL LocalDb on your build server for some details.
Hope this helps.

Related

Is there a way to set the file permissions when a database backup file is created in SQL Server?

I'm currently running Microsoft SQL Server 2022 Express Edition in a Docker container on Ubuntu. I'm running the following query using C# with the Microsoft.Data.SqlClient namespace:
BACKUP DATABASE [DBName] TO DISK = N'/path/to/dir/DBName.bak' WITH NOFORMAT, NOINIT, NAME = 'DBName', SKIP, NOREWIND, NOUNLOAD, STATS = 10;
The backup is successfully created with the following permissions:
-rw-r----- 1 root root 16900096 Feb 2 11:14 DBName.bak
Is there a way I can configure SQL Server to write the backup file with certain permissions (basically the equivalent of running sudo chmod on the file)? I specifically want all users to have read/write/execute permissions.
I COULD just have my code run a shell script that sets the permissions after the file is created, but I would prefer a platform-independent solution. I was hoping that SQL Server may have some way of configuring the backup file output.

MVC4: Move SQLServer session into external hosting

I want to move my session database from my local machine into external hosting. Why? Because I do not have access to use
aspnet_regsql.ext -d DBName -S ServerName -U User -P Password -ssadd -sstype c
on the external hosting.
I tried to backup and restore. The problem is that the name of the database on the external hosting server starts with prefix, which will make the name to be different from my local machine DBName.
I want to just have 1 file that I can move to any external hosting server and the name is dynamic (means the name of the DB will be different each time I install my website).
I am considering of using custom SQL session state, but I have no knowledge of making one.
Why not use config file and config file transformations for each deployment? So that database connection string will be substituted with the right one during the process of deployment.
https://forums.asp.net/t/1993077.aspx?How+to+Change+The+Default+Database+name+In+MVC+4

using C# with PIG on HDInsight Azure

I am working with HDInsight .NET SDK to use C# sdk with pig. I am getting error when specifying the c# application path.
here's how am defining the C# app in pig script
DEFINE pigudf `PigUDF.exe` SHIP('wasb://book#storage.blob.core.windows.net/PIG/app/PigUDF.exe');
am getting error "invalid ship specification" 'wasb://bookstore#storage160203122480.blob.core.windows.net/PIG/app/PigUDF.exe' doesn't exists, however PigUDF.exe does exists at the given path.
If I run the same query from HDInsight cluster console with both pig script file and c# app stored locally on cluster, it runs successfully.. i.e the below works on hdinsight cluster console
DEFINE pigudf `PigUDF.exe` SHIP('C:/PigUDF.exe');
where pigudf.exe is locally stored on cluster.
I even tried running it through HDInsight tools for visual studio, but I get same error.
Any help here will be appreciated.
thanks,
Saleem
Try using http:// instead of wasb://. The wasb protocol is used to access Windows Azure blob storage.
DEFINE pigudf `PigUDF.exe` SHIP('http://book#storage.blob.core.windows.net/PIG/app/PigUDF.exe');
You can copy your udf to local, update its permissions, ship it, and eventually remove it
fs -copyToLocal wasb://<container>#account/udf.exe
sh cacls udf.exe /E /G <group>:F
define myUdf `udf.exe`ship('udf.exe')
-- run your computation...
sh del udf.exe

How do I generate bacpac file from local machine and upload it on Azure blob?

I'm using SQL Server 2012 , I know how to take bacpac of sql database from azure portal and store that file into blob but
How do I generate bacpac file from local machine and upload it on Azure blob?
Is there any way to do so ? using c# program or any utility ?
If you are looking to automate this, there is a way:
1) Generate the .bacpac file using SqlPackage.exe
For example:
“C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\sqlpackage.exe”
/a:Export /ssn:SourceServerName /sdn:SourceDatabaseName
/tf:C:\DataExtraction\SourceDatabase.bacpac"
This would generate a bacpac under C:\DataExtraction\SourceDatabase.bacpac
for more info go here: SqlPackage.exe
2) Upload the bacpac to Azure Storage as Blob using Azure PowerShell
Switch-AzureMode -Name AzureServiceManagement
$context= New-AzureStorageContext -StorageAccountName "Storageaccountname" -StorageAccountKey "mystoragekeyhere"
Set-AzureStorageBlobContent -Context $context -Container adventureworks -File "NameOfLocal.bacpac" -Blob "NameofBacpacInStorageContainer.bacpac"
for more information on this cmdlet go here:
Set-AzureStorageBlobContent
Should you need to Import the following command will help you:
“C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\sqlpackage.exe”
/a:Import /tsn:TargetServerName /tdn:TargetDatabaseName
/sf:C:\DataExtraction\SourceDatabase.bacpac"
Should you desire to use sqlpackage.exe add the following directory to your PATH (instructions):
C:\Program Files (x86)\Microsoft SQL Server\110\DAC\bin\
Download and install SSDT tool
[Your DataBase]-> Task -> Export Data tier Application -> [Choose the localPath.bacpac]
You can also directly deploy to SQL Azure using Deploy Data Tier Application to SQL Azure

How to create a SqlServer database backup with .Net?

I want to make a database backup with this C# code:
connect = new SqlConnection(con);
connect.Open();
// Execute SQL
SqlCommand command = new SqlCommand
(
#"backup database MY_database to disk='d:\SQLBackup\wcBackUp1.bak' with init, stats=10",
connect
);
command.ExecuteNonQuery();
connect.Close();
When I run it, the following error message shows up:
Cannot open backup device 'd:\SQLBackup\wcBackUp1.bak'.
Operating system error 3(The system cannot find the path specified.).
If I change the path to d:\wcBackUp1.bak it seems to be ok, is without error, but the file does not exist, it was not generated.
If I run in SQL the command I have the message that it was 100% processed, but I didn`t see the file.
Could someone help me please?
Make sure the location "d:\SQLBackup\" exist in your database server and not on your client machine.
Two things to check.
The Sql Service may not have access to the d:\sqlbackup folder. Old Sql installs used to default to install the service with full access to the machine, but newer instances tighten that up. You could try changing the path to the directory where the default backups are stored.
Secondly, if the sql server is not on the same machine that you are running this program, then you must remember that the D: will be the D: on the sql server and not your local machine
Fundamentally, the Windows account that the SQL Server service runs under must have write permissions on the specified folder.
You can check what account this is by looking in SQL Server Configuration Manager, under SQL Server Services (look at the Log On As column)
Check what permissions that account actually has on the target folder using Explorer -> right click folder -> properties -> security -> advanced -> effective permissions.
One way to check that this is the problem is to change your code to back up to your SQL instance's backup folder, where the permissions are likely to be correct. For example
C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\Backup

Categories

Resources