Connecting to SQLite Database on Azure File Storage - c#

Problem:
I cannot figure out what connection string to use to connect a WPF desktop application to a SQLite database on Azure File Storage. Thanks to the MSDN Documentation I am able to access the CloudFile from the app (so I have access to the URI), but when I pass the URI to a connection string to create a connection and then try to open the connection, I get an error message that my URI is invalid. The connection works fine when I try to connect to a SQLite database on my hard drive. Do I need to pass a key or something to the SQLite connection string to connect to a database on Azure File Storage? Is it even possible?
/// <summary>
/// Add all online (Azure file storage) data sources
/// </summary>
private void FindOnlineDataSources()
{
var accountName = "myAccountName";
var keyValue = "myKeyValue";
var useHttps = true;
var exportSecrets = true;
var storageCredentials = new StorageCredentials(accountName, keyValue);
var storageAccount = new CloudStorageAccount(storageCredentials, useHttps);
var connString = storageAccount.ToString(exportSecrets);
// Create a CloudFileClient object for credentialed access to Azure Files.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Get a reference to the file share we created previously.
CloudFileShare share = fileClient.GetShareReference("myShare");
// Ensure that the share exists.
if (share.Exists())
{
// Get a reference to the root directory for the share.
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
// Get a reference to the directory we created previously.
CloudFileDirectory sampleDir = rootDir.GetDirectoryReference("myDirectory");
// Ensure that the directory exists.
if (sampleDir.Exists())
{
// Get a reference to the file we created previously.
var fileList = sampleDir.ListFilesAndDirectories();
foreach (var fileTemp in fileList)
{
if (fileTemp is CloudFile && TestConnection(SQLiteOnlineConnectionBuilder(fileTemp.StorageUri.PrimaryUri.AbsoluteUri)))
{
// Store reference to data source
}
}
}
}
}
/// <summary>
/// Test data source connection to determine if it is accessible
/// </summary>
private bool TestConnection(DbConnection connection)
{
bool retval = false;
try
{
connection.Open();
connection.Close();
retval = true;
}
catch { }
return retval;
}
/// <summary>
/// Create SQLite connection from URI string
/// </summary>
private DbConnection SQLiteOnlineConnectionBuilder(string uri)
{
return new SQLiteConnection
{
ConnectionString = new SQLiteConnectionStringBuilder
{
Uri = uri,
ForeignKeys = true,
BinaryGUID = false,
}.ConnectionString
};
}
Background:
I am building a desktop app for use within my company. The data for the app is held in a SQLite database. We will only have a maximum of 5 users accessing the data at one time so I decided it would be unnecessary to try to set up a full server - SQLite seems like a great option.
However, I am trying to put the SQLite database into our Azure File Storage account so that multiple users can access it thru the desktop app wherever they have internet access. We don't have a central company network so I figured Azure File Storage would be the way to go.

Hey so one option with Azure File Share that is not very secure but might suit your needs is to map the azure file share to where the desktop app resides. Then you can just point to the sqlite *db file inside the mapped drive.
https://learn.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows - how to make the mapping.

Related

Google DriveService Files.List() not returning results

Edit:
I've tried granting the SA access to my personal drive (within the organization Workspace) to do some troubleshooting. After granting rights to the SA to a particular folder and rewriting the code to examine that folder, it successfully returned information about files within the test folder. The conclusion is the SA has been set-up correctly by our IT department and does have adequate scope and rights to read files in our organizations Workspace. So, the questions remain: why can't it return information about files in a Shared Drive? What other parameters need to be set in order to get it to return those files? Are there entirely other functions that need to be used? I did notice the deprecated TeamDrives.List() function, but the guidance when trying to use it was to use Files.List() as I had written originally.
--- end edit ---
We have a Google Workspace environment. I've been granted a Service Account (SA) by our IT department and am trying to use it to help maintain access rights. The SA has been granted Content Manager rights to a shared drive instance.
I've tried following along this YouTube tutorial. In stepping through the code execution, it appears to log in correctly, but it is not returning any files. I've tried substituting the full URL for the file ID of the root folder I'd like to examine, but then it returns a 404 error, so I think it is finding the correct folder.
If the file ID is used the code runs without errors, it simply returns no files (and there are hundreds of folders and files within the root).
Any suggestions?
namespace DriveQuickstart
{
class Program
{
static string[] Scopes = { DriveService.Scope.DriveReadonly };
private const string PathToServiceAccountKeyFile = #"<path to jason Service Account file>";
private const string ServiceAccountEmail = #"<Service Account "email">";
static void Main(string[] args)
{
MainAsync().Wait();
}
static async Task MainAsync()
{
var credential = GoogleCredential.FromFile(PathToServiceAccountKeyFile)
.CreateScoped(new[] { DriveService.ScopeConstants.Drive });
var service = new DriveService(new BaseClientService.Initializer()
{
HttpClientInitializer = credential
});
var request = service.Files.List();
request.IncludeItemsFromAllDrives = true;
request.SupportsAllDrives = true;
request.Q = "parents in '<id of "root" folder in shared drive>'";
FileList results = await request.ExecuteAsync();
foreach (var driveFile in results.Files)
{
Console.WriteLine($"{driveFile.Name} {driveFile.MimeType} {driveFile.Id}");
}
}
}
}
OK, it appears the #DAIMTO example is specific to personal drives. The Q() parameter syntax is incorrect for Team drives in the example. To make it work in Team environment:
IncludeItemsFromAllDrives parameter must be set to true
SupportsAllDrives parameter must be set to true
the Q search parameter syntax for finding specific directories is:
Q = "'folder_ID' in parents and mimeType = 'application/vnd.google-apps.folder'"; -- or mimeType of your choice
(note: this is reversed from the youtube example of "parents in 'folder_ID'")

How can I copy a file from the isolated storage to the Downloads folder?

I'm trying to copy my database file from the isolated storage to the Download folder (or any folder that the user can access).
Currently my database is stored in:
/data/user/0/com.companyname.appname/files/Databases/MyDatabase.db
I tried to use this code:
public string GetCustomFilePath(string folder, string filename)
{
var docFolder = System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal);
var libFolder = Path.Combine(docFolder, folder);
if (!Directory.Exists(libFolder))
Directory.CreateDirectory(libFolder);
return Path.Combine(libFolder, filename);
}
var bas = GetDatabaseFilePath("MyDatabase.db");
var des = Path.Combine(Android.OS.Environment.DirectoryDownloads, "MyDatabase.db");
File.Copy(bas, des);
The Android.OS.Environment.DirectoryDownloads property returns the path Download, which is the name of the downloads folder.
But File.Copy() throws an exception telling
System.IO.DirectoryNotFoundException: Destination directory not found:
Download.
I tried to use a slash before like this: /Download/MyDatabase.db with no luck.
Is there any way to copy a file like that? Do I need any permission?
1st) Yes, you do need permissions to write to external storage.
You can get the runtime time permission required by doing it yourself:
https://devblogs.microsoft.com/xamarin/requesting-runtime-permissions-in-android-marshmallow/
Or via a 3rd-party plugin, such as James Montemagno's PermissionsPlugin
https://github.com/jamesmontemagno/PermissionsPlugin
2nd) Once your user accepts that it is ok to write to external storage, you can use:
Android.OS.Environment.ExternalStorageDirectory.AbsolutePath, Android.OS.Environment.DirectoryDownloads
To obtain the path of the device's public Download folder, i.e. using a Forms' dependency service:
public interface IDownloadPath
{
string Get();
}
public class DownloadPath_Android : IDownloadPath
{
public string Get()
{
return Path.Combine(Android.OS.Environment.ExternalStorageDirectory.AbsolutePath, Android.OS.Environment.DirectoryDownloads);
}
}
https://learn.microsoft.com/en-us/xamarin/xamarin-forms/app-fundamentals/dependency-service/introduction
You end up with something like:
public void Handle_Button(object sender, System.EventArgs e)
{
var fileName = "someFile.txt";
using (var stream = File.Create(Path.Combine(FileSystem.CacheDirectory, fileName)))
{
// just creating a dummy file to copy (in the cache dir using Xamarin.Essentials
}
var downloadPath = DependencyService.Get<IDownloadPath>().Get();
File.Copy(Path.Combine(FileSystem.CacheDirectory, fileName), downloadPath);
}

403 on file creation while using Hadoop.WebHDFSClient, Although was able to create Folders in HDFS

This code works till the folder creation then could not create file and throws 403. am I missing some thing here for authentication. I am new to Hadoop and trying to learn hdInsight.
private static void uploadFile()
{
//set variables
string srcFileName = #"./prime/in/integers.txt";
string destFolderName = #"/prime/in";
string destFileName = #"integers.txt";
string outputFolderName= #"/prime/out";
//connect to hadoop cluster
Uri myUri = new Uri("http://DXPZN72-LP:50070");
string userName = "hadoop";
WebHDFSClient myClient = new WebHDFSClient(myUri, userName);
//drop destination directory (if exists)
myClient.DeleteDirectory(destFolderName, true).Wait();
//create destination directory
myClient.CreateDirectory(destFolderName).Wait();
//create outputFolderName directory
myClient.CreateDirectory(outputFolderName).Wait();
//load file to destination directory
var res= myClient.CreateFile(srcFileName, "/" +destFileName);
res.Wait();
//list file contents of destination directory
Console.WriteLine();
Console.WriteLine("Contents of " +destFolderName);
myClient.GetDirectoryStatus(destFolderName).ContinueWith(
ds => ds.Result.Files.ToList().ForEach(
f => Console.WriteLine("t" +f.PathSuffix)
));
//keep command window open until user presses enter
Console.ReadLine();
}
This was found to be a service needed restarted.Although I was able to create folders but was not able to create File with a 403.
I restarted my machine then tried again, the issue was gone and I was able to create file as well.

Command line SFTP "Local to local copy not supported" error

I'm running a program written in C# from the command line as administrator that should generate the batch file (which it does), and then it should sFTP the file to a remote site. I have verified the username and password are correct. When I run the utility (C# program) to do this it says it's transferring the file and then immediately gives me this
ERROR: Local to local copy not supported.
However, I can manually (through Filezilla) move the file from our server to their site. It's probably something silly, but I just can't seem to figure it out. Any help is appreciated!
There are many files to this program, but here is where the most of the FTP stuff is in the code. I hope it helps:
if (pars.ContainsKey("ftp"))
{
var env = (pars.ContainsKey("ftp") ? pars["ftp"] : null) ?? "default";
entities = entities ?? new SBLStagingEntities();
settings = settings ?? new SettingReader(entities, env).GetSetting();
var filename = Path.GetFileName(pars["path"]);
Console.WriteLine("Transfering {0} using sFTP ................................\t\t", filename);
var processors = new SblFtpTransport(settings);
processors.ProcessFile(pars["path"]);
Console.Write("sFTP Done\n");
}
///-----------------------a different class that is called from the first one------///
public SblFtpTransport(Settings settings)
{
_settings = settings;
}
/// <summary>
/// this method is called by file watcher for each new file dropped in the watched folder
/// </summary>
/// <param name="file"></param>
public void ProcessFile(string file)
{
var fileName = Path.GetFileName(file);
if (!File.Exists(file) || string.IsNullOrEmpty(fileName))
{
Console.Error.WriteLine("file does not exist");
return;
}
//ftp the file and record the result in db
var result = FtpFile(file);
Log(fileName, result);
Console.Write("{0}", result);
Archive(result, file);
}
///-------------------------------another class that is used--------------///
public class WatcherSettings
{
public IFileProcessor CreateProcessor()
{
return new SblFtpTransport(new Settings()
{
AchiveFolder = #"C:\Docs\place\Watcher\Archived",
FtpPort = "22",
FtpServer = "xxxxx.someplace.net",
FtpTargetPath = "/StudentBatchLoad_FW",
FtpUsername = "xxx",
Password = "xxxxxxx",
});
}
public string WatcherPath { get; set; }
}
As far as I can tell, you never call CreateProcessor(). And it appears you need to call that so the settings get created properly with the remote host, and that's why you get an error that you're trying to copy to local host. So change your code to call that.
But your code is extremely disjointed and hard to read. Spend some time cleaning it, and step through it with a debugger to see exactly what's happening.

Not able to mount VHD drive on azure server

Please help me. I am writing following code to mount the vhd file. But I am not able to mount it. It works fine locally but when I deploy it on azure server the webrole remains offline. I tried by removing foreach block below but in vain. But when I removed the code "Global.driveLetter = drive.Mount(localCache.MaximumSizeInMegabytes - 20, DriveMountOptions.Force);" role got ready on server. But I can't do this because this is the key statement to mount the drive.
What would be the problem?
private static void MountAzureDrive()
{
string connectionStringSettingName = "AzureConnectionString";
string azureDriveContainerName = "azuredrives";
string azureDrivePageBlobName = Guid.NewGuid().ToString("N").ToLowerInvariant();
string azureDriveCacheDirName = Path.Combine(Environment.CurrentDirectory, "cache");
CloudStorageAccount.SetConfigurationSettingPublisher((a, b) =>
{
b(RoleEnvironment.GetConfigurationSettingValue(connectionStringSettingName));
});
//CloudStorageAccount storageAccount=CloudStorageAccount.FromConfigurationSetting(connectionStringSettingName);
CloudStorageAccount storageAccount=CloudStorageAccount.DevelopmentStorageAccount;
LocalResource localCache=RoleEnvironment.GetLocalResource("InstanceDriveCache");
CloudDrive.InitializeCache(localCache.RootPath + "cache", localCache.MaximumSizeInMegabytes);
// Just checking: make sure the container exists
CloudBlobClient blobClient = storageAccount.CreateCloudBlobClient();
blobClient.GetContainerReference("drives").CreateIfNotExist();
// Create cloud drive
//WebRole.drive=storageAccount.CreateCloudDrive(blobClient.GetContainerReference("drives").GetPageBlobReference("Test.VHD").Uri.ToString());
WebRole.drive = storageAccount.CreateCloudDrive("drives/Test.VHD");
try
{
WebRole.drive.CreateIfNotExist(512);
}
catch (CloudDriveException ex)
{
// handle exception here
// exception is also thrown if all is well but the drive already exists
}
foreach (var d in CloudDrive.GetMountedDrives())
{
var mountedDrive = storageAccount.CreateCloudDrive(d.Value.PathAndQuery);
mountedDrive.Unmount();
}
//Global.driveLetter = drive.Mount(25, DriveMountOptions.Force);
Global.driveLetter = drive.Mount(localCache.MaximumSizeInMegabytes - 20, DriveMountOptions.Force);
}
Thanks in advance.
Maybe this is stating the obvious, but... when you deploy to Windows Azure, did you change the storage account from dev storage? You have the dev storage emulator hard-coded:
CloudStorageAccount storageAccount=CloudStorageAccount.DevelopmentStorageAccount;

Categories

Resources