I'm running a program written in C# from the command line as administrator that should generate the batch file (which it does), and then it should sFTP the file to a remote site. I have verified the username and password are correct. When I run the utility (C# program) to do this it says it's transferring the file and then immediately gives me this
ERROR: Local to local copy not supported.
However, I can manually (through Filezilla) move the file from our server to their site. It's probably something silly, but I just can't seem to figure it out. Any help is appreciated!
There are many files to this program, but here is where the most of the FTP stuff is in the code. I hope it helps:
if (pars.ContainsKey("ftp"))
{
var env = (pars.ContainsKey("ftp") ? pars["ftp"] : null) ?? "default";
entities = entities ?? new SBLStagingEntities();
settings = settings ?? new SettingReader(entities, env).GetSetting();
var filename = Path.GetFileName(pars["path"]);
Console.WriteLine("Transfering {0} using sFTP ................................\t\t", filename);
var processors = new SblFtpTransport(settings);
processors.ProcessFile(pars["path"]);
Console.Write("sFTP Done\n");
}
///-----------------------a different class that is called from the first one------///
public SblFtpTransport(Settings settings)
{
_settings = settings;
}
/// <summary>
/// this method is called by file watcher for each new file dropped in the watched folder
/// </summary>
/// <param name="file"></param>
public void ProcessFile(string file)
{
var fileName = Path.GetFileName(file);
if (!File.Exists(file) || string.IsNullOrEmpty(fileName))
{
Console.Error.WriteLine("file does not exist");
return;
}
//ftp the file and record the result in db
var result = FtpFile(file);
Log(fileName, result);
Console.Write("{0}", result);
Archive(result, file);
}
///-------------------------------another class that is used--------------///
public class WatcherSettings
{
public IFileProcessor CreateProcessor()
{
return new SblFtpTransport(new Settings()
{
AchiveFolder = #"C:\Docs\place\Watcher\Archived",
FtpPort = "22",
FtpServer = "xxxxx.someplace.net",
FtpTargetPath = "/StudentBatchLoad_FW",
FtpUsername = "xxx",
Password = "xxxxxxx",
});
}
public string WatcherPath { get; set; }
}
As far as I can tell, you never call CreateProcessor(). And it appears you need to call that so the settings get created properly with the remote host, and that's why you get an error that you're trying to copy to local host. So change your code to call that.
But your code is extremely disjointed and hard to read. Spend some time cleaning it, and step through it with a debugger to see exactly what's happening.
Related
I'm trying to copy my database file from the isolated storage to the Download folder (or any folder that the user can access).
Currently my database is stored in:
/data/user/0/com.companyname.appname/files/Databases/MyDatabase.db
I tried to use this code:
public string GetCustomFilePath(string folder, string filename)
{
var docFolder = System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal);
var libFolder = Path.Combine(docFolder, folder);
if (!Directory.Exists(libFolder))
Directory.CreateDirectory(libFolder);
return Path.Combine(libFolder, filename);
}
var bas = GetDatabaseFilePath("MyDatabase.db");
var des = Path.Combine(Android.OS.Environment.DirectoryDownloads, "MyDatabase.db");
File.Copy(bas, des);
The Android.OS.Environment.DirectoryDownloads property returns the path Download, which is the name of the downloads folder.
But File.Copy() throws an exception telling
System.IO.DirectoryNotFoundException: Destination directory not found:
Download.
I tried to use a slash before like this: /Download/MyDatabase.db with no luck.
Is there any way to copy a file like that? Do I need any permission?
1st) Yes, you do need permissions to write to external storage.
You can get the runtime time permission required by doing it yourself:
https://devblogs.microsoft.com/xamarin/requesting-runtime-permissions-in-android-marshmallow/
Or via a 3rd-party plugin, such as James Montemagno's PermissionsPlugin
https://github.com/jamesmontemagno/PermissionsPlugin
2nd) Once your user accepts that it is ok to write to external storage, you can use:
Android.OS.Environment.ExternalStorageDirectory.AbsolutePath, Android.OS.Environment.DirectoryDownloads
To obtain the path of the device's public Download folder, i.e. using a Forms' dependency service:
public interface IDownloadPath
{
string Get();
}
public class DownloadPath_Android : IDownloadPath
{
public string Get()
{
return Path.Combine(Android.OS.Environment.ExternalStorageDirectory.AbsolutePath, Android.OS.Environment.DirectoryDownloads);
}
}
https://learn.microsoft.com/en-us/xamarin/xamarin-forms/app-fundamentals/dependency-service/introduction
You end up with something like:
public void Handle_Button(object sender, System.EventArgs e)
{
var fileName = "someFile.txt";
using (var stream = File.Create(Path.Combine(FileSystem.CacheDirectory, fileName)))
{
// just creating a dummy file to copy (in the cache dir using Xamarin.Essentials
}
var downloadPath = DependencyService.Get<IDownloadPath>().Get();
File.Copy(Path.Combine(FileSystem.CacheDirectory, fileName), downloadPath);
}
Problem:
I cannot figure out what connection string to use to connect a WPF desktop application to a SQLite database on Azure File Storage. Thanks to the MSDN Documentation I am able to access the CloudFile from the app (so I have access to the URI), but when I pass the URI to a connection string to create a connection and then try to open the connection, I get an error message that my URI is invalid. The connection works fine when I try to connect to a SQLite database on my hard drive. Do I need to pass a key or something to the SQLite connection string to connect to a database on Azure File Storage? Is it even possible?
/// <summary>
/// Add all online (Azure file storage) data sources
/// </summary>
private void FindOnlineDataSources()
{
var accountName = "myAccountName";
var keyValue = "myKeyValue";
var useHttps = true;
var exportSecrets = true;
var storageCredentials = new StorageCredentials(accountName, keyValue);
var storageAccount = new CloudStorageAccount(storageCredentials, useHttps);
var connString = storageAccount.ToString(exportSecrets);
// Create a CloudFileClient object for credentialed access to Azure Files.
CloudFileClient fileClient = storageAccount.CreateCloudFileClient();
// Get a reference to the file share we created previously.
CloudFileShare share = fileClient.GetShareReference("myShare");
// Ensure that the share exists.
if (share.Exists())
{
// Get a reference to the root directory for the share.
CloudFileDirectory rootDir = share.GetRootDirectoryReference();
// Get a reference to the directory we created previously.
CloudFileDirectory sampleDir = rootDir.GetDirectoryReference("myDirectory");
// Ensure that the directory exists.
if (sampleDir.Exists())
{
// Get a reference to the file we created previously.
var fileList = sampleDir.ListFilesAndDirectories();
foreach (var fileTemp in fileList)
{
if (fileTemp is CloudFile && TestConnection(SQLiteOnlineConnectionBuilder(fileTemp.StorageUri.PrimaryUri.AbsoluteUri)))
{
// Store reference to data source
}
}
}
}
}
/// <summary>
/// Test data source connection to determine if it is accessible
/// </summary>
private bool TestConnection(DbConnection connection)
{
bool retval = false;
try
{
connection.Open();
connection.Close();
retval = true;
}
catch { }
return retval;
}
/// <summary>
/// Create SQLite connection from URI string
/// </summary>
private DbConnection SQLiteOnlineConnectionBuilder(string uri)
{
return new SQLiteConnection
{
ConnectionString = new SQLiteConnectionStringBuilder
{
Uri = uri,
ForeignKeys = true,
BinaryGUID = false,
}.ConnectionString
};
}
Background:
I am building a desktop app for use within my company. The data for the app is held in a SQLite database. We will only have a maximum of 5 users accessing the data at one time so I decided it would be unnecessary to try to set up a full server - SQLite seems like a great option.
However, I am trying to put the SQLite database into our Azure File Storage account so that multiple users can access it thru the desktop app wherever they have internet access. We don't have a central company network so I figured Azure File Storage would be the way to go.
Hey so one option with Azure File Share that is not very secure but might suit your needs is to map the azure file share to where the desktop app resides. Then you can just point to the sqlite *db file inside the mapped drive.
https://learn.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows - how to make the mapping.
Maybe someone knows a simple solution to my problem.
I do not know the entry of the file so it's not a static value.
It can be changed through the BizTalk gui and there we have a URI through the receiveport. But I do not believe it's accessible that easy. What I want to do is write out the full path as the filename. It works well with the messageID where the file is given a specific filepath name. But the Path-Name where the file was dropped is not working that well.
I keep getting this error :
Message:
Object reference not set to an instance of an object.
the message resource is present but the message is not found in the string/message table
-Does not say me much
Below you can see a snip from my code
internal static string UpdateMacroPathProperty(IBaseMessage baseMessage, string macroPathProperty, string macroDefsFile)
{
if (macroName == "MessageID")
{
contextPropertyValue = baseMessage.MessageID.ToString();
}
else if (macroName == "SourceFileName")
{
contextPropertyValue = Directory.GetCurrentDirectory();
}
}
This is an specific created pipeline. Has anyone encountered this problem or can point me in the right way.
I know that BizTalk has a built in function for this, BizTalk Server: List of Macros as the %SourceFileName% but I'm trying to save this as logs in a specific map structure so that it does not get processed.
It's adapter dependent; some adapters will use the FILE adapter's namespace even though they're not the file adapter, but this is the kind of logic that I've used in the past for this:
string adapterType = (string)pInMsg.Context.Read("InboundTransportType",
"http://schemas.microsoft.com/BizTalk/2003/system-properties");
string filePath = null;
if (adapterType != null)
{
if (adapterType == "FILE")
{
filePath = (string)pInMsg.Context.Read("ReceivedFileName",
"http://schemas.microsoft.com/BizTalk/2003/file-properties");
}
else if (adapterType.Contians("SFTP") && !adapterType.Contains("nsoftware"))
// nsoftware uses the FTP schema
{
filePath = (string)pInMsg.Context.Read("ReceivedFileName",
"http://schemas.microsoft.com/BizTalk/2012/Adapter/sftp-properties");
}
else if (adapterType.Contains("FTP"))
{
filePath = (string)pInMsg.Context.Read("ReceivedFileName",
"http://schemas.microsoft.com/BizTalk/2003/ftp-properties");
}
}
And then you can just fall back to the MessageID if you can't get the file path from any of these.
My app generates spreadsheet files and subfolders in which to categorize them, and then stores them in a shared folder on the network; it works fine - this network location exists:
\\storageblade\cs\REPORTING\RoboReporter
...and the app dynamically adds, as necessary, subfolders to that, such as:
\\storageblade\cs\REPORTING\RoboReporter\ABUELOS
...which in turn, get their own subfolders, such as:
\\storageblade\cs\REPORTING\RoboReporter\ABUELOS\20161230_1336
...which final subfolders contain the generated spreadsheet file[s].
I am saving the dynamically created subfolders and .xlsx files like so:
String _uniqueFolder = RoboReporterConstsAndUtils.uniqueFolder;
var fromDate = _delPerfBeginDate.ToString("yyyyMMdd");
var toDate = _delPerfEndDate.ToString("yyyyMMdd");
var sharedFolder = String.Format(#"\\storageblade\cs\REPORTING\RoboReporter\{0}", _uniqueFolder);
RoboReporterConstsAndUtils.ConditionallyCreateDirectory(sharedFolder);
var filename = String.Format("{0}\\{1} - Delivery Performance - from {2} to {3}.xlsx", sharedFolder,
_unit, fromDate, toDate);
if (File.Exists(filename))
{
File.Delete(filename);
}
Stream stream = File.Create(filename);
package.SaveAs(stream);
stream.Close();
package.Save();
As I said, it's working fine - the folders and files are created in the shared network location.
However, the subfolders are also being created in the folder where the .exe lives. For example, on the dev machine, these end up in C:\Projects\~\bin\Debug, so that I have many subfolders such as:
C:\Projects\~\bin\Debug\ABUELOS\20161230_0908
...and on the "live" machine in the folder where I placed the .exe (as well as on the shared network area where they belong).
Here is the mysterious ConditionallyCreateDirectory() method:
internal static void ConditionallyCreateDirectory(string dirName)
{
Directory.CreateDirectory(dirName);
}
Why? And more importantly, how can I prevent this? I don't need this doubling up of file storage - some of these files are large, and will eventually cause the machine to fail, which will cause the network to fail, which will cause the business to fail, which will cause myself to be invited away from the premises, etc.
UPDATE
In answer to OfirW's comment/question, here are, I think, the pertinent parts:
In RoboReporterConstsAndUtils:
public static string uniqueFolder = String.Empty;
. . .
internal static string GetUniqueFolder(string _unit)
{
return uniqueFolder = String.Format("{0}\\{1}", _unit.ToUpper(), GetYYYYMMDDHHMM());
}
internal static void ConditionallyCreateDirectory(string dirName)
{
Directory.CreateDirectory(dirName);
}
How it is referenced elsewhere:
RoboReporterConstsAndUtils.ConditionallyCreateDirectory(RoboReporterConstsAndUtils.GetUniqueFolder(_unit));
String _uniqueFolder = RoboReporterConstsAndUtils.uniqueFolder;
_unit is assigned in the constructor:
public PriceComplianceRpt_EPPlus(DateTime begin, DateTime end, String unit)
{
_begDate = begin;
_endDate = end;
_unit = unit;
}
...which is called like so:
internal static void GenerateAndSavePriceComplianceReport(QueuedReports qr)
{
var pcr = new PriceComplianceRpt_EPPlus(qr.StartDateRange, qr.EndDateRange, qr.Unit);
pcr.GeneratePriceComplianceRpt();
}
This line is where the additional directory is being created:
RoboReporterConstsAndUtils.ConditionallyCreateDirectory(RoboReporterConstsAndUtils.GetUniqueFolder(_unit));
why do you do that?
I think you can change the class like this
In RoboReporterConstsAndUtils:
public static string uniqueFolder = String.Empty;
. . .
internal static void SetUniqueFolder(string _unit)
{
uniqueFolder = String.Format("{0}\\{1}", _unit.ToUpper(), GetYYYYMMDDHHMM());
}
internal static void ConditionallyCreateDirectory(string dirName)
{
Directory.CreateDirectory(dirName);
}
and replace this:
RoboReporterConstsAndUtils.ConditionallyCreateDirectory(RoboReporterConstsAndUtils.GetUniqueFolder(_unit));
String _uniqueFolder = RoboReporterConstsAndUtils.uniqueFolder;
With this:
RoboReporterConstsAndUtils.SetUniqueFolder(_unit));
String _uniqueFolder = RoboReporterConstsAndUtils.uniqueFolder;
That's it it think
I am using SharpSVN dll with my Visual Studio 2010 to get the latest revision number so I can version my project using this number. I tried this piece of code below but it gives me error saying:
Can't determine the user's config path
I don't even understand what that means. All I want to do is provide the svn link, my credentials like username and password and get the latest revision number.
Here is the code I tried so far:
using(SvnClient client = new SvnClient())
{
//client.LoadConfiguration(Path.Combine(Path.GetTempPath(), "Svn"), true);
Collection<SvnLogEventArgs> list;
client.Authentication.DefaultCredentials = new NetworkCredential("john.locke", "s7y5543a!!");
SvnLogArgs la = new SvnLogArgs();
client.GetLog(new Uri("https://100.10.20.12/svn/P2713888/trunk/src/"), la, out list);
string sRevisionNumber = string.Empty;
int iRevisionNumber = 0;
foreach(SvnLogEventArgs a in list)
{
if (Convert.ToInt32(a.Revision) > iRevisionNumber)
{
iRevisionNumber = Convert.ToInt32(a.Revision);
}
}
RevisionNumber.Text = iRevisionNumber.ToString();
}
other ways to get the revision number may also be selected as answer.
I had this problem as well-- needing to find/set properties on the SvnClient before use. Here's what I ended up using. Try using this method instead of just instantiating your client object-- it will auto-create a config folder if it doesn't already exist:
private SvnClient GetClient()
{
SvnClient client = new SvnClient();
// Note: Settings creds up here is optional
// client.Authentication.DefaultCredentials = _creds;
string configPath = Path.Combine(Path.GetTempPath(), "sharpsvn");
if (!Directory.Exists(configPath))
{
Directory.CreateDirectory(configPath);
}
client.LoadConfiguration(configPath, true);
return client;
}
Alternately, if you want to minimize File IO checking to see if the directory exists, you can try LoadConfiguration, and only create and reassign the directory if that call failed, but just checking each time is simpler.
At any rate, you can then get the latest revision number for a location using the following code:
public long GetLatestRevisionNumber(Uri svnPath)
{
using (SvnClient client = GetClient())
{
SvnInfoEventArgs info;
client.GetInfo(svnPath, out info);
return info.LastChangeRevision;
}
}