How to store file on server ASP.net - c#

Ok so I am trying to have error log file for the website running on ASP.net and C#.
I tried storing to drive with regular path.
Expecting it because it is in c# code (code behind page) it will pick up the server and store it there. (In the end it is a code that gets executed on server)
Unfortunatelly for me thats not true. It gets saved on local machine from which client opens the site.
How can you force it to be saved on server machine?
I want all of the error logs on one place.
Any suggestions on reading or how to do this.
string path = "C:\\WebSite";
string error = "error";
try
{
// check if dir exists if it does i add file name to the path
// i removed code to keep it simple
path += "\\logs.txt";
FileStream log_fs =
new FileStream(path, FileMode.Append, FileAccess.Write);
StreamWriter log_sw =
new StreamWriter(log_fs);
string log = String.Empty;
// formate log string using error message
log_sw.Write(log);
log_sw.Close();
log_fs.Close();
}
catch (Exception ex)
{
// here I e-mail the error in case logging failed
}
This code will generate file on local machine, instead of server

You can log to any location on the web server (that the ASPNET user account has permission to write to) with code such as this:
string log = String.Empty;
// format log string using error message
try
{
using (StreamWriter logFile = new StreamWriter(#"C:\WebSite\logs.txt", true))
{
logFile.WriteLine(log);
}
}
catch
{
// here I e-mail the error in case logging failed
}
This will write the file to the web server hosting the website. Now, if you're running the website locally during development and you're using either the built-in VS webserver or IIS, then of course the file will be created on your machine when testing the website locally.

To answer the suggestion portion of your question, have you considered using ELMAH, it will log errors to different storages:
Microsoft SQL Server
Oracle (OracleErrorLog)
SQLite (version 3) database file
Microsoft Access (AccessErrorLog)
VistaDB (VistaDBErrorLog)
Loose XML files
RAM (in-memory)
or send via email, twitter
elmah is very easy to setup and very effective

Related

Network share file access debugging with OS hardening

is there a way in C# to see what credentials are used to access a file in a network share?
I'm trying to debug a scenario where most of CIS Windows L1 hardening settings are applied to a system. This system only has local users and one of them (a member of local Administrators group) is used to run a C# app (.NET Framework 4.7.2) that accesses a network share, say \\myserver\some\path\foo.xml. Without hardening, it all works fine, even though when I have a look at who can access the network share, the local user is not listed.
With hardening applied, it does not work. More concretely: in File Explorer, when trying to access the foo.xml, a CredUI prompt for credentials appears. I have to submit my own credentials (AD user) to get access.
What the app does is essentially
string[] possiblePaths = ... // load possible network paths of desired file
foreach (string possiblePath in possiblePaths)
{
if (File.Exists(possiblePath))
{
return possiblePath;
}
}
I figured out how to deal with the SeTcbPrivilege policy violation (that gave me errors in the Event Log), but after that, I am lost.
To debug, I have created a test app:
string path = #"\\myserver\some\path\foo.xml";
if (File.Exists(path))
{
Console.WriteLine($"Path {path} exists!");
}
else
{
Console.WriteLine($"Path {path} does not exist!");
}
try
{
var fh = File.Open(path, FileMode.Open);
fh.Close();
}
catch (Exception ex)
{
Console.WriteLine(ex);
Console.WriteLine(ex.Message);
if (ex.InnerException != null)
{
Console.WriteLine(ex.InnerException.Message);
}
}
When I restart the system and try to run the app without providing my credentials to File Explorer, it says "The path does not exist" and IOException that the network path does not exist. However, when I submit my own credentials (not the local user, but mine, as an AD user) in File Explorer to access that location, the output of the test app says "The path exists" and UnauthorizedAccessException: Access to the path ... is denied. Which is weird, as I don't run the app as me, but as the local user.
I know I can use trial-error (and probably will) to identify the policy messing it up, but I would like to constrain it as much as possible - and also to understand why exactly it is causing trouble (maybe there used to be some anonymous/guest access on the share that is not permitted now on the system?).
To that extent, I would very welcome any nudge as how to see what credentials a C# app is using to authenticate elsewhere, how to get more debug info, or how the File.Exists()/File.Open() work under the hood.
Thanks!

How to Connect to a Remote Database

I am having some trouble connecting to an Access database which sits on a remote computer:
The Microsoft Access database engine cannot open or write to the file '\\ACCESSSERVER-PC\Warehouse Manager\Error Logging\Error Logging.accdb'. It is already opened exclusively by another user, or you need permission to view and write its data.
The database is not open for editing by anyone else and I should be able to connect to it.
I am using this connection string:
Provider=Microsoft.ACE.OLEDB.12.0;Data Source=\\ACCESSSERVER-PC\Warehouse Manager\Error Logging\Error Logging.accdb;Persist Security Info=False;
And the error occurs when calling connOpen();
using (var conn = new OleDbConnection(ConnectionString))
{
try
{
conn.Open();
}
}
I gave myself full permissions for the server as well as the .accdb file iteself, and did the same for the NETWORK SERVICE account as per this post, but still not luck.
I don't have a problem connecting to databases which are stored locally, and this only seems to be happening whent rying to conect over the network.
Has anyone else experienced this and found a solution? Any help or advice is much appreciated.
I have checked this answer and can confirm that I have all the required permissions to access the file.
That question is marked as a duplicate of this question, but I don't have any programs running which would have a stream open to the file.
It turns out this was being caused by trying to work with some terrible IT infrastructure.
The issue is that \\ACCESSSERVER-PC is actually a Windows 7 machine and so can only handle a limited number of connections.
Running the IsLocked code from this answer against the database file gave me a much more useful error message:
No more connections can be made to this remote computer at this time because there are already as many connections as the computer can accept.
After getting some users to disconnect their Access runtime and mapped drives from the server and running the code again, IsLocked returned false and I was then able to connect to the database.
Here is the code I used to help me find the solution to this:
protected virtual bool IsFileLocked(FileInfo file)
{
FileStream stream = null;
try
{
stream = file.Open(FileMode.Open, FileAccess.Read, FileShare.None);
}
catch (IOException)
{
//the file is unavailable because it is:
//still being written to
//or being processed by another thread
//or does not exist (has already been processed)
//
// ** OR There are already too many connections open to the
// ** "server" where the file exists!
return true;
}
finally
{
if (stream != null)
stream.Close();
}
//file is not locked
return false;
}
The moral of the story is Don't use (or work in an environment where they use) a Windows 7 machine as a server when you need 20+ concurrent connections to the server!

Converting docx to html with dotnet-mammoth fails at deploy server

I'm using dotnet-mammoth (mammoth.js with edge.js) to convert a docx document to html in .net
I added it to my project via its nuget package.
I'm using the code provided by the sample, which is working correctly in my development enviroment (running IIS Express):
var documentConverter = new Mammoth.DocumentConverter();
var result = documentConverter.ConvertToHtml(Server.MapPath("~/files/document.docx")); // problem here at production enviroment
string theResult = result.Value
However, once I deploy it to production server, when the executed code reaches documentConverter.ConvertToHtml() method, it's redirecting me to the login page. Without displaying any error messages, without saving anything on IIS log file.
If I remove that line, everything else executes normally.
I assume it could be an issue related to permissions but I don't know what could it be. Any ideas?
The latest version of Mammoth on NuGet no longer uses edge.js, and is now just .NET code, so should work more reliably.
You can resolve this by getting the exact error when the process is trying to read the file. Below is the code from dotnet-mammoth DocumentConverter.cs. As shown below on call it is trying to read all bytes to be sent to edge
public Result<string> ConvertToHtml(string path)
{
var mammothJs = ReadResource("Mammoth.mammoth.browser.js") + ReadResource("Mammoth.mammoth.edge.js");
var f = Edge.Func(mammothJs);
var result = f(File.ReadAllBytes(path));
Task.WaitAll(result);
return ReadResult(result.Result);
}
I suppose you are giving absolute path to the input. In that case the absolute path should be accessible by app identity hosting the app pool of the web application.
If the path specified is in web root directory - (not advised) - but if it is then you can use Server.MapPath

Batch file not working in server but working local

I created a batch file in my asp.net web site, It is working within the local system without any problem. After I was upload it to my web site's server (shared server), it is not working in the remote environment.
Here is a screenshot of the error I'm receiving
This is the code I'm using to test run the batch on the server.
string path = Server.MapPath("~/SourceCode/Jsil");
Response.Write(path+"<br>");
string batpath=Server.MapPath("~/Dir.bat");
string framework = Server.MapPath("~/SourceCode/v4.0.30319");
string vscompiler = #"\csc.exe /t:exe /r:JSIL.dll;JSIL.Meta.dll Program.cs";
string full = framework + vscompiler;
Response.Write(full);
StreamWriter file = new StreamWriter(batpath);
file.WriteLine("G:");
file.WriteLine("cd " + path);
file.Write(full);
file.Close();
//Excecute bat file
System.Diagnostics.Process compile = new System.Diagnostics.Process();
compile.StartInfo = new ProcessStartInfo(batpath);
compile.Start();
compile.Close();
This might be a simple problem of privileges. The application user might not have execute privilege in the server box. You need to contact your system administrator and ask him to provide execute privilege to the IIS app pool user in the server.

Using Smo.Backup to backup SQL Server database to string

I'm trying to make a little app that would help me making my server backup. That app would run on my home PC so the main goal is to be able to connect to the external server, backup the selected database, dump the backup content to a string or something so I could write it on my PC disk and not the server's disk.
I did that which works to write on the server's disk, but I'd like to be able to write on my PC's disk the backup's result.
private bool BackupDatabase()
{
try
{
// Filename
string sFileName = string.Format("{0}\\{1}.bak", _sWhereToBackup, DatabaseName);
// Connection
string sConnectionString = String.Format(
"Data Source=tcp:{0};Initial Catalog={1};User ID={2};Pwd={3};",
DatabaseHost, DatabaseName, DatabaseUsername, DatabasePassword);
SqlConnection oSqlConnection = new SqlConnection(sConnectionString);
Server oServer = new Server(new ServerConnection(oSqlConnection));
// Backup
Backup backup = new Backup();
backup.Action = BackupActionType.Database;
backup.Database = DatabaseName;
backup.Incremental = false;
backup.Initialize = true;
backup.LogTruncation = BackupTruncateLogType.Truncate;
// Backup Device
BackupDeviceItem backupItemDevice = new BackupDeviceItem(sFileName, DeviceType.File);
backup.Devices.Add(backupItemDevice);
// Start Backup
backup.SqlBackup(oServer);
}
catch (Exception ex)
{
throw ex;
}
return false;
}
Thanks so much!
This is going to get a bit hacky because you need to either
call sql functions to read the file on the server and return as a binary array to you and then convert back to a file. this will require proper permissions as the account you are running under to access the file on the server's drive.
you can use t-sql or a bit more 'advanced' .net code
t-sql can be seen in this great sql injection guide
http://www.blackhat.com/presentations/bh-europe-09/Guimaraes/Blackhat-europe-09-Damele-SQLInjection-slides.pdf
or .net:
http://www.mssqltips.com/sqlservertip/2349/read-and-write-binary-files-with-the-sql-server-clr/
map a file location (i.e. access a share/drive/ipc connection) over the network
have the server ftp the files to a location
Which sounds likely in your scenario?
I take it that you're not networked to the server? Ideally you could back it up directly to your machine using a network address but I'm guessing you would have already thought of that.
Going the route you're thinking of is going to require you to have permissions that you would not normally want to have opened up to the someone over the internet, and will be much more trouble to program than simply setting up a second process to move the file somewhere accessible to you.
Adams suggestion to have the server ftp the files is a good one. Alternatively, you may find that using DropBox (www.dropbox.com) might be the path of least resistance. Then you could just back the database up to a folder setup with dropbox and it will automatically be copied to any folder you setup on your machine to be synchronized with it.
You should look into DAC, it is mainly made for the SQL Server in Azure, but will work with a SQL Server 2008 R2 too.
http://sqldacexamples.codeplex.com/

Categories

Resources