i have a cloud database server like application on my computer that i'm hosting my game on. However, every time an user tries to save data i get an UnauthorizedAccessException.
Im running it by admin and i dont have any specias right in my folder so i have no idea what's the problem.
Here's my code:
public const string root = "D:/DATABASE/";
public static void WriteData(string playername, string type, string data)
{
if (!Directory.Exists("D:/DATABASE/" + playername))
{
Directory.CreateDirectory("D:/DATABASE/" + playername);
Directory.CreateDirectory("D:/DATABASE/" + playername + "/weapons");
}
if (type != "Weapon")
{
using (StreamWriter sw = new StreamWriter("D:/DATABASE/" + playername + "/" + type + ".sav"))
{
sw.WriteLine(data);
}
}
else
{
string[] dat = data.Split('%');
using (StreamWriter sw = new StreamWriter("D:/DATABASE/" + playername + "/weapons/" + dat[0] + ".gfa"))
{
string[] lines = dat[1].Split('#');
foreach (string cline in lines)
{
sw.WriteLine(cline);
}
}
}
}
public static string ReadLoadout(string playername)
{
string output = "";
string[] items = new string[2];
using (StreamReader sr = new StreamReader(root + playername + "/loadout.gfl"))
{
items[0] = sr.ReadLine();
items[1] = sr.ReadLine();
}
int c = 0;
foreach (string citem in items)
{
if (c > 0) output += "$";
output += citem + "%" + GetCompressedWeaponFile(playername, citem);
c++;
}
return output;
}
public static string GetCompressedWeaponFile(string playerName, string weaponName)
{
string output = "";
using (StreamReader sr = new StreamReader(root + playerName + "/weapons/" + weaponName))
{
string line = " ";
int c = 0;
while (line != null)
{
line = sr.ReadLine();
if (line != null)
{
if (c > 0) output += "#";
output += line;
}
c++;
}
}
return output;
}
public static void RegisterNewUser(string username, string password, string email)
{
string udir = root + username;
Directory.CreateDirectory(udir);
Directory.CreateDirectory(udir + "/weapons");
Directory.CreateDirectory(udir + "/loadouts");
File.WriteAllText(udir + "/password.sav", password);
File.WriteAllText(udir + "/level.sav", "1");
File.WriteAllText(udir + "/money.sav", "1000");
File.WriteAllText(udir + "/email.sav", email);
File.WriteAllText(udir + "/loadout.gfl", "");
using (StreamWriter sw = new StreamWriter(root + "emails.txt", true))
{
sw.WriteLine(email);
}
Email.Send(email, "New Account Registration", string.Format(mailTemplate, username, password));
}
public static void EditLoadout(string username, string items)
{
File.WriteAllLines(root + username + "/loadout.gfl",items.Split('#'));
}
It is difficult to provide specific help without more information. Here are a few of troubleshooting suggestions:
1) Try running your code on a different machine. Specifically your development computer. Do you still have the same error? If not, then there is indeed a permission problem.
2) Have you tried checking the stack trace of the exception?
When you run the application on your own computer, try using the IDE to display the exception. Yes, the problem may ultimately be in a low-level class, but you should be able to break on the error and go back in the call stack to see which method in your code is actually throwing the error.
3) Check the actual exception, even for a system-level exception.
Chances are, if you are able to debug this in the IDE, that you will see property information that will give you a hint. Is it in a directory method or a file write method? Check additional properties. Somewhere it might give you the text of the path (assuming it's a file issue) that it failed on that that could help narrow things down too.
4) Add Exception handling to your code
This is a good rule of thumb, and you should really do this anyway to make a stronger program. Regardless of who's method you are calling (yours, someone else's, or a system method) you need to determine where it should be handled.
For example, in your code, in the RegisterNewUser() method, consider something like:
public static void RegisterNewUser(string username, string password, string email)
{
try
{
string udir = root + username;
Directory.CreateDirectory(udir);
Directory.CreateDirectory(udir + "/weapons");
Directory.CreateDirectory(udir + "/loadouts");
File.WriteAllText(udir + "/password.sav", password);
File.WriteAllText(udir + "/level.sav", "1");
File.WriteAllText(udir + "/money.sav", "1000");
File.WriteAllText(udir + "/email.sav", email);
File.WriteAllText(udir + "/loadout.gfl", "");
using (StreamWriter sw = new StreamWriter(root + "emails.txt", true))
{
sw.WriteLine(email);
}
Email.Send(email, "New Account Registration", string.Format(mailTemplate, username, password));
}
catch (Exception ex)
{
// Create a method to display or log the exception, with it's own error handler
LogAndDisplayExceptions(ex);
// Send the user a message that we failed to add them. Put this in it's own try-catch block
// ideally, for readability, in it's own method.
try
{
Email.Send(email, "Failed to register", "An error occurred while trying to add your account.");
}
catch (Exception exNested)
{
LogAndDisplayExceptions(exNested);
}
}
}
5) Add a "crash-and-burn" exception handler to "main"
In the method that is your "top method" (it's hard to tell in the snippet you provided since there are few methods that would attempt to write to the disk) you could wrap your code in a try - catch block and print the exception or write it to disk.
If you have having trouble writing the exception to disk, I would suggest creating an error file first, make sure that the user account that is running the program can write to it, and then in the catch block open the file for APPEND. This should make it easier to get to the error text.
6) When all else fails, use the Debug class or Console class to write the traditional "I made it to line x."
While this will not solve your problem, it should help you get more information that will provide more insight into where your code is causing an error.
Related
I'm writing a project, and the part I'm doing now is getting arrow shaped real fast. How can I remove the nested if statements, but still have the same behaviour?
The code below might not look so bad now, but I'm planning on refactoring to include more methods.
public async Task FirstDiffTestAsync()
{
string folderDir = "../../../";
string correctReportDir = folderDir + "Reports To Compare/Testing - Copy.pdf";
string OptyNumber = "122906";
//Making a POST call to generate report
string result = ReportGeneration(OptyNumber).Result;
Response reportResponse = JsonConvert.DeserializeObject<Response>(result);
string newURL = reportResponse.documentUrl;
//Logging the Response to a text file for tracking purposes
await File.WriteAllTextAsync(Context.TestRunDirectory + "/REST_Response.txt", result);
using (StreamWriter w = File.AppendText(Context.TestDir + "/../log.txt"))
{
//Checking if the Integration failed
if (reportResponse.Error == null)
{
//now we have the url, reading in the pdf reports
List<string> Files = new List<string> { correctReportDir, newURL };
List<string> parsedText = PdfToParsedText(Files);
DiffPaneModel diff = InlineDiffBuilder.Diff(parsedText[0], parsedText[1]);
// DiffReport is a customised object
DiffReport diffReport = new DiffReport(correctReportDir, newURL);
diffReport.RunDiffReport(diff);
//In-test Logging
string indent = "\n - ";
string logMsg = $"{indent}Opty Number: {OptyNumber}{indent}Activity Number: {reportResponse.ActivityNumber}{indent}File Name: {reportResponse.FileName}";
if (diffReport.totalDiff != 0)
{
await File.WriteAllTextAsync(Context.TestRunDirectory + "/DiffReport.html", diffReport.htmlDiffHeader + diffReport.htmlDiffBody);
logMsg += $"{indent}Different lines: {diffReport.insertCounter} Inserted, {diffReport.deleteCounter} Deleted";
}
LogTesting(logMsg, w);
//Writing HTML report conditionally
if (diffReport.totalDiff != 0)
{
await File.WriteAllTextAsync(Context.TestRunDirectory + "/DiffReport.html", diffReport.htmlDiffHeader + diffReport.htmlDiffBody);
}
Assert.IsTrue(diffReport.insertCounter + diffReport.deleteCounter == 0);
}
else
{
LogTesting($" Integration Failed: {reportResponse.Error}", w);
Assert.IsNull(reportResponse.Error);
}
}
}
As mentioned in the comment, the indentation level is fine for now, but its always better to minimize when possible, especially when you are repeating same blocks of code.
The best way to do this is to write a separate function that contains that block of code and then call that function instead of the nested if statements.
In your case it would be something like this:
private async void checkTotalDiff(diffReport) {
...
}
You could pass anything you might need in the parameters. This way in your main code, you could replace the if statements with checkTotalDiff(diffReport) and save the return (if any) to a variable.
Also note I used void for return but you could change the type depending on what the function returns.
I wouldn't consider this as having an excessive amount of nested if-statements. It is fine as is. Otherwise you could do the following (also suggested by #Caius Jard):
public async Task FirstDiffTestAsync()
{
string folderDir = "../../../";
string correctReportDir = folderDir + "Reports To Compare/Testing - Copy.pdf";
string OptyNumber = "122906";
//Making a POST call to generate report
string result = ReportGeneration(OptyNumber).Result;
Response reportResponse = JsonConvert.DeserializeObject<Response>(result);
//Checking if the Integration failed
if (reportResponse.Error != null)
{
LogTesting($" Integration Failed: {reportResponse.Error}", w);
Assert.IsNull(reportResponse.Error);
return;
}
string newURL = reportResponse.documentUrl;
//Logging the Response to a text file for tracking purposes
await File.WriteAllTextAsync(Context.TestRunDirectory + "/REST_Response.txt", result);
using (StreamWriter w = File.AppendText(Context.TestDir + "/../log.txt"))
{
//now we have the url, reading in the pdf reports
List<string> Files = new List<string> { correctReportDir, newURL };
List<string> parsedText = PdfToParsedText(Files);
DiffPaneModel diff = InlineDiffBuilder.Diff(parsedText[0], parsedText[1]);
// DiffReport is a customised object
DiffReport diffReport = new DiffReport(correctReportDir, newURL);
diffReport.RunDiffReport(diff);
//In-test Logging
string indent = "\n - ";
string logMsg = $"{indent}Opty Number: {OptyNumber}{indent}Activity Number: {reportResponse.ActivityNumber}{indent}File Name: {reportResponse.FileName}";
if (diffReport.totalDiff != 0)
{
await File.WriteAllTextAsync(Context.TestRunDirectory + "/DiffReport.html", diffReport.htmlDiffHeader + diffReport.htmlDiffBody);
logMsg += $"{indent}Different lines: {diffReport.insertCounter} Inserted, {diffReport.deleteCounter} Deleted";
}
LogTesting(logMsg, w);
//Writing HTML report conditionally
if (diffReport.totalDiff != 0)
{
await File.WriteAllTextAsync(Context.TestRunDirectory + "/DiffReport.html", diffReport.htmlDiffHeader + diffReport.htmlDiffBody);
}
Assert.IsTrue(diffReport.insertCounter + diffReport.deleteCounter == 0);
}
}
I'm getting an error with StreamWriter that the file is in use by another process, but I believe it may be down to the speed at which I'm writing the file or more specifically the speed of it being opened/closed.
The code is as follows:
public static void writeLog(string msg)
{
StreamWriter log;
string currentMonth = DateTime.Now.ToString("MMM");
string currentYear = DateTime.Now.ToString("yyyy");
string directoryName = currentMonth + "-" + currentYear;
if (!Directory.Exists(#"C:\AutoSkill\LogFiles\" + directoryName + #"\"))
{
Directory.CreateDirectory(#"C:\AutoSkill\LogFiles\" + directoryName + #"\");
}
DateTime dt = DateTime.Now;
string date = dt.ToString("dd-MM-yy");
if (!File.Exists(#"C:\AutoSkill\LogFiles\" + directoryName + #"\" + date + ".txt"))
{
log = new StreamWriter(#"C:\AutoSkill\LogFiles\" + directoryName + #"\" + date + ".txt");
}
else
{
log = File.AppendText(#"C:\AutoSkill\LogFiles\" + directoryName + #"\" + date + ".txt");
}
try
{
log.WriteLine(msg);
}
catch (Exception err)
{
Console.WriteLine("There was an error writing to the log file.");
Console.WriteLine(err.Message);
}
log.Close();
}
So I'm closing the log each time I'm done writing to it, but I'm writing all out my output from the console screen to the file to keep track of what actually happened; sometimes the lines are only a few milliseconds apart if the action that was taken was quick or just returned null.
Am i getting this error due to speed of writing to the file? Is there a better way to handle writing a log file?
Disregard this, I'm dumb.
I've not had this problem for the last 2 years, I'm getting the error because I'm writing to the same file from a different thread, which is where the overlap is.
The file actually is in use by another process, the same one just a different thread.
Closed. This question needs debugging details. It is not currently accepting answers.
Edit the question to include desired behavior, a specific problem or error, and the shortest code necessary to reproduce the problem. This will help others answer the question.
Closed 5 years ago.
Improve this question
I am trying to to write data to a log file but nothing gets written to the file.
Aim of the program is to run a continuous loop and keep looking for file, if file is valid, process it and move it. I am logging for any errors and items that are created.
Also, how can I make my log file access able while the loop is running so that I can see that values got appended.
static void Main(string[] args)
{
var logFile = File.Create(filePath + "\\log_" + DateTime.Today.ToString("MMMM") + ".txt").ToString();
while (true)
{
try
{
var moveTo = Directory.CreateDirectory(#"" + directoryPath + "Processed_" + DateTime.Today.ToString("MMMM"));
var files = Directory.GetFiles(filePath);
var todaysDate = DateTime.Now.Date;
var firstOfMonth = new DateTime(todaysDate.Year, todaysDate.Month, 1);
var monthEnd = firstOfMonth.AddMonths(1).AddDays(-1);
if (todaysDate == monthEnd)
{
File.Move(logFile, #"" + moveToNewPath + logFile);
}
foreach (var fileName in files)
{
if (fileName.Contains("myFile.csv"))
{
var fileValues = File.ReadAllLines(filePath + fileName.Substring(44)).Skip(1).Select(v => new myFile(v)).ToList();
foreach (var i in fileValues)
{
try
{
var jsonValues = ValueFromFile(i);
var response = UploadData(url, username, password, values);
this should be written to a log file ===> .File.AppendAllText(logFile, Environment.NewLine + DateTime.Now + "\t" + response);
}
catch (Exception exception)
{
File.AppendAllText(logFile, Environment.NewLine + DateTime.Now + "\t" + exception.Message.Replace("\n", " "));
}
}
File.Move(fileName, #"" + directoryPath + "\\" + moveTo + "\\" + "processedMyFile" + DateTime.Now.Date.ToString("MM-dd-yy") + ".csv");
}
}
}
catch (Exception exception)
{
File.AppendAllText(logFile, Environment.NewLine + DateTime.Now + "\t" + exception.Message.Replace("\n", " "));
}
}
}
Let's start with this line at the top of the program:
var logFile = File.Create(filePath + "\\log_" + DateTime.Today.ToString("MMMM") + ".txt").ToString();
I'm not sure what you're doing with that ToString() call hanging off the end. It almost certainly doesn't do what you think it does. But I really want to take a closer look at the documentation for the File.Create() method here. Specifically, this excerpt:
The FileStream object created by this method has a default FileShare value of None; no other process or code can access the created file until the original file handle is closed.
Uh oh. That means the File.AppendAllText() call later on will be out of luck. But let's look at the AppendAllText() documentation. Specifically this:
The method creates the file if it doesn’t exist
Meaning you can just remove the problem line at the top. You neither need nor want it. Or maybe you just want to create the file name there, like this:
var logFile = Path.Combine(filePath, "log_" + DateTime.Today.ToString("MMMM") + ".txt");
As a bonus, I'd explore changing this code to use System.Diagnostics.Trace in conjunction with FileTraceListener and maybe a ConsoleTraceListener attached.
static string moveToNewPath = "...";
static string filePath = "...";
static string logFormat = "\n{0:s}\t{1}";
static string logFile = "";
static string directoryPath = "...";
static void LogMessage(string FilePath, string Message)
{
File.AppendAllText(Path.Combine(filePath, logFile),
string.Format(logFormat, DateTime.Now, Message.Replace("\n", " ")));
}
static void Main(string[] args)
{
logFile = "log_" + DateTime.Today.ToString("MMMM") + ".txt";
//Rotate log file on last day of month
try
{
var todaysDate = DateTime.Now.Date;
var firstOfMonth = new DateTime(todaysDate.Year, todaysDate.Month, 1);
var monthEnd = firstOfMonth.AddMonths(1).AddDays(-1);
if (todaysDate == monthEnd)
{
File.Move(Path.Combe(filePath, logFile), Path.Combine(moveToNewPath, logFile));
}
}
catch (Exception ex)
{
LogMessage(ex.Message);
}
while (true)
{
// !!!!!!!!!!!!!!
//Log rotate code used to be here... but... you need something to be sure this only happens once per day.
// I STRONGLY suspect this code should be setup to run as
// a SCHEDULED TASK set to run once per day or maybe once per hour, rather than an always-on background program.
// !!!!!!!!!!!!!!!!
try
{
var moveTo = Path.Combine(directoryPath, "Processed_" + DateTime.Today.ToString("MMMM"));
Directory.CreateDirectory(moveTo);
var files = Directory.GetFiles(filePath).Where(f => f.EndsWith("myFile.csv"));
foreach (var fileName in files)
{
var fileValues = File.ReadAllLines(filePath + fileName.Substring(44)).Skip(1).Select(v => new myFile(v));
foreach (var i in fileValues)
{
try
{
var jsonValues = ValueFromFile(i);
var response = UploadData(url, username, password, jsonValues);
LogMessage(response);
}
catch (Exception ex)
{
LogMessage(ex.Message);
}
}
File.Move(fileName, Path.Combine(moveTo, "processedMyFile" + DateTime.Now.Date.ToString("MM-dd-yy") + ".csv"));
}
}
catch (Exception ex)
{
LogMessage(ex.Message);
}
}
}
I will be amazed if I find a solution for this, since it is very specific and vague, but I figured I would try. I'll try to give as much information as humanly possible, since I've been searching for answers for some time now.
I am building a utility in C# which copies records from a file in a library on the i-series/AS400 and builds an encrypted text file with each record from the AS400 as a comma separated string. In the file, it will have values like filename, fieldvalue1, fieldvalue2, fieldvalue3. I then take that text file to another PC, and run a C# utility which copies that record into the same file name in a library over there on a different i-series machine. Unfortunately, I receive the outside bounds of the array exception in some cases, but I cannot determine why. In the record just prior to the exception, the record looks pretty much the same and it works fine. My code is below in a nutshell. I usually don't give up, but I don't expect to ever figure this out. If someone does, I'll probably sing karaoke tonight.
// Select records from AS400 file and write them to text file
Recordset rs = new Recordset();
sqlQuery = "SELECT * FROM " + dataLibrary + "." + fileName;
try
{
rs.Open(sqlQuery, con);
while (!rs.EOF)
{
int[] fieldLengths;
fieldLengths = new int[rs.Fields.Count];
String[] fieldValues;
fieldValues = new String[rs.Fields.Count];
String fullString = "";
for (i = 0; i < rs.Fields.Count; i++)
{
fieldLengths[i] += rs.Fields[i].DefinedSize;
fieldValues[i] += rs.Fields[i].Value;
}
fullString = fileName + "," + String.Join(",", fieldValues);
fullString = Functions.EncryptString(fullString);
File.AppendAllText(savefile.FileName, fullString + Environment.NewLine);
rs.MoveNext();
}
}
catch (Exception ex)
{
}
cmd.Dispose();
// This gives me a text file of filename, fieldvalue1, fieldvalue2, etc...
// Next, I take the file to another system and run this process:
while ((myString = inputFile.ReadLine()) != null)
{
int stringLength = myString.Length;
String[] valuesArray = myString.Split(',');
for (i = 0; i < valuesArray.Length; i++)
{
if (i == 0)
{
fileName = valuesArray[0];
// Create file if it doesn't exist already
createPhysicalFile(newLibrary, fileName);
SQLStatement = "INSERT INTO " + newLibrary + "." + fileName + "VALUES(";
}
else
{
if (i == valuesArray.Length - 1)
{
SQLStatement += "#VAL" + i + ")";
}
else
{
SQLStatement += "#VAL" + i + ", ";
}
}
}
try
{
using (connection)
{
try
{
connection.Open();
}
catch (Exception ex)
{
}
// Create a new SQL command
iDB2Command command = new iDB2Command(SQLStatement, connection);
for (i = 1; i < valuesArray.Length; i++)
{
try
{
command.Parameters.AddWithValue("#VAL" + i, (valuesArray[i]));
}
catch (Exception ex)
{
}
}
// Just split the array into a string to visually check
// differences in the records
String arraySplit = ConvertStringArrayToString(valuesArray);
// The query gets executed here. The command looks something
// like:
// INSERT INTO LIBNAME.FILENAME VALUES(#VAL!, #VAL2, #VAL3, #VAL4)
// There are actually 320 fields in the file I'm having a problem with,
// so it's possible I'm overlooking something. I have narrowed it down to
// field # 316 when the exception occurs, but in both cases
// field 316 is blanks (when it works and when it doesn't).
command.ExecuteNonQuery();
}
}
catch (Exception ex)
{
// Here I get the exception out of bounds error in MSCORLIB.DLL.
// Some records are added fine, while others cause this exception.
// I cannot visibly tell any major differences, nor do I see any
// errors in the AS400 job log or anything in C# that would lead me
// down a certain path.
String error = ex.Message;
}
}
For what it's worth, I found this happening one a smaller file in the system and was able to figure out what going on, after painstaking research into the code and the net. Basically, the file file has numeric fields on the i-series. Somehow, the records were written to the file on the original system with null values in the numeric fields instead of numeric values. When storing the original records, I had to do this calculation:
String fieldType = rs.Fields[i].Type.ToString();
object objValue = rs.Fields[i].Value;
if (fieldType == "adNumeric" && objValue is DBNull)
{
fieldValues[i] += "0";
}
else
{
fieldValues[i] += rs.Fields[i].Value;
}
After this, if null values were found in one of the numeric fields, it just put "0" in it's place so that when writing to the new machine, it would put a valid numeric character in there and continue on writing the rest of the values. Thanks for all the advice and moral support. :)
Struggling with a C# Component. What I am trying to do is take a column that is ntext in my input source which is delimited with pipes, and then write the array to a text file. When I run my component my output looks like this:
DealerID,StockNumber,Option
161552,P1427,Microsoft.SqlServer.Dts.Pipeline.BlobColumn
Ive been working with the GetBlobData method and im struggling with it. Any help with be greatly appreciated! Here is the full script:
public override void Input0_ProcessInputRow(Input0Buffer Row)
{
string vehicleoptionsdelimited = Row.Options.ToString();
//string OptionBlob = Row.Options.GetBlobData(int ;
//string vehicleoptionsdelimited = System.Text.Encoding.GetEncoding(Row.Options.ColumnInfo.CodePage).GetChars(OptionBlob);
string[] option = vehicleoptionsdelimited.Split('|');
string path = #"C:\Users\User\Desktop\Local_DS_CSVs\";
string[] headerline =
{
"DealerID" + "," + "StockNumber" + "," + "Option"
};
System.IO.File.WriteAllLines(path + "OptionInput.txt", headerline);
using (System.IO.StreamWriter file = new System.IO.StreamWriter(path + "OptionInput.txt", true))
{
foreach (string s in option)
{
file.WriteLine(Row.DealerID.ToString() + "," + Row.StockNumber.ToString() + "," + s);
}
}
Try using
BlobToString(Row.Options)
using this function:
private string BlobToString(BlobColumn blob)
{
string result = "";
try
{
if (blob != null)
{
result = System.Text.Encoding.Unicode.GetString(blob.GetBlobData(0, Convert.ToInt32(blob.Length)));
}
}
catch (Exception ex)
{
result = ex.Message;
}
return result;
}
Adapted from:
http://mscrmtech.com/201001257/converting-microsoftsqlserverdtspipelineblobcolumn-to-string-in-ssis-using-c
Another very easy solution to this problem, because it is a total PITA, is to route the error output to a derived column component and cast your blob data to a to a STR or WSTR as a new column.
Route the output of that to your script component and the data will come in as an additional column on the pipeline ready for you to parse.
This will probably only work if your data is less than 8000 characters long.