Getting files to be committed code problems - c#

BACKGROUND:
What I am trying to do is output a list of files which are unversioned or have had changes done to them and need to be commited.
WHAT IVE TRIED:
I am currently using the code below, the code runs but nothing is outputted to the console. The catch method isnt activated either as the message box doesnt appear.
using (SvnClient client = new SvnClient())
{
try
{
EventHandler<SvnStatusEventArgs> statusHandler = new EventHandler<SvnStatusEventArgs>(HandleStatusEvent);
client.Status(Properties.Settings.Default.LocalFolderPath + #"\" + project, statusHandler);
}
catch
{
MessageBox.Show("ERROR");
}
}
private void HandleStatusEvent(object sender, SvnStatusEventArgs args)
{
switch (args.LocalContentStatus)
{
case SvnStatus.NotVersioned: // Handle appropriately
Console.WriteLine(args.ChangeList);
break;
}
// review other properties of 'args'
}
Im not quite sure if this is the right code to get the list of files which need to be committed as the documentation is poor. Ive looked on this site and have found a few other methods (similar to this way) but I still cant get it working. can anyone help?

Related

Why does my file sometimes disappear in the process of reading from it or writing to it?

I have an app that reads from text files to determine which reports should be generated. It works as it should most of the time, but once in awhile, the program deletes one of the text files it reads from/writes to. Then an exception is thrown ("Could not find file") and progress ceases.
Here is some pertinent code.
First, reading from the file:
List<String> delPerfRecords = ReadFileContents(DelPerfFile);
. . .
private static List<String> ReadFileContents(string fileName)
{
List<String> fileContents = new List<string>();
try
{
fileContents = File.ReadAllLines(fileName).ToList();
}
catch (Exception ex)
{
RoboReporterConstsAndUtils.HandleException(ex);
}
return fileContents;
}
Then, writing to the file -- it marks the record/line in that file as having been processed, so that the same report is not re-generated the next time the file is examined:
MarkAsProcessed(DelPerfFile, qrRecord);
. . .
private static void MarkAsProcessed(string fileToUpdate, string
qrRecord)
{
try
{
var fileContents = File.ReadAllLines(fileToUpdate).ToList();
for (int i = 0; i < fileContents.Count; i++)
{
if (fileContents[i] == qrRecord)
{
fileContents[i] = string.Format("{0}{1} {2}"
qrRecord, RoboReporterConstsAndUtils.COMPLETED_FLAG, DateTime.Now);
}
}
// Will this automatically overwrite the existing?
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
}
catch (Exception ex)
{
RoboReporterConstsAndUtils.HandleException(ex);
}
}
So I do delete the file, but immediately replace it:
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
The files being read have contents such as this:
Opas,20170110,20161127,20161231-COMPLETED 1/10/2017 12:33:27 AM
Opas,20170209,20170101,20170128-COMPLETED 2/9/2017 11:26:04 AM
Opas,20170309,20170129,20170225-COMPLETED
Opas,20170409,20170226,20170401
If "-COMPLETED" appears at the end of the record/row/line, it is ignored - will not be processed.
Also, if the second element (at index 1) is a date in the future, it will not be processed (yet).
So, for these examples shown above, the first three have already been done, and will be subsequently ignored. The fourth one will not be acted on until on or after April 9th, 2017 (at which time the data within the data range of the last two dates will be retrieved).
Why is the file sometimes deleted? What can I do to prevent it from ever happening?
If helpful, in more context, the logic is like so:
internal static string GenerateAndSaveDelPerfReports()
{
string allUnitsProcessed = String.Empty;
bool success = false;
try
{
List<String> delPerfRecords = ReadFileContents(DelPerfFile);
List<QueuedReports> qrList = new List<QueuedReports>();
foreach (string qrRecord in delPerfRecords)
{
var qr = ConvertCRVRecordToQueuedReport(qrRecord);
// Rows that have already been processed return null
if (null == qr) continue;
// If the report has not yet been run, and it is due, add i
to the list
if (qr.DateToGenerate <= DateTime.Today)
{
var unit = qr.Unit;
qrList.Add(qr);
MarkAsProcessed(DelPerfFile, qrRecord);
if (String.IsNullOrWhiteSpace(allUnitsProcessed))
{
allUnitsProcessed = unit;
}
else if (!allUnitsProcessed.Contains(unit))
{
allUnitsProcessed = allUnitsProcessed + " and "
unit;
}
}
}
foreach (QueuedReports qrs in qrList)
{
GenerateAndSaveDelPerfReport(qrs);
success = true;
}
}
catch
{
success = false;
}
if (success)
{
return String.Format("Delivery Performance report[s] generate
for {0} by RoboReporter2017", allUnitsProcessed);
}
return String.Empty;
}
How can I ironclad this code to prevent the files from being periodically trashed?
UPDATE
I can't really test this, because the problem occurs so infrequently, but I wonder if adding a "pause" between the File.Delete() and the File.WriteAllLines() would solve the problem?
UPDATE 2
I'm not absolutely sure what the answer to my question is, so I won't add this as an answer, but my guess is that the File.Delete() and File.WriteAllLines() were occurring too close together and so the delete was sometimes occurring on both the old and the new copy of the file.
If so, a pause between the two calls may have solved the problem 99.42% of the time, but from what I found here, it seems the File.Delete() is redundant/superfluous anyway, and so I tested with the File.Delete() commented out, and it worked fine; so, I'm just doing without that occasionally problematic call now. I expect that to solve the issue.
// Will this automatically overwrite the existing?
File.Delete(fileToUpdate);
File.WriteAllLines(fileToUpdate, fileContents);
I would simply add an extra parameter to WriteAllLines() (which could default to false) to tell the function to open the file in overwrite mode, and not call File.Delete() at all then.
Do you currently check the return value of the file open?
Update: ok, it looks like WriteAllLines() is a .Net Framework function and therefore cannot be changed, so I deleted this answer. However now this shows up in the comments, as a proposed solution on another forum:
"just use something like File.WriteAllText where if the file exists,
the data is just overwritten, if the file does not exist it will be
created."
And this was exactly what I meant (while thinking WriteAllLines() was a user defined function), because I've had similar problems in the past.
So, a solution like that could solve some tricky problems (instead of deleting/fast reopening, just overwriting the file) - also less work for the OS, and possibly less file/disk fragmentation.

Why email is not sending even if the path is working

This is a piece of code for genrating mail which works until I didnt attach path as a parameter . the thing is if I attach the path it didnt throw any error(no logs). Just the page started being unresponsive,and debugger not even jump to next line.
any help wil help me to understand my mistake . Thanks
public ActionResult Mailsending(string list)
{
try
{
string strIdeas = string.Empty;
string Certpath = System.Configuration.ConfigurationManager.AppSettings["UploadPath"];
List<int> list = new List<int>();
List<string> pramAttachment = new List<string>();
pramAttachment.Add(Server.MapPath(Certpath) + "MyPdf.pdf"); ///Path of the generated pdf.
Submitidlist = new CommonBL().GetSubmiidListForGenerateMail();
new CommonBL().UpdateIsGenerateStatus(ideaidlist, UserID);
foreach (var item in ideaidlist)
{
strIdeas = strIdeas + item.ToString() + ",";
}
GenerateMyPDF(list); //Here pdf is generating
string path = GenerateMail(strIdeas.TrimEnd(','));
if (path != string.Empty)
{
new CommonBL().AddGenerateImagePath(path, UserId);
new MailSender().SendMail((int)eMailType.GenerateMail, null, pramAttachment); // here path is added as parameter,and after this debugger not jump out of this scope.
}
return Json("Mail generated Successfully."); ///no message showing
}
catch (Exception ex)
{
return Json("Error");
}
}
Edit :
public class MailSender : IDisposable
{
public bool SendMail(short mailId, List<KeyValuePair<string, string>> parameters, List<string> attachmentsPath = null);
}
Possibly still leaving lock on the generated PDF, so MailSender is not able to access it due to that exclusive lock. Can you send emails with files previously generated?
adding a point which apparently is also an answer of this question, is :
After debugging whole code, I found that my smtp server is not allowing to send me a mail, so even if the above code is right, it shows processing.
So if anyone is working with above code will work fine.
An update : Now it works fine after configuring my mail service from control panel. So if any one wants to take reference from this can go ahead . the code is fine .

commit files not working properly

I m trying to commit changes to the project to my svn server but it doesnt seem to work.
The Test:
I run the code below with the folder where nothing has been changed and nothing happens as expected, Then I create a new folder and then run the code again, this time nothing seems to happen, the code runs and returns without the error showing it worked.
public bool svnCommitProject(String project, String message)
{
using (SvnClient client = new SvnClient())
{
try
{
SvnCommitArgs args = new SvnCommitArgs();
args.LogMessage = message;
client.Authentication.ForceCredentials(Properties.Settings.Default.Username, Properties.Settings.Default.Password);
return client.Commit(Properties.Settings.Default.LocalFolderPath + Properties.Settings.Default.Username + #"\" + project, args);
}
catch
{
MessageBox.Show("ERROR");
return false;
}
}
}
Suspected Problem:
From looking at this and google i suspect that the problem exists because the file hasnt been "added" to svn control, but im not sure.
Is this the case? and if so how would I go about adding the files which need to be added? I also assume that something similar would be needed for files which are deleted/modified, is this correct and how would I add this in too?
See Find files not added to subversion
Yes, files just dropped into the local working directory doesn't tell the SVN to commit to.
Collection<SvnStatusEventArgs> filesStatuses = new Collection<SvnStatusEventArgs>();
if (!client.GetStatus(localDir, new SvnStatusArgs
{
Depth = SvnDepth.Infinity,
RetrieveRemoteStatus = true,
RetrieveAllEntries = true
}, out workDirFilesStatus))
{
throw new SvnOperationCanceledException("SvnClient.GetStatus doesn't return anything.");
}
filesStatuses.Where(i => i.LocalContentStatus == SvnStatus.NotVersioned).ToList().ForEach(i => svnC.Add(i.Path));
filesStatuses.Where(i => i.LocalContentStatus == SvnStatus.Missing).ToList().ForEach(i => svnC.Delete(i.Path));

0 rows inserted with Entity Framework

I've been trying to write a simple financial app to manage my home spending's and while I was writing my Save button code I've encountered a situation where the code runs fine but inserts 0 rows to the local database.
Here's the code that calls saveIncome method:
if (comboBox1.SelectedIndex == 0)
{
try
{
saveIncome();
}
catch(Exception ex)
{
MessageBox.Show(ex.Message);
}
}
And here's the code for the "Save" button:
public void saveIncome()
{
using (WalletEntities ctx = new WalletEntities())
{
var Income = new Income
{
ID = transID,
Name = tbName.Text,
Date = calDate.SelectionRange.Start,
Value = decimal.Parse(tbValue.Text),
Owner = tbOwner.Text,
Desc = tbDesc.Text,
};
ctx.Income.Add(Income);
ctx.SaveChanges();
MessageBox.Show("Added Income ID: " + transID.ToString());
}
}
When I've tried to debug this everything ran ok. Object Income was filled and the Message Box shows.
As I understand, I was using the "Model First" approach to make this.
Please be gentle - I'm a beginner in programming :) also sorry for my English - not my primary language.
Ok, so the problem is fixed and it was due to my lack of knowledge. So apparently #MilenPavlov was right. I did in fact inspect a different database. I had no idea that when built the project copies the *.sdf to DEBUG folder and places changes there - courtesy of Copy to Output Directory property on your *.sdf file. So as I was inspecting via Visual Studio, I've been viewing a different copy of the file.
Thanks #MilenPavlov for showing me the way :)

Camera server died! Error 100 when starting recording

NOTICE: I'm Using Monodroid, expect C# code.
I'm facing this error when the _recorder.Start() is called.
CODE:
private void IniciarGrabacion()
{
try
{
CamcorderProfile camProfile = CamcordeProfile.Get(CamcorderQuality.High);
String outputFile = "/sdcard/trompiz.mp4";
_camera.Unlock ();
_recorder = new MediaRecorder();
_recorder.SetCamera(_camera);
_recorder.SetAudioSource(AudioSource.Default);
_recorder.SetVideoSource(VideoSource.Camera);
_recorder.SetProfile(camProfile);
_recorder.SetOutputFile(outputFile);
_recorder.SetPreviewDisplay(_preview.Holder.Surface);
_recorder.Prepare();
_recorder.Start(); // HERE IS WHERE THE ERROR APPEARS
}
catch(Exception ex)
{
string error = "Error starting Recording: " + ex.Message;
Log.Debug("ERROR",error);
Toast.MakeText(Application, error, ToastLength.Long).Show();
}
}
The outputFile is hardcoded because i'm still testing.
I can confirm that exists because it gets created.
I just figured the problem.
It wasn't on how the camera was handled.
It was the Profile setting.
CamcorderProfile camProfile = CamcordeProfile.Get(CamcorderQuality.High);
Might be a Device bug, but I cant set it to high. To make it work, I changed it to LOW.
CamcorderProfile camProfile = CamcordeProfile.Get(CamcorderQuality.Low);
I have a Zenithink C93 Z283 (H6_2f)
I hope this helps anyone else fighting with this...
Now I have to see how to record on High quality. I know I can because the native camera app records in high....

Categories

Resources