WP7: collection of images - c#

I have images in folder Images in my windows phone solution. How can i get collection of images in this folder? Build Action of all images is "Content".

It had been bugging me that it wasn't possible to do this so I've done a bit of digging and have come up with a way of getting a list of all image files with the build action of "Resource". - Yes, this isn't quite what was asked for but hopefully this will still be useful.
If you really must use a build action of "Content" I'd use a T4 script to generate the list of files at build time. (This is what I do with one of my projects and it works fine.)
Assuming that the images are in a folder called "images" you can get them with the following:
var listOfImageResources = new StringBuilder();
var asm = Assembly.GetExecutingAssembly();
var mrn = asm.GetManifestResourceNames();
foreach (var resource in mrn)
{
var rm = new ResourceManager(resource.Replace(".resources", ""), asm);
try
{
var NOT_USED = rm.GetStream("app.xaml"); // without getting a stream, next statement doesn't work - bug?
var rs = rm.GetResourceSet(Thread.CurrentThread.CurrentUICulture, false, true);
var enumerator = rs.GetEnumerator();
while (enumerator.MoveNext())
{
if (enumerator.Key.ToString().StartsWith("images/"))
{
listOfImageResources.AppendLine(enumerator.Key.ToString());
}
}
}
catch (MissingManifestResourceException)
{
// Ignore any other embedded resources (they won't contain app.xaml)
}
}
MessageBox.Show(listOfImageResources.ToString());
This just displays a list of the names, but hopefully it'll be easy to change this to do whatever you need to.
Any suggestions for improving this code will be greatly appreciated.

Related

S3PutObjectCopy Batch Operations change destination structure of output directories in .NET

I'm using AWS.NET SDK and I'm trying to use S3CopyObjectOperation with CreateJob from batch operations. It works fine, I'm generating manifest on the fly with the following content:
mysourcebucket,folder/subdir/file1.txt
mysourcebucket,folder/subdir/file2.txt
Now I'm creating a job with CreateJob with the following request:
new CreateJobRequest(){
// some values ommited
Operation = new JobOperation(){
S3PutObjectCopy = new S3CopyObjectOperation{
StorageClass = Standard,
TargetResource = dstBucketArn,
TargetKeyPrefix = dstSubdir
}
},
Manifest = new JobManifest(){
Spec = new JobManifestSpec(){Fields={"Bucket","Key"}, Format= JobManifestFormat.S3BatchOperations_CSV_20180820},
Location= new JobManifestLocation(){
ObjectArn = //manifest key,
Etag = // manifest Etag
}
}
}
It is copying the files correctly under dstBucket. Output:
dstSubdir/folder/subdir/file1.txt
dstSubdir/folder/subdir/file2.txt
Is it possible to change target path somehow, so it doesn't include folder? Expected output:
dstSubdir/subdir/file1.txt
dstSubdir/subdir/file2.txt
Edit: Imageine simple scenario where I want to copy these objects and at some point copy them back into the same location. I won't be able to do that if I use TargetPrefix. In above example I'd create
targetbucket,dstSubdir/folder/subdir/file1.txt
targetbucket,dstSubdir/folder/subdir/file2.txt
And output would be in srcbucket:
srcSubdir/dstSubdir/folder/subdir/file1.txt
srcSubdir/dstSubdir/folder/subdir/file2.txt
The only solution that would work is not to use TargetPrefix at all and keep the same structure in src and dst buckets which is quite a big restriction in my case.
Ideally I'd like to pass
my-bucket,{"origKey": "object1key", "newKey": "newObject1Key"}
my-bucket,{"origKey": "object2key", "newKey": "newObject2Key"}
my-bucket,{"origKey": "object3key", "newKey": "newObject3Key"}
as presented in this example https://docs.aws.amazon.com/AmazonS3/latest/userguide/batch-ops-invoke-lambda.html

Move outlook item to archive

I have a C# program (actually it's just a C# library that is being used by NUnit) that I wish to modify slightly which is based off of this article: How to Programmatically move items in outlook. I'm currently faced with a folder that has bout 3500 messages all around 350kb and is taking FOREVER to move to my archive so I can send and receive emails again (since my inbox is currently at 1.5Gb out of 500Mb... lol) but for the life of me I can't figure out how to get my archive folder. I'm a little bit multitasking since I'm at work so I can edit as I go. So if you have any code readily available that finds the archive folder that would be great. Thank you
EDIT
ok to show that I do have some work in progress (based on negative feedback) here is the code I have in place right now (since yes I know I have a give me teh codez)
here is my NUnit test case that looks at a folder and gives me specific information
[Test]
public void CheckMessages()
{
List<EmailMessage> messages = new List<EmailMessage>();
using (var target = new EmailMessageProvider())
{
messages.AddRange(target.GetEmailMessages("UnexpectedErrors\\NotFindImage"));
}
Dictionary<int, string> asdf = new Dictionary<int, string>();
foreach (var item in messages)
{
var line = item.Body.Split(new string[] { Environment.NewLine }, StringSplitOptions.None)[2];
var revisionId = int.Parse(Regex.Match(line, #"\-*\d+").Value);
var path = line.Substring(line.IndexOf("\\\\"));
if (asdf.ContainsKey(revisionId))
{
Assert.That(path, Is.EqualTo(asdf[revisionId]));
}
else
{
asdf.Add(revisionId, path);
}
}
foreach (var item in asdf.OrderBy(x => x.Key))
{
Console.WriteLine($"{item.Key} {item.Value}");
}
}
I use the same class to find messages (in another test) and move it to that subfolder which that test is using.
here is the code I have that does the moving
public void MoveSurveyPrintComponentsNotFound()
{
var destination = _baseFolder.Folders["UnexpectedErrors"].Folders["NotFindImage"];
foreach (var mailItem in _baseFolder.Folders["UnexpectedErrors"].Items.OfType<MailItem>())
{
mailItem.UseMailItem(x =>
{
if (x.Body.Contains("Foobar.Common.Exceptions.ImageNotFound"))
x.Move(destination);
});
}
}
EDIT 2
looks like I may have just about got it. I found that in the MAPI Namspace one of the subfolders is Archives. I'm going to try to change a few of the variables and see if it moves. Problem is just checking one folder takes over 31 seconds. oh well. better than never.
I figured it out. It wasn't as hard as I had thought either so I'll share what i have just incase someone else has this problem. In my program I did 2 things. One was to set _basefolder as my default email address's Folder. Second was to to set _mapi to the Outlook.GetNamespace("MAPI"). Those two thinsg I already had in my constructor.
private readonly OutlookApplication _outlook;
private readonly NameSpace _mapi;
private MAPIFolder _baseFolder;
public EmailMessageProvider()
{
_outlook = new OutlookApplication();
_mapi = _outlook.GetNamespace("MAPI");
_baseFolder = _mapi.Folders["robert#defaultEmail.com"];
}
Archives works just like any other MAPIFolder so it's just a matter of getting said folder. For me it was in _mapi.Folders["Archive"]. I would imagine this is fairly standard so if you copy and paste it should work just fine.
So now to list out all of my emails I want to go through and move tham appropriatly.
public void MoveSpecificEmailsToArchives()
{
var destination = _mapi.Folders["Archives"];
foreach (var mailItem in _baseFolder.Folders["Unexpected Error"].Items.OfType<MailItem>())
{
mailItem.UseMailItem(x =>
{
if (x.Body.Contains("offensiveProgram.exe ERROR "))
x.Move(destination);
});
}
Release(destination);
}
fyi the UseMailItem is an extension method. Looks like this
public static void UseMailItem(this MailItem item, Action<MailItem> mailItemAction)
{
mailItemAction(item);
Marshal.ReleaseComObject(item);
}

Modify programatically csproj files with Microsoft.Build.Evaluation (instead of Engine)

I would like to read, modify and write back csproj files.
I've found this code, but unfortunately Engine class is depreciated.
Engine engine = new Engine()
Project project = new Project(engine);
project.Load("myproject.csproj");
project.SetProperty("SignAssembly", "true");
project.Save("myproject.csproj");
So I've continued based on the hint I should use Evaluation.ProjectCollection instead of Engine:
var collection = new ProjectCollection();
collection.DefaultToolsVersion = "4.0";
var project = new Project(collection);
// project.Load("myproject.csproj") There is NO Load method :-(
project.FullPath = "myproject.csproj"; // Instead of load? Does nothing...
// ... modify the project
project.Save(); // Interestingly there is a Save() method
There is no Load method anymore. I've tried to set the property FullPath, but the project still seems empty. Missed I something?
(Please note I do know that the .csproj file is a standard XML file with XSD schema and I know that we could read/write it by using XDocument or XmlDocument. That's a backup plan. Just seeing the .Save() method on the Project class I think I missed something if I can not load an existing .csproj. thx)
I've actually found the answer, hopefully will help others:
Instead of creating a new Project(...) and trying to .Load(...) it, we should use a factory method of the ProjectCollection class.
// Instead of:
// var project = new Project(collection);
// project.FullPath = "myproject.csproj"; // Instead of load? Does nothing...
// use this:
var project = collection.LoadProject("myproject.csproj")
Since i can't comment:
This won't work in .net core without first setting the MSBuild.exe path variable. The code to do so can be found here
https://blog.rsuter.com/missing-sdk-when-using-the-microsoft-build-package-in-net-core/
and is written here
private static void SetMsBuildExePath()
{
try
{
var startInfo = new ProcessStartInfo("dotnet", "--list-sdks")
{
RedirectStandardOutput = true
};
var process = Process.Start(startInfo);
process.WaitForExit(1000);
var output = process.StandardOutput.ReadToEnd();
var sdkPaths = Regex.Matches(output, "([0-9]+.[0-9]+.[0-9]+) \\[(.*)\\]")
.OfType<Match>()
.Select(m => System.IO.Path.Combine(m.Groups[2].Value, m.Groups[1].Value, "MSBuild.dll"));
var sdkPath = sdkPaths.Last();
Environment.SetEnvironmentVariable("MSBUILD_EXE_PATH", sdkPath);
}
catch (Exception exception)
{
Console.Write("Could not set MSBUILD_EXE_PATH: " + exception);
}
}

C# directory scan performance

I have a folder structure on a network drive that is
Booking Centre -> Facility -> Files
eg
EUR/12345678/File_archive1.txt
EUR/12345678/File_archive2.txt
EUR/12345678/File_latest.txt
EUR/5555/File_archive1.txt
EUR/5555/File_archive2.txt
EUR/5555/File_latest.txt
When a user selects a booking centre from the drop down, I want the code to look in the above network path for that booking centre, to look at all sub folders and find the most recent file in each of the sub folders and use that to populate a list of portfolios for a second dropdown. It is incredibly slow though, my code given below. Can anyone suggest a faster approach?
public IDictionary<string, Portfolio> ReadPortfolios()
{
var portfolios = new Dictionary<string, Portfolio>();
var di = new DirectoryInfo(PortfolioPath);
var possibleFacilities = di.GetDirectories();
foreach (var possibleFacility in possibleFacilities)
{
try
{
if (possibleFacility.GetFiles().Any())
{
var mostRecentFile = possibleFacility.GetFiles().OrderBy(file => file.LastWriteTimeUtc).Last();
var portfolio = UnzipAndReadPortfolio(mostRecentFile);
if (portfolio == null) continue;
portfolios.Add(possibleFacility.Name, portfolio);
}
}
catch (Exception ex)
{
Console.WriteLine(#"Failed to read portfolio: " + ex.Message);
}
}
return portfolios;
}
If you're interested by all subdirectories of "PortFolioPath", try to use the overload of GetDirectories and / or GetFiles which allows you to pass the SearchOption.AllDirectories parameter : it will avoid multiple access to network.
You also have TWO calls of GetFiles() in your loop, you should rather store the result of first call in a local variable.
You don't provide the code of UnzipAndReadPortfolio, which is maybe the slowest part (... or not ?).
Remember : in your code often you can think "one method call = one network access". So try to flatten your loops, reduce FSO access, etc.
A probably real little performance gain
var mostRecentFile = possibleFacility.GetFiles()
.OrderBy(file => file.LastWriteTimeUtc)
.LastOrDefault();
if(mostRecentFile != null)
....
and comment out the first
// if(possibleFacility.GetFiles().Any())
The most obvious thing:
Every time you call possibleFacility.GetFiles() you get all files within the folder.
you have to call it and save it in a variable and then use this variable.

DirectoryInfo.GetFileSystemInfos and File Renaming

I seem to be having a timing issue when renaming images and then re displaying them. In my code I use System.IO.File.Move twice to rename some images in a directory. Then later I try to retrieve a list of files in the directory, but when I do so I get some file names that existed after the first rename, and some that existed after the 2nd rename. How do I ensure I get only file names that exist after the 2nd rename? I have contemplated putting in a Thread.Sleep(), but that feels like a hack. In case it helps, I'm using MVC3.
public ActionResult UpdateImages ()
{
foreach (file in directory)
System.IO.File.Move("oldname", "newname");
foreach (file in directory)
System.IO.File.Move("oldname", "newname");
return RedirectToAction("Images", "Manager", new { id = Id });
}
public ViewResult Images(int id)
{
var di = new DirectoryInfo(Server.MapPath("something")));
var files = di.GetFileSystemInfos("*-glr.jpg");
var orderedFiles = files.OrderBy(f => f.Name);
var images = new List<string>();
images.AddRange(orderedFiles.Select(fileSystemInfo => fileSystemInfo.Name));
ViewData["Images"] = images;
return View();
}
edit
I wish I could remove my own question. It seems I have solved this and the answer isn't even related to the information I provided in the question.
It seems that I ended up sending both a Get and a Post to the server. The Post kicked off the work, but the response from the post got aborted since the Get was also fired. Since the Get finished quickly, it would catch the system in an in-between state.
The offending line of code was a anchor element that had both an href and a javascript click handler (through jQuery) attached to it.

Categories

Resources