I'm trying to use a PreparationTask to grab my ResourceFiles to be used as input data.
My prep task looks like this:
myJob.JobPreparationTask = new JobPreparationTask { CommandLine = jobPrepCmdLine };
How do I configure my job with a PreparationTask to download ResourceFiles from my AutoStorageContainer to pool VM's?
I tried:
var inputFiles = new List<ResourceFile> { };
var file = ResourceFile.FromAutoStorageContainer("fgrp-jill2");
inputFiles.Add(file);
myJob.JobPreparationTask.ResourceFiles = inputFiles;
But get a null object error, even when the inputFiles.Add is showing at least 1 file recognized.
You should use the Storage SDK in conjunction with Batch in this scenario. You can follow this as an example: https://learn.microsoft.com/en-us/azure/batch/quick-run-dotnet#preliminaries
A Job preparation task functions very similar to a regular Start Task in that it runs before the tasks execution. In the example from the link, you'll see that we reference the Blob client, container name and file path. I'll paste the sample here:
List<string> inputFilePaths = new List<string>
{
"taskdata0.txt",
"taskdata1.txt",
"taskdata2.txt"
};
List<ResourceFile> inputFiles = new List<ResourceFile>();
foreach (string filePath in inputFilePaths)
{
inputFiles.Add(UploadFileToContainer(blobClient, inputContainerName, filePath));
}
Related
foreach(var lst in list)
{
CloudBlobDirectory dira = container.GetDirectoryReference(folderName + "/" + prefix);
bool isExists = dira.GetBlockBlobReference(filename + ".png").ExistsAsync().Result;
if(isExists)
{
// create sas url using cloud block url
}
}
I am using this code to check if blob exists for each path But ExistsAsync() is taking too much time.
I have also tried GetBlobClient in loop but it was also taking time.
Is there any other way to check if blockblob is exists
I am using the latest version(12) of Azure.Storage.Blobs
It´s not entierly clear what OP wants to do. One naive way to try and make Blob queries faster is to execute multiple in parallel.
Note that this example actually uses Azure.Storage.Blobs, while op seems to be using Windows.Azure.Storage despite stating otherwise.
using Azure.Storage.Blobs;
using Azure.Storage.Blobs.Specialized;
var blobClient = new BlobServiceClient("<connection string>");
var container = blobClient.GetBlobContainerClient("<container name>");
// However your list looks like, Im assuming it´s partial or full paths in blob
var potentialBlobs = new List<string>() { "<some>/<blob>/<path>" };
// Amount to execute in parallel at a time, depends on your system which value will yield better results.
// Could be calculated based on Environment.ProcessorCount too (note that this can return 0 or negative values sometimes, so you have to check)
const int chunkSize = 4;
var existingBlobs = new List<string>();
foreach (var chunk in potentialBlobs.Chunk(chunkSize))
{
// Create multiple tasks
var tasks = chunk.Select(async path => // potentialBlobs list item
{
var exists = await container
.GetBlockBlobClient(path) // adjust path however needed, I don´t understand what OP wants here
.ExistsAsync();
return exists ? path : null;
});
// Wait for tasks to finish
var results = await Task.WhenAll(tasks);
// Append to result list
foreach (var cur in results)
{
if (cur is not null)
existingBlobs.Add(cur);
}
}
There does not seem to be an Exists method available in the BatchClient of Azure.Storage.Blobs.Batch. As far as Im aware of it´s only mostly for adding, updating or deleting blobs not to check for their existence. So I think it is not viable for this scenario, but I might have missed something.
If you cannot get acceptable performance this way you´ll have to store blob paths in something like table storage or cosmosdb that is faster to query against. Depending on what you need you might also be able to cache the result or store the result somewhere and continuously update it as new blobs get added.
I have the following code taking the input of my input file:
var inputStream = new AntlrInputStream(File.ReadAllText(fileName));
var lexer = new LegitusLexer(inputStream);
var commonTokenStream = new CommonTokenStream(lexer);
var parser = new LegitusParser(commonTokenStream);
parser.AddErrorListener(this);
var context = parser.program();
var visitor = new LegitusVisitor(_io.GetDefaultMethods(), _io.GetDefaultVariables())
{
Logger = _logger
};
visitor.Visit(context);
But when I call parser.program(), my program runs as it should. However, I need a way to validate that the input file is syntactically correct, so that users can verify without having to run the scripts (which run against a special machine).
Does Antlr4csharp support this easily?
The Antlr tool can be used to lint the source.
The only difference from a 'standard' tool run is that no output files are generated -- the warnings/errors will be the same.
For example (Java; string source content w/manually applied file 'name'):
Tool tool = new Tool();
tool.removeListeners();
tool.addListener(new YourLintErrorReporter());
ANTLRStringStream in = new ANTLRStringStream(content);
GrammarRootAST ast = tool.parse(name, in);
Grammar g = tool.createGrammar(ast);
g.fileName = name; // to ensure all err msgs identify the file by name
tool.process(g, false); // false -> lint: don't gencode
The CS implementation is equivalent.
I am trying to retrieve all image files from a virtual directory in my Azure storage account. The path of the folder (container URI) is correct but is returning
StorageException: The requested URI does not represent any resource on the server.
Pasting the URI in the browser produces
BlobNotFound
The specified blob does not exist. RequestId:622f4500-a01e-0022-7dd0-7d9428000000 Time:2019-10-08T12:02:29.6389180Z
The URI, which is public, works fine; you can see this video by pasting the URI in your browser or clicking the Engine Video link below.
Engine Video
My code grabs the container, whose URI is https://batlgroupimages.blob.core.windows.net/enerteck/publicfiles/images/robson
public async Task<List<string>> GetBlobFileListAsync(CloudBlobContainer blobContainer, string customer)
{
var files = new List<string>();
BlobContinuationToken blobContinuationToken = null;
do
{
//code fails on the line below
var segments = await blobContainer.ListBlobsSegmentedAsync(null, blobContinuationToken);
blobContinuationToken = segments.ContinuationToken;
files.AddRange(segments.Results.Select(x => GetFileNameFromBlobUri(x.Uri, customer)));
} while (blobContinuationToken != null);
return files;
}
The code is failing on the var segments = await blobContainer…. code line
and it is not the container that is causing the error (IMO) as you can see the container comes back with a valid URI
and the virtual folder contains files
I would love to know what I am doing wrong here.
https://batlgroupimages.blob.core.windows.net/enerteck/publicfiles/images/robson is not a container URI.
https://batlgroupimages.blob.core.windows.net/enerteck is a container URI.
publicfiles/images/robson/image.png could be a blob's name in that container.
I'm thinking you may have included some of the virtual folder path in the container URI and maybe that is messing up something?
So the answer lies in the fact that I am trying to list blobs WITHIN a virtual folder path
So in the OP, I was trying to list the blobs using the FULL path to the folder containing the blobs. You cannot do it that way. You must use one of the overloads of ListBlobsSegmentedAsync on the MAIN CONTAINER. Thanks to juunas for getting me to realize I wasn't starting at the main container. I realized their must be other overload methods to accomplish what I was seeking to do
The code below works well
public async Task<List<EvaluationImage>> GetImagesFromVirtualFolder(CloudBlobContainer blobContainer, string customer)
{
var images = new List<EvaluationImage>();
BlobContinuationToken blobContinuationToken = null;
do
{
//this is the overload to use, you pass in the full virtual path from the main container to where the files are (prefix), use a
//useflatbloblisting (true value in 2nd parameter), BlobListingDetail, I chose 100 as the max number of blobs (parameter 4)
//then the token and the last two parameters can be null
var results = await blobContainer.
ListBlobsSegmentedAsync("publicfiles/images/" + customer,true,BlobListingDetails.All, 100, blobContinuationToken,null,null);
// Get the value of the continuation token returned by the listing call.
blobContinuationToken = results.ContinuationToken;
foreach (var item in results.Results)
{
var filename = GetFileNameFromBlobUri(item.Uri, customer);
var img = new EvaluationImage
{
ImageUrl = item.Uri.ToString(),
ImageCaption = GetCaptionFromFilename(filename),
IsPosterImage = filename.Contains("poster")
};
images.Add(img);
}
} while (blobContinuationToken != null);
return images;
}
Using NotificationHubClient I can get all registered devices using GetAllRegistrationsAsync(). But if I do not use the registration model but the installation model instead, how can I get all installations? There are methods to retrieve a specific installation but none to get everything.
You're correct, as of July 2016 there's no way to get all installations for a hub. In the future, the product team is planning to add this feature to the installations model, but it will work in a different way. Instead of making it a runtime operation, you'll provide your storage connection string and you'll get a blob with everything associated with the hub.
Sorry for visiting an old thread... but in theory you could use the GetAllRegistrationsAsyc to get all the installations. I guess this will return everything without an installation id as well, but you could just ignore those if you choose.
Could look something like this
var allRegistrations = await _hub.GetAllRegistrationsAsync(0);
var continuationToken = allRegistrations.ContinuationToken;
var registrationDescriptionsList = new List<RegistrationDescription>(allRegistrations);
while (!string.IsNullOrWhiteSpace(continuationToken))
{
var otherRegistrations = await _hub.GetAllRegistrationsAsync(continuationToken, 0);
registrationDescriptionsList.AddRange(otherRegistrations);
continuationToken = otherRegistrations.ContinuationToken;
}
// Put into DeviceInstallation object
var deviceInstallationList = new List<DeviceInstallation>();
foreach (var registration in registrationDescriptionsList)
{
var deviceInstallation = new DeviceInstallation();
var tags = registration.Tags;
foreach(var tag in tags)
{
if (tag.Contains("InstallationId:"))
{
deviceInstallation.InstallationId = new Guid(tag.Substring(tag.IndexOf(":")+1));
}
}
deviceInstallation.PushHandle = registration.PnsHandle;
deviceInstallation.Tags = new List<string>(registration.Tags);
deviceInstallationList.Add(deviceInstallation);
}
I am not suggesting this to be the cleanest chunk of code written, but it does the trick for us. We only use this for debugging type purposes anyways
I am trying to add an item to an ObservableCollection while in an aysnc operation. If I run the application, the collection does not have the correct file. If I step through it, I see the correct file does get added, which obviously shows a timing issue. Trouble is I cannot figure out how to fix it. Besides this, everything else works as I expect.
Can someone explain to me what I am doing wrong and how to fix it so the correct filename is written to the ObservableCollection?
private void ChangeFile(INotificationComplete notification)
{
FileInfo currentFileInfo = null;
var destinationImageFilename = string.Empty;
var imageDestinationFolder = Path.Combine(messageBrokerInstance.GetProgramPath("LevelThreeFilesWebLocation", this.SelectedPlatform), "images");
var fileDestinationFolder = Path.Combine(messageBrokerInstance.GetProgramPath("LevelThreeFilesWebLocation", this.SelectedPlatform));
try
{
Task.Factory.StartNew((Action)delegate
{
string[] files = null;
if (directoryInfo.Exists)
{
files = Directory.GetFiles(directoryInfo.FullName, #"*.htm", SearchOption.TopDirectoryOnly);
}
foreach (string file in files)
{
currentFileInfo = new FileInfo(file);
**// bunch of code
// I've found what I want and now am ready to write the file
// and add the filename to the collection the user sees.**
if (writeFile)
{
var fileDestination = Path.Combine(fileDestinationFolder, currentFileInfo.Name);
File.WriteAllLines(webFileDestination, fileArray);
**// Correct file was written but the wrong filename
// is added to the collection.**
// If I step through this, the correct filename is added.
UIDispatcher.Current.BeginInvoke((Action)delegate
{
this.ChangedFiles.Add(currentFileInfo.Name); // ChangedFiles is an ObservableCollection<string>
});
}
}
WaitAnimationNotification offNotification = new WaitAnimationNotification()
{
IsWaitAnimationOn = false,
WaitAnimationMessage = "Please wait while the operation completes..."
};
WaitAnimationNotification waitNotification = notification as WaitAnimationNotification;
if (waitNotification.IsWaitAnimationOn)
{
this.SendMessage("ToggleWaitAnimation", new NotificationEventArgs<WaitAnimationNotification, INotificationComplete>("ToggleWaitAnimation", offNotification));
}
});
}
}
I dont know what UIDispatcher.Current is, but the correct Dispatcher to use is Application.Current.Dispatcher It's easy to spin up a separate dispatcher on a background thread unintentionally - which will give you the behavior you see. Application.Current.Dispatcher is going to be the correct dispatcher for the main application message pump