Azure Webjob is now on V3, so this answer is not up to date anymore (How to integration test Azure Web Jobs?)
I imagine we need to do something like this:
var host = CreateHostBuilder(args).Build();
using (var scope = host.Services.CreateScope())
using (host)
{
var jobHost = host.Services.GetService(typeof(IJobHost)) as JobHost;
var arguments = new Dictionary<string, object>
{
// parameters of MyQueueTriggerMethodAsync
};
await host.StartAsync();
await jobHost.CallAsync("MyQueueTriggerMethodAsync", arguments);
await host.StopAsync();
}
QueueTrigger Function
public MyService(
ILogger<MyService> logger
)
{
_logger = logger;
}
public async Task MyQueueTriggerMethodAsync(
[QueueTrigger("MyQueue")] MyObj obj
)
{
_logger.Log("ReadFromQueueAsync success");
}
But after that, how can I see what's happened?
What do you suggest to be able to do Integration Tests for Azure Webjobs V3?
I'm guessing this is a cross post with Github. The product team recommends looking at their own end-to-end testing for ideas on how to handle integration testing.
To summarize:
You can configure an IHost as a TestHost and add your integrated services to it.
public TestFixture()
{
IHost host = new HostBuilder()
.ConfigureDefaultTestHost<TestFixture>(b =>
{
b.AddAzureStorage();
})
.Build();
var provider = host.Services.GetService<StorageAccountProvider>();
StorageAccount = provider.GetHost().SdkObject;
}
Tests would look something like this:
/// <summary>
/// Covers:
/// - queue binding to custom object
/// - queue trigger
/// - table writing
/// </summary>
public static void QueueToICollectorAndQueue(
[QueueTrigger(TestQueueNameEtag)] CustomObject e2equeue,
[Table(TableName)] ICollector<ITableEntity> table,
[Queue(TestQueueName)] out CustomObject output)
{
const string tableKeys = "testETag";
DynamicTableEntity result = new DynamicTableEntity
{
PartitionKey = tableKeys,
RowKey = tableKeys,
Properties = new Dictionary<string, EntityProperty>()
{
{ "Text", new EntityProperty("before") },
{ "Number", new EntityProperty("1") }
}
};
table.Add(result);
result.Properties["Text"] = new EntityProperty("after");
result.ETag = "*";
table.Add(result);
output = e2equeue;
}
The difficulty in setting up a specific test depends on which triggers and outputs you are using and whether or not an emulator.
Related
I would like to use .net variant of Google Cloud Client Libraries (Resource Manager for creating new project, for example).
I wouldn't like to use neither service account credentials nor ADC.
Can I somehow pass existing OAuth credentials (access token, obtained for appropriate scope) to the client library to authenticate the given user?
(Or) do I need any authentication client library?
Briefly looked at the ProjectsClientBuilder class, but seems heavy generated (also as the documentation), meaning it's a bit harder to find any hint.
The following example shows how to authorize the Google cloud resource manager API using Oauth2 for an installed app.
// Key file from google developer console (INSTALLED APP)
var PathToInstalledKeyFile = #"C:\Development\FreeLance\GoogleSamples\Credentials\credentials.json";
// scope of authorization needed for the method in question.
var scopes = "https://www.googleapis.com/auth/cloud-platform";
// Installed app authorizaton.
var credential = GoogleWebAuthorizationBroker.AuthorizeAsync(GoogleClientSecrets.FromFile(PathToInstalledKeyFile).Secrets,
new []{ scopes },
"userName",
CancellationToken.None,
new FileDataStore("usercreds", true)).Result;
var client = new ProjectsClientBuilder()
{
Credential = credential,
}.Build();
var projects = client.ListProjects(new FolderName("123"));
Note for a web application the code will be different. Web authorization is not the same with the client library. I havent tried to connect any of the cloud apis via web oauth before.
As mentioned above, only thing needed is to initialize Credential property in the project builder prior the Build().
Just for the completeness:
// when using Google.Apis.CloudResourceManager.v3
public class Program
{
private static async Task OlderMethod(string oAuthToken)
{
using var service = new CloudResourceManagerService();
var id = Guid.NewGuid().ToString("N")[..8];
var project = new Google.Apis.CloudResourceManager.v3.Data.Project
{
DisplayName = $"Prog Created {id}",
ProjectId = $"prog-created-{id}",
};
var createRequest = service.Projects.Create(project);
createRequest.Credential = new OlderCredential(oAuthToken);
var operation = await createRequest.ExecuteAsync();
// ...
}
}
public class OlderCredential : IHttpExecuteInterceptor
{
private readonly string oAuthToken;
public OlderCredential(string oAuthToken) { this.oAuthToken = oAuthToken; }
public Task InterceptAsync(HttpRequestMessage request, CancellationToken cancellationToken)
{
request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", oAuthToken);
return Task.CompletedTask;
}
}
In the end the newer one is simpler, just returning the token, no need to modify the request itself:
// when using Google.Cloud.ResourceManager.V3
public class Program
{
private static async Task NewerMethod(string oAuthToken)
{
var client = await new ProjectsClientBuilder
{
Credential = new NewerCredential(oAuthToken),
}.BuildAsync();
var id = Guid.NewGuid().ToString("N")[..8];
var project = new Project
{
DisplayName = $"Prog Created New {id}",
ProjectId = $"prog-created-new-{id}",
};
var operation = await client.CreateProjectAsync(project);
}
}
public class NewerCredential : ICredential
{
private readonly string oAuthToken;
public NewerCredential(string oAuthToken) { this.oAuthToken = oAuthToken; }
public void Initialize(ConfigurableHttpClient httpClient) { }
public Task<string> GetAccessTokenForRequestAsync(string? authUri, CancellationToken cancellationToken) => Task.FromResult(oAuthToken);
}
I have written unit test cases where I have my test cases written against Cosmos Db emulator. (Those who don't know what emulator is , it is a local development cosmos Db provided by Microsoft which are generally used to test your queries)
In my unit test case I am instantiating the Emulator db and then running the test cases. problem occurs when I push this changes to my Azure devops pipeline. there the test cases fails with error as
Target machine actively refused the connection.
It does mean it is not able to instansiate db. How can i fix this. Any idea??
here is the initial code for testing
public class CosmosDataFixture : IDisposable
{
public static readonly string CosmosEndpoint = "https://localhost:8081";
public static readonly string EmulatorKey = "C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==";
public static readonly string DatabaseId = "testdb";
public static readonly string RecordingCollection = "testcolec";
public static string Root = Directory.GetParent( Directory.GetCurrentDirectory() ).Parent.Parent.FullName;
public static DocumentClient client { get; set; }
public async Task ReadConfigAsync()
{
// StartEmulatorDatabaseFromPowerShell();
client = new DocumentClient( new Uri( CosmosEndpoint ), EmulatorKey,
new ConnectionPolicy
{
ConnectionMode = ConnectionMode.Direct,
ConnectionProtocol = Protocol.Tcp
} );
await client.CreateDatabaseIfNotExistsAsync( new Database { Id = DatabaseId } );
await client.CreateDocumentCollectionIfNotExistsAsync( UriFactory.CreateDatabaseUri( DatabaseId ),
new DocumentCollection { Id = RecordingCollection } );
await ReadAllData( client );
}
public CosmosDataFixture()
{
ReadConfigAsync();
}
public void Dispose()
{
DeleteDatabaseFromPowerShell();// this is also defined in above class
}
}
public class CosmosDataTests : IClassFixture<CosmosDataFixture>
{ // mu unit test case goes here
You need to add this statement to your yaml Pipeline:
- task: PowerShell#2
inputs:
targetType: 'inline'
script: |
Import-Module "$env:ProgramFiles\Azure Cosmos DB Emulator\PSModules\Microsoft.Azure.CosmosDB.Emulator"
Start-CosmosDbEmulator
And the Connection String for CosmosDB should be: AccountEndpoint=https://localhost:8081/;AccountKey=C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==
You can instantiate the CosmosDB client like that:
var connectionString = "AccountEndpoint=https://localhost:8081/;AccountKey=C2y6yDjf5/R+ob0N8A7Cgv30VRDJIWEHLM+4QDU5DE2nQ9nDuVTqobD4b8mGGyPMbIZnqyMsEcaGQy67XIw/Jw==";
var client = new CosmosClient(connectionString);
var database = client.CreateDatabaseIfNotExistsAsync("testdb");
var container = await database.Database.CreateContainerIfNotExistsAsync("testcolec", "/partitionKey");
I'm doing a bot with de Microsoft Bot Framework version V4. The documentation is really awful, and I'm having problems with Cosmos DB (Azure) when I try to store de UserSate and the ConversationState.
I tried every result in Google but nothing has worked yet. Plus, there no much information about the framework really.
Bellow is the code of the file Startup.cs.
public void ConfigureServices(IServiceCollection services)
{
services.AddBot<SeguritoBot>(options =>
{
var secretKey = Configuration.GetSection("botFileSecret")?.Value;
var botFilePath = Configuration.GetSection("botFilePath")?.Value;
// Loads .bot configuration file and adds a singleton that your Bot can access through dependency injection.
var botConfig = BotConfiguration.Load(botFilePath ?? #".\Segurito.bot", secretKey);
services.AddSingleton(sp => botConfig ?? throw new InvalidOperationException($"The .bot config file could not be loaded. ({botConfig})"));
// Retrieve current endpoint.
var environment = _isProduction ? "production" : "development";
var service = botConfig.Services.FirstOrDefault(s => s.Type == "endpoint" && s.Name == environment);
if (!(service is EndpointService endpointService))
{
throw new InvalidOperationException($"The .bot file does not contain an endpoint with name '{environment}'.");
}
options.CredentialProvider = new SimpleCredentialProvider(endpointService.AppId, endpointService.AppPassword);
// Creates a logger for the application to use.
ILogger logger = _loggerFactory.CreateLogger<SeguritoBot>();
// Catches any errors that occur during a conversation turn and logs them.
options.OnTurnError = async (context, exception) =>
{
logger.LogError($"Exception caught : {exception}");
await context.SendActivityAsync("Sorry, it looks like something went wrong.");
};
var optionsConversation = new CosmosDbStorageOptions()
{
CosmosDBEndpoint = new Uri("--secret--"),
AuthKey = "--secret--",
DatabaseId = "--secret--",
CollectionId = "--secret--"
};
var optionsUser = new CosmosDbStorageOptions()
{
CosmosDBEndpoint = new Uri("--secret--"),
AuthKey = "--secret--",
DatabaseId = "--secret--",
CollectionId = "--secret--"
};
IStorage dataStoreConversationState = new CosmosDbStorage(optionsConversation);
IStorage dataStoreUserState = new CosmosDbStorage(optionsUser);
options.Middleware.Add(new ConversationState<ConversationState>(dataStoreConversationState));
options.Middleware.Add(new UserState<UserState>(dataStoreUserState));
});
}
The last to lines are giving the error:
The non-generic type 'ConversationState' cannot be used with type arguments
The non-generic type 'ConversationState' cannot be used with type arguments
Ok, I'm not sure where you got this code from, but it looks like it's from a pre-release build. ConversationState and UserState are no longer middleware and no longer generic (e.g. don't have type arguments).
This is what a Startup::ConfigureServices should look like when using CosmosDB for state storage on a 4.x release build:
public class Startup
{
public void ConfigureServices(IServiceCollection services)
{
// Only need a single storage instance unless you really are storing your conversation state and user state in two completely DB instances
var storage = new CosmosDbStorage(new CosmosDbStorageOptions
{
// … set options here …
});
var conversationState = new ConversationState(storage);
var userState = new UserState(storage);
// Add the states as singletons
services.AddSingleton(conversationState);
services.AddSingleton(userState);
// Create state properties accessors and register them as singletons
services.AddSingleton(conversationState.CreateProperty<YourBotConversationState>("MyBotConversationState"));
services.AddSingleton(userState.CreateProperty<YourBotUserState>("MyBotUserState"));
services.AddBot<SeguritoBot>(options =>
{
// … set options here …
});
}
}
Now, in your bot, if you want to access those properties, you take them as dependencies via the constructor:
public class SeguritoBot : IBot
{
private readonly ConversationState _conversationState;
private readonly UserState _userState;
private readonly IStatePropertyAccessor<YourBotConversationState> _conversationStatePropertyAccessor;
private readonly IStatePropertyAccessor<YourBotUserState> _userStatePropertyAccessor;
public SeguritoBot(
ConversationState conversationState,
UserState userState,
IStatePropertyAccessor<YourBotConversationState> conversationStatePropertyAccessor,
IStatePropertyAccessor<YourBotUserState> userStatePropertyAccesssor)
{
_conversationState = conversationState;
_userState = userState;
_conversationStatePropertyAcessor = conversationStatePropertyAcessor;
_userStatePropertyAcessor = userStatePropertyAcessor;
}
public async Task OnTurnAsync(ITurnContext turnContext, CancellationToken cancellationToken = default(CancellationToken))
{
var currentConversationState = await _conversationStatePropertyAccessor.GetAsync(
turnContext,
() => new YourBotConversationState(),
cancellationToken);
// Access properties for this conversation
// currentConversationState.SomeProperty
// Update your conversation state property
await _conversationStatePropertyAccessor.SetAsync(turnContext, currentConversationState, cancellationToken);
// Commit any/all changes to conversation state properties
await _conversationState.SaveChangesAsync(turnContext, cancellationToken);
}
}
Obviously you can do the same with the user state property and you can support multiple properties per state scope with more calls to CreateProperty and injecting those IStatePropertyAccessor<T> as well, etc.
I recently started using TestServer class to self-host and bootstrap an Aspnet Core API to run Integration Tests without a dedicated running environment.
I love the way it works and using custom environment name, I decided to control the way the EF Context is created, switching from SQL Server to In-Memory database.
Problem is that in order to seed the data necessary to run the tests via API requests, would be very expensive for both coding and running time.
My idea is to create a class or a simple framework to seed the data necessary by each test, but to do so I need to use the same in-memory database which is initialized with the API stack by the TestServer.
How is it possible to do so?
In the first place is important to understand that for better testing in replacement to a relational database such SQL Server, In Memory database is not ideal.
Among the various limitations, it does not support foreign key constraints.
A better way to use an in-memory database is to use SQLite In-Memory mode.
Here is the code I used to setup the TestServer, Seed the Data and register the DB Context for the Dependency Injection:
TestServer
public class ApiClient {
private HttpClient _client;
public ApiClient()
{
var webHostBuilder = new WebHostBuilder();
webHostBuilder.UseEnvironment("Test");
webHostBuilder.UseStartup<Startup>();
var server = new TestServer(webHostBuilder);
_client = server.CreateClient();
}
public async Task<HttpResponseMessage> PostAsync<T>(string url, T entity)
{
var content = new StringContent(JsonConvert.SerializeObject(entity), Encoding.UTF8, "application/json");
return await _client.PostAsync(url, content);
}
public async Task<T> GetAsync<T>(string url)
{
var response = await _client.GetAsync(url);
response.EnsureSuccessStatusCode();
var responseString = await response.Content.ReadAsStringAsync();
return JsonConvert.DeserializeObject<T>(responseString);
}
}
Data Seeding (Helper class)
public class TestDataConfiguration
{
public static IMyContext GetContex()
{
var serviceCollection = new ServiceCollection();
IocConfig.RegisterContext(serviceCollection, "", null);
var serviceProvider = serviceCollection.BuildServiceProvider();
return serviceProvider.GetService<IMyContext>();
}
}
Data Seeding (Test class)
[TestInitialize]
public void TestInitialize()
{
_client = new ApiClient();
var context = TestDataConfiguration.GetContex();
var category = new Category
{
Active = true,
Description = "D",
Name = "N"
};
context.Categories.Add(category);
context.SaveChanges();
var transaction = new Transaction
{
CategoryId = category.Id,
Credit = 1,
Description = "A",
Recorded = DateTime.Now
};
context.Transactions.Add(transaction);
context.SaveChanges();
}
DB Context registration (In IocConfig.cs)
public static void RegisterContext(IServiceCollection services, string connectionString, IHostingEnvironment hostingEnvironment)
{
if (connectionString == null)
throw new ArgumentNullException(nameof(connectionString));
if (services == null)
throw new ArgumentNullException(nameof(services));
services.AddDbContext<MyContext>(options =>
{
if (hostingEnvironment == null || hostingEnvironment.IsTesting())
{
var connection = new SqliteConnection("DataSource='file::memory:?cache=shared'");
connection.Open();
options.UseSqlite(connection);
options.UseLoggerFactory(MyLoggerFactory);
}
else
{
options.UseSqlServer(connectionString);
options.UseLoggerFactory(MyLoggerFactory);
}
});
if (hostingEnvironment == null || hostingEnvironment.IsTesting())
{
services.AddSingleton<IMyContext>(service =>
{
var context = service.GetService<MyContext>();
context.Database.EnsureCreated();
return context;
});
} else {
services.AddTransient<IMyContext>(service => service.GetService<MyContext>());
}
}
The key is the URI file string used to create the SQLite connection:
var connection = new SqliteConnection("DataSource='file::memory:?cache=shared'");
I've encountered a weird issue with naming of queues, which are created in Azure Service Bus and used by MassTransit.
Uri is "sb://{#namespace}.servicebus.windows.net/{path}/{queueName}"
E.g. if a path equals to dev and queueName contains dev as a substring, e.g. devices than dev is removed and I see a queue with a name ices created.
Same happens if path = test as well.
I've have not found any reserved words for queue naming so I wonder if there are any.
That issue happens only while sending messages in ASP.NET Web Api process. For Azure Worker Role everything works fine.
Used methods are:
public static Uri BuildQueueUri(string #namespace, string path, string queueName)
{
return new Uri($"sb://{#namespace}.servicebus.windows.net/{path}/{queueName}");
}
protected Task<ISendEndpoint> EstablishSendEndpoint(string queueName)
{
Uri uri = BusConfiguration.BuildQueueUri(
Settings.GetSetting(ConfigKeys.ServiceBusNamespace),
Settings.GetSetting(ConfigKeys.ServiceBusPath),
queueName);
return BusControl.GetSendEndpoint(uri);
}
public async Task<IHttpActionResult> SendGetIpsToInterceptCommand()
{
ISendEndpoint endpoint = await EstablishSendEndpoint(BusConfiguration.SendCommandsQueue);
var guid = Guid.NewGuid();
await endpoint.Send<ICGetItemsToInterceptCommand>(new
{
CommandId = guid
});
return Ok(guid);
}
MassTransit configuration (using Autofac)
private static void RegisterMicroserviceBus(ContainerBuilder builder)
{
builder.Register(c =>
Bus.Factory.CreateUsingAzureServiceBus(sbc =>
{
var serviceUri = ServiceBusEnvironment.CreateServiceUri("sb",
Settings.GetSetting(ConfigKeys.ServiceBusNamespace),
Settings.GetSetting(ConfigKeys.ServiceBusPath));
sbc.Host(serviceUri, h =>
{
h.TokenProvider =
TokenProvider.CreateSharedAccessSignatureTokenProvider(
Settings.GetSetting(ConfigKeys.ServiceBusKeyName),
Settings.GetSetting(ConfigKeys.ServiceBusAccessKey),
TimeSpan.FromDays(365),
TokenScope.Namespace);
});
}))
.SingleInstance()
.As<IBusControl>()
.As<IBus>();
}