ASP.NET Web API Test instance initialization - c#

I am writing some integration tests for my web API, which means that it has to be running during the execution of the tests. Is there any way to run it with an in-memory database instead of a real one based on SQL Server?
Also, I need to run a few instances at a time, so I need somehow to change the base address of each of them to be unique. For example, I could append to the base URL these instance IDs, that are mentioned in the code below.
Here is the code which I am using to run a new instance for my tests:
public static class WebApiHelper
{
private const string ExecutableFileExtension = "exe";
private static readonly Dictionary<Guid, Process> _instances = new();
public static void EnsureIsRunning(Assembly? assembly, Guid instanceId)
{
if (assembly is null)
throw new ArgumentNullException(nameof(assembly));
var executableFullName = Path.ChangeExtension(
assembly.Location, ExecutableFileExtension);
_instances.Add(instanceId, Process.Start(executableFullName));
}
public static void EnsureIsNotRunning(Guid instaceId)
=> _instances[instaceId].Kill();
}
Talking in general, is this a good way to create test instances, or maybe I am missing something? Asking this, because maybe there is another 'legal' way to achieve my goal.

Okay, so in the end, I came up with this super easy and obvious solution.
As was mentioned in the comments - using the in-memory database is not the best way to test, because relational features are not supported if using MS SQL.
So I decided to go another way.
Step 1: Overwrite the connection strings.
In my case, that was easy since I have a static IConfiguration instance and was need just to overwrite the connection string within that instance.
The method looks as follows:
private const string ConnectionStringsSectionName = "ConnectionStrings";
private const string TestConnectionStringFormat = "{0}_Test";
private static bool _connectionStringsOverwitten;
private static void OverwriteConnectionStrings()
{
if (_connectionStringsOverwitten)
return;
var connectionStrings = MyStaticConfigurationContainer.Configuration
.AsEnumerable()
.Where(entry => entry.Key.StartsWith(ConnectionStringsSectionName)
&& entry.Value is not null);
foreach (var connectionString in connectionStrings)
{
var builder = new SqlConnectionStringBuilder(connectionString.Value);
builder.InitialCatalog = string.Format(TestConnectionStringFormat,
builder.InitialCatalog);
MyStaticConfigurationContainer.Configuration[connectionString.Key] = builder.ConnectionString;
}
_connectionStringsOverwitten = true;
}
Of course, you would need to handle the database creation and deletion before and after running the tests, otherwise - your test DBs may become a mess.
Step 2: Simply run your web API instance within a separate thread.
In my case, I am using the NUnit test framework, which means I just need to implement the web API setup logic within the fixture. Basically, the process would be more or less the same for every testing framework.
The code looks as follows:
[SetUpFixture]
public class WebApiSetupFixture
{
private const string WebApiThreadName = "WebApi";
[OneTimeSetUp]
public void SetUp() => new Thread(RunWebApi)
{
Name = WebApiThreadName
}.Start();
private static void RunWebApi()
=> Program.Main(Array.Empty<string>());
// 'Program' - your main web app class with entry point.
}
Note: The code inside Program.Main(); will also look for connection strings in the MyStaticConfigurationContainer.Configuration which was changed in the previous step.
And that's it! Hope this could help somebody else :)

Related

Console.SetOut does not work with ConsoleLogger

In my asp.net application, I'm using ConsoleLogger when debug.
services.AddLogging(builder => builder.AddConsole());
In integration tests, I'm trying to obtain console output like this
var strWriter = new StringWriter();
Console.SetOut(strWriter);
// some code that has a lot of logging
var consoleOutput = strWriter.ToString();
and consoleOutput is an empty string.
Is that issue with ConsoleLogger? How can I obtain console output?
Not sure why you want to do this and even more - I would say you should not do this, at least this way, but if you still want - you will need Console.SetOut before writing first messages to log, i.e. in case of ASP.NET Core it should look something like this:
public static StringWriter GlobalStringWriter = new StringWriter(); // use this "singleton"
public static void Main(string[] args)
{
Console.SetOut(GlobalStringWriter);
CreateHostBuilder(args).Build().Run();
}
Note that it will capture all concurrent requests (as it should do anyway).
If you take a look at source code for ConsoleLoggerProvider (which is actually setup with AddConsole call), you will see that in internally uses AnsiLogConsole/AnsiParsingLogConsole both of which internally capture current value of System.Console.Error/System.Console.Out in theirs constructors:
public AnsiParsingLogConsole(bool stdErr = false)
{
_textWriter = stdErr ? System.Console.Error : System.Console.Out;
_parser = new AnsiParser(WriteToConsole);
}
public AnsiLogConsole(bool stdErr = false)
{
_textWriter = stdErr ? System.Console.Error : System.Console.Out;
}
So setting the string writer after this capture happened does not actually change where logs are written.
I would say that this is not very good approach and possibly you better think about implementing some kind of your own logger provider which will allow you to handle this usecase, or use an existing one and set it up for debug mode to write to file.

Force DBUP to rerun new scripts during development

We're using DBUP to handle db migrations. Each release, we would like to run the dbup console app with a command line switch so that during dev we can re-run our scripts while we're working on them, however we don't want it to re-run all the previous releases scripts which already appear in the database. How can this be achieved?
We added a '-debug' command line switch to our DbUp console application. If this is present we switch which Journal class is used when talking to the database.
The Journal class (https://dbup.readthedocs.io/en/latest/more-info/journaling/) in DbUp is the class that interacts with the database to check and record which scripts have already been run (stored by default in the Schema Versions table). For Dev, we force this to use a read-only version of this, which can check which scripts are already present (to prevent you re-running everything each time) but prevents new records being recorded, so that next time it will attempt to re-run your new scripts again.
The read only journal looks like this;
public class ReadOnlyJournal : IJournal
{
private readonly IJournal _innerJournal;
public ReadOnlyJournal(IJournal innerJournal)
{
_innerJournal = innerJournal;
}
public void EnsureTableExistsAndIsLatestVersion(Func<IDbCommand> dbCommandFactory)
{
_innerJournal.EnsureTableExistsAndIsLatestVersion(dbCommandFactory);
}
public string[] GetExecutedScripts()
{
return _innerJournal.GetExecutedScripts().ToArray();
}
public void StoreExecutedScript(SqlScript script, Func<IDbCommand> dbCommandFactory)
{
// don't store anything
}
}
Then an extension method to allow the use of this new journal to be easier specified;
public static class DbUpHelper
{
public static UpgradeEngineBuilder WithReadOnlyJournal(this UpgradeEngineBuilder builder, string schema, string table)
{
builder.Configure(c => c.Journal = new ReadOnlyJournal(new SqlTableJournal(() => c.ConnectionManager, () => c.Log, schema, table)));
return builder;
}
}
And then finally the change to your DbUp console app;
var upgrader = debug
? DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.WithReadOnlyJournal("dbo", "SchemaVersions")
.LogToConsole()
.Build()
: DeployChanges.To
.SqlDatabase(connectionString)
.WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
.LogToConsole()
.Build();
var result = upgrader.PerformUpgrade();
if (!result.Successful)
....

Is it possible to use fluent migrator in application_start?

I'm using fluent migrator to manage my database migrations, but what I'd like to do is have the migrations run at app start. The closest I have managed is this:
public static void MigrateToLatest(string connectionString)
{
using (var announcer = new TextWriterAnnouncer(Console.Out)
{
ShowElapsedTime = true,
ShowSql = true
})
{
var assembly = typeof(Runner).Assembly.GetName().Name;
var migrationContext = new RunnerContext(announcer)
{
Connection = connectionString,
Database = "SqlServer2008",
Target = assembly
};
var executor = new TaskExecutor(migrationContext);
executor.Execute();
}
}
I'm sure I had this working, but I've not looked at it for sometime (hobby project) and it's now throwing null reference exceptions when it gets to the Execute line. Sadly there are no docs for this and I've been banging my head on it for ages.
Has anyone managed to get this kind of thing working with FluentMigrator?
PM> Install-Package FluentMigrator.Tools
Manually add a reference to:
packages\FluentMigrator.Tools.1.6.1\tools\AnyCPU\40\FluentMigrator.Runner.dll
Note that the folder name will vary on version number, this illustration uses the current 1.6.1 release. If you need the .NET 3.5 runner use the \35\ directory.
public static class Runner
{
public class MigrationOptions : IMigrationProcessorOptions
{
public bool PreviewOnly { get; set; }
public string ProviderSwitches { get; set; }
public int Timeout { get; set; }
}
public static void MigrateToLatest(string connectionString)
{
// var announcer = new NullAnnouncer();
var announcer = new TextWriterAnnouncer(s => System.Diagnostics.Debug.WriteLine(s));
var assembly = Assembly.GetExecutingAssembly();
var migrationContext = new RunnerContext(announcer)
{
Namespace = "MyApp.Sql.Migrations"
};
var options = new MigrationOptions { PreviewOnly=false, Timeout=60 };
var factory =
new FluentMigrator.Runner.Processors.SqlServer.SqlServer2008ProcessorFactory();
using (var processor = factory.Create(connectionString, announcer, options))
{
var runner = new MigrationRunner(assembly, migrationContext, processor);
runner.MigrateUp(true);
}
}
}
Note the SqlServer2008ProcessorFactory this is configurable dependent upon your database, there is support for: 2000, 2005, 2008, 2012, and 2014.
I have actually accomplished running migrations in the application_start however it is hard to tell from that code what could be wrong... Since it is open source I would just grab the code and pull it into your solution and build it to find out what the Execute method is complaining about. I found that the source code for Fluent Migrator is organized pretty well.
One thing that you might have to be concerned about if this is a web app is making sure that no one uses the database while you are migrating. I used a strategy of establishing a connection, setting the database to single user mode, running the migrations, setting the database to multi user mode, then closing the connection. This also handles the scenario of a load balanced web application on multiple servers so 2 servers don't try to run migrations against the same database.

Store the cache data locally

I develops a C# Winform application, it is a client and connect to web service to get data. The data returned by webservice is a DataTable. Client will display it on a DataGridView.
My problem is that: Client will take more time to get all data from server (web service is not local with client). So I must to use a thread to get data. This is my model:
Client create a thread to get data -> thread complete and send event to client -> client display data on datagridview on a form.
However, when user closes the form, user can open this form in another time, and client must get data again. This solution will cause the client slowly.
So, I think about a cached data:
Client <---get/add/edit/delete---> Cached Data ---get/add/edit/delete--->Server (web service)
Please give me some suggestions.
Example: cached data should be developed in another application which is same host with client? Or cached data is running in client.
Please give me some techniques to implement this solution.
If having any examples, please give me.
Thanks.
UPDATE : Hello everyone, maybe you think my problem so far. I only want to cache data in client's lifetime. I think cache data should be stored in memory. And when client want to get data, it will check from cache.
If you're using C# 2.0 and you're prepared to ship System.Web as a dependency, then you can use the ASP.NET cache:
using System.Web;
using System.Web.Caching;
Cache webCache;
webCache = HttpContext.Current.Cache;
// See if there's a cached item already
cachedObject = webCache.Get("MyCacheItem");
if (cachedObject == null)
{
// If there's nothing in the cache, call the web service to get a new item
webServiceResult = new Object();
// Cache the web service result for five minutes
webCache.Add("MyCacheItem", webServiceResult, null, DateTime.Now.AddMinutes(5), Cache.NoSlidingExpiration, System.Web.Caching.CacheItemPriority.Normal, null);
}
else
{
// Item already in the cache - cast it to the right type
webServiceResult = (object)cachedObject;
}
If you're not prepared to ship System.Web, then you might want to take a look at the Enterprise Library Caching block.
If you're on .NET 4.0, however, caching has been pushed into the System.Runtime.Caching namespace. To use this, you'll need to add a reference to System.Runtime.Caching, and then your code will look something like this:
using System.Runtime.Caching;
MemoryCache cache;
object cachedObject;
object webServiceResult;
cache = new MemoryCache("StackOverflow");
cachedObject = cache.Get("MyCacheItem");
if (cachedObject == null)
{
// Call the web service
webServiceResult = new Object();
cache.Add("MyCacheItem", webServiceResult, DateTime.Now.AddMinutes(5));
}
else
{
webServiceResult = (object)cachedObject;
}
All these caches run in-process to the client. Because your data is coming from a web service, as Adam says, you're going to have difficulty determining the freshness of the data - you'll have to make a judgement call on how often the data changes and how long you cache the data for.
Do you have the ability to make changes/add to the webservice?
If you can Sync Services may be an option for you. You can define which tables are syncronised, and all the sync stuff is managed for you.
Check out
http://msdn.microsoft.com/en-us/sync/default.aspx
and shout if you need more information.
You might try the Enterprise Library's Caching Application Block. It's easy to use, stores in memory and, if you ever need to later, it supports adding a backup location for persisting beyond the life of the application (such as to a database, isolated storage, file, etc.) and even encryption too.
Use EntLib 3.1 if you're stuck with .NET 2.0. There's not much new (for caching, at least) in the newer EntLibs aside from better customization support.
Identify which objects you would like to serialize, and cache to isolated storage. Specify the level of data isolation you would like (application level, user level, etc).
Example:
You could create a generic serializer, a very basic sample would look like this:
public class SampleDataSerializer
{
public static void Deserialize<T>(out T data, Stream stm)
{
var xs = new XmlSerializer(typeof(T));
data = (T)xs.Deserialize(stm);
}
public static void Serialize<T>(T data, Stream stm)
{
try
{
var xs = new XmlSerializer(typeof(T));
xs.Serialize(stm, data);
}
catch (Exception e)
{
throw;
}
}
}
Note that you probably should put in some overloads to the Serialize and Deserialize methods to accomodate readers, or any other types you are actually using in your app (e.g., XmlDocuments, etc).
The operation to save to IsolatedStorage can be handled by a utility class (example below):
public class SampleIsolatedStorageManager : IDisposable
{
private string filename;
private string directoryname;
IsolatedStorageFile isf;
public SampleIsolatedStorageManager()
{
filename = string.Empty;
directoryname = string.Empty;
// create an ISF scoped to domain user...
isf = IsolatedStorageFile.GetStore(IsolatedStorageScope.User |
IsolatedStorageScope.Assembly | IsolatedStorageScope.Domain,
typeof(System.Security.Policy.Url), typeof(System.Security.Policy.Url));
}
public void Save<T>(T parm)
{
using (IsolatedStorageFileStream stm = GetStreamByStoredType<T>(FileMode.Create))
{
SampleDataSerializer.Serialize<T>(parm, stm);
}
}
public T Restore<T>() where T : new()
{
try
{
if (GetFileNameByType<T>().Length > 0)
{
T result = new T();
using (IsolatedStorageFileStream stm = GetStreamByStoredType<T>(FileMode.Open))
{
SampleDataSerializer.Deserialize<T>(out result, stm);
}
return result;
}
else
{
return default(T);
}
}
catch
{
try
{
Clear<T>();
}
catch
{
}
return default(T);
}
}
public void Clear<T>()
{
if (isf.GetFileNames(GetFileNameByType<T>()).Length > 0)
{
isf.DeleteFile(GetFileNameByType<T>());
}
}
private string GetFileNameByType<T>()
{
return typeof(T).Name + ".cache";
}
private IsolatedStorageFileStream GetStreamByStoredType<T>(FileMode mode)
{
var stm = new IsolatedStorageFileStream(GetFileNameByType<T>(), mode, isf);
return stm;
}
#region IDisposable Members
public void Dispose()
{
isf.Close();
}
}
Finally, remember to add the following using clauses:
using System.IO;
using System.IO.IsolatedStorage;
using System.Xml.Serialization;
The actual code to use the classes above could look like this:
var myClass = new MyClass();
myClass.name = "something";
using (var mgr = new SampleIsolatedStorageManager())
{
mgr.Save<MyClass>(myClass);
}
This will save the instance you specify to be saved to the isolated storage. To retrieve the instance, simply call:
using (var mgr = new SampleIsolatedStorageManager())
{
mgr.Restore<MyClass>();
}
Note: the sample I've provided only supports one serialized instance per type. I'm not sure if you need more than that. Make whatever modifications you need to support further functionalities.
HTH!
You can serialise the DataTable to file:
http://forums.asp.net/t/1441971.aspx
Your only concern then is deciding when the cache has gone stale. Perhaps timestamp the file?
In our implementation every row in the database has a last-updated timestamp. Every time our client application accesses a table we select the latest last-updated timestamp from the cache and send that value to the server. The server responds with all the rows that have newer timestamps.

Silverlight 2 - Adding database records using WCF

I'm creating a simple Silverlight 2 application - a guestbook. I'm using MSSQL as the data source, I've managed to load the data but I can't find out how to add new rows (messages) to the database.
I crawled all the internet but didn't find any working solution. The SCMEssages table has four columns - MessageID, MessageDate, MessageAuthor and MessageText. Right now I have the following code in Service1 class (which implements IService1 interface) (not working though):
public void SaveMessage(SCMessage message)
{
DataClasses1DataContext db=new DataClasses1DataContext();
db.GetTable<SCMessage>().Attach(message);
db.SubmitChanges();
}
In the main class I'm simply calling this method:
private void SendBtn_Click(object sender, RoutedEventArgs e)
{
SCMessage sm = new SCMessage
{
MessageAuthor = NameTB.Text,
MessageDate = DateTime.Now,
MessageText = TextTB.Text
};
newMessages.Add(sm);
ServiceReference1.Service1Client client = new Service1Client();
client.SaveMessageAsync(sm);
}
Could anybody help me? Thanks for any suggestions!
I'm not sure I complete understand the context (like do you control your WCF service and/or your DB). But did you consider ADO.Net Data services? (also known as astoria)
(http://msdn.microsoft.com/en-us/library/cc668792.aspx)
Then you don't need to create a webservice for it, it is already created for you.
Basically it is an easy way to access your data from within Silverlight and even be able to do queries from within silverlight.
There is already a bit of doc in blogs, for example:
A quickstart is here: http://michaelsync.net/2008/01/15/consuming-adonet-data-service-astoria-from-silverlight
How to update data can be seen here: http://michaelsync.net/2008/02/10/crud-operations-in-siverlight-using-adonet-data-service
A complete working example is here: http://www.silverlightdata.com/
Note that in a lot of examples on the web the silverlight proxy is generated using the command line, that is however not needed anymore, you can do it directly from within VS using "add service reference" to your project and pointing it to your ado.net data service
Hope this helps a bit?
Tjipke
Is SCMessage decorated with the [DataContract] attribute or is it [Serializable]? Please provide us with the definition of SCMessage.
SCMessage is name of a Data Class - I created a file from template "Linq to SQL Classes" (.dbml) and drag&dropped the SCMessages table to the Designer. It's decorated with the [DataContract] attribute and I set it's DataContext's Serialiation Mode property to Unidirectional. So the content of the SCMessage class is auto-generated, but here is as least a part of it:
[Table(Name="dbo.SCMessages")]
[DataContract()]
public partial class SCMessage : INotifyPropertyChanging, INotifyPropertyChanged
{
private static PropertyChangingEventArgs emptyChangingEventArgs = new PropertyChangingEventArgs(String.Empty);
private int _MessageID;
private string _MessageAuthor;
private string _MessageText;
private System.DateTime _MessageDate;
#region Extensibility Method Definitions
partial void OnLoaded();
partial void OnValidate(System.Data.Linq.ChangeAction action);
partial void OnCreated();
partial void OnMessageIDChanging(int value);
partial void OnMessageIDChanged();
partial void OnMessageAuthorChanging(string value);
partial void OnMessageAuthorChanged();
partial void OnMessageTextChanging(string value);
partial void OnMessageTextChanged();
partial void OnMessageDateChanging(System.DateTime value);
partial void OnMessageDateChanged();
#endregion
public SCMessage()
{
this.Initialize();
}
[Column(Storage="_MessageID", AutoSync=AutoSync.OnInsert, DbType="Int NOT NULL IDENTITY", IsPrimaryKey=true, IsDbGenerated=true)]
[DataMember(Order=1)]
public int MessageID
{
get
{
return this._MessageID;
}
set
{
if ((this._MessageID != value))
{
this.OnMessageIDChanging(value);
this.SendPropertyChanging();
this._MessageID = value;
this.SendPropertyChanged("MessageID");
this.OnMessageIDChanged();
}
}
}
And here's a little problem with Astoria - it doesn't work for me. I followed the Michael Sync's tutorial and made a few modifications, such as using of DataServiceQuery because WebDataQuery doesn't exist in final version of Astoria, etc. I ended up with two code snippets - the first is almost identical copy of the one in Michael Sync's article and the second one is using LINQ query instead of the CreateQuery method (I think both of these approaches lead to the same end). Here are the snippets:
SilverchatDBEntities entity =
new SilverchatDBEntities(new Uri("http://localhost:65373/WebDataService1.svc", UriKind.Absolute));
entity.MergeOption = MergeOption.OverwriteChanges;
DataServiceQuery<SCMessages> messages = entity.CreateQuery<SCMessages>("SCMessages");
messages.BeginExecute(
result =>
{
var mess = messages.EndExecute(result);
foreach (var mes in mess)
{
MessagesLB.Items.Add(mes.MessageAuthor);
}
},
null);
This doesn't do anything - it doesn't throw any exception and doesn't load any SCMessages neither. The second snippet is as follows:
SilverchatDBEntities entity =
new SilverchatDBEntities(new Uri("http://localhost:65373/WebDataService1.svc", UriKind.Absolute));
var query = (DataServiceQuery<SCMessages>) from m in entity.SCMessages select m;
query.BeginExecute(new AsyncCallback(result =>
{
try
{
var mes = query.EndExecute(result);
foreach (var r in mes)
{
MessagesLB.Items.Add(string.Format("{0}; {1} - {2}",
r.MessageDate.
ToString(
"d/M hh:mm",
CultureInfo.
InvariantCulture),
r.MessageAuthor,
r.MessageText));
}
}
catch (Exception ex)
{
MessageBox.Show(ex.Message);
}
}), null);
This one throws exception at the 'foreach' loop - 'Object reference not set to an instance of an object'. I have no idea what the problem could be.
Chrasty,
I don't see any obvious errors yet. So in theory it shoulld work (I currently work almost daily with these kind of queries. A few questions:
1. can you use fiddler2 to see what is going over the wire. (If you don't know what fiddler is then google :-) and if after that you are using fiddler on localhost, then please add a '.' in the url in the browser (like http:\localhost.:1234\mywebsitehostingsilverlight.aspx), -> prevents an other google search)
2. Do you have a stack trace of the second one throwing an exception.
3. Did you try putting a breakpoint in your callback (first example) to see if it is called and with what result it is called
Hope these questions help a bit in solving the problem

Categories

Resources