Blog

Useful tidbits related to software development, that I think might be of use or interest to everyone else (or to me when I forget what I did!)

Cloning a Visual Studio Project With A New Name

July 21, 2021

Often when I'm working on solutions I have a lot of projects that are all pretty much starting from the same place as another. For example, "unit test project targeting .net5, with FluentAssertions, Moq, Autofixture" - but also sometimes projects with a similar class structures too like a "repository that talks to MongoDB, with an ISomethingConfig", or a "Lambda with DI container". Of course I could create Visual Studio templates for these, but often templates fall out of date or I'd end up still needing to copy in some specific classes from a recently used project that I want to replicate. In a lot of cases I found myself copying and pasting and entire project folder, renaming the files accordingly then going through the csproj/cs files to replace the namespace and/or class names to target my "new" project. To automate this I wrote a Bash script, which might not be the most elegant or robust script in the world, but it does the job so I thought I'd share it here:
# Script to create a new Visual Studio project folder from an existing one
# Usage: ./csprojclone.sh Your.Namespaced.Source Your.Namespaced.Destinationn

if [ $# -lt 2 ]
	then
		echo "Please pass source and destination namespace arguments"
		return -1
fi
source=${1%%/}
dest=${2%%/}

if [[ ! -d $source ]]
	then
		echo "Source directory does not exist or isn't a folder - make sure you are in the correct working directory"
		return -1
fi

if [[ -e $dest ]]
        then
                echo "Destination folder already exists in the working directory"
                return -1
fi

oldproject=${source%%.UnitTests}
oldproject=${oldproject##*.}
newproject=${dest%%.UnitTests}
newproject=${newproject##*.}

cp -r $source $dest
rm -r $dest/bin/
rm -r $dest/obj/
find $dest -iname "*$source*" -exec rename -v "s/$source/$dest/" {} \;
find $dest -iname "*$oldproject*" -exec rename -v "s/$oldproject/$newproject/" {} \;
find $dest -type f -print0 | xargs -0 sed -i "s/$source/$dest/g"
find $dest -type f -print0 | xargs -0 sed -i "s/$oldproject/$newproject/g"
As per the script, the usage is to call the script from within your solution directory, passing in the name of the existing project folder and then a destination one. The script will clone the source, rename the files, search and replace the namespace change and update any classes that had the specific project name in. If you're using it on a unit tests project, it will strip of ".UnitTests" from the path - so if that's not your naming convention then feel free to edit those bits. Here's an example of how it might work: PWD = /path/to/SolutionFolder
    My.Namespace.UserRepository 
    My.Namespace.UserRepository/My.Namespace.UserRepository.csproj
    My.Namespace.UserRepository/MongoStuff/...
    My.Namespace.UserRepository/IUserRepositoryConfig.cs
    etc..
. /path/to/script/csprojclone.sh My.Namespace.UserRepository My.Namespace.OrderRepository
Will create:
    My.Namespace.OrderRepository
    My.Namespace.OrderRepository/My.Namespace.OrderRepository.csproj
    My.Namespace.OrderRepository/MongoStuff/...
    My.Namespace.OrderRepository/IOrderRepositoryConfig.cs
With all namespaces also updated.

Easy Way To Test All Constructor Null Argument Checks

February 09, 2021

Often C# classes will have several dependencies passed into the constructor and some (maybe all) of these will be mandatory for the class to function. In this case, ordinarily you'd add a null argument check in the ctor and if something is null throw an "ArgumentNullException". This is pretty boiler plate and is usually auto-generated code, but still it needs to be tested in your unit tests to assert that indeed all mandatory options have been checked (and conversely that all optional dependencies can indeed be null). This can be quite tedious and repetitive and later changing the signature of the ctor can result in many tests that requiring fixing up. To make these tests smaller and more concise I've come up with a new strategy using test cases and nullable mocks, as follows:
// SomeClass.cs
public class SomeClass
{
	private readonly IDependency1 _dependency1;
	private readonly IDependency2 _dependency2;
	private readonly IDependency3 _dependency3;

	public SomeClass(IDependency1 dependency1, IDependency2 dependency2, IDependency3 dependency3)
	{
		_dependency1 = dependency1 ?? throw new ArgumentNullException(nameof(dependency1));
		_dependency2 = dependency2 ?? throw new ArgumentNullException(nameof(dependency2));
		_dependency3 = dependency3 ?? throw new ArgumentNullException(nameof(dependency3));
	}
}

// SomeClass.tests.cs
[TestFixture]
public class SomeClassTests
{
	private Mock<IDependency1> _dependency1Mock;
	private Mock<IDependency2> _dependency2Mock;
	private Mock<IDependency3> _dependency3Mock;

	[SetUp]
	public void SetUp()
	{
		_dependency1Mock = new Mock<IDependency1>();
		_dependency2Mock = new Mock<IDependency2>();
		_dependency3Mock = new Mock<IDependency3>();
	}

	[TestCase("dependency1")]
	[TestCase("dependency2")]
	[TestCase("dependency3")]
	public void Ctor_RequiredDependencyNull_ThrowsException(string dependency)
	{
		var setup = new Dictionary<string, Action>
		{
			{"dependency1", () => _dependency1Mock = null },
			{"dependency2", () => _dependency2Mock = null },
			{"dependency3", () => _dependency3Mock = null }
		};
		setup[dependency]();

		Func<SomeClass> act = GetDefaultSut;

		act.Should().Throw<ArgumentNullException>().And.ParamName.Should().Be(dependency);
	}

   private SomeClass GetDefaultSut()
	{
		return new SomeClass(_dependency1Mock?.Object, _dependency2Mock?.Object, _dependency3Mock?.Object);
	}
}
My above example is for NUnit, using FluentAssertions and Moq but can be converted to your testing tools of choice.

Recommendations For Acceptance Testing ASP.NET Core APIs Using SpecFlow

December 10, 2020

Before I begin with my recommendations it's probably worth me defining what I mean by "acceptance tests" by showing you where these sit conceptually in my testing arsenal: Unit Tests - Tests for an individual class, to ensure that it behaves as expected and that all behaviour is documented with a test. Acceptance tests - Tests for a piece of functionality, to ensure that the collection of classes involved "actually work" but excluding downstream dependencies Integration Tests - In-situ testing that the features continue to work when all dependencies are "real" I find that writing "acceptance tests" using SpecFlow is a great way to de-couple your behaviour from your code structure, making TDD more realistic and also meaning after a re-factor (which often results in re-factoring the unit tests) you can confirm there are no breaking changes. It also allows you to involve your QA/BA in the process by quantifying in plain English what scenario's you are catering for and how the system behaves for each. It's worth noting that SpecFlow can also be used to automate your integration tests, however that's a little more complex to setup as usually involved spinning up SQL servers, Kafka instances, mocked external APIs etc. and also is too slow to run those types of tests on build, whereas the acceptance tests I will demonstrate below you can quickly run on build like any other unit test. To create a SpecFlow project for testing an API add an NUnit test project and install the SpecFlow.NUnit.Runners & Microsoft.AspNetCore.Mvc.Testing NuGet packages into that test project, add a reference to the Api project and then begin creating your tests. My recommendations to consider are below:
  • Create a "WebTestFixture" that inherits from "WebApplicationFactory<Startup>"
    • Where "Startup" is your API Statup class
    • Take constructor params to capture shared data context classes from BoDi (the SpecFlow DI container)
    • Override the "ConfigureWebHost" method and use "builder.ConfigureTestServices" to replace any "real" dependencies with mocks defined in the test project
    • Also register any shared data contexts that your mocks require from BoDi with the .NET DI container
  • Create a folder structure that allows you consider the following genres of classes:
    • Infrastructure - e.g. SpecFlow hooks, Value Retrievers, Transformations etc. (basically the custom SpecFlow pipework)
    • TestDataProviders - with a subfolder for each high level dependency you are mocking (e.g. what would be a class library in the real implementation)
      • EachDataProvider - containing:
        • Interceptors - create a class per interface which acts as an in-memory version of the system you are mocking (use constructor injection to give these an accessible backing store in the form of a "context" class.)
        • DataContext - POCO classes which represent the state of your in-memory repository and are shared between BoDi and the .NET DI container so they can be manipulated in the test steps
        • StepDefinitions - All the SpecFlow step definitions for interacting with these mocks
    • FolderPerController - the "tests" live in here so assuming your controllers align with a sensible functional grouping it makes sense to mirror that structure
      • Interactions - create a class which interacts with this controller via the "WebTestFixture.CreateClient()" HttpClient
      • Features - create a SpecFlow feature file per endpoint of the controller - in here create the scenario's this endpoint supports
      • Context - any classes that represent the data context of the controller itself (such as the data you will post, or the response from the API)
      • StepDefinitions - All the SpecFlow step definitions for interacting with this API controller and the assertions of the features

    This structure works well for me as it allows me to keep a separation of code specific to a controller or endpoint (making it easier to see what is involved with which moving part) from each other, but also allows code re-use of steps which are for contriving data in your mocked repositories, with a clear separation again which would match the structure of your class libraries of your project. And of course, once you have defined the features/steps/data required to interact with all mocks and all controllers/endpoints - you can create a high level folder of features that interact across multiple of these, if you have such scenarios to assert.

Global Error Logging For HttpClient Calls In ASP.NET

September 22, 2020

When you have multiple downstream dependencies that are accessed via HttpClient in .NET then you want a simple way of logging all the error responses that are received from those calls. Often your application will react to the non-successful response by logging it's own error, but this can sometimes miss the detail of what actually went wrong downstream. An easy to way to capture all the non-successful outbound calls your application makes is to inject a custom delegating handler into all instances of HttpClient (via DI) which can inspect the return code and call out to your logger if necessary: LoggingMessageHandler.cs:
public class LoggingMessageHandler : DelegatingHandler
{
	private readonly IExceptionLogger _exceptionLogger;

	public LoggingMessageHandler(IExceptionLogger exceptionLogger)
	{
		_exceptionLogger = exceptionLogger ?? throw new ArgumentNullException(nameof(exceptionLogger));
	}

	protected override async Task<HttpResponseMessage> SendAsync(
		HttpRequestMessage request, System.Threading.CancellationToken cancellationToken)
	{
		var response = await base.SendAsync(request, cancellationToken);
		if (!response.IsSuccessStatusCode)
		{
			var responseContent = await response.Content.ReadAsStringAsync();
			await _exceptionLogger.LogExceptionAsync(new HttpRequestException(
				$"Api call to {request.RequestUri} failed with status code {(int)response.StatusCode}. Response content: {responseContent}"));
		}
		return response;
	}
}
Your implementation of "IExceptionLogger" may vary, but this is your opportunity to write something to your logs/DB etc. To set this delegating handler on your HttpClient, setup the Microsoft DI container as follows:
services.AddHttpClient<ISomeClient, SomeClient>().AddHttpMessageHandler<LoggingMessageHandler>()
Since this class and custom logic will typically live in your composition root, I'd recommend passing the builder delegate down to any class libraries you are building (where you choose to define the ServiceCollectionExtensions inside the class library). For example: Startup.cs:
services.AddMyCustomLibrary(builder => builder.AddHttpMessageHandler<LoggingMessageHandler>());
CustomLibrary/ServiceCollectionExtensions.cs:
public static void AddMyCustomLibrary(this IServiceCollection services, Action<IHttpClientBuilder> clientBuilderDelegate = null)
{
	var someInnerClientBuilder = services.AddHttpClient<ISomeInnerClient, SomeInnerClient>();
	clientBuilderDelegate?.Invoke(someInnerClientBuilder);
}

Simple Test Approach for HttpClient

June 25, 2020

It's pretty common practice in .NET Core to take a dependency on HttpClient in your constructor and using the built-in DI container extension to register this. When it comes to unit testing it can always be a bit fiddly when you depend on a concrete class rather than an interface. After solving this problem several times when it comes to HttpClient based unit tests I've create a simple TestHttpClient and TestHttpClientBuilder to simplify the process:
public class TestHttpClientBuilder
{
        private readonly HttpResponseMessage _stubHttpResponseMessage = new HttpResponseMessage(HttpStatusCode.OK);
        private Exception _exception = null;

        public TestHttpClientBuilder WithStatusCode(HttpStatusCode statusCode)
        {
            _stubHttpResponseMessage.StatusCode = statusCode;
            return this;
        }

        public TestHttpClientBuilder WithJsonContent<T>(T expectedResponseObject)
        {
            _stubHttpResponseMessage.Content = new StringContent(JsonConvert.SerializeObject(expectedResponseObject), Encoding.UTF8, "application/json");
            return this;
        }

        public TestHttpClientBuilder WithException(Exception ex)
        {
            _exception = ex;
            return this;
        }

        public TestHttpClient Build()
        {
            return new TestHttpClient(
                _exception != null ? 
                    new FakeHttpMessageHandler(_exception) : 
                    new FakeHttpMessageHandler(_stubHttpResponseMessage));
        }

        public class TestHttpClient : HttpClient
        {
            private readonly FakeHttpMessageHandler _httpMessageHandler;

            internal TestHttpClient(FakeHttpMessageHandler httpMessageHandler) : base(httpMessageHandler)
            {
                _httpMessageHandler = httpMessageHandler;
                BaseAddress = new Uri("http://localhost.com");
            }

            public IReadOnlyList<HttpRequestMessage> CapturedRequests => _httpMessageHandler.CapturedRequests;
        }
}

public class FakeHttpMessageHandler : HttpMessageHandler
{
        private readonly Exception _exception;
        private readonly HttpResponseMessage _response;
        private readonly List<HttpRequestMessage> _capturedRequests = new List<HttpRequestMessage>();

        public FakeHttpMessageHandler(Exception exception)
        {
            _exception = exception;
        }

        public FakeHttpMessageHandler(HttpResponseMessage response)
        {
            _response = response;
        }

        public IReadOnlyList<HttpRequestMessage> CapturedRequests => _capturedRequests;

        protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request,
            CancellationToken cancellationToken)
        {
            _capturedRequests.Add(request);

            if (_exception != null)
            {
                throw _exception;
            }

            return Task.FromResult(_response);
        }
}
Given this code is available to your unit tests, you can now use the builder when instantiating the SUT and use the builder methods to configure the possible responses and/or inspect the captured requests to test your outbound calls. e.g.
public class UnitTestClass
{
	private TestHttpClientBuilder _testHttpClientBuilder;
	private Lazy<TestHttpClientBuilder.TestHttpClient> _testHttpClient;

	public void SetUp()
	{
		_testHttpClientBuilder = new TestHttpClientBuilder()
			.WithStatusCode(HttpStatusCode.OK)
			.WithJsonContent(new MyDataType()); // use AutoFixture, test data builder etc. to create a default response

		_testHttpClient = new Lazy<TestHttpClientBuilder.TestHttpClient>(() => _testHttpClientBuilder.Build());
	}

	// Now create tests on the SUT using "_testHttpClient.Value" for the HttpClient ctor argument.
	
	// Make assertions based on changing the response status code / content, or by inspecting "_testHttpClient.Value.CapturedRequests"
}

Unit Testing Microsoft Configuration Bindings

June 21, 2020

When you have a class that is populated using "configuration.Bind" you might find it useful to test that the properties are correctly set. This confirms that the property names are correctly aligned with your expected keys and that all your bound properties have accessible "setters" to be called by the binding engine. (Given that the interface that my config files implement are usually "read-only" then the "set" is not enforced) See the example class:
public class SomeSettings
{
    public SomeSettings(IConfiguration configuration)
    {
        if (configuration == null) throw new ArgumentNullException(nameof(configuration));
        configuration.Bind("SomeSettings", this);
    }

    public int SomeIntSetting { get; set; }
    public IReadOnlyList<SomeChildSetting> SomeListOfObjects { get; set; }
    public IReadOnlyList<string> SomeListOfValues { get; set; }
}

public class SomeChildSetting
{
    public string SomeChildItem { get; set; }
}
Your appsettings.json file might look like:
{
    "SomeSettings": {
         "SomeIntSetting": 1,
         "SomeListOfObjects": [
             {
                  "SomeChildItem": "hello"
             },
             {
                  "SomeChildItem": "world"
             }
          ],
       "SomeListOfValues": [ "this", "is", "great" ]
    }
}
The above should work, however renaming a property on the class would break the binding or removing the "set" against a property would break the binding, but there would be no errors during compilation or runtime, it would simply ignore the things that couldn't be bound. Therefore, it's worth adding unit tests to protect against such bugs, which can be achieved using the "ConfigurationBuilder" which supports in memory collections, as shown below:
[TestFixture]
public class SomeSettingsTests
{
    private readonly Fixture _fixture = new Fixture();
    private Dictionary<string, string> _stubConfigs;
    private IConfigurationBuilder _configurationBuilder;

    [SetUp]
    public void SetUp()
    {
        _stubConfigs = new Dictionary<string, string>();
        _configurationBuilder = new ConfigurationBuilder().AddInMemoryCollection(_stubConfigs);
    }

    [Test]
    public void Ctor_ConfigurationNull_ThrowsException()
    {
        Func<SomeSettings> act = () => new SomeSettings(null);

        act.Should().Throw<ArgumentNullException>().And.ParamName.Should().Be("configuration");
    }

    [Test]
    public void SomeIntSetting_WhenConfigured_IsExpected()
    {
        var value = _fixture.Create<int>();
        _stubConfigs.Add("SomeSettings:SomeIntSetting", value.ToString());

        var result = GetDefaultSut().SomeIntSetting;

        result.Should().Be(value);
    }

    [Test]
    public void SomeListOfObjects_WhenConfigured_IsExpected()
    {
        var childSettings = _fixture.CreateMany<SomeChildSetting>().ToList();

        for (var i = 0; i < childSettings.Count; i++)
        {
            foreach (var propertyInfo in typeof(SomeChildSetting).GetProperties())
            {
                _stubConfigs.Add($"SomeSettings:SomeListOfObjects:{i}:{propertyInfo.Name}", propertyInfo.GetGetMethod().Invoke(childSettings[i], null).ToString());
            }
        }

        var result = GetDefaultSut().SomeListOfObjects;

        result.Should().BeEquivalentTo(childSettings);
    }

    [Test]
    public void SomeListOfValues_WhenConfigured_IsExpected()
    {
        var values = _fixture.CreateMany<string>().ToList();

        for (var i = 0; i < values.Count; i++)
        {
            _stubConfigs.Add($"SomeSettings:SomeListOfValues:{i}", values[i]);
        }

        var result = GetDefaultSut().SomeListOfValues;

        result.Should().BeEquivalentTo(values);
    }

    private SomeSettings GetDefaultSut() => new SomeSettings(_configurationBuilder.Build());
}

Converting periodised tranches into flattened permutations.

March 09, 2020

This is a difficult one to describe in terms of using the correct terminology for exactly what problem I was solving when I came up with this code. I think the following example is the best way to convey what problem this solution is designed to solve. Imagine you have a sort of 2-dimensional jagged array (in my case a list of lists) where the x dimension represents the passing of time and the y dimension represents the various options/forks in the data which could be used in that segment. e.g.
| 0 | 1 | 2 |
| A | A | A |
|   | B | B |
|   | C |   |
In the above, segment 0 of time can only use option "A", segment 1 can use "A", "B" or "C", segment 2 can use "A" or "B". Given this above set of data, there are a finite number of possible combinations the data can be used (which is equal to the multiple aggregate value of the counts of the y values) i.e.: 1 * 3 * 2 = 6 combinations And I wanted a way to have a single pass at the data and build the truth table of possible permutations by filling in the gaps left by lack of any option e.g.:
| 0 | 1 | 2 |
| A | A | A |
| A | A | B |
| A | B | A |
| A | B | B |
| A | C | A |
| A | C | B |
My idea was that, ahead of time for a given permutation, you know how many times the input options of each segment should be repeated into the output matrix in order to end up with all the permutations. At the same time, you must occasionally reverse the output order in order not generate a mirror image of an existing permutation. The code I came up with, an example of which can be seen below, can be used with any combination of x and y counts and returns the value containing all distinct permutations:
private static List<string>[] GetFullCombinations(List<List<string>> segmentOptions)
{
	var totalPermutations = segmentOptions.Aggregate(1, (x, y) => x * y.Count);
	var combos = new List<string>[totalPermutations];
	var repetitions = totalPermutations;

	foreach (var options in segmentOptions)
	{
		repetitions /= options.Count;
		var optionIndex = 0;
		for (var permutation = 0; permutation < totalPermutations; permutation++)
		{
			if ((permutation + 1) % repetitions == 0)
				optionIndex = (optionIndex + 1) % options.Count;

			var option = options[optionIndex];
			if (combos[permutation] == null)
			{
				combos[permutation] = new List<string>(segmentOptions.Count);
			}

			combos[permutation].Add(option);
		}
	}

	return combos;
}
Due to the "no mirror images" modular arithmetic, the output is actually in a slightly different order to how a human might have ordered it (in my first table), nevertheless all combinations are returned:
[Test]
public void GetFullCombinations_WhenInputSegmentsHaveOptions_ReturnsAllDistinctPermutations()
{
	var input = new List<List<string>>
	{
		new List<string>
		{
			"A"
		},
		new List<string>
		{
			"A",
			"B",
			"C"
		},
		new List<string>
		{
			"A",
			"B"
		}
	};
	var expectedPermutations = new[]
	{
		new [] { "A", "A", "A" },
		new [] { "A", "A", "B"},
		new [] { "A", "B", "A"},
		new [] { "A", "B", "B"},
		new [] { "A", "C", "A"},
		new [] { "A", "C", "B"}
	};

	var result = GetFullCombinations(input);

	using (new AssertionScope())
	{
		foreach (var expectedPermutation in expectedPermutations)
		{
			result.Should().ContainEquivalentOf(expectedPermutation, cfg => cfg.WithStrictOrdering());
		}
	}
}

Developer Tools

June 17, 2019

I try to maintain a toolkit of useful apps for doing my daily development tasks. Some of these I use very frequently, others not so much but they are useful to know about. I thought I'd catalogue them on my blog so that I remember them when I'm setting up a new machine :)
Tool Name Description
Microsoft Visual Studio I think this one goes without saying, but if anyone getting into development needs to choose an IDE I'd highly recommend starting here! It pretty much does everything you need (solutions, projects, code editing, compiling, debugging, NuGet package management, profiling, source control and more) and at the time of writing is available for Windows and Mac. The main competitor being JetBrains Rider which is fully cross platform and includes Re-Sharper refactorings, but as of yet has not tempted me away from the staple of Visual Studio. There are free editions of Visual Studio suitable for most people.
JetBrains Re-Sharper A plugin for Visual Studio which has many extensions and helpers to refactor your code, spot potential issues, decompile .NET assemblies, performance tracing etc. It does have a cost associated with it and I don't always install it as I don't like the idea of being dependent on it and there is a lot of cross over in functionality provided by Visual Studio itself of other free 3rd party tools. However more and more I am liking a lot of the features and it becoming a staple in my day to day developments.
CodeMaid A free plugin for Visual Studio which provides shortcuts for cleaning up code files, such as ensuring the order of code within classes, removing and sorting "using" statements etc. You can also download my preferred settings for CodeMaid.
NCrunch A plugin for Visual Studio which provides test code coverage and a automatic background test runner to keep you well informed of uncovered lines or broken tests while you develop. This one also has a cost associated with it but I'm yet to find anything in the free software space that comes close to the functionality.
Notepad++ A free cross platform text editor which is well maintained and comes with a lot of features for working with text files. It's not a "code editor", as such although it supports syntax highlighting, but it's useful for quickly viewing or editing all kinds of text files.
VS Code A free cross platform extensible IDE/text editor by Microsoft. For me, this is the middle ground between opening Notepad++ and opening Visual Studio. I also like to use VS Code when working on any front-end projects such as those built using Webpack due to the lack of Visual Studio project files in those kind of projects and because of the built in terminal window.
Sourcetree A free GUI for Git. One of the best I've tried and adds real value vs using the Visual Studio plugin or going fully command line.
Docker Installing Docker desktop opens up a whole world of containerised apps ready for you to integrate with in your code, such as Redis caches, Kafka instances, SQL server, FTP servers - pretty much run anything with a simple command!
Fiddler A free tool to aid debugging web based application. It can capture web traffic as well as reply packets, intercept calls and more.
Wireshark A free tool to aid debugging network traffic. Generally I use this when Fiddler can't intercept the traffic and I need something a little further down the network stack for capturing traffic.
ILSpy A free tool for decompiling .NET assembles.
Multi Commander A free dual pane file explorer tool with many extensions and helpful functions for dealing with different types of file. Most of the time I find Windows Explorer fine, but sometimes an alternative tool with more options can be useful. From all the ones I tried this is currently my favourite.
FAR - Find and Replace A free tool for performing 2 useful operations - 1 is replace names within files (multi rename) and 2 is replace text within files. This is useful when you want to create a new project based on another and want to quickly rename all project files and swap out the namespaces in all code files.
mRemoteNG A free tool for managing connections to remote machines including RDP, SSH and Web interfaces.
WinMerge A free tool for comparing and merging files and folders.
Conduktor A free GUI for inspecting the data in a Kafka instance

Visual Studio 2017/2019 Not Remembering Custom Fonts and Colours

April 04, 2019

I've had issues with both VS2017 and now VS2019 where applying my custom fonts/colour scheme is not maintained between sessions. The same trick worked in VS2019 as what I discovered in VS2017, so this time I'm blogging it! Basically, import your custom colour scheme as usual using the "Import and Export Settings" wizard. Now go to Tools > Options > General and switch the "Color Theme" to any other theme than the current one. Now switch the theme back. That's it! For some reason this seems to persist your customisation of the theme whereas without switching themes the changes get lost.

Testing if XML has deserialized correctly

March 24, 2019

XML is pretty old tech and without a schema is a bit of a pain to work with! A semi saving grace is using Visual Studio's "Paste XML as Classes" option (Paste Special) which will generate C# classes capable of representing the XML you had on the clipboard (using the XmlSerializer). However the caveat to this is that it only generates code for the exact xml you have used, so any optional attributes/elements or collections that only have 1 item in them will be generated incorrectly and will silently start dropping information when you deserialize another file with slightly different xml content. To combat this, I wrote a simple XmlSchemaChecker class which takes the content of an XML file and it's deserialized equivalent and ensures that every piece of data from the file is represented within the instance. It logs these problems when running with Debug logging enabled and is called from the class responsible for deserializing files.
using System;
using System.Collections;
using System.Collections.Generic;
using System.IO;
using System.Linq;
using System.Xml;
using System.Xml.Serialization;
using Microsoft.Extensions.Logging;

namespace Deserialization
{
    public class XmlSchemaChecker : IXmlSchemaChecker
    {
        private readonly ILogger<XmlSchemaChecker> _logger;

        public XmlSchemaChecker(ILogger<XmlSchemaChecker> logger)
        {
            _logger = logger ?? throw new ArgumentNullException(nameof(logger));
        }

        public void LogSchemaWarnings<T>(string originalXmlFilePath, T deserialized)
        {
            if (!_logger.IsEnabled(LogLevel.Debug)) return;

            var originalXml = File.ReadAllText(originalXmlFilePath);
            var newXml = ReSerialize(deserialized);

            var originalValues = GetXmlValues(originalXml);
            var newValues = GetXmlValues(newXml);

            var missingItems = originalValues.Except(newValues).ToList();

            if (missingItems.Any())
            {
                _logger.LogDebug("Schema for {filename} was not fully deserialized. Missing items: {missingItems}", originalXmlFilePath, missingItems);
            }
        }

        private static void ProcessNodes(ISet<string> values, Stack<string> paths, IEnumerable nodes)
        {
            foreach (var node in nodes)
            {
                switch (node)
                {
                    case XmlDeclaration _:
                        continue;
                    case XmlElement element:
                        {
                            paths.Push(element.Name);

                            foreach (var att in element.Attributes)
                            {
                                if (att is XmlAttribute xmlAttribute && xmlAttribute.Name != "xmlns:xsd" && xmlAttribute.Name != "xmlns:xsi")
                                {
                                    values.Add($"{string.Join(":", paths.Reverse())}:{xmlAttribute.Name}:{CleanseValue(xmlAttribute.Value)}");
                                }
                            }

                            if (element.HasChildNodes)
                            {
                                ProcessNodes(values, paths, element.ChildNodes);
                            }

                            paths.Pop();
                            break;
                        }
                    case XmlText text:
                        {
                            values.Add($"{string.Join(":", paths.Reverse())}:{text.ParentNode.Name}:{CleanseValue(text.InnerText)}");
                            break;
                        }
                }
            }
        }

        private static string CleanseValue(string value)
        {
            return value.Replace("\r\n", "\n").Replace("\t", "").Trim(' ', '\n');
        }

        private static IEnumerable<string> GetXmlValues(string xml)
        {
            var values = new HashSet<string>();
            var paths = new Stack<string>();
            var doc = new XmlDocument();
            doc.LoadXml(xml);

            ProcessNodes(values, paths, doc.ChildNodes);

            return values;
        }

        private static string ReSerialize<T>(T item)
        {
            var xmlSerializer = new XmlSerializer(typeof(T));
            var output = new System.Text.StringBuilder();

            using (var outputStream = new StringWriter(output))
            {
                xmlSerializer.Serialize(outputStream, item);
            }

            return output.ToString();
        }
    }
}