So, one of my previous customers reached out to me a couple of weeks ago. They had a question concerning on how to use dependency injection in their AutoMapper profiles. For this project we were using profiles which were dynamically loaded inside the application using MEF and were using Autofac for dependency injection.

The way you would normally load all of these profiles is by using the `AddProfiles` method when initializing AutoMapper. The code would look similar to the following excerpt.

private static void RegisterAutomapperDefault(IEnumerable<Assembly> assemblies)
{
    AutoMapper.Mapper.Initialize(cfg =>
    {
        cfg.AddProfiles(assemblies);
    });
}

This works fine on most occasions and is the recommended approach, to my knowledge.

When you start thinking about using dependency injection (constructor injection in this case), you might want to rethink your mapping profile. Because, if you have the need for dependencies when mapping object properties to the properties of a different object it probably means there’s too much logic going on over here.

Of course, if you need this, one thing you might want to consider is using the custom type convertors or custom value resolvers. You can use dependency injection (constructor injection) using these convertors and resolvers by adding a single line in the `Initialize` method of AutoMapper.

private static void RegisterAutomapperDefault(IContainer container, IEnumerable<Assembly> assemblies)
{
    AutoMapper.Mapper.Initialize(cfg =>
    {
        cfg.ConstructServicesUsing(container.Resolve);

        cfg.AddProfiles(assemblies);
    });
}

Now if you still feel like you need to do constructor injection inside your mapping `Profile` classes, that’s also quite possible, but please think about it in before doing so.

In order to get this working, I first created a new `Profile` class which injects an `IConvertor`, like below.

public class MyProfile : Profile
{
    public MyProfile(IConvertor convertor)
    {
        CreateMap<Model, ViewModel>()
            .ForMember(dest => dest.Id, opt => opt.MapFrom(src => src.Identifier))
            .ForMember(dest => dest.Name, opt => opt.MapFrom(src => convertor.Execute(src.SomeText)))
            ;
    }
}

What you need to do now is register all of the `Profile` implementations to your IoC-framework, like Autofac. To do this, you have to do some reflection magic. The code used below retrieves all `Profile` implementations in the assemblies which have their name starting with “Some”.

public static IContainer Autofac()
{
    var containerBuilder = new ContainerBuilder();

    // Register the dependencies...
    containerBuilder.RegisterType<Convertor>().As<IConvertor>();


    var loadedProfiles = RetrieveProfiles();
    containerBuilder.RegisterTypes(loadedProfiles.ToArray());

    var container = containerBuilder.Build();

    RegisterAutoMapper(container, loadedProfiles);

    return container;
}

/// <summary>
/// Scan all referenced assemblies to retrieve all `Profile` types.
/// </summary>
/// <returns>A collection of <see cref="AutoMapper.Profile"/> types.</returns>
private static List<Type> RetrieveProfiles()
{
    var assemblyNames = Assembly.GetExecutingAssembly().GetReferencedAssemblies()
        .Where(a => a.Name.StartsWith("Some"));
    var assemblies = assemblyNames.Select(an => Assembly.Load(an));
    var loadedProfiles = ExtractProfiles(assemblies);
    return loadedProfiles;
}

private static List<Type> ExtractProfiles(IEnumerable<Assembly> assemblies)
{
    var profiles = new List<Type>();
    foreach (var assembly in assemblies)
    {
        var assemblyProfiles = assembly.ExportedTypes.Where(type => type.IsSubclassOf(typeof(Profile)));
        profiles.AddRange(assemblyProfiles);
    }
    return profiles;
}

All of this code is just to register your mapping profiles to Autofac. This way Autofac can resolve them when initializing AutoMapper. To register your mapping profiles in AutoMapper you need to use a specific overload of the `AddProfile` method which takes a `Profile` instance, instead of a type.

/// <summary>
/// Over here we iterate over all <see cref="Profile"/> types and resolve them via the <see cref="IContainer"/>.
/// This way the `AddProfile` method will receive an instance of the found <see cref="Profile"/> type, which means
/// all dependencies will be resolved via the <see cref="IContainer"/>.
/// </summary>
private static void RegisterAutoMapper(IContainer container, IEnumerable<Type> loadedProfiles)
{
    AutoMapper.Mapper.Initialize(cfg =>
    {
        cfg.ConstructServicesUsing(container.Resolve);
                
        foreach (var profile in loadedProfiles)
        {
            var resolvedProfile = container.Resolve(profile) as Profile;
            cfg.AddProfile(resolvedProfile);
        }
                
    });
}

You can see I’m resolving all of the loaded profiles via Autofac and add each resolved instance to AutoMapper.

This takes quite a bit of effort, but resolving your profiles like this will give you the possibility to do any kind of dependency injection inside your AutoMapper code.


Just remember, as I’ve written before: “Just because you can, doesn’t mean you should!”
Still I wanted to show you how this can be done as it’s kind of cool. If you want to check out the complete solution, check out my GitHub repository for this project.

For years we (a lot of people I know and myself included) have been using the Unit of Work and Repository pattern combined with each other. This makes quite a lot of sense as, in most cases, they both have something to do with your database calls.

When searching for both of these patterns you’ll often be directed to a popular article on the Microsoft documentation site. The sample code over there has a very detailed implementation on how you can implement both of these patterns for accessing and working with your database. I kind of like this post as it goes in great length to describe both the unit of work- and repository pattern and the advantages of using them. I see a lot of projects/companies having implemented the pattern combo like described in the Microsoft article. I can’t really blame them as it’s one of the top hits when you search for it in any search engine.

There is a downside to this sample though. It violates the Open/Closed principle which states “software entities (classes, modules, functions, etc.) should be open for extension, but closed for modification”. Whenever you need to add a new repository to your database context, you also need to add this repository to your unit of work, therefore violating the open/closed principle.

It also violates the Single Responsibility Principle, which states “everymoduleorclassshould have responsibility over a single part of thefunctionalityprovided by thesoftware, and that responsibility should be entirelyencapsulatedby the class. All itsservicesshould be narrowly aligned with that responsibility.” or in short “A class should have only one reason to change.”. The reason why the sample implementation violates this principle is because it is handling multiple responsibilities. The unit of work’s purpose should be to encapsulate and commit or rollback transactions of atomic operations. However, it’s also creating and managing the several repository objects, therefore having multiple responsibilities.

Implementing the unit of work and repository pattern can be done in multiple ways. Derek Greer goes on about this at great length about this in an old post of him. As always there are several ways to improve the design. You might even want to keep the mentioned design in the Microsoft example, because ‘it-just-works’. For the sake of cleaner code I’ll describe one of the ways, which I personally like very much, to improve the software design. By adding a decorator to the project the functional code will be much cleaner.

First thing you have to consider is implementing some form of CQRS in your software design. This will make your live much easier when splitting the command, unit of work and repository functionality. You can perfectly implement the described solution without implementing CQRS, but why would you want to do this?

I’ll just assume you have a command handler in your application. The interface will probably look similar to the following piece of code.

public interface IIncomingFileHandler<in TCommand>
	where TCommand : IncomingFileCommand
{
	void Handle(TCommand command);
}

The actual command handler can be implemented like the following piece of code.

public class IncomingFileHandler<TCommand> : IIncomingFileHandler<TCommand>
    where TCommand : IncomingFileCommand
{
    private readonly IRepository<Customer> customerRepository;
    private readonly IRepository<File> fileRepository;
    
    protected IncomingFileHandler(IRepository<Customer> customerRepository, IRepository<File> fileRepository)
    {
        this.customerRepository = customerRepository;
        this.fileRepository = fileRepository;
    }

    public void Handle(TCommand command)
    {
        //Implement your logic over here.
        var customer = customerRepository.Get(command.CustomerId);
        customer.LatestUpdate = command.Request;
        customerRepository.Update(customer);
        var file = CreateNewIncomingFileDto(command);
        fileRepository.Add(file);

        return;
    }
}

All of the necessary repositories are injected over here so we can implement the logic for this functional area. The implementation doesn’t make much sense, but keep in mind it’s just an example. This piece of code wants to write to the database multiple times. We could implement the call to the SaveChanges() method inside the Update- and Add-methods, but that’s a waste of database requests and you’ll sacrifice transactional consistency.

At this time nothing is actually written back to the database, because the SaveChanges isn’t called anywhere and we aren’t committing (or rolling back) any transaction either. The functionality for persisting the data will be implemented in a transaction handler, which will be added as a decorator. The transaction handler will create a new TransactionScope, invoke the Handle-method of the actual IIncomingFileHandler<TCommand> implementation (in our case the IncomingFileHandler<TCommand>), save the changes and commit the transaction (or roll back).

A simple version of this transaction decorator is shown in the following code block.

public class IncomingFileHandlerTransactionDecorator<TCommand> : IIncomingFileHandler<TCommand> 
    where TCommand : IncomingFileCommand
{
    private readonly IIncomingFileHandler<TCommand> decorated;
    private readonly IDbContext context;

    public IncomingFileHandlerTransactionDecorator(IIncomingFileHandler<TCommand> decorated, IDbContext context)
    {
        this.decorated = decorated;
        this.context = context;
    }

    public void Handle(TCommand command)
    {
        using (var transaction = context.BeginTransaction())
        {
            try
            {
                decorated.Handle(command)

                context.SaveChanges();
                context.Commit(transaction);
            }
            catch (Exception ex)
            {
                context.Rollback(transaction);
                throw;
            }
        }
    }
}

This piece of code is only responsible for creating a transaction and persisting the changes made into the database.

We are still using the repository pattern and making use of the unit-of-work, but each piece of code now has its own responsibility. Therefore making the code much cleaner. You also aren’t violating the open/closed principle as you can still add dozens of repositories, without affecting anything else in your codebase.

The setup for this separation is a bit more complex compared to just hacking everything together in one big file/class. Luckily Autofac has some awesome built-in functionality to add decorators. The following two lines are all you need to make the magic happen.

builder.RegisterGeneric(typeof(IncomingFileHandler<>)).Named("commandHandler", typeof(IIncomingFileHandler<>));
builder.RegisterGenericDecorator(typeof(IncomingFileHandlerTransactionDecorator<>), typeof(IIncomingFileHandler<>), fromKey: "commandHandler");

This tells Autofac to use the IncomingFileHandlerTransactionDecorator as a decorator for the IncomingFileHandler.

After having implemented the setup you are good to go. So, whenever you think of implementing the unit-of-work and repository pattern in your project, keep in mind the suggestions in this post.

On a recent project I had to implement the decorator pattern to add some functionality to the existing code flow.

Not a big problem of course. However, on this project we were using Autofac for our dependency injection framework so I had to check how to implement this pattern using the framework built-in capabilities. One of the reasons I always resort to Autofac is the awesome and comprehensive documentation. It’s very complete and most of the time easy to understand. The advanced topics also have a chapter dedicated to the Adapter- and Decorator pattern which was very useful for implementing the decorator pattern in this project.

I wanted to use the decorator pattern to add some logic to determine if a command should be handled and for persisting database transactions of my commands and queries. You can also use it for things like security, additional logging, enriching the original command, etc.

As the documentation already states, you’ll have to register your original command handler as a Named service. The Autofac extensions for registering a decorator will use this named instance to add the decorators on to. One thing to remember when you need to add several decorators to your command, you’ll have to register each decorator as a named service also, except for the last one!

The command handlers we were using were accepting a generic argument to instantiate a class. Therefore, we also had to use the open generic version for registering the implementations and decorators.

The implementation of the actual command handler looks very much like the follwing code block.

public class ProcessedItemHandler<TCommand> : IProcessedMessageHandler<TCommand> 
		where TCommand : ProcessedMessageCommand
{
	public ProcessedItemHandler(
		IBackendSystemFormatter<TModel> formatter, 
		IQueueItemWriter<TModel> writer, 
		IRepository<ProcessQueue> processQueueRepository)
	{
	}
	
	public void Handle(TCommand command)
	{
		/* Implementation logic */
	}		
}

It implements the IProcessedMessageHandler<TCommand> interface and contains the logic to execute the command.

The decorator has to implement the same interface and one of the injected dependencies is the same interface. This tells Autofac to inject an IProcessedMessageHandler<TCommand> which is ‘linked’ in the registration of our application.

public class ProcessedMessageTransactionDecorator<TCommand> : IProcessedMessageHandler<TCommand>
		where TCommand : ProcessedMessageCommand
{
	private readonly IProcessedMessageHandler<TCommand> decorated;
	private readonly ITransactionHandler transactionHandler;

	public ProcessedMessageTransactionDecorator(
		IProcessedMessageHandler<TCommand> decorated,
		ITransactionHandler transactionHandler)
	{
		this.decorated = decorated;
		this.transactionHandler = transactionHandler;
	}

	public void Handle(TCommand command)
	{
		/* Decorator logic */

		decorated.Handle(command);

		/* Decorator logic */
	}
}

As you can see, you will be able to do all kinds of stuff in the Handle-method before or after invoking the decorated object.

The registration in our application looks very much like the following code block.

var storeProcessedMessageCommandHandlers = GetAllStoreProcessedMessageCommandHandlerImplementationsFromAssemblies();

foreach (var commandHandler in storeProcessedMessageCommandHandlers)
{
	builder.RegisterGeneric(commandHandler).Named("storeProcessedMessageHandler", typeof(IProcessedMessageHandler<>));
}

builder.RegisterGenericDecorator(typeof(ProcessedMessageTransactionDecorator<>), typeof(IProcessedMessageHandler<>),
										fromKey: "storeProcessedMessageHandler");

First we need to collect all implementations of the IProcessedMessageHandler<TCommand> and register them within the Autofac container. As you can see, all these implementations are registered as a named service with an index called storeProcessedMessageHandler. If you only have 1 implementation of the command handler, you can just register this one implementation of course.

After having registered all of the command handlers, the decorator(s) can be registered. The helper method RegisterGenericDecorator helps with this. This method also works with open generics and registration looks very similar to registering a ‘normal’ class and interface. The main difference is the addition of the fromKey argument. This argument is used to determine to which named service the decorator should be added to.

If you want to hook up multiple decorators you can also add the toKey argument to your RegisterGenericDecorator method. By adding the toKey argument, the decorator is also added as a named service to Autofac and you will be able to hook up another decorator to the earlier decorator by using the name defined in the toKey in the fromKey of the new decorator. This might be a bit abstract, so let me just write up a small example.

builder.RegisterGeneric(typeof(IncomingHandler<>)).Named("commandHandler", typeof(ICommandHandler<>));
builder.RegisterGenericDecorator(typeof(TransactionRequestHandlerDecorator<>), typeof(ICommandHandler<>), fromKey: "commandHandler", toKey: "transactionHandler");
builder.RegisterGenericDecorator(typeof(ShouldHandleCommandHandlerDecorator<>), typeof(ICommandHandler<>), fromKey: "transactionHandler");

Makes more sense right?

Just remember, not to add a toKey argument to the last decorator of your flow. Otherwise you will not be able to inject the interface, because everything is added to the IIndex<T> collection and there isn’t a defined entrypoint. Ask me how I know……

 

Hope this helps you in future projects. Knowing about this functionality surely has helped me to keep the code clean.