NDepend Blog

Improve your .NET code quality with NDepend

Architecture of a .NET Application Case Studies

Architecture of a .NET Application: 8 Case Studies

January 30, 2024 9 minutes read

This recent question on Reddit’s Number of projects per solution led to interesting debates. Of course, the answer depends largely on the overall size and business of the application. In this post, we’ll go through various code bases. This way we will figure out how industry leaders architecture their large .NET applications.

Before going through case studies let’s first remind a few points:

Points to keep in mind when partitioning .NET code

There are many aspects to consider when partitioning .NET code within projects of a solution. All those points draw trades-off between:

  • A single or multiple solutions.
  • A few large projects or many smaller projects in a solution.

Technical Points

  • Build time: If you work on a large enough code base, the build time can become a problem since the build is often triggered to run manual and automatic tests. Relying on incremental builds where only projects impacted by changes are rebuilt helps a lot. But sometimes – to obtain an acceptable build experience – some projects need to be unloaded manually or trimmed down through some Visual Studio Solution Filters .slnf files. But doing so degrades the refactoring and exploration experience. For more details on this point here is a related article I wrote recently Improve Visual Studio Build Performance.
  • Cross solutions reference: One drawback of having several solutions is that one needs to reference the DLLs of other solutions instead of referencing projects defined into the same solution. The DLL reference is a more brittle approach that breaks when the project output location gets changed. In such a situation, NuGet is here to reference projects as components from other solutions but doing so introduces some extra assets to maintain.

         DLL vs Project Reference

  • The physical nature of projects: Typically each project compiles to a .DLL or a .EXE assembly file. Those are physical artifacts and having dozens or hundreds of DLLs can lead to versioning, deployment, and maintenance difficulties. This is why when one creates a new project, it is worth questioning if there is a physical reason underlying the need for this new project. One such common physical reason is whether the runtime will load this project dynamically through a Dependency Injection (DI) framework.

Development Points

  • Focus: having a few projects in multiple solutions can help enforce the separation of concerns, and keep build time low, and may be well suited to having multiple teams with narrower focus, and well-defined service boundaries. The last case study of the present article exhibits an application made of 1.600+ projects and 200.000+ classes: in such a situation, no one can develop without multiple solutions.
  • Refactoring impact: If your code is defined in several solutions, this can significantly slow down the daily refactoring process since popular refactoring tools (Visual Studio refactoring, Resharper…) work within the boundary of a single Visual Studio solution. There’s room for a hybrid approach: use smaller solutions for the most part, but create a single including all projects for times when larger scale changes are required. But it means an extra solution to maintain.
  • Project cycles prevented by IDE: all .NET IDEs detect and prevent dependency cycles in the project dependency graph. This advocates for many fine-grained projects to prevent anarchical structure in large projects. Unless some sort of rules let properly layer classes defined in large projects. A modular approach is necessary to build an application and this questions your definition of a component: a unit of re-use, a unit of development, a unit of feature, a unit of versioning, a unit of testing, a unit of compilation?
  • Project as encapsulation container: If a class is only used in the scope of its parent project it should be declared as internal. Such class can then be consumed by tests declared in another project, thanks to InternalsVisibleToAttribute. However this attribute should not be used in the context of application projects and if you stumble on this need, it is an indication that some classes should be merged in the same project.

Test and Other Points

  • Test and Application Code segregations: one instance of the point above is that test code runs in test processes while application code runs both in test and production processes. Thus better segregate tests and application code into distinct projects.
  • Classes that don’t run in the same process: are a good indication that these classes should belong to different projects.
  • Code that compiles against various .NET flavors: To increase re-use, some code like domain classes fits well in .NET Standard projects (that runs everywhere) while some infrastructure code requires .NET 8  or 7 projects to harness the latest improvements of the platform.

There is no perfect approach so let’s explore the choices made by some industry leaders.

There are many diagrams in the case studies sections below. They all have been generated by the NDepend dependency graph. I precise since readers ask in the comments how they’ve been generated.

Clean Architecture

Clean Architecture is a term coined by Uncle Bob and refers to principles to structure projects so that it is easy to understand and easy to change as the project grows. It is becoming increasingly popular to structure ASP.NET Core web applications. Here are the Project Dependency Diagrams of Jason Taylor’s CleanArchitecture .NET solution template available here on github.

CleanArchitecture .NET Solution

We can see test/code segregation through src and tests solution folders. Also, each application project represents a layer with standardized names and roles: Domain, Application, Infrastructure and WebUI. You can refer to this post Clean Architecture for ASP.NET Core Solution: A Case Study for an in-depth analysis of this way of structuring a .NET solution.

NopCommerce

NopCommerce is a popular OSS project eCommerce platform. It is way bigger way than the CleanArchitecture prototype above and has a total of 28 projects. However, most of these projects are small plugins. The application code spawns a few large projects: Core, Services, Data and Web.

  • Core contains mostly domain and some infrastructure abstractions. In the context of eCommerce domain contains classes like Order, Payment, Store, Affiliate, Vendor, Catalog, Discount, Gdpr…
  • Services contains infrastructure code to implement the domains listed above (order checkout, caching, various discounts…).
  • Data contains the code related to persistence.
  • Web contains the ASP.NET Core code.

Thus NopCommerce’s engineers choose the few large projects approach. However as mentioned earlier, this approach lacks the benefit of IDE dependency cycle control between components. It only works at the project level. As a consequence, large projects like Nop.Services become super-components. Pretty much everything relies on everything else (screenshot below).

Such a large entangled portion of code is also known as spaghetti code or big ball of mud. Those qualify a software or a component that lacks a perceivable architecture. It doesn’t mean that this piece of code is not working well or that it requires a lot of effort. It means that in the project Nop.Services there are no layers to segregate the 700 types. Altogether these 700 types form a large unit of compilation, development, and testing. One cannot easily refactor the project Nop.Services into smaller components. This situation leads to extra maintenance costs. Later I’ll explain a way to counter this phenomenon because it is worth having large and cohesive projects.

Microservices Architecture

Structuring a web application in multiple micro-services is becoming more and more popular. Micro-services promises are:

  • Scalability: There’s less work involved because developers concentrate on individual services rather than the whole monolithic app.
  • Faster development: faster development cycles because developers can focus on specific services.
  • Improved data security: Microservices communicate with one another through secure APIs. This might provide development teams with better data security than the monolithic method
  • Become “language and technology agnostic”: Teams work somewhat independently of each other. Microservices allow different developers to use different programming languages and technologies

See below the project dependency diagram of the OSS solution run-aspnetcore-microservices. We can also see the CleanArchitecture principles applied to the Ordering concern (Domain, Application, Infrastructure).

Micro Services Architecture

Also, the projects in this Microservices diagram seem less coupled than in the NopCommerce diagram (from the previous section). However, this diagram lacks some dependencies. For example the service Basket.API consumes Discount.API by calling the method GetDiscount() even though their projects are not statically coupled. The key is that the gRPC framework is used to handle such GetDiscount() calls (RPC stands for Remote Procedure Call) as illustrated in the screenshot above.

log4Net

log4Net is a popular OSS logging framework. It consists of a single C# project and another project contains the test. In such a situation, the single-project approach makes sense. Indeed, log4Net is a cohesive enough framework and its clients don’t want to mess up with multiple assemblies, even if they are packed in a single NuGet package. However here also having a single large project led to the super-component phenomenon. In the log4Net project pretty much everything statically depends on everything else.

Log4Net .NET Solution

.NET Base Class Libraries

See below the graph of the 166 assemblies of .NET 7.0 preview BCL, found in the directory C:\Program Files\dotnet\shared\Microsoft.NETCore.App\8.0.0. Obviously, the BCL is not as cohesive as a smaller-scale API like log4Net. For example, all the XML-related implementations shouldn’t be loaded in memory if the application is dealing with JSON only. Thus it makes sense to split its 18K types (10K of them being public) over 166 projects.

.NET 7.0

NDepend

Here is the project dependency graph of our application. We also choose to have a few large projects (NDepend.Core and NDepend.UI) surrounded by smaller projects for the various NDepend flavors (analysis & reporting, Visual Studio extension, Azure DevOps extension, ILSpy extension…). The base project NDepend.API only contains abstractions and is consumed both by our code and by third-party consumers of NDepend.API to automatically pilot the core features of the product. Some users reported having literally thousands of .NET solutions to analyze so such automation really makes sense for them.

NDepend .NET Architecture

Despite having large projects, we don’t face the super-component phenomenon because NDepend dog food has rules like Avoid namespaces mutually dependent and Avoid namespaces dependency cycles. Thus inside a large project, we group classes in a hierarchy of namespaces that we consider as our components. As mentioned, relying on fewer large projects has benefits: easier refactoring, easier versioning, less maintenance, and fewer physical assets to maintain. Fortunately, the C# compiler is very fast and compiles the 1.400 classes of NDepend.UI in 3 seconds on modern hardware.

NDepend UI Architecture

Roslyn

The core of Roslyn is the 3x compiler projects Roslyn.CodeAnalysis, Roslyn.CodeAnalysis.CSharp and Roslyn.CodeAnalysis.VisualBasic. Around those projects, there is a galaxy of smaller projects to handle services like Workspace / Solution / Project, Analyzer Runner, Scripting, Expression evaluation…

Again the large projects approach makes sense here because a compiler is something cohesive: one might want to compile some C# code without hosting the VB.NET compiler in-memory but one certainly doesn’t want to only use a partial version of the C# compiler.

Roslyn .NET Architecture

Visual Studio

With its 1.600+ projects and 200.000+ classes Visual Studio might be well the largest .NET application on earth. I have no insider info about the number of solutions required but it is certainly a lot. Clearly, multiple teams need a narrower focus and acceptable build time. Most of the features are extensions and the IDE loads those on-demand. No-ones uses all Visual Studio features in a single solution. As a consequence, most assemblies remain unloaded most of the time. Also for performance reasons, Visual Studio spawns many child processes at runtime, which enforces the relevance of having multiple solutions.

Visual Studio Architecture

Conclusion

It seems that for large enough applications, the industry favors less but larger projects. On the other hand, for smaller-scale applications, guidance like Clean Architecture prevails. Also, the Microservices section makes clear the benefits of this approach.

If you are wondering how to structure your next .NET solution or how to improve existing ones, I hope that the various case studies covered will help you make the right choices.

If you are interested in visualizing your .NET project architecture, just download NDepend 14-day free trial now. Then start VisualNDepend.exe, analyze your solution(s), and go to the Dependency Graph panel.

Comments:

  1. Great article! Really nice to see a number of approaches summarised in one article.

    2 questions:
    1. How you can tell how many classes from the same project is used in runtime? Do you analyse this “by hand” (using something like IIL) or using some advanced tool?
    2. What tool are you using for generating this nice dependency diagrams presented in the article? 🙂

  2. You may want to make it clear that these Mapping features are only available in Enterprise version. Pro and regular do not have it.

  3. Joan Comas says:

    There is another option: add a project to multiple solutions.

    This way, you avoid referencing Dlls, you can debug it, and you can work on it as necessary.

    Of course once the changes are done, the other solutions have to be built to check if there are any breaking changes, and if so, fix them.

    I’ve found that for simpler projects, this saves time compared to producing nuget packages, publishing them and updating every project on every solution.

  4. @Joan Comas To quote the article “There’s room for a hybrid approach: use smaller solutions for the most part, but create a single including all projects for times when larger scale changes are required. But it means an extra solution to maintain.”

    …and as you noticed “Of course once the changes are done, the other solutions have to be built to check if there are any breaking changes, and if so, fix them.”

    …this is extra maintenance, as a developer we must strive to avoid introducing any extra burden so I wouldn’t be comfortable with this approach.

  5. This article give me a lots of knowledge about how to architecture a website by .net . Sir please also add the video for better clarification.Thankyou

Comments are closed.