NDepend

Improve your .NET code quality with NDepend

4 Predictions for the Future of .NET

In May 2019, Microsoft officially announced .NET 5, the future of .NET: it will be based on all the .NET Core work already achieved. Here is the schedule announced:

On one hand the future of .NET has never been so bright. On the other hand this represents a massive move for all .NET development shops, especially for those that still target .NET Framework 4.x that won’t evolve anymore. But not everything is clear from this announcement. Such massive move will have many collateral consequences that we can only guess by now. Certainly many points are not yet cast in stone and still debated.

Hence for large .NET legacy code bases some predictions must be made to plan now a seamless and in-time migration toward the future of .NET. So let’s do some predictions: it’ll still be interesting to come back in a few years and see how good or bad they were.

.NET Standard won’t evolve much

.NET Standard was introduced as a common API set that all .NET flavors must implement. .NET Standard superseded PCL (Portable Class Library). Now that several .NET frameworks will be unified upon .NET Core bases, and that the .NET Framework 4.x won’t support future versions of .NET Standard anymore, it sounds like the need for more .NET standard API will decrease significantly. Actually .NET Framework 4.8 doesn’t even support latest .NET Standard 2.1: “.NET Framework 4.8 will remain on .NET Standard 2.0 rather than implement .NET Standard 2.1”.

However .NET Standard is certainly not dead yet: it is (and will be for years to come) an essential tool to compile code into portable components that can be reused across several .NET flavors. However with this unification process the future of .NET Standard is compromised.

Visual Studio will run on .NET 5 or 6 (and in a x64 process)

It has to. Imagine the consequences if in 3 years from now (2019 Q4) the main Microsoft IDE for .NET professional developments still run on .NET Framework v4.8:

  • Engineers working on VS would lack access to all new .NET APIs, performance improvements and langage improvements. They would remain locked in the past.
  • As a consequence they wouldn’t use their own tool (dogfooding) and dogfooding is a key aspect of developing tools for developers.
  • Overall the message sent wouldn’t be acceptable for the users.

On the other hand, if you know a bit how VS works, imagine how massive this migration is going to be. For more than a decade there have been a lot of complaints from the community about Visual Studio not running in a 64 bits process. See some discussions on reddit here for example. If I remember well this x64 request was the most voted one when VS feedback was still handled by UserVoices. Some technical explanations have been provided by Microsoft like those ones provided 10 years ago! If in 2019 Visual Studio still doesn’t run in a x64 process, this says a lot on how large and complex such migration is.

It seems inevitable that this time the Visual Studio legacy will evolve toward what will be the future of .NET. One key benefit will be to run in a x64 process and have plenty of memory to work with very large solutions. Another implication is that all Visual Studio extensions, like our extension, must evolve too. Here at NDepend we are already preparing it but it will take time, not because we’ll miss much API (we’ll mostly miss AppDomain) but because:

  • We depend on some third-parties that we’d like to get rid of to have full control over our migration, and overall code.
  • For several years we’ll have to support both future Visual Studio versions and Visual Studio 2019, 2017 and maybe 2015 that runs on .NET Framework v4.x (btw we still support VS 2013/2012/2010 but this will have to be discarded to benefit from .NET Standard reused DLLs)

We cannot know yet if Visual Studio vNext will run on .NET 5 or if it’ll take more years until we see it running upon .NET 6?

Btw here are 2 posts Quickly assess your .NET code compliance with .NET Standard and An in-depth analysis of .NET Core 3.0 support for WPF and Winforms APIs that can help plan your own legacy migration.

.NET will propose a cross-platform UI Framework: WPF or a similar XAML UI Framework

On October 4, 2019 Satya Nadella revealed why Windows may not be the future of Microsoft’s business. In August 2019 Microsoft provided a .NET Cross Platform UI Framework Survey. Clearly a .NET cross-platform UI Framework is wanted: the community is asking for it. So far Microsoft closed the debate about WPF: WPF won’t be multi-platform.

Let’s also be crystal clear. This (WPF cross platform) is a very hard project. If the cost was low, this would be a very different conversation and very likely a different outcome. We have enough trouble being compatible with OpenSSL and that’s just one library.  Rich Lander – Dec 5, 2018

But given the immense benefits of what WPF running cross-platform would offer, I wouldn’t be surprise to see WPF become cross-platforms within the next years. Or at least a similar XAML UI framework. Moreover WPF is now open-source so who knows…

The Visual Studio UI is mostly based on WPF hence one of the benefit of having WPF cross-platform would be to have a unique cross-platform Visual Studio: the same way Microsoft is now unifying .NET Frameworks, they could unify the Visual Studio suite into a single cross-platform product.

Xamarin Forms and Avalonia are also natural candidates to be the .NET cross-platform UI Framework. But it seems those frameworks doesn’t receive enough love from the community, this is my subjective feeling. Also we have to keep in mind that Microsoft did a survey and that the community is massively asking for it.

Blazor is promised to a bright future

If you didn’t follow the recent Blazor evolution, the promises of this technology are huge:

  • Run .NET code in all browsers (like Silverlight)
  • with no browser plugin needed (unlike Silverlight)
  • with near-native performance
  • with components compiled to a compact binary format

This is all possible thanks to the WebAssembly (Wasm) format supported by most browsers.

WebAssembly (abbreviated Wasm) is a binary instruction format for a stack-based virtual machine. Wasm is designed as a portable target for compilation of high-level languages like C/C++/Rust, enabling deployment on the web for client and server applications.

Blazor was initially a personal project created by Steve Sanderson from Microsoft. It was first introduced during NDC OSLO in July 2017: the video is worth being watched, also read how enthusiastics are the comments. However Blazor is not yet finalized and still has some limitations: it doesn’t offer yet a decent debugging experience and the application size to download (a few MBs) is still too large because dependencies have to be loaded too. Those ones are currently being addressed (see here for debugging and here for download size, runtime code will be trimmed and cached and usage of CDN (Content Distribution Network) is mentioned).

The community is enthusiast, the technology is getting mature and there is no technological nor political barrier in sight: the Blazor future looks bright. Don’t miss the Blazor FAQ to learn more.

log4net vs NLog A Comparison of How They Affect Codebases

Log4net vs NLog: A Comparison of How They Affect Codebases

Ah, the old “versus” Google search.  Invariably, you’re in the research stage of some decision when you type this word into a search engine.  Probably not something like Coke vs Pepsi.  Maybe “C# vs Java for enterprise projects” or “angular vs react.”  Or if you landed here, perhaps you’re looking at “log4net vs NLog.”

With a search like this, you expect a certain standard script.  The writer should describe each one anecdotally, perhaps with a history.  Then comes the matrix with a list of features and checks and exes for each one, followed by a sober list of strengths and weaknesses.  Then, with a flourish, I should finish with a soggy conclusion that it really depends on your needs, but I maybe kinda sorta like one better.

I’m not going to do any of that. Continue reading Log4net vs NLog: A Comparison of How They Affect Codebases

NDepend and .NET Fx v4.7.2: an extension method collision and how to solve it easily

In Oct 2017 I wrote about the potential collision problem with extension methods. At that time the .NET Framework 4.7.1 was just released with this new extension method that is colliding with our own NDepend.API Append() extension method with same signature.

The problem was solved easily because just one default rule consumed our Append() extension method, we just had to refactor this method to use it as a static method call instead of an extension method call:  ExtensionMethodsEnumerable.Append(...)

 


Unfortunately with the recent release of .NET Framework 4.7.2, the same problem just happened again, this time with this extension method:

This time 22 default code rules are relying on our ToHashSet() extension method. This method is used widely because it is often the cornerstone to improve significantly performances. But this means that after installing the .NET Fx v4.7.2, 22 default rules will break.

This time the problem is not solved easily by calling our ExtensionMethodsSet.ToHashSet<TSource>(this IEnumerable<TSource>)  extension method as a static method because in most of these 22 rules source code, changing the extension method call into a static method call require a few brain cycle. Moreover it makes the rules source code less readable: For example the first needs to be transformed into the second:

We wanted a straightforward and clean way for NDepend users to solve this issue on all their default-or-custom code rules.  The solution is the new extension method ToHashSetEx().

Solving the issue on an existing NDepend deployment is now as simple as replacing .ToHashSet()  with  .ToHashSetEx()  in all textual files that contain the user code rules and code queries (the files with extension .ndproj and .ndrules).

We just released NDepend v2018.1.1 with this new extension method  ExtensionMethodsSet.ToHashSetEx<TSource>(this IEnumerable<TSource>). Of course all default rules and generated queries now rely on ToHashSetEx() and also a smart error message is now shown to the user in such situation:

We hesitated between ToHashSetEx() and ToHashSet2() but we are confident that this problem won’t scale (more explanation on suffixing a class or method name with Ex here).

 


Actually we could have detected this particular problem earlier in October 2017 because Microsoft claimed that the .NET Fx will ultimately support .NET Standard 2.0 and  .NET Standard 2.0 already presented this ToHashSet() extension method. So this time we analyzed both C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\netstandard.dll and NDepend.API.dll to double-check with this code query that there is no more risk of extension method collision:

We find back both Append() and ToHashSet() collisions and since NDepend.API is not concerned with queryable, there is no more risk of collision:

 

 

Quickly assess your .NET code compliance with .NET Standard

Yesterday evening I had an interesting discussion about the feasibility of migrating parts of the NDepend code to .NET Standard to ultimately run it on .NET Core. We’re not yet there but this might make sense to run at least the code analysis on non Windows platform, especially for NDepend clones CppDepend (for C++), JArchitect (for Java) and others to come.

Then I went to sleep (as every developers know the brain is coding hard while sleeping), then this morning I went for an early morning jogging and it stroke me: NDepend is the perfect tool to  assess some .NET code compliance to .NET Standard, or to any other libraries actually! As soon on my machine I did a proof of concept in less than an hour.

The key is that .NET standard 2.0 types and members are all packet in a single assemblies netstandard.dll v2.0 that can be found under C:\Program Files\dotnet\sdk\NuGetFallbackFolder\netstandard.library\2.0.3\build\netstandard2.0\ref\netstandard.dll (on my machine).  A quick analyze of netstandard.dll with NDepend shows 2 317 types in 78 namespaces, with 24 303 methods and 884 fields. Let’s precise that netstandard.dll doesn’t contain any code, it is a standard not an implementation. The 68K IL instructions represent the IL code for throw null which is the method body for all non-abstract methods.

.NET Standard 2.0 analyzed by NDepend

(Btw, I am sure that if you read this  you have an understanding of what is .NET Standard but if anything is still unclear, I invite you to read this great article by my friend Laurent Bugnion wrote 3 days ago A Brief History of .NET Standard)

Given that, what stroke me this morning is that to analyze some .NET code compliance to .NET Standard, I’d just have to include netstandard.dll in the list of my application assemblies and write a code query that  filters the dependencies the way I want. Of course to proof test this idea I wanted to explore the NDepend code base compliance to .NET Standard:

NetStandard assembly included in the NDepend assemblies to analyze

The code query was pretty straightforward to write. It is written in a way that:

  • it is easy to use to analyze compliance with any other library than .NET standard,
  • it is easy to explore the compliance and the non-compliance with a library in a comprehensive way, thanks to the NDepend code query result browsing facilities,
  • it is easy to refactor the query for querying more, for example below I refactor it to assess the usage of third-party non .NET Standard compliant code

The result looks like that and IMHO it is pretty interesting. For example we can see at a glance that NDepend.API is almost full compliant with .NET standard except for the usage of System.Drawing.Image (all the 1 type are the Image type actually) and for the usage of code contracts.

NDepend code base compliance with .NET standard

For a more intuitive assessment of the compliance to .NET Standard we can use the metric view, that highlights the code elements matched by the currently edited code query.

  • Unsurprisingly NDepend.UI is not compliant at all,
  • portions of NDepend.Core non compliant to .NET Standard are well defined (and I know it is mostly because of some UI code here too, that we consider Core because it is re-usable in a variety of situations).

With this information it’d be much easier to plan a major refactoring to segregate .NET standard compliant code from the non-compliant one, especially to anticipate hot spots that will be painful to refactor.

The code query to assess compliancy can be refactored at whim. For example I found it interesting to see which non-compliant third-party code elements were the most used. So I refactored the query this way:

Without surprise UI code that is non .NET Standard compliant popups first:

.NET Standard non-compliant third-party code usage

There is no limit to refactor this query to your own need, like assessing usage of non-compliant code — except UI code– for example, or assessing the usage of code non compliant to ASP.NET Core 2 (by changing the library).

Hope you’ll find this content useful to plan your migration to .NET Core and .NET Standard!