NDepend

Improve your .NET code quality with NDepend

4 Predictions for the Future of .NET

In May 2019, Microsoft officially announced .NET 5, the future of .NET: it will be based on all the .NET Core work already achieved. Here is the schedule announced:

On one hand the future of .NET has never been so bright. On the other hand this represents a massive move for all .NET development shops, especially for those that still target .NET Framework 4.x that won’t evolve anymore. But not everything is clear from this announcement. Such massive move will have many collateral consequences that we can only guess by now. Certainly many points are not yet cast in stone and still debated.

Hence for large .NET legacy code bases some predictions must be made to plan now a seamless and in-time migration toward the future of .NET. So let’s do some predictions: it’ll still be interesting to come back in a few years and see how good or bad they were.

.NET Standard won’t evolve much

.NET Standard was introduced as a common API set that all .NET flavors must implement. .NET Standard superseded PCL (Portable Class Library). Now that several .NET frameworks will be unified upon .NET Core bases, and that the .NET Framework 4.x won’t support future versions of .NET Standard anymore, it sounds like the need for more .NET standard API will decrease significantly. Actually .NET Framework 4.8 doesn’t even support latest .NET Standard 2.1: “.NET Framework 4.8 will remain on .NET Standard 2.0 rather than implement .NET Standard 2.1”.

However .NET Standard is certainly not dead yet: it is (and will be for years to come) an essential tool to compile code into portable components that can be reused across several .NET flavors. However with this unification process the future of .NET Standard is compromised.

Visual Studio will run on .NET 5 or 6 (and in a x64 process)

It has to. Imagine the consequences if in 3 years from now (2019 Q4) the main Microsoft IDE for .NET professional developments still run on .NET Framework v4.8:

  • Engineers working on VS would lack access to all new .NET APIs, performance improvements and langage improvements. They would remain locked in the past.
  • As a consequence they wouldn’t use their own tool (dogfooding) and dogfooding is a key aspect of developing tools for developers.
  • Overall the message sent wouldn’t be acceptable for the users.

On the other hand, if you know a bit how VS works, imagine how massive this migration is going to be. For more than a decade there have been a lot of complaints from the community about Visual Studio not running in a 64 bits process. See some discussions on reddit here for example. If I remember well this x64 request was the most voted one when VS feedback was still handled by UserVoices. Some technical explanations have been provided by Microsoft like those ones provided 10 years ago! If in 2019 Visual Studio still doesn’t run in a x64 process, this says a lot on how large and complex such migration is.

It seems inevitable that this time the Visual Studio legacy will evolve toward what will be the future of .NET. One key benefit will be to run in a x64 process and have plenty of memory to work with very large solutions. Another implication is that all Visual Studio extensions, like our extension, must evolve too. Here at NDepend we are already preparing it but it will take time, not because we’ll miss much API (we’ll mostly miss AppDomain) but because:

  • We depend on some third-parties that we’d like to get rid of to have full control over our migration, and overall code.
  • For several years we’ll have to support both future Visual Studio versions and Visual Studio 2019, 2017 and maybe 2015 that runs on .NET Framework v4.x (btw we still support VS 2013/2012/2010 but this will have to be discarded to benefit from .NET Standard reused DLLs)

We cannot know yet if Visual Studio vNext will run on .NET 5 or if it’ll take more years until we see it running upon .NET 6?

Btw here are 2 posts Quickly assess your .NET code compliance with .NET Standard and An in-depth analysis of .NET Core 3.0 support for WPF and Winforms APIs that can help plan your own legacy migration.

.NET will propose a cross-platform UI Framework: WPF or a similar XAML UI Framework

On October 4, 2019 Satya Nadella revealed why Windows may not be the future of Microsoft’s business. In August 2019 Microsoft provided a .NET Cross Platform UI Framework Survey. Clearly a .NET cross-platform UI Framework is wanted: the community is asking for it. So far Microsoft closed the debate about WPF: WPF won’t be multi-platform.

Let’s also be crystal clear. This (WPF cross platform) is a very hard project. If the cost was low, this would be a very different conversation and very likely a different outcome. We have enough trouble being compatible with OpenSSL and that’s just one library.  Rich Lander – Dec 5, 2018

But given the immense benefits of what WPF running cross-platform would offer, I wouldn’t be surprise to see WPF become cross-platforms within the next years. Or at least a similar XAML UI framework. Moreover WPF is now open-source so who knows…

The Visual Studio UI is mostly based on WPF hence one of the benefit of having WPF cross-platform would be to have a unique cross-platform Visual Studio: the same way Microsoft is now unifying .NET Frameworks, they could unify the Visual Studio suite into a single cross-platform product.

Xamarin Forms and Avalonia are also natural candidates to be the .NET cross-platform UI Framework. But it seems those frameworks doesn’t receive enough love from the community, this is my subjective feeling. Also we have to keep in mind that Microsoft did a survey and that the community is massively asking for it.

Blazor is promised to a bright future

If you didn’t follow the recent Blazor evolution, the promises of this technology are huge:

  • Run .NET code in all browsers (like Silverlight)
  • with no browser plugin needed (unlike Silverlight)
  • with near-native performance
  • with components compiled to a compact binary format

This is all possible thanks to the WebAssembly (Wasm) format supported by most browsers.

WebAssembly (abbreviated Wasm) is a binary instruction format for a stack-based virtual machine. Wasm is designed as a portable target for compilation of high-level languages like C/C++/Rust, enabling deployment on the web for client and server applications.

Blazor was initially a personal project created by Steve Sanderson from Microsoft. It was first introduced during NDC OSLO in July 2017: the video is worth being watched, also read how enthusiastics are the comments. However Blazor is not yet finalized and still has some limitations: it doesn’t offer yet a decent debugging experience and the application size to download (a few MBs) is still too large because dependencies have to be loaded too. Those ones are currently being addressed (see here for debugging and here for download size, runtime code will be trimmed and cached and usage of CDN (Content Distribution Network) is mentioned).

The community is enthusiast, the technology is getting mature and there is no technological nor political barrier in sight: the Blazor future looks bright. Don’t miss the Blazor FAQ to learn more.

Identify .NET Code Structure Patterns with no Effort

The two pillars of code maintainability are automatic testing and clean code structure.

  • Testing is used to regularly challenge code correctness and detect regression early. Testing can be easily assessed with numbers like code coverage ratio and the amount of assertions tested.
  • A clean code structure prevents the phenomenon of spaghetti code, entangled code that is hard to understand and hard to maintain. However assessing the code structure cannot be achieved through numbers like for testing. Moreover the structure emerges from a myriad of details buried in many source files and thus appropriate tooling is needed.

For most engineers, code dependency graph is the tool of choice to explore code structure. Boxes and arrows graph is intuitive and well adapted to visualize a small amount of dependencies. However to visualize complex portion of code the Dependency Structure Matrix (DSM) is more adapted. See below the same set of 34 namespaces visualized with the NDepend Dependency Graph and the NDepend Dependency Matrix.

NDepend dependency graph
NDepend dependency graph

 

NDepend Dependency Matrix
NDepend Dependency Matrix

If the concept of dependency matrix is something new to you, it is important to note that:

  • The Matrix headers’ elements represent graph boxes
  • The Matrix non-empty cells correspond to graph arrows. Numbers on the cells represents a measure of the coupling in terms of numbers of methods and fields involved. In a symmetric matrix a pair of blue and green cell is symmetric because both cells represents the same thing: the blue cell represents A uses B and the green cells represents B is used by A.

Here is a 5 minutes introduction video if you are not familiar with the dependency matrix:

Clearly the graph is more intuitive, but apart the two red arrows that represent two pairs of namespaces mutually dependent this graph tells few things about the overall structure.

On the other hand the matrix algorithm naturally attempts to layer code elements, exhibit dependency cycles, shows which element is used a lot or not… Let’s enumerate some structural patterns that can be visualized at a glance with the dependency matrix:

Layers

One pattern that is made obvious by a DSM is layered structure (i.e acyclic structure). When the matrix is triangular, with all blue cells in the lower-left triangle and all green cells in the upper-right triangle, then it shows that the structure is perfectly layered. In other words, the structure doesn’t contain any dependency cycle.

On the right part of the snapshot, the same layered structure is represented with a graph. All arrows have the same left to right direction. The problem with graph, is that the graph layout doesn’t scale. Here, we can barely see the big picture of the structure. If the number of boxes would be multiplied by 2, the graph would be completely unreadable. On the other side, the DSM representation wouldn’t be affected; we say that the DSM scales better than graph.

Notice that NDepend proposes 2 rules out of the box to control layering by preventing dependency cycles to appear: ND1400 Avoid namespaces mutually dependent and ND1401 Avoid namespaces dependency cycles.

Interestingly enough, most of graph layout algorithms rely on the fact that a graph is acyclic. To compute layout of a graph with cycles, these algorithms temporarily discard some dependencies to deal with a layered graph, and then append the discarded dependencies at the last step of the computation.

Cycles

If a structure contains a cycle, the cycle is displayed by a red square on the DSM. We can see that inside the red square, green and blue cells are mixed across the diagonal. There are also some black cells that represent mutual direct usage (i.e A is using B and B is using A).

The NDepend’s DSM comes with the option Indirect Dependency. An indirect dependency between A and B means that A is using something, that is using something, that is using something … that is using B. Below is shown the same DSM with a cycle but in indirect mode. We can see that the red square is filled up with only black cells. It just means that given any element A and B in the cycle, A and B are indirectly and mutually dependent.

Here is the same structure represented with a graph. The red arrow shows that several elements are mutually dependent. But the graph is not of any help to highlight all elements involved in the parent cycle.

Notice that in NDepend, we provided a button to highlight cycles in the DSM (if any). If the structure is layered, then this button has for effect to triangularize the matrix and to keep non-empty cells as closed as possible to the diagonal.

High Cohesion / Low-Coupling

The idea of high-cohesion (inside a component) / low-coupling (between components) is popular nowadays. But if one cannot measure and visualize dependencies, it is hard to get a concrete evaluation of cohesion and coupling. DSM is good at showing high cohesion. In the DSM below, an obvious squared aggregate around the diagonal is displayed. It means that elements involved in the square have a high cohesion: they are strongly dependent on each other although. Moreover, we can see that they are layered since there is no cycle. They are certainly candidate to be grouped into a parent artifact (such as a namespace or an assembly).

On the other hand, the fact that most cells around the square are empty advocate for low-coupling between elements of the square and other elements.

In the DSM below, we can see 2 components with high cohesion (upper and lower square) and a pretty low coupling between them.

While refactoring, having such an indicator can be pretty useful to know if there are opportunities to split coarse components into several more fine-grained components.

Too many responsibilities

The popular Single Responsibility Principle (SRP) states that: a class shouldn’t have more than one reason to change. Another way to interpret the SRP is that a class shouldn’t use too many different other types. If we extend the idea at other level (assemblies, namespaces and method), certainly, if a code element is using dozens of other different code elements (at same level), it has too many responsibilities. Often the term God class or God component is used to qualify such piece of code.

DSM can help pinpoint code elements with too many responsibilities. Such code element is represented by columns with many blue cells and by rows with many green cells. The DSM below exposes this phenomenon.

Popular Code Elements

A popular code element is used by many other code elements. Popular code elements are unavoidable (think of the String class for example).

A popular code element is not a flaw. However it is advised that popular elements are interfaces and enumerations. This way consumers rely on abstractions and not on implementations details. The benefit is that consumers are less often broken because abstraction are less subject to change than implementations.

A popular code element is represented by columns with many green cells and by rows with many blue cells. The DSM below highlights a popular code element.

Something to notice is that when one is keeping its code structure perfectly layered, popular components are naturally kept at low-level. Indeed, a popular component cannot de-facto use many things, because popular component are low-level, they cannot use something at a higher level. This would create a dependency from low-level to high-level and this would break the acyclic property of the structure.

Mutual dependencies

You can see the coupling between 2 components by right clicking a non-empty cell, and select the menu Open this dependency.

If the opened cell was black as in the snapshot above (i.e if A and B are mutually dependent) then the resulting rectangular matrix will contains both green and blue cells (and eventually black cells as well) as in the snapshot below.

In this situation, you’ll often notice a deficit of green or blue cells (3 blue cells for 1 green cell here). It is because even if 2 code elements are mutually dependent, there often exists a natural level order between them. For example, consider the System.Threading namespaces and the System.String class. They are mutually dependent; they both rely on each other. But the matrix shows that Threading is much more dependent on String than the opposite (there are much more blue cells than green cells). This confirms the intuition that Threading is upper level than String.

The continuous adaptation of Visual Studio extensions

One could think that developing an extension for a two-decades+ product as mature as Visual Studio is headache-less.

Not really. Visual Studio is a big big beast used by millions of professional developers daily. Versions after versions Microsoft is spending an astounding amount of effort to improve it and make sure it respects the latest standard in terms of dev platforms, IDE features, usability and performance.

This implies a lot of deep changes and all VS extensions must adapt to these changes … or die. Most of this adaptation efforts come from the support of new platforms (.NET Core, Xamarin, upcoming .NET 5…) but here I’d like to focus on VS integration recent changes.

Deferred extensions loading

On Monday 11th December 2017 all VS partners received an mail announcing VS package deferred loading, also referred to extension asynchronous initialization. In this mode VS extensions are loaded after VS startup and initialization. As a consequence VS startup is not slowed down by extensions.

Implementing this scenario required serious changes described here but this deferred loading scenario became mandatory only with VS 2019.1 (16.1) released this spring 2019, around 18 months later after the initial announcement. VS extension ISVs got plenty of time (and plenty of high quality support from Microsoft engineers) to implement and test this async loading scheme.

VS2019 Extensions Menu

In February 2019, we were developing the support of NDepend for VS 2019 and were downloading the latest VS2019 preview, installed our NDepend extension and immediately noticed there were no more NDepend global menu? After investigation we saw our NDepend main menu, and other extensions menus, under a new Extensions menu –with no possibility to get back extensions menu in the main bar–.

Visual Studio 2019 Extensions Menu

This change impacts millions of extensions users and hundreds of partners ISVs but has never been announced publicly ahead nor debated. As an immediate consequence someone asked to get rid of this new Extensions menu and this generated a long list of complains.

I guess this major (and blunt) change was provoked initially by the VS2019 compact menu look, but I am not sure: even on a 1920 pixel wide monitor (DPI 100%) there is still plenty of width space to show extension menus.

Visual Studio 2019 Compact Menu

Sometime in April 2019 someone anonymously published on the discussion the link to a new extension to put back extensions main menus in the main bar.

Getting back Extensions in Main Menu

I guess the VS APIs used to achieve this magic are not well documented, hence we can wonder if this initiative comes from MS or not? At least now users can get back their extensions menus in the main bar. But the Extensions in Main Menus extension has been downloaded only 800 times in around 3 months. It means only a tiny subset of VS extension users are aware of it and are actually using it.

Personally I still have hope Microsoft will reconsider this move and will embed a similar option in a future VS version to let the users chose which features (out-of-the-box VS feature or extension feature) deserves a no-brainer/main bar access or not.

VS2019 Optimize rendering for screens with different pixel densities

We published NDepend v2019.2.0 on 14th March 2019 with proud support for VS2019 and .NET Core 3. Two weeks later a user reported severe UI bugs we’ve never observed. The repro scheme was easy:

  • install .NET Fx 4.8,
  • enable the new VS2019 option Optimize rendering for screens with different pixel densities
  • run VS2019 with the NDepend extension in a multi-monitors environment with various DPI scale for each monitor
Optimize rendering for screens with different pixel densities
Optimize rendering for screens with different pixel densities

We then discussed with VS engineers. They pointed us to this documentation and kindly offered quite a few custom advices: Per-Monitor Awareness support for Visual Studio extenders

Actually this change is pretty deep for (the tons of) WPF and Winforms controls rendered in the Visual studio UI. This took us time but we just released on July 3rd 2019 NDepend v2019.2.5 with full support for this Optimize rendering for screens with different pixel densities option. This required quite tricky fixes and we’d wish we have had a few months to adapt to this major breaking changes as we had for deferred loading explained above. As far as I know extensions ISVs were not informed in advance. On the other hand I’d like to underline the awesome support we got from Visual Studio engineers (as always actually) thanks again to all of them.

I noticed that Resharper pops up a dialog at VS2019 starting time (not sure if they got all fixed at the time of writing).

Resharper Message related to Optimize rendering for screens with different pixel densities

I also noticed that up to VS2019 16.1.3 there were still some bugs in the VS UI, see the Project Properties and Diagnostics Tool windows below. I just tried with 16.1.5 and it seems fixed.

VS 2019 16.1.3 UI issues (now fixed)

My point is that this great improvement (Optimize rendering for screens with different pixel densities) should have been highlighted ahead during VS2019 preview time to avoid a wide range of UI bugs experienced by users during the early months of VS2019 RTM.

What can we expect next?

With .NET 5 public announcement on May 6th 2019 the .NET community now knows that sooner or later most of .NET Core or .NET Fx applications will have to be migrated to benefit from latest innovations and improvements.

Hopefully .NET Core apps migration will be quite straightforward since .NET 5 is .NET Core vNext. But .NET Fx apps migration will be (more or less) painful. Although WPF and Winforms desktop APIs are already supported by .NET Core 3, some others major APIs are not – and won’t be – supported (ASP.NET Web Forms, WCF, Windows Workflow, .NET Remoting, AppDomain…).

For too complex migration plans here is the Microsoft recommendation: What do you do with your older applications that you are not spending much engineering time on? We recommend leaving these on .NET Framework. (…) .NET Framework will continue to be supported and will receive minor updates. Even here at Microsoft, many large products will remain on .NET Framework. There are absolutely no changes to support and that will not change in the future. .NET Framework 4.8 is the latest version of .NET Framework and will continue to be distributed with future releases of Windows. If it is installed on a supported version of Windows, .NET Framework 4.8 will continue to be supported too.

Visual Studio 2019 is still running in a 32 bits process upon the .NET Fx 4 CLR with a massive WPF based UI and some COM usage.

Can we expect VS2019 to run on .NET 5 sometime in 2021? or in a later version? or never?

Only the future will tell but I hope extension VS ISVs will be advised ahead to anticipate the migration and continue to offer a seamless integration experience to users.

Simplifying a Visual Studio extension menu: A Case Study

NDepend version 2019.2.1 has just been released. This new version proposes a simplified menu for the NDepend Visual Studio extension.

Before going further I underline that we didn’t get rid of the standard menu, the user can switch back and forth simplified and standard menu within a single click. There is something in UI that all users worldwide dislike: don’t force a new UI just for the sake of it.

The simplified menu is shown per default on machines that never ran NDepend, else the standard menu is shown. Here is a view of the simplified/beginner menu vs. the standard/advanced menu.

NDepend simplified/beginner menu vs. standard/advanced menu

We introduced a simplified menu because we think that trial users and beginner users will benefit from it. Having most features available through a single level menu certainly makes the product features more discoverable.

More and Tools popup menus

A More… popup menu is here to show features and actions that are typically used less often. It is a UI trade off to force an extra-click to access these actions:

More Actions Menu

The Tools popup menu is an opportunity to switch to the advanced menu, and see the NDepend Start Page. It is also good to discover NDepend tooling outside Visual Studio, the standalone VisualNDepend.exe, NDepend API and OSS Power Tools, Azure DevOps extension and others Continuous Integration possibilities:

Tools Menu Actions

Opportunities to lean up the user experience

Having a simplified menu is also an opportunity to encourage the user to do some configuration steps. For example in the first screenshot we have the Explore Code Coverage button. With NDepend the code coverage by tests of a code base can be harnessed in multiple ways as explained here. But to harness code coverage data the user has to import it first. Hence, per default an Import Code Coverage button is shown to the user and the Explore Code Coverage button is shown only upon data imported.

Another important NDepend feature is comparing against a baseline. As far as I know NDepend is the only Visual Studio tool that lets the user diff the actual code base against a baseline. The user can then focus on new issues since baseline, or diff-related issues like API Breaking Changes or Code Smells Regression. The baseline can also be used to review code changes since the baseline. When NDepend analyzes a large legacy code base, letting the user focuses on new issues since the baseline is a very important facility to avoid letting the users lost in the prioritization of thousands of issues.

Per default the baseline is set to the first analysis result obtained. It means that the first time the user runs an analysis on a code base, this first analysis result is used both for the actual snapshot and the baseline snapshot. In this situation, when the user clicks the buttons New Issues since Baseline or Diff since Baseline, there is nothing to show. Thus we take a chance to show an informational message box that explains this no-diff situation. Doing so avoids the user being stuck with a no-result action or even worse, a button disabled with no obvious reason. We also hope these explanations will motivate the user to work on changes and then re-analyze to see what NDepend has to say about the new changes.

Having a single level menu just became mandatory

Another point to underline is that having a single level VS extension menu just became mandatory. Visual Studio 2019 just introduced the Extensions top level menu and extensions cannot have their top level menus anymore.

Visual Studio 2019 Extensions Menus

This move adds an extra click for accessing each extension feature. This extra click represents a UI regression for the millions of VS extensions users and de-facto discards features accessible only through a nested sub-menu. This move was unexpected because it solves a UI problem we (as a VS extension provider) have never heard of: having so many extensions menus that it becomes a cluttering problem. The only right way to solve such problem would be to let the user choose which extension have their top-level menu and which extension lies under the Extensions menu. An extension menu should be top-leveled by default upon installation: when a user installs an extension the goal is to spend the next minutes or hours trying it and using it. In this context it doesn’t make sense to put the extension menus in a second-zone, doing so degrades the so precious and fragile FTUE First-Time User Experience. A VS extension doesn’t get a second chance to offer a first impression to a trial user.

Several big names in the VS extension industry have already complained about this move here. You can vote and comment now because apparently Microsoft will monitor this thread only till mid May 2019.


 

We hope this simplified menu will help trial and beginner users getting started with all the NDepend goodies. Certainly some seasoned users will also prefer this less cluttered menu style. If you have comment or ideas to improve more let us know.

On forced UI change you can read further this 2012’s Scott Hanselman post Who moved my cheese?

People don’t like it when you move their cheese. They are just trying to get through the maze, they have it all under control and then, poof, someone moved their cheese. Now it’s a huge hassle to find it again. Change a hotkey or the case of the menus and all heck breaks loose.

 

An in-depth analysis of .NET Core 3.0 support for WPF and Winforms APIs

.NET Core 3.0 will be RTM soon and it supports WPF and Winforms APIs.

In my last post I’ve been exploring .NET Core 3.0 new APIs by comparing compiled bits with NDepend, of .NET Core 3.0 against .NET Core 2.2.

In this post I will compare .NET Core 3.0 Windows Forms (Winforms) and WPF APIs with .NET Framework 4.x.

I won’t make the suspense last: .NET Core 3.0 support for Winforms and WPF APIs is almost complete, I found very few breaking changes.

I will now explain what I’ve done with NDepend to explore this API diff, and then dig into the results. If you are wondering How to port desktop applications to .NET Core 3.0 see Microsoft explanations here.

Comparing .NET Core 3.0 Winforms and WPF APIs vs. NET Framework 4.x with NDepend

From the NDepend Start Page select Compare 2 versions of a code base menu. Then use Add assemblies in Folder buttons to add .NET Framework assemblies from folder C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319 and Microsoft.WindowsDesktop.app nuget package assemblies (from folder C:\Users\psmac\.nuget\packages\microsoft.windowsdesktop.app\3.0.0-preview-27325-3\ref\netcoreapp3.0 on my machine).

Comparing .NET Core 3.0 Winforms WPF APIs with .NET Framework

A minor difficulty was to isolate the exact set of assemblies to focus on. Here is the list of concerned assemblies I came up with:

Notice that to preserve the correspondance between APIs and assemblies packaging, the attribute TypeForwardedToAttribute is massively used to delegate implementations.

Usage of TypeForwardedToAttribute

The few breaking changes

With the default NDepend rules about API breaking changes, I’ve only found 16 public types and 52 public methods missing. Here are the types:

16 types missing in WPF Winforms .NET Core 3.0 API

16 types missing on a total of 4.095 public types, well done!

Number of public types per assemblies

The 52 public methods missing are: (on a total of 42.645 public methods)

Parent Assembly Name Parent Type Name Method name
System.Security.Principal.Windows System.Security.Principal.WindowsIdentity .ctor(String,String)
System.Security.Principal.Windows System.Security.Principal.WindowsIdentity Impersonate()
System.Security.Principal.Windows System.Security.Principal.WindowsIdentity Impersonate(IntPtr)
System.Security.Principal.Windows System.Security.Principal.IdentityReferenceCollection get_IsReadOnly()
System.Security.Permissions System.Net.EndpointPermission ToString()
System.Security.Permissions System.Security.HostProtectionException GetObjectData(SerializationInfo,StreamingContext)
System.Security.Permissions System.Security.Policy.ApplicationDirectory Clone()
System.Security.Permissions System.Security.Policy.ApplicationTrust Clone()
System.Security.Permissions System.Security.Policy.PermissionRequestEvidence Clone()
System.Security.Permissions System.Security.Policy.Site Clone()
System.Security.Permissions System.Security.Policy.StrongName Clone()
System.Security.Permissions System.Security.Policy.Url Clone()
System.Security.Permissions System.Security.Policy.Zone Clone()
System.Security.Permissions System.Security.Policy.GacInstalled Clone()
System.Security.Permissions System.Security.Policy.Hash Clone()
System.Security.Permissions System.Security.Policy.Publisher Clone()
System.Security.Cryptography.Pkcs System.Security.Cryptography.Pkcs.EnvelopedCms .ctor(SubjectIdentifierType,ContentInfo)
System.Security.Cryptography.Pkcs System.Security.Cryptography.Pkcs.EnvelopedCms .ctor(SubjectIdentifierType,ContentInfo,AlgorithmIdentifier)
System.Security.Cryptography.Pkcs System.Security.Cryptography.Pkcs.EnvelopedCms Encrypt()
System.Security.Cryptography.Pkcs System.Security.Cryptography.Pkcs.ContentInfo Finalize()
System.Security.Cryptography.Cng System.Security.Cryptography.ECDiffieHellmanCng FromXmlString(String)
System.Security.Cryptography.Cng System.Security.Cryptography.ECDiffieHellmanCng ToXmlString(Boolean)
System.Security.Cryptography.Cng System.Security.Cryptography.ECDsaCng FromXmlString(String)
System.Security.Cryptography.Cng System.Security.Cryptography.ECDsaCng ToXmlString(Boolean)
System.Security.Cryptography.Cng System.Security.Cryptography.RSACng DecryptValue(Byte[])
System.Security.Cryptography.Cng System.Security.Cryptography.RSACng EncryptValue(Byte[])
System.Security.Cryptography.Cng System.Security.Cryptography.RSACng get_KeyExchangeAlgorithm()
System.Security.Cryptography.Cng System.Security.Cryptography.RSACng get_SignatureAlgorithm()
System.Printing System.Printing.PrintQueue set_Name(String)
System.Printing System.Printing.IndexedProperties.PrintInt32Property op_Implicit(PrintInt32Property)
System.Printing System.Printing.IndexedProperties.PrintStringProperty op_Implicit(PrintStringProperty)
System.Printing System.Printing.IndexedProperties.PrintStreamProperty op_Implicit(PrintStreamProperty)
System.Printing System.Printing.IndexedProperties.PrintQueueAttributeProperty op_Implicit(PrintQueueAttributeProperty)
System.Printing System.Printing.IndexedProperties.PrintQueueStatusProperty op_Implicit(PrintQueueStatusProperty)
System.Printing System.Printing.IndexedProperties.PrintBooleanProperty op_Implicit(PrintBooleanProperty)
System.Printing System.Printing.IndexedProperties.PrintThreadPriorityProperty op_Implicit(PrintThreadPriorityProperty)
System.Printing System.Printing.IndexedProperties.PrintServerLoggingProperty op_Implicit(PrintServerLoggingProperty)
System.Printing System.Printing.IndexedProperties.PrintDriverProperty op_Implicit(PrintDriverProperty)
System.Printing System.Printing.IndexedProperties.PrintPortProperty op_Implicit(PrintPortProperty)
System.Printing System.Printing.IndexedProperties.PrintServerProperty op_Implicit(PrintServerProperty)
System.Printing System.Printing.IndexedProperties.PrintTicketProperty op_Implicit(PrintTicketProperty)
System.Printing System.Printing.IndexedProperties.PrintByteArrayProperty op_Implicit(PrintByteArrayProperty)
System.Printing System.Printing.IndexedProperties.PrintProcessorProperty op_Implicit(PrintProcessorProperty)
System.Printing System.Printing.IndexedProperties.PrintQueueProperty op_Implicit(PrintQueueProperty)
System.Printing System.Printing.IndexedProperties.PrintJobPriorityProperty op_Implicit(PrintJobPriorityProperty)
System.Printing System.Printing.IndexedProperties.PrintJobStatusProperty op_Implicit(PrintJobStatusProperty)
System.Printing System.Printing.IndexedProperties.PrintDateTimeProperty op_Implicit(PrintDateTimeProperty)
System.Printing System.Printing.IndexedProperties.PrintSystemTypeProperty op_Implicit(PrintSystemTypeProperty)
System.Printing System.Windows.Xps.XpsDocumentWriter raise__WritingProgressChanged(Object,WritingProgressChangedEventArgs)
System.Printing System.Windows.Xps.XpsDocumentWriter raise__WritingCompleted(Object,WritingCompletedEventArgs)
System.Printing System.Windows.Xps.XpsDocumentWriter raise__WritingCancelled(Object,WritingCancelledEventArgs)
System.Drawing System.Drawing.FontConverter Finalize()

Portability to .NET Core 3.0 analysis

Microsoft offers a Portability Analyzer tool to analyze changes in desktop API that will break your desktop app. I’ve tested it on NDepend but I just got very coarse results. Did I miss something? At least it is mostly green 🙂

Portability Analyzer analysis on NDepend

I wrote last year a post named Quickly assess your .NET code compliance with .NET Standard let me know in comment if it is worth revisiting this post for desktop APIs. Btw, my guess is that desktop APIs won’t be part of .NET Standard vNext (since there is no plan to support it on all platforms) but I haven’t found any related info on the web.

Why migrate your desktop app to .NET Core 3.0?

This is a great news that Microsoft embeds good-old desktop APIs in .NET Core 3.0 with such an outstanding compatibility. It is worth noting that so far (February 2019) there is no plan to port Windows Forms and WPF on other platforms than Windows.  So, what are the benefits of porting an existing application to .NET Core 3.0?

I found answers in this recent How to Port Desktop Applications to .NET Core 3.0 Channel9 30 minutes video at 5:12. Basically you’ll get more deployment flexibility, Core Runtime and API improvements and also more performances.

Microsoft promises to not urge anyone to port existing Winforms and WPF application to .NET Core 3.0. However for a Visual Studio extension shop like us if it is decided that VS will run on .NET Core 3.0 in the future, we hope to be notified many months ahead. We discussed that on twitter with Amanda Silver in January 2019. It looks like this spring 2019 they will take a decision. As a consequence to support both Visual Studio past versions running on .NET fx and new versions running on .NET Core 3, an extension will need to support both .NET Fx and .NET Core 3 desktop APIs.

Visual_Studio_Enterprise_vs._Professional_Essential_Differences

Visual Studio Enterprise vs. Professional: Essential Differences

If you’re a .NET developer, then it’s overwhelmingly likely that you’re a Visual Studio user. There are alternatives to it, sure. But the product from the Redmond giant is the go-to when it comes to developing for the .NET framework.

For a newcomer, though, things can get confusing since Visual Studio isn’t a single thing.

Instead, it comes in several shapes and sizes.

Which one should you pick? What are the features that matter for your use case?

Since Visual Studio isn’t free—most editions aren’t, at least—you want to get the best bang for your buck.

That’s what this post is going to cover. As its title suggests, we’ll focus on the differences between the enterprise and professional editions of the integrated development environment (IDE).

By the end of the post, you’ll have learned enough to make an informed decision on which version of the IDE better suits your needs.

Wishing your edition of Visual Studio had full architecture tooling support?

Download a free trial of NDepend  and check out all of the architecture visualization you can get without needing to upgrade your VS edition.

Let’s get started.

Continue reading Visual Studio Enterprise vs. Professional: Essential Differences

On the Superiority of the Visual Studio Dark Theme

When I downloaded the newest version of NDepend, something wonderful awaited me.  Was it support for the latest .NET Core version?  The addition of checks for ubiquitous language for DDD projects?  Any of the various rule additions or bug fixes?

Nope.  I’m a lot more superficial than that, apparently.  I saw this and was incredibly excited:

Dark Theme

In case you’re scratching your head and wondering, “what?” instead of sharing my delight, I’ll be more explicit.  NDepend added support for the Visual Studio dark theme.  And I absolutely love it.

Asserting the Superiority of the Visual Studio Dark Theme

Why so excited?  Well, as a connoisseur of the Visual Studio dark theme, this makes my experience with the tool just that much better.

Don’t get me wrong.  I didn’t mind the interface up until this release, per se.  I’ve happily used NDepend for years and never really thought about its color scheme.  But this version is the equivalent of someone going into my house where I already like the bathroom and installing a radiant floor.  I never knew I was missing it until I had it.

Everything in Visual Studio is better in the dark.

Oh, I know there’s evidence to the contrary.  People are, apparently, 26% more accurate when reading dark text on a light background than vice-versa. And it’s easier to focus on and remember text in a light theme.  I understand all of that.  And I don’t care.

The dark theme is still better.

Continue reading On the Superiority of the Visual Studio Dark Theme

A Guide to Code Coverage Tools for C#

A Guide to Code Coverage Tools for C#

I promise that I’ll get to a treatment of code coverage tools in short order.  In this post, I’ll go through 6 different options and list their features to help you make a decision.

But first, I want to offer a quick disclaimer and warning.

If you’re here because you’re looking for a way to let management track the team’s code coverage progress, please stop and reconsider your actions.

I’ve written in the past about how code coverage should not be a management concern, and that holds as true now as ever.  Code coverage is a tool that developers should use — not something that should be done to them.  Full stop.

With that out of the way, let’s get back to helping developers pick a tool to keep an eye on code coverage.

Interested in seeing a heat map of your codebase’s code coverage?

Download the NDepend trial for free and see what it looks like.

What Is Code Coverage and Why Do It?

Okay, with that out of the way, let’s talk extremely briefly about what code coverage is.

And take note that this is going to be a very simple explanation, glossing over the fact that there are actually (slightly) different ways to measure coverage.  But the short form is this.  Code coverage measurements tell you which lines of code your test suite executes and which lines it doesn’t.

Let’s look at the simplest imaginable example.  Here’s a pretty dubious implementation of division:

Let’s say this was the only method in our codebase.  Let’s also say that we wrote a single unit test, from which we invoked Divide(2, 1) and asserted that we should get back 2.

We would then have 67% code coverage.  Why?

Well, this test would cause the runtime to test the conditional and then to execute the return x/y statement, thus executing two-thirds of the method.  Absent any other tests, no automated test ever executes that last line.

So in the grossest of terms, that’s the “what.”  As for the “why,” developers can use code coverage analysis to see holes in their testing efforts.  For instance, code coverage analysis would tell us here, “Hey, you need to test the case where you pass a 0 to the method for y.”

What Are Code Coverage Tools?

What, then, are code coverage tools?

Well, as you can imagine, computing code coverage the way I just did would get pretty labor intensive.

So developers have done what developers do.  They’ve automated code coverage detection.

In just about any language and tech stack imaginable, you have ways to detect how thoroughly the unit test suite covers your codebase.  (Don’t confuse code coverage with an assessment of the quality of your tests, though.  It’s just a measure of whether or not the tests execute the code.)

The result is an array of tools to choose from that help you see how well your test suite covers your codebase.  And that can become somewhat confusing.  So today, I’ll take you through some of your options in the .NET ecosystem.

Let’s take a look at those options.

Continue reading A Guide to Code Coverage Tools for C#

Static analysis of .NET Core 2.0 applications

NDepend v2017.3 has just been released with major improvements. One of the most requested features, now available, is the support for analyzing .NET Core 2.0 and .NET Standard 2.0 projects. .NET Core and its main flavor, ASP.NET Core, represents a major evolution for the .NET platform. Let’s have a look at how NDepend is analyzing .NET Core code.

Resolving .NET Core third party assemblies

In this post I’ll analyze the OSS application ASP.NET Core / EntityFramework MusicStore hosted on github. From the Visual Studio solution file, NDepend is resolving the application assembly MusicStore.dll and also two test assemblies that we won’t analyze here. In the screenshot below, we can see that:

  • NDepend recognizes the .NET profile, .NET Core 2.0, for this application.
  • It resolves several folders on the machine that are related to .NET Core, especially NuGet package folders.
  • It resolves all 77 third-party assemblies referenced by MusicStore.dll. This is important since many code rules and other NDepend features take into account what the application code is using.

It is worth noticing that the .NET Core platform assemblies have high granularity. A simple website like MusicStore references no fewer than 77 assemblies. This is because the .NET Core framework is implemented through a few NuGet packages that each contain many assemblies. The idea is to release the application only with needed assemblies, in order to reduce the memory footprint.

.NET Core 2.0 third party assemblies granularity

NDepend v2017.3 has a new heuristic to resolve .NET Core assemblies. This heuristic is based on .deps.json files that contain the names of the NuGet packages referenced. Here we can see that 3 NuGet packages are referenced by MusicStore. From these package names, the heuristic will resolve third-party assemblies (in the NuGet store) referenced by the application assemblies (MusicStore.dll in our case).

NuGet packages referenced in .deps.json file

Analyzing .NET Standard assemblies

Let’s be clear that NDepend v2017.3 can also analyze .NET Standard assemblies. Interestingly enough, since .NET Standard 2.0, .NET Standard assemblies reference a unique assembly named netstandard.dll and found in C:\Users\[user]\.nuget\packages\NETStandard.Library\2.0.0\build\netstandard2.0\ref\netstandard.dll.

By decompiling this assembly, we can see that it doesn’t contain any implementation, but it does contain all types that are part of .NET Standard 2.0. This makes sense if we remember that .NET Standard is not an implementation, but is a set of APIs implemented by various .NET profiles, including .NET Core 2.0, the .NET Framework v4.6.1, Mono 5.4 and more.

Browsing how the application is using .NET Core

Let’s come back to the MusicStore application that references 77 assemblies. This assembly granularity makes it impractical to browse dependencies with the dependency graph, since this generates dozens of items. We can see that NDepend suggests viewing this graph as a dependency matrix instead.

NDepend Dependency Graph on an ASP.NET Core 2.0 project

The NDepend dependency matrix can scale seamlessly on a large number of items. The numbers in the cells also provide a good hint about the represented coupling. For example, here we can see that  22 members of the assembly Microsoft.EntityFrameworkCore.dll are used by 32 methods of the assembly MusicStore.dll, and a menu lets us dig into this coupling.

NDepend Dependency Matrix on an ASP.NET Core 2.0 project

Clicking the menu item Open this dependency shows a new dependency matrix where only members involved are kept (the 32 elements in column are using the 22 elements in rows). This way you can easily dig into which part of the application is using what.

NDepend Dependency Matrix on an ASP.NET Core 2.0 project

All NDepend features now work when analyzing .NET Core

We saw how to browse the structure of a .NET Core application, but let’s underline that all NDepend features now work when analyzing .NET Core applications. On the Dashboard we can see code quality metrics related to Quality Gates, Code Rules, Issues and Technical Debt.

NDepend Dashboard on an ASP.NET Core 2.0 project

Also, most of the default code rules have been improved to avoid reporting false positives on .NET Core projects.

NDepend code rules on an ASP.NET Core 2.0 project

We hope you’ll enjoy using all your favorite NDepend features on your .NET Core projects!

C# Tools to Help with Your Code Quality

C# Tools to Help with Your Code Quality

Over the years, one of the things I’ve come to love about the .NET ecosystem is the absolute abundance of tools to help you.  It’s an embarrassment of riches.  I enjoy writing code in C# because the language itself is great.  But C# tools take the experience to a whole other level.

I know, I know.  Some of you out there might argue that you get all of this goodness only by using heavyweight, resource-intensive tooling.  I’ll just go ahead and concede the point while saying that I don’t care.  I’m happy to work on a powerful development rig, outfitted with powerful tools, to write code in a highly productive language.

Today, I’d like to talk about some of these C# tools.  Or I should say I’d like to talk about some of the many C# tools you can use that are generally oriented toward the subject of code quality.

So, if you’re a C# developer, what are some tools you can use to improve the quality of your code?

Continue reading C# Tools to Help with Your Code Quality