Microsoft Build 2018 – Session Recommendations (Part 2)

As promised, I bring a couple more sessions from Microsoft Build 2018!

The first part of this list can be found here.

Be aware that I’m a .NET+C# Lover, so those are the sessions that I enjoy the most, but if you want a deeper look into the sessions catalog, you can go through this list on Channel9, all the sessions are there (or will be very soon)

But let’s start with my session recommendations.

.NET Overview & Roadmap

This session was amazing. Presented by Scott Hanselman and Scott Hunter, it presented some amazing numbers about .NET Core and how bright (and fast) the future will be.

They also presented a couple of ideas that they are working on and I was particularly AMAZED by the demo on the micro-services template that they are playing with for ASP.NET Core 2.2. You will see that the session’s audience reaction was also very positive.

API CLI

Also great demos on .NET Tooling and Blazor (Blazor is unbelievable so far)

I, particularly, loved it. Also, Scott Hanselman was joking as hell on this one, it’s almost half tech half stand-up comedy 😛 

The session:

Pair Programming Made Awesome with Visual Studio Live Share

I tried really hard to get access to VS Live Share during the Beta phase, but unfortunately it didn’t work out for me, now it really available (in preview) and it’s awesome!

The whole idea is to be able to share code files during a LIVE session in a way that developers can collaborate and help solve each others problems (or any other idea that you can come up in a collaboration session)

But how is it different from Screen Sharing? Well, it has a number of benefits that are enumerated on this session by Jonathan Carter and Jon Chu.

I played with it a little bit with some friends (even friends from Brazil!) and it really is super fun.

Past, Present, and Future of .NET

I did a post not long ago about a session from Richard Campbell on the history of .NET, which was really great. If you watched it you might know that he is now working on a book about the History of .NET (!)

Now, in this Build 2018 Talk/Session, he is exploring some of this history with some major players from the .NET world! Scott Hunter, Beth Massi and Mads Torgersen.

This is really just a casual talk exploring some of the major events that led to the current state of the platform. It’s really nice to see how things played to get us to where we are right now.

The player is a bit different on this one because it’s straight out of Channel9 and not from YouTube.

Technology Keynote: Microsoft Azure

Also traditional at this point is the Microsoft Azure Technology Keynote by Scott Guthrie.

Again there was a big focus on the AI power that Microsoft is adding to the Azure Cloud every day. This one has a number of really interesting demos!

  • Visual Studio Live Share (Amanda Silver & Johnathan Carter)
    • I’m really super excited about this feature! So much potential.
  • VS App Center + GitHub Integration (Simina Pasat)
  • VSTS and a Lot on CI/CD (Donovan Brown)
    • I consume a lot of CI/CD and DevOps resources at my work, but I’m not really a specialist in configuring or maintaining them, still, this Demo made me want to now a lot more about it!
  • Kubernetes + Azure + Visual Studio (Scott Hanselman)
    • Nice Demo on how those 3 tools integrate really well to make the handling of containers with Kubernetes on Azure be as smooth as possible.
    • There was a funny technical problem during this Demo 😛 Scott Guthrie added some value to the presentation.
  • Serverless + IoT (Jeff Hollan)
    • I don’t need to tell you how much Microsoft is investing on this right? Azure Functions are getting better and better in an amazing speed.
  • CosmosDB (Rimma Nehme)
  • Azure Search + Cognitive Skills (Paige Bailey)
  • Azure DataBricks (Paige Bailey)

As you can see there was two Demos by Paige Bailey, a Cloud Developer Advocate for Azure that has a number of amazing posts on twitter! You can follow her here.

And here is the session.

Nice wasn’t it? 😀

I have enough videos on my queue yet, so I will probably make at least one more post with more session recommendations. Let me know if you are enjoying it!

Microsoft Build 2018 – Session Recommendations (Part 1)

As some of you might know, the Microsoft Build 2018 event happened this week (07-09 May) and this is always an awesome event for .NET Developers!

As I am making my way through as many sessions as my time allows me, I would like to share some of the best ones (in my not so humble opinion) with all of you!

Vision Keynote

I will start here with the more obvious one, the Keynote from Satya Nadella. This year’s Vision Keynote focused a lot into AI in the Cloud (Intelligent Cloud) and AI in IoT (Intelligent Edge), which is pretty interesting as I still lack a lot of experience in both AI/ML and IoT.

This Keynote actually made me very excited with what Microsoft envisions for AI and Machine Learning in the .NET world for the future, and I will try to keep my radar on this subjects.

I am particularly excited with ML.NET and the ever-growing power of the Cognitive Services.

As usual, there are some amazing numbers about the industry:Industry Numbers

Here is the video:

The Future of C#

This session is a must for me by now, as every year Mads Torgersen and Dustin Campbell make an awesome presentation on what is coming next for my favorite language ever. (I love you Python, but C# comes first)

I was a bit disconnected from what is about to come with C# 8.0 and this session was amazing to put me up to speed with the stage in which those feature proposals are.

If you wanna know more about C# language feature proposals (or maybe even propose your own!) you can check this repository on github.

ML.NET

MACHINE LEARNING IN DOTNET! Do I need to say anything else? Okay, a bit more then:

  • It’s multi-platform (Windows, Linux and MacOS)
  • It’s Open Source (check it out here)
  • It’s still on Preview

The session is a really short one (20 minutes) and in this short time Rowan Miller walks through an application that can fix a piece of music(!) using Machine Learning with ML.NET.

It’s a relatively high altitude overview of the Framework’s capabilities, but it’s also super FUN, and powerful.

I am still waiting for them to upload some sessions that look really amazing.

But what do you think of these sessions so far?

I will post another list of my favorite sessions in a couple of days, after I went through most of my watch list 😛

Talk to you soon 🙂

The History of .NET by Richard Campbell

Hi everyone,
I am preparing a nice post on Azure Functions for very soon but, today I would like to share some history with you.

I found today a really amazing video on YouTube about the History of .NET. The video is actually a recording of a talk at the NDC { London } 2018, just 2 months ago, by Richard Campbell (from .NET Rocks)

On the talk Richard, who is a great storyteller, explains the story of how Microsoft was cornered and had to reinvent itself many times along the last 20-25 years, and how it led to the creation of the .NET Framework and all the technologies that entangles it. He cannot avoid mentioning all the great people involved into this process, from the original designer of J++, Microsoft’s implementation of Java, to the current maintainers of the .NET Framework and C# Language, and even the current CEO.

It also, and it could not be different, explains a lot of the history of the Windows OS, Microsoft’s Web Browsers and the rise of Microsoft Azure!

If you code in any Microsoft Language and like .NET and a bit of History, I assure you it is worth every second of the (~) 1 hour and 10 minutes.

Here is the video:

So, what do you think? Great, wasn’t it?

Links:

Creating an OData v4 API with ASP.NET Core 2.0

Hallo Mensen 😀 Let’s talk OData, shall we?

In the last few years my work revolved a lot around REST APIs. I think most of us will agree that REST APIs are a really good way of obtaining data from a server without caring too much about details on how to get access to that data.

You call an URL with the proper parameters (including auth or not), headers, and HTTP verb and you get some data back. Easy enough, right?

The negative side is that these APIs are either very rigid on the way they are implemented, meaning that you can only get the data in the exact way defined by the server, or they are very verbose on their implementation, meaning that you will have a lot of code to maintain on the server side to make the API flexible.

Then what?… Use OData!

What is the Open Data Protocol (OData)?

OData Logo
OData Logo

OData, or Open Data Protocol, is a set of ISO approved standards for building and consuming RESTul APIs, but what does it mean, in practice?

It means that OData is not really an implementation that you simply use but documentation specifying how you must implement it yourself. You can read a lot more about OData here.

Why would I want to use OData?

There are a number of benefits that you will obtain from implementing OData standards, from an easier learning path on the usage of your API by your customers/consumers, to the fact that OData is easily readable by machines, and in this post I want to talk about the flexibility that implementing OData gives to your API through the OData URL Conventions.

Using URL Conventions you can expose a much cleaner and generic API and let the consumer specify their needs through the call.

OData URL Conventions

The OData URL Conventions are a set of  commands that you can pass to the API through the HTTP call query string.

An OData URL is typically composed by three parts: service root URL, resource path and query options.

OData Url Format

  • Service Root URL is the root address for the API.
  • Resource Path identifies exactly which resource you are trying to achieve.
  • Query Options is how you define to the API the format in which you need that data to be delivered.

How does that sound? Simple enough, right? but also, with the proper options, extremely powerful. Let me give you a list of possible options within the Query Options block of the call:

  • $select: Allows you to define a subset of properties to return from that Resource.
  • $expand: Allows you to include data from a related resource to your query results.
  • $orderby: Not surprisingly, allows you to define the ordering of the returned dataset.
  • $top: Allows you to select the top X results of the query.
  • $skip: Allows you to skip X results of the query.
  • $count: Allows you to get a count of items that would result from that query.
  • $search: Allows for a free-text search on that particular resource
  • $format: Allows you to define the format for the returned data in some query types
  • $filter: Allows you to define a filter for your dataset.

As you can see, many of the commands are pretty similar to what you have in most of the common query languages.

I will go into a bit more detail in each of those options on the Code sample.

OData and ASP.NET

ASP.NET Core still don’t have a stable library to implement the OData protocol! But worry you not, as Microsoft has been working on it for some time and right now we have a really promising beta version on Nuget. You can find it here.

ASP.NET Framework has a really good library to implement OData, and it is quite stable by now. You can find it here.

Enough with the theory, how can we implement this query protocol in my ASP.NET Core Application?

letscode

Implementing your API

Let’s start creating a simple ASP.NET Core Web API Application on Visual Studio and creating our models.

Also, let’s create our DbContext…

…and configure our Services.

Good! Our plumbing is set, now we are going to seed some initial data to our database.

And now we call our seeder on app startup.

We must not forget to add our migration.

And last, but not least, let’s implement the simplest API possible on our 3 entities.

Done. Now we can test it using Postman:

 


Wow! It certainly looks like a nice API, doesn’t it? What is the problem with it? Why would I ever want to add OData to it?

Well, there are two fundamental problems with this approach to our API: the payload size and the querying of the data from the database.

Payload Size

The payload format is completely defined on the API/Server side of the application and the client cannot define which data he really needs to receive.

This can be made more flexible by adding complexity to the code (more parameters? more calls?) but this is not what we want right?

In the most common scenarios the client will simply have to ignore a lot of data that he doesn’t care about.

Look at the result of our query for Books below and tell me what should we do if I only want the name for the first book on the list?

We have no options here other than accept all this data and filter what we need on client side.

Querying the Data

For much of the same reason, all the queries to the database have to be done in a very rigid way, not allowing for smaller queries whenever possible.

In the same query as above, we just sent a request for a list of books, and let’s have a look at what was sent to the database:

All this huge query just to get the name of one book. That doesn’t sound good, right?

Let’s make it better with OData 🙂

Changing our API to OData

The good news is that we can use much of the same structure for our OData API, we just need to make a few configurations. Let’s start by installing the package.

As you can see the OData package for .NET Core is still in beta, as opposed to the .NET Framework version of the package, who is stable for a long time already. I have high hopes that this package will be out of beta in no time!

Let’s configure our entities to understand the OData commands.

I commented the function of each call on this class, pay close attention to it, as the calls are paramount for the full use of the URL Conventions. And now let’s wire our OData configuration with the rest of the API.

All good! Finally, we must adjust our three controllers and have then accept OData URLs. Things you should notice were changed on the Controllers:

  • All the controllers were renamed to be in the singular form. That is only necessary due to our configuration in the ModelBuilder, they can be configured to be plural.
  • The return types were all changed to IQueryable<T>
  • The .Include() calls were removed, as they are no longer necessary. The OData package will take care of this for you.
  • We are no longer inheriting from Controller but from ODataController
  • We have a new decorator for the API calls: [EnableQuery]

And that is it! Our API is ready to be used with OData URL Conventions. Let’s try it?

New API and Results

You can play with the new API format on Postman:

 

The original call to get books would look like this:

http://localhost:5000/api/books

The new call will look like this

http://localhost:5000/odata/Book

First of all, let’s try to get a list of books and look at the results:

MUCH Cleaner, right? Let’s even make it smaller, as we just want the name of the first book on the list:

http://localhost:5000/odata/Book?$top=1

And let’s have a look at the database query for this call:

Yes! A much more efficient call! But wait… We just need the NAME of the book, why don’t we make it more specific?

http://localhost:5000/odata/Book?$top=1&$select=Name

And that is an awesomely small payload! And the query is also more efficient

What if you want details about and specific Author and all his related books?

http://localhost:5000/odata/Author?$expand=Books&$filter=Name eq ‘J.K. Rowling’

Amazing, isn’t it? That can really increase the quality of our APIs.

As a last piece of information, let’s not forget that OData is designed to be readable by machines! So we have a couple of out-of-the-box urls with documentation on our API:

http://localhost:5000/odata/$metadata

Cool, isn’t it?

What’s Next?

Well, now that you know how simple it is to implement the OData protocol in .NET, I would recommend that you spend some time getting familiar with the Protocol itself, you can find all the guidance that you need here.

Also, if you have the intention to use the protocol with .NET Core, I suggest that you keep a close eye on the Nuget Package page and also the feature request on GitHub.

Source Code

You can find the whole source code for this solution on my GitHub.

And that is all for today folks 🙂

I hope you enjoyed it, and don’t hesitate to use the comments section if you were left with any questions.

Sources:

.NET Core 2.1 is coming! (and I will be back)

Hallo Mensen 🙂
I know I’ve been away from my blog for a long time and I’ll not try to make an excuse for this, but I want to make it clear that I intend to start writing again some time this quarter!

Today I just wanted to share with you two new videos from Channel 9 with some cool demos on the new features for .NET Core 2.1. In particular I would advise you to pay close attention to the improvements on the HttpClient and the Entity Framework support for CosmosDb. Enjoy!

What is new in .NET Core 2.1?

The Demos!!

One last thing to mention. Pay close attention to the benchmarks on the build process for .NET Core 2.1, it is amazing!

Incremental Build Improvements for .NET Core 2.1 SDKReally excited for the Future of .NET Core 😀

.NET Core 2.1 should have its first previews released still in this February and the RTM Version is planned to this Summer!

Source: .NET Core 2.1 Roadmap | .NET Blog

 

Announcing .NET Core 2.0 | .NET Blog

This post is a reblog from the Official .NET Blog on MSDN.


.NET Core 2.0 is available today as a final release. You can start developing with it at the command line, in your favorite text editor, in Visual Studio 2017 15.3, Visual Studio Code or Visual Studio for Mac. It is ready for production workloads, on your own hardware or your favorite cloud, like Microsoft Azure.

We are also releasing ASP.NET Core 2.0 and Entity Framework Core 2.0. Read the ASP.NET Core 2.0 and the Entity Framework Core 2.0 announcements for details. You can also watch the launch video on Channel 9 to see many of the new features in action.

The .NET Standard 2.0 spec is complete, finalized at the same time as .NET Core 2.0. .NET Standard is a key effort to improve code sharing and to make the APIs available in each .NET implementation more consistent. .NET Standard 2.0 more than doubles that set of APIs that you have available for your projects.

.NET Core 2.0 has been deployed to Azure Web Apps. It is available today in a small number of regions and will expand globally quickly.

.NET Core 2.0 includes major improvements that make .NET Core easier to use and much more capable as a platform. The following improvements are the biggest ones and others are described in the body of this post. Please share feedback and any issues you encounter at dotnet/core #812.

Runtime

SDK

Visual Studio

  • Live Unit Testing supports .NET Core
  • Code navigation improvements
  • C# Azure Functions support in the box
  • CI/CD support for containers

For Visual Studio users: You need to update to the latest versions of Visual Studio to use .NET Core 2.0. You will need to install the .NET Core 2.0 SDK separately for this update.

Thanks!

On behalf of the entire team, I want to express our gratitude for all the direct contributions that we received for .NET Core 2.0. Thanks! Some of the most prolific contributors for .NET Core 2.0 are from companies investing in .NET Core, other than Microsoft. Thanks to Samsung and Qualcomm for your contributions to .NET Core.

The .NET Core team shipped two .NET Core 2.0 previews (preview 1 and preview 2) leading up to today’s release. Thanks to everyone who tried out those releases and gave us feedback.

Using .NET Core 2.0

You can get started with .NET Core 2.0 in just a few minutes, on Windows, MacOS or Linux.

You first need to install the .NET Core SDK 2.0.

You can create .NET Core 2.0 apps on the command line or in Visual Studio.

Creating new projects is easy. There are templates you can use in Visual Studio 2017. You can also create new application at the command line with dotnet new, as you can see in the following example.


You can also upgrade an existing application to .NET Core 2.0. In Visual Studio, you can change the target framework of an application to .NET Core 2.0.

If you are working with Visual Studio Code or another text editor, you will need to update the target framework to netcoreapp2.0.

It is not as critical to update libraries to .NET Standard 2.0. In general, libraries should target .NET Standard unless they require APIs only in .NET Core. If you do want to update libraries, you can do it the same way, either in Visual Studio or directly in the project file, as you can see with the following project file segment that target .NET Standard 2.0.

You can read more in-depth instructions in the Migrating from ASP.NET Core 1.x to ASP.NET Core 2.0 document.

Relationship to .NET Core 1.0 and 1.1 Apps

You can install .NET Core 2.0 on machines with .NET Core 1.0 and 1.1. Your 1.0 and 1.1 applications will continue to use the 1.0 and 1.1 runtimes, respectively. They will not roll forward to the 2.0 runtime unless you explicitly update your apps to do so.

By default, the latest SDK is always used. After installing the .NET Core 2.0 SDK, you will use it for all projects, including 1.0 and 1.1 projects. As stated above, 1.0 and 1.1 projects will still use the 1.0 and 1.1 runtimes, respectively.

You can configure a directory (all the way up to a whole drive) to use a specific SDK by creating a global.json file that specifies a specific .NET Core SDK version. All dotnet uses “under” that file will use that version of the SDK. If you do that, make sure you have that version installed.

.NET Core Runtime Improvements

The .NET Core 2.0 Runtime has the following improvements.

Performance Improvements

There are many performance improvements in .NET Core 2.0. The team published a few posts describing the improvements to the .NET Core Runtime in detail.

.NET Core 2.0 Implements .NET Standard 2.0

The .NET Standard 2.0 spec has been finalized at the same time as .NET Core 2.0.

We have more than doubled the set of available APIs in .NET Standard from 13k in .NET Standard 1.6 to 32k in .NET Standard 2.0. Most of the added APIs are .NET Framework APIs. These additions make it much easier to port existing code to .NET Standard, and, by extension, to any .NET implementation of .NET Standard, such as .NET Core 2.0 and the upcoming version of Universal Windows Platform (UWP).

.NET Core 2.0 implements the .NET Standard 2.0 spec: all 32k APIs that the spec defines.

You can see a diff between .NET Core 2.0 and .NET Standard 2.0 to understand the set of APIs that .NET Core 2.0 provides beyond the set required by the .NET Standard 2.0 spec.

Much easier to target Linux as a single operating system

.NET Core 2.0 treats Linux as a single operating system. There is now a single Linux build (per chip architecture) that works on all Linux distros that we’ve tested. Our support so far is specific to glibc-based distros and more specifically Debian- and Red Hat-based Linux distros.

There are other Linux distros that we would like to support, like those that use musl C Standard library, such as Alpine. Alpine will be supported in a later release.

Please tell us if the .NET Core 2.0 Linux build doesn’t work well on your favorite Linux distro.

Similar improvements have been made for Windows and macOS. You can now publish for the following “runtimes”.

  • linux-x64linux-arm
  • win-x64win-x86
  • osx-x64

Linux ARM32 is now supported, in Preview

The .NET Core team is now producing Linux ARM32 builds for .NET Core 2.0+. These builds are great for using on Raspberry Pi. These builds are not yet supported by Microsoft and have preview status.

The team is producing Runtime and not SDK builds for .NET Core. As a result, you need to build your applications on another operating system and then copy to a Raspberry Pi (or similar device) to run.

There are two good sources of .NET Core ARM32 samples that you can use to get started:

Globalization Invariant Mode

.NET Core 2.0 includes a new opt-in globalization mode that provides basic globalization-related functionality that is uniform across operating systems and languages. The benefit of this new mode is its uniformity, distribution size, and the absence of any globalization dependencies.

See .NET Core Globalization Invariant Mode to learn more about this feature, and decide whether the new mode is a good choice for your app or if it breaks its functionality.

.NET Core SDK Improvements

The .NET Core SDK 2.0 has the following improvements.

dotnet restore is implicit for commands that require it

The dotnet restore command has been a required set of keystrokes with .NET Core to date. The command installs required project dependencies and some other tasks. It’s easy to forget to type it and the error messages that tell you that you need to type it are not always helpful. It is now implicitly called on your behalf for commands like runbuild and publish.

The following example workflow demonstrates the absence of a required dotnet restore command:

Reference .NET Framework libraries from .NET Standard

You can now reference .NET Framework libraries from .NET Standard libraries using Visual Studio 2017 15.3. This feature helps you migrate .NET Framework code to .NET Standard or .NET Core over time (start with binaries and then move to source). It is also useful in the case that the source code is no longer accessible or is lost for a .NET Framework library, enabling it to be still be used in new scenarios.

We expect that this feature will be used most commonly from .NET Standard libraries. It also works for .NET Core apps and libraries. They can depend on .NET Framework libraries, too.

The supported scenario is referencing a .NET Framework library that happens to only use types within the .NET Standard API set. Also, it is only supported for libraries that target .NET Framework 4.6.1 or earlier (even .NET Framework 1.0 is fine). If the .NET Framework library you reference relies on WPF, the library will not work (or at least not in all cases). You can use libraries that depend on additional APIs,but not for the codepaths you use. In that case, you will need to invest significantly in testing.

You can see this feature in use in the following images.

The call stack for this app makes the dependency from .NET Core to .NET Standard to .NET Framework more obvious.

.NET Standard NuGet Packages no longer have required dependencies

.NET Standard NuGet packages no longer have any required dependencies if they target .NET Standard 2.0 or later. The .NET Standard dependency is now provided by the .NET Core SDK. It isn’t necessary as a NuGet artifact.

The following is an example nuspec (recipe for a NuGet package) targeting .NET Standard 2.0.

The following is an example nuspec (recipe for a NuGet package) targeting .NET Standard 1.4.

Visual Studio 2017 version 15.3 updates

Side-by-Side SDKs

Visual Studio now has the ability to recognize the install of an updated .NET Core SDK and light up corresponding tooling within Visual Studio. With 15.3, Visual Studio now provides side-by-side support for .NET Core SDKs and defaults to utilizing the highest version installed in the machine when creating new projects while giving you the flexibility to specify and use older versions if needed, via the use of global.json file. Thus, a single version of Visual Studio can now build projects that target different versions of .NET Core.

Support for Visual Basic

In addition to supporting C# and F#, 15.3 now also supports using Visual Basic to develop .NET Core apps. Our aim with Visual Basic this release was to enable .NET Standard 2.0 class libraries. This means Visual Basic only offers templates for class libraries and console apps at this time, while C# and F# also include templates for ASP.NET Core 2.0 apps. Keep an eye on this blog for updates.

Live Unit Testing Support

Live Unit Testing (LUT) is a new feature we introduced in Visual Studio 2017 enterprise edition and with 15.3 it now supports .NET Core. Users who are passionate with Test Driven Development (TDD) will certainly love this new addition. Starting LUT is as simple as turning it ON from the menu bar: Test->Live Unit Testing->Start.

When you enable LUT, you will get unit test coverage and pass/fail feedback live in the code editor as you type. Notice the green ticks and red x’s shown in the code editor in image below.

 

IDE Productivity enhancements

Visual Studio 2017 15.3 has several productivity enhancements to help you write better code faster. We now support .NET naming conventions and formatting rules in EditorConfig allowing your team to enforce and configure almost any coding convention for your codebase.

With regards to navigation improvements, we’ve added support for camelCase matching in GoToAll (Ctrl+T), so that you can navigate to any file/type/member/symbol declaration just by typing cases (e.g., “bh” for “BusHelpers.cs”). You’ll also notice suggested variable names (Fig.2) as you are typing (which will adhere to any code style configured in your team’s EditorConfig).

We’ve added a handful of new refactorings including:

  • Resolve merge conflict
  • Add parameter (from callsite)
  • Generate overrides
  • Add named argument
  • Add null-check for parameters
  • Insert digit-separators into literals
  • Change base for numeric literals (e.g., hex to binary)
  • Convert if-to-switch
  • Remove unused variable

Project System simplifications

We further simplified the .csproj project file by removing some unnecessary elements that were confusing to users and wherever possible we now derive them implicitly. Simplification trickles down to Solution Explorer view as well. Nodes in Solution Explorer are now neatly organized into categories within the Dependencies node, like NuGet, project-to-project references, SDK, etc.

Another enhancement made to the .NET Core project system is that it is now more efficient when it comes to builds. If nothing changed and the project appears to be up to date since the last build, then it won’t waste build cycles.

Docker

Several important improvements were made to .NET Core support for Docker during the 2.0 project.

Support and Lifecycle

.NET Core 2.0 is a new release, supported by Microsoft . You can start using it immediately for development and production.

Microsoft has two support levels: Long Term Support (LTS) and Current release. LTS releases have three years of support and Current releases are shorter, typically around a year, but potentially shorter. .NET Core 1.0 and 1.1 are LTS releases. You can read more about these support levels in the .NET Support and Versioning post. In that post, “Current” releases are referred to as “Fast Track Support”.

.NET Core 2.0 is a Current release. We are waiting to get your feedback on quality and reliability before switching to LTS support. In general, we want to make sure that LTS releases are at the stage where we only need to provide security fixes for them. Once you deploy an app with an LTS release, you shouldn’t have to update it much, at least not due to platform updates.

.NET Core 1.1

.NET Core 1.1 has transitioned to LTS Support, adopting the same LTS timeframe as .NET Core 1.0.

.NET Core 1.0 and 1.1 will both go out of support on June 27, 2019 or 12 months after the .NET Core 2.0 LTS release, whichever is shorter.

We recommend that all 1.0 customers move to 1.1, if not to 2.0. .NET Core 1.1 has important usability fixes in it that make for a significantly better development experience than 1.0.

Red Hat

Red Hat also provides full support for .NET Core on RHEL and will be providing a distribution of .NET Core 2.0 very soon. We’re excited to see our partners like Red Hat follow our release so quickly. For more information head to RedHatLoves.NET.

Closing

We’re very excited on this significant milestone for .NET Core. Not only is the 2.0 release our fastest version of .NET ever, the .NET Standard 2.0 delivers on the promise of .NET everywhere. In conjunction with the Visual Studio family, .NET Core provides the most productive development platform for developers using MacOS or Linux as well as Windows. We encourage you to download the latest .NET Core SDK from https://dot.net/core and start working with this new version of .NET Core.

Please share feedback and any issues you encounter at dotnet/core #812.

Watch the launch video for .NET Core 2.0 to see this new release in action.

Original Post