Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

What's new in .NET 6 Preview 4 #6098

Closed
leecow opened this issue Mar 29, 2021 · 14 comments
Closed

What's new in .NET 6 Preview 4 #6098

leecow opened this issue Mar 29, 2021 · 14 comments
Milestone

Comments

@leecow
Copy link
Member

leecow commented Mar 29, 2021

What's new in .NET 6 Preview 4

This issue is for teams to highlight work for the community that will release .NET 6 Preview 4.

To add content, use a new conversation entry. The entry should include the team name and feature title as the first line as shown in the template below.

## Team Name: Feature title

[link to the tracking issue or epic item for the work]

Tell the story of the feature and anything the community should pay particular attention 
to be successful using the feature.

Preview 1: #5853
Preview 2: #5889
Preview 3: #5890
Preview 4: #6098
Preview 5: #6099

@eerhardt
Copy link
Member

eerhardt commented Apr 16, 2021

.NET Libraries: Extensions Enhancements (WIP)

Hosting

In previous versions, when a BackgroundService threw an exception, the exception was lost and the service appeared to hang. .NET 6 has fixed this by logging the exception and stopping the Host when an unhandled exception is thrown. This behavior is consistent with the way other app models behave when unhandled exceptions are encountered.

If you prefer to keep the previous behavior of allowing an unhandled exception in a BackgroundService to not stop the Host, you can set HostOptions.BackgroundServiceExceptionBehavior to BackgroundServiceExceptionBehavior.Ignore.

Options

Configuration

Logging

Caching

Tell the story of the feature and anything the community should pay particular attention
to be successful using the feature.

@agocke
Copy link
Member

agocke commented Apr 20, 2021

Single-file Publishing Improvements

Static Analysis

Analyzers for single-file publishing were added in .NET 5 to warn about Assembly.Location and a few other APIs which behave differently in single-file bundles.

For .NET 6 Preview 4 we've improved the analysis to allow for custom warnings. If you have an API which doesn't work in single-file publishing you can now mark it with the [RequiresAssemblyFiles] attribute, and a warning will appear if the analyzer is enabled. Adding that attribute will also silence all warnings related to single-file in the method, so you can use the warning to propagate warnings upward to your public API.

The analyzer is automatically enabled for exe projects when PublishSingleFile is set to true, but you can also enabled it for any project by setting EnableSingleFileAnalysis to true. This is could be helpful if you want to embed a library in a single file bundle.

Compression

Single-file bundles now support optional compression, which can be enabled by setting the property EnableCompressionInSingleFile to true. At runtime, the files are decompressed to memory as necessary. This can present huge space savings for some scenarios. For instance, NuGet Package Explorer

Without compression: 172 MB
image

With compression: 71.6 MB
image

However, compression can also significantly increase the startup time of the application, especially on Unix platforms (because they have a no-copy fast start path that can't be used with compression). You should test your app after enabling compression to see if the additional startup is acceptable.

@eiriktsarpalis
Copy link
Member

System.Text.Json: IAsyncEnumerable Serialization Support dotnet/runtime#1570

System.Text.Json now supports serializing IAsyncEnumerable<T> values as JSON arrays:

using System;
using System.Collections.Generic;
using System.IO;
using System.Text.Json;

static async IAsyncEnumerable<int> PrintNumbers(int n)
{
    for (int i = 0; i < n; i++) yield return i;
}

Stream stream = Console.OpenStandardOutput();
var data = new { Data = PrintNumbers(3) };
await JsonSerializer.SerializeAsync(stream, data); // prints {"Data":[0,1,2]}

Note that IAsyncEnumerable values are only supported using the asynchronous serialization methods. Attempting to serialize using the sync methods will result in a NotSupportedException being thrown.

Deserialization Support

Deserializing IAsyncEnumerable<T> values is nominally supported:

using System;
using System.Collections.Generic;
using System.IO;
using System.Text;
using System.Text.Json;

var stream = new MemoryStream(Encoding.UTF8.GetBytes(@"{""Data"":[0,1,2,3,4]}"));
var result = await JsonSerializer.DeserializeAsync<MyPoco>(stream);
await foreach (int item in result.Data)
{
    Console.WriteLine(item);
}

public class MyPoco
{
    public IAsyncEnumerable<int> Data { get; set; }
}

However it must be pointed out that the deserializer will have buffered all IAsyncEnumerable contents in memory before returning the deserialized POCO. This is because the deserializer needs to have consumed the entire JSON value before returning a result.

The DeserializeAsyncEnumerable method

If you are looking to deserialize IAsyncEnumerable<T> values in a streaming fashion, we have added a dedicated JsonSerializer.DeserializeAsyncEnumerable method for this purpose:

using System;
using System.IO;
using System.Text;
using System.Text.Json;

var stream = new MemoryStream(Encoding.UTF8.GetBytes("[0,1,2,3,4]"));
await foreach (int item in JsonSerializer.DeserializeAsyncEnumerable<int>(stream))
{
    Console.WriteLine(item);
}

This will deserialize elements on-demand and can be useful when consuming particularly large data streams. Do note that the method only supports reading from root-level JSON arrays.

@eiriktsarpalis
Copy link
Member

eiriktsarpalis commented Apr 22, 2021

.NET Libraries: new LINQ APIs dotnet/runtime#47231

This preview includes a number of new LINQ APIs that have been requested and contributed by the community.

Adding Enumerable support for Index and Range parameters

The Enumerable.ElementAt method now accepts indices from the end of the enumerable:

Enumerable.Range(1, 10).ElementAt(^2); // returns 9

We have also added an Enumerable.Take overload that accepts Range parameters. This greatly simplifies taking slices of enumerable sequences:

  • source.Take(..3) instead of source.Take(3)
  • source.Take(3..) instead of source.Skip(3)
  • source.Take(2..7) instead of source.Take(7).Skip(2)
  • source.Take(^3..) instead of source.TakeLast(3)
  • source.Take(..^3) instead of source.SkipLast(3)
  • source.Take(^7..^3) instead of source.TakeLast(7).SkipLast(3).

Credit to @Dixin for contributing the implementation.

TryGetNonEnumeratedCount

The TryGetNonEnumeratedCount method attempts to obtain the count of the source enumerable without forcing an enumeration. This checks for sources implementing ICollection/ICollection<T> or takes advantage of some of the internal optimizations employed by Linq. This can be useful in scenaria where we want to preallocate buffers ahead of enumeration:

List<T> buffer = source.TryGetNonEnumeratedCount(out int count) ? new List<T>(capacity: count) : new List<T>();
foreach (T item in source)
{
    buffer.Add(item);
}

DistinctBy/UnionBy/IntersectBy/ExceptBy

We have added variants to the set operations that allow specifying equality using key selector functions:

Enumerable.Range(1, 20).DistinctBy(x => x % 3); // {1, 2, 3}

var first = new (string Name, int Age)[] { ("Tom", 20), ("Dick", 30), ("Harry", 40) };
var second = new (string Name, int Age)[] { ("Peter", 30), ("John", 30), ("Toby", 33) };
first.UnionBy(second, person => person.Age); // { ("Tom", 20), ("Dick", 30), ("Harry", 40), ("Toby", 33) }

MaxBy/MinBy

The new MaxBy and MinBy methods allow finding maximal or minimal elements using a key selector:

var people = new (string Name, int Age)[] { ("Tom", 20), ("Dick", 30), ("Harry", 40) };
people.MaxBy(person => person.Age); // ("Harry", 40)

Chunk

Can be used to chunk a source enumerable into slices of a fixed size:

IEnumerable<int[]> chunks = Enumerable.Range(0, 10).Chunk(size: 3); // { {0,1,2}, {3,4,5}, {6,7,8}, {9} }

Credit to @inputfalken for contributing the implementation.

FirstOrDefault/LastOrDefault/SingleOrDefault overloads taking default parameters

The existing FirstOrDefault/LastOrDefault/SingleOrDefault methods return default(T) if the source enumerable is empty. We have now added overloads that are able to specify the default parameter to be returned in that case:

Enumerable.Empty<int>().SingleOrDefault(-1); // returns -1

Credit to @Foxtrek64 for contributing the implementation.

Zip overload accepting three enumerables

As the title suggests, this overload can be used to zip together three enumerables:

var xs = Enumerable.Range(1, 10);
var ys = xs.Select(x => x.ToString());
var zs = xs.Select(x => x % 2 == 0);

foreach ((int x, string y, bool z) in Enumerable.Zip(xs,ys,zs))
{
}

Credit to @huoyaoyuan for contributing the implementation.

@agocke
Copy link
Member

agocke commented Apr 24, 2021

IL Trimming

Warnings enabled by default

You could previously access trim warnings, which tell you about places where trimming may remove code that's used at runtime, by setting <SuppressTrimAnalysisWarnings> to false. We've now annotated large portions of NetCoreApp and are enabling warnings by default. You can find an intro on resolving trim warnings at https://github.com/mono/linker/blob/main/docs/fixing-warnings.md and more detailed docs at https://docs.microsoft.com/en-us/dotnet/core/deploying/prepare-libraries-for-trimming. Trim warnings bring predictability to the trimming process and put power in developers' hands. We hope the community can improve the trimming ecosystem by annotating more code to be trim safe.

Default TrimMode=link

In .NET 5, trimming tried to find and remove unreferenced assemblies by default. This is safer, but provides limited benefit. Now that trim warnings are on by default developers can be confident in the results of trimming. The new default Trim Mode in .NET 6 is link.

The link TrimMode can provide significant savings by trimming not just unused assemblies, but unused members. As an example, the dotnet Ready To Run compiler can be trimmed with only a few trim warnings, which only occur when using legacy native PDBs.

First, if we publish without trimming at all: 80 MB

image

If we publish with the legacy copyused trim mode: 55 MB.

image

However, if we use the link Trim Mode: 36 MB

image

We hope that the new trim mode aligns much better with the expectations for trimming: significant savings and predictable results.

Native AOT

As part of our Native AOT experiment we've implemented the same trimming warnings for AOT as well, which should improve the Native AOT compilation experience in much the same way.

@mangod9
Copy link
Member

mangod9 commented Apr 26, 2021

PublishReadyToRun now uses crossgen2 by default

As we continue the journey of enabling crossgen2 to replace crossgen, we have now enabled crossgen2 to be used by default when publishing ReadyToRun images in preview4. It also supports generating composite images which needs to be enabled explicitly however. Here are some of the setting which can be toggled:

      <PublishReadyToRun>true</PublishReadyToRun> 
      <PublishReadyToRunUseCrossgen2>true</PublishReadyToRunUseCrossgen2> <!-- switch to false to use crossgen like in 5.0 -->
      <PublishReadyToRunComposite>false</PublishReadyToRunComposite> <!-- swtich to true to generate a composite R2R image -->

@JulieLeeMSFT
Copy link
Member

CodeGen

Community contributions

@SingleAccretion:

Dynamic PGO dotnet/runtime#43618

JIT Loop Optimizations dotnet/runtime#43549

LSRA

Optimizations:

@adamsitnik
Copy link
Member

.NET Libraries: Developers using FileStream find it to be high performance and robust

As noticed by @benaadams and other Community members in dotnet/runtime#40359, FileStream had a lot of place for the improvements.

We have recognized that and decided to rewrite FileStream in .NET 6, with the main focus on Windows (the 20 years younger Unix implementation was already very fast).

The first step we took, was implementing the Strategy Pattern which allows .NET to choose FileStream implementation at run time (dotnet/runtime#47128).
By doing that, we have added a possibility to:

  • switch to .NET 5 compatibility mode (the old implementation)
{
    "configProperties": {
        "System.IO.UseNet5CompatFileStream": true
    }
}

Based on the solid new fundamentals, we have ensured (dotnet/runtime#48813) that FileStream created for async IO is never performing any blocking work:

The next step we took was optimizing the sys-calls usage (dotnet/runtime#49975):

And last but not least, greatly reduced the managed allocations:

Enough of the talking. Let's measure the improvements using BenchmarkDotNet:

public class FileStreamPerf
{
    private int fileSize = 1_000_000; // 1 MB
    private Memory<byte> _buffer = new byte[8_000]; // 8 kB
    private string _filePath = "test.file";
    private FileStream _fileStream;

    [GlobalSetup(Target = nameof(ReadAsync))]
    public void SetupRead()
    {
        File.WriteAllBytes(_filePath, new byte[fileSize]);
        _fileStream = new FileStream(_filePath, FileMode.Open, FileAccess.Read, FileShare.Read, bufferSize: 1, useAsync: true);
    }

    [Benchmark]
    public async ValueTask ReadAsync()
    {
        _fileStream.Position = 0; // read from the begining

        while (await _fileStream.ReadAsync(_buffer) > 0)
        {
        }
    }

    [GlobalSetup(Target = nameof(WriteAsync))]
    public void SetupWrite()
    {
        _fileStream = new FileStream(_filePath, FileMode.CreateNew, FileAccess.Write, FileShare.Read, bufferSize: 1, useAsync: true);
    }

    [Benchmark]
    public async ValueTask WriteAsync()
    {
        _fileStream.SetLength(0); // truncate the file

        for (int i = 0; i < fileSize / _buffer.Length; i++)
        {
            await _fileStream.WriteAsync(_buffer);
        }
    }

    [GlobalCleanup]
    public void Cleanup()
    {
        _fileStream.Dispose();
        File.Delete(_filePath);
    }
}
BenchmarkDotNet=v0.12.1.1533-nightly, OS=Windows 10.0.18363.1500 (1909/November2019Update/19H2)
Intel Xeon CPU E5-1650 v4 3.60GHz, 1 CPU, 12 logical and 6 physical cores
.NET SDK=6.0.100-preview.4.21219.1
  [Host]     : .NET 5.0.5 (5.0.521.16609), X64 RyuJIT
  Job-JAYWNT : .NET 5.0.5 (5.0.521.16609), X64 RyuJIT
  Job-ZRAEXA : .NET 6.0.0 (6.0.21.21801), X64 RyuJIT
Method Runtime Mean Ratio Allocated
ReadAsync .NET 5.0 3.419 ms 1.00 39,504 B
ReadAsync .NET 6.0 1.445 ms 0.42 192 B
WriteAsync .NET 5.0 12.181 ms 1.00 39,192 B
WriteAsync .NET 6.0 2.193 ms 0.18 192 B

In this particular example (Windows 10 with SSD drive with BitLocker enabled) reading 1 MB file is now 2.5 times faster, while writing is 5.5 times faster.
The memory allocations dropped from 39 kilobytes to 192 bytes! This is a 99.5% improvement!

@mattjohnsonpint
Copy link

mattjohnsonpint commented May 13, 2021

.NET Libraries: Enhanced Date, Time and Time Zone support (dotnet/runtime#45318)

(Blog post pending Preview 4 release)

New DateOnly and TimeOnly structs (dotnet/runtime#49036)

  • Complements existing date/time types (DateTime, DateTimeOffset, TimeSpan, TimeZoneInfo).
  • In System namespace, shipped in CoreLib, just like the others.
  • Each represent one half of a DateTime, either only the date part, or only the time part.
  • DateOnly ideal for birthdays, anniversary days, business days, etc. - Cleanly aligns with SQL Server's date type.
  • TimeOnly ideal for recurring meetings, alarm clocks, weekly business hours, etc. - Cleanly aligns with SQL Server's time type.

Perf improvements to DateTime.UtcNow (dotnet/runtime#50263)

  • Fixes 2.5x perf regression for getting the system time on Windows
  • Utilizes a 5-minute sliding cache of Windows leap second data instead of fetching with every call

Support for both Windows and IANA time zones on all platforms

Improved time zone display names (dotnet/runtime#48931)

  • Removes ambiguity from the display names in the list returned by TimeZoneInfo.GetSystemTimeZones
  • Leverages ICU / CLDR globalization data
  • Unix only for now. Windows still uses the registry data. (Though possible for future.)

Miscellaneous

@steveharter
Copy link
Member

steveharter commented May 13, 2021

System.Text.Json: Writable DOM Feature

For additional background see https://github.com/dotnet/designs/blob/main/accepted/2020/serializer/WriteableDomAndDynamic.md

This adds a writeable DOM feature to System.Text.Json. It is expected many System.Text.Json consumers will use this feature for various reasons:

  • As a lightweight alternative to the serializer for cases when compilation of POCO types is not possible or desired, or when a JSON schema is not fixed and must be inspected.
  • To efficiently modify a subset of a large tree. For example, it is possible to efficiently navigate to a subsection of a large JSON tree and read an array or deserialize a POCO from that subsection. LINQ can also be used with that.
  • A desire to use C# "dynamic" keyword for varying reasons including sharing of loosely-typed, script-based code.

The basic class structure:

namespace System.Text.Json.Node
{
    public abstract class JsonNode {...};
    public sealed class JsonObject : JsonNode, IDictionary<string, JsonNode?> {...}
    public sealed class JsonArray : JsonNode, IList<JsonNode?> {...};
    public abstract class JsonValue : JsonNode {...};
}

and programming model:

    // Parse a JSON object
    JsonNode jNode = JsonNode.Parse("{\"MyProperty\":42}");
    int value = (int)jNode["MyProperty"];
    Debug.Assert(value == 42);
    // or
    value = jNode["MyProperty"].GetValue<int>();
    Debug.Assert(value == 42);

    // Parse a JSON array
    jNode = JsonNode.Parse("[10,11,12]");
    value = (int)jNode[1];
    Debug.Assert(value == 11);
    // or
    value = jNode[1].GetValue<int>();
    Debug.Assert(value == 11);

    // Create a new JsonObject using object initializers and array params
    var jObject = new JsonObject
    {
        ["MyChildObject"] = new JsonObject
        {
            ["MyProperty"] = "Hello",
            ["MyArray"] = new JsonArray(10, 11, 12)
        }
    };

    // Obtain the JSON from the new JsonObject
    string json = jObject.ToJsonString();
    Console.WriteLine(json); // {"MyChildObject":{"MyProperty":"Hello","MyArray":[10,11,12]}}

    // Indexers for property names and array elements are supported and can be chained
    Debug.Assert(jObject["MyChildObject"]["MyArray"][1].GetValue<int>() == 11);

Noteworthy features:

  • Programming model
    • Intuitive class hierarchy based on reference types. This allows the base class JsonNode to have a minimal set of members by keeping the majority on the appropriate derived types.
    • Support for common interfaces:
      • IDictionary<string, JsonNode?> for JSON objects.
      • IList<JsonNode?> for JSON arrays.
    • Methods on the base class JsonNode that are specific to a derived type are kept to minimum and are there only to support a terse syntax by:
      • Supporting chained indexers for property names and collection elements to quickly navigate to a specific node.
      • Ability to obtain a value without casting to JsonValue.
      • Inline syntax to cast to a derived type via AsObject(), AsArray() and AsValue() which allows chaining of methods such as .Add().
    • Implicit and explicit C# conversions between JsonValue and supported primitive types.
    • Standard language null semantics instead of a "JsonNullType" singleton.
    • A single programming model to obtain a value from JsonValue whether the value is backed by a JsonElement after a Parse() or backed by an actual CLR type while in "edit" mode. e.g. jValue.GetValue<int>() works in either case.
    • Property names on JsonObject can be configured to be case-sensitive or case-insensitive by using JsonNodeOptions.
    • Support for the C# dynamic keyword.
  • Interop with JsonSerializer and JsonElement
    • JsonValue can be be created with any CLR type assuming it is supported by JsonSerializer. This makes generating JSON easy for calling services, etc. when these types are readily available including:
      • Custom data types registered with JsonSerializerOptions.
      • C# Anonymous types.
      • All POCOs and collection types.
    • A POCO property or collection element can be declared as a JsonNode or one of its derived types.
    • Ability to deserialize a POCO property or collection element declared as System.Object as JsonNode via JsonSerializerOptions.UnknownTypeHandling. Previously, only JsonElement could be deserialized.
    • The existing JsonSerializer static methods can be invoked with a JsonNode or a derived type.
    • A JsonValue can be created with an internal value of a JsonElement.
  • The [JsonExtensionData] attribute (to round-trip JSON properties that don't map to a CLR type during deserialization) can be applied to a JsonObject property. Previously, only Dictionary<string, JsonElement> based properties were possible which made it difficult to modify extension data since JsonElement is read-only.
  • Debugging
    • JsonNode.ToString() returns formatted JSON for easy inspection. Note that ToJsonString() should be used to obtain round-trippable, terse JSON.
    • JsonNode.GetPath() can be used to determine where a given node is at in a tree.
    • JsonNode-derived classes have custom debug data visualizers that display the JSON, Path and property\item counts in a Visual Studio watch window in an easy-to-understand format.
  • Performance
    • After a JsonNode.Parse(), a single JsonElement is created which is very efficient since a single allocation contains the entire JSON buffer and materialization of child nodes or values is deferred until necessary.
    • The JsonDocument Dispose pattern can be leveraged (by creating a JsonNode-derived type with JsonDocument.RootElement) which supports pooled buffers to prevent the potentially large JSON alloc.
  • LINQ
    • IEnumerable<JsonNode>-based JsonObject and JsonNode.
    • Parent and Root properties help with querying against relationships.

@davidortinau
Copy link
Contributor

davidortinau commented May 20, 2021

.NET MAUI

This release has general progress on porting controls, layouts, and features from Xamarin.Forms. Progress for that is on our GitHub status report

Highlights:

  • Desktop windows now size content to fill, and remeasure on window resize.

New in this release are:

BlazorWebView

BlazorWebView enables you to host a Blazor web application right in your .NET MAUI application and take advantage of seamless native platform features and UI controls.

<BlazorWebView 
    HostPage="wwwroot/index.html"
    Services="{StaticResource Services}">
    <BlazorWebView.RootComponent>
        <RootComponent 
            Selector="#app"
            ComponentType="{x:Type local:Main}"
        />
    </BlazorWebView.RootComponent>
</BlazorWebView>

Splash Screen

SplashScreens

Add a static splash screen to all platforms by marking any image as a MauiSplashScreen build type, or add this in your csproj.

<MauiSplashScreen Include="Resources\appiconfg.svg" Color="#512BD4" />

Raw Assets

Add other media to your single project such as documents, video, audio, etc. by settings the MauiAsset build type. For example, we can add an html file

<MauiAsset Include="Resources\Raw\index.html" />

and then reference it by filename from any native control

<WebView Source="index.html" />

New Templates

image

image

image

@josalem
Copy link

josalem commented May 24, 2021

.NET Diagnostics: EventPipe for Mono and Improved EventPipe Performance [WIP]

dotnet/runtime#45518

EventPipe is .NET's cross-platform mechanism for egressing events, performance data, and counters. Starting with .NET 6, we've moved the implementation from C++ to C. With this change, Mono will be able to use EventPipe as well! This means that both CoreCLR and Mono will use the same eventing infrastructure, including the .NET Diagnostics CLI Tools! This change also came with small reduction in size for CoreCLR:

lib after size - before size diff
libcoreclr.so 7037856 - 7049408 -11552

We've also made some changes that improve EventPipe throughput while under load. Over the first few previews, we've made a series of changes that result in throughput improvements as high as 2.06x what .NET 5 was capable of:
image

Data collected using the EventPipeStress framework in dotnet/diagnostics. The writer app writes events as fast as it can for 60 seconds. The number of successful and dropped events is recorded.

@VARUN46
Copy link

VARUN46 commented Jun 13, 2021

System.Text.Json: Writable DOM Feature

For additional background see https://github.com/dotnet/designs/blob/main/accepted/2020/serializer/WriteableDomAndDynamic.md

This adds a writeable DOM feature to System.Text.Json. It is expected many System.Text.Json consumers will use this feature for various reasons:

  • As a lightweight alternative to the serializer for cases when compilation of POCO types is not possible or desired, or when a JSON schema is not fixed and must be inspected.
  • To efficiently modify a subset of a large tree. For example, it is possible to efficiently navigate to a subsection of a large JSON tree and read an array or deserialize a POCO from that subsection. LINQ can also be used with that.
  • A desire to use C# "dynamic" keyword for varying reasons including sharing of loosely-typed, script-based code.

The basic class structure:

namespace System.Text.Json.Node
{
    public abstract class JsonNode {...};
    public sealed class JsonObject : JsonNode, IDictionary<string, JsonNode?> {...}
    public sealed class JsonArray : JsonNode, IList<JsonNode?> {...};
    public abstract class JsonValue : JsonNode {...};
}

and programming model:

    // Parse a JSON object
    JsonNode jNode = JsonNode.Parse("{\"MyProperty\":42}");
    int value = (int)jNode["MyProperty"];
    Debug.Assert(value == 42);
    // or
    value = jNode["MyProperty"].GetValue<int>();
    Debug.Assert(value == 42);

    // Parse a JSON array
    jNode = JsonNode.Parse("[10,11,12]");
    value = (int)jNode[1];
    Debug.Assert(value == 11);
    // or
    value = jNode[1].GetValue<int>();
    Debug.Assert(value == 11);

    // Create a new JsonObject using object initializers and array params
    var jObject = new JsonObject
    {
        ["MyChildObject"] = new JsonObject
        {
            ["MyProperty"] = "Hello",
            ["MyArray"] = new JsonArray(10, 11, 12)
        }
    };

    // Obtain the JSON from the new JsonObject
    string json = jObject.ToJsonString();
    Console.WriteLine(json); // {"MyChildObject":{"MyProperty":"Hello","MyArray":[10,11,12]}}

    // Indexers for property names and array elements are supported and can be chained
    Debug.Assert(jObject["MyChildObject"]["MyArray"][1].GetValue<int>() == 11);

Noteworthy features:

  • Programming model

    • Intuitive class hierarchy based on reference types. This allows the base class JsonNode to have a minimal set of members by keeping the majority on the appropriate derived types.

    • Support for common interfaces:

      • IDictionary<string, JsonNode?> for JSON objects.
      • IList<JsonNode?> for JSON arrays.
    • Methods on the base class JsonNode that are specific to a derived type are kept to minimum and are there only to support a terse syntax by:

      • Supporting chained indexers for property names and collection elements to quickly navigate to a specific node.
      • Ability to obtain a value without casting to JsonValue.
      • Inline syntax to cast to a derived type via AsObject(), AsArray() and AsValue() which allows chaining of methods such as .Add().
    • Implicit and explicit C# conversions between JsonValue and supported primitive types.

    • Standard language null semantics instead of a "JsonNullType" singleton.

    • A single programming model to obtain a value from JsonValue whether the value is backed by a JsonElement after a Parse() or backed by an actual CLR type while in "edit" mode. e.g. jValue.GetValue<int>() works in either case.

    • Property names on JsonObject can be configured to be case-sensitive or case-insensitive by using JsonNodeOptions.

    • Support for the C# dynamic keyword.

  • Interop with JsonSerializer and JsonElement

    • JsonValue can be be created with any CLR type assuming it is supported by JsonSerializer. This makes generating JSON easy for calling services, etc. when these types are readily available including:

      • Custom data types registered with JsonSerializerOptions.
      • C# Anonymous types.
      • All POCOs and collection types.
    • A POCO property or collection element can be declared as a JsonNode or one of its derived types.

    • Ability to deserialize a POCO property or collection element declared as System.Object as JsonNode via JsonSerializerOptions.UnknownTypeHandling. Previously, only JsonElement could be deserialized.

    • The existing JsonSerializer static methods can be invoked with a JsonNode or a derived type.

    • A JsonValue can be created with an internal value of a JsonElement.

  • The [JsonExtensionData] attribute (to round-trip JSON properties that don't map to a CLR type during deserialization) can be applied to a JsonObject property. Previously, only Dictionary<string, JsonElement> based properties were possible which made it difficult to modify extension data since JsonElement is read-only.

  • Debugging

    • JsonNode.ToString() returns formatted JSON for easy inspection. Note that ToJsonString() should be used to obtain round-trippable, terse JSON.
    • JsonNode.GetPath() can be used to determine where a given node is at in a tree.
    • JsonNode-derived classes have custom debug data visualizers that display the JSON, Path and property\item counts in a Visual Studio watch window in an easy-to-understand format.
  • Performance

    • After a JsonNode.Parse(), a single JsonElement is created which is very efficient since a single allocation contains the entire JSON buffer and materialization of child nodes or values is deferred until necessary.
    • The JsonDocument Dispose pattern can be leveraged (by creating a JsonNode-derived type with JsonDocument.RootElement) which supports pooled buffers to prevent the potentially large JSON alloc.
  • LINQ

    • IEnumerable<JsonNode>-based JsonObject and JsonNode.
    • Parent and Root properties help with querying against relationships.

Is there any way to understand how Remove() works for JsonNode. I guess providing a simple way to jsonNode["[MyCurrentNode["].Remove("[NodeName]") would help !!

@leecow leecow unpinned this issue Jun 17, 2021
@ryancdotnet
Copy link

ryancdotnet commented Jun 19, 2021

dotnet/sdk: Preserve destination folder option for File System publish (dotnet/sdk@a66f426)

When using dotnet or msbuild to publish a project or website using the File System option, and the "Delete existing files" option is enabled, the target destination folder is deleted and recreated after deleting any existing files inside. This can have unwarranted side effects, with one such being that all file system permissions and ACL's on the target destination folder are lost. When the folder is recreated, it will be assigned permissions and ACL's based on the user account that creates it and where it is created (i.e. the recreated folder would inherit any upstream inheritable permissions, or permissions based on who created the folder in the case of a CIFS share path for example). The option to "Delete existing files" does not explicitly indicate that the target folder itself will be deleted and recreated, and this can cause issues for developers who are unaware of this behavior, particularly in situations of publishing directly to a web server or production share path where the target folder has explicit permissions applied.

Now there is a new msbuild option when publishing via File System that will preserve the destination folder and yet still delete any existing files inside when the "Delete existing files" option is selected.

This new option is called "PreserveDestinationFolder". Simple set this value to "true" along with "DeleteExistingFiles" to "true" when publishing, and the destination folder will not be deleted and recreated.

This option is backwards compatible with existing publish profiles and publish commands that use "DeleteExistingFiles", in that if the "PreserveDestinationFolder" is not supplied or is anything else but "true", the previous behavior of deleting and recreating the target folder will still occur. Only when you supply "true" for both "DeleteExistingFiles" and "PreserveDestinationFolder" will the target folder not get deleted and recreated.

By providing an opt-in option to preserve the destination folder, this issue can be bypassed, and users can still take advantage of the cleanup to remove older files that may be outdated or defunct.

This option is currently not represented in the UI for Visual Studio's publish wizard, but it can be manually inserted into publish profiles by editing them or it can be passed in as parameter to dotnet or msbuild:

Publish profile example

<DeleteExistingFiles>True</DeleteExistingFiles>
<PreserveDestinationFolder>True</PreserveDestinationFolder>

Dotnet example

dotnet publish -p:DeleteExistingFiles=True -p:PreserveDestinationFolder=True

Msbuild example

msbuild /target:publish /p:DeleteExistingFiles=True /p:PreserveDestinationFolder=True

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests