Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Consider Using Surrogates (POCOs) Rather Than DOM Elements #1

Closed
Mike-E-angelo opened this issue Dec 6, 2016 · 16 comments
Closed

Consider Using Surrogates (POCOs) Rather Than DOM Elements #1

Mike-E-angelo opened this issue Dec 6, 2016 · 16 comments

Comments

@Mike-E-angelo
Copy link

This is a widespread problem and needs thorough ecosystem-wide correction. :P

I see a lot of this:
Example.

This is incredibly labor-intensive and error-prone. Please consider a surrogate (or rather, POCO 馃槃) design instead:

[Fact]
public void BetterSerializationTrustMe()
{
	var file = File.ReadAllText(@"C:\path\to\file.txt");
	var definition = JsonConvert.DeserializeObject<ProjectDefinition>(file);
	foreach (var reference in definition.References)
	{
		// do stuff.
	}
}

class ProjectDefinition
{
	[JsonProperty("dependencies")]
	public References References { get; set; } = new References();
}

[JsonArray("dependencies")]
class References : Collection<PackageReference> {}

class PackageReference
{
	[JsonProperty("name")]
	public string Name { get; set; }

	[JsonProperty("version")]
	public Version Version { get; set; }
}

This is super example code. I did not run this, but made it to get the point across. You will notice there is no linq statement as all of that is handled in JsonConvert.DeserializeObject for you.

This can be made even better by generalizing the surrogates and decoupling JSON.NET-specific attributes (making using of the JsonSerializerSettings) class.

This is why I am so hardup on POCO. As you do not have to dick around for a schema and dive into LINQ land to translate your data to the target constructs in your code. You use one line (JsonConvert.DeserializeObject) and boom! you're done. 馃槢

@aL3891
Copy link
Owner

aL3891 commented Dec 7, 2016

The reason its not like that now is just be cause I didn't want to enforce a particular object model across all the different ways to specify references, although for the json one specifically, this is certainly a better option (and that poco could be in the json one specifically too). Right now its more poc level code :)

For the XAML one I've planned to do this actually, specify a poco that is then created in XAML. Then i'd probably read that as xml anyway since XAML is not supported on core yet, but still.

@Mike-E-angelo
Copy link
Author

Not enforcing a model is what makes it so error-prone and why MSBuild's (and .NET Configuration's for that matter) DOM is so disliked. :) I think there is also a conceptual gap here which seems to be continually missed, in that "data" files should be seen as serialized objects by rule rather by exception. That is, when you are using a data file, you should be describing describing a CLR object that will eventually be activated into memory -- not newed-up and translated XElement (or JToken) by XElement. If you are going to go through all that work, why not simply serialize a POCO? It's 1 line of code vs. O(n) (if I am applying my newly-acquired O-Notation knowledge correctly 馃槃) lines of code per new introduced element.

It's simply madness. It doesn't help that MSFT continues to fall over itself by continuing to produce poor guidance as project.json suffers from this same malady. This is also why I continue to sound like a broken record and point to Xaml as the premiere example of the correct way of handling serialization and "configuration."

And yes, such a shame about XAML. It is nearly 200 votes on the dedicated GitHub issue that continues to be ignored. 馃挗

@aL3891
Copy link
Owner

aL3891 commented Dec 7, 2016

I think there is also a conceptual gap here which seems to be continually missed, in that "data" files should be seen as serialized objects

Well that's a matter of opinion I suppose, and I do agree with you in the case of project.json, however say for the custom references where you store references as files, then you cant just deserialize something. In here I actually do have a poco for the references, the PackageReference class, I just don't have a global one for the entire project model.

I do agree with you though that this is something that should be done for project.json specifically, and perhaps for other models as well, but i'm just not sure the best approach would be to define a single poco for projects that fits all models.

Flexibility is also a problem, the rigidity of a poco is both a strength and a weakness, but for a format that has to be as flexible as the msbuild project system I think it would be very hard to find a unified poco that satisfies all the needs :)

Msbuild after all does have pocos as well, Items and properties, they are just more general.

@Mike-E-angelo
Copy link
Author

Well that's a matter of opinion I suppose

sigh :) And this is why we have an ecosystem inundated with weak, obscure schemas resulting in a subpar developer experience, in my opinion. To illustrate, consider a Xaml file vs. a JSON file. With Xaml, the number of artifacts required are:

  1. The class you are describing (PackageReference.cs).
  2. The class that represents the serialized form of that class (PackageReference.xaml).

In a typical XML/JSON-schema scenario (again the kind that are incorrectly produced by MSFT), you have the above, plus now a 3rd:

  1. PackageReference.xsd (or PackageReference.json.schema orwhateveritis).

By introducing that 3rd artifact, you now force the developer (and tooling) to somehow find it and load it appropriately so that intellisense and the like "just works." But, what happens when you change the originating class file? You now have to update that schema, too. So not only have you introduced yet another (arguably unnecessary) artifact, you have also now doubled the amount of work required for each and every change as a developer must now update the class file and this other schema file.

In here I actually do have a poco for the references, the PackageReference class, I just don't have a global one for the entire project.

And why not? :) It should be the first thing you design!

but i'm just not sure the best approach would be to define a single poco for projects that fits all models.

Again, I am sensing a disconnect here. What "models" are you speaking of? There is one model here that I see: The PackageReference POCO that you have created. It's really as simple as that. Every format that you are supporting is simply deserializing a collection of those suckers into memory. Or should be. IMO. :)

the rigidity of a poco is both a strength and a weakness

I guess I am confused on why you allow this constraint with your code, but for when you serialize artifacts generated from that code into a data stream and then onto disk, you no longer wish to abide by it? Help me out here.

Msbuild after all does have pocos as well, Items and properties, they are just more general

Now this is a matter of opinion. ;) (joking) Yes, MSBuild is rooted in POCOs, but those POCOs are tightly coupled to XmlElements. Can you imagine building a class these days where a property in that class is a JToken and/or JObject? The horror!

What we're after here is clean plain ol' CLR objects that simply describe a concern in a domain solution. Again I go back to the disconnect that when an object leaves memory and goes to disk via a "data" file, all sorts of really weird conceptions start cropping up and that is where the friction/confusion/angst occurs. That object is still an object and should be treated as such.

Yes, IMO. 馃槑

@aL3891
Copy link
Owner

aL3891 commented Dec 7, 2016

And this is why we have an ecosystem inundated with weak, obscure schemas resulting in a subpar developer experience.

I think you're oversimplifying things a bit. What happens if that class is changed? Now you're potentially breaking millions of existing projects. What happens if some other party wants to add properties to the project file, are they supposed to drop a PR to update the project class to you and then have you push an update that all clients needs to install in order to use the third party thing? Probably not, so what'd end up happening is that the project poco would have to have lists of properties and items that others would use to extend the project system. Starting to sound familiar? :)

Sure, there is a argument to be had that intellisense could be better by having type checking, but then how do you specify what other classes to include? Both XAML and good old .config files are an example of this. Both work and XAML is pretty clean too, but then you're looking at a lot more complex deserialization logic.

And why not? :) It should be the first thing you design!

Well first of all, as I already mentioned, there is a poco for package references, the thing that this project is supposed to handle. As I also said before, a single poco for the project model is too restrictive in this case. Take the File packages references, it doesn't really map to a project level poco. Effectively that project itself is a deserializer for a bunch of items into a list of package references.

I also disagree that there is an extra artifact needed for XAML vs json, in both cases you need two artifacts, the schema and the data file. For xaml, the schema is in the form of a dll, and the data is in xaml. For json its a json.schema and a json, for xml its an xsd and a xml file and so on. From an end user/tooling perspective, you're still dealing with two files. xaml is no exception to this, the xaml editor also needs to find and load the corresponding type for the xaml content.

From an authoring perspective, sure you're also using a .cs file, but you can easily generate whatever other schema file you what from the compilation output so I don't really see that as a problem either.

Again, I am sensing a disconnect here. What "models" are you speaking of? There is one model here that I see: The PackageReference POCO that you have created. It's really as simple as that. Every format that you are supporting is simply deserializing a collection of those suckers into memory

You're the one that suggested having a project level poco that I deserialize project.json directly into. i'm fine with having that in the json project but what i'm saying is that defining a global poco for all the different reference styles is too restrictive. if I make a poco that matches the structure of project.json, it won't make sense for packages.config for example. Each of the different ways to define references would need its own. (and that's fine, I just havent implemented that yet, and I agree its a good idea)

I guess I am confused on why you allow this constraint with your code, but for when you serialize artifacts generated from that code into a data stream and then onto disk, you no longer wish to abide by it

I don't really understand what you mean by this.. on disk everything is just bytes, there is not structure at all. Also, isn't your complaint that the format is too permissive not too constrictive? Anyway, what I was talking about was the general idea of pocos in a project system, they have a set schema and that is a good thing in some ways, like for intellisense, but its also restrictive when it comes to extensibility.

MSBuild is rooted in POCOs, but those POCOs are tightly coupled to XmlElements.

Well, not really.. msbuild is built primarily on TaskItems and properties, and you'll never see a TaskItem declared in the msbuild xml and when you're writing an msbuild target you never process any xml in your code, you just get pocos in and return them back out.

I agree though that msbuild as a whole is tightly coupled to xml and that's a bit of a shame, but what I argue as part of this project is that it doesn't really matter, we can have the cake and eat it to, without throwing away large portions of the msbuild code base.

Finally, I should say, i'm not part of @davkean team, the msbuild team or Microsoft at all, I can't speak for any of them. But I think I understand their rational for doing/not doing the changes they've planned and I think we can work together to make things better.

And hey, msbuild is open source, if you think its easy to use xaml/pocos, try it out :)

@aL3891
Copy link
Owner

aL3891 commented Dec 7, 2016

tldr; yeah I think a poco for the project.json is a good idea, but i'd use that particular poco only in the project.json project. Packages.config would get a different poco to represent it for example.

@Mike-E-angelo
Copy link
Author

What happens if that class is changed? Now you're potentially breaking millions of existing projects. What happens if some other party wants to add properties to the project file, are they supposed to drop a PR to update the project class to you and then have you push an update that all clients needs to install in order to use the third party thing

OK I am definitely missing something here. This is what versioning is all about, is it not? The project you save/load has a version attached to it, and that is what solves these sorts of concerns.

but then you're looking at a lot more complex deserialization logic.

For whom? :) What seems to be lost here is developer experience and therefore adoption of a technology/toolset. As a developer, are you going to go with the API that has you jump through 10 hoops to get a file into memory successfully, or the one that only takes 5? Or 3? Or 1? OR NONE??? :) :) :)

I also disagree that there is an extra artifact needed for XAML vs json, in both cases you need two artifacts, the schema and the data file. For xaml, the schema is in the form of a dll, and the data is in xaml. For json its a json.schema and a json, for xml its an xsd and a xml file and so on. From an end user/tooling perspective, you're still dealing with two files. xaml is no exception to this, the xaml editor also needs to find and load the corresponding type for the xaml content.

Go ahead and do me a favor here. Load up this SLN, and then load up this file in Visual Studio.

I don't know about you, but without any work, this is what mine looks like:

Now opening this file (the JSON equivalent) looks like this:

See the difference? The in POCO-based development, tooling detects the POCO and "lights up" accordingly, as the class is the schema.

However, in JSON, I do see this "feature":

But, I do not see any way of selecting my own schema, nor do I know where they are even stored. Even if I did know, then I am already spending way more time (developer experience) dicking around and trying to make the tool work in one format that automatically "just works" in another.

From an authoring perspective, sure you're also using a .cs file, but you can easily generate whatever other schema file you what from the compilation output so I don't really see that as a problem either.

Easily? No one is doing this, and no tooling is supporting this. If so, please demonstrate. I want to be sure I am positively educated on the matter.

if I make a poco that matches the structure of project.json, it won't make sense for packages.config for example

Ahhhhh I am with you! Yes, in those cases you would need a surrogate as well. OK I think we are on the same page there.

Finally, I should say, i'm not part of @davkean team, the msbuild team or Microsoft at all, I can't speak for any of them. But I think I understand their rational for doing/not doing the changes they've planned and I think we can work together to make things better.

The CPS team "gets" Xaml. They are using it for their rules engine. The issue isn't CPS but MSBuild. It is indeed powerful (I've been with it since the start, keep in mind), but it is a total bear to endure. The issue here is creating an easy (even fun?) way of accessing/interfacing with the elements that make MSBuild tick.

And hey, msbuild is open source, if you think its easy to use xaml/pocos, try it out :)

That's what my little proof of concept is all about. :) I am rounding out the XML part and then I hope to have a build to check out. This is for dotnet/msbuild#613 BTW. Please upvote if you haven't already. Would really appreciate the support.

BTW thanks for the dialogue. I really appreciate the thoughts. Even though I am (really 馃槢) opinionated on the matter I do concede I don't have all the answers. So, it's good to get feedback/discussion to ensure I have my understanding on square ground.

@aL3891
Copy link
Owner

aL3891 commented Dec 7, 2016

OK I am definitely missing something here. This is what versioning is all about, is it not? The project you save/load has a version attached to it, and that is what solves these sorts of concerns.

That doesn't help you much if the version is specified in the actual file you're trying to load. You'd need to know the version before deserializing the file. The xaml reader will barf if it encounters an invalid property. Of course that is solvable but then you're no longer "just deserializing a poco"

For whom? :) What seems to be lost here is developer experience and therefore adoption of a technology/toolset.

For the tool developer obviously. You're making a very unclear argument here.. First you seem to argue that effort in the tooling doesn't matter because the developer experience is the most important, but then you on to say that developers will choose the simplest way to "get a file into memory". well yeah, and just deserializing a poco is not the simplest way when you take versioning and extensibility into consideration.

Go ahead and do me a favor here. Load up this SLN, and then load up this file in Visual Studio.

This in no way addresses my comment, both formats rely on two files, schema and data. First, of all the json file in your example does not specify a schema so obviously none is available. if you were to properly set the $schema propery, you'd get better intellisense.

Secondly you're comparing the tooling experience not the capability of each format, if someone wanted to they could make the exact same tooling based on json schema as xaml, this has nothing to do with the format itself.

You're only getting that experience because you have all the assemblies and the editor is able to find them. If someone just hands you a xaml file you get no better experience than with you would if someone just handed you a json file.

Easily? No one is doing this, and no tooling is supporting this. If so, please demonstrate. I want to be sure I am positively educated on the matter.

Sure, http://www.newtonsoft.com/jsonschema/help/html/GeneratingSchemas.htm https://blogs.msdn.microsoft.com/webdev/2014/04/10/intellisense-for-json-schema-in-the-json-editor/ both are third hits on google, so clearly somebody does this..

The CPS team "gets" Xaml. They are using it for their rules engine.

Yes for their internal rules engine.. The issue as I see it is versioning and extensibility. If you where to define an object model that was as flexible as the msbuild schema, you'd end up with something far more verbose. Even your own sample is more verbose than the new msbuild files, and it doesn't define things like dependencies that you'd have to do with attached properties. If you were to define an object model that exactly matched the project, then that would not be flexible enough. Pick your poison :)

I'm not entierly sure why you're trying to convince me of this, i'm not in any position to influence the Microsoft teams at all and this project doesn't try and replace the whole project file, but even if it did, its ment to act as a bridge between msbuild and whatever format you want. its not ment to even be "the best" format, just conform to various ways projects have been specified in the past, like project.json.

Infact, going back to the original suggestion, project.json has features that make it hard to just deserialize into a poco, consider this:

{
  "dependencies": {
    "Microsoft.AspNet.ConfigurationModel": "0.1-alpha-*",
    "SomeProject": ""
  }
}

What does the poco for that look like? A property called Microsoft.AspNet.ConfigurationModel? doesn't work..

@Mike-E-angelo
Copy link
Author

Mike-E-angelo commented Dec 7, 2016

You'd need to know the version before deserializing the file. The xaml reader will barf if it encounters an invalid property. Of course that is solvable but then you're no longer "just deserializing a poco"

Not true. You can get the version in Xaml's case from the xmlns tag. For example:

xmlns:tasks="clr-namespace:Contoso.Tasks;assembly=Extensibility.Tasks, Version=1.0.0.0, Culture=en, PublicKeyToken=a5d015c7d5a0b012"

So you know precisely which version to load when the file is deserialized. JSON.NET uses the $type property for this same purpose.

well yeah, and just deserializing a poco is not the simplest way when you take versioning and extensibility into consideration

Well the way it has been designed, along with the practices and "guidance" that have been provided, this is very true. 馃槢 This is what all the discussions and conversation has been around, and what we should be striving to improve.

This in no way addresses my comment, both formats rely on two files, schema and data

It does address your comment, in that I thought it would be easier to show with pictures before adding more messy words to the fray. While you are correct that there is a 3rd artifact in the case of Xaml being the DLL, this is not something that the developer has to directly work with in order to get the scenario to work as above. In the case of JSON and XML, you have 3 artifacts that the developer must constantly work with in the case of any changes. With Xaml (a strictly-POCO paradigm), the workflow process only involves two. Am I wrong here?

Secondly you're comparing the tooling experience not the capability of each format

Why yes, yes I am. :) The better experience should lead to stronger features and better adoption. That is what any API should strive to attain, right?

if someone wanted to they could make the exact same tooling based on json schema as xaml, this has nothing to do with the format itself.

Aha, but why do that when the schema already works and updates/synchronizes perfectly with the class definition in Xaml? Again, that is a paradigm that provides less work (better experience) so should be the winner here. If "someone" wants to go create a schema, that is now extra work/effort/time that they have to put into something that is already readily available in another format. This impacts the bottom line and increases TCO (the whole reason we're looking for the "easiest" and "best" developer/tooling experience).

You're only getting that experience because you have all the assemblies and the editor is able to find them. If someone just hands you a xaml file you get no better experience than with you would if someone just handed you a json file.

OK now we're talking. In either case the assemblies for which the serialized POCOs are dependent on are assumed to be readily available. This would definitely be a consideration for going down this path, and it really should not be difficult to find a way to make assemblies readily accessible and visible to developers so that this is a non-issue. I mean, .NET has some pretty incredible assembly-resolution so you can piggyback off of that to start.

both are third hits on google, so clearly somebody does this..

Wow, did you see all the steps involved to make that work? Look at all the effort that is required here to make a JSON file render a schema! This is compared to simply opening up an XAML file in Visual Studio and marvel at the magic. This might get a little lost in the conversation here, but the ease and lack of friction involved with simply getting symbols to resolve as you would expect for a text representation of a serialized instance is part of the reason why Xaml became so popular.

As for the proposed generation JSON.NET link, you would have to run this tool (well first BUILD the tool, because what you provided was just the API to allow this) after each build. Which of course you would have to have no doubt some logic in there to ensure there are changes, etc etc. So now we're talking:

  1. Build the tool that generates the schema.
  2. Wire it into the build pipeline.
  3. Detect changes.
  4. Ensure it only executes during changes.
  5. Find the generated schema and ensure documents that point to it update correctly.
  6. Wire up the generated schema to any required documents with the considerable recommended instruction list.

Whew, I'm tired. :)

With Xaml, I simply have to build the project, and whalah, the designer/tooling automatically updates. I do not have to touch another file, tool, or API to do this.

Additionally, can you say for certainty that the JSON schema allows for custom visual editors? I am not sure if you noticed, but the screenshot above demonstrating the Xaml tooling experience includes designer controls for certain properties, such as checkboxes for booleans, and dropdowns for enumerations, etc. Furthermore, a whole API exists to extend the design controls that are used and allows the potential for developers to create very powerful and engaging controls to help provide the data that they are supposed to enter.

Again, this ultimately reduces developer time and project TCO.

Yes for their internal rules engine.

Right, and? My point is that Xaml is being used for the CPS system, internal or otherwise. Those rules are POCO objects and they are no doubt benefiting from the tooling around what is used to define them.

If you where to define an object model that was as flexible as the msbuild schema, you'd end up with something far more verbose. Even your own sample is more verbose than the new msbuild files, and it doesn't define things like dependencies that you'd have to do with attached properties.

Orly? More verbose? How so? FWIW, I was working off of @gulshan's suggested DSL in dotnet/msbuild#1289. Obviously, adding more properties will add more content in there, but that is not the important highlight I am trying to demonstrate with this POC. What is important is being able to define the same POCO element in multiple formats. If it is too verbose for your liking, guess what? You can always modify the serializer to make it prettier. :)

I'm not entierly sure why you're trying to convince me of this,

Like I said, I am wanting to ensure my knowledge is square, and for the most part it sounds like it is. I definitely appreciate your thoughts here, I hope that is clear. My motivation is to ensure that you as a developer embarking on a mission to help address some of the "format friction" in the ecosystem are aware of the latent (nay, transparent!) issues you are going to be (or already are) wrestling with, and hopefully/maybe correct some perceived wrongs before they manifest.

A property called Microsoft.AspNet.ConfigurationModel? doesn't work..

Didn't I just model this with the originating issue post? That is easily captured the References class in the first post. How does that not work? Or rather, what do I have misunderstood again? :)

@aL3891
Copy link
Owner

aL3891 commented Dec 8, 2016

So you know precisely which version to load when the file is deserialized. JSON.NET uses the $type property for this same purpose.

You still need to find that version of the assembly. You can't simply reference it because you need to load it in order to find the references inside the file itself. So again, if you want to add a property to this format, you'd have to make a PR to the msbuild repo, get it merged, released and so on and so on before you can actually use it.

The only other option would, again, to design your project poco such that its a big list of properties and items, i.e exactly what ms build is now, but with more xml namespaces and less flexible syntax (msbuild collections are open for example, so Compile elements can be defined in multiple places)

While you are correct that there is a 3rd artifact in the case of Xaml being the DLL, this is not something that the developer has to directly work with

..except that they have to specify it by name, along with the exact version file and PublicToken at the top of the file, as well as make sure visual studio is able to find it in order for the project is even able to load. Oh and that also goes for every other assembly the project file references and uses too.

In the case of JSON and XML, you have 3 artifacts that the developer must constantly work with in the case of any changes. With Xaml (a strictly-POCO paradigm), the workflow process only involves two. Am I wrong here?

Where is this third file in the case of project.json? Where is it with msbuild? You have the json file, the json schema and..? in these specific cases the schema is even implicit so you only have one file, but even in the general case, there is only two. The file and the schema. The developer only ever needs those two.

Why yes, yes I am. :) The better experience should lead to stronger features and better adoption. That is what any API should strive to attain, right?

Yes but the tooling is obviously not intrinsic to the format... Xaml as a format is not better that json because it has better tooling. It simply has better tooling. There is also nothing in the XAML designer that would provide intellisense for packages for example, so in that case the tooling is even better for project.json atm.

Wow, did you see all the steps involved to make that work?

I honestly am losing the ability to tell if you're trolling me.. what you're referring to as "all the steps" is this adding the line "$schema" : "path" to a json file?

As for the proposed generation JSON.NET link, you would have to run this tool (well first BUILD the tool, because what you provided was just the API to allow this) after each build. Which of course you would have to have no doubt some logic in there to ensure there are changes, etc etc.
With Xaml, I simply have to build the project, and whalah, the designer/tooling automatically updates. I do not have to touch another file, tool, or API to do this.

Again, you're conflating the experience for an end user and the developer of the tooling. You've not continually building your own project system every time you build. Adding a few lines of powershell when you're msbuild to generate a schema is not exactly a big deal.

Additionally, can you say for certainty that the JSON schema allows for custom visual editors?

What? obviously yes... In what way would the schema prevent a custom editor in the tooling to exist?

Right, and?

.. its an internal system so it does not have to consider versioning and composability/extensibility in the same way.

Orly? More verbose? How so?

How is this unclear? From what I've seen your Xaml file with all its namespace declarations has more text than an equivalent msbuild file [again in the new format]. In other words, its more verbose

I am wanting to ensure my knowledge is square, and for the most part it sounds like it is.

Yeah, I don't agree. You're ignoring large parts of the problem space like how to deal with extensibility and backwards compat,(if the project.xaml file contains the packages references but you also have a custom property in it that require those references to already be there, how do you restore since you can't load the project?) you're also making a ton of assumptions and constantly conflating the current tooling experience with the capabilities of each format. You're also switching between the experience for the end developer and the tool developer as well as the general case for using XAML vs json and the case for using it to build a project system specifically.

My motivation is to ensure that you as a developer embarking on a mission to help address some of the "format friction" in the ecosystem are aware of the latent (nay, transparent!) issues you are going to be (or already are) wrestling with, and hopefully/maybe correct some perceived wrongs before they manifest.

I think that if you want to do that, you need to have a more focused argument. This project isn't even about determining what the best way to represent a project is. It's about enabling people to specify references in any way they want_, old or new, poco or not.

None the less I think you've made your argument, and some of it I agree with and other parts less so.

Didn't I just model this with the originating issue post? That is easily captured the References class in the first post.

Not correctly.. You code assumes References is an array, its not, its an object. It assumes there is a name and version property, there aren't, the name of the package is the property name as I outlined earlier, and the value of that property is the version. Your code also doesn't capture the fact that the value of a reference may be a string, the version, or an object that specify other things for the reference. (my code doesn't cover that last part either btw)

@Mike-E-angelo
Copy link
Author

So again, if you want to add a property to this format, you'd have to make a PR to the msbuild repo, get it merged, released and so on and so on before you can actually use it.

While this would be true in the case of adding another property directly to the Project class or any other POCO, hopefully this would not be the case in the general scenario of extensibility. Ideally what you (or a typical developer in this world) would build is a Task extension and would look more like this (or this in JSON :) ).

except that they have to specify it by name, along with the exact version file and PublicToken at the top of the file, as well as make sure visual studio is able to find it in order for the project is even able to load.

We "have" to do this now with each of our references in our projects in our solutions, but somehow we don't have a problem with it. That is because tooling has made this part easier to do over time. Getting an assembly/namespace in Xaml has never been a chore, especially with ReSharper.

Where is this third file in the case of project.json? Where is it with msbuild?

Well, so let's take your extensibility concern again above with the Contoso.Tasks.CustomTask. If I add this to a project file, where is the the Schema? How do I easily add the schema to my current project.json or project.xml? Project.xaml "just works" because it simply loads the class definition from the .dll, but .json and xml do not have this capability. There is simply more work involved as a developer when you do not use the class definition as the schema.

Yes but the tooling is obviously not intrinsic to the format... Xaml as a format is not better that json because it has better tooling. It simply has better tooling. There is also nothing in the XAML designer that would provide intellisense for packages for example, so in that case the tooling is even better for project.json atm

100% agreed here. Part of the subtext here is that all of this great MSFT-sponsored and built functionality "just worked" in Xaml and somehow was simply ignored to build the wheel again with a WAYYYYYYY inferior replacement. I will be honest and say that JSON's terseness is winning me over. It would be pretty sweet to see all the awesomeness I am pointing out to you in Xaml files replicated in JSON.

I honestly am losing the ability to tell if you're trolling me..

Hahaha... I seriously did/do not have the patience to read that entire list of instructions versus simply having to open a file to get the "same" (not sure if visual designers work yet?) experience in another!

In what way would the schema prevent a custom editor in the tooling to exist?

Well, as you perfectly well know, anything in software is possible. :) I am again simply pointing out the featureset that already exists in a mature, readily available tooling paradigm that does not require a lot of work/effort to leverage, and how much further away "yet another format" is from matching it, even if it is prettier to the naked eye to digest.

This also further underscores the reason to support as many formats as possible, leaving each to the developer/organization to work with it in the tooling that is most comfortable/powerful/familiar to them.

From what I've seen your Xaml file with all its namespace declarations has more text than an equivalent msbuild file [again in the new format].

LOLz yes of COURSE Xaml is verbose! It has no choice but to be! But it is much more expressive and powerful because of it, allowing you to use MarkupExtensions and other concepts. You do not see this in JSON, either. You harp on my Xaml file, but what of the JSON? I think you still do not grasp the fact that both files describe the same theoretical (read: NOT REAL) POCO object!

Yeah, I don't agree. You're ignoring... // ... you're also ... // ... You're also ...

Yeah, I think (very nice) discussion would be more valuable to have in person at this point.

This project isn't even about determining what the best way to represent a project is. It's about enabling people to specify references in any way they want_, old or new, poco or not.

Fair enough. I am simply hoping to ensure the best guidance on how to do this, so that when people hit the ground running here, they are given the experience that only requires one line of code rather than O(n) lines of code.

You code assumes References is an array, its not, its an object. It assumes there is a name and version property, there aren't,

Now even more confused. Are you saying that this is a dynamic object of some sort that can simply change its schema on the fly? Or that really, there is no schema at all for this object?! You don't model your C# classes like this, do you? Again there seems to be a disconnect between data and classes. I guess if anything we're learning that I am in the minority here and think that this is a terrible and very error-prone way to design solutions.

@aL3891
Copy link
Owner

aL3891 commented Dec 8, 2016

While this would be true in the case of adding another property directly to the Project class or any other POCO, hopefully this would not be the case in the general scenario of extensibility

As I've been saying all along, you'd then need to design your poco to be a bunch of lists of items, witch is basically what msbuild is now. (but with better intellisense out of the box, fair enough, but tooling would most likley have custom intellisense anyway like with project.json)

We "have" to do this now with each of our references in our projects in our solutions

Your argument was that "this is not something that the developer has to directly work with" this is clearly false as you do need to work with it. There is nothing stopping tooling from helping you find any schema Xaml or otherwise

Well, so let's take your extensibility concern again above with the Contoso.Tasks.CustomTask. If I add this to a project file, where is the the Schema

Nowhere? I've never claimed the current msbuild is perfect when it comes to intellisense. you're claming that are three artifacts a developer has to deal with when it comes to json and xml. your example has nothing to do with that, and also ignores that you need that dll for XAML to "just work". json and xml will also "just work" if the tooling is built like that, but that has nothing to do with the format or whether there is a clr poco somewhere in the background

I want to be sure I am positively educated on the matter.

Hahaha... I seriously did/do not have the patience to read that entire list of instructions

Yeah clearly, since you've unable to read that entire list of one item.

can you say for certainty that the JSON schema allows for custom visual editors?

Well, as you perfectly well know, anything in software is possible

You're that one that asked the question.

Orly? More verbose? How so?

LOLz yes of COURSE Xaml is verbose!

Allright then....

Is much more expressive and powerful because of it

Sure, but verbosity and uglyness is the main complaint people have against msbuild files. i dbout people who don't like project.csproj will feel alot better about project.xaml

You harp on my Xaml file,

I harp on about it? You're the one that keeps bringing it up!

but what of the JSON?

What about it? Its also very verbose since you specify the types everywhere. besides how is that a good example, you're the one saying the editing experience for json is so terrible.

I think you still do not grasp the fact that both files describe the same POCO object

When have I ever disputed that they are?

I am simply hoping to ensure the best guidance on how to do this

You've given your perstective but i find it a little arrogant to suppose this is "the best" guidence. If you want to write something in perticular way, write it, dont expect other people to do it for you.

Are you saying that this is a dynamic object of some sort that can simply change its schema on the fly

..what are you talking about? i dont have any thing modeled as a dynamic. I'm reading the poject.json file as it was defined by the asp.net team, if you have a problem with that, take it up with them.

@Mike-E-angelo
Copy link
Author

As I've been saying all along, you'd then need to design your poco to be a bunch of lists of items, witch is basically what msbuild is now.

And I am woefully trying to demonstrate that this is not necessary.

Your argument was that "this is not something that the developer has to directly work with"

OK, since we're getting into the nitty gritty, how about: this is not something that the developer has to directly work with in existing tooling workflows. If I open a Xaml file in my project/solution, I don't even have to build the solution to get it working. The tooling immediately picks up the schema from the class that I define as the root element. As opposed to the steps above with the current JSON tooling experience, this is much less work and therefore wins the preferred/desired workflow.

Again I am not saying that this is inherent to Xaml, and I am in agreement that "nothing" is stopping this from working in other formats, but it doesn't currently work in other formats and the amount of work to get it to work in those formats is far greater than utilizing what already works in an existing (and preferred to many who have used WPF) format.

and also ignores that you need that dll for XAML to "just work". json and xml will also "just work" if the tooling is built like that, but that has nothing to do with the format or whether there is a clr poco somewhere in the background

As I just described, the DLL isn't necessary for in-project items, and I am in agreement with the rest, but the unfortunate reality is that no tooling is available to make it "just work" like what Xaml does. This is due in part to the systemic conceptual disconnect in our ecosystem between data and POCO which we are exploring here.

Yeah clearly, since you've unable to read that entire list of one item.

LOL you got me here, champ. I took a deep breath and re-read that article. So right, adding a $schema attribute or specifying it from the dropdown -- which is still a mystery on how it is populated and/or how to add more to the its list. This is part of the frustrating friction that is part of dealing with JSON. Again I hear you in what you are saying that there is "nothing stopping" JSON to work like the "lit up" experience already available in Xaml, but my point is that no one is actually doing that and the effort it would require a developer to do that is quite a bit.

I also hope that you understand the frustration here that this is all taken under the context of Xaml, where it already "just works" the way "it should" without much effort on the part of the developers that engage in its workflow.

You're that one that asked the question.

Haha right, but I was seeing if YOU knew if it supported it in the same fashion (or better) than Xaml currently does. I know it can, but does it now? This sounds like a "no."

Sure, its also way more verbose, and verbosity and uglyness is the main complaint people have against msbuild files.

Well that is a part of it, but I think they are fundamentally frustrated with its tooling experience, of having an arbitrary schema with very little tooling support around it. There is a tremendous amount of friction involved with dealing with those files, to be sure.

I harp on about it? You're the one that keeps bringing it up!

You were saying how verbose my files were, but interestingly enough you were gravitating towards the Xaml and not the JSON.

You've given your perstective but i find it a little arrogant to suppose this is "the best" guidence. If you want to write something in perticular way, write it, dont expect other people to do it for you.

Well "best" is subjective, of course. I guess I should say "lesser or least effort required with existing tooling and workflows."

I'm reading the poject.json file as it was defined by the asp.net team, if you have a problem with that, take it up with them

Already done and done. (Moved my grievances to CPS and MSBuild as that is where the action was migrated after the big shitshow went down).

Just think, if the ASP.NET had done the "right thing" here and built that schema around a POCO, you could simply add the NuGet package containing that POCO into your cool project here and half of your work would already be done. Imagine that. 馃槃

@gulshan
Copy link

gulshan commented Dec 8, 2016

Hey guys. I am starting to envision the architecture of my proposal in dotnet/msbuild#1289 and your discussion helped a lot. 馃槂 Here it is-

  • MSBuild should supply an IProject interface (or abstract class) and extension class(es) based on the interface supplying various functionality.
  • Among the public APIs, a special signature will be used to select methods, which will be available as MSBuild tasks and can be invoked from command line or other tools.
  • There will be plugins to MSBuild. Plugins can do anything on top of MSBuild supplied API-
    • Implement the IProject for a certain type of project like CSharpProject or FSharpProject.
    • There can be child project inheriting other projects defined by other (or same) plugins like FSharpLibrary or ASPNETWebApp.
    • That means, a plugin can depend on other pluings.
    • Add extension classes/methods for IProject and implementation.
    • And anything else.
  • The main MSBuild has to be installed as an app. But plugins will be acquired using Nuget. That will facilitate version and dependency management.
  • Plugins can be acquired on project basis. But there should be aggressive caching to avoid fetching same version of a plugin multiple times.
  • I prefer plugins to be hosted in a separate repository other than nuget.com, so that all the plugins can be reviewed by the a responsible team as well as rated by users.
  • The end user or developer will be using a toned down or restricted version of C# script to define/describe a project. The restrictions will be applied by roslyn analyzer.
  • The script will refer to the plugins it is using and describe the project using either object intializer notation or other constructors supplied by the plugin. So, a plugin can provide a constructor, which actually build the project object from a xml, json or any other type of file. (There is some concerns for external tools here.)
  • End developer will also be able to define MSBuild tasks, matching the special signature already mentioned.
  • There will be a MSBuild tooling API, which will enable the external tools to read (edit if necessary) the project file. Visual Studio will just be one such tool.

An example of how a project file may look can be found here. Any opinion?

@aL3891
Copy link
Owner

aL3891 commented Dec 8, 2016

@Mike-EEE you know what. I don't see this discussion leading to anything constructive, especially not for this project. I've spent far too much time than I should bickering with you about pros and cons of xaml frankly I have better things to do, like actually work on this project for instance.

If you want to use xaml, do it. If you want to make a project system based on it, go ahead. Just show that your system can handle all the things msbuild can and people will make their choice. Given how confident you are about it superiority and how easy it is to implement I can't see how it would not be successful.

@gulshan
That's interesting but I don't see how that is related to this issue or this project. it sounds like something you should discuss over at msbuild, i'm sure you'll get better response from people who are actually involved in msbuild there.

@aL3891 aL3891 closed this as completed Dec 8, 2016
@Mike-E-angelo
Copy link
Author

bickering with you...

aw geeze dude, sorry you feel that way. :( FWIW I thought this was a discussion/hearted debate. Sorry if I came off inflammatory and/or condescending to leave you with an impression that it was otherwise.

If you want to use xaml...

I hope if anything you can can take away from this, is that what I am talking about is not Xaml (you're so turning that into a trigger word, LOL). It is a POCO-serialized, designer-friendly development paradigm, of which Xaml can be considered the best approximation, requiring the least amount of effort to currently enjoy.

FWIW, if Xaml worked on .NET Core I'd soooo be all over helping you with your Xaml proj. :)

@gulshan You know that I am in full support of your efforts! The trick of course is to get MSBuild team to buy in, and that is extremely challenging, if even possible.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants