Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add async completion support #1986

Merged
merged 8 commits into from May 6, 2021
Merged

Conversation

333fred
Copy link
Contributor

@333fred 333fred commented Oct 16, 2020

The recent completion rewrite makes completion faster for a large number of cases, but it has a couple of issues:

  • Any completion provider that needs to make edits beyond the few we've special cased doesn't work correctly. Roslyn is looking to add more of these, such as Intellisense for cast, indexers and operators dotnet/roslyn#47511.
  • Completion providers that we do special case can be slow. Override and partial method completion in big types is painful, taking over a minute to show up sometimes.

To resolve this, I've added support for async edits by adding a new completion end point: /completion/afterInsert. It takes the item that was inserted, the file, and the new cursor position. It relies on the InsertText that was inserted still allowing for resolving the completion in the same way as before, so I had to force a few providers that don't behave well here to be eagerly resolved, regardless. I also kept import completion using the standard /completion/resolve step to resolve extra edits, as they don't need to modify the current cursor location at all.

@333fred 333fred marked this pull request as draft October 16, 2020 07:22
@333fred
Copy link
Contributor Author

333fred commented Oct 16, 2020

@filipw @JoeRobich @david-driscoll I've opened this as a draft because I'm not 100% sure we actually want to take this. It'll require implementation work in every client as LSP does not support this. LSP can use the command field in a completionitem to trigger a custom command, but every editor client extension will have to know how to handle that. It certainly is much faster for many things though.

Pictures speak 1000 words, and animated gifs even more so.
Sync completion (the experience today):
sync-completion
Async completion (the experience with this PR):
async-completion

333fred added a commit to 333fred/vscode-csharp that referenced this pull request Oct 16, 2020
@david-driscoll
Copy link
Member

@333fred
Copy link
Contributor Author

333fred commented Oct 17, 2020

@david-driscoll better, but not good enough for our purposes. Unless they're planning on changing it, vscode only supports additionalTextEdits and detal/documentation fields for lazy resolve, not the main textEdit.

@333fred
Copy link
Contributor Author

333fred commented Oct 17, 2020

Because, to be clear, I do not like this code. It's a race condition waiting to happen. But the vscode lsp implementation does not report textEdit as a resolvable property: https://github.com/microsoft/vscode-languageserver-node/blob/248560e74c9a715be59e62fdbdd554b87ddf8dba/client/src/common/client.ts#L1506-L1508.

@david-driscoll
Copy link
Member

command should be sufficient, however I don't know if there is a good logical flag for if an client supports it or not.

@david-driscoll
Copy link
Member

vscode: supports it via native commands
emacs: lsp-mode supports it here
eclipse: lsp4e supports it here
vim: I can't tell if vim-lsp supports it, my assumption is it does not.

cc @nickspoons has vim knowledge

@nickspoons
Copy link
Member

There are many LSP clients for vim:

I flicked through a few of them and can't find any reference to this command parameter in any. But I don't know these code bases well at all.

As for OmniSharp-vim, we're not an LSP client so we can adapt 😁

@filipw
Copy link
Member

filipw commented Oct 20, 2020

I think since this is based on a request flag, there is absolutely no problem in adding this and thus giving more advanced (more O# protocol aware) clients an option to use a superior approach for extra edits

@333fred
Copy link
Contributor Author

333fred commented Oct 20, 2020

I think since this is based on a request flag, there is absolutely no problem in adding this and thus giving more advanced (more O# protocol aware) clients an option to use a superior approach for extra edits

Note, I think I can (and probably should) make this a startup flag for omnisharp instead, so LSP clients that support the command parameter could map it appropriately.

@ffMathy
Copy link

ffMathy commented Dec 14, 2020

In fact, I think this is a must if Omnisharp and VSCode needs any chance to be feasible instead of Visual Studio.

@ffMathy
Copy link

ffMathy commented Dec 14, 2020

@333fred what are the next steps for this? Can it be marked as a real PR instead of a draft?

I think this is super important.

@333fred
Copy link
Contributor Author

333fred commented Dec 14, 2020

@ffMathy I'm really not happy with this code and think it's very likely to result in a subpar experience, which is why I haven't moved forward with it yet.

@ffMathy
Copy link

ffMathy commented Dec 14, 2020

What can be done to help though? The current experience is insanely slow (as seen in the first GIF).

@333fred
Copy link
Contributor Author

333fred commented Dec 15, 2020

What can be done to help though? The current experience is insanely slow (as seen in the first GIF).

Frankly, I'm not certain. There's an inherent race condition with this PR and we really can't do anything to solve it.

@ffMathy
Copy link

ffMathy commented Dec 15, 2020

Hmm, but how does the full Visual Studio do it then? It manages to be much faster in terms of completions and suggestions.

I think it's crucial to get to the same speed for larger projects. Is this still the plan abd/or vision?

@ffMathy
Copy link

ffMathy commented Dec 15, 2020

Similarly one could ask, how does TypeScript do it? It also remains super fast for very large projects.

Could we get inspired by it?

@333fred
Copy link
Contributor Author

333fred commented Dec 15, 2020

Hmm, but how does the full Visual Studio do it then?

VS has native support for this operation. VS Code does not have support for this type of asynchronous completion, and thus this remains a hack around functionality that VS Code does not have.

Similarly one could ask, how does TypeScript do it? It also remains super fast for very large projects.

Typescript doesn't do this type of operation, so they don't have to be fast for it :)

@ffMathy
Copy link

ffMathy commented Dec 15, 2020

Thank you for answering my questions so far.

Is there an issue in the VS Code repo for supporting this type of async operation?

@333fred
Copy link
Contributor Author

333fred commented Dec 15, 2020

I've talked with @jrieken about it before, I'm not sure if there is any specific issue open about making TextEdit asynchronously-resolvable.

@ffMathy
Copy link

ffMathy commented Dec 15, 2020

I made an issue microsoft/vscode#112527 for it now.

@jrieken
Copy link
Contributor

jrieken commented Dec 15, 2020

I made an issue microsoft/vscode#112527 for it now.

I have closed the issue as out-of-scope and this is why: accepting a completion is a synchronous operation and no further blocking or waiting can happen. You have to assume that users are typing and that typing has accepted a completion. Needing to asynchronously resolve the text edit would mean that we (a) block typing or (b) employ some kind of OT strategy which inserts the text edit when resolving. Option (a) is not possible because you don't block in JS and (b) is complex to implement and yields in a subpar experience, e.g when I accept a completion I wanna see it now, not after a delay.

@333fred
Copy link
Contributor Author

333fred commented Dec 15, 2020

(b)... yields in a subpar experience, e.g when I accept a completion I wanna see it now, not after a delay.

FWIW, and I've brought this up before (though not in talking with you Johannes) I fundamentally disagree with this assertion. The experience today is subpar because bringing up the completion list takes 10+ seconds in these complex cases due to the nature of how we need to calculate the changes up front for every single completion in the list. I'd much rather have a 100 ms delay after insertion, rather than a 10+ second delay on bringing up the list.

@ffMathy
Copy link

ffMathy commented Dec 15, 2020

@333fred indeed. The current experience is simply unusable. Not just for the average user - for everyone.

Every C# developer I know (which is a lot), wants to use VS Code, but they always back out because Omnisharp's autocompletion is slow. This is across all kinds of solutions that are just a little bigger than the average Hello World example (multiple projects, dependencies, etc).

The current project I have is only 7000 lines of code and spans 3 projects. Visual Studio can produce suggestions instantly. In VS Code, it takes me 10+ seconds for the completions to arrive.

@jrieken I agree blocking is probably not the option. That's why many people love VS Code in the first place in favor of Visual Studio for every other language than C#. However, whatever solution we end up with, it can't possibly be worse than the current state of things, and in that, I agree with @333fred.

@ffMathy
Copy link

ffMathy commented Dec 15, 2020

I think it's as simple as asking "does anyone want 10+ seconds of wait time for sguiggly error lines and/or warning lines to disappear, and the same wait time for waiting for completions and code-fixes?"

The answer would clearly be no. What we have now is essentially Notepad with syntax highlighting, because of that wait time.

And don't get me wrong. Omnisharp and VS Code are great. Just slow. The feature set is amazing and an excellent piece of work. I just think it's a waste when it doesn't get the opportunity to be used on a larger scale.

It's almost like an API that presents an amazing feature, but doesn't have documentation for it.

@333fred
Copy link
Contributor Author

333fred commented Dec 15, 2020

Tho, having said that we DO have additional text edits and a best effort approach to apply them async, after making the main edit.

Yes, this is how import completion works in omnisharp today (and in TS, I believe). The issue with override completion is that the expensive part is calculating the minimum text necessary for the argument types and what the auto-inserted base call should look like. These edits intersect the current cursor location, and indeed where the cursor should finally end up is based on this calculation. This means that additionalTextEdits is not suitable, as we cannot control the cursor location or intersect the cursor location.

There's other types of completion that become impossible or very hard to do well without asynchronicity. I know that rust-analyzer has brought this up in the past as well, where they want to have a match completion that rewrites the current expression and returns a snippet, but they, like us, don't want to do the expensive part of that calculation unless the user actually wants the completion. Roslyn is also starting to add these types of completions, and in order to keep general completions responsive in O#, I'm going to have to block them. Either that, or try to work around vscode's limitations and provide a significantly worse experience than vscode could do natively, as I fundamentally cannot resolve this race condition in an extension.

@ffMathy
Copy link

ffMathy commented Jan 5, 2021

@jrieken does @333fred's above elaboration make the idea more "feasible"? 🙏

 The recent completion rewrite makes completion faster for a large number of cases, but it has a couple of issues:

 * Any completion provider that needs to make edits beyond the few we've special cased doesn't work correctly. Roslyn is looking to add more of these, such as dotnet/roslyn#47511.
* Completion providers that we do special case can be _slow_.  Override and partial method completion in big types is painful, taking over a minute to show up sometimes.

To resolve this, I've added support for async edits by adding a new completion end point: /completion/afterInsert. It takes the item that was inserted, the file, and the new cursor position. It relies on the InsertText that was inserted still allowing for resolving the completion in the same way as before, so I had to force a few providers that don't behave well here to be eagerly resolved, regardless. I also kept import completion using the standard /completion/resolve step to resolve extra edits, as they don't need to modify the current cursor location at all.
333fred added a commit to 333fred/vscode-csharp that referenced this pull request Apr 25, 2021
@333fred 333fred marked this pull request as ready for review April 25, 2021 03:15
/// <summary>
/// Maximum number of completion lists allowed in cache. Must be >= 1.
/// </summary>
private const int MaxCacheSize = 2;
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we used to have a single statically cached item, I am not sure I can follow why is there now two?
is this "just in case"?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We now have an after insert step, and the user could technically have triggered a completion session in between. I want to make sure that we handle it somewhat gracefully.


if (lastCompletionItem.GetProviderName() is not (CompletionItemExtensions.OverrideCompletionProvider or
CompletionItemExtensions.PartialMethodCompletionProvider)
and var name)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh I didn't know you could do something like this 🧐

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Composition!

@333fred
Copy link
Contributor Author

333fred commented May 3, 2021

@filipw any more comments?

Copy link
Member

@filipw filipw left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

for me this is good to go - thanks!

@333fred
Copy link
Contributor Author

333fred commented May 6, 2021

@filipw what's the next step?

@filipw filipw enabled auto-merge May 6, 2021 18:31
@filipw
Copy link
Member

filipw commented May 6, 2021

it failed the last test run, I've enabled auto merge now

@filipw filipw merged commit aa1018c into OmniSharp:master May 6, 2021
@333fred 333fred deleted the async-completion branch May 6, 2021 18:43
@ffMathy
Copy link

ffMathy commented May 7, 2021

@333fred congrats on getting this out, and thank you for making it happen! 😍

When can this be enabled, and how?

@333fred
Copy link
Contributor Author

333fred commented May 7, 2021

@333fred congrats on getting this out, and thank you for making it happen! 😍

When can this be enabled, and how?

When the vscode side is merged, and using the setting.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

7 participants