Releases: sourcegraph/cody
Cody for VS Code 1.16.3
✨ See the What’s new in v1.16 blog post for what’s new in this release since v1.15 ✨
v1.16.3 Changes
- Tutorial: Fixed telemetry when activating the tutorial on first authentication by @umpox in #4068
- Tutorial: Improved the reliability and discoverability of the Edit command by @umpox in #4068
Full Changelog: vscode-v1.16.2...vscode-v1.16.3
Cody for VS Code 1.16.2
✨ See the What’s new in v1.16 blog post for what’s new in this release since v1.15 ✨
v1.16.2 Changes
Full Changelog: vscode-v1.16.1...vscode-v1.16.2
Cody for VS Code 1.16.1
✨ See the What’s new in v1.16 blog post for what’s new in this release since v1.15 ✨
v1.16.1 Changes
- Fixed a bug where old Sourcegraph instances' error messages caused Cody to ignore all context files by @dominiccooney in #4024
- Fixed a visually distracting drop shadow on some text labels in the model selection dropdown menu by @sqs in #4026
Full Changelog: vscode-v1.16.0...vscode-v1.16.1
Cody for VS Code 1.16.0
✨ See the What’s new in v1.16 blog post for what’s new in this release since v1.15 ✨
v1.16.0 Changes
- Chat: The context window for the
Claude 3 Sonnet
andClaude 3 Opus
models is now increased by default for all non-Enterprise users, without requiring a feature flag by @abeatrix in #3953 - Custom Commands: Added the ability to create new custom Edit commands via the Custom Command Menu by @abeatrix in #3862
- Custom Commands: Added 'currentFile' option to include the full file content in the Custom Commands menu by @philipp-spiess in #3960
- Chat: Pressing Alt+Enter or Opt+Enter will submit a chat message without enhanced context (only @-mentions) by @sqs in #3996
- Chat: Fixed an issue where Cody's responses were not visible in small windows by @keegancsmith in #3859
- Edit: Fixed an issue where an Edit task would not correctly respin when an irresolvable conflict is encountered by @umpox in #3872
- Chat: Fixed an issue where older chats were displaying as 'N months ago' instead of the number in the Chat History sidebar by @abeatrix in #3864
- Custom Commands: Fixed an issue where the "selection" option was not being toggled correctly based on the user's selection in the Custom Command menu by @philipp-spiess in #3960
- Chat: Fixed an issue where the chat title showed up as "New Chat" when the question started with a new line by @abeatrix in #3977
- Sidebar (Settings & Support): For Pro & Enterprise, moved 'Account' up to the top. For Pro only, removed 'Usage' as it can be accessed via 'Account' → 'Manage Account' by @toolmantim in #3868
- Debug: Removed the
cody.debug.enabled
setting. Baseline debugging is now enabled by default by @philipp-spiess in #3873 - Chat: The experimental Ollama Chat feature, which allows using local Ollama models for chat and commands, is now enabled by default by @abeatrix in #3914
- Removed Claude 2, Claude 2.1 and Claude Instant from Cody Free and Cody Pro. All users are now upgraded to use Claude 3 by default by @philipp-spiess in #3971
Full Changelog: vscode-v1.14.0...vscode-v1.16.0
Cody for VS Code 1.14.0
✨ See the What’s new in v1.14 blog post for what’s new in this release since v1.13 ✨
v1.14.0 Changes
- Chat: Add highlighted code to Cody Chat as
@-mentions
context by right-clicking on the code and selectingCody Chat: Add context
by @abeatrix in #3713 - Autocomplete: Add the proper infilling prompt for Codegemma when using Ollama by @philipp-spiess in #3754
- Chat: The new
Mixtral 8x22B
chat model is available for Cody Pro users by @abeatrix in #3768 - Chat: Add a "Pop out" button to the chat title bar that allows you to move Cody chat into a floating window by @umpox in #3773
- Sidebar: A new button to copy the current Cody extension version to the clipboard shows up next to the Release Notes item in the SETTINGS & SUPPORT sidebar on hover. This is useful for reporting issues or getting information about the installed version by @abeatrix in #3802
- Generate Unit Tests: Added a new code action "Ask Cody to Test" currently shows against functions in JS, TS, Go and Python by @umpox in #3763
- Chat: @-mentions that exceed the context window will be displayed as invalid to make it easier to identify them during input by @abeatrix in #3742
- Generate Unit Tests: Fixed an issue where Cody would generate tests for the wrong code in the file by @umpox in #3759
- Chat: Fixed an issue where changing the chat model did not update the token limit for the model by @abeatrix in #3762
- Troubleshoot: Don't show SignIn page if the authentication error is because of network connectivity issues by @RXminuS in #3750
- Edit: Large file warnings for @-mentions are now updated dynamically as you add or remove them by @abeatrix in #3767
- Generate Unit Tests: Improved quality for creating file names by @umpox in #3763
- Custom Commands: Fixed an issue where newly added custom commands were not working when clicked in the sidebar tree view by @abeatrix in #3804
- Chat: Fixed an issue where whitespaces in messages submitted by users were omitted by @abeatrix in #3817
- Chat: Improved token counting mechanism that allows more context to be correctly included or excluded by @abeatrix in #3742
- Chat: Fixed an issue where context files were opened with an incorrect link for Enterprise users due to double encoding by @abeatrix in #3818
- Chat: Line numbers for @-mentions are now included and counted toward the "x lines from y files" section in the UI by @abeatrix in #3842
- Command: Ghost text hint for
Document Code
("Alt+D to Document") now only shows on documentable symbols without an existing docstring by @abeatrix in #3622 - Chat: Updates to the latest GPT 4 Turbo model by @philipp-spiess in #3790
- Chat: Slightly speeds up enhanced context fetching on Cody Free and Cody Pro when both embeddings and search is used by @philipp-spiess in #3798
- Support Sidebar: Consolidated all support links to our new Support page, which includes a new Community Forum for user discussion. by @abeatrix in #3803
- Support Sidebar: Update the icon for Discord to use the official Discord logo by @abeatrix in #3803
- Commands/Chat: Increased the maximum output limit of LLM responses by @umpox in #3797
- Commands: Updated the naming of various code actions to be more descriptive by @umpox in #3831
- Chat: Adds chat model to more telemetry events by @philipp-spiess in #3829
- Telemetry: Adds a new telemetry event when users sign-in the first time by @philipp-spiess in #3836
- Chat: Increased context window size when using the
Claude 3 Sonnet
andClaude 3 Opus
models by @abeatrix in #3742
Full Changelog: vscode-v1.12.0...vscode-v1.14.0
Cody Agent 0.0.5b
Fix MacOS builds crashing on startup by signing executables.
Cody Agent 0.0.5
agent-v0.0.5 fix path names for agent-release action (#3704)
Cody Agent 0.0.4
agent-v0.0.4 bump agent version to test release CI (#3703)
Cody for VS Code 1.12.0
✨ See the What’s new in v1.12 blog post for what’s new in this release since v1.11 ✨
v1.12.0 Changes
- Edit/Chat: Cody now expands the selection to the nearest enclosing function, if available, before attempting to expand to the nearest enclosing block by @umpox in #3507
- Edit: New
cody.edit.preInstruction
configuration option for adding custom instruction at the end of all your requests by @abeatrix in #3542 - Edit: Add support for the new
cody.edit.preInstruction
setting by @abeatrix in #3542 - Edit: Added telemetry to measure the persistence of edits in the document by @umpox in #3550
- Edit: "Ask Cody to Fix" now uses Claude 3 Sonnet by @umpox in #3555
- Chat: Added buttons in the chat input box for enabling/disabling Enhanced Context by @abeatrix in #3547
- Edit: Display warnings for large @-mentioned files during selection by @umpox in #3494
- Edit: Automatically show open tabs as available options when triggering an @-mention by @umpox in #3494
Cody Debug: Report Issue
command to easily file a pre-filled GitHub issue form for reporting bugs and issues directly inside VS Code. TheCody Debug: Report Issue
command is accessible from the command palette and the...
menu in the Cody Support sidebar by @abeatrix in #3624- Chat: Fixed issue where large files could not be added via @-mention. You can now @-mention line ranges within large files by @abeatrix in #3585
- Edit: Improved the response reliability, Edit commands should no longer occasionally produce Markdown outputs by @umpox in #3192
- Chat: Handle empty chat message input and prevent submission of empty messages by @abeatrix in #3554
- Chat: Warnings are now displayed correctly for large files in the @-mention file selection list by @abeatrix in #3526
- Custom Commands: Errors when running context command scripts now show the error output in the notification message by @toolmantim in #3565
- Edit: Improved the response reliability, Edit commands should no longer occasionally produce Markdown outputs by @umpox in #3192
- Edit: The
document
command now defaults to Claude 3 Haiku by @umpox in #3572 - Chat: The Enhanced Context Settings modal is opened by default for the first chat session by @abeatrix in #3547
- Add information on which Cody tier is being used to analytics events by @philipp-spiess in #3508
- Auth: Enable the new onboarding flow that does not require the redirect back to VS Code for everyone by @philipp-spiess in #3574
- Chat: Claude 3 Sonnet is now the default model for every Cody Free or Pro user by @philipp-spiess in #3575
- Edit: Removed a previous Edit shortcut (
Shift+Cmd/Ctrl+v
), useOpt/Alt+K
to trigger Edits by @umpox in #3591 - Commands: The
Editor Title Icon
configuration option has been removed from the Cody Settings menu. Users can configure the title bar icon by right-clicking on the title bar by @abeatrix in #3677 - Hover Commands: Cody commands are now integrated with the native hover provider, allowing you to seamlessly access essential commands on mouse hover by @abeatrix in #3585
Full Changelog: vscode-v1.10.2...vscode-v1.12.0
Cody for VS Code 1.10.2
✨ See the What’s new in v1.10 blog post for what’s new in this release since v1.9 ✨
v1.10.2 Changes
- Cody Enterprise users now have access to an
experimental-openaicompatible
which allows bringing your own LLM via any OpenAI-compatible API. For now, this is only supported with Starchat and specific configurations - but we continue to generalize this work to support more models and OpenAI-compatible endpoints by @slimsag in #3218
Full Changelog: vscode-v1.10.1...vscode-v1.10.2