Autocomplete
Learn how Cody helps you get contextually-aware autocompletion for your codebase.
Cody predicts what you're trying to write before you even type it. It offers single-line and multi-line suggestions based on the provided code context, ensuring accurate autocomplete suggestions. Cody autocomplete supports a wide range of programming languages because it uses LLMs trained on broad data.
Code autocompletions are optimized for both server-side and client-side performance, ensuring seamless integration into your coding workflow. The default autocomplete model for Cody Free, Pro, and Enterprise users is DeepSeek V2, which significantly helps boost both the responsiveness and accuracy of autocomplete.
Cody's autocomplete capabilities
The autocompletion model is designed to enhance speed, accuracy, and the overall user experience that offers:
- Increased speed and reduced latency: The P75 latency is reduced by 350 ms, making the autocomplete function faster
- Improved accuracy for multi-line completions: Completions across multiple lines are more relevant and accurately aligned with the surrounding code context
- Higher completion acceptance rates: The average completion acceptance rate (CAR) is improved by more than 4%, providing a more intuitive user interaction
How does autocomplete work?
First, you'll need the following setup:
- A Free or Pro account via Sourcegraph.com or a Sourcegraph Enterprise instance
- A supported editor extension (VS Code, JetBrains)
The autocomplete feature is enabled by default on all IDE extensions, i.e., VS Code and JetBrains. Generally, there's a checkbox in the extension settings that confirms whether the autocomplete feature is enabled or not. In addition, some autocomplete settings are optionally and explicitly supported by some IDEs. For example, JetBrains IDEs have settings that allow you to customize colors and styles of the autocomplete suggestions.
When you start typing, Cody will automatically provide suggestions and context-aware completions based on your coding patterns and the code context. These autocomplete suggestions appear as grayed text. Press the Enter
or Tab
to accept the suggestion.
Configure autocomplete on an Enterprise Sourcegraph instance
By default, a fully configured Sourcegraph instance picks a default LLM to generate code autocomplete. Custom models can be used for Cody autocomplete based on your specific requirements. To do so:
- Go to the Site admin of your Sourcegraph instance
- Navigate to Configuration > Site configuration
- Here, edit the
completionModel
option inside thecompletions
- Click the Save button to save the changes
Cody supports and uses a set of models for autocomplete. Learn more about these here. It's also recommended to read the Enabling Cody on Sourcegraph Enterprise docs.