Closes#6701 (one of the top ranking issues as of writing)
Adds the ability to specify an HTTP/HTTPS proxy to route Copilot code
completion API requests through. This should fix copilot functionality
in restricted network environments (where such a proxy is required) but
also opens up the ability to point copilot code completion requests at
your own local LLM, using e.g.:
- https://github.com/jjleng/copilot-proxy
- https://github.com/bernardo-bruning/ollama-copilot/tree/master
External MITM-proxy tools permitting, this can serve as a stop-gap to
allow local LLM code completion in Zed until a proper OpenAI-compatible
local code completions provider is implemented. With this in mind, in
this PR I've added separate `settings.json` variables to configure a
proxy server _specific to the code completions provider_ instead of
using the global `proxy` setting, to allow for cases like this where we
_only_ want to proxy e.g. the Copilot requests, but not all outgoing
traffic from the application.
Currently, two new settings are added:
- `inline_completions.copilot.proxy`: Proxy server URL (HTTP and HTTPS
schemes supported)
- `inline_completions.copilot.proxy_no_verify`: Whether to disable
certificate verification through the proxy
Example:
```js
"features": {
"inline_completion_provider": "copilot"
},
"show_completions_on_input": true,
// New:
"inline_completions": {
"copilot": {
"proxy": "http://example.com:15432",
"proxy_no_verify": true
}
}
```
Release Notes:
- Added the ability to specify an HTTP/HTTPS proxy for Copilot.
---------
Co-authored-by: Marshall Bowers <git@maxdeviant.com>