Skip to content

daanturo/starhugger.el

Repository files navigation

https://melpa.org/packages/starhugger-badge.svg https://stable.melpa.org/packages/starhugger-badge.svg

  • Hugging Face/AI-powered text & code completion client (unofficial).

Currently supported model(s): bigcode/starcoder, codellama/CodeLlama-13b-hf.

demo.webm

(Maybe not up to date) Mirror: https://github.com/daanturo/starhugger.el

Installation

Starhugger is installable from MELPA - Starhugger, using the command package-install.

Of if you want to install from source, add one of the following to your configuration:

;; package-vc.el (built-in from Emacs 29 and above)
(unless (package-installed-p 'starhugger)
  (package-vc-install '(starhugger :url "https://gitlab.com/daanturo/starhugger.el")))

;; straight.el
(straight-use-package '(starhugger :files (:defaults "*.py")))

;; Doom
(package! starhugger :recipe (:files (:defaults "*.py")))

;; elpaca.el
(elpaca (starhugger :repo "https://gitlab.com/daanturo/starhugger.el" :files (:defaults "*.py")))

Or any package manager of your choice.

This package also requires those dependencies: compat, dash, spinner, s.el.

Usage

The performance may not be great, but they are asynchronous so Emacs won’t be blocked.

Primary commands

  • Previewing overlay: starhugger-trigger-suggestion will display the suggestion.

M-x starhugger-show-next-suggestion and starhugger-show-prev-suggestion to cycle suggestions.

M-x starhugger-accept-suggestion to insert current suggestion.

starhugger-dismiss-suggestion (bound to C-g by default when showing) to cancel.

There is also starhugger-auto-mode (non-global minor mode) but its usage isn’t prioritized because of the limit rate.

Example configuration

(setq starhugger-api-token "hf_ your token here")

;; Use https://huggingface.co/codellama/CodeLlama-13b-hf, best to be set before loading this package
(setq starhugger-model-id "codellama/CodeLlama-13b-hf")

(global-set-key (kbd "M-\\") #'starhugger-trigger-suggestion)

(with-eval-after-load 'starhugger
  ;; `starhugger-inline-menu-item' makes a conditional binding that is only active at the inline suggestion start
  (define-key starhugger-inlining-mode-map (kbd "TAB") (starhugger-inline-menu-item #'starhugger-accept-suggestion))
  (define-key starhugger-inlining-mode-map (kbd "M-[") (starhugger-inline-menu-item #'starhugger-show-prev-suggestion))
  (define-key starhugger-inlining-mode-map (kbd "M-]") (starhugger-inline-menu-item #'starhugger-show-next-suggestion))
  (define-key starhugger-inlining-mode-map (kbd "M-f") (starhugger-inline-menu-item #'starhugger-accept-suggestion-by-word)))

Additional settings

;; For evil users, dismiss after pressing ESC twice
(defvar my-evil-force-normal-state-hook '())
(defun my-evil-run-force-normal-state-hook-after-a (&rest _)
  (run-hooks 'my-evil-force-normal-state-hook))

(advice-add #'evil-force-normal-state
 :after #'my-evil-run-force-normal-state-hook-after-a)

;; Workaround conflict with `blamer.el'
;; (https://github.com/Artawower/blamer.el): when at the end of line, blamer's
;; overlay's `after-string' property will display before starhugger's
;; `display' property, which will result in starhugger's part of suggestion on
;; current line (1) being pushed out of the display

;; <before point>|                            commit info<right edge of the window><suggestion after point, before newline>
;; <the rest of suggestion>

;; workaround: disable `blamer-mode' while `starhugger-inlining-mode'

(defvar-local my-starhugger-inlining-mode--blamer-mode-state nil)
(defvar-local blamer-mode nil)

(defun my-starhugger-inlining-mode-h ()
  (if starhugger-inlining-mode
      (progn
        (add-hook 'my-evil-force-normal-state-hook
                  (lambda () (starhugger-dismiss-suggestion t))
                  nil t)
        (setq my-starhugger-inlining-mode--blamer-mode-state blamer-mode)
        (when my-starhugger-inlining-mode--blamer-mode-state
          (blamer-mode 0)))
    (progn
      (when (and my-starhugger-inlining-mode--blamer-mode-state
                 (not blamer-mode))
        (blamer-mode 1)))))

(add-hook 'starhugger-inlining-mode-hook #'my-starhugger-inlining-mode-h)

Notes

Remember to set starhugger-api-token (from https://huggingface.co/settings/tokens), otherwise you may easily get hit by the limit rate.

Known quirks

From the model (https://huggingface.co/bigcode/starcoder):

  • Doesn’t use num_return_sequences (detailed_parameters) to return multiple responses, workaround by making multiple requests.
  • Doesn’t use use_cache, current workaround is forcing a different response via randomizing temperature.

Emacs overlays are used under the hood to display inline suggestion, there are some shortcomings with this approach:

  • Not possible to display PRE|<ov>SUF without using 2 different types of overlay properties when SUF isn’t emtpy (in the middle of the buffer) and empty (at buffer end)
  • At the end of the buffer (overlaystart = overlay-end), the overlay’s keymap property doesn’t work
  • Conflict with https://github.com/Artawower/blamer.el, mentioned in “Example configuration”

  • ✓ Support setting parameters at https://huggingface.co/docs/api-inference/detailed_parameters#text-generation-task.
  • ✓ VSCode-like previewing overlays: take after https://github.com/zerolfx/copilot.el.
  • ✓ Let starhugger-trigger-suggestion fetch about 3 suggestions to quickly cycle.
  • ✓ Fill-in-the-middle support https://github.com/huggingface/huggingface-vscode: <fim_prefix>〈code before〉<fim_suffix>〈code after〉<fim_middle>.
  • ? More robust and reliable method to show a different suggestion.
  • ½ Batch-previewing multiple suggestions, maybe with syntax highlighting.
  • ½ Support for auto-completing when typing: investigate Emacs’s built-in completion-at-point-functions’s asynchronous capabilities, or another framework? Current implementation: starhugger-auto-mode using overlays.
  • ½ Find a way to take other files into account Copilot Internals | thakkarparth007.github.io: current experimental implementation isn’t as sophisticated, just a dumb grep - regex hellish madness on the codebase to find “relevant” symbols (again, relevancy is measure by naive line similarities to the current file name, no semantic analysis or AI involved here); also the syntax to expose inter-file context isn’t known yet, I made up using comments.
  • ? Separate frontend (inline suggestion interface) and backend (the text completion provider), and allow multiple backends, not just Hugging Face’s inference API.