Skip to main content

FAQ

How do I set up my own LLM?

You must manually edit the config file located at ~/.waveterm/config/settings.json.

Key NameTypeFunction
ai:baseurlstringSet the AI Base Url (must be OpenAI compatible)
ai:apitokenstringyour AI api token
ai:namestringstring to display in the Wave AI block header
ai:modelstringmodel name to pass to API
ai:maxtokensintmax tokens to pass to API
ai:timeoutmsinttimeout (in milliseconds) for AI calls

Here's an example of pointing it to a local Ollama instance. Note that to get the text in the header of the AI block to update, you'll need to set the "ai:name" key. For ollama, you'll also need to provide something for the apitoken (even though it is ignored).

Here are the ollma open AI compatibility docs: https://github.com/ollama/ollama/blob/main/docs/openai.md

{
"ai:baseurl": "http://localhost:11434/v1",
"ai:name": "llama3.2",
"ai:model": "llama3.2",
"ai:apitoken": "ollama"
}

How can I connect to Azure AI

You must manually edit the config file located at ~/.waveterm/config/settings.json.

You'll need to set your ai:baseurl to your Azure AI Base URL (do not include query parameters or api-version). You'll also need to set ai:apitype to azure. You can then set the ai:model, and ai:apitoken appropriately for your setup.

How can I see the block numbers?

The block numbers will appear when you hold down Ctrl-Shift (and disappear once you release the key combo).

How do I make a remote connection?

There is a button in the header. Click the or and type the [user]@[host] that you wish to connect to.

On Windows, how can I use Git Bash as my default shell?

In order to make Git Bash your default shell you'll need to set the configuration variable term:localshellpath to the location of the Git Bash "bash.exe" binary. By default it is located at "C:\Program Files\Git\bin\bash.exe". Just remember in JSON, backslashes need to be escaped. So add this to your settings.json file:

    "term:localshellpath": "C:\\Program Files\\Git\\bin\\bash.exe"