Skip to content

Commit

Permalink
llm: do not evaluate symlink for exe path lookup (ollama#9088)
Browse files Browse the repository at this point in the history
In some cases, the directories in the executable path read by
filepath.EvalSymlinks are not accessible, resulting in permission
errors which results in an error when running models. It also
doesn't work well on long paths on windows, also resulting in
errors. This change removes filepath.EvalSymlinks when accessing
os.Executable() altogether
  • Loading branch information
jmorganca authored Feb 14, 2025
1 parent 6600bd7 commit f05774b
Show file tree
Hide file tree
Showing 2 changed files with 0 additions and 10 deletions.
5 changes: 0 additions & 5 deletions discover/path.go
Original file line number Diff line number Diff line change
Expand Up @@ -19,11 +19,6 @@ var LibOllamaPath string = func() string {
return ""
}

exe, err = filepath.EvalSymlinks(exe)
if err != nil {
return ""
}

var libPath string
switch runtime.GOOS {
case "windows":
Expand Down
5 changes: 0 additions & 5 deletions llm/server.go
Original file line number Diff line number Diff line change
Expand Up @@ -320,11 +320,6 @@ func NewLlamaServer(gpus discover.GpuInfoList, model string, f *ggml.GGML, adapt
return nil, fmt.Errorf("unable to lookup executable path: %w", err)
}

exe, err = filepath.EvalSymlinks(exe)
if err != nil {
return nil, fmt.Errorf("unable to evaluate symlinks for executable path: %w", err)
}

// TODO - once fully switched to the Go runner, load the model here for tokenize/detokenize cgo access
s := &llmServer{
port: port,
Expand Down

0 comments on commit f05774b

Please sign in to comment.