GNU bug report logs - #71219
[PATCH] gnu: llama-cpp: Update configure flags for shared library build.

Previous Next

Package: guix-patches;

Reported by: Andy Tai <atai <at> atai.org>

Date: Mon, 27 May 2024 05:20:02 UTC

Severity: normal

Tags: patch

Done: Ludovic Courtès <ludo <at> gnu.org>

Bug is archived. No further changes may be made.

To add a comment to this bug, you must first unarchive it, by sending
a message to control AT debbugs.gnu.org, with unarchive 71219 in the body.
You can then email your comments to 71219 AT debbugs.gnu.org in the normal way.

Toggle the display of automated, internal messages from the tracker.

View this report as an mbox folder, status mbox, maintainer mbox


Report forwarded to guix-patches <at> gnu.org:
bug#71219; Package guix-patches. (Mon, 27 May 2024 05:20:02 GMT) Full text and rfc822 format available.

Acknowledgement sent to Andy Tai <atai <at> atai.org>:
New bug report received and forwarded. Copy sent to guix-patches <at> gnu.org. (Mon, 27 May 2024 05:20:02 GMT) Full text and rfc822 format available.

Message #5 received at submit <at> debbugs.gnu.org (full text, mbox):

From: Andy Tai <atai <at> atai.org>
To: guix-patches <at> gnu.org
Cc: Andy Tai <atai <at> atai.org>
Subject: [PATCH] gnu: llama-cpp: Update configure flags for shared library
 build.
Date: Sun, 26 May 2024 22:16:48 -0700
* gnu/packages/machine-learning.scm (lama-cpp):
  [arguments](configure-flags): add cmake configure flag
  to force position independent code generation from C
  compiler for shared library build.

Change-Id: I7c4bc219a22aa9a949e811b340c7cf745b176d14
---
 gnu/packages/machine-learning.scm | 3 ++-
 1 file changed, 2 insertions(+), 1 deletion(-)

diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm
index a385ddc18c..398b42f203 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -541,7 +541,8 @@ (define-public llama-cpp
       (build-system cmake-build-system)
       (arguments
        (list
-        #:configure-flags #~'("-DLLAMA_BLAS=ON"
+        #:configure-flags #~'("-DCMAKE_POSITION_INDEPENDENT_CODE=TRUE"
+                              "-DLLAMA_BLAS=ON"
                               "-DLLAMA_BLAS_VENDOR=OpenBLAS"
 
                               "-DLLAMA_NATIVE=OFF" ;no '-march=native'

base-commit: 0f3a25a25e212bfa8ab9db37d267fb260a087e5d
-- 
2.34.1





Information forwarded to guix-patches <at> gnu.org:
bug#71219; Package guix-patches. (Tue, 28 May 2024 01:59:02 GMT) Full text and rfc822 format available.

Message #8 received at 71219 <at> debbugs.gnu.org (full text, mbox):

From: Andy Tai <atai <at> atai.org>
To: 71219 <at> debbugs.gnu.org
Cc: Andy Tai <atai <at> atai.org>
Subject: [PATCH v2] gnu: llama-cpp: Update configure flags for shared library
 build.
Date: Mon, 27 May 2024 18:57:38 -0700
* gnu/packages/machine-learning.scm (lama-cpp):
  [arguments](configure-flags): add cmake configure flag
  for shared library build.

Change-Id: I7c4bc219a22aa9a949e811b340c7cf745b176d14
---
 gnu/packages/machine-learning.scm | 7 +++++--
 1 file changed, 5 insertions(+), 2 deletions(-)

diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm
index a385ddc18c..8a332a8b6f 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -541,7 +541,8 @@ (define-public llama-cpp
       (build-system cmake-build-system)
       (arguments
        (list
-        #:configure-flags #~'("-DLLAMA_BLAS=ON"
+        #:configure-flags #~'("-DBUILD_SHARED_LIBS=ON"
+                              "-DLLAMA_BLAS=ON"
                               "-DLLAMA_BLAS_VENDOR=OpenBLAS"
 
                               "-DLLAMA_NATIVE=OFF" ;no '-march=native'
@@ -591,7 +592,9 @@ (define-public llama-cpp
               (assoc-ref python:%standard-phases 'wrap))
             (add-after 'install 'install-main
               (lambda _
-                (copy-file "bin/main" (string-append #$output "/bin/llama")))))))
+                (with-directory-excursion (string-append #$output "/bin")
+                    (symlink "main" "llama"))))
+            )))
       (inputs (list python))
       (native-inputs (list pkg-config))
       (propagated-inputs

base-commit: 0f3a25a25e212bfa8ab9db37d267fb260a087e5d
-- 
2.34.1





Information forwarded to guix-patches <at> gnu.org:
bug#71219; Package guix-patches. (Mon, 03 Jun 2024 16:26:03 GMT) Full text and rfc822 format available.

Message #11 received at 71219 <at> debbugs.gnu.org (full text, mbox):

From: Andy Tai <atai <at> atai.org>
To: 71219 <at> debbugs.gnu.org
Cc: Andy Tai <atai <at> atai.org>
Subject: [PATCH v3] gnu: llama-cpp: Update commit and update configure flags
 for shared library build.
Date: Mon,  3 Jun 2024 08:43:48 -0700
* gnu/packages/machine-learning.scm (lama-cpp): Update to commit a5735e with
  pkf-config support.
  [arguments](configure-flags): add cmake configure flag
  for shared library build.
  (phases) 'install-python-scripts: Remove references to deleted scripts
  and add new ones upsteeam.

Change-Id: I7c4bc219a22aa9a949e811b340c7cf745b176d14
---
 gnu/packages/machine-learning.scm | 15 ++++++++-------
 1 file changed, 8 insertions(+), 7 deletions(-)

diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm
index a385ddc18c..52966bd8bc 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -524,7 +524,7 @@ (define-public guile-aiscm-next
   (deprecated-package "guile-aiscm-next" guile-aiscm))
 
 (define-public llama-cpp
-  (let ((commit "fed0108491a3a3cbec6c6480dc8667ffff9d7659")
+  (let ((commit "a5735e4426b19a3ebd0c653ad8ac01420458ee95")
         (revision "2"))
     (package
       (name "llama-cpp")
@@ -537,11 +537,12 @@ (define-public llama-cpp
                (commit commit)))
          (file-name (git-file-name name version))
          (sha256
-          (base32 "16rm9gy0chd6k07crm8rkl2j3hg7y7h0km7k6c8q7bmm2jrd64la"))))
+          (base32 "0nx55wchwf204ld6jygfn37cjrzc4lspwn5v0qk8i6p92499bv0h"))))
       (build-system cmake-build-system)
       (arguments
        (list
-        #:configure-flags #~'("-DLLAMA_BLAS=ON"
+        #:configure-flags #~'("-DBUILD_SHARED_LIBS=ON"
+                              "-DLLAMA_BLAS=ON"
                               "-DLLAMA_BLAS_VENDOR=OpenBLAS"
 
                               "-DLLAMA_NATIVE=OFF" ;no '-march=native'
@@ -584,14 +585,14 @@ (define-public llama-cpp
                   (mkdir-p bin)
                   (make-script "convert-hf-to-gguf")
                   (make-script "convert-llama-ggml-to-gguf")
-                  (make-script "convert-lora-to-ggml")
-                  (make-script "convert-persimmon-to-gguf")
-                  (make-script "convert"))))
+                  (make-script "convert-hf-to-gguf-update.py"))))
             (add-after 'install-python-scripts 'wrap-python-scripts
               (assoc-ref python:%standard-phases 'wrap))
             (add-after 'install 'install-main
               (lambda _
-                (copy-file "bin/main" (string-append #$output "/bin/llama")))))))
+                (with-directory-excursion (string-append #$output "/bin")
+                    (symlink "main" "llama"))))
+            )))
       (inputs (list python))
       (native-inputs (list pkg-config))
       (propagated-inputs

base-commit: 879fc9b3f0c2e58c6232da03b94eba98c78e2d99
-- 
2.34.1





Information forwarded to guix-patches <at> gnu.org:
bug#71219; Package guix-patches. (Thu, 06 Jun 2024 18:03:01 GMT) Full text and rfc822 format available.

Message #14 received at 71219 <at> debbugs.gnu.org (full text, mbox):

From: Andy Tai <atai <at> atai.org>
To: 71219 <at> debbugs.gnu.org
Date: Thu, 6 Jun 2024 10:52:38 -0700
patch passes Guix QA
https://qa.guix.gnu.org/issue/71219




Information forwarded to guix-patches <at> gnu.org:
bug#71219; Package guix-patches. (Thu, 06 Jun 2024 18:21:01 GMT) Full text and rfc822 format available.

Message #17 received at 71219 <at> debbugs.gnu.org (full text, mbox):

From: Christopher Baines <mail <at> cbaines.net>
To: Andy Tai <atai <at> atai.org>
Cc: 71219 <at> debbugs.gnu.org
Subject: Re: [bug#71219]
Date: Thu, 06 Jun 2024 19:12:27 +0100
[Message part 1 (text/plain, inline)]
Andy Tai <atai <at> atai.org> writes:

> patch passes Guix QA
> https://qa.guix.gnu.org/issue/71219

It does, but the build is still failing.

The status is "Succeeding" overall because the package was failing to
build before, so the situation isn't worse.
[signature.asc (application/pgp-signature, inline)]

Information forwarded to guix-patches <at> gnu.org:
bug#71219; Package guix-patches. (Tue, 11 Jun 2024 15:46:02 GMT) Full text and rfc822 format available.

Message #20 received at 71219 <at> debbugs.gnu.org (full text, mbox):

From: Andy Tai <atai <at> atai.org>
To: 71219 <at> debbugs.gnu.org
Cc: Andy Tai <atai <at> atai.org>
Subject: [PATCH v4] gnu: llama-cpp: Update commit and update configure flags
 for shared library build.
Date: Tue, 11 Jun 2024 04:41:44 -0700
* gnu/packages/machine-learning.scm (lama-cpp): Update to commit a5735e with
  pkf-config support.
  [arguments](configure-flags): add cmake configure flag
  for shared library build and adjust arguments to make openblas
  found by cmake.
  (phases) 'install-python-scripts: Remove references to deleted scripts
  and add new ones upsteeam.

Change-Id: I7c4bc219a22aa9a949e811b340c7cf745b176d14
---
 gnu/packages/machine-learning.scm | 18 +++++++++++-------
 1 file changed, 11 insertions(+), 7 deletions(-)

diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm
index a385ddc18c..f433f8cd65 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -524,7 +524,7 @@ (define-public guile-aiscm-next
   (deprecated-package "guile-aiscm-next" guile-aiscm))
 
 (define-public llama-cpp
-  (let ((commit "fed0108491a3a3cbec6c6480dc8667ffff9d7659")
+  (let ((commit "a5735e4426b19a3ebd0c653ad8ac01420458ee95")
         (revision "2"))
     (package
       (name "llama-cpp")
@@ -537,12 +537,16 @@ (define-public llama-cpp
                (commit commit)))
          (file-name (git-file-name name version))
          (sha256
-          (base32 "16rm9gy0chd6k07crm8rkl2j3hg7y7h0km7k6c8q7bmm2jrd64la"))))
+          (base32 "0nx55wchwf204ld6jygfn37cjrzc4lspwn5v0qk8i6p92499bv0h"))))
       (build-system cmake-build-system)
       (arguments
        (list
-        #:configure-flags #~'("-DLLAMA_BLAS=ON"
+        #:configure-flags
+        #~(list               "-DBUILD_SHARED_LIBS=ON"
+                              "-DLLAMA_BLAS=ON"
                               "-DLLAMA_BLAS_VENDOR=OpenBLAS"
+                              (string-append "-DBLAS_INCLUDE_DIRS=" #$(this-package-input "openblas") "/include")
+                              (string-append "-DBLAS_LIBRARIES=" #$(this-package-input "openblas") "/lib/libopenblas.so")
 
                               "-DLLAMA_NATIVE=OFF" ;no '-march=native'
                               "-DLLAMA_FMA=OFF"    ;and no '-mfma', etc.
@@ -584,14 +588,14 @@ (define-public llama-cpp
                   (mkdir-p bin)
                   (make-script "convert-hf-to-gguf")
                   (make-script "convert-llama-ggml-to-gguf")
-                  (make-script "convert-lora-to-ggml")
-                  (make-script "convert-persimmon-to-gguf")
-                  (make-script "convert"))))
+                  (make-script "convert-hf-to-gguf-update.py"))))
             (add-after 'install-python-scripts 'wrap-python-scripts
               (assoc-ref python:%standard-phases 'wrap))
             (add-after 'install 'install-main
               (lambda _
-                (copy-file "bin/main" (string-append #$output "/bin/llama")))))))
+                (with-directory-excursion (string-append #$output "/bin")
+                    (symlink "main" "llama"))))
+            )))
       (inputs (list python))
       (native-inputs (list pkg-config))
       (propagated-inputs

base-commit: bc8a41f4a8d9f1f0525d7bc97c67ed3c8aea3111
-- 
2.45.1





Information forwarded to guix-patches <at> gnu.org:
bug#71219; Package guix-patches. (Tue, 11 Jun 2024 16:55:01 GMT) Full text and rfc822 format available.

Message #23 received at 71219 <at> debbugs.gnu.org (full text, mbox):

From: Andy Tai <atai <at> atai.org>
To: 71219 <at> debbugs.gnu.org
Cc: Andy Tai <atai <at> atai.org>
Subject: [PATCH v5] gnu: llama-cpp: Update commit and configure flags for
 shared library build.
Date: Tue, 11 Jun 2024 05:02:13 -0700
* gnu/packages/machine-learning.scm (lama-cpp): Update to commit a5735e with
  pkg-config support.
  [arguments](configure-flags): Add cmake configure flag
  for shared library build and adjust arguments to make openblas
  found by cmake.
  (phases) 'install-python-scripts: Remove references to deleted scripts
  and add new ones upsteeam.

Change-Id: I7c4bc219a22aa9a949e811b340c7cf745b176d14
---
 gnu/packages/machine-learning.scm | 18 +++++++++++-------
 1 file changed, 11 insertions(+), 7 deletions(-)

diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm
index a385ddc18c..f433f8cd65 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -524,7 +524,7 @@ (define-public guile-aiscm-next
   (deprecated-package "guile-aiscm-next" guile-aiscm))
 
 (define-public llama-cpp
-  (let ((commit "fed0108491a3a3cbec6c6480dc8667ffff9d7659")
+  (let ((commit "a5735e4426b19a3ebd0c653ad8ac01420458ee95")
         (revision "2"))
     (package
       (name "llama-cpp")
@@ -537,12 +537,16 @@ (define-public llama-cpp
                (commit commit)))
          (file-name (git-file-name name version))
          (sha256
-          (base32 "16rm9gy0chd6k07crm8rkl2j3hg7y7h0km7k6c8q7bmm2jrd64la"))))
+          (base32 "0nx55wchwf204ld6jygfn37cjrzc4lspwn5v0qk8i6p92499bv0h"))))
       (build-system cmake-build-system)
       (arguments
        (list
-        #:configure-flags #~'("-DLLAMA_BLAS=ON"
+        #:configure-flags
+        #~(list               "-DBUILD_SHARED_LIBS=ON"
+                              "-DLLAMA_BLAS=ON"
                               "-DLLAMA_BLAS_VENDOR=OpenBLAS"
+                              (string-append "-DBLAS_INCLUDE_DIRS=" #$(this-package-input "openblas") "/include")
+                              (string-append "-DBLAS_LIBRARIES=" #$(this-package-input "openblas") "/lib/libopenblas.so")
 
                               "-DLLAMA_NATIVE=OFF" ;no '-march=native'
                               "-DLLAMA_FMA=OFF"    ;and no '-mfma', etc.
@@ -584,14 +588,14 @@ (define-public llama-cpp
                   (mkdir-p bin)
                   (make-script "convert-hf-to-gguf")
                   (make-script "convert-llama-ggml-to-gguf")
-                  (make-script "convert-lora-to-ggml")
-                  (make-script "convert-persimmon-to-gguf")
-                  (make-script "convert"))))
+                  (make-script "convert-hf-to-gguf-update.py"))))
             (add-after 'install-python-scripts 'wrap-python-scripts
               (assoc-ref python:%standard-phases 'wrap))
             (add-after 'install 'install-main
               (lambda _
-                (copy-file "bin/main" (string-append #$output "/bin/llama")))))))
+                (with-directory-excursion (string-append #$output "/bin")
+                    (symlink "main" "llama"))))
+            )))
       (inputs (list python))
       (native-inputs (list pkg-config))
       (propagated-inputs

base-commit: bc8a41f4a8d9f1f0525d7bc97c67ed3c8aea3111
-- 
2.45.1





Information forwarded to guix-patches <at> gnu.org:
bug#71219; Package guix-patches. (Tue, 11 Jun 2024 17:08:01 GMT) Full text and rfc822 format available.

Information forwarded to guix-patches <at> gnu.org:
bug#71219; Package guix-patches. (Wed, 10 Jul 2024 04:03:02 GMT) Full text and rfc822 format available.

Message #29 received at 71219 <at> debbugs.gnu.org (full text, mbox):

From: Andy Tai <atai <at> atai.org>
To: 71219 <at> debbugs.gnu.org
Date: Tue, 9 Jul 2024 21:00:20 -0700
patch passed QA
https://qa.guix.gnu.org/issue/71219




Reply sent to Ludovic Courtès <ludo <at> gnu.org>:
You have taken responsibility. (Wed, 10 Jul 2024 13:48:01 GMT) Full text and rfc822 format available.

Notification sent to Andy Tai <atai <at> atai.org>:
bug acknowledged by developer. (Wed, 10 Jul 2024 13:48:02 GMT) Full text and rfc822 format available.

Message #34 received at 71219-done <at> debbugs.gnu.org (full text, mbox):

From: Ludovic Courtès <ludo <at> gnu.org>
To: Andy Tai <atai <at> atai.org>
Cc: 71219-done <at> debbugs.gnu.org
Subject: Re: [bug#71219] [PATCH v5] gnu: llama-cpp: Update commit and
 configure flags for shared library build.
Date: Wed, 10 Jul 2024 15:47:36 +0200
[Message part 1 (text/plain, inline)]
Hi Andy,

Andy Tai <atai <at> atai.org> skribis:

> * gnu/packages/machine-learning.scm (lama-cpp): Update to commit a5735e with
>   pkg-config support.
>   [arguments](configure-flags): Add cmake configure flag
>   for shared library build and adjust arguments to make openblas
>   found by cmake.
>   (phases) 'install-python-scripts: Remove references to deleted scripts
>   and add new ones upsteeam.
>
> Change-Id: I7c4bc219a22aa9a949e811b340c7cf745b176d14

Applied with the indentation changes shown below.  Thanks!

Ludo’.

[Message part 2 (text/x-patch, inline)]
diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm
index a2be0bf9c8..1cb6586e81 100644
--- a/gnu/packages/machine-learning.scm
+++ b/gnu/packages/machine-learning.scm
@@ -567,18 +567,22 @@ (define-public llama-cpp
       (arguments
        (list
         #:configure-flags
-        #~(list               "-DBUILD_SHARED_LIBS=ON"
-                              "-DLLAMA_BLAS=ON"
-                              "-DLLAMA_BLAS_VENDOR=OpenBLAS"
-                              (string-append "-DBLAS_INCLUDE_DIRS=" #$(this-package-input "openblas") "/include")
-                              (string-append "-DBLAS_LIBRARIES=" #$(this-package-input "openblas") "/lib/libopenblas.so")
+        #~(list "-DBUILD_SHARED_LIBS=ON"
+                "-DLLAMA_BLAS=ON"
+                "-DLLAMA_BLAS_VENDOR=OpenBLAS"
+                (string-append "-DBLAS_INCLUDE_DIRS="
+                               #$(this-package-input "openblas")
+                               "/include")
+                (string-append "-DBLAS_LIBRARIES="
+                               #$(this-package-input "openblas")
+                               "/lib/libopenblas.so")
 
-                              "-DLLAMA_NATIVE=OFF" ;no '-march=native'
-                              "-DLLAMA_FMA=OFF"    ;and no '-mfma', etc.
-                              "-DLLAMA_AVX2=OFF"
-                              "-DLLAMA_AVX512=OFF"
-                              "-DLLAMA_AVX512_VBMI=OFF"
-                              "-DLLAMA_AVX512_VNNI=OFF")
+                "-DLLAMA_NATIVE=OFF" ;no '-march=native'
+                "-DLLAMA_FMA=OFF"    ;and no '-mfma', etc.
+                "-DLLAMA_AVX2=OFF"
+                "-DLLAMA_AVX512=OFF"
+                "-DLLAMA_AVX512_VBMI=OFF"
+                "-DLLAMA_AVX512_VNNI=OFF")
 
         #:modules '((ice-9 textual-ports)
                     (guix build utils)

Information forwarded to guix-patches <at> gnu.org:
bug#71219; Package guix-patches. (Wed, 10 Jul 2024 13:50:02 GMT) Full text and rfc822 format available.

Message #37 received at 71219 <at> debbugs.gnu.org (full text, mbox):

From: Ludovic Courtès <ludo <at> gnu.org>
To: Andy Tai <atai <at> atai.org>
Cc: 71219 <at> debbugs.gnu.org
Subject: Re: [bug#71219] [PATCH v5] gnu: llama-cpp: Update commit and
 configure flags for shared library build.
Date: Wed, 10 Jul 2024 15:49:25 +0200
Andy Tai <atai <at> atai.org> skribis:

>  (define-public llama-cpp
> -  (let ((commit "fed0108491a3a3cbec6c6480dc8667ffff9d7659")
> +  (let ((commit "a5735e4426b19a3ebd0c653ad8ac01420458ee95")
>          (revision "2"))

Also bumped ‘revision’, as explained here:

  https://guix.gnu.org/manual/devel/en/html_node/Version-Numbers.html




bug archived. Request was from Debbugs Internal Request <help-debbugs <at> gnu.org> to internal_control <at> debbugs.gnu.org. (Thu, 08 Aug 2024 11:24:06 GMT) Full text and rfc822 format available.

This bug report was last modified 217 days ago.

Previous Next


GNU bug tracking system
Copyright (C) 1999 Darren O. Benham, 1997,2003 nCipher Corporation Ltd, 1994-97 Ian Jackson.