Received: (at 68455) by debbugs.gnu.org; 26 Jan 2024 12:21:42 +0000 From debbugs-submit-bounces <at> debbugs.gnu.org Fri Jan 26 07:21:42 2024 Received: from localhost ([127.0.0.1]:50605 helo=debbugs.gnu.org) by debbugs.gnu.org with esmtp (Exim 4.84_2) (envelope-from <debbugs-submit-bounces <at> debbugs.gnu.org>) id 1rTLDK-0004ae-3n for submit <at> debbugs.gnu.org; Fri, 26 Jan 2024 07:21:42 -0500 Received: from mail-ua1-x92b.google.com ([2607:f8b0:4864:20::92b]:51608) by debbugs.gnu.org with esmtp (Exim 4.84_2) (envelope-from <david@HIDDEN>) id 1rTLDI-0004aP-Cr for 68455 <at> debbugs.gnu.org; Fri, 26 Jan 2024 07:21:41 -0500 Received: by mail-ua1-x92b.google.com with SMTP id a1e0cc1a2514c-7d2e1a0337bso149247241.3 for <68455 <at> debbugs.gnu.org>; Fri, 26 Jan 2024 04:21:33 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=tpflug-com.20230601.gappssmtp.com; s=20230601; t=1706271687; x=1706876487; darn=debbugs.gnu.org; h=content-transfer-encoding:mime-version:message-id:date:subject:cc :to:from:from:to:cc:subject:date:message-id:reply-to; bh=XZn10jdVCmsw6A0ehl/2OLgchbq4BGNZ6tssIna3fI0=; b=lSaLoYcFyo8oWDTDKVQPUOf9NZvfTBT4y2NM8JsfHe2XJao/At7iEPldFl9O8zIT1b PUSES/t+6Dr9CvRj7sDJeMzE8iPGm8gDIRlf9A9pQbnFyfRu4kvQhPeyVh8Vqkj5xyE4 Ujgk5vDhhvl3QVII09CK9qiJib4LFpNc4z2goaTZqSdCukxgSSRhW30mNLIbLNegwNus r/lb9ULp2qXcD4E7zaBTZzw0X4nrlZ/2AUbz6sdURdRFGngZ9cJlic5afvYJ7TmT6YMb xZtPwMfXKIEYncnqsAK3pqNf1bMyEuN69RFZmC0zzJy7p5vAwo6k34X+b6c+sFHp6+F5 zkQA== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1706271687; x=1706876487; h=content-transfer-encoding:mime-version:message-id:date:subject:cc :to:from:x-gm-message-state:from:to:cc:subject:date:message-id :reply-to; bh=XZn10jdVCmsw6A0ehl/2OLgchbq4BGNZ6tssIna3fI0=; b=M8RHqlXMckSKeicxFQ0M8YyYTynaqednIYXRp7U7jPW2uhzpXjSsqyuQ9Vfzrp/aiw REJLClSuzVsW3uqVEuOhqBo6dRJpo03Jy0JZpo3pVy1ZjFodouUUUlqznKXTVcHLn8kN L6AUCG4pP4RM++OiLY/rGEhbOkZi+IfMzQMNYPDVa7NEV5Xq+Vzx5UlpiPbI/GnxxyTM JHKT8/NuKOTvJyikHg0yYmcKilFVJDlrpgSID1AJ6rBKOOKcn0RA+HnxNSmQtPCsA6rf mFYz0zRZnZO3HtUb5uxlVtjIS+HoyoDS+uiOHTvp3CSjmy9IRGhE4nOEevx1RsKYcVyt j80A== X-Gm-Message-State: AOJu0YzacF1PwMro3jyGniy4fQzNsVAlzu+ruKJDmYTqzqV/Tca+kdgw H23UyubmkJ8QwVZ684km3r7bWvFgqp2eYveDTBSgXtihuI1+bpO+OsQGmbra/AGlUZIup5Z8z/L b X-Google-Smtp-Source: AGHT+IGyrictmipIbQ3IHlA/aNxAm1ei6oOVNEzJtRc3/kznzphG93NnY+WcvXmduv5MQxQTgnmfUg== X-Received: by 2002:a05:6122:4782:b0:4b7:363d:89fd with SMTP id ee2-20020a056122478200b004b7363d89fdmr626441vkb.33.1706271686940; Fri, 26 Jan 2024 04:21:26 -0800 (PST) Received: from localhost.localdomain (2603-9001-0600-4cc0-6a05-caff-fe36-28a4.inf6.spectrum.com. [2603:9001:600:4cc0:6a05:caff:fe36:28a4]) by smtp.gmail.com with ESMTPSA id cb11-20020a056122408b00b004b71167e54dsm128174vkb.50.2024.01.26.04.21.25 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Fri, 26 Jan 2024 04:21:26 -0800 (PST) From: David Pflug <david@HIDDEN> To: 68455 <at> debbugs.gnu.org Subject: [PATCH v2] gnu: llama-cpp: Update to 1873. Date: Fri, 26 Jan 2024 07:20:21 -0500 Message-ID: <20240126122110.10991-1-david@HIDDEN> X-Mailer: git-send-email 2.41.0 MIME-Version: 1.0 Content-Transfer-Encoding: 8bit X-Spam-Score: 0.2 (/) X-Debbugs-Envelope-To: 68455 Cc: David Pflug <david@HIDDEN> X-BeenThere: debbugs-submit <at> debbugs.gnu.org X-Mailman-Version: 2.1.18 Precedence: list List-Id: <debbugs-submit.debbugs.gnu.org> List-Unsubscribe: <https://debbugs.gnu.org/cgi-bin/mailman/options/debbugs-submit>, <mailto:debbugs-submit-request <at> debbugs.gnu.org?subject=unsubscribe> List-Archive: <https://debbugs.gnu.org/cgi-bin/mailman/private/debbugs-submit/> List-Post: <mailto:debbugs-submit <at> debbugs.gnu.org> List-Help: <mailto:debbugs-submit-request <at> debbugs.gnu.org?subject=help> List-Subscribe: <https://debbugs.gnu.org/cgi-bin/mailman/listinfo/debbugs-submit>, <mailto:debbugs-submit-request <at> debbugs.gnu.org?subject=subscribe> Errors-To: debbugs-submit-bounces <at> debbugs.gnu.org Sender: "Debbugs-submit" <debbugs-submit-bounces <at> debbugs.gnu.org> X-Spam-Score: -0.8 (/) * gnu/packages/machine-learning.scm (llama-cpp): Update to 1873. python-gguf added by #68735 Change-Id: I091cd20192743c87b497ea3c5fd18a75ada75d9d --- gnu/packages/machine-learning.scm | 110 +++++++++++++++--------------- 1 file changed, 55 insertions(+), 55 deletions(-) diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm index 0e88f7265b..1d590d1c1b 100644 --- a/gnu/packages/machine-learning.scm +++ b/gnu/packages/machine-learning.scm @@ -519,63 +519,63 @@ (define-public guile-aiscm-next (deprecated-package "guile-aiscm-next" guile-aiscm)) (define-public llama-cpp - (let ((commit "f31b5397143009d682db90fd2a6cde83f1ef00eb") - (revision "0")) - (package - (name "llama-cpp") - (version (git-version "0.0.0" revision commit)) - (source - (origin - (method git-fetch) - (uri (git-reference - (url "https://github.com/ggerganov/llama.cpp") - (commit (string-append "master-" (string-take commit 7))))) - (file-name (git-file-name name version)) - (sha256 - (base32 "0ys6n53n032zq1ll9f3vgxk8sw0qq7x3fi7awsyy13adzp3hn08p")))) - (build-system cmake-build-system) - (arguments - (list - #:modules '((ice-9 textual-ports) - (guix build utils) - ((guix build python-build-system) #:prefix python:) - (guix build cmake-build-system)) - #:imported-modules `(,@%cmake-build-system-modules - (guix build python-build-system)) - #:phases - #~(modify-phases %standard-phases - (add-before 'install 'install-python-scripts - (lambda _ - (let ((bin (string-append #$output "/bin/"))) - (define (make-script script) - (let ((suffix (if (string-suffix? ".py" script) "" ".py"))) - (call-with-input-file - (string-append "../source/" script suffix) - (lambda (input) - (call-with-output-file (string-append bin script) - (lambda (output) - (format output "#!~a/bin/python3\n~a" - #$(this-package-input "python") - (get-string-all input)))))) - (chmod (string-append bin script) #o555))) - (mkdir-p bin) - (make-script "convert-pth-to-ggml") - (make-script "convert-lora-to-ggml") - (make-script "convert")))) - (add-after 'install-python-scripts 'wrap-python-scripts - (assoc-ref python:%standard-phases 'wrap)) - (replace 'install - (lambda _ - (copy-file "bin/main" (string-append #$output "/bin/llama"))))))) - (inputs (list python)) - (propagated-inputs - (list python-numpy python-pytorch python-sentencepiece)) - (home-page "https://github.com/ggerganov/llama.cpp") - (synopsis "Port of Facebook's LLaMA model in C/C++") - (description "This package provides a port to Facebook's LLaMA collection + (package + (name "llama-cpp") + (version "1873") + (source + (origin + (method git-fetch) + (uri (git-reference + (url "https://github.com/ggerganov/llama.cpp") + (commit (string-append "b" version)))) + (file-name (git-file-name name version)) + (sha256 + (base32 "11may9gkafg5bfma5incijvkypjgx9778gmygxp3x2dz1140809d")))) + (build-system cmake-build-system) + (arguments + (list + #:modules '((ice-9 textual-ports) + (guix build utils) + ((guix build python-build-system) #:prefix python:) + (guix build cmake-build-system)) + #:imported-modules `(,@%cmake-build-system-modules + (guix build python-build-system)) + #:phases + #~(modify-phases %standard-phases + (add-before 'install 'install-python-scripts + (lambda _ + (let ((bin (string-append #$output "/bin/"))) + (define (make-script script) + (let ((suffix (if (string-suffix? ".py" script) "" ".py"))) + (call-with-input-file + (string-append "../source/" script suffix) + (lambda (input) + (call-with-output-file (string-append bin script) + (lambda (output) + (format output "#!~a/bin/python3\n~a" + #$(this-package-input "python") + (get-string-all input)))))) + (chmod (string-append bin script) #o555))) + (mkdir-p bin) + (make-script "convert-hf-to-gguf") + (make-script "convert-llama-ggml-to-gguf") + (make-script "convert-lora-to-ggml") + (make-script "convert-persimmon-to-gguf") + (make-script "convert")))) + (add-after 'install-python-scripts 'wrap-python-scripts + (assoc-ref python:%standard-phases 'wrap)) + (replace 'install + (lambda _ + (copy-file "bin/main" (string-append #$output "/bin/llama"))))))) + (inputs (list python)) + (propagated-inputs + (list python-numpy python-pytorch python-sentencepiece python-gguf)) + (home-page "https://github.com/ggerganov/llama.cpp") + (synopsis "Port of Facebook's LLaMA model in C/C++") + (description "This package provides a port to Facebook's LLaMA collection of foundation language models. It requires models parameters to be downloaded independently to be able to run a LLaMA model.") - (license license:expat)))) + (license license:expat))) (define-public mcl (package base-commit: c5453fbfeb0dbd19cb402199fe1e5ad51a051e56 -- 2.41.0
guix-patches@HIDDEN
:bug#68455
; Package guix-patches
.
Full text available.Received: (at 68455) by debbugs.gnu.org; 17 Jan 2024 17:29:46 +0000 From debbugs-submit-bounces <at> debbugs.gnu.org Wed Jan 17 12:29:46 2024 Received: from localhost ([127.0.0.1]:53257 helo=debbugs.gnu.org) by debbugs.gnu.org with esmtp (Exim 4.84_2) (envelope-from <debbugs-submit-bounces <at> debbugs.gnu.org>) id 1rQ9jV-0004ec-TF for submit <at> debbugs.gnu.org; Wed, 17 Jan 2024 12:29:46 -0500 Received: from eggs.gnu.org ([2001:470:142:3::10]:46660) by debbugs.gnu.org with esmtp (Exim 4.84_2) (envelope-from <othacehe@HIDDEN>) id 1rQ9jU-0004eI-27 for 68455 <at> debbugs.gnu.org; Wed, 17 Jan 2024 12:29:44 -0500 Received: from fencepost.gnu.org ([2001:470:142:3::e]) by eggs.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.90_1) (envelope-from <othacehe@HIDDEN>) id 1rQ9jN-0003b1-BM; Wed, 17 Jan 2024 12:29:37 -0500 DKIM-Signature: v=1; a=rsa-sha256; q=dns/txt; c=relaxed/relaxed; d=gnu.org; s=fencepost-gnu-org; h=MIME-Version:Date:References:In-Reply-To:Subject:To: From; bh=nNU9rzECqw4QNIasyvRu+jwaFQkb1hVX/nmgKrObsIQ=; b=d27/YtZbrR5QLoTu9d8V Uw0a80TxVs4n5jxUhwm0N/pJELefHHwYMUkRO06OKLOsiMCHWmzCeT0KDJPhDGXgaqftrrnYDfY6W wesViMwRhHtRGD0bVmklht/s9X6+JxyDDYpQqv6tGqTSc68JlY3JNNJZU9KEy9cUgaxyA6CTMUw2V 2h7j0EEpVpCjmUIv1MF3vbZ1APrufKrNnjSYsrpfCpUyeFMeg6vhaFQBy4wu+Lu+CNILNGaOVdlw5 Ppv3WdrtmQkEAT4RsBs++NTxJj07QjAYYKFQcb6FTXG4PPbClVCgv3kJwCuTvnwR9U41y403sCRTZ jInLeyl9gd8ZcQ==; From: Mathieu Othacehe <othacehe@HIDDEN> To: David Pflug <david@HIDDEN> Subject: Re: [bug#68455] [PATCH] gnu: llama-cpp: Update to 1873. In-Reply-To: <20240114203255.26500-1-david@HIDDEN> (David Pflug's message of "Sun, 14 Jan 2024 15:32:45 -0500") References: <20240114203255.26500-1-david@HIDDEN> Date: Wed, 17 Jan 2024 18:29:35 +0100 Message-ID: <87o7djj2bk.fsf@HIDDEN> User-Agent: Gnus/5.13 (Gnus v5.13) MIME-Version: 1.0 Content-Type: text/plain X-Spam-Score: -1.6 (-) X-Debbugs-Envelope-To: 68455 Cc: 68455 <at> debbugs.gnu.org X-BeenThere: debbugs-submit <at> debbugs.gnu.org X-Mailman-Version: 2.1.18 Precedence: list List-Id: <debbugs-submit.debbugs.gnu.org> List-Unsubscribe: <https://debbugs.gnu.org/cgi-bin/mailman/options/debbugs-submit>, <mailto:debbugs-submit-request <at> debbugs.gnu.org?subject=unsubscribe> List-Archive: <https://debbugs.gnu.org/cgi-bin/mailman/private/debbugs-submit/> List-Post: <mailto:debbugs-submit <at> debbugs.gnu.org> List-Help: <mailto:debbugs-submit-request <at> debbugs.gnu.org?subject=help> List-Subscribe: <https://debbugs.gnu.org/cgi-bin/mailman/listinfo/debbugs-submit>, <mailto:debbugs-submit-request <at> debbugs.gnu.org?subject=subscribe> Errors-To: debbugs-submit-bounces <at> debbugs.gnu.org Sender: "Debbugs-submit" <debbugs-submit-bounces <at> debbugs.gnu.org> X-Spam-Score: -2.6 (--) Hello David, > +(define-public python-gguf > + (package > + (name "python-gguf") > + (version "0.6.0") > + (source > + (origin > + (method url-fetch) > + (uri (pypi-uri "gguf" version)) > + (sha256 > + (base32 "0rbyc2h3kpqnrvbyjvv8a69l577jv55a31l12jnw21m1lamjxqmj")))) > + (build-system pyproject-build-system) > + (arguments > + `(#:phases > + (modify-phases %standard-phases > + (delete 'check)))) > + (inputs (list poetry python-pytest)) > + (propagated-inputs (list python-numpy)) > + (home-page "https://ggml.ai") > + (synopsis "Read and write ML models in GGUF for GGML") > + (description "Read and write ML models in GGUF for GGML") > + (license license:expat))) This should be part of a separate patch. Can you send a v2? Thanks, Mathieu
guix-patches@HIDDEN
:bug#68455
; Package guix-patches
.
Full text available.Received: (at submit) by debbugs.gnu.org; 14 Jan 2024 20:33:46 +0000 From debbugs-submit-bounces <at> debbugs.gnu.org Sun Jan 14 15:33:46 2024 Received: from localhost ([127.0.0.1]:44130 helo=debbugs.gnu.org) by debbugs.gnu.org with esmtp (Exim 4.84_2) (envelope-from <debbugs-submit-bounces <at> debbugs.gnu.org>) id 1rP7Aw-0003ps-1r for submit <at> debbugs.gnu.org; Sun, 14 Jan 2024 15:33:46 -0500 Received: from lists.gnu.org ([2001:470:142::17]:53944) by debbugs.gnu.org with esmtp (Exim 4.84_2) (envelope-from <david@HIDDEN>) id 1rP7As-0003pe-Or for submit <at> debbugs.gnu.org; Sun, 14 Jan 2024 15:33:44 -0500 Received: from eggs.gnu.org ([2001:470:142:3::10]) by lists.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_256_GCM_SHA384:256) (Exim 4.90_1) (envelope-from <david@HIDDEN>) id 1rP7An-0006AL-MY for guix-patches@HIDDEN; Sun, 14 Jan 2024 15:33:37 -0500 Received: from mail-yw1-x112f.google.com ([2607:f8b0:4864:20::112f]) by eggs.gnu.org with esmtps (TLS1.2:ECDHE_RSA_AES_128_GCM_SHA256:128) (Exim 4.90_1) (envelope-from <david@HIDDEN>) id 1rP7Al-0005WB-HA for guix-patches@HIDDEN; Sun, 14 Jan 2024 15:33:37 -0500 Received: by mail-yw1-x112f.google.com with SMTP id 00721157ae682-5edfcba97e3so85198057b3.2 for <guix-patches@HIDDEN>; Sun, 14 Jan 2024 12:33:34 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=tpflug-com.20230601.gappssmtp.com; s=20230601; t=1705264413; x=1705869213; darn=gnu.org; h=content-transfer-encoding:mime-version:message-id:date:subject:cc :to:from:from:to:cc:subject:date:message-id:reply-to; bh=3KCupbun9P+ig6od2iEITjV79dNbwvuswU9yKmLDQeg=; b=yB+sHgrdAoxOEszdYGS2neVE4675u6qH1tObJGZun7iLVq9BNJ8wDGZDbFq/LuMafk t4YWB/ZVUNMt9Xx2owAIM+DNr/e+itkQYCTqSPOw9FQtYL0TdSQO0WRcHU5cMiaowcpM wCuKM+vLumNUwNVYy8dmvQBrKJaU4YPvP9xCqpnUDjVgm1szBMu/roBItVFeLQPwerXN /MFxf6bFv/J/IZKzUepZK9BtR//YqLj2x6S50yVG84stRKA1cgqG4JR3kbvpk9RcEWw2 6Gj6KHCP6k9vd/0uQyf7MK9FgPlbjsEwi/FK6jK6IybLA5dzKsk/EbSJJWj1W/YQ26jv 2X5A== X-Google-DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=1e100.net; s=20230601; t=1705264413; x=1705869213; h=content-transfer-encoding:mime-version:message-id:date:subject:cc :to:from:x-gm-message-state:from:to:cc:subject:date:message-id :reply-to; bh=3KCupbun9P+ig6od2iEITjV79dNbwvuswU9yKmLDQeg=; b=NAsrc2U6HBWqFUTS9qWRso+aKt49xgYegmRG5YPAv5TbOkdwWBqNeD0Ct1hh1njdOL ZRa2xJC++j9nyb2hKBVF2z5PMEjldJjUK2LetZdpx/LsDTU/QsZtxE9VCpkUDRdtnSv0 3w19H4HGSDzB3Jsx8BOBpw2hxRZuV4VcKEctFEiWxYCTFY3di8nrfUpPAOTlEyWk8KRz oiryAqVwLotNjpuSDSZZAlV2zHbALtl0TVlwkauwzTdfuOb7oqZvBy1IKiR61gK3utEj 7s0HDL8gOEKhCHrkMRLmLCoLWQdU5VKY+qnkMFREH88gltX3wwIJAhTG1wq4mNQWiGyU FgSg== X-Gm-Message-State: AOJu0YwAKP2rGqciA4kk1xv70gkgXa8UZEYe4YTEo2/uoFexlM/TrF+3 SPt7/o+osaNEzV4xEnmofOimdQn1TrI9IfwOrXjlEs/8nYI= X-Google-Smtp-Source: AGHT+IH13l+30ZB7JlUxpx5E0tCXSQeKdz6rUx6IY6z5yR/+QXuPH/Wx1+oueUPS8VbUQ6Fdw8L20A== X-Received: by 2002:a05:690c:b08:b0:5ee:381a:3b33 with SMTP id cj8-20020a05690c0b0800b005ee381a3b33mr3485154ywb.94.1705264412860; Sun, 14 Jan 2024 12:33:32 -0800 (PST) Received: from localhost.localdomain (2603-9001-0600-4cc0-6e0b-84ff-fe93-fcd5.inf6.spectrum.com. [2603:9001:600:4cc0:6e0b:84ff:fe93:fcd5]) by smtp.gmail.com with ESMTPSA id cn7-20020a05690c0d0700b005f445345522sm3376470ywb.3.2024.01.14.12.33.31 (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Sun, 14 Jan 2024 12:33:32 -0800 (PST) From: David Pflug <david@HIDDEN> To: guix-patches@HIDDEN Subject: [PATCH] gnu: llama-cpp: Update to 1873. Date: Sun, 14 Jan 2024 15:32:45 -0500 Message-ID: <20240114203255.26500-1-david@HIDDEN> X-Mailer: git-send-email 2.41.0 MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Received-SPF: pass client-ip=2607:f8b0:4864:20::112f; envelope-from=david@HIDDEN; helo=mail-yw1-x112f.google.com X-Spam_score_int: -15 X-Spam_score: -1.6 X-Spam_bar: - X-Spam_report: (-1.6 / 5.0 requ) BAYES_00=-1.9, DKIM_SIGNED=0.1, DKIM_VALID=-0.1, HEADER_FROM_DIFFERENT_DOMAINS=0.249, RCVD_IN_DNSWL_NONE=-0.0001, SPF_HELO_NONE=0.001, SPF_PASS=-0.001, T_SCC_BODY_TEXT_LINE=-0.01, URIBL_SBL_A=0.1 autolearn=no autolearn_force=no X-Spam_action: no action X-Spam-Score: 2.0 (++) X-Spam-Report: Spam detection software, running on the system "debbugs.gnu.org", has NOT identified this incoming email as spam. The original message has been attached to this so you can view it or label similar future email. If you have any questions, see the administrator of that system for details. Content preview: * gnu/packages/machine-learning.scm (llama-cpp): Update to 1873. Change-Id: I091cd20192743c87b497ea3c5fd18a75ada75d9d --- gnu/packages/machine-learning.scm | 133 ++++++++++++++++++ 1 file changed, 78 insertions(+), 55 deletions(-) Content analysis details: (2.0 points, 10.0 required) pts rule name description ---- ---------------------- -------------------------------------------------- 0.1 URIBL_SBL_A Contains URL's A record listed in the Spamhaus SBL blocklist [URIs: ggml.ai] 0.6 URIBL_SBL Contains an URL's NS IP listed in the Spamhaus SBL blocklist [URIs: ggml.ai] 0.2 HEADER_FROM_DIFFERENT_DOMAINS From and EnvelopeFrom 2nd level mail domains are different -0.0 SPF_HELO_PASS SPF: HELO matches SPF record 1.0 SPF_SOFTFAIL SPF: sender does not match SPF record (softfail) -0.0 T_SCC_BODY_TEXT_LINE No description available. X-Debbugs-Envelope-To: submit Cc: David Pflug <david@HIDDEN> X-BeenThere: debbugs-submit <at> debbugs.gnu.org X-Mailman-Version: 2.1.18 Precedence: list List-Id: <debbugs-submit.debbugs.gnu.org> List-Unsubscribe: <https://debbugs.gnu.org/cgi-bin/mailman/options/debbugs-submit>, <mailto:debbugs-submit-request <at> debbugs.gnu.org?subject=unsubscribe> List-Archive: <https://debbugs.gnu.org/cgi-bin/mailman/private/debbugs-submit/> List-Post: <mailto:debbugs-submit <at> debbugs.gnu.org> List-Help: <mailto:debbugs-submit-request <at> debbugs.gnu.org?subject=help> List-Subscribe: <https://debbugs.gnu.org/cgi-bin/mailman/listinfo/debbugs-submit>, <mailto:debbugs-submit-request <at> debbugs.gnu.org?subject=subscribe> Errors-To: debbugs-submit-bounces <at> debbugs.gnu.org Sender: "Debbugs-submit" <debbugs-submit-bounces <at> debbugs.gnu.org> X-Spam-Score: 1.0 (+) * gnu/packages/machine-learning.scm (llama-cpp): Update to 1873. Change-Id: I091cd20192743c87b497ea3c5fd18a75ada75d9d --- gnu/packages/machine-learning.scm | 133 ++++++++++++++++++------------ 1 file changed, 78 insertions(+), 55 deletions(-) diff --git a/gnu/packages/machine-learning.scm b/gnu/packages/machine-learning.scm index 1616738399..0cdfe7bb08 100644 --- a/gnu/packages/machine-learning.scm +++ b/gnu/packages/machine-learning.scm @@ -22,6 +22,7 @@ ;;; Copyright © 2023 Navid Afkhami <navid.afkhami@HIDDEN> ;;; Copyright © 2023 Zheng Junjie <873216071@HIDDEN> ;;; Copyright © 2023 Troy Figiel <troy@HIDDEN> +;;; Copyright © 2023 David Pflug <david@HIDDEN> ;;; ;;; This file is part of GNU Guix. ;;; @@ -517,63 +518,63 @@ (define-public guile-aiscm-next (deprecated-package "guile-aiscm-next" guile-aiscm)) (define-public llama-cpp - (let ((commit "f31b5397143009d682db90fd2a6cde83f1ef00eb") - (revision "0")) - (package - (name "llama-cpp") - (version (git-version "0.0.0" revision commit)) - (source - (origin - (method git-fetch) - (uri (git-reference - (url "https://github.com/ggerganov/llama.cpp") - (commit (string-append "master-" (string-take commit 7))))) - (file-name (git-file-name name version)) - (sha256 - (base32 "0ys6n53n032zq1ll9f3vgxk8sw0qq7x3fi7awsyy13adzp3hn08p")))) - (build-system cmake-build-system) - (arguments - (list - #:modules '((ice-9 textual-ports) - (guix build utils) - ((guix build python-build-system) #:prefix python:) - (guix build cmake-build-system)) - #:imported-modules `(,@%cmake-build-system-modules - (guix build python-build-system)) - #:phases - #~(modify-phases %standard-phases - (add-before 'install 'install-python-scripts - (lambda _ - (let ((bin (string-append #$output "/bin/"))) - (define (make-script script) - (let ((suffix (if (string-suffix? ".py" script) "" ".py"))) - (call-with-input-file - (string-append "../source/" script suffix) - (lambda (input) - (call-with-output-file (string-append bin script) - (lambda (output) - (format output "#!~a/bin/python3\n~a" - #$(this-package-input "python") - (get-string-all input)))))) - (chmod (string-append bin script) #o555))) - (mkdir-p bin) - (make-script "convert-pth-to-ggml") - (make-script "convert-lora-to-ggml") - (make-script "convert")))) - (add-after 'install-python-scripts 'wrap-python-scripts - (assoc-ref python:%standard-phases 'wrap)) - (replace 'install - (lambda _ - (copy-file "bin/main" (string-append #$output "/bin/llama"))))))) - (inputs (list python)) - (propagated-inputs - (list python-numpy python-pytorch python-sentencepiece)) - (home-page "https://github.com/ggerganov/llama.cpp") - (synopsis "Port of Facebook's LLaMA model in C/C++") - (description "This package provides a port to Facebook's LLaMA collection + (package + (name "llama-cpp") + (version "1873") + (source + (origin + (method git-fetch) + (uri (git-reference + (url "https://github.com/ggerganov/llama.cpp") + (commit (string-append "b" version)))) + (file-name (git-file-name name version)) + (sha256 + (base32 "11may9gkafg5bfma5incijvkypjgx9778gmygxp3x2dz1140809d")))) + (build-system cmake-build-system) + (arguments + (list + #:modules '((ice-9 textual-ports) + (guix build utils) + ((guix build python-build-system) #:prefix python:) + (guix build cmake-build-system)) + #:imported-modules `(,@%cmake-build-system-modules + (guix build python-build-system)) + #:phases + #~(modify-phases %standard-phases + (add-before 'install 'install-python-scripts + (lambda _ + (let ((bin (string-append #$output "/bin/"))) + (define (make-script script) + (let ((suffix (if (string-suffix? ".py" script) "" ".py"))) + (call-with-input-file + (string-append "../source/" script suffix) + (lambda (input) + (call-with-output-file (string-append bin script) + (lambda (output) + (format output "#!~a/bin/python3\n~a" + #$(this-package-input "python") + (get-string-all input)))))) + (chmod (string-append bin script) #o555))) + (mkdir-p bin) + (make-script "convert-hf-to-gguf") + (make-script "convert-llama-ggml-to-gguf") + (make-script "convert-lora-to-ggml") + (make-script "convert-persimmon-to-gguf") + (make-script "convert")))) + (add-after 'install-python-scripts 'wrap-python-scripts + (assoc-ref python:%standard-phases 'wrap)) + (replace 'install + (lambda _ + (copy-file "bin/main" (string-append #$output "/bin/llama"))))))) + (inputs (list python)) + (propagated-inputs + (list python-numpy python-pytorch python-sentencepiece python-gguf)) + (home-page "https://github.com/ggerganov/llama.cpp") + (synopsis "Port of Facebook's LLaMA model in C/C++") + (description "This package provides a port to Facebook's LLaMA collection of foundation language models. It requires models parameters to be downloaded independently to be able to run a LLaMA model.") - (license license:expat)))) + (license license:expat))) (define-public mcl (package @@ -5257,3 +5258,25 @@ (define-public oneapi-dnnl "OneAPI Deep Neural Network Library (oneDNN) is a cross-platform performance library of basic building blocks for deep learning applications.") (license license:asl2.0))) + +(define-public python-gguf + (package + (name "python-gguf") + (version "0.6.0") + (source + (origin + (method url-fetch) + (uri (pypi-uri "gguf" version)) + (sha256 + (base32 "0rbyc2h3kpqnrvbyjvv8a69l577jv55a31l12jnw21m1lamjxqmj")))) + (build-system pyproject-build-system) + (arguments + `(#:phases + (modify-phases %standard-phases + (delete 'check)))) + (inputs (list poetry python-pytest)) + (propagated-inputs (list python-numpy)) + (home-page "https://ggml.ai") + (synopsis "Read and write ML models in GGUF for GGML") + (description "Read and write ML models in GGUF for GGML") + (license license:expat))) base-commit: 18393fcdddf5c3d834fa89ebf5f3925fc5b166ed -- 2.41.0
David Pflug <david@HIDDEN>
:guix-patches@HIDDEN
.
Full text available.guix-patches@HIDDEN
:bug#68455
; Package guix-patches
.
Full text available.
GNU bug tracking system
Copyright (C) 1999 Darren O. Benham,
1997 nCipher Corporation Ltd,
1994-97 Ian Jackson.