How to Run Large Language Models (LLMs) in Your Command Line? : Chris
How to Run Large Language Models (LLMs) in Your Command Line?
by: Chris
blow post content copied from Be on the Right Side of Change
click here to view original post
July 15, 2023 at 07:25PM
Click here for more details...
=============================
The original post is available in Be on the Right Side of Change by Chris
this post has been published as it is through automation. Automation script brings all the top bloggers post under a single umbrella.
The purpose of this blog, Follow the top Salesforce bloggers and collect all blogs in a single place through automation.
============================
by: Chris
blow post content copied from Be on the Right Side of Change
click here to view original post
LLM is a command-line utility and Python library for interacting with large language models. In the latest release (v0.5), it offers support for self-hosted language models through plugins.
Installation
LLM can be installed using pip, pipx, or Homebrew. The syntax for each is as follows:
- With pip:
pip install llm
- With pipx:
pipx install llm
- With Homebrew:
brew install simonw/llm/llm
Key Features
- Plugins: Users can install plugins that add support for additional models. This includes 17 models from the GPT4All project, Mosaic’s MPT-30B self-hosted model, and Google’s PaLM 2 (via their API).
- Model Installation: With the new plugin system, users can install models directly on their machine. For instance, the
llm-gpt4all
plugin can be installed withllm install llm-gpt4all
. - Running Prompts: Users can run prompts against a model. For instance, to run the prompt
"The capital of Germany?"
against theggml-vicuna-7b-1
model, usellm -m ggml-vicuna-7b-1 "The capital of Germany?"
. - Logging: All prompts and responses are logged to a SQLite database. Users can view the most recent record with
llm logs -n 1
.
You can see the possible models by running the following command:
llm models list
Using LLM from Python
The new version also supports usage as a Python library. Here’s an example:
import llm model = llm.get_model("gpt-3.5-turbo") model.key = 'YOUR_API_KEY_HERE' response = model.prompt("Ten names for a new programming language") print(response.text())
You can also use conversations and send multiple prompts to the model within the same context:
import llm model = llm.get_model("ggml-mpt-7b-chat") conversation = model.conversation() r1 = conversation.prompt("The biggest country in Europe?") print(r1.text()) r2 = conversation.prompt("How many people live there?") print(r2.text())
This and previous examples are taken from Simon Willison’s Web Blog. Check it out, I really love the blog!
Also check out our guide that I’m sure you’ll love!
Recommended: 26 Insane Auto-GPT and Baby AGI Alternatives You Ought to Try in 2023
July 15, 2023 at 07:25PM
Click here for more details...
=============================
The original post is available in Be on the Right Side of Change by Chris
this post has been published as it is through automation. Automation script brings all the top bloggers post under a single umbrella.
The purpose of this blog, Follow the top Salesforce bloggers and collect all blogs in a single place through automation.
============================
Post a Comment