How to Run LLMA2 Locally on Your Mac or Linux: The Complete Guide

Welcome to the World of Local LLMs!

Sumit Agrawal
2 min readNov 11, 2023

--

Hey there! If you’re like me, fascinated by the wonders of AI and itching to get your hands dirty with some of the most powerful language models out there, you’re in the right place. Today, we’re diving into how to run LLMA2 locally on your Mac or Linux system. And trust me, it’s not as daunting as it sounds. Let’s get started!

Why Run LLMA2 Locally?

  • No Internet, No Problem: You don’t need to be online to chat with LLMA2.
  • Privacy is Key: Your data stays with you, on your device.
  • Tailor-Made AI: Tweak and experiment with LLMA2 to your heart’s content.

For Linux users using llama.cpp

Run this on linux terminal

#!/bin/bash
git clone https://github.com/ggerganov/llama.cpp.git
cd llama.cpp
LLAMA_METAL=1 make
export MODEL=llama-2–13b-chat.ggmlv3.q4_0.bin
curl -L "https://huggingface.co/TheBloke/Llama-2-13B-chat-GGML/resolve/main/${MODEL}" -o models/${MODEL}
./main -m ./models/llama-2–13b-chat.ggmlv3.q4_0.bin - color - ctx_size 2048 -n -1 -ins -b 256 - top_k 10000 - temp 0.2 - repeat_penalty 1.1 -t 8

Ollama: The Mac-Exclusive Club

Only for Mac (Apple Silicon)

Ollama is this cool macOS app that not only lets you run LLMA2 but also gives you the power to create and share large language models. It’s like having your own little AI playground.

Download and Run

  • Head over to ollama.ai/download and grab the app.
  • Once it’s installed, no need to create an account or join any waitlist. Just open your terminal and pull the model:
$ ollama pull llama2 # For the 7B model
$ ollama pull llama2:13b # For the 13B model

- And then, it’s showtime:

$ ollama run llama2

>>> hi

Parting Thoughts

And there you have it, folks — your very own local LLMA2 setup on Mac or Linux. Whether you’re a developer, a researcher, or just an AI enthusiast, the possibilities are now endless right at your fingertips. Happy experimenting, and let’s see what amazing things you’ll create with LLMA2!

--

--

Sumit Agrawal
Sumit Agrawal

Written by Sumit Agrawal

Enterprise Solutions Architect / Polyglot developer / Cybersecurity Enthusiast

No responses yet