How to Install and Run DeepSeek-R1 on Ubuntu 24.04 | Complete Step-by-Step Guide

In this guide, we’ll walk you through the process of installing and running DeepSeek-R1 on Ubuntu 24.04, one of the most advanced AI models available today. Whether you’re setting it up for research, automation, or just to explore AI capabilities, this tutorial will ensure that you can get DeepSeek-R1 up and running smoothly on your Ubuntu machine.

Watch the step-by-step process in this video for a detailed, visual guide.


System Requirements for Running DeepSeek-R1 on Ubuntu 24

Before diving into the installation process, let’s first check the system requirements for DeepSeek-R1. Ensuring your system meets these prerequisites will help you avoid any potential issues down the line.

  • Operating System: Ubuntu 24.04
  • RAM: At least 16GB (32GB+ recommended for larger models)
  • Storage: Minimum of 20GB of free storage for the 1.5b model (larger models may require more)
  • Internet: A stable internet connection for downloading dependencies and the model
  • Model: For optimal performance on lower-resource systems, we recommend the DeepSeek-R1:1.5b model.

If your system meets these requirements, you’re all set to begin!


Step 1: Installing the Ollama Tool

Before we can run DeepSeek-R1, we need to install Ollama, the tool that simplifies the management of AI models on Linux. Ollama handles the heavy lifting of installing and updating dependencies, making it easier to interact with DeepSeek-R1.

What is Ollama?

Ollama is a command-line tool designed to streamline the process of running large language models like DeepSeek-R1. It ensures that all the required dependencies are automatically installed and updated, saving you time and effort.

To install Ollama on Ubuntu 24.04, open your terminal and execute the following command:

curl -fsSL https://ollama.com/install.sh | sh

This command will download and install Ollama, allowing you to run models like DeepSeek-R1 with ease. Once the installation completes, verify it by running:

ollama --version

If you see the version number, Ollama is installed successfully.


Step 2: Installing the DeepSeek-R1 Model

Now that Ollama is set up, it’s time to download and install the DeepSeek-R1 model. For users with limited system resources, we recommend installing the 1.5b model, which is optimized for smaller systems.

To install the DeepSeek-R1:1.5b model, execute the following command in your terminal:

ollama run deepseek-r1:1.5b

This command tells Ollama to download and run the DeepSeek-R1 model on your system. The first time you run this, it may take a few moments to download and initialize everything. Once complete, you’ll see a confirmation message indicating that the model is running in the background.


Step 3: Querying the DeepSeek-R1 Model

Once DeepSeek-R1 is up and running, it’s time to test the model with a query. The process is straightforward. To send a query, use the following command:

ollama run deepseek-r1:1.5b --query "What is the weather like today?"

You’ll receive a response based on the AI model’s understanding. DeepSeek-R1 can handle a wide range of queries, from simple questions to complex data retrieval tasks.


Step 4: Practical Use Cases for DeepSeek-R1

Now that DeepSeek-R1 is running, let’s explore some practical applications of this AI model.

1. Document Summarization

DeepSeek-R1 can quickly summarize long documents, helping you save time and focus on key information.

Example Query:

ollama run deepseek-r1:1.5b --query "Summarize the key findings of this technical report on machine learning."

2. Customer Support Automation

You can use DeepSeek-R1 to automate customer support by answering frequently asked questions.

Example Query:

ollama run deepseek-r1:1.5b --query "How do I reset my password?"

3. Research Assistance

For researchers, DeepSeek-R1 can quickly search through large amounts of data and find relevant information.

Example Query:

ollama run deepseek-r1:1.5b --query "What are the latest advancements in AI research?"

Step 5: Advanced Usage with GPU Acceleration

Running DeepSeek-R1 on a CPU is sufficient for basic tasks, but for more complex operations or faster performance, you can enable GPU acceleration. This requires a compatible GPU and the necessary drivers.

GPUs are ideal for AI workloads due to their ability to handle parallel processing, which significantly speeds up model execution.

For detailed instructions on enabling GPU acceleration, refer to the official Ollama documentation.


Troubleshooting Common Issues

If you encounter issues during the installation or use of DeepSeek-R1, here are some common troubleshooting tips:

  • Command Not Found: Ensure Ollama has been added to your PATH.
  • Permission Issues: Use sudo to grant necessary permissions during installation.
  • Failed Model Download: Ensure your internet connection is stable.
  • Out of Memory Errors: Switch to a smaller model like deepseek-r1:1.5b.
  • Slow Performance: Consider enabling GPU acceleration.

FAQs

Q1: How much RAM do I need to run DeepSeek-R1?
A1: At least 16GB of RAM is recommended for the 1.5b model. For larger models, 32GB or more is preferable.

Q2: Can I run DeepSeek-R1 without a GPU?
A2: Yes, DeepSeek-R1 can run on a CPU, but it will be slower. A GPU is recommended for faster performance.

Q3: What is Ollama?
A3: Ollama is a tool that simplifies running AI models like DeepSeek-R1 by managing dependencies and updates.

Q4: How do I query DeepSeek-R1?
A4: Use the command ollama run deepseek-r1:1.5b --query "Your question here" to interact with the model.

Q5: Can I run DeepSeek-R1 on a different version of Ubuntu?
A5: While this guide is for Ubuntu 24.04, you can run DeepSeek-R1 on earlier versions of Ubuntu with slight adjustments.


Conclusion

You’ve now successfully installed and are ready to run DeepSeek-R1 on Ubuntu 24.04. Whether you’re using it for research, automation, or other AI tasks, DeepSeek-R1 offers immense potential.

For a visual walkthrough, be sure to check out our step-by-step video guide. If you have any questions or need further assistance, feel free to leave a comment below. Happy exploring!


See also:

List of monitoring tools 

Linux Blogs

AWS Cloud Blogs

Database Blogs

DevOps Blogs

Interview Questions & Answers

Docker Blogs

Google Cloud Blogs







Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.