5.1.6 Function Calling
Last Version: 11/09/2025
Overview
This section explains how to use Large Language Models (LLMs) to perform Function Calling. With feature, an LLM can go beyond understanding and generating natural language, it can automatically choose and execute local or cloud functions based on your instructions, enabling a shift from simply “talking” to actually “doing.”
Install Dependencies
Install necessary dependencies, including Ollama toolkit and model resources:
sudo apt update
sudo apt install spacemit-ollama-toolkit
Create Models
Download and create the necessary models.
sudo apt install wget
wget https://modelscope.cn/models/second-state/Qwen2.5-0.5B-Instruct-GGUF/resolve/master/Qwen2.5-0.5B-Instruct-Q4_0.gguf -P ./
wget https://archive.spacemit.com/spacemit-ai/modelfile/qwen2.5:0.5b.modelfile -P ./
wget http://archive.spacemit.com/spacemit-ai/gguf/qwen2.5-0.5b-fc-q4_0.gguf -P ./
wget http://archive.spacemit.com/spacemit-ai/modelfile/qwen2.5-0.5b-fc.modelfile -P ./
Create the models in Ollama:
ollama create qwen2.5:0.5b -f qwen2.5:0.5b.modelfile
ollama create qwen2.5-0.5b-fc -f qwen2.5-0.5b-fc.modelfile
Clone Repository
Clone the project repository and move into the correct directory to access the example code.
git clone https://gitee.com/bianbu/spacemit-demo.git
cd spacemit-demo/examples/NLP
Run the Example
Execute the main script. The program will start and wait for your input.
python 05_llm_demo.py
When you enter a command or question, the LLM analyzes your intent and automatically calls the appropriate function to handle the task. This enables the model to return a structured response or directly execute the requested action.