Skip to main content

5.2.6 Function Call

Feature Introduction

This chapter introduces how to use Large Language Models (LLM) to implement Function Calling. Function calling enables large models to not only understand and generate natural language, but also automatically select and call local or cloud functions based on instructions, achieving the transition from "can speak" to "can do".

Install Dependencies

First install necessary dependencies, including Ollama toolkit and model resources:

sudo apt update
sudo apt install spacemit-ollama-toolkit

Model Creation

sudo apt install wget
wget https://modelscope.cn/models/second-state/Qwen2.5-0.5B-Instruct-GGUF/resolve/master/Qwen2.5-0.5B-Instruct-Q4_0.gguf -P ./
wget https://archive.spacemit.com/spacemit-ai/modelfile/qwen2.5:0.5b.modelfile -P ./

wget http://archive.spacemit.com/spacemit-ai/gguf/qwen2.5-0.5b-fc-q4_0.gguf -P ./
wget http://archive.spacemit.com/spacemit-ai/modelfile/qwen2.5-0.5b-fc.modelfile -P ./
ollama create qwen2.5:0.5b -f qwen2.5:0.5b.modelfile
ollama create qwen2.5-0.5b-fc -f qwen2.5-0.5b-fc.modelfile

Clone Repository

git clone https://gitee.com/bianbu/spacemit-demo.git
cd spacemit_demo/examples/NLP

Run Example Code

python 05_llm_demo.py

After running, the large model will parse user intent based on natural language input and automatically call corresponding functions, returning structured responses or directly executing logical functions.