UPMEM-MLP implements a multilayer perceptron training application in C and accelerates this application on the UPMEM platform.
- CMake 3.10 or higher
- GCC
- Python
- UPMEM SDK
Installing UPMEM SDK
- Download UPMEM SDK tarball for your system from this link
NOTICE: UPMEM SDK is no longer downloadable on UPMEM's official SDK Downloads page.
-
Extract its content and (preferably) move it to a better place like
/usr/local/bin/ -
Add the shell script
upmem_env.sh, which sets necessary environment variables, to be sourced into your.bashrc:
source /usr/local/bin/upmem-sdk/upmem_env.sh simulator > /dev/null-
Restart your shell session for the changes to become effective
-
Test your setup:
which dpu-lldb- Clone this repository and navigate inside it:
git clone https://github.com/OpenHardware-Initiative/UPMEM-MLP.git
cd UPMEM-MLP- (Optional, but recommended) Create a Python virtual environment:
python3 -m venv venv
source venv/bin/activate- Install Python requirements:
pip install -r requirements.txt- Extract training samples & labels:
python3 read_dataset.py- Compile the MLP:
make- Run the MLP:
./build/mlpWith this command, you can use:
BATCH_SIZE=...to configure the batch size used during training, which otherwise defaults to 20MAX_EPOCH=...to configure the maximum number of epochs the training can run for, which otherwise defaults to 10NUM_TRAIN_SAMPLES=...to configure from the command line how many samples the model should be trained with, which otherwise defaults to 200UPMEM=0to turn off matrix multiplication on UPMEMSAN=1to run the MLP with GCC sanitizerEVAL=1to run the MLP in evaluation mode, which adds to the printout how many cycles are spent in training
UPMEM-MLP comes with unit tests, which can be found in tests/. Run these unit tests using:
mkdir build
cd build
cmake ..
make
make testUPMEM-MLP is completed and being actively maintained as of 2025-11-23.
UPMEM-MLP is licensed under the Apache License v2.0. See LICENSE for more details.