ORA
TwitterDiscordWebsiteMirror
  • ✨Quick Start
  • 🔘Introduction
    • About ORA
    • Use Cases
    • Key Milestones
  • 🌐Foundation
    • ORA Coin ($ORA)
    • Tokenomics
      • Interacting with ORA’s Tokenomics
  • 🤖Onchain Perpetual Agent (opAgent)
    • opAgent
  • 🔮Onchain AI Oracle (OAO)
    • Onchain AI Oracle
      • Build with AI Oracle
        • Callback Gas Limit Estimation
        • Advanced Usages of AI Oracle
        • Optimistic Machine Learning (OPML)
      • Onboard AI Models
      • Non-Developer Guides
      • References
        • AI Settlement Oracle
        • Example: Fortune Teller
    • Node Operator Guide
      • Tora Validator Client
      • Tora Launcher - Tutorial
      • Tora CLI - Tutorial
      • Troubleshooting
    • Fraud Proof Virtual Machine (FPVM) and Frameworks
      • opML
      • opp/ai
      • Comparison of Proving Frameworks
  • 💠Resilient Model Services (RMS)
    • Overview
    • ORA API
    • Model and Pricing
    • Technology
  • 🪙INITIAL MODEL OFFERING (IMO)
    • IMO Overview
    • IMO Participation Rules
    • ERC-7641: Intrinsic RevShare Token
    • ERC-7007: Verifiable AI-Generated Content Token
  • 📚Resources
    • Resources
    • FAQ
    • Glossary
    • Legal
      • Privacy Policy
      • Terms of Use
  • 📧Get in touch
    • Twitter
    • Telegram
    • GitHub
Powered by GitBook
On this page
  • Learning Objectives
  • Prerequisites
  • Setup
  • Train the model using Pytorch
  • Convert model into ggml format
  • Converting inference code to MIPS VM executable
  • Running the dispute game

Was this helpful?

Export as PDF
  1. Onchain AI Oracle (OAO)
  2. Onchain AI Oracle

Onboard AI Models

Bring Your Own Model

PreviousOptimistic Machine Learning (OPML)NextNon-Developer Guides

Last updated 7 months ago

Was this helpful?

In this tutorial we explain how to integrate your own AI model with ORA's AI Oracle. We will start by looking at repository and trying to understand what's happening there. At the end we will showcase how works, by running a simple dispute game script inside a docker container.

Learning Objectives

  • Understand how to transform an AI model inference code and integrate it with ORA's AI Oracle

  • Execute a simple dispute game and understand the process of AI inference verification.

Prerequisites

  • installed

Setup

  1. Clone repository

git clone git@github.com:OPML-Labs/mlgo.git
  1. Navigate to cloned repository

cd mlgo
  1. To install the required dependencies for your project, run the following command:

pip install -r requirements.txt

If there are some missing dependencies, make sure to install them in your Python environment.

Train the model using Pytorch

First we need to train a DNN model using Pytorch. The training part is shown in examples/mnist/trainning/mnist.ipynb.

After the training model is saved at examples/mnist/models/mnist/mnist-small.state_dict.

Convert model into ggml format

  1. Position to mnist folder

cd examples/mnist
  1. Convert python model into ggml format

python3 convert-h5-to-ggml.py models/mnist/mnist-small.state_dict

In order to convert AI model written in Python to ggml format we are executing python script and providing a file that stores the model as a parameter to the script. The output is the binary file in ggml format. Note that the model is saved in big-endian, making it easy to process in the big-endian MIPS-32 VM.

Converting inference code to MIPS VM executable

Next step is to write inference code in go language. Then we will transform go binary into MIPS VM executable file.

Go supports compilation to MIPS. However, the generated executable is in ELF format. We'd like to get a pure sequence of MIPS instructions instead. To build a ML program in MIPS VM execute the following steps:

  1. Navigate to the mnist_mips directory and build go inference code

cd ../mnist_mips && ./build.sh

Running the dispute game

Now that we compiled our AI model and inference code into MIPS VM executable code.

  1. First we need to specify the operating system that runs inside our container. In this case we're using ubuntu:22.04.

# How to run instructions:
# 1. Generate ssh command: ssh-keygen -t rsa -b 4096 -C "your_email@example.com"
#    - Save the key in local repo where Dockerfile is placed as id_rsa
#    - Add the public key to the GitHub account
# 2. Build docker image: docker build -t ubuntu-opml-dev .
# 3. Run the hardhat: docker run -it --rm --name ubuntu-opml-dev-container ubuntu-opml-dev bash -c "npx hardhat node"
# 4. Run the challange script on the same container: docker exec -it ubuntu-opml-dev-container bash -c "./demo/challenge_simple.sh"


# Use an official Ubuntu as a parent image
FROM ubuntu:22.04
# Set environment variables to non-interactive to avoid prompts during package installations
ENV DEBIAN_FRONTEND=noninteractive

# Update the package list and install dependencies
RUN apt-get update && apt-get install -y \
    build-essential \
    cmake \
    git \
    golang \
    wget \
    curl \
    python3 \
    python3-pip \
    python3-venv \
    unzip \
    file \
    openssh-client \
    && apt-get clean \
    && rm -rf /var/lib/apt/lists/*

# Install Node.js and npm
RUN curl -fsSL https://deb.nodesource.com/setup_18.x | bash - && \
    apt-get install -y nodejs
  1. Then we configure ssh keys, so that docker container can clone all the required repositories.

# Copy SSH keys into the container
COPY id_rsa /root/.ssh/id_rsa
RUN chmod 600 /root/.ssh/id_rsa
# Configure SSH to skip host key verification
RUN echo "Host *\n\tStrictHostKeyChecking no\n" >> /root/.ssh/config
  1. Position to the root directory inside docker container and clone opml repository along with its submodules.

# Set the working directory
WORKDIR /root

# Clone the OPML repository
RUN git clone git@github.com:ora-io/opml.git --recursive
WORKDIR /root/opml
  1. Lastly, we tell docker to build executables and run the challenge script.

# Build the OPML project
RUN make build

# Change permission for the challenge script
RUN chmod +x demo/challenge_simple.sh

# Default command
CMD ["bash"]

Create docker container and run the script

  1. In order to successfully clone opml repository, you need to generate a new ssh key and add it to your Github account. Once it's generated, save the key in the local repository where Dockerfile is placed as id_rsa. Then add the public key to your GitHub account.

    ssh-keygen -t rsa -b 4096 -C "your_email@example.com"

  2. Build the docker image

    docker build -t ubuntu-opml-dev .

  3. Run the local Ethereum node

    docker run -it --rm --name ubuntu-opml-dev-container ubuntu-opml-dev bash -c "npx hardhat node"

  4. In another terminal run the challenge script

    docker exec -it ubuntu-opml-dev-container bash -c "./demo/challenge_simple.sh"

After executing the steps above you should be able to see interactive challenge process in the console.

Script first deploys necessary contracts to the local node. Proposer opML node executes AI inference and then the challenger nodes can dispute it, if they think that the result is not valid. Challenger and proposer are interacting in order to find the differing step between their computations. Once the dispute step is found it's sent to the smart contract for the arbitration. If the challenge is successful the proposer node gets slashed.

In this tutorial we achieved the following:

  • converted our AI model from Python to ggml format

  • compiled AI inference code written in go to MIPS VM executable format

  • run the dispute game inside a docker container and understood the opML verification process

is a file format that consists of version number, followed by three components that define a large language model: the model's hyperparameters, its vocabulary, and its weights. Ggml allows for more efficient inference runs on CPU. We will now convert the Python model to ggml format by executing following steps:

Build script will compile go code and then run script that will transform compiled go code to the MIPS VM executable file.

We can test the dispute game process. We will use a from opml repository to showcase the whole verification flow.

For this part of the tutorial we will use , so make sure to have it installed.

Let's first check the content of the that we are using:

Then we need to install all the necessary dependencies in order to run .

🔮
mlgo
opML
git
mlgo
Ggml
compile.py
bash script
Docker
Dockerfile
dispute game script