FAQ
Frequently Asked Questions
Points Program
How do I unstake my staked assets?
Since points program and ORA staking has ended, here's the guide for you to unstake your assets. Your funds are safe and will be available to you forever.
Click the link below, connect the wallet, click on requestWithdraw under write as proxy tab, fill out the params, pool address are below, amount should be the amount you staked: https://etherscan.io/address/0x784fDeBfD4779579B4cc2bac484129D29200412a#writeProxyContract
ETH POOL: 0x0a7Df7BC7a01A4b6C9889d5994196C1600D4244a
OLM POOL: 0x4F5E12233Ed7ca1699894174fCbD77c7eD60b03d
OLM2 POOL (6 month lock up): 0x07b022BD57e22c8c5Abc577535Cf25e483dAe3dF
STETH POOL: 0x5982241e50Cb4C42cb51D06e74A97EAaCa3a8CE2
STONE ETH POOL: 0xc0b2FdA4EDb0f7995651B05B179596b112aBE0Ff
Once the first transaction is successful, come back after 1 day and click on claimWithdraw and fill out the params.
Tora Client
Information about running Tora Client can be found under our Node Operator Guide.
How do I run a Tora Client?
Currently there are 2 options for running a Tora client, Tora Launcher and CLI option.
Are there incentives for running a Tora Client?
The incentive is in form of ORA points. Read more on Points page.
Each validated transaction will earn 3 points. Read more here: https://docs.ora.io/doc/points/tasks#task-4-running-validator-node
ORA Network
Is ORA network a rollup or layer 2?
Neither.
ORA is a verifiable AI oracle network. It contains a set of smart contracts capable of making calls to a network of nodes computing AI inference, secured by opML.
AI Oracle
How does AI Oracle handle large responses to generate videos or images?
The content generated by ORA's AI Oracle can be securely stored on decentralized storage networks like IPFS (InterPlanetary File System). Once stored, the files can be retrieved using the Content Identifier (CID), which is provided by ORA’s AI Oracle.
I want to build with ORA's AI Oracle, what options do I have?
Depending on your use case, you can choose from a variety of supported AI models. Please refer to the References page, where you'll find all the essential details regarding each model, including supported blockchain networks, associated fees, and other relevant information.
What does the AI Oracle fee consist of?
AI Oracle fee = Model Fee (for LlaMA2 or Stable Diffusion) + Callback Fee (for node to submit a inference result back to onchain) + Network Fee (gas)
Callback fee and network fees may be higher when network is experiencing congestion.
Callback fees may be lower if you are using model such as Stable Diffusion, because the inference result will be shorter (just an IPFS hash, instead of long paragraphs in LLM).
OPML
How does opML guarantee consistency, given ML models are non-deterministic?
ML inferences can be deterministic provided that the random seed is fixed and the inference is run using Nvidia's deterministic framework or in our deterministic VM. Learn more from this talk on determinism on ML.
What are the limitations of opML?
Privacy, because all models in opML needs to be public and open-source for network participants to challenge. This can be mitigated with opp/ai.
What is the proving overhead, performance, and limitations of zkML frameworks?
zkLLM and Ligetron data from EthBelgrade conference talk.
Modulus Labs zkML bringing GPT2-1B onchain resulted in a 1m+ times overhead (200+ hours for each call), 128-core CPU and 1TB RAM.
The zkML framework with EZKL takes around 80 minutes to generate a proof of a 1M-nanoGPT model.
According to Modulus Labs, zkML has >>1000 times overhead than pure computation, with the latest report being 1000 times, and 200 times for small models.
According to EZKL’s benchmark, the average proving time of RISC Zero is of 173 seconds for Random Forest Classification.
For more details, refer to:
Ethereum Foundation's granted benchmark, which compares our zkML framework to other leading solutions in the space.
EthBelgrade conference talk, which mentions zkLLM and Ligetron data.
Other Questions
How can I be a OG on discord?
Unfortunately, the OG role was a limited-time opportunity exclusively for our first 100 Discord community members, and that window has closed. However, we appreciate your interest and look forward to having you as part of our community!
What is OLM?
OLM is the first AI model launched through the IMO framework. More details here https://docs.openlm.io/olm/initial-model-offering
More questions? Reach out in our Discord.
Last updated
Was this helpful?