- Read yield data from multiple EVM chains via Wormhole queries
- Execute a Classical ML model to optimize yield
- Schedule periodic position rebalancing via scheduled transactions
1
Initial assumptions
For sake of example, assume that:
- For each of our yield sources, we have an on-chain read function that returns data including APY, liquidity, and 24H volume.
- We can black-box a function for updating our cross-chain yield positions, also executed via Wormhole queries
- We have already created a Classical ML model that consumes on-chain yield metrics and returns an allocation as output
2
Execute initial setup
First, we can setup our smart contract by inheriting the We can also upload our Classical ML model to Hugging Face or Arweave via our
IScheduler
interface:infernet-ml
toolkit.3
Read and update yield allocations
Next, we can:Using this interface, we can proceed to build a function that chains together the Wormhole and Classical ML inference precompiles to run our desired pipeline:As you can see above, we:
- Tap into our multi-chain yield sources to collect yield data
- Process the yield data by executing our Classical ML model
- Update our position allocations
- Iterate through our yield sources, collecting their yield data
- Parse and package this data for execution by an ONNX model
- Use the returned array from the ONNX model to update our allocations
4
Setup rebalancing automatically
Finally, with the
updateYieldAllocation()
function setup, we can easily schedule our position allocations to rebalance automatically through the use of scheduled transactions: