1. Applicant & Project Overview
TrustNomiks – Tokenomics Intelligence Graph
Team
-
Leo Delion – Co-founder @ Nomiks, Tokenomics Engineer.
-
Nomiks collaborators (research & engineering support)
Overview
TrustNomiks is a tokenomics intelligence layer built directly on the Intuition Protocol. It encodes key token metrics (supply, TGE, allocations, vesting, emission models, risk flags) as Intuition atoms and triples, each backed by expert staking in $TRUST. Instead of scattered PDFs and inconsistent spreadsheets, TrustNomiks offers a canonical, queryable tokenomics graph that agents, dashboards, tokenomists and allocators can consume. Objective claims (e.g. max supply, vesting schedule) are clearly separated from interpretative ones (e.g. dilution risk, emission aggressiveness) and curated via staking and challenges. This makes tokenomics machine-readable, economically curated, and reusable across the ecosystem and the mathematical/financial models.
Project Category
-
InfoFi / Knowledge Graph
-
AI / Agent Context Provider
-
Developer Tooling
-
Consumer App
-
Identity / Reputation
Elevator Pitch
TrustNomiks turns tokenomics into InfoFi-grade data: structured atoms and triples curated with $TRUST, so analysts, DeFi curators, and AI agents can finally query reliable token metrics instead of scraping PDFs, landing pages or DAOs proposals.
Origin Story
At Nomiks, we’ve designed tokenomics and risk frameworks for a wide range of L1s, DeFi protocols, and apps. The same issue kept blocking both humans and agents: token metrics are fragmented, ambiguous, and hard to trust. Intuition’s InfoFi primitives (atoms, triples, signals, $TRUST staking) are exactly what we wanted as infrastructure for a token-curated tokenomics graph. TrustNomiks is the crystallization of this experience into an on-chain, machine-readable knowledge layer that anyone in the ecosystem can build on.
Traction / Achievements
-
Delivered tokenomics designs, unlock studies, and economic models for multiple protocols (L1, DeFi, consumer).
-
Built internal tools (token allocations helper : Nomiks Builder, simulation engines : Nomiks Sentinel) to stress-test unlocks and liquidity vs. market conditions.
-
Co-designed the TrustNomiks concept in dialogue with Intuition contributors; early interest from tokenomics engineers and DeFi risk curators.
-
Worked on protocol improvements proposals for Stacks $STX (SIP-31), SandBox $SAND and Chiliz $CHZ.
2. What we are building
Problem
Tokenomics information today is:
-
Scattered – across whitepapers, pitch decks, forum posts, on-chain explorers.
-
Inconsistent – different numbers depending on the source; no canonical reference.
-
Not machine-readable – hard for AI agents, dashboards, and automated due diligence.
-
Weakly trusted – no economic cost for publishing wrong or misleading token data.
This hurts both freelances/workers (data-analysts, allocators, developers) and machines (agents, trackers, bots), especially for risk analysis and strategy construction.
Solution
TrustNomiks builds a Tokenomics Intelligence Graph on Intuition:
-
Each key tokenomics datum becomes an Intuition claim:
-
e.g. Token X — has Max Supply — 1,000,000,000,
-
Token X — has Allocation Segment — Segment Team,
-
Segment Team — allocation Percentage — 20%,
-
Segment Team — vesting Duration Months — 36.
-
-
Claims are sourced (source URL, document, TGE announcement, audit, DAO Proposal, etc.).
-
Claims are asserted by identifiable experts (DIDs / Intuition identities / agents).
-
Claims can be backed by $TRUST (signals and staking) and challenged if incorrect.
-
Objective vs. interpretative claims are modeled separately, allowing:
-
Objective layer – hard data (supplies, dates, percentages).
-
Interpretative layer – risk flags (e.g. “>40% supply unlock in next 12 months”), quality tags, scoring.
-
Result: a reusable, queryable tokenomics substrate powering due diligence, risk filters, dashboards, modeling frameworks and agents.
Stage of Development
- Ontology v1 (entities + predicates) is drafted.
-
CSV/Sheet schemas for ingestion are defined.
-
Example mappings to Intuition atoms and triples are sketched (trained with >200 tokens already integrated for Nomiks SAAS solutions).
This grant will fund the transition from ontology + design → working MVP on Intuition with real tokens modeled and live $TRUST-curated claims.
Technical Architecture
Core entities (Atoms):
-
Token, Organisation, Wallet
-
AllocationSegment (Team, Treasury, Investors, Liquidity, Community, Ecosystem, etc.)
-
VestingSchedule (cliff, duration, frequency, hatch)
-
Emission Model (inflationary, fixed cap, deflationary, burn & mInt, rebase…)
-
Expert (analyst, curator, DAO, firm)
-
Data Source (docs, on-chain, API)
-
Risk Flag (unlock concentration, governance capture, emission aggressiveness, etc.)
Key predicates (Triples):
-
has Max Supply, has Initial Supply, has Circulating Supply At Date
-
has TGE Supply, percentage Of Max Supply
-
has Allocation Segment, allocation Percentage, allocation Token Amount
-
has Vesting Schedule, cliff Months, vesting Duration Months, vesting Frequency
-
and more…
Integrations & Dependencies
-
Intuition’s GraphQL API / MCP to read/write atoms, triples, signals.
-
Potential integration with DeFi dashboards, allocators, and wallet UIs needing tokenomics context.
Security & Data Integrity
-
Explicit separation between fact-like claims and opinion / interpretation.
-
Strong provenance metadata (source, timestamp, asserter identity).
-
$TRUST-backed claims with challenge / slashing to discourage low-quality data.
3. Team & Execution Ability
Backgrounds
-
Leo Delion – Tokenomics engineer, co-founder of Nomiks. Experience across L1s, DeFi, and application tokens: allocation design, vesting, liquidity, incentive systems, risk analysis.
-
Nomiks collaborators: quant modeling, simulation, and DeFi research.
Execution Proof (Leo)
-
Built the database of Nomiks Builder (internal tool for token metrics and raise scenarios).
-
Worked on Nomiks Sentinel (simulation engines to stress-test unlock schedules and liquidity vs. market conditions).
-
Shipped tokenomics and due-diligence work for several protocols.
Commitment
-
Lead (Leo): part-time with the ability to expand depending on funding and traction.
-
Dev : We have our own network of freelancers, but it would also be interesting to work with developers who are already involved in Intuition’s projects.
-
Additional Nomiks support: part-time on ontology, data modeling, integration.
4. Grant Request & Milestones
Amount Requested
- Worker incentives: $35,000 equivalent for active workers on the project (engineers, quants, economists, developers…).
- (if technical feasibility) Reward design: Allocating a reward pool in $TRUST enables us to design a dedicated incentive function for data commitments on the TrustNomiks graph.
Executive summary :
-
Stage 1: 200 token-metrics ready to be implemented → completed
-
Stage 2: deployment → need grant
- milestone 1: TrustNomiks onthology and Intuition mapping.
- milestone 2: reinforcement Dapps / StressTest and audit of the graph
- user acquisition
-
Stage 3: contributor incentives → need $TRUST reward pool
- milestone 3: incentive of tokenomics experts for the interpretative layer
- contributor acquisition (data commitments)
| Milestone | Description | Deliverables | Timeline | Success Criteria |
|---|---|---|---|---|
| M1 – Ontology & Schemas v1 | Finalize TrustNomiks tokenomics ontology and Intuition mapping. | Public spec of entities & predicates; CSV/JSON schemas; example triples for 10–20 tokens. | Weeks 0–4 | Ontology reviewed with Intuition team + at least one external tokenomics expert; ready for ingestion. |
| M2 – MVP Graph with 100 Tokens | Implement ingestion pipeline and model 100 canonical tokens end-to-end. | Atoms & triples on Intuition for 100 tokens; sample queries; basic documentation for consumers. | Weeks 4–10 | Queries demonstrate useful tokenomics filters (e.g. unlock risk, team vesting length, supply distribution). |
| M3 – $TRUST-Curated Expert Layer | Add subjective claims, staking, and incentive design. | RiskFlag schema; initial set of expert-backed interpretative claims; simple UX/workflow for staking & challenges (could be basic UI + Atlas flows + scripts). | Weeks 10–16 | 3–5 experts actively staking on claims; at least one successful challenge / correction executed. |
5. Intuition Ecosystem Alignment
Why Intuition
TrustNomiks is natively an InfoFi application:
-
Tokenomics is information that must be structured, attestable, and economically curated.
-
Intuition provides exactly this: atoms, triples, attestations, signals, $TRUST staking, and a shared InfoFi graph.
-
Tokenomics cuts across the whole ecosystem (L1s, DeFi, consumer apps) and is one of the most obvious verticals where better information directly changes capital allocation.
Intuition Primitives Used
-
Atoms – Token, Organisation, Wallet, AllocationSegment, VestingSchedule, EmissionModel, Expert, DataSource, RiskFlag.
-
Triples – core tokenomics relations: supplies, allocations, vesting, emissions, risk flags, provenance (assertedBy, sourceURL, etc.).
-
Attestations / Signals – experts signal confidence in specific claims and groupings of claims.
-
$TRUST staking & challenges – economic curation of tokenomics claims; incorrect or misleading data can be challenged and penalized.
-
Knowledge graph / MCP context – TrustNomiks provides a coherent context for AI agents performing due diligence, risk scoring, and strategy design. And developers building apps or dashboards that combine TrustNomiks intelligence with real-time market data (CoinMarketCap API, TokenTerminal, etc.)
Why This Must Be Built on Intuition
-
We specifically need a token-curated knowledge graph with staking, rewards/punishment mechanism design, and composable queries: this is Intuition’s core.
-
Tokenomics data should be co-owned by the ecosystem, not locked in a proprietary database. We believe that tokenomics datas are the backbone of a new generation of monetary policies (micro-incentives patterns, KPI-based inflations, autonomous economic agents…), it must become open sourced.
New Schemas / Patterns / Agent Types
-
Tokenomics RiskFlag schema – structured interpretative claims like “High unlock concentration” or “Team allocation > X% with short vesting.”
-
Tokenomics Curator agents – agents that propose new claims, keep token entries up to date by reading DAOs proposals, screen landing pages…
-
Due diligence query patterns – reusable query templates such as “find tokens with low dilution risk and long team vesting” or “filter by community vs. investor allocation balance.”
Long-Term Network Contribution
-
Provide a standardized tokenomics ontology that other projects on Intuition can reuse and extend.
-
Act as a canonical reference layer feeding other Intuition-based products (risk dashboards, portfolio tools, institutional allocators, DeFi protocols screening…).
Contact & Attestation
-
Nomiks: https://nomiks.io
-
Linkedin : https://www.linkedin.com/in/leodelion/
-
Medium : https://medium.com/@Nomiks
-
contact : leo@nomiks.io
Thank you !




