202602251427-spark-business-scales
🎯 Core Idea
Spark can be understood as three user-facing businesses that share a single brand and governance context:
- Savings
- Spark Lend
- Spark Liquidity Layer
This card captures a snapshot of how each business is doing right now, using publicly accessible dashboards and APIs.
Data snapshot (as of 2026-02-25)
Savings
- Metric: total supply (USD)
- Total: $5.32B
- Change:
- 1d: +$299.26M (+5.96%)
- 7d: +$362.83M (+7.32%)
- Base rate shown in the dashboard: 4.30%
Spark Lend
- Metric: TVL (USD)
- TVL: $2.21B
- Change:
- 1d: +$14.20M (+0.65%)
- 7d: -$108.71M (-4.68%)
- Additional scale signals:
- total supply: $3.46B
- total borrow: $1.25B
Spark Liquidity Layer
- Metric: AUM (USD)
- AUM: $1.86B
- Change:
- 1d: +$0.42M (+0.02%)
- 7d: -$61.53M (-3.20%)
- Top networks by AUM:
- ethereum: $1.81B
- base: $20.00M
- unichain: $15.00M
- optimism: $14.99M
- arbitrum: $14.98M
- Top protocols by AUM:
- sparklend: $794.54M
- paypal: $527.58M
- maple: $255.01M
- anchorage: $150.00M
- curve: $67.67M
Notes on interpretation
- These numbers are a point-in-time view from a dashboard API, not an audited financial statement.
- Different dashboards may define TVL and AUM differently. The most important thing is consistency in the source used when comparing changes over time.
- For Savings, the dashboard reports a base rate. The cashflow source for that rate and the long-run sustainability should be analyzed separately.
🌲 Branching Questions
➡ For Savings, what exactly is included in total supply (USD)? Which assets and wrappers are counted, and where does the rate come from?
The Savings dashboard API exposes aggregate totals plus a vault breakdown.
- The total supply (USD) in the Savings endpoint is an aggregate value across Savings vaults.
- The vaults endpoint returns a list of vault entries, each with a symbol and total supply in token units, plus a percentage field. This suggests the USD total is derived from token balances multiplied by their prices, aggregated across vaults.
What is included depends on which vaults are active. The vault list is the authoritative inventory for this dashboard view.
Where the rate comes from depends on the specific savings product. The dashboard exposes a base rate field, but it does not fully explain the underlying cashflow in the API response. To understand rate source you need to map:
- what assets back the savings token
- what the yield sources are (protocol revenue, lending revenue, or other allocation yield)
- what governance decisions can change the rate
A practical method is:
- use the Savings docs to identify which savings tokens exist
- use the vault list to see current composition
- treat the base rate as a surfaced parameter, and look for the governance artifact that sets it
➡ For Spark Lend, what is the best primary scale metric: TVL, total supply, total borrow, or something else? Why?
Each metric answers a different question:
-
TVL
- best for overall size and market attention
- but it mixes supplied assets that may not be economically productive
-
total supply
- best for how much capital depositors have placed into the market
- closer to how much liquidity the market offers
-
total borrow
- best for actual utilization and revenue generation
- more directly linked to interest revenue and risk exposure
For product performance, total borrow is often the best single metric because it reflects real demand. For risk and liquidity, total supply is more informative. A good dashboard view tracks both and their ratio.
In this snapshot, Spark Lend shows total supply $3.46B and total borrow $1.25B, implying an aggregate utilization around 36 percent.
➡ For Spark Lend, what does a negative 7d TVL change imply: outflows, price movement, or utilization changes?
TVL is sensitive to multiple factors:
- net inflows or outflows of supplied assets
- price changes in volatile collateral (less relevant if the market is dominated by stable assets)
- changes in how the data source defines the protocol bucket
To interpret a negative 7d TVL change you want to check whether supply and borrow moved together:
- If total supply decreased, that suggests outflows.
- If supply stayed flat but TVL dropped, price movement or categorization changes may be involved.
- If total borrow increased while supply decreased, utilization rises and risk pressure can increase.
The Spark dashboard provides historic series for TVL, total supply, and total borrow. The simplest approach is to compare their 7d deltas together instead of relying on TVL alone.
➡ For the Liquidity Layer, what is the actual product promise: yield routing, treasury deployment, or liquidity management? How should I judge success?
The Liquidity Layer is best understood as allocation and liquidity management infrastructure: capital is deployed across multiple networks and protocols, and the dashboard tracks AUM by network and protocol bucket.
Judging success depends on the target function:
-
If it is yield routing
- success means competitive risk-adjusted yield compared to alternatives
- stable performance without large drawdowns
-
If it is treasury deployment
- success means reliable income and diversified counterparties
- clear governance controls on allocation and risk
-
If it is liquidity management
- success means the ability to move capital where it is needed quickly
- the ability to support other Spark businesses (Savings and Lend) with liquidity
A practical KPI set combines scale (AUM), diversification (protocol mix), and stability (drawdowns and allocation changes).
➡ The Liquidity Layer shows large allocations by protocol. What are the top risk factors per protocol bucket (counterparty risk, smart contract risk, governance risk)?
A useful risk decomposition by bucket:
-
Onchain DeFi protocols
- smart contract risk
- oracle risk
- governance risk
- liquidity risk in stress events
-
Institutional or custody-like counterparties
- counterparty default risk
- legal and jurisdiction risk
- transparency risk
-
Stablecoin or issuer-like exposure
- issuer risk
- depeg risk
- regulatory enforcement risk
The dashboard labels such as paypal, maple, and anchorage suggest that not all buckets are purely onchain protocols. That increases the importance of counterparty and legal risk analysis.
A practical workflow is to treat each protocol label as a risk dossier:
- what is the legal entity and contract structure
- what is the redemption path
- what is the failure mode
➡ How do these three businesses connect economically: do they reinforce each other or compete for the same capital?
They are linked through capital allocation:
- Savings attracts stable capital by offering a rate.
- Lend creates borrowing demand and interest flows.
- The Liquidity Layer can deploy capital across strategies and counterparties.
They can reinforce each other if:
- Savings draws in capital
- Liquidity Layer deploys capital productively
- Lend captures onchain demand and produces yield and fees
They can compete if:
- capital is finite and shifts between savings and lending markets
- incentives or rates pull the same users between products
The more integrated the allocation policy is, the more the system behaves like one balance sheet with multiple distribution channels.
➡ What leading indicators should I track weekly for each line (growth, retention, yield competitiveness, risk health)?
Savings
- total supply (USD) and weekly net inflow
- base rate changes and competitiveness versus alternatives
- composition changes across savings vaults
Spark Lend
- total borrow and total supply
- utilization and changes in borrow mix
- collateral composition and risk exposure
Liquidity Layer
- AUM and allocation changes by network and protocol
- concentration metrics (top protocol share)
- large allocation shifts that may signal risk events or strategy changes
A practical weekly report is a delta table plus notes on any large composition shift.
📚 References
- https://docs.spark.fi/user-guides/earning-savings/
- https://docs.spark.fi/user-guides/spark-lend/
- https://docs.spark.fi/user-guides/spark-liquidity-layer/
- https://spark2-api.blockanalitica.com/sparkstar/savings/
- https://spark2-api.blockanalitica.com/sparkstar/sparklend/
- https://spark2-api.blockanalitica.com/sparkstar/sll/aum/
- https://spark2-api.blockanalitica.com/sparkstar/sll/aum/historic