# Page Not Found
Source: https://docs.spark.money/404
The page you are looking for doesn't exist or has been moved.
Sorry, we couldn't find the page you're looking for. Here are some helpful links:
* [Home](/home/welcome)
* [Wallet Documentation](/wallet/introduction)
* [Mint Documentation](/issuing/introduction)
* [API Reference](/issuing/api-reference)
If you believe this is a bug, please [let us know](https://spark.money/get-in-touch).
# null
Source: https://docs.spark.money/CLAUDE
# Doc Sync Agent Instructions
This repo contains documentation for Spark ([https://spark.money](https://spark.money)).
## Repo Structure
```
issuance/ # Token issuance tutorials (IssuerSparkWallet)
wallets/ # Wallet tutorials (SparkWallet)
api-reference/
wallet/ # SparkWallet method reference
issuer/ # IssuerSparkWallet method reference
learn/ # Conceptual docs (architecture, trust model)
quickstart/ # Getting started guides
```
## SDK Source of Truth
The Spark SDKs are the source of truth:
* **SparkWallet**: `sdks/js/packages/spark-sdk/src/spark-wallet/spark-wallet.ts`
* **IssuerSparkWallet**: `sdks/js/packages/issuer-sdk/src/issuer-wallet/issuer-spark-wallet.ts`
## Sync Rules
### Method Signatures
* Parameter names must match SDK EXACTLY
* Types must match EXACTLY (bigint vs number matters)
* Optional parameters must be marked correctly
* Return types must match
### Deprecations
* Methods marked `@deprecated` in SDK need `**Deprecated**...` in docs
* Link to the replacement method
### Code Examples
* Must use correct parameter syntax
* Object-style params: `method({ param1, param2 })`
* Positional params: `method(param1, param2)`
## Common Issues to Check
1. **Multi-token support** - IssuerSparkWallet methods now take `tokenIdentifier` parameter
2. **Object vs positional params** - Many methods changed from positional to object params
3. **Async/await** - Ensure examples have `await` for async methods
4. **Type accuracy** - `bigint` for token amounts, `number` for sats
## Files to Compare
When syncing, always compare:
| Doc File | SDK Source |
| ---------------------------- | ------------------------ |
| `api-reference/wallet/*.mdx` | `spark-wallet.ts` |
| `api-reference/issuer/*.mdx` | `issuer-spark-wallet.ts` |
| `issuance/*.mdx` | `issuer-spark-wallet.ts` |
| `wallets/*.mdx` | `spark-wallet.ts` |
# Issuer API
Source: https://docs.spark.money/api-reference/issuer-overview
IssuerSparkWallet class methods for token creation, minting, burning, and freezing.
The `IssuerSparkWallet` class extends `SparkWallet` with token issuance and management capabilities on the Spark network. All functions from `SparkWallet` are also available, including Bitcoin transfers, Lightning payments, and wallet management.
***
## Installation
Install the Issuer SDK packages using your package manager of choice.
```bash npm theme={null}
npm install @buildonspark/issuer-sdk
```
```bash yarn theme={null}
yarn add @buildonspark/issuer-sdk
```
```bash pnpm theme={null}
pnpm add @buildonspark/issuer-sdk
```
***
## Method Categories
# batchTransferTokens
Source: https://docs.spark.money/api-reference/issuer/batch-transfer-tokens
Transfer tokens to multiple recipients in one transaction.
Transfers tokens to multiple recipients in a single transaction via the `IssuerSparkWallet`.
## Method Signature
```typescript theme={null}
async function batchTransferTokens(
receiverOutputs: {
tokenIdentifier: Bech32mTokenIdentifier;
tokenAmount: bigint;
receiverSparkAddress: string;
}[],
outputSelectionStrategy?: "SMALL_FIRST" | "LARGE_FIRST",
selectedOutputs?: OutputWithPreviousTransactionData[]
): Promise;
```
## Parameters
Array of transfer objects. All outputs must have the same `tokenIdentifier`:
* `tokenIdentifier`: Bech32m token identifier (e.g., `btkn1...`)
* `tokenAmount`: Amount of tokens to transfer (`bigint`)
* `receiverSparkAddress`: Recipient's Spark address
Strategy for selecting outputs: `"SMALL_FIRST"` or `"LARGE_FIRST"` (default: `"SMALL_FIRST"`)
Specific outputs to use for transfer (overrides selection strategy)
## Returns
Transaction ID
## Example
```typescript theme={null}
const txId = await issuerWallet.batchTransferTokens([
{ tokenIdentifier: "btkn1...", tokenAmount: 1000n, receiverSparkAddress: "spark1abc..." },
{ tokenIdentifier: "btkn1...", tokenAmount: 500n, receiverSparkAddress: "spark1def..." },
{ tokenIdentifier: "btkn1...", tokenAmount: 250n, receiverSparkAddress: "spark1ghi..." }
]);
console.log("Batch transfer completed:", txId);
```
# burnTokens
Source: https://docs.spark.money/api-reference/issuer/burn-tokens
Burn tokens to reduce circulating supply.
Burns existing tokens to reduce the circulating supply for the `IssuerSparkWallet`.
## Method Signature
```typescript theme={null}
// Single-token issuer (positional parameter)
async burnTokens(
amount: bigint,
selectedOutputs?: OutputWithPreviousTransactionData[]
): Promise
// Multi-token issuer (object parameters)
async burnTokens({
tokenAmount,
tokenIdentifier,
selectedOutputs,
}: {
tokenAmount: bigint;
tokenIdentifier: Bech32mTokenIdentifier;
selectedOutputs?: OutputWithPreviousTransactionData[];
}): Promise
```
## Parameters
### Single-token issuer
The amount to burn (e.g., `1000n`)
Optional specific outputs to use for the burn operation
### Multi-token issuer
The amount to burn (e.g., `1000n`)
The token identifier to burn. Required for multi-token issuers.
Optional specific outputs to use for the burn operation
## Returns
Transaction ID
## Example
```typescript theme={null}
// Single token issuer (simple positional parameter)
const txId = await issuerWallet.burnTokens(1000n);
console.log("Tokens burned:", txId);
// Multi-token issuer (specify which token with object parameters)
const tokenId = await issuerWallet.getIssuerTokenIdentifier();
const txId2 = await issuerWallet.burnTokens({
tokenAmount: 500n,
tokenIdentifier: tokenId
});
```
# createToken
Source: https://docs.spark.money/api-reference/issuer/create-token
Create a new token with name, ticker, decimals, and supply settings.
Creates a new token on Spark using Spark Native Tokens for the `IssuerSparkWallet`.
## Method Signature
```typescript theme={null}
// Returns transaction hash only
async createToken({
tokenName,
tokenTicker,
decimals,
maxSupply,
isFreezable,
extraMetadata,
returnIdentifierForCreate,
}: {
tokenName: string;
tokenTicker: string;
decimals: number;
maxSupply?: bigint; // defaults to 0n (unlimited)
isFreezable: boolean;
extraMetadata?: Uint8Array;
returnIdentifierForCreate?: false;
}): Promise
// Returns both transaction hash and token identifier
async createToken({
tokenName,
tokenTicker,
decimals,
maxSupply,
isFreezable,
extraMetadata,
returnIdentifierForCreate,
}: {
tokenName: string;
tokenTicker: string;
decimals: number;
maxSupply?: bigint; // defaults to 0n (unlimited)
isFreezable: boolean;
extraMetadata?: Uint8Array;
returnIdentifierForCreate: true;
}): Promise
```
## Parameters
Name of the token (eg: SparkCoin)
Token ticker (eg: SPARKC)
The precision the token supports (eg: 8 for BTC)
The maximum supply for this token (defaults to `0n` for unlimited supply)
Whether or not the Issuer can freeze this token
Optional extra metadata bytes to associate with the token (e.g., image data, JSON metadata)
When `true`, returns both the transaction hash and token identifier as `TokenCreationDetails`. When `false` or omitted, returns only the transaction hash as a string (default: `false`)
## Returns
When `returnIdentifierForCreate` is `false` or omitted:
Spark Transaction ID
When `returnIdentifierForCreate` is `true`:
Object containing:
* `tokenIdentifier`: Bech32m token identifier (e.g., `btkn1...`)
* `transactionHash`: Spark Transaction ID
## Example
```typescript theme={null}
// Basic usage - returns only transaction hash
const txId = await issuerWallet.createToken({
tokenName: "SparkCoin",
tokenTicker: "SPARKC",
decimals: 8,
maxSupply: 1000000n,
isFreezable: true,
// Optional: add extra metadata
extraMetadata: new TextEncoder().encode(JSON.stringify({ icon: "..." }))
});
console.log("Token created:", txId);
// Get both transaction hash and token identifier
const result = await issuerWallet.createToken({
tokenName: "SparkCoin",
tokenTicker: "SPARKC",
decimals: 8,
maxSupply: 1000000n,
isFreezable: true,
returnIdentifierForCreate: true
});
console.log("Token created:", result.transactionHash);
console.log("Token identifier:", result.tokenIdentifier);
```
# freezeTokens
Source: https://docs.spark.money/api-reference/issuer/freeze-tokens
Freeze tokens held by a specific Spark address.
Freezes issuer's tokens for a specific wallet via the `IssuerSparkWallet`.
## Method Signature
```typescript theme={null}
// Single-token issuer (positional parameter)
async freezeTokens(sparkAddress: string): Promise<{
impactedOutputIds: string[];
impactedTokenAmount: bigint;
}>
// Multi-token issuer (object parameters)
async freezeTokens({
tokenIdentifier,
sparkAddress,
}: {
tokenIdentifier: Bech32mTokenIdentifier;
sparkAddress: string;
}): Promise<{
impactedOutputIds: string[];
impactedTokenAmount: bigint;
}>
```
## Parameters
### Single-token issuer
The Spark Address to freeze
### Multi-token issuer
The token identifier to freeze. Required for multi-token issuers.
The Spark Address to freeze
## Returns
Array of output IDs that were frozen
Total amount of tokens frozen
## Example
```typescript theme={null}
// Single token issuer (simple positional parameter)
const result = await issuerWallet.freezeTokens(
"spark1qw508d6qejxtdg4y5r3zarvary0c5xw7kxpjzsx"
);
console.log("Frozen outputs:", result.impactedOutputIds);
console.log("Frozen amount:", result.impactedTokenAmount);
// Multi-token issuer (specify which token with object parameters)
const tokenId = await issuerWallet.getIssuerTokenIdentifier();
const result2 = await issuerWallet.freezeTokens({
tokenIdentifier: tokenId,
sparkAddress: "spark1qw508d6qejxtdg4y5r3zarvary0c5xw7kxpjzsx"
});
```
# getIdentityPublicKey
Source: https://docs.spark.money/api-reference/issuer/get-identity-public-key
Get the issuer wallet's identity public key as hex.
Gets the identity public key of the `SparkWallet`.
## Method Signature
```typescript theme={null}
async getIdentityPublicKey(): Promise
```
## Returns
The identity public key as a hex string
## Example
```typescript theme={null}
const identityPubKey = await wallet.getIdentityPublicKey();
console.log("Identity Public Key:", identityPubKey);
```
# getIssuerTokenBalance
Source: https://docs.spark.money/api-reference/issuer/get-issuer-token-balance
Get the issuer's own token balance and identifier.
**Deprecated**: Use [`getIssuerTokenBalances()`](/api-reference/issuer/get-issuer-token-balances) instead for multi-token support.
Gets the issuer's own token balance via the `IssuerSparkWallet`. Throws an error if the issuer has multiple tokens.
## Method Signature
```typescript theme={null}
async function getIssuerTokenBalance(): Promise<{
tokenIdentifier: Bech32mTokenIdentifier | undefined;
balance: bigint;
}>;
```
## Returns
Bech32m token identifier (e.g., `btkn1...`), or `undefined` if no token exists
Token balance held by the issuer
## Example
```typescript theme={null}
const { tokenIdentifier, balance } = await issuerWallet.getIssuerTokenBalance();
if (tokenIdentifier) {
console.log("Token identifier:", tokenIdentifier);
console.log("Issuer balance:", balance);
} else {
console.log("No token created yet");
}
```
# getIssuerTokenBalances
Source: https://docs.spark.money/api-reference/issuer/get-issuer-token-balances
Get all token balances for a multi-token issuer.
Gets all token balances for tokens issued by this `IssuerSparkWallet`. Supports issuers with multiple tokens.
## Method Signature
```typescript theme={null}
async getIssuerTokenBalances(): Promise<{
tokenIdentifier: Bech32mTokenIdentifier | undefined;
balance: bigint;
}[]>
```
## Returns
Array of objects containing token identifier and balance for each token issued by this wallet.
Bech32m token identifier (e.g., `btkn1...`), or `undefined` if no token exists
Token balance held by the issuer
## Example
```typescript theme={null}
const balances = await issuerWallet.getIssuerTokenBalances();
for (const { tokenIdentifier, balance } of balances) {
if (tokenIdentifier) {
console.log(`Token ${tokenIdentifier}: ${balance}`);
}
}
```
# getIssuerTokenDistribution
Source: https://docs.spark.money/api-reference/issuer/get-issuer-token-distribution
Get token distribution stats: circulating supply, holders, and transactions.
Gets the token distribution information for the token associated with this `IssuerSparkWallet`.
This feature is currently under development and will be available in a future release of Spark.
## Method Signature
```typescript theme={null}
interface TokenDistribution {
totalCirculatingSupply: bigint;
totalIssued: bigint;
totalBurned: bigint;
numHoldingAddress: number;
numConfirmedTransactions: bigint;
}
async function getIssuerTokenDistribution(): Promise;
```
## Returns
Total circulating supply of the token
Total issued tokens
Total tokens burned
Number of addresses holding the token
Number of confirmed transactions
## Example
```typescript theme={null}
const distribution = await issuerWallet.getIssuerTokenDistribution();
console.log("Circulating supply:", distribution.totalCirculatingSupply);
console.log("Total issued:", distribution.totalIssued);
console.log("Total burned:", distribution.totalBurned);
console.log("Holding addresses:", distribution.numHoldingAddress);
console.log("Confirmed transactions:", distribution.numConfirmedTransactions);
```
# getIssuerTokenIdentifier
Source: https://docs.spark.money/api-reference/issuer/get-issuer-token-identifier
Get the Bech32m token identifier (btkn1...) for your token.
**Deprecated**: Use [`getIssuerTokenIdentifiers()`](/api-reference/issuer/get-issuer-token-identifiers) instead for multi-token support.
Gets the Bech32m token identifier for the issuer's token via the `IssuerSparkWallet`. Throws an error if the issuer has multiple tokens.
## Method Signature
```typescript theme={null}
async getIssuerTokenIdentifier(): Promise
```
## Returns
Bech32m token identifier (eg: btkn1…)
## Example
```typescript theme={null}
const tokenIdentifier = await issuerWallet.getIssuerTokenIdentifier();
console.log("Token identifier:", tokenIdentifier);
// Use in transfer operations
const txId = await issuerWallet.transferTokens({
tokenIdentifier,
tokenAmount: 1000n,
receiverSparkAddress: "spark1..."
});
```
# getIssuerTokenIdentifiers
Source: https://docs.spark.money/api-reference/issuer/get-issuer-token-identifiers
Get all token identifiers (btkn1...) for a multi-token issuer.
Gets all Bech32m token identifiers for tokens issued by this `IssuerSparkWallet`. Supports issuers with multiple tokens.
## Method Signature
```typescript theme={null}
async getIssuerTokenIdentifiers(): Promise
```
## Returns
Array of Bech32m token identifiers (e.g., `["btkn1...", "btkn1..."]`)
## Example
```typescript theme={null}
const tokenIdentifiers = await issuerWallet.getIssuerTokenIdentifiers();
console.log("Tokens issued by this wallet:");
for (const tokenId of tokenIdentifiers) {
console.log(" -", tokenId);
}
// Use in operations
for (const tokenId of tokenIdentifiers) {
const balance = await issuerWallet.getBalance();
const tokenBalance = balance.tokenBalances.get(tokenId);
console.log(`${tokenId}: ${tokenBalance?.balance ?? 0n}`);
}
```
# getIssuerTokenMetadata
Source: https://docs.spark.money/api-reference/issuer/get-issuer-token-metadata
Get token metadata including name, ticker, decimals, and max supply.
**Deprecated**: Use [`getIssuerTokensMetadata()`](/api-reference/issuer/get-issuer-tokens-metadata) instead for multi-token support.
Gets the metadata about the token for the `IssuerSparkWallet`. Throws an error if the issuer has multiple tokens.
As an issuer, to get your token public key, you can call `getIssuerTokenMetadata()` or `getIdentityPublicKey()`. For frequent retrieval, use `getIdentityPublicKey()` since it doesn't require a network call.
## Method Signature
```typescript theme={null}
interface IssuerTokenMetadata {
tokenPublicKey: string; // Issuer's public key (same as identity public key)
rawTokenIdentifier: Uint8Array; // Binary token identifier
tokenName: string;
tokenTicker: string; // Token ticker symbol (e.g., "USDT")
decimals: number; // Number of decimal places
maxSupply: bigint;
isFreezable: boolean;
extraMetadata?: Uint8Array; // Arbitrary bytes set during token creation
}
async function getIssuerTokenMetadata(): Promise;
```
## Returns
Object containing complete token metadata
## Example
```typescript theme={null}
const metadata = await issuerWallet.getIssuerTokenMetadata();
console.log("Token name:", metadata.tokenName);
console.log("Token ticker:", metadata.tokenTicker);
console.log("Decimals:", metadata.decimals);
console.log("Max supply:", metadata.maxSupply);
console.log("Is freezable:", metadata.isFreezable);
// Check for extra metadata
if (metadata.extraMetadata) {
console.log("Extra metadata:", metadata.extraMetadata);
}
```
# getIssuerTokensMetadata
Source: https://docs.spark.money/api-reference/issuer/get-issuer-tokens-metadata
Get metadata for all tokens issued by this wallet.
Gets metadata for all tokens issued by this `IssuerSparkWallet`. Supports issuers with multiple tokens.
## Method Signature
```typescript theme={null}
interface IssuerTokenMetadata {
tokenPublicKey: string; // Issuer's public key (same as identity public key)
rawTokenIdentifier: Uint8Array; // Binary token identifier
tokenName: string;
tokenTicker: string; // Token ticker symbol (e.g., "USDT")
decimals: number; // Number of decimal places
maxSupply: bigint;
isFreezable: boolean;
extraMetadata?: Uint8Array; // Arbitrary bytes set during token creation
bech32mTokenIdentifier: string; // Bech32m encoded token identifier
}
async getIssuerTokensMetadata(): Promise
```
## Returns
Array of metadata objects for each token issued by this wallet
## Example
```typescript theme={null}
const tokensMetadata = await issuerWallet.getIssuerTokensMetadata();
for (const metadata of tokensMetadata) {
console.log("Token name:", metadata.tokenName);
console.log("Token ticker:", metadata.tokenTicker);
console.log("Token ID:", metadata.bech32mTokenIdentifier);
console.log("Decimals:", metadata.decimals);
console.log("Max supply:", metadata.maxSupply);
console.log("---");
}
```
# getSparkAddress
Source: https://docs.spark.money/api-reference/issuer/get-spark-address
Get the issuer wallet's Spark address for receiving payments.
Gets the Spark Address of the `SparkWallet`.
## Method Signature
```typescript theme={null}
async getSparkAddress(): Promise
```
## Returns
The Spark Address
## Example
```typescript theme={null}
const sparkAddress = await wallet.getSparkAddress();
console.log("Spark Address:", sparkAddress);
```
# initialize
Source: https://docs.spark.money/api-reference/issuer/initialize
Create or restore an IssuerSparkWallet instance from mnemonic or seed.
Creates and initializes a new `IssuerSparkWallet` instance.
## Method Signature
```typescript theme={null}
interface IssuerSparkWalletProps {
mnemonicOrSeed?: Uint8Array | string;
accountNumber?: number;
signer?: SparkSigner;
options?: ConfigOptions;
}
static async initialize(props: IssuerSparkWalletProps): Promise<{
wallet: IssuerSparkWallet;
mnemonic?: string;
}>
```
## Parameters
BIP-39 mnemonic phrase or raw seed
Number used to generate multiple identity keys from the same mnemonic
Custom signer implementation for advanced use cases
Wallet configuration options including network selection
## Returns
The initialized IssuerSparkWallet instance
The mnemonic if one was generated (undefined for raw seed)
## Example
```typescript Create New Wallet theme={null}
import { IssuerSparkWallet } from "@buildonspark/issuer-sdk";
// Create a new issuer wallet
const { wallet, mnemonic } = await IssuerSparkWallet.initialize({
options: { network: "REGTEST" } // or "MAINNET"
});
console.log("Issuer wallet initialized:", wallet);
console.log("Generated mnemonic:", mnemonic);
```
```typescript Import Existing Wallet theme={null}
import { IssuerSparkWallet } from "@buildonspark/issuer-sdk";
// Import issuer wallet from existing mnemonic
const { wallet } = await IssuerSparkWallet.initialize({
mnemonicOrSeed: "abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon about",
accountNumber: 0, // Optional: specify account index
options: { network: "REGTEST" }
});
console.log("Issuer wallet restored from mnemonic");
```
# mintTokens
Source: https://docs.spark.money/api-reference/issuer/mint-tokens
Mint new tokens to increase circulating supply.
Mints new tokens to increase the circulating supply for the `IssuerSparkWallet`.
## Method Signature
```typescript theme={null}
// Single-token issuer (positional parameter)
async mintTokens(amount: bigint): Promise
// Multi-token issuer (object parameters)
async mintTokens({
tokenAmount,
tokenIdentifier,
}: {
tokenAmount: bigint;
tokenIdentifier: Bech32mTokenIdentifier;
}): Promise
```
## Parameters
### Single-token issuer
The amount to mint (e.g., `1000n`)
### Multi-token issuer
The amount to mint (e.g., `1000n`)
The token identifier to mint. Required for multi-token issuers.
## Returns
Transaction ID
## Example
```typescript theme={null}
// Single token issuer (simple positional parameter)
const txId = await issuerWallet.mintTokens(1000n);
console.log("Tokens minted:", txId);
// Multi-token issuer (specify which token with object parameters)
const tokenId = await issuerWallet.getIssuerTokenIdentifier();
const txId2 = await issuerWallet.mintTokens({
tokenAmount: 5000n,
tokenIdentifier: tokenId
});
```
# queryTokenTransactions
Source: https://docs.spark.money/api-reference/issuer/query-token-transactions
Query token transactions by owner, issuer, hash, or token identifier.
Retrieves token transactions from the network with flexible filtering options for the `IssuerSparkWallet`.
## Method Signature
```typescript theme={null}
async function queryTokenTransactions({
sparkAddresses,
ownerPublicKeys,
issuerPublicKeys,
tokenTransactionHashes,
tokenIdentifiers,
outputIds,
order,
pageSize,
offset,
}: {
sparkAddresses?: string[];
ownerPublicKeys?: string[]; // deprecated, use sparkAddresses
issuerPublicKeys?: string[];
tokenTransactionHashes?: string[];
tokenIdentifiers?: string[];
outputIds?: string[];
order?: "asc" | "desc";
pageSize?: number;
offset?: number;
}): Promise;
```
## Parameters
**Filter Constraint:** You can only specify **one** filter type per query. The parameters are mutually exclusive - use `sparkAddresses` OR `ownerPublicKeys` OR `issuerPublicKeys` OR `tokenTransactionHashes` OR `tokenIdentifiers` OR `outputIds`, but not multiple types together.
Array of Spark addresses to filter by (recommended)
Array of owner public keys to filter by (deprecated, use `sparkAddresses`)
Array of issuer public keys to filter by
Array of token transaction hashes to filter by
Array of Bech32m token identifiers to filter by
Array of output IDs to filter by
Sort order: `"asc"` or `"desc"` (defaults to `"desc"`)
Number of results per page (defaults to 50)
Pagination offset (defaults to 0)
## Returns
Token transactions response object
## Example
```typescript theme={null}
// Query all token transactions
const transactions = await issuerWallet.queryTokenTransactions({});
// Query transactions for specific tokens with pagination
const tokenTransactions = await issuerWallet.queryTokenTransactions({
tokenIdentifiers: ["btkn1..."],
order: "desc",
pageSize: 20,
offset: 0
});
// Query transactions for specific Spark addresses
const addressTransactions = await issuerWallet.queryTokenTransactions({
sparkAddresses: ["spark1..."]
});
// Query all transactions for your token
const allTransactions = await issuerWallet.queryTokenTransactions({
issuerPublicKeys: [await issuerWallet.getIdentityPublicKey()]
});
```
**Getting the sender address:** Token transaction outputs only show the recipient. To get the sender's address for an incoming transfer, query the previous transaction using the `prevTokenTransactionHash` from the input's `OutputToSpend`.
# transferTokens
Source: https://docs.spark.money/api-reference/issuer/transfer-tokens
Transfer tokens from issuer wallet to a Spark address.
Transfers tokens to another Spark Address via the `IssuerSparkWallet`.
## Method Signature
```typescript theme={null}
async function transferTokens({
tokenIdentifier,
tokenAmount,
receiverSparkAddress,
outputSelectionStrategy,
selectedOutputs,
}: {
tokenIdentifier: Bech32mTokenIdentifier;
tokenAmount: bigint;
receiverSparkAddress: string;
outputSelectionStrategy?: "SMALL_FIRST" | "LARGE_FIRST";
selectedOutputs?: OutputWithPreviousTransactionData[];
}): Promise;
```
## Parameters
Bech32m token identifier (eg: btkn1…) of the token to transfer
Amount of tokens to transfer
Recipient's Spark Address
Strategy for selecting outputs: `"SMALL_FIRST"` or `"LARGE_FIRST"` (defaults to `"SMALL_FIRST"`)
Specific outputs to use for transfer (overrides selection strategy)
## Returns
Transaction ID
## Example
```typescript theme={null}
const txId = await issuerWallet.transferTokens({
tokenIdentifier: "btkn1...",
tokenAmount: 1000n,
receiverSparkAddress: "spark1qw508d6qejxtdg4y5r3zarvary0c5xw7kxpjzsx"
});
console.log("Tokens transferred:", txId);
```
# unfreezeTokens
Source: https://docs.spark.money/api-reference/issuer/unfreeze-tokens
Unfreeze previously frozen tokens for a Spark address.
Unfreezes issuer's tokens for a specific wallet via the `IssuerSparkWallet`.
## Method Signature
```typescript theme={null}
// Single-token issuer (positional parameter)
async unfreezeTokens(sparkAddress: string): Promise<{
impactedOutputIds: string[];
impactedTokenAmount: bigint;
}>
// Multi-token issuer (object parameters)
async unfreezeTokens({
tokenIdentifier,
sparkAddress,
}: {
tokenIdentifier: Bech32mTokenIdentifier;
sparkAddress: string;
}): Promise<{
impactedOutputIds: string[];
impactedTokenAmount: bigint;
}>
```
## Parameters
### Single-token issuer
The Spark Address to unfreeze
### Multi-token issuer
The token identifier to unfreeze. Required for multi-token issuers.
The Spark Address to unfreeze
## Returns
Array of output IDs that were unfrozen
Total amount of tokens unfrozen
## Example
```typescript theme={null}
// Single token issuer (simple positional parameter)
const result = await issuerWallet.unfreezeTokens(
"spark1qw508d6qejxtdg4y5r3zarvary0c5xw7kxpjzsx"
);
console.log("Unfrozen outputs:", result.impactedOutputIds);
console.log("Unfrozen amount:", result.impactedTokenAmount);
// Multi-token issuer (specify which token with object parameters)
const tokenId = await issuerWallet.getIssuerTokenIdentifier();
const result2 = await issuerWallet.unfreezeTokens({
tokenIdentifier: tokenId,
sparkAddress: "spark1qw508d6qejxtdg4y5r3zarvary0c5xw7kxpjzsx"
});
```
# Overview
Source: https://docs.spark.money/api-reference/overview
Spark SDK methods for wallets, Lightning, and token issuance.
Welcome to the Spark API Reference documentation. This comprehensive guide covers all the methods available in the Spark SDKs for building Bitcoin-native applications, including wallet management, Bitcoin operations, Lightning Network integration, and token issuance.
***
## Available SDKs
***
## Getting Started
To get started with the Spark API, follow the steps below.
Install the SDK package using your package manager of choice.
For the **Wallet SDK**:
```bash npm theme={null}
npm install @buildonspark/spark-sdk
```
```bash yarn theme={null}
yarn add @buildonspark/spark-sdk
```
```bash pnpm theme={null}
pnpm add @buildonspark/spark-sdk
```
For the **Issuer SDK**:
```bash npm theme={null}
npm install @buildonspark/issuer-sdk
```
```bash yarn theme={null}
yarn add @buildonspark/issuer-sdk
```
```bash pnpm theme={null}
pnpm add @buildonspark/issuer-sdk
```
Create a wallet instance to start interacting with the Spark network.
For the **Wallet SDK**:
```ts theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
async function main() {
const { wallet, mnemonic } = await SparkWallet.initialize({
options: { network: "MAINNET" },
});
const address = await wallet.getSparkAddress();
console.log({ address });
}
main();
```
For the **Issuer SDK**:
```ts theme={null}
import { IssuerSparkWallet } from "@buildonspark/issuer-sdk";
async function main() {
const { wallet, mnemonic } = await IssuerSparkWallet.initialize({
options: { network: "MAINNET" },
});
const address = await wallet.getSparkAddress();
console.log({ address });
}
main();
```
Browse the method documentation using the sidebar navigation. Each method includes detailed parameters, return values, and code examples.
***
## Error Handling
The SDK throws typed errors that you can catch and handle appropriately:
```typescript theme={null}
import {
SparkError,
SparkValidationError,
SparkRequestError
} from "@buildonspark/spark-sdk";
try {
await wallet.transfer({ receiverSparkAddress: "...", amountSats: 1000 });
} catch (error) {
if (error instanceof SparkValidationError) {
// Invalid parameters (e.g., negative amount, invalid address format)
console.error("Validation error:", error.message);
console.error("Field:", error.context?.field);
} else if (error instanceof SparkRequestError) {
// Network/API errors
console.error("Request failed:", error.message);
} else if (error instanceof SparkError) {
// General SDK errors
console.error("SDK error:", error.message);
}
}
```
| Error Type | When Thrown |
| ---------------------- | ------------------------------------------------------ |
| `SparkValidationError` | Invalid parameters, out-of-range values, format errors |
| `SparkRequestError` | Network failures, API errors, timeout |
| `SparkError` | General SDK errors, configuration issues |
***
## Use AI Tools with These Docs
Our documentation is optimized for AI coding assistants like Cursor, Claude, and ChatGPT.
| Resource | URL |
| :------------------------ | :----------------------------------------------------------------------- |
| Full docs (LLM-optimized) | [docs.spark.money/llms-full.txt](https://docs.spark.money/llms-full.txt) |
| Docs index | [docs.spark.money/llms.txt](https://docs.spark.money/llms.txt) |
| Any page as Markdown | Append `.md` to any URL |
Paste the `llms-full.txt` URL into your AI assistant's context for complete knowledge of Spark's APIs and best practices.
# Verify Signatures Without Wallet
Source: https://docs.spark.money/api-reference/utilities/verify-signature-readonly
Verify message signatures using only a Spark address, without initializing a wallet.
Verify message signatures without creating a wallet instance. This is useful for readonly verification scenarios where you only have access to a Spark address.
## Use Case
When building applications that need to verify signatures but don't have access to wallet credentials:
* Readonly wallet views
* Signature verification services
* Cross-chain verification (verifying Spark signatures from other chains)
* Server-side verification without sensitive key material
## How It Works
Spark addresses encode the wallet's identity public key. You can extract this key using `decodeSparkAddress` and verify signatures directly with secp256k1.
## Implementation
```typescript theme={null}
import { decodeSparkAddress } from "@buildonspark/spark-sdk";
import * as secp256k1 from "@noble/secp256k1";
// Extract identity public key from Spark address
const { identityPublicKey } = decodeSparkAddress(address, network);
// Verify the signature
const isValid = secp256k1.verify(signature, message, identityPublicKey);
```
## Parameters
The Spark address to extract the identity public key from
The network type (`MAINNET`, `TESTNET`, or `REGTEST`)
The signature to verify
The original message that was signed
## Full Example
```typescript theme={null}
import { decodeSparkAddress, NetworkType } from "@buildonspark/spark-sdk";
import * as secp256k1 from "@noble/secp256k1";
import { sha256 } from "@noble/hashes/sha256";
async function verifySparkSignature(
address: string,
message: string,
signature: string,
network: NetworkType = NetworkType.MAINNET
): Promise {
// Decode address to get identity public key
const { identityPublicKey } = decodeSparkAddress(address, network);
// Hash the message (signatures are typically over message hashes)
const messageHash = sha256(new TextEncoder().encode(message));
// Convert signature from hex if needed
const sigBytes = typeof signature === "string"
? Uint8Array.from(Buffer.from(signature, "hex"))
: signature;
// Verify using secp256k1
return secp256k1.verify(sigBytes, messageHash, identityPublicKey);
}
// Usage
const isValid = await verifySparkSignature(
"sp1qw508d6qejxtdg4y5r3zarvary0c5xw7k...",
"Hello, Spark!",
"304402..."
);
console.log("Signature valid:", isValid);
```
## Key Points
`decodeSparkAddress` is exported directly from the SDK and doesn't require wallet initialization
Only needs the SDK and a secp256k1 library—no sensitive data or network calls
## Related
* [signMessageWithIdentityKey](/api-reference/wallet/sign-message-with-identity-key) - Sign messages with wallet
* [validateMessageWithIdentityKey](/api-reference/wallet/validate-message-with-identity-key) - Validate with wallet instance
* [getIdentityPublicKey](/api-reference/wallet/get-identity-public-key) - Get identity key from wallet
# Wallet API
Source: https://docs.spark.money/api-reference/wallet-overview
SparkWallet class methods for deposits, transfers, Lightning, and tokens.
The `SparkWallet` class is the primary interface for interacting with the Spark network, providing everything you need to build wallet applications on Bitcoin. It includes methods for creating and managing wallets, handling deposits and withdrawals, executing transfers, and interacting with the Lightning Network.
***
## Installation
Install the Spark SDK packages using your package manager of choice.
```bash npm theme={null}
npm install @buildonspark/spark-sdk
```
```bash yarn theme={null}
yarn add @buildonspark/spark-sdk
```
```bash pnpm theme={null}
pnpm add @buildonspark/spark-sdk
```
***
## Method Categories
# advancedDeposit
Source: https://docs.spark.money/api-reference/wallet/advanced-deposit
Deposit funds to Spark using a raw Bitcoin transaction.
Non-trusty flow for depositing funds to the `SparkWallet`. Construct the transaction spending from an L1 wallet to the Spark address.
## Method Signature
```typescript theme={null}
async advancedDeposit(txHex: string): Promise
```
## Parameters
The hex string of the transaction to deposit
## Returns
The nodes resulting from the deposit
## Example
```typescript theme={null}
const nodes = await wallet.advancedDeposit("transaction-hex-string");
console.log("Deposit nodes:", nodes);
```
# batchTransferTokens
Source: https://docs.spark.money/api-reference/wallet/batch-transfer-tokens
Send tokens to multiple recipients in one transaction.
Transfers tokens to multiple recipients in a single transaction.
## Method Signature
```typescript theme={null}
async batchTransferTokens(
receiverOutputs: {
tokenIdentifier: Bech32mTokenIdentifier;
tokenAmount: bigint;
receiverSparkAddress: string;
}[],
outputSelectionStrategy?: "SMALL_FIRST" | "LARGE_FIRST",
selectedOutputs?: OutputWithPreviousTransactionData[]
): Promise
```
## Parameters
Array of transfer outputs, each containing:
* `tokenIdentifier`: Bech32m token identifier (all must be the same token)
* `tokenAmount`: Amount of tokens to send
* `receiverSparkAddress`: Recipient's Spark address
Strategy for selecting outputs: `"SMALL_FIRST"` or `"LARGE_FIRST"` (default: `"SMALL_FIRST"`)
Specific outputs to use (overrides selection strategy)
## Returns
The transaction ID of the batch token transfer
## Example
```typescript theme={null}
const txId = await wallet.batchTransferTokens([
{
tokenIdentifier: "btkn1...",
tokenAmount: 500n,
receiverSparkAddress: "spark1abc..."
},
{
tokenIdentifier: "btkn1...",
tokenAmount: 300n,
receiverSparkAddress: "spark1def..."
}
]);
console.log("Batch transfer completed:", txId);
```
# checkTimelock
Source: https://docs.spark.money/api-reference/wallet/check-timelock
Check remaining timelock blocks for a node.
Checks the remaining timelock on a given node.
## Method Signature
```typescript theme={null}
async checkTimelock(nodeId: string): Promise<{
nodeTimelock: number;
refundTimelock: number;
}>
```
## Parameters
The ID of the node to check
## Returns
Object containing:
* `nodeTimelock`: Remaining blocks for the node transaction
* `refundTimelock`: Remaining blocks for the refund transaction
## Example
```typescript theme={null}
const timelocks = await wallet.checkTimelock("node-id-here");
console.log("Node timelock:", timelocks.nodeTimelock, "blocks");
console.log("Refund timelock:", timelocks.refundTimelock, "blocks");
```
# claimDeposit
Source: https://docs.spark.money/api-reference/wallet/claim-deposit
Claim a Bitcoin deposit from a single-use deposit address.
Claims a Bitcoin deposit made to a single-use deposit address for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async claimDeposit(txId: string): Promise
```
## Parameters
The transaction ID of the Bitcoin deposit
## Returns
The wallet leaves resulting from the claim operation
## Example
```typescript theme={null}
const leaves = await wallet.claimDeposit("transaction-hash-here");
console.log("Deposit claimed, leaves:", leaves);
```
# claimHTLC
Source: https://docs.spark.money/api-reference/wallet/claim-htlc
Claim an HTLC by providing the preimage.
Claims an HTLC by providing the preimage.
## Method Signature
```typescript theme={null}
async claimHTLC(preimage: string): Promise
```
## Parameters
The 32-byte preimage as a hex string
## Returns
The claimed transfer details
## Example
```typescript theme={null}
const transfer = await wallet.claimHTLC("abc123..."); // 32 bytes hex
console.log("HTLC claimed:", transfer.id);
```
# claimStaticDeposit
Source: https://docs.spark.money/api-reference/wallet/claim-static-deposit
Claim a deposit from a static address using a quote.
Claims a deposit made to a static deposit address using quote information from `getClaimStaticDepositQuote` for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async claimStaticDeposit({
transactionId,
creditAmountSats,
sspSignature,
outputIndex,
}: {
transactionId: string;
creditAmountSats: number;
sspSignature: string;
outputIndex?: number;
}): Promise
```
## Parameters
The Bitcoin transaction ID from the quote
The amount of sats from the quote
The SSP signature from the quote
The index of the output
## Returns
The claim result or null if the operation fails
## Example
```typescript theme={null}
const quote = await wallet.getClaimStaticDepositQuote(txId);
const claimResult = await wallet.claimStaticDeposit({
transactionId: txId,
creditAmountSats: quote.creditAmountSats,
sspSignature: quote.signature, // Note: quote returns 'signature', pass as 'sspSignature'
});
console.log("Claim result:", claimResult);
```
# claimStaticDepositWithMaxFee
Source: https://docs.spark.money/api-reference/wallet/claim-static-deposit-with-max-fee
Claim a static deposit only if fee is within your limit.
Gets a quote for a static deposit claim and automatically claims it if the fee is within the specified maximum.
## Method Signature
```typescript theme={null}
async claimStaticDepositWithMaxFee({
transactionId,
maxFee,
outputIndex,
}: {
transactionId: string;
maxFee: number;
outputIndex?: number;
}): Promise
```
## Parameters
The Bitcoin transaction ID of the deposit
Maximum fee in satoshis you're willing to pay
The output index (auto-detected if not provided)
## Returns
The claim result, or null if the fee exceeds maxFee
## Example
```typescript theme={null}
const result = await wallet.claimStaticDepositWithMaxFee({
transactionId: "abc123...",
maxFee: 500 // Will only claim if fee <= 500 sats
});
if (result) {
console.log("Deposit claimed:", result);
} else {
console.log("Fee too high, deposit not claimed");
}
```
# cleanupConnections
Source: https://docs.spark.money/api-reference/wallet/cleanup-connections
Close active connections and abort streams.
Cleans up connections and aborts any active streams for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async cleanupConnections(): Promise
```
## Returns
This method returns nothing and resolves to `void`.
## Example
```typescript theme={null}
await wallet.cleanupConnections();
console.log("Connections cleaned up");
```
# createHTLC
Source: https://docs.spark.money/api-reference/wallet/create-htlc
Create a Hash Time-Locked Contract for atomic swaps.
Creates a Hash Time-Locked Contract (HTLC) for atomic swaps and conditional payments.
## Method Signature
```typescript theme={null}
async createHTLC({
receiverSparkAddress,
amountSats,
preimage,
expiryTime,
}: {
receiverSparkAddress: string;
amountSats: number;
preimage?: string; // Optional - auto-generated if not provided
expiryTime: Date;
}): Promise
```
## Parameters
The Spark address of the receiver
The amount in satoshis to lock
The preimage (32 bytes hex) for the HTLC hash lock. If not provided, a deterministic preimage is generated using [`getHTLCPreimage()`](/api-reference/wallet/get-htlc-preimage).
The expiry time for the HTLC (must be in the future)
## Returns
The HTLC transfer details
## Example
```typescript theme={null}
// With auto-generated preimage (recommended)
const htlc = await wallet.createHTLC({
receiverSparkAddress: "spark1...",
amountSats: 10000,
expiryTime: new Date(Date.now() + 3600000) // 1 hour from now
});
console.log("HTLC created:", htlc.id);
// With custom preimage
const htlcWithPreimage = await wallet.createHTLC({
receiverSparkAddress: "spark1...",
amountSats: 10000,
preimage: "abc123def456...", // 32 bytes hex
expiryTime: new Date(Date.now() + 3600000)
});
```
# createHTLCReceiverSpendTx
Source: https://docs.spark.money/api-reference/wallet/create-htlc-receiver-spend-tx
Create a signed transaction to spend an HTLC as the receiver.
Creates a receiver spend transaction for an HTLC using the preimage.
## Method Signature
```typescript theme={null}
async createHTLCReceiverSpendTx({
htlcTx,
hash,
hashLockDestinationPubkey,
sequenceLockDestinationPubkey,
preimage,
satsPerVbyteFee,
}: {
htlcTx: string;
hash: string;
hashLockDestinationPubkey: string;
sequenceLockDestinationPubkey: string;
preimage: string;
satsPerVbyteFee: number;
}): Promise
```
## Parameters
The HTLC transaction hex
The hash used in the HTLC
Public key for the hash lock destination
Public key for the sequence lock destination
The preimage to unlock the HTLC
Fee rate in sats per vbyte
## Returns
The signed receiver spend transaction hex
## Example
```typescript theme={null}
const txHex = await wallet.createHTLCReceiverSpendTx({
htlcTx: "02000000...",
hash: "abc123...",
hashLockDestinationPubkey: "02...",
sequenceLockDestinationPubkey: "03...",
preimage: "secret123...",
satsPerVbyteFee: 10
});
console.log("Receiver spend tx:", txHex);
```
# createHTLCSenderSpendTx
Source: https://docs.spark.money/api-reference/wallet/create-htlc-sender-spend-tx
Create a signed transaction to reclaim an expired HTLC as sender.
Creates a sender spend transaction for an HTLC after the timelock expires.
## Method Signature
```typescript theme={null}
async createHTLCSenderSpendTx({
htlcTx,
hash,
hashLockDestinationPubkey,
sequenceLockDestinationPubkey,
satsPerVbyteFee,
}: {
htlcTx: string;
hash: string;
hashLockDestinationPubkey: string;
sequenceLockDestinationPubkey: string;
satsPerVbyteFee: number;
}): Promise
```
## Parameters
The HTLC transaction hex
The hash used in the HTLC
Public key for the hash lock destination
Public key for the sequence lock destination
Fee rate in sats per vbyte
## Returns
The signed sender spend transaction hex
## Example
```typescript theme={null}
const txHex = await wallet.createHTLCSenderSpendTx({
htlcTx: "02000000...",
hash: "abc123...",
hashLockDestinationPubkey: "02...",
sequenceLockDestinationPubkey: "03...",
satsPerVbyteFee: 10
});
console.log("Sender spend tx:", txHex);
```
# createLightningInvoice
Source: https://docs.spark.money/api-reference/wallet/create-lightning-invoice
Create a BOLT11 Lightning invoice to receive payments.
Creates a Lightning invoice for receiving payments via the `SparkWallet`.
## Method Signature
```typescript theme={null}
type CreateLightningInvoiceParams = {
amountSats: number;
memo?: string;
expirySeconds?: number;
includeSparkAddress?: boolean;
receiverIdentityPubkey?: string;
descriptionHash?: string;
};
async createLightningInvoice(params: CreateLightningInvoiceParams): Promise;
```
## Parameters
Amount in satoshis to receive. Use `0` for zero-amount invoices. Must be a safe integer (less than 2^53).
Optional memo/description for the invoice (max 639 characters). Cannot be used together with `descriptionHash`.
Invoice expiry time in seconds (default: 2,592,000 = 30 days)
Whether to embed Spark address in the invoice fallback field. **Note:** If the payer uses the fallback address instead of Lightning, the payment cannot be correlated to this invoice—it appears as a separate Spark transfer.
33-byte compressed identity pubkey for generating invoices for other Spark users
SHA256 hash of the description for BOLT11 description\_hash field. Cannot be used together with `memo`.
## Returns
The Lightning receive request object containing:
* `id`: Unique identifier for the request
* `invoice`: Invoice object with `encodedInvoice`, `paymentHash`, `amount`, etc.
* `status`: Request status
* `createdAt`, `updatedAt`: Timestamps
Access the BOLT11 invoice string via `request.invoice.encodedInvoice`.
## Example
```typescript theme={null}
const request = await wallet.createLightningInvoice({
amountSats: 1000,
memo: "Payment for services",
expirySeconds: 3600 // 1 hour
});
console.log("Lightning invoice:", request.invoice.encodedInvoice);
console.log("Request ID:", request.id);
console.log("Payment hash:", request.invoice.paymentHash);
```
# createSatsInvoice
Source: https://docs.spark.money/api-reference/wallet/create-sats-invoice
Create a Spark invoice to receive sats from another Spark wallet.
Creates a Spark invoice for receiving a sats payment on Spark.
## Method Signature
```typescript theme={null}
async createSatsInvoice({
amount,
memo,
senderSparkAddress,
expiryTime,
}: {
amount?: number;
memo?: string;
senderSparkAddress?: SparkAddressFormat;
expiryTime?: Date;
}): Promise
```
## Parameters
The amount of sats to receive (optional for open invoices). Max: 2,100,000,000,000,000 sats (21M BTC).
Optional memo/description for the payment
Optional Spark address of the expected sender
Optional expiry time for the invoice
## Returns
A Spark address/invoice that can be paid by another Spark wallet
## Example
```typescript theme={null}
// Create an invoice for 1000 sats
const invoice = await wallet.createSatsInvoice({
amount: 1000,
memo: "Payment for coffee"
});
console.log("Spark invoice:", invoice);
// Create an open invoice (no amount specified)
const openInvoice = await wallet.createSatsInvoice({
memo: "Tip jar"
});
```
# createTokensInvoice
Source: https://docs.spark.money/api-reference/wallet/create-tokens-invoice
Create a Spark invoice to receive tokens.
Creates a Spark invoice for receiving a tokens payment on Spark.
## Method Signature
```typescript theme={null}
async createTokensInvoice({
amount,
tokenIdentifier,
memo,
senderSparkAddress,
expiryTime,
}: {
tokenIdentifier?: Bech32mTokenIdentifier;
amount?: bigint;
memo?: string;
senderSparkAddress?: SparkAddressFormat;
expiryTime?: Date;
}): Promise
```
## Parameters
The Bech32m token identifier (e.g., `btkn1...`)
The amount of tokens to receive. Max: 2^128 - 1.
Optional memo/description for the payment
Optional Spark address of the expected sender
Optional expiry time for the invoice
## Returns
A Spark address/invoice that can be paid by another Spark wallet
## Example
```typescript theme={null}
const invoice = await wallet.createTokensInvoice({
tokenIdentifier: "btkn1...",
amount: 1000n,
memo: "Token payment"
});
console.log("Token invoice:", invoice);
```
# EventEmitter
Source: https://docs.spark.money/api-reference/wallet/event-emitter
Listen for wallet events like transfers and deposits.
`SparkWallet` extends `EventEmitter`, so it inherits the following methods for handling events.
## Available Events
| Event | Callback Signature | Description |
| ----------------------- | -------------------------------------------------------------------------------- | --------------------------------------------------------------------------------------------------------------------------- |
| `"transfer:claimed"` | `(transferId: string, updatedBalance: bigint) => void` | Emitted when an **incoming** transfer is claimed. Does NOT fire for outgoing Lightning payments. |
| `"deposit:confirmed"` | `(depositId: string, updatedBalance: bigint) => void` | Emitted when a pending L1 deposit becomes spendable (after 3 confirmations). Only fires if you claimed before confirmation. |
| `"stream:connected"` | `() => void` | Emitted when the event stream connects |
| `"stream:disconnected"` | `(reason: string) => void` | Emitted when the stream disconnects after max retries |
| `"stream:reconnecting"` | `(attempt: number, maxAttempts: number, delayMs: number, error: string) => void` | Emitted when attempting to reconnect |
The `updatedBalance` parameter is a `bigint` representing the wallet's new total balance in satoshis after the event.
Events are only emitted for **incoming** funds (transfers received, deposits confirmed). For outgoing operations (Lightning sends, withdrawals), poll the status using `getLightningSendRequest()` or `getCoopExitRequest()`.
***
## on(event: string, listener: Function)
Adds a listener for the specified event.
```typescript theme={null}
on(event: keyof SparkWalletEvents, listener: Function): this
```
## Parameters
The event name to listen for
The callback function to execute when the event is emitted
## Returns
The SparkWallet instance for chaining
## Example
```typescript theme={null}
wallet.on("transfer:claimed", (transferId, updatedBalance) => {
console.log(`Transfer ${transferId} claimed. New balance: ${updatedBalance}`);
});
```
***
## once(event: string, listener: Function)
Adds a one-time listener for the specified event.
```typescript theme={null}
once(event: keyof SparkWalletEvents, listener: Function): this
```
## Parameters
The event name to listen for
The callback function to execute when the event is emitted
## Returns
The SparkWallet instance for chaining
## Example
```typescript theme={null}
wallet.once("deposit:confirmed", (depositId, updatedBalance) => {
console.log(`Deposit ${depositId} confirmed! New balance: ${updatedBalance}`);
});
```
***
## off(event: string, listener: Function)
Removes the specified listener from the specified event.
```typescript theme={null}
off(event: keyof SparkWalletEvents, listener: Function): this
```
## Parameters
The event name
The callback function to remove
## Returns
The SparkWallet instance for chaining
## Example
```typescript theme={null}
const listener = (transferId) => console.log(`Transfer: ${transferId}`);
wallet.on("transfer:claimed", listener);
// Later, remove the listener
wallet.off("transfer:claimed", listener);
```
# fulfillSparkInvoice
Source: https://docs.spark.money/api-reference/wallet/fulfill-spark-invoice
Pay one or more Spark invoices.
Fulfills one or more Spark invoices by paying them.
## Method Signature
```typescript theme={null}
async fulfillSparkInvoice(
sparkInvoices: {
invoice: SparkAddressFormat;
amount?: bigint;
}[]
): Promise
```
## Parameters
Array of invoices to fulfill:
* `invoice`: The Spark invoice to pay (must use `spark1...` prefix)
* `amount`: Amount to pay (required for invoices without encoded amount)
## Returns
Response containing results for all invoice payment attempts
```typescript theme={null}
interface FulfillSparkInvoiceResponse {
// Successfully paid sats invoices
satsTransactionSuccess: {
invoice: SparkAddressFormat;
transferResponse: WalletTransfer;
}[];
// Failed sats invoice payments
satsTransactionErrors: {
invoice: SparkAddressFormat;
error: Error;
}[];
// Successfully paid token invoices
tokenTransactionSuccess: {
tokenIdentifier: Bech32mTokenIdentifier;
invoices: SparkAddressFormat[];
txid: string;
}[];
// Failed token invoice payments
tokenTransactionErrors: {
tokenIdentifier: Bech32mTokenIdentifier;
invoices: SparkAddressFormat[];
error: Error;
}[];
// Invoices that failed validation before payment
invalidInvoices: {
invoice: SparkAddressFormat;
error: Error;
}[];
}
```
## Example
```typescript theme={null}
// Pay a single invoice
const result = await wallet.fulfillSparkInvoice([
{ invoice: "spark1..." }
]);
// Pay multiple invoices with amounts
const batchResult = await wallet.fulfillSparkInvoice([
{ invoice: invoiceWithNoAmount, amount: 1000n },
{ invoice: invoiceWithEncodedAmount }
]);
// Check results
console.log("Successful:", result.satsTransactionSuccess);
console.log("Errors:", result.satsTransactionErrors);
```
# getBalance
Source: https://docs.spark.money/api-reference/wallet/get-balance
Get wallet balance in sats and token balances with metadata.
Gets the current balance of the `SparkWallet`, including Bitcoin balance and token balances.
## Method Signature
```typescript theme={null}
async getBalance(): Promise<{
balance: bigint;
tokenBalances: TokenBalanceMap;
}>
// TokenBalanceMap = Map
interface UserTokenMetadata {
rawTokenIdentifier: Uint8Array; // Binary token identifier used to encode the bech32m identifier
tokenPublicKey: string; // Issuer's public key
tokenName: string;
tokenTicker: string;
decimals: number; // Number of decimal places
maxSupply: bigint;
extraMetadata?: Uint8Array; // Arbitrary bytes set by the issuer
}
```
## Returns
The wallet's current balance in satoshis as a `bigint`. Use `Number(balance)` for display or arithmetic with regular numbers.
Map of Bech32m token identifiers to token balance and metadata objects
## Example
```typescript theme={null}
const { balance, tokenBalances } = await wallet.getBalance();
console.log("Balance:", balance);
// Iterate over token balances
for (const [tokenId, info] of tokenBalances) {
console.log(`Token ${tokenId}: ${info.balance}`);
console.log(` Name: ${info.tokenMetadata.tokenName}`);
console.log(` Ticker: ${info.tokenMetadata.tokenTicker}`);
console.log(` Decimals: ${info.tokenMetadata.decimals}`);
// Check for extra metadata
if (info.tokenMetadata.extraMetadata) {
console.log(` Extra metadata: ${info.tokenMetadata.extraMetadata.length} bytes`);
}
}
```
# getClaimStaticDepositQuote
Source: https://docs.spark.money/api-reference/wallet/get-claim-static-deposit-quote
Get a quote with fee and SSP signature for claiming a static deposit.
Gets a quote for claiming a deposit made to a static deposit address for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async getClaimStaticDepositQuote(
transactionId: string,
outputIndex?: number
): Promise
```
## Parameters
The Bitcoin transaction ID of the deposit transaction
The index of the output (auto-detected if not provided)
## Returns
Quote object containing:
* `creditAmountSats`: The amount in satoshis to claim
* `signature`: The SSP signature required for claiming
## Example
```typescript theme={null}
const quote = await wallet.getClaimStaticDepositQuote("abc123...");
console.log("Credit amount:", quote.creditAmountSats);
console.log("Signature:", quote.signature);
```
# getCoopExitRequest
Source: https://docs.spark.money/api-reference/wallet/get-coop-exit-request
Get the status of a withdrawal request by ID.
Gets a cooperative exit request by ID for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async getCoopExitRequest(id: string): Promise
```
## Parameters
The withdrawal request ID returned by `withdraw()`. **Not the on-chain transaction ID.**
Format: `SparkCoopExitRequest:xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx`
## Returns
The cooperative exit request details or null if not found
The `transfer` field in the response will be empty until the SSP has claimed the inbound Spark transfer. Poll until populated if you need the on-chain transaction details.
## Example
```typescript theme={null}
// Use the ID returned from withdraw(), not the on-chain txid
const exitRequest = await wallet.getCoopExitRequest(withdrawal.id);
if (exitRequest) {
console.log("Status:", exitRequest.status);
console.log("On-chain txid:", exitRequest.coopExitTxid);
}
```
# getHTLCPreimage
Source: https://docs.spark.money/api-reference/wallet/get-htlc-preimage
Generate a deterministic HTLC preimage from a transfer ID.
Generates a deterministic HTLC preimage from a transfer ID using the wallet's keys.
## Method Signature
```typescript theme={null}
async getHTLCPreimage(transferID: string): Promise
```
## Parameters
The transfer ID to generate a preimage for
## Returns
The 32-byte preimage
## Example
```typescript theme={null}
const preimage = await wallet.getHTLCPreimage("transfer-id");
console.log("Preimage:", bytesToHex(preimage));
```
# getIdentityPublicKey
Source: https://docs.spark.money/api-reference/wallet/get-identity-public-key
Get the wallet's identity public key as hex string.
Gets the identity public key of the `SparkWallet`.
## Method Signature
```typescript theme={null}
async getIdentityPublicKey(): Promise
```
## Returns
The identity public key as a hex string
## Example
```typescript theme={null}
const identityPubKey = await wallet.getIdentityPublicKey();
console.log("Identity Public Key:", identityPubKey);
```
# getLeaves
Source: https://docs.spark.money/api-reference/wallet/get-leaves
Get all wallet leaves (UTXOs) as TreeNode objects.
Gets all available leaves (UTXOs) owned by the wallet.
## Method Signature
```typescript theme={null}
async getLeaves(isBalanceCheck?: boolean): Promise
```
## Parameters
If `true`, only queries the coordinator for faster response (default: `false`)
## Returns
Array of tree nodes representing wallet leaves
## Example
```typescript theme={null}
const leaves = await wallet.getLeaves();
console.log("Leaves:", leaves.length);
console.log("Total value:", leaves.reduce((sum, l) => sum + l.value, 0));
```
# getLightningReceiveRequest
Source: https://docs.spark.money/api-reference/wallet/get-lightning-receive-request
Check status of a Lightning invoice.
Gets the status of a Lightning receive request (invoice) for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async getLightningReceiveRequest(id: string): Promise
```
## Parameters
The ID of the invoice to check
## Returns
The Lightning receive request details or null if not found
## Example
```typescript theme={null}
const receiveRequest = await wallet.getLightningReceiveRequest("request-id");
if (receiveRequest) {
console.log("Invoice status:", receiveRequest.status);
}
```
# getLightningSendFeeEstimate
Source: https://docs.spark.money/api-reference/wallet/get-lightning-send-fee-estimate
Estimate the fee for paying a Lightning invoice.
Estimates the fee for sending a Lightning payment via the `SparkWallet`.
## Method Signature
```typescript theme={null}
interface LightningSendFeeEstimateInput {
encodedInvoice: string;
amountSats?: number;
}
async getLightningSendFeeEstimate(params: LightningSendFeeEstimateInput): Promise
```
## Parameters
The BOLT11-encoded Lightning invoice
Amount in satoshis to send (required for zero-amount invoices)
## Returns
The estimated fee in satoshis
## Example
```typescript theme={null}
const feeEstimate = await wallet.getLightningSendFeeEstimate({
encodedInvoice: "lnbc..."
});
console.log("Estimated fee:", feeEstimate, "sats");
```
# getLightningSendRequest
Source: https://docs.spark.money/api-reference/wallet/get-lightning-send-request
Check status of a Lightning payment.
Gets the status of a Lightning send request for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async getLightningSendRequest(id: string): Promise
```
## Parameters
The ID of the Lightning send request to check
## Returns
The Lightning send request details or null if not found
## Example
```typescript theme={null}
const sendRequest = await wallet.getLightningSendRequest("request-id");
if (sendRequest) {
console.log("Payment status:", sendRequest.status);
}
```
# getSingleUseDepositAddress
Source: https://docs.spark.money/api-reference/wallet/get-single-use-deposit-address
Generate a one-time Bitcoin deposit address (bc1p...).
Generates a new single-use deposit address for receiving Bitcoin funds. Returns a Bitcoin address (not a Spark Address) that starts with "bc1p".
**CRITICAL: Single-use addresses are consumed after the first deposit.** Any subsequent deposits to a used address **will be permanently lost**. For most use cases, prefer [`getStaticDepositAddress()`](/api-reference/wallet/get-static-deposit-address) which is reusable and safer.
## Method Signature
```typescript theme={null}
async getSingleUseDepositAddress(): Promise
```
## Returns
A Bitcoin address for depositing funds
## Example
```typescript theme={null}
const depositAddress = await wallet.getSingleUseDepositAddress();
console.log("Deposit Address:", depositAddress);
```
# getSparkAddress
Source: https://docs.spark.money/api-reference/wallet/get-spark-address
Get the wallet's Spark address for receiving payments.
Gets the Spark Address of the `SparkWallet`.
## Method Signature
```typescript theme={null}
async getSparkAddress(): Promise
```
## Returns
The Spark Address
## Example
```typescript theme={null}
const sparkAddress = await wallet.getSparkAddress();
console.log("Spark Address:", sparkAddress);
```
# getStaticDepositAddress
Source: https://docs.spark.money/api-reference/wallet/get-static-deposit-address
Get a reusable Bitcoin deposit address for the wallet.
Returns a reusable Bitcoin address for the `SparkWallet` that can be used to receive deposits multiple times. Currently only one static deposit address is supported per wallet.
If a static deposit address already exists for this wallet, calling this method again will return the existing address (not create a new one).
**Do not call this method concurrently.** Concurrent calls before an address exists can create multiple addresses. Always await the first call before making another.
## Method Signature
```typescript theme={null}
async getStaticDepositAddress(): Promise
```
## Returns
A reusable Bitcoin address for depositing funds
## Example
```typescript theme={null}
const staticAddress = await wallet.getStaticDepositAddress();
console.log("Static deposit address:", staticAddress);
// Calling again returns the same address
const sameAddress = await wallet.getStaticDepositAddress();
console.log(staticAddress === sameAddress); // true
```
# getSwapFeeEstimate
Source: https://docs.spark.money/api-reference/wallet/get-swap-fee-estimate
Estimate fees for a leaf swap operation.
Gets the estimated fee for a swap of leaves for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async getSwapFeeEstimate(amountSats: number): Promise
```
## Parameters
The amount of sats to swap
## Returns
The estimated fee for the swap
## Example
```typescript theme={null}
const feeEstimate = await wallet.getSwapFeeEstimate(10000);
console.log("Swap fee estimate:", feeEstimate);
```
# getTokenL1Address
Source: https://docs.spark.money/api-reference/wallet/get-token-l1-address
Get the L1 Bitcoin address derived from identity key for tokens.
Gets the Layer 1 Bitcoin address derived from the wallet's identity key for token operations.
## Method Signature
```typescript theme={null}
async getTokenL1Address(): Promise
```
## Returns
A P2WPKH Bitcoin address derived from the wallet's identity public key
## Example
```typescript theme={null}
const l1Address = await wallet.getTokenL1Address();
console.log("Token L1 address:", l1Address);
```
# getTokenOutputStats
Source: https://docs.spark.money/api-reference/wallet/get-token-output-stats
Get token output statistics including count and total amount.
Gets statistics about token outputs in the wallet.
## Method Signature
```typescript theme={null}
async getTokenOutputStats(
tokenIdentifier?: Bech32mTokenIdentifier
): Promise<{
outputCount: number;
totalAmount: bigint;
}>
```
## Parameters
Optional token identifier to get stats for. If not provided, returns stats for all tokens.
## Returns
Number of token outputs
Total amount across all outputs
## Example
```typescript theme={null}
// Get stats for a specific token
const stats = await wallet.getTokenOutputStats("btkn1...");
console.log("Output count:", stats.outputCount);
console.log("Total amount:", stats.totalAmount);
// Get stats for all tokens
const allStats = await wallet.getTokenOutputStats();
console.log("All token outputs:", allStats.outputCount);
```
# getTransfer
Source: https://docs.spark.money/api-reference/wallet/get-transfer
Get a specific transfer by ID.
Gets a specific transfer by ID that the wallet is a participant of.
This returns Spark transfer data only. For Lightning-related transfer information, use [`getTransferFromSsp()`](/api-reference/wallet/get-transfer-from-ssp).
## Method Signature
```typescript theme={null}
async getTransfer(id: string): Promise
```
## Parameters
The transfer ID to query
## Returns
The transfer details, or undefined if not found
### WalletTransfer Fields
| Field | Type | Description |
| --------------------------- | ------------------- | ----------------------------------------------------------------------------------------- |
| `id` | `string` | Unique transfer identifier |
| `status` | `string` | Transfer status (`TRANSFER_STATUS_SENDER_KEY_TWEAKED`, `TRANSFER_STATUS_COMPLETED`, etc.) |
| `totalValue` | `number` | Amount in satoshis |
| `senderIdentityPublicKey` | `string` | Sender's identity public key |
| `receiverIdentityPublicKey` | `string` | Receiver's identity public key |
| `transferDirection` | `string` | `INCOMING` or `OUTGOING` relative to this wallet |
| `createdTime` | `Date \| undefined` | When the transfer was created |
| `updatedTime` | `Date \| undefined` | When the transfer was last updated |
| `expiryTime` | `Date \| undefined` | When the transfer expires |
## Example
```typescript theme={null}
const transfer = await wallet.getTransfer("transfer-id-here");
if (transfer) {
console.log("Transfer status:", transfer.status);
console.log("Amount:", transfer.totalValue);
}
```
# getTransferFromSsp
Source: https://docs.spark.money/api-reference/wallet/get-transfer-from-ssp
Get transfer details from SSP, including Lightning info.
Gets a transfer from the SSP (Spark Service Provider), including Lightning-related transfer information.
## Method Signature
```typescript theme={null}
async getTransferFromSsp(id: string): Promise
```
## Parameters
The transfer ID to query
## Returns
The transfer with user request details, or undefined if not found
## Example
```typescript theme={null}
const transfer = await wallet.getTransferFromSsp("transfer-id");
if (transfer) {
console.log("Transfer:", transfer);
console.log("User request:", transfer.userRequest);
}
```
# getTransfers
Source: https://docs.spark.money/api-reference/wallet/get-transfers
Get transfer history with pagination and date filters.
Gets all transfers for the `SparkWallet`.
`getTransfers()` includes Spark transfers, Lightning sends/receives, and cooperative exits (L1 withdrawals). For token transaction details (e.g., sender address), use [`queryTokenTransactions()`](/api-reference/wallet/query-token-transactions) instead.
## Method Signature
```typescript theme={null}
async getTransfers(
limit: number = 20,
offset: number = 0,
createdAfter?: Date,
createdBefore?: Date
): Promise<{
transfers: WalletTransfer[];
offset: number;
}>
```
## Parameters
Maximum number of transfers to return (default: 20)
Offset for pagination (default: 0)
Only return transfers created after this date (mutually exclusive with `createdBefore`)
Only return transfers created before this date (mutually exclusive with `createdAfter`)
`createdAfter` and `createdBefore` are mutually exclusive. Providing both will throw an error.
## Returns
Array of transfer objects
The offset used for this request
## Example
```typescript theme={null}
// Basic pagination
const transfers = await wallet.getTransfers(10, 0);
console.log("Transfers:", transfers.transfers);
console.log("Offset:", transfers.offset);
// Get transfers from the last 24 hours
const yesterday = new Date(Date.now() - 24 * 60 * 60 * 1000);
const recentTransfers = await wallet.getTransfers(50, 0, yesterday);
// Get transfers before a specific date
const cutoffDate = new Date("2024-01-01");
const oldTransfers = await wallet.getTransfers(50, 0, undefined, cutoffDate);
```
# getUnusedDepositAddresses
Source: https://docs.spark.money/api-reference/wallet/get-unused-deposit-addresses
Get all unused single-use deposit addresses.
Gets all unused single-use deposit addresses for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async getUnusedDepositAddresses(): Promise
```
## Returns
Array of unused Bitcoin deposit addresses
## Example
```typescript theme={null}
const unusedAddresses = await wallet.getUnusedDepositAddresses();
console.log("Unused deposit addresses:", unusedAddresses);
```
# getUserRequests
Source: https://docs.spark.money/api-reference/wallet/get-user-requests
Get user requests from the SSP.
Gets user requests from the SSP.
## Method Signature
```typescript theme={null}
async getUserRequests(
params?: GetUserRequestsParams
): Promise
```
## Parameters
Optional filtering parameters
## Returns
User requests connection, or null if not available
## Example
```typescript theme={null}
const requests = await wallet.getUserRequests();
if (requests) {
console.log("User requests:", requests);
}
```
# getUtxosForDepositAddress
Source: https://docs.spark.money/api-reference/wallet/get-utxos-for-deposit-address
Get confirmed UTXOs for a deposit address.
Returns confirmed UTXOs for a given Spark deposit address.
## Method Signature
```typescript theme={null}
async getUtxosForDepositAddress(
depositAddress: string,
limit?: number,
offset?: number,
excludeClaimed?: boolean
): Promise<{ txid: string; vout: number }[]>
```
## Parameters
The deposit address to query
Maximum number of UTXOs to return (default: 100)
Pagination offset (default: 0)
Whether to exclude already claimed UTXOs (default: false)
## Returns
Array of UTXO objects with `txid` and `vout`
## Example
```typescript theme={null}
const utxos = await wallet.getUtxosForDepositAddress(
"bcrt1p...",
100,
0,
true
);
console.log("UTXOs:", utxos);
```
# getWalletSettings
Source: https://docs.spark.money/api-reference/wallet/get-wallet-settings
Get current wallet settings including privacy mode.
Gets the current wallet settings.
## Method Signature
```typescript theme={null}
async getWalletSettings(): Promise
```
## Returns
Current wallet settings, or undefined if not available
## Example
```typescript theme={null}
const settings = await wallet.getWalletSettings();
if (settings) {
console.log("Privacy enabled:", settings.privateEnabled);
}
```
# getWithdrawalFeeQuote
Source: https://docs.spark.money/api-reference/wallet/get-withdrawal-fee-quote
Get fee quote for on-chain withdrawal at fast, medium, or slow speeds.
Gets a fee quote for a cooperative exit (on-chain withdrawal) for the `SparkWallet`. The quote includes options for different speeds and an expiry time.
## Method Signature
```typescript theme={null}
async getWithdrawalFeeQuote({
amountSats,
withdrawalAddress,
}: {
amountSats: number;
withdrawalAddress: string;
}): Promise
```
## Parameters
The amount in satoshis to withdraw
The Bitcoin address where the funds should be sent
## Returns
A fee quote for the withdrawal, or null if not available
### CoopExitFeeQuote Fields
| Field | Type | Description |
| ---------------------- | ---------------- | ---------------------------------------------- |
| `id` | `string` | Quote ID (use as `feeQuoteId` in `withdraw()`) |
| `expiresAt` | `string` | When this quote expires |
| `userFeeFast` | `CurrencyAmount` | Service fee for fast exit |
| `userFeeMedium` | `CurrencyAmount` | Service fee for medium exit |
| `userFeeSlow` | `CurrencyAmount` | Service fee for slow exit |
| `l1BroadcastFeeFast` | `CurrencyAmount` | L1 tx fee for fast exit |
| `l1BroadcastFeeMedium` | `CurrencyAmount` | L1 tx fee for medium exit |
| `l1BroadcastFeeSlow` | `CurrencyAmount` | L1 tx fee for slow exit |
`CurrencyAmount` has `originalValue` (number in satoshis) and `originalUnit` fields.
## Example
```typescript theme={null}
const feeQuote = await wallet.getWithdrawalFeeQuote({
amountSats: 17000,
withdrawalAddress: "bcrt1p..."
});
if (feeQuote) {
console.log("Quote expires:", feeQuote.expiresAt);
console.log("Fast fee:", feeQuote.userFeeFast.originalValue + feeQuote.l1BroadcastFeeFast.originalValue, "sats");
console.log("Medium fee:", feeQuote.userFeeMedium.originalValue + feeQuote.l1BroadcastFeeMedium.originalValue, "sats");
console.log("Slow fee:", feeQuote.userFeeSlow.originalValue + feeQuote.l1BroadcastFeeSlow.originalValue, "sats");
}
```
# initialize
Source: https://docs.spark.money/api-reference/wallet/initialize
Create or restore a SparkWallet instance from mnemonic or seed.
Creates and initializes a new `SparkWallet` instance.
## Method Signature
```typescript theme={null}
interface SparkWalletProps {
mnemonicOrSeed?: Uint8Array | string;
accountNumber?: number;
signer?: SparkSigner;
options?: ConfigOptions;
}
static async initialize(props: SparkWalletProps): Promise<{
wallet: SparkWallet;
mnemonic?: string;
}>
```
## Parameters
BIP-39 mnemonic phrase or raw seed
Number used to generate multiple identity keys from the same mnemonic
Custom signer implementation for advanced use cases
Wallet configuration options including network selection
```typescript theme={null}
interface ConfigOptions {
network?: "MAINNET" | "REGTEST"; // Required for most use cases
// Advanced options (rarely needed):
optimizationOptions?: {
auto?: boolean; // Auto-optimize leaves on sync (default: true)
multiplicity?: number; // Optimization level 0-5 (default: 1)
};
tokenOptimizationOptions?: {
enabled?: boolean; // Enable token output consolidation (default: true)
intervalMs?: number; // Optimization interval (default: 300000 = 5 min)
minOutputsThreshold?: number; // Min outputs before optimizing (default: 50)
};
events?: Partial; // Pre-register event handlers at init
}
```
## Leaf Optimization
Leaf optimization pre-arranges your wallet's internal structure to enable faster transfers. Without optimization, transfers may require a swap with the SSP (Spark Service Provider), adding latency. With optimization, transfers complete in \~5 seconds.
### How It Works
Spark creates **power-of-2 denomination leaves** (1, 2, 4, 8, 16... sats). With one leaf of each denomination, you can transfer any amount without needing an SSP swap. With multiple leaves of each denomination, you can make multiple transfers without swapping.
### Multiplicity Levels
The `multiplicity` setting controls how aggressively to optimize:
| Level | Behavior | Use Case |
| ------- | -------------------------------- | -------------------------------------------- |
| **0** | No optimization | Testing only |
| **1** | 1 leaf per denomination | Likely 1 fast transfer before needing a swap |
| **2-4** | Multiple leaves per denomination | Multiple fast transfers |
| **5** | Maximum optimization | Likely 5+ fast transfers without any swaps |
### The Tradeoff
Higher multiplicity = faster transfers, but smaller individual leaves. Leaves under **16,348 sats cannot be unilaterally exited** (fees would exceed value). If unilateral exit capability is critical for your users, use a lower multiplicity or larger balances.
For most consumer wallets, fast transfer speeds for 100% of users outweighs unilateral exit costs for a small fraction of users.
### Auto vs Manual Mode
**Auto mode** (`auto: true`, default):
* Optimization runs automatically in the background after sync
* Swaps with SSP when leaves are too far from optimal
* Transfers wait for optimization to complete before sending
* Best for most applications
**Manual mode** (`auto: false`):
* You control exactly when optimization runs via [`optimizeLeaves()`](/api-reference/wallet/optimize-leaves)
* More aggressive optimization (skips the "is it needed?" check)
* Use when you want maximum optimization regardless of current state
### Configuration Examples
```typescript theme={null}
// Default behavior - auto optimization with multiplicity 1
const { wallet } = await SparkWallet.initialize({
options: { network: "MAINNET" }
});
// Fast transfers for consumer wallet
const { wallet } = await SparkWallet.initialize({
options: {
network: "MAINNET",
optimizationOptions: {
auto: true,
multiplicity: 5
}
}
});
// Manual control for maximum optimization
const { wallet } = await SparkWallet.initialize({
options: {
network: "MAINNET",
optimizationOptions: {
auto: false,
multiplicity: 5
}
}
});
// Then call wallet.optimizeLeaves() explicitly when needed
```
**Safe to pass on every init.** Pass `optimizationOptions` every time you initialize (e.g., on app reopen). The SDK only runs optimization if needed and does nothing on wallets with no balance.
## Returns
The initialized SparkWallet instance
The mnemonic if one was generated (undefined for raw seed)
## Example
```typescript Create New Wallet theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
// Create a new wallet
const { wallet, mnemonic } = await SparkWallet.initialize({
options: { network: "REGTEST" } // or "MAINNET"
});
console.log("Wallet initialized:", wallet);
console.log("Generated mnemonic:", mnemonic);
```
```typescript Import Existing Wallet theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
// Import wallet from existing mnemonic
const { wallet } = await SparkWallet.initialize({
mnemonicOrSeed: "abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon about",
accountNumber: 0, // Optional: specify account index
options: { network: "REGTEST" }
});
console.log("Wallet restored from mnemonic");
```
## Multiple SDK Instances
When running multiple wallet instances (e.g., service worker + popup in a browser extension):
Multiple instances are **safe** but may cause temporary claim failures. The SDK handles this automatically—failed claims retry and succeed on subsequent attempts.
**Best practices:**
* Avoid calling `getStaticDepositAddress()` concurrently—this can create duplicate addresses
* Let one instance handle background claiming if possible
* Failed claims due to concurrent access are automatically recoverable
## System Time Requirements
The SDK uses your device's system time for expiry calculations.
**Your device clock must be within 2 minutes of actual time**, or operations will fail with "invalid expiry\_time" errors. If users report this error, ask them to sync their device clock.
# isOptimizationInProgress
Source: https://docs.spark.money/api-reference/wallet/is-optimization-in-progress
Check if leaf optimization is currently running.
Checks if a leaf optimization operation is currently in progress.
## Method Signature
```typescript theme={null}
async isOptimizationInProgress(): Promise
```
## Returns
`true` if optimization is running, `false` otherwise
## Example
```typescript theme={null}
if (await wallet.isOptimizationInProgress()) {
console.log("Optimization is running...");
} else {
console.log("No optimization in progress");
}
```
# isTokenOptimizationInProgress
Source: https://docs.spark.money/api-reference/wallet/is-token-optimization-in-progress
Check if token output optimization is currently running.
Checks if a token output optimization operation is currently in progress.
## Method Signature
```typescript theme={null}
async isTokenOptimizationInProgress(): Promise
```
## Returns
`true` if token optimization is running, `false` otherwise
## Example
```typescript theme={null}
if (await wallet.isTokenOptimizationInProgress()) {
console.log("Token optimization is running...");
} else {
console.log("No token optimization in progress");
}
```
# optimizeLeaves
Source: https://docs.spark.money/api-reference/wallet/optimize-leaves
Manually optimize leaf structure for faster transfers.
Manually optimizes wallet leaf structure by consolidating or splitting leaves into power-of-2 denominations. This is an async generator that yields progress updates and can be interrupted between steps.
For background on leaf optimization concepts, multiplicity levels, and tradeoffs, see [Leaf Optimization](/api-reference/wallet/initialize#leaf-optimization) in the initialize docs.
## When to Use
Use this method when you've configured **manual mode** (`auto: false`) in your initialization options. In manual mode, the SDK won't optimize automatically—you control exactly when optimization runs.
```typescript theme={null}
// Initialize with manual mode
const { wallet } = await SparkWallet.initialize({
options: {
network: "MAINNET",
optimizationOptions: {
auto: false,
multiplicity: 5
}
}
});
// Then call optimizeLeaves() explicitly
for await (const progress of wallet.optimizeLeaves()) {
console.log(`Optimizing: ${progress.step}/${progress.total}`);
}
```
In **auto mode** (default), the SDK handles optimization automatically. You typically don't need to call this method directly.
## Method Signature
```typescript theme={null}
async *optimizeLeaves(
multiplicity?: number
): AsyncGenerator<{
step: number;
total: number;
controller: AbortController;
}>
```
## Parameters
Optimization multiplicity (0-5). Defaults to the value set in `optimizationOptions` during initialization.
## Yields
Progress object containing:
* `step`: Current step number
* `total`: Total number of steps
* `controller`: AbortController to cancel optimization between steps
## Interrupting Optimization
The abort controller lets you pause optimization between steps—useful when the user wants to send a payment mid-optimization.
```typescript theme={null}
let optimizeGenerator = wallet.optimizeLeaves();
async function runOptimization() {
for await (const progress of optimizeGenerator) {
console.log(`Step ${progress.step} of ${progress.total}`);
// User wants to send payment - abort optimization
if (userWantsToSend) {
progress.controller.abort();
break;
}
}
}
// Start optimization
runOptimization();
// Later: user triggers a send
userWantsToSend = true;
await wallet.transfer({ ... });
// Resume optimization after the transfer
runOptimization();
```
Optimization runs multiple swap operations with the SSP. If you abort mid-optimization, the wallet remains in a valid state but may not be fully optimized. You can resume optimization later.
## Example: Basic Usage
```typescript theme={null}
for await (const progress of wallet.optimizeLeaves()) {
console.log(`Step ${progress.step} of ${progress.total}`);
}
console.log("Optimization complete");
```
## Example: With Progress UI
```typescript theme={null}
async function optimizeWithProgress() {
const progressBar = showProgressBar();
try {
for await (const { step, total } of wallet.optimizeLeaves(5)) {
progressBar.update(step / total * 100);
}
progressBar.complete();
} catch (error) {
progressBar.error("Optimization failed");
throw error;
}
}
```
# optimizeTokenOutputs
Source: https://docs.spark.money/api-reference/wallet/optimize-token-outputs
Consolidate token outputs for improved performance.
Optimizes token outputs by consolidating them when there are more than the configured threshold.
## Method Signature
```typescript theme={null}
async optimizeTokenOutputs(): Promise
```
## Returns
This method returns nothing and resolves to `void`.
## Example
```typescript theme={null}
await wallet.optimizeTokenOutputs();
console.log("Token outputs optimized");
```
# payLightningInvoice
Source: https://docs.spark.money/api-reference/wallet/pay-lightning-invoice
Pay a BOLT11 Lightning invoice.
Pays a Lightning invoice via the `SparkWallet`.
## Method Signature
```typescript theme={null}
interface PayLightningInvoiceParams {
invoice: string;
maxFeeSats: number;
preferSpark?: boolean;
amountSatsToSend?: number;
}
async payLightningInvoice(params: PayLightningInvoiceParams): Promise
```
## Parameters
The BOLT11-encoded Lightning invoice to pay
Maximum fee in satoshis to pay for the invoice
When `true`, initiate a Spark transfer if a valid Spark address is found in the invoice (default: `false`)
Amount in satoshis to send for zero-amount invoices
## Returns
The Lightning payment request details, or a `WalletTransfer` if `preferSpark` is `true` and a valid Spark address was found in the invoice
The payment preimage is not returned immediately. To retrieve the preimage after payment completes, call [`getLightningSendRequest(id)`](/api-reference/wallet/get-lightning-send-request) with the returned request ID.
When `preferSpark: true` and the invoice contains a valid Spark fallback address, the method returns a `WalletTransfer` instead of `LightningSendRequest`. If no valid Spark address is found, it falls back to Lightning.
## Examples
```typescript theme={null}
// Pay a regular invoice via Lightning
const payment_response = await wallet.payLightningInvoice({
invoice: "lnbc100n...", // Regular Lightning invoice with amount
maxFeeSats: 5,
});
console.log("Payment Response:", payment_response);
// Pay a zero-amount invoice
const zeroAmountPayment = await wallet.payLightningInvoice({
invoice: "lnbc...", // Zero-amount Lightning invoice
maxFeeSats: 5,
amountSatsToSend: 1000, // Specify amount for zero-amount invoice
});
console.log("Zero-amount Payment Response:", zeroAmountPayment);
// Prefer Spark transfer if invoice has Spark fallback address
const sparkPreferred = await wallet.payLightningInvoice({
invoice: "lnbc100n...", // Invoice with Spark fallback
maxFeeSats: 5,
preferSpark: true, // Will use Spark transfer if available
});
// Returns WalletTransfer if Spark used, LightningSendRequest otherwise
```
# queryHTLC
Source: https://docs.spark.money/api-reference/wallet/query-htlc
Query HTLCs by payment hash, status, or transfer ID.
Queries HTLCs with flexible filtering options.
## Method Signature
```typescript theme={null}
async queryHTLC({
paymentHashes,
status,
transferIds,
matchRole,
limit,
offset,
}: {
paymentHashes?: string[];
status?: PreimageRequestStatus;
transferIds?: string[];
matchRole?: PreimageRequestRole;
limit?: number;
offset?: number;
}): Promise
```
## Parameters
Array of payment hashes to filter by
Filter by HTLC status
Array of transfer IDs to filter by
Filter by role (default: `PREIMAGE_REQUEST_ROLE_RECEIVER`)
Maximum results to return (1-100, default: 100). Values outside this range throw `SparkValidationError`.
Pagination offset (default: 0). Must be non-negative.
## Returns
HTLC query results
## Example
```typescript theme={null}
const htlcs = await wallet.queryHTLC({
limit: 50,
offset: 0
});
console.log("HTLCs:", htlcs);
```
# querySparkInvoices
Source: https://docs.spark.money/api-reference/wallet/query-spark-invoices
Check the status of Spark invoices.
Queries the status of Spark invoices.
## Method Signature
```typescript theme={null}
async querySparkInvoices(
invoices: string[]
): Promise
```
## Parameters
Array of Spark invoice strings to query
## Returns
Response containing invoice status information
## Example
```typescript theme={null}
const status = await wallet.querySparkInvoices([
"spark1...",
"spark1..."
]);
console.log("Invoice statuses:", status);
```
# queryStaticDepositAddresses
Source: https://docs.spark.money/api-reference/wallet/query-static-deposit-addresses
List all static deposit addresses for this wallet.
Queries all static deposit addresses associated with the `SparkWallet`.
## Method Signature
```typescript theme={null}
async queryStaticDepositAddresses(): Promise
```
## Returns
Array of static deposit Bitcoin addresses
## Example
```typescript theme={null}
const addresses = await wallet.queryStaticDepositAddresses();
console.log("Static deposit addresses:", addresses);
```
# queryTokenTransactions
Source: https://docs.spark.money/api-reference/wallet/query-token-transactions
Query token transaction history with flexible filters.
Retrieves token transactions from the network with flexible filtering options for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async queryTokenTransactions({
sparkAddresses,
ownerPublicKeys,
issuerPublicKeys,
tokenTransactionHashes,
tokenIdentifiers,
outputIds,
order,
pageSize,
offset,
}: {
sparkAddresses?: string[];
ownerPublicKeys?: string[]; // deprecated, use sparkAddresses
issuerPublicKeys?: string[];
tokenTransactionHashes?: string[];
tokenIdentifiers?: string[];
outputIds?: string[];
order?: "asc" | "desc";
pageSize?: number;
offset?: number;
}): Promise
```
## Parameters
**Filter Constraint:** You can only specify **one** filter type per query. The parameters are mutually exclusive - use `sparkAddresses` OR `ownerPublicKeys` OR `issuerPublicKeys` OR `tokenTransactionHashes` OR `tokenIdentifiers` OR `outputIds`, but not multiple types together.
Array of Spark addresses to filter by (recommended)
Array of owner public keys to filter by (deprecated, use `sparkAddresses`)
Array of issuer public keys to filter by
Array of token transaction hashes to filter by
Array of Bech32m token identifiers to filter by
Array of output IDs to filter by
Sort order: `"asc"` or `"desc"` (defaults to `"desc"`)
Number of results per page (defaults to 50)
Pagination offset (defaults to 0)
## Returns
Token transactions response object
## Example
```typescript theme={null}
// Query all token transactions
const transactions = await wallet.queryTokenTransactions({});
// Query transactions for specific tokens with pagination
const tokenTransactions = await wallet.queryTokenTransactions({
tokenIdentifiers: ["btkn1..."],
order: "desc",
pageSize: 20,
offset: 0
});
// Query transactions for specific Spark addresses
const addressTransactions = await wallet.queryTokenTransactions({
sparkAddresses: ["spark1..."]
});
```
**Getting the sender address:** Token transaction outputs only show the recipient. To get the sender's address for an incoming transfer, query the previous transaction using the `prevTokenTransactionHash` from the input's `OutputToSpend`.
# refundAndBroadcastStaticDeposit
Source: https://docs.spark.money/api-reference/wallet/refund-and-broadcast-static-deposit
Refund a static deposit and broadcast to Bitcoin network.
Refunds a static deposit and broadcasts the transaction to the Bitcoin network.
## Method Signature
```typescript theme={null}
async refundAndBroadcastStaticDeposit({
depositTransactionId,
outputIndex,
destinationAddress,
satsPerVbyteFee,
}: {
depositTransactionId: string;
outputIndex?: number;
destinationAddress: string;
satsPerVbyteFee?: number;
}): Promise
```
## Parameters
The transaction ID of the original deposit
The output index (auto-detected if not provided)
The Bitcoin address to send the refund to
The fee rate in sats per vbyte
## Returns
The transaction ID of the broadcast refund transaction
## Example
```typescript theme={null}
const txid = await wallet.refundAndBroadcastStaticDeposit({
depositTransactionId: "abc123...",
destinationAddress: "bcrt1p...",
satsPerVbyteFee: 10
});
console.log("Refund broadcast, txid:", txid);
```
# refundStaticDeposit
Source: https://docs.spark.money/api-reference/wallet/refund-static-deposit
Create a refund transaction for a static deposit (returns tx hex).
Refunds a deposit made to a static deposit address back to a specified Bitcoin address for the `SparkWallet`.
## Method Signature
```typescript theme={null}
async refundStaticDeposit({
depositTransactionId,
outputIndex,
destinationAddress,
satsPerVbyteFee,
}: {
depositTransactionId: string;
outputIndex?: number;
destinationAddress: string;
satsPerVbyteFee?: number;
}): Promise
```
## Parameters
The transaction ID of the original deposit
The index of the output (auto-detected if not provided)
The Bitcoin address to send the refund to
The fee rate in sats per vbyte (must be less than 150)
## Returns
The refund transaction as a hex string that needs to be broadcast
## Example
```typescript theme={null}
const txHex = await wallet.refundStaticDeposit({
depositTransactionId: "abc123...",
destinationAddress: "bcrt1p...",
satsPerVbyteFee: 10
});
console.log("Refund transaction:", txHex);
```
# setPrivacyEnabled
Source: https://docs.spark.money/api-reference/wallet/set-privacy-enabled
Enable or disable privacy mode for the wallet.
Enables or disables privacy mode for the wallet.
## Method Signature
```typescript theme={null}
async setPrivacyEnabled(privacyEnabled: boolean): Promise
```
## Parameters
Whether to enable privacy mode
## Returns
Updated wallet settings, or undefined if update failed
## Example
```typescript theme={null}
// Enable privacy mode
const settings = await wallet.setPrivacyEnabled(true);
console.log("Privacy enabled:", settings?.privateEnabled);
// Disable privacy mode
await wallet.setPrivacyEnabled(false);
```
# signMessageWithIdentityKey
Source: https://docs.spark.money/api-reference/wallet/sign-message-with-identity-key
Sign a message with the wallet's identity key.
Signs a message with the wallet's identity key.
## Method Signature
```typescript theme={null}
async signMessageWithIdentityKey(
message: string,
compact?: boolean
): Promise
```
## Parameters
The message to sign
Whether to use compact encoding. If `false`, uses DER encoding.
## Returns
The signature as a hex string
## Example
```typescript theme={null}
const signature = await wallet.signMessageWithIdentityKey("Hello, Spark!");
console.log("Signature:", signature);
// Compact encoding
const compactSig = await wallet.signMessageWithIdentityKey("Hello", true);
```
# signTransaction
Source: https://docs.spark.money/api-reference/wallet/sign-transaction
Sign a Bitcoin transaction with wallet keys.
Signs a Bitcoin transaction with wallet keys.
## Method Signature
```typescript theme={null}
async signTransaction(
txHex: string,
keyType?: string
): Promise
```
## Parameters
The transaction hex to sign
Key type to use: `"identity"`, `"deposit"`, or `"auto-detect"` (default: `"auto-detect"`)
## Returns
The signed transaction hex
## Example
```typescript theme={null}
// Auto-detect key type
const signedTx = await wallet.signTransaction("02000000...");
// Use specific key type
const signedWithIdentity = await wallet.signTransaction(
"02000000...",
"identity"
);
console.log("Signed transaction:", signedWithIdentity);
```
# transfer
Source: https://docs.spark.money/api-reference/wallet/transfer
Send Bitcoin to another Spark address.
Transfers Bitcoin to another `SparkWallet`.
## Method Signature
```typescript theme={null}
async transfer({
receiverSparkAddress,
amountSats,
}: {
receiverSparkAddress: string;
amountSats: number;
}): Promise
```
## Parameters
The recipient's Spark Address
The amount in satoshis to transfer. Must be positive and less than 2^53 (JavaScript safe integer limit).
## Returns
The completed transfer details
## Example
```typescript theme={null}
const transfer = await wallet.transfer({
receiverSparkAddress: "spark1...",
amountSats: 1000
});
console.log("Transfer completed:", transfer);
```
Do not pass a Spark invoice (address with encoded payment details) to this method. Use [`fulfillSparkInvoice()`](/api-reference/wallet/fulfill-spark-invoice) instead.
# transferTokens
Source: https://docs.spark.money/api-reference/wallet/transfer-tokens
Send tokens to another Spark address.
Transfers tokens to another `SparkWallet`.
## Method Signature
```typescript theme={null}
async transferTokens({
tokenIdentifier,
tokenAmount,
receiverSparkAddress,
outputSelectionStrategy,
selectedOutputs,
}: {
tokenIdentifier: Bech32mTokenIdentifier;
tokenAmount: bigint;
receiverSparkAddress: string;
outputSelectionStrategy?: "SMALL_FIRST" | "LARGE_FIRST";
selectedOutputs?: OutputWithPreviousTransactionData[];
}): Promise
```
## Parameters
The Bech32m token identifier (format: `btkn1...` for mainnet, `btknrt1...` for regtest)
The amount of tokens to transfer
The recipient's Spark Address
Strategy for selecting outputs: `"SMALL_FIRST"` or `"LARGE_FIRST"` (defaults to `"SMALL_FIRST"`)
Specific outputs to use for the transfer (overrides selection strategy)
## Returns
The transaction ID of the token transfer
## Example
```typescript theme={null}
const txId = await wallet.transferTokens({
tokenIdentifier: "btkn1...",
tokenAmount: 1000n,
receiverSparkAddress: "spark1..."
});
console.log("Token transfer completed:", txId);
```
# validateMessageWithIdentityKey
Source: https://docs.spark.money/api-reference/wallet/validate-message-with-identity-key
Verify a message signature against the wallet's identity key.
Validates a message signature against the wallet's identity key.
## Method Signature
```typescript theme={null}
async validateMessageWithIdentityKey(
message: string,
signature: string | Uint8Array
): Promise
```
## Parameters
The original message that was signed
The signature to validate (hex string or bytes)
## Returns
`true` if the signature is valid, `false` otherwise
## Example
```typescript theme={null}
const isValid = await wallet.validateMessageWithIdentityKey(
"Hello, Spark!",
"304402..."
);
console.log("Signature valid:", isValid);
```
# withdraw
Source: https://docs.spark.money/api-reference/wallet/withdraw
Withdraw to an on-chain Bitcoin address via cooperative exit.
Initiates a withdrawal to move funds from the Spark network to an on-chain Bitcoin address via the `SparkWallet`.
## Method Signature
```typescript theme={null}
interface WithdrawParams {
onchainAddress: string;
exitSpeed: ExitSpeed;
amountSats?: number;
feeQuoteId?: string; // ID from getWithdrawalFeeQuote
feeAmountSats?: number; // Fee amount based on exitSpeed
feeQuote?: CoopExitFeeQuote; // @deprecated - use feeQuoteId and feeAmountSats
deductFeeFromWithdrawalAmount?: boolean; // default: true
}
async withdraw(params: WithdrawParams): Promise
```
## Parameters
The Bitcoin address where the funds should be sent
The desired speed of the exit (`FAST`, `MEDIUM`, `SLOW`)
The amount in satoshis to withdraw (if not specified, withdraws all available funds)
The ID from the fee quote returned by `getWithdrawalFeeQuote`
The fee amount in satoshis based on the chosen `exitSpeed`
**Deprecated:** Use `feeQuoteId` and `feeAmountSats` instead. The fee quote object returned by `getWithdrawalFeeQuote`
When `true`, fees are deducted from withdrawal amount; when `false`, from remaining wallet balance (default: `true`)
## Returns
The withdrawal request details, or null/undefined if the request cannot be completed
## Example
```typescript theme={null}
// 1) Fetch a fee quote
const feeQuote = await wallet.getWithdrawalFeeQuote({
amountSats: 17000,
withdrawalAddress: "bcrt1pf8hed85p94emupfpfhq2g0p5c40cgzqs4agvvfmeuy32nxeh549syu2lwf",
});
// 2) Calculate fee based on exit speed
const exitSpeed = ExitSpeed.MEDIUM;
let feeAmountSats: number;
switch (exitSpeed) {
case ExitSpeed.FAST:
feeAmountSats = (feeQuote.l1BroadcastFeeFast?.originalValue || 0) +
(feeQuote.userFeeFast?.originalValue || 0);
break;
case ExitSpeed.MEDIUM:
feeAmountSats = (feeQuote.l1BroadcastFeeMedium?.originalValue || 0) +
(feeQuote.userFeeMedium?.originalValue || 0);
break;
case ExitSpeed.SLOW:
feeAmountSats = (feeQuote.l1BroadcastFeeSlow?.originalValue || 0) +
(feeQuote.userFeeSlow?.originalValue || 0);
break;
}
// 3) Use the quote before it expires to create the withdrawal
const withdraw_result = await wallet.withdraw({
onchainAddress: "bcrt1pf8hed85p94emupfpfhq2g0p5c40cgzqs4agvvfmeuy32nxeh549syu2lwf",
amountSats: 17000,
exitSpeed,
feeQuoteId: feeQuote.id,
feeAmountSats,
deductFeeFromWithdrawalAmount: true,
});
console.log("Withdraw Result:", withdraw_result);
```
# Embedded Wallets
Source: https://docs.spark.money/guides/embedded-wallets
Integrate Spark wallets directly into web and mobile apps.
# Embedded Wallets
Integrate Spark wallets directly into your applications for seamless Bitcoin-native user experiences.
## Overview
Embedded wallets allow users to interact with Bitcoin and Spark without managing separate wallet applications. Perfect for DeFi apps, games, and consumer applications.
## Prerequisites
* Spark SDK installed
* Basic React/JavaScript knowledge
* Understanding of wallet concepts
## Getting Started
### Wallet Integration
Embed Spark wallets in your app:
```typescript theme={null}
import { SparkWallet } from '@spark/sdk';
const wallet = new SparkWallet({
// Configuration options
});
```
### Key Features
* **Instant Setup**: Users can start transacting immediately
* **Lightning Integration**: Built-in Lightning Network support
* **Token Support**: Native BTKN token handling
* **Cross-Platform**: Works on web, mobile, and desktop
## Implementation Patterns
### Custodial Wallets
* Quick onboarding
* User-friendly experience
* Managed key recovery
### Non-Custodial Wallets
* Full user control
* Enhanced security
* Self-sovereignty
### Hybrid Approaches
* Flexible key management
* Progressive decentralization
* Best of both worlds
## Use Cases
* **DeFi Applications**: Trading, lending, yield farming
* **Gaming**: In-game economies and NFT trading
* **E-commerce**: Bitcoin payments and rewards
* **Social Apps**: Tipping and micropayments
## Security Considerations
* Key management strategies
* Multi-signature setups
* Hardware wallet integration
* Recovery mechanisms
## Next Steps
* Choose your wallet integration approach
* Implement basic wallet functionality
* Add advanced features like Lightning payments
# Ethereum Bridge
Source: https://docs.spark.money/guides/eth-bridge
Bridge assets between Ethereum and Spark.
Bridge assets between Bitcoin L1 and Spark L2 seamlessly.
## Supported Assets
* Bitcoin (BTC)
* Spark tokens
* Lightning Network payments
## Bridge Features
* **Fast deposits** - Move Bitcoin to Spark in minutes
* **Instant withdrawals** - Exit to Bitcoin L1 anytime
* **Lightning integration** - Seamless Lightning payments
* **Non-custodial** - You maintain full control
## How It Works
The bridge uses Spark's statechain technology to enable trustless movement of assets between layers.
# Fiat Offramps
Source: https://docs.spark.money/guides/fiat-offramps
Convert Bitcoin and Spark assets to fiat currency.
# Fiat Offramps
Enable users to convert Bitcoin and Spark assets back to traditional currency for seamless exits and real-world spending.
## Overview
Fiat offramps provide the reverse flow of onramps, allowing users to convert their Bitcoin holdings into traditional currency for spending, withdrawals, or other financial needs.
## Prerequisites
* Understanding of payment processing
* Familiarity with compliance requirements
* Spark SDK knowledge
* Integration with financial services
## Getting Started
### Offramp Integration
Convert Bitcoin to fiat through:
* **Bank Transfers**: Direct deposits to bank accounts
* **Card Payments**: Debit card spending
* **Digital Wallets**: PayPal, Apple Pay, Google Pay
* **Cash Pickup**: Physical cash withdrawal locations
### Key Features
* **Instant Conversion**: Real-time Bitcoin-to-fiat conversion
* **Global Reach**: Multi-currency and multi-region support
* **Low Fees**: Competitive exchange rates
* **Compliance**: Built-in regulatory compliance
## Implementation
### Basic Offramp Flow
```typescript theme={null}
// Initiate fiat offramp
const offramp = await createFiatOfframp({
amount: 0.001, // BTC
currency: 'USD',
destination: 'bank_account',
method: 'wire_transfer',
});
```
### Integration Patterns
**Direct Integration**
* Embedded offramp widgets
* Custom user interfaces
* Seamless experience
* Brand consistency
**Partner Integration**
* Third-party service providers
* Reduced compliance burden
* Established infrastructure
* Quick deployment
**API Integration**
* Custom implementation
* Full control
* Advanced features
* Complex requirements
## Supported Methods
### Bank Transfers
* **Wire Transfers**: International transfers
* **ACH Transfers**: US domestic transfers
* **SEPA**: European transfers
* **SWIFT**: Global bank network
### Card Services
* **Debit Cards**: Direct spending
* **Prepaid Cards**: Loaded with Bitcoin
* **Virtual Cards**: Digital payment cards
* **ATM Withdrawals**: Cash access
### Digital Wallets
* **PayPal**: Global digital payments
* **Apple Pay**: iOS integration
* **Google Pay**: Android integration
* **Venmo**: Social payments
### Cash Services
* **ATM Networks**: Bitcoin ATMs
* **Cash Pickup**: Physical locations
* **Money Orders**: Traditional methods
* **P2P Transfers**: Peer-to-peer cash
## Use Cases
### Consumer Applications
* **Spending**: Convert Bitcoin for purchases
* **Bills**: Pay utilities and services
* **Savings**: Traditional bank deposits
* **Emergency**: Quick access to cash
### Business Solutions
* **Payroll**: Employee salary payments
* **Vendor Payments**: Supplier settlements
* **Tax Payments**: Government obligations
* **Expense Management**: Business spending
### DeFi Applications
* **Yield Harvesting**: Convert profits to fiat
* **Liquidity Management**: Rebalance portfolios
* **Risk Management**: Hedge positions
* **Compliance**: Meet regulatory requirements
## Compliance & Security
### Regulatory Requirements
* **KYC/AML**: Know Your Customer procedures
* **Transaction Limits**: Regulatory thresholds
* **Reporting**: Suspicious activity reports
* **Tax Compliance**: Capital gains reporting
### Security Measures
* **Multi-Signature**: Enhanced security
* **Cold Storage**: Offline Bitcoin storage
* **Fraud Detection**: Machine learning systems
* **Audit Trails**: Complete transaction records
### Privacy Considerations
* **Data Protection**: User privacy safeguards
* **Transaction Privacy**: Confidential transfers
* **Identity Protection**: Minimal data collection
* **Compliance**: Regulatory privacy requirements
## Best Practices
### User Experience
* **Clear Pricing**: Transparent fee structure
* **Fast Processing**: Quick settlement times
* **Status Updates**: Real-time progress tracking
* **Support**: Customer service integration
### Technical Implementation
* **Rate Limiting**: Prevent abuse
* **Error Handling**: Graceful failure management
* **Monitoring**: Comprehensive logging
* **Scalability**: Handle high transaction volumes
### Risk Management
* **Fraud Prevention**: Advanced detection systems
* **Compliance Monitoring**: Ongoing regulatory compliance
* **Liquidity Management**: Maintain adequate reserves
* **Operational Security**: Secure infrastructure
## Integration Examples
### Wallet Applications
```typescript theme={null}
// Add offramp to wallet
const offrampButton = (
);
```
### E-commerce Integration
```typescript theme={null}
// Checkout with Bitcoin offramp
const checkout = await processPayment({
amount: 100, // USD
method: 'bitcoin_offramp',
destination: 'merchant_account',
});
```
## Next Steps
* Choose your offramp provider
* Implement basic integration
* Add compliance features
* Test with small amounts
* Deploy production solution
# Fiat Onramps
Source: https://docs.spark.money/guides/fiat-onramps
Enable fiat-to-Bitcoin purchases in your Spark app.
# Fiat Onramps
Integrate seamless fiat-to-Bitcoin conversion services into your Spark applications for easy user onboarding.
## Overview
Fiat onramps allow users to convert traditional currency (USD, EUR, etc.) into Bitcoin and Spark assets, making Bitcoin applications accessible to mainstream users.
## Prerequisites
* Understanding of payment processing
* Familiarity with KYC/AML requirements
* Spark SDK knowledge
* Integration with payment providers
## Getting Started
### Onramp Integration
Connect users to Bitcoin through:
* **Credit/Debit Cards**: Instant Bitcoin purchases
* **Bank Transfers**: ACH and wire transfers
* **Digital Wallets**: PayPal, Apple Pay, Google Pay
* **Crypto Exchanges**: Integration with major exchanges
### Key Features
* **Instant Settlement**: Funds available immediately on Spark
* **Low Fees**: Optimized conversion rates
* **Global Support**: Multi-currency and multi-region
* **Compliance**: Built-in KYC/AML compliance
## Implementation
### Basic Onramp Flow
```typescript theme={null}
// Initiate fiat onramp
const onramp = await createFiatOnramp({
amount: 100, // USD
currency: 'USD',
destination: 'spark_address',
paymentMethod: 'card',
});
```
### Integration Patterns
**Direct Integration**
* Embed onramp widgets
* Custom UI components
* Seamless user experience
* Brand consistency
**Redirect Flow**
* External payment pages
* Reduced integration complexity
* Provider-hosted compliance
* Quick implementation
**API Integration**
* Custom payment flows
* Advanced features
* Full control
* Complex requirements
## Supported Providers
### Card Payments
* **Stripe**: Global card processing
* **Coinbase Commerce**: Crypto-focused payments
* **MoonPay**: Multi-asset onramp
* **Ramp**: European-focused service
### Bank Transfers
* **Plaid**: Bank account verification
* **Yodlee**: Financial data aggregation
* **Open Banking**: European bank integration
* **ACH Networks**: US bank transfers
### Digital Wallets
* **PayPal**: Global digital payments
* **Apple Pay**: iOS integration
* **Google Pay**: Android integration
* **Samsung Pay**: Samsung device support
## Use Cases
### Consumer Applications
* **Wallet Apps**: Easy Bitcoin purchasing
* **Gaming**: In-app currency conversion
* **E-commerce**: Bitcoin payment options
* **Social Apps**: Tipping and rewards
### Enterprise Solutions
* **Treasury Management**: Corporate Bitcoin purchases
* **Payment Processing**: Multi-asset payment rails
* **Compliance**: Regulated conversion services
* **B2B Services**: Business-to-business payments
### DeFi Applications
* **DEX Integration**: Fiat-to-DeFi bridges
* **Yield Farming**: Easy capital deployment
* **Lending Protocols**: Collateral acquisition
* **Trading Platforms**: Seamless asset conversion
## Compliance & Security
### KYC/AML Requirements
* **Identity Verification**: Document upload and verification
* **Address Verification**: Proof of residence
* **Sanctions Screening**: OFAC and other lists
* **Transaction Monitoring**: Suspicious activity detection
### Security Measures
* **Encryption**: End-to-end data protection
* **PCI Compliance**: Card data security
* **Fraud Prevention**: Machine learning detection
* **Audit Trails**: Complete transaction records
## Best Practices
### User Experience
* **Minimal Friction**: Streamlined onboarding
* **Clear Pricing**: Transparent fee structure
* **Progress Indicators**: User journey visibility
* **Error Handling**: Graceful failure management
### Technical Implementation
* **Rate Limiting**: Prevent abuse
* **Webhook Handling**: Real-time status updates
* **Retry Logic**: Robust error recovery
* **Monitoring**: Comprehensive logging
### Compliance
* **Data Privacy**: GDPR and CCPA compliance
* **Record Keeping**: Audit trail maintenance
* **Reporting**: Regulatory compliance
* **Risk Management**: Ongoing monitoring
## Next Steps
* Choose your onramp provider
* Implement basic integration
* Add compliance features
* Optimize user experience
* Deploy production-ready solution
# Guide Title
Source: https://docs.spark.money/guides/guide-template
# Guide Title
Brief introduction explaining what this guide covers, what developers will learn, and what they'll be able to build after completing it.
## Prerequisites
Before starting this guide, make sure you have:
* \[Prerequisite 1] - Brief explanation
* \[Prerequisite 2] - Brief explanation
* \[Prerequisite 3] - Brief explanation
## What You'll Build
In this guide, you'll learn how to:
* \[Learning objective 1]
* \[Learning objective 2]
* \[Learning objective 3]
By the end, you'll have a \[description of final result].
## Step 1: Setup and Installation
### Install Dependencies
```bash theme={null}
npm install @spark/sdk @partner/client
```
### Environment Configuration
Create a `.env` file with your credentials:
```env theme={null}
SPARK_API_KEY=your_spark_api_key
PARTNER_API_KEY=your_partner_api_key
SPARK_NETWORK=testnet
```
### Initialize the Client
```typescript theme={null}
import { SparkClient } from '@spark/sdk'
import { PartnerClient } from '@partner/client'
const sparkClient = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
network: process.env.SPARK_NETWORK
})
const partnerClient = new PartnerClient({
apiKey: process.env.PARTNER_API_KEY
})
```
## Step 2: \[Core Functionality]
### Understanding the Concept
\[Explain the core concept or functionality being implemented]
### Implementation
```typescript theme={null}
// Example implementation code
async function implementCoreFunctionality() {
try {
// Step-by-step implementation
const result = await sparkClient.someMethod({
// configuration
})
console.log('Success:', result)
return result
} catch (error) {
console.error('Error:', error)
throw error
}
}
```
### Error Handling
```typescript theme={null}
// Proper error handling example
try {
const result = await implementCoreFunctionality()
} catch (error) {
if (error.code === 'SPECIFIC_ERROR') {
// Handle specific error case
} else {
// Handle general error case
}
}
```
## Step 3: \[Advanced Features]
### Feature Overview
\[Explain advanced features or optimizations]
### Implementation
```typescript theme={null}
// Advanced implementation
class AdvancedImplementation {
constructor(private sparkClient: SparkClient, private partnerClient: PartnerClient) {}
async advancedFeature() {
// Implementation details
}
}
```
## Step 4: Testing and Validation
### Unit Testing
```typescript theme={null}
import { describe, it, expect } from 'vitest'
describe('Core Functionality', () => {
it('should implement basic functionality', async () => {
const result = await implementCoreFunctionality()
expect(result).toBeDefined()
})
})
```
### Integration Testing
```typescript theme={null}
describe('Integration Tests', () => {
it('should work with partner integration', async () => {
// Test integration with partner service
})
})
```
## Step 5: Deployment and Production
### Production Configuration
```typescript theme={null}
const productionConfig = {
apiKey: process.env.PRODUCTION_SPARK_API_KEY,
network: 'mainnet',
timeout: 30000,
retries: 3
}
```
### Monitoring and Logging
```typescript theme={null}
import { Logger } from '@spark/sdk'
const logger = new Logger({
level: 'info',
service: 'your-service-name'
})
// Use logger throughout your application
logger.info('Operation completed successfully')
```
## Troubleshooting
### Common Issues
**Problem**: Getting authentication errors when making API calls.
**Solution**:
1. Verify your API key is correct
2. Check that your API key has the required permissions
3. Ensure you're using the correct network (testnet/mainnet)
**Problem**: Receiving rate limit errors.
**Solution**:
1. Implement exponential backoff
2. Add request queuing
3. Consider upgrading your API plan
### Debug Mode
Enable debug logging to troubleshoot issues:
```typescript theme={null}
const client = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
debug: true,
logLevel: 'debug'
})
```
## Best Practices
* **Security**: Always store API keys in environment variables
* **Error Handling**: Implement comprehensive error handling
* **Rate Limiting**: Respect API rate limits and implement backoff
* **Testing**: Write tests for all critical functionality
* **Monitoring**: Implement proper logging and monitoring
## Next Steps
Now that you've completed this guide, you can:
* \[Next step 1] - Link to related guide or documentation
* \[Next step 2] - Link to advanced topics
* \[Next step 3] - Link to integration examples
## Additional Resources
* [Related Documentation](/path/to/related/docs)
* [API Reference](/api-reference/endpoint)
* [Community Examples](https://github.com/spark/examples)
* [Support Forum](https://community.spark.money)
## Code Examples
```typescript Complete Implementation theme={null}
// Complete working example
import { SparkClient } from '@spark/sdk'
const client = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
network: 'testnet'
})
async function completeExample() {
// Full implementation
}
```
```javascript Node.js theme={null}
// Node.js specific implementation
const { SparkClient } = require('@spark/sdk')
const client = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
network: 'testnet'
})
```
```python Python theme={null}
# Python implementation
from spark_sdk import SparkClient
client = SparkClient(
api_key=os.getenv('SPARK_API_KEY'),
network='testnet'
)
```
# Issue a stablecoin
Source: https://docs.spark.money/guides/issue-stablecoin
Launch a stablecoin on Spark using Brale.
Issue your own stablecoin with Brale.
## What is Privy?
Privy provides seamless authentication and wallet management for Web3 applications.
## Integration Benefits
* **Social logins** - Google, Twitter, Discord
* **Wallet creation** - Automatic wallet generation
* **User management** - Built-in user profiles
* **Security** - Enterprise-grade security
## Quick Integration
```typescript theme={null}
import { PrivyProvider } from '@privy-io/react-auth'
function App() {
return (
)
}
```
## Documentation
* [Privy Docs](https://docs.privy.io/)
* [Spark + Privy Guide](/guides/privy-integration)
# Bitcoin Layer 1
Source: https://docs.spark.money/guides/layer-1
Interact with Bitcoin L1 from Spark for withdrawals and settlement.
# Bitcoin Layer 1 Integration
Learn how to interact with Bitcoin's base layer from your Spark applications for maximum security and decentralization.
## Overview
While Spark provides instant, low-cost transactions, sometimes you need the security and finality of Bitcoin's base layer. This guide shows you how to integrate L1 operations seamlessly.
## Prerequisites
* Understanding of Bitcoin transactions
* Familiarity with UTXO model
* Spark SDK knowledge
## Getting Started
### L1 Operations
Connect to Bitcoin's base layer for:
* **Final Settlement**: Move funds to Bitcoin L1
* **Security**: Leverage Bitcoin's security model
* **Compliance**: Meet regulatory requirements
* **Interoperability**: Connect with other Bitcoin applications
### Key Concepts
* **UTXO Management**: Understanding Bitcoin's transaction model
* **Fee Estimation**: Optimizing transaction costs
* **Confirmation Times**: Planning for network delays
* **Script Types**: P2TR, P2WPKH, and other output types
## Implementation
### Withdrawal to L1
```typescript theme={null}
// Withdraw funds from Spark to Bitcoin L1
const txid = await wallet.withdraw({
amount: 100000, // sats
address: 'bc1p...', // Bitcoin address
});
```
### L1 Monitoring
Track Bitcoin transactions:
* Confirmation status
* Fee optimization
* Mempool monitoring
* Double-spend protection
## Use Cases
### DeFi Applications
* **Collateral Management**: Move collateral to L1 for security
* **Settlement**: Final settlement on Bitcoin
* **Cross-Chain**: Bridge to other Bitcoin layers
### Enterprise Solutions
* **Treasury Management**: Secure corporate Bitcoin holdings
* **Compliance**: Meet regulatory requirements
* **Auditing**: Transparent on-chain records
### Consumer Applications
* **Long-term Storage**: Move funds to cold storage
* **Backup**: Create L1 backups of Spark funds
* **Recovery**: Emergency withdrawal mechanisms
## Best Practices
* **Fee Management**: Use RBF and CPFP for fee optimization
* **Privacy**: Leverage CoinJoin and other privacy techniques
* **Security**: Implement proper key management
* **Monitoring**: Set up transaction tracking
## Next Steps
* Implement L1 withdrawal functionality
* Add transaction monitoring
* Optimize for fees and privacy
* Build compliance features
# Lightning Payments
Source: https://docs.spark.money/guides/lightning-payments
Send and receive Lightning payments from Spark wallets.
# Guide Title
Brief introduction explaining what this guide covers, what developers will learn, and what they'll be able to build after completing it.
## Prerequisites
Before starting this guide, make sure you have:
* \[Prerequisite 1] - Brief explanation
* \[Prerequisite 2] - Brief explanation
* \[Prerequisite 3] - Brief explanation
## What You'll Build
In this guide, you'll learn how to:
* \[Learning objective 1]
* \[Learning objective 2]
* \[Learning objective 3]
By the end, you'll have a \[description of final result].
## Step 1: Setup and Installation
### Install Dependencies
```bash theme={null}
npm install @spark/sdk @partner/client
```
### Environment Configuration
Create a `.env` file with your credentials:
```env theme={null}
SPARK_API_KEY=your_spark_api_key
PARTNER_API_KEY=your_partner_api_key
SPARK_NETWORK=testnet
```
### Initialize the Client
```typescript theme={null}
import { SparkClient } from '@spark/sdk'
import { PartnerClient } from '@partner/client'
const sparkClient = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
network: process.env.SPARK_NETWORK
})
const partnerClient = new PartnerClient({
apiKey: process.env.PARTNER_API_KEY
})
```
## Step 2: \[Core Functionality]
### Understanding the Concept
\[Explain the core concept or functionality being implemented]
### Implementation
```typescript theme={null}
// Example implementation code
async function implementCoreFunctionality() {
try {
// Step-by-step implementation
const result = await sparkClient.someMethod({
// configuration
})
console.log('Success:', result)
return result
} catch (error) {
console.error('Error:', error)
throw error
}
}
```
### Error Handling
```typescript theme={null}
// Proper error handling example
try {
const result = await implementCoreFunctionality()
} catch (error) {
if (error.code === 'SPECIFIC_ERROR') {
// Handle specific error case
} else {
// Handle general error case
}
}
```
## Step 3: \[Advanced Features]
### Feature Overview
\[Explain advanced features or optimizations]
### Implementation
```typescript theme={null}
// Advanced implementation
class AdvancedImplementation {
constructor(private sparkClient: SparkClient, private partnerClient: PartnerClient) {}
async advancedFeature() {
// Implementation details
}
}
```
## Step 4: Testing and Validation
### Unit Testing
```typescript theme={null}
import { describe, it, expect } from 'vitest'
describe('Core Functionality', () => {
it('should implement basic functionality', async () => {
const result = await implementCoreFunctionality()
expect(result).toBeDefined()
})
})
```
### Integration Testing
```typescript theme={null}
describe('Integration Tests', () => {
it('should work with partner integration', async () => {
// Test integration with partner service
})
})
```
## Step 5: Deployment and Production
### Production Configuration
```typescript theme={null}
const productionConfig = {
apiKey: process.env.PRODUCTION_SPARK_API_KEY,
network: 'mainnet',
timeout: 30000,
retries: 3
}
```
### Monitoring and Logging
```typescript theme={null}
import { Logger } from '@spark/sdk'
const logger = new Logger({
level: 'info',
service: 'your-service-name'
})
// Use logger throughout your application
logger.info('Operation completed successfully')
```
## Troubleshooting
### Common Issues
**Problem**: Getting authentication errors when making API calls.
**Solution**:
1. Verify your API key is correct
2. Check that your API key has the required permissions
3. Ensure you're using the correct network (testnet/mainnet)
**Problem**: Receiving rate limit errors.
**Solution**:
1. Implement exponential backoff
2. Add request queuing
3. Consider upgrading your API plan
### Debug Mode
Enable debug logging to troubleshoot issues:
```typescript theme={null}
const client = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
debug: true,
logLevel: 'debug'
})
```
## Best Practices
* **Security**: Always store API keys in environment variables
* **Error Handling**: Implement comprehensive error handling
* **Rate Limiting**: Respect API rate limits and implement backoff
* **Testing**: Write tests for all critical functionality
* **Monitoring**: Implement proper logging and monitoring
## Next Steps
Now that you've completed this guide, you can:
* \[Next step 1] - Link to related guide or documentation
* \[Next step 2] - Link to advanced topics
* \[Next step 3] - Link to integration examples
## Additional Resources
* [Related Documentation](/path/to/related/docs)
* [API Reference](/api-reference/endpoint)
* [Community Examples](https://github.com/spark/examples)
* [Support Forum](https://community.spark.money)
## Code Examples
```typescript Complete Implementation theme={null}
// Complete working example
import { SparkClient } from '@spark/sdk'
const client = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
network: 'testnet'
})
async function completeExample() {
// Full implementation
}
```
```javascript Node.js theme={null}
// Node.js specific implementation
const { SparkClient } = require('@spark/sdk')
const client = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
network: 'testnet'
})
```
```python Python theme={null}
# Python implementation
from spark_sdk import SparkClient
client = SparkClient(
api_key=os.getenv('SPARK_API_KEY'),
network='testnet'
)
```
# Onchain Swaps
Source: https://docs.spark.money/guides/onchain-swaps
Implement atomic swaps and cross-chain trading on Bitcoin.
# Onchain Swaps
Implement atomic swaps and cross-chain trading directly on Bitcoin using Spark's native capabilities.
## Overview
Onchain swaps enable trustless trading between different assets without intermediaries, leveraging Bitcoin's security and Spark's speed.
## Prerequisites
* Understanding of atomic swaps
* Familiarity with Bitcoin scripting
* Spark SDK knowledge
* Basic cryptography concepts
## Getting Started
### Atomic Swap Basics
Atomic swaps allow two parties to exchange different cryptocurrencies without trusting each other or a third party.
### Key Components
* **Hash Time-Locked Contracts (HTLCs)**: Core mechanism for atomic swaps
* **Preimage Secrets**: Cryptographic keys that unlock funds
* **Time Locks**: Automatic refund mechanisms
* **Multi-Signature**: Enhanced security for swap execution
## Implementation
### Basic Atomic Swap
```typescript theme={null}
// Create an atomic swap
const swap = await createAtomicSwap({
assetA: 'BTC',
assetB: 'USDT',
amountA: 100000, // sats
amountB: 1000, // USDT
counterparty: 'spark_address',
});
```
### Swap Lifecycle
1. **Initiation**: One party creates swap proposal
2. **Acceptance**: Counterparty accepts terms
3. **Funding**: Both parties fund swap contracts
4. **Execution**: Atomic exchange of assets
5. **Completion**: Funds released to respective parties
## Advanced Features
### Cross-Chain Swaps
* Bitcoin ↔ Ethereum
* Bitcoin ↔ Solana
* Bitcoin ↔ Other L2s
* Multi-hop routing
### Automated Market Making
* Liquidity provision
* Price discovery
* Slippage protection
* Fee optimization
### Flash Swaps
* Instant execution
* Collateral-free trading
* Arbitrage opportunities
* Risk management
## Use Cases
### DeFi Applications
* **DEX Integration**: Decentralized exchange functionality
* **Yield Farming**: Automated trading strategies
* **Arbitrage**: Cross-chain price differences
* **Liquidity Mining**: Incentivize swap participation
### Enterprise Solutions
* **Treasury Management**: Corporate asset swapping
* **Payment Processing**: Multi-asset payment rails
* **Risk Hedging**: Portfolio rebalancing
* **Compliance**: Regulated swap execution
### Consumer Applications
* **Portfolio Management**: Automated rebalancing
* **Payment Flexibility**: Multi-asset payments
* **Investment Tools**: Dollar-cost averaging
* **Cross-Border**: International asset transfers
## Security Considerations
* **Smart Contract Audits**: Thorough security reviews
* **Time Lock Management**: Proper timeout handling
* **Preimage Security**: Secure secret generation
* **Multi-Signature**: Enhanced key management
## Best Practices
* **Fee Optimization**: Minimize transaction costs
* **Slippage Protection**: Price impact management
* **Liquidity Management**: Maintain adequate reserves
* **User Experience**: Intuitive swap interfaces
## Next Steps
* Implement basic atomic swap functionality
* Add cross-chain support
* Build automated market making
* Deploy production-ready swap platform
# Let's build something
Source: https://docs.spark.money/guides/overview
Developer guides for building on Spark.
# Developer Guides
A collection of practical guides and examples for building foundational applications on Spark.
## What You'll Find Here
These guides cover the essential building blocks and integrations you need to create Bitcoin-native applications on Spark. Each guide provides step-by-step instructions, code examples, and best practices.
## Available Guides
### **Core Integrations**
**[Embedded Wallets](/guides/embedded-wallets)**
Integrate Spark wallets directly into your applications for seamless Bitcoin-native user experiences.
**[Lightning Payments](/guides/lightning-payments)**
Add Lightning Network support to enable instant, low-cost Bitcoin payments.
**[Bitcoin Layer 1](/guides/layer-1)**
Interact with Bitcoin's base layer for maximum security and final settlement.
**[Token Lists](/guides/token-lists)**
Work with Bitcoin tokens and manage token metadata in your applications.
### **Financial Services**
**[Fiat Onramps](/guides/fiat-onramps)**
Integrate fiat-to-Bitcoin conversion services for easy user onboarding.
**[Fiat Offramps](/guides/fiat-offramps)**
Enable users to convert Bitcoin back to traditional currency.
**[Onchain Swaps](/guides/onchain-swaps)**
Implement atomic swaps and cross-chain trading on Bitcoin.
**[Issue Stablecoin](/guides/issue-stablecoin)**
Create and manage your own stablecoin using Spark's token system.
### **Cross-Chain Bridges**
**[Ethereum Bridge](/guides/eth-bridge)**
Bridge assets between Bitcoin L1 and Ethereum ecosystem.
**[Solana Bridge](/guides/sol-bridge)**
Connect Bitcoin with Solana's ecosystem for cross-chain functionality.
### **AI & Automation**
**[Build with AI](/guides/ai)**
Integrate AI capabilities into Spark applications for intelligent Bitcoin experiences.
**[Prompt Library](/guides/prompt-library)**
Curated AI prompts for Spark development and application building.
### **Authentication & User Experience**
**[Social Authentication](/guides/social-authentication)**
Implement user onboarding and authentication with Privy integration.
### **Developer Resources**
**[Example Repos](/guides/repos)**
Explore real-world implementations and open-source projects built on Spark.
## Getting Started
1. **Choose a guide** that matches what you want to build
2. **Follow the prerequisites** to set up your development environment
3. **Work through the examples** step by step
4. **Adapt the code** for your specific use case
## Prerequisites
Most guides assume you have:
* Basic familiarity with JavaScript/TypeScript
* Node.js installed
* Understanding of Bitcoin concepts
* Spark SDK installed
## Need Help?
* Check the [API Reference](/api-reference/introduction) for detailed method documentation
* Join the [Discord community](https://discord.gg/spark) for support
* Browse [example repositories](/guides/repos) for inspiration
* Review the [Wallet SDK docs](/wallets/introduction) for wallet-specific guidance
# Prompt Library
Source: https://docs.spark.money/guides/prompt-library
AI prompts for Spark development and code generation.
# Prompt Library
Curated collection of AI prompts to help you build intelligent Spark applications and enhance developer productivity.
## Overview
This library contains battle-tested prompts for common Spark development tasks, AI integration patterns, and creative use cases.
## Development Prompts
### Code Generation
**Spark Wallet Integration**
```
Create a React component for a Spark wallet that includes:
- Balance display
- Send/receive functionality
- Lightning invoice generation
- Transaction history
- Error handling
```
**Token Management**
```
Build a TypeScript class for managing BTKN tokens on Spark with:
- Token creation methods
- Transfer functionality
- Balance queries
- Event handling
- Type safety
```
### Testing & QA
**Unit Test Generation**
```
Generate comprehensive unit tests for Spark SDK methods including:
- Mock implementations
- Edge case coverage
- Error scenarios
- Integration test patterns
- Performance benchmarks
```
**Security Audit**
```
Create a security checklist for Spark applications covering:
- Key management
- Transaction validation
- Input sanitization
- Rate limiting
- Privacy considerations
```
## AI Integration Prompts
### Trading Bots
**Market Analysis**
```
Analyze Bitcoin market conditions and generate trading signals for:
- Price trend identification
- Support/resistance levels
- Volume analysis
- Risk assessment
- Entry/exit points
```
**Portfolio Management**
```
Create an AI-driven portfolio management system that:
- Analyzes user spending patterns
- Suggests optimal allocations
- Implements dollar-cost averaging
- Manages risk exposure
- Provides rebalancing recommendations
```
### User Experience
**Transaction Categorization**
```
Categorize Bitcoin transactions automatically by:
- Merchant identification
- Transaction purpose
- Spending patterns
- Tax implications
- Budget tracking
```
**Fraud Detection**
```
Implement fraud detection for Spark transactions using:
- Pattern recognition
- Anomaly detection
- Risk scoring
- Real-time alerts
- Machine learning models
```
## Creative Applications
### Gaming & NFTs
**Game Economy Design**
```
Design a Bitcoin-native game economy featuring:
- In-game currency systems
- NFT marketplace
- Player rewards
- Economic balance
- Monetization strategies
```
**NFT Metadata Generation**
```
Generate rich metadata for Bitcoin NFTs including:
- Dynamic descriptions
- Attribute generation
- Rarity calculations
- Collection themes
- Community features
```
### DeFi Applications
**Yield Farming Strategy**
```
Create an automated yield farming strategy that:
- Identifies optimal opportunities
- Manages liquidity provision
- Optimizes returns
- Manages risks
- Implements stop-losses
```
**Lending Protocol**
```
Design a Bitcoin-backed lending protocol with:
- Collateral management
- Interest rate models
- Liquidation mechanisms
- Risk assessment
- User interface
```
## Documentation & Content
### Technical Writing
**API Documentation**
```
Generate comprehensive API documentation including:
- Method descriptions
- Parameter details
- Code examples
- Error handling
- Best practices
```
**Tutorial Creation**
```
Create step-by-step tutorials for:
- Wallet setup
- First transaction
- Lightning integration
- Token creation
- Advanced features
```
### Marketing & Community
**Product Descriptions**
```
Write compelling product descriptions for Spark applications:
- Feature highlights
- User benefits
- Technical advantages
- Use case examples
- Call-to-action
```
**Community Content**
```
Create engaging community content including:
- Educational posts
- Technical deep-dives
- User stories
- Development updates
- Event announcements
```
## Best Practices
### Prompt Engineering
* **Be Specific**: Include detailed requirements
* **Provide Context**: Explain the Spark ecosystem
* **Iterate**: Refine prompts based on results
* **Test**: Validate AI-generated code
* **Document**: Keep track of successful prompts
### Quality Assurance
* **Code Review**: Always review AI-generated code
* **Testing**: Implement comprehensive tests
* **Security**: Audit for vulnerabilities
* **Performance**: Optimize for efficiency
* **Maintenance**: Keep prompts updated
## Next Steps
* Browse the prompt library
* Customize prompts for your use case
* Share successful prompts with the community
* Contribute new prompts
* Build AI-powered Spark applications
# Example Repos
Source: https://docs.spark.money/guides/repos
Open-source Spark projects and code examples.
# Example Repositories
Explore real-world implementations and get inspired by community-built Spark applications.
## Overview
Discover open-source projects that demonstrate Spark's capabilities across different use cases and industries.
## Featured Repositories
### Wallet Applications
**Spark Wallet Template**
* Complete wallet implementation
* React/TypeScript stack
* Lightning integration
* Token support
**Mobile Spark Wallet**
* React Native implementation
* Cross-platform compatibility
* Biometric authentication
* Offline transaction signing
### DeFi Applications
**Spark DEX**
* Decentralized exchange on Spark
* Automated market making
* Liquidity provision
* Yield farming strategies
**Spark Lending Protocol**
* Bitcoin-backed lending
* Collateral management
* Interest rate optimization
* Risk assessment
### Gaming & NFTs
**Bitcoin Game Engine**
* In-game Bitcoin economy
* NFT marketplace
* Player-to-player trading
* Achievement rewards
**Spark NFT Platform**
* Bitcoin-native NFTs
* Instant transfers
* Low-cost minting
* Cross-game compatibility
### Developer Tools
**Spark CLI Tools**
* Command-line utilities
* Development helpers
* Testing frameworks
* Deployment scripts
**Spark SDK Examples**
* Code samples and tutorials
* Best practices
* Common patterns
* Integration guides
## Community Projects
### Open Source Contributions
* **Wallet Integrations**: Third-party wallet support
* **API Wrappers**: Language-specific SDKs
* **Testing Tools**: Development and QA utilities
* **Documentation**: Community-maintained guides
### Hackathon Winners
* **Innovation Awards**: Creative use cases
* **Technical Excellence**: Advanced implementations
* **User Experience**: Intuitive interfaces
* **Security Focus**: Robust implementations
## Getting Started
### Fork and Contribute
1. Browse the repositories
2. Fork interesting projects
3. Contribute improvements
4. Share your creations
### Build Your Own
1. Study existing implementations
2. Identify gaps and opportunities
3. Start with simple projects
4. Scale to complex applications
## Resources
* **GitHub Organization**: [@buildonspark](https://github.com/buildonspark)
* **Community Discord**: Join the developer community
* **Hackathons**: Participate in Spark hackathons
* **Grants Program**: Apply for development funding
## Next Steps
* Explore featured repositories
* Contribute to open source projects
* Build your own Spark application
* Share your work with the community
# Social Authentication
Source: https://docs.spark.money/guides/social-authentication
Add social login to Spark apps with Privy.
Integrate Privy for enhanced authentication and user management.
## What is Privy?
Privy provides seamless authentication and wallet management for Web3 applications.
## Integration Benefits
* **Social logins** - Google, Twitter, Discord
* **Wallet creation** - Automatic wallet generation
* **User management** - Built-in user profiles
* **Security** - Enterprise-grade security
## Quick Integration
```typescript theme={null}
import { PrivyProvider } from '@privy-io/react-auth'
function App() {
return (
)
}
```
## Documentation
* [Privy Docs](https://docs.privy.io/)
* [Spark + Privy Guide](/guides/privy-integration)
# Solana Bridge
Source: https://docs.spark.money/guides/sol-bridge
Bridge assets between Solana and Spark.
Bridge assets between Bitcoin L1 and Spark L2 seamlessly.
## Supported Assets
* Bitcoin (BTC)
* Spark tokens
* Lightning Network payments
## Bridge Features
* **Fast deposits** - Move Bitcoin to Spark in minutes
* **Instant withdrawals** - Exit to Bitcoin L1 anytime
* **Lightning integration** - Seamless Lightning payments
* **Non-custodial** - You maintain full control
## How It Works
The bridge uses Spark's statechain technology to enable trustless movement of assets between layers.
# Token Lists
Source: https://docs.spark.money/guides/token-lists
Work with token metadata and registries on Spark.
# Guide Title
Brief introduction explaining what this guide covers, what developers will learn, and what they'll be able to build after completing it.
## Prerequisites
Before starting this guide, make sure you have:
* \[Prerequisite 1] - Brief explanation
* \[Prerequisite 2] - Brief explanation
* \[Prerequisite 3] - Brief explanation
## What You'll Build
In this guide, you'll learn how to:
* \[Learning objective 1]
* \[Learning objective 2]
* \[Learning objective 3]
By the end, you'll have a \[description of final result].
## Step 1: Setup and Installation
### Install Dependencies
```bash theme={null}
npm install @spark/sdk @partner/client
```
### Environment Configuration
Create a `.env` file with your credentials:
```env theme={null}
SPARK_API_KEY=your_spark_api_key
PARTNER_API_KEY=your_partner_api_key
SPARK_NETWORK=testnet
```
### Initialize the Client
```typescript theme={null}
import { SparkClient } from '@spark/sdk'
import { PartnerClient } from '@partner/client'
const sparkClient = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
network: process.env.SPARK_NETWORK
})
const partnerClient = new PartnerClient({
apiKey: process.env.PARTNER_API_KEY
})
```
## Step 2: \[Core Functionality]
### Understanding the Concept
\[Explain the core concept or functionality being implemented]
### Implementation
```typescript theme={null}
// Example implementation code
async function implementCoreFunctionality() {
try {
// Step-by-step implementation
const result = await sparkClient.someMethod({
// configuration
})
console.log('Success:', result)
return result
} catch (error) {
console.error('Error:', error)
throw error
}
}
```
### Error Handling
```typescript theme={null}
// Proper error handling example
try {
const result = await implementCoreFunctionality()
} catch (error) {
if (error.code === 'SPECIFIC_ERROR') {
// Handle specific error case
} else {
// Handle general error case
}
}
```
## Step 3: \[Advanced Features]
### Feature Overview
\[Explain advanced features or optimizations]
### Implementation
```typescript theme={null}
// Advanced implementation
class AdvancedImplementation {
constructor(private sparkClient: SparkClient, private partnerClient: PartnerClient) {}
async advancedFeature() {
// Implementation details
}
}
```
## Step 4: Testing and Validation
### Unit Testing
```typescript theme={null}
import { describe, it, expect } from 'vitest'
describe('Core Functionality', () => {
it('should implement basic functionality', async () => {
const result = await implementCoreFunctionality()
expect(result).toBeDefined()
})
})
```
### Integration Testing
```typescript theme={null}
describe('Integration Tests', () => {
it('should work with partner integration', async () => {
// Test integration with partner service
})
})
```
## Step 5: Deployment and Production
### Production Configuration
```typescript theme={null}
const productionConfig = {
apiKey: process.env.PRODUCTION_SPARK_API_KEY,
network: 'mainnet',
timeout: 30000,
retries: 3
}
```
### Monitoring and Logging
```typescript theme={null}
import { Logger } from '@spark/sdk'
const logger = new Logger({
level: 'info',
service: 'your-service-name'
})
// Use logger throughout your application
logger.info('Operation completed successfully')
```
## Troubleshooting
### Common Issues
**Problem**: Getting authentication errors when making API calls.
**Solution**:
1. Verify your API key is correct
2. Check that your API key has the required permissions
3. Ensure you're using the correct network (testnet/mainnet)
**Problem**: Receiving rate limit errors.
**Solution**:
1. Implement exponential backoff
2. Add request queuing
3. Consider upgrading your API plan
### Debug Mode
Enable debug logging to troubleshoot issues:
```typescript theme={null}
const client = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
debug: true,
logLevel: 'debug'
})
```
## Best Practices
* **Security**: Always store API keys in environment variables
* **Error Handling**: Implement comprehensive error handling
* **Rate Limiting**: Respect API rate limits and implement backoff
* **Testing**: Write tests for all critical functionality
* **Monitoring**: Implement proper logging and monitoring
## Next Steps
Now that you've completed this guide, you can:
* \[Next step 1] - Link to related guide or documentation
* \[Next step 2] - Link to advanced topics
* \[Next step 3] - Link to integration examples
## Additional Resources
* [Related Documentation](/path/to/related/docs)
* [API Reference](/api-reference/endpoint)
* [Community Examples](https://github.com/spark/examples)
* [Support Forum](https://community.spark.money)
## Code Examples
```typescript Complete Implementation theme={null}
// Complete working example
import { SparkClient } from '@spark/sdk'
const client = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
network: 'testnet'
})
async function completeExample() {
// Full implementation
}
```
```javascript Node.js theme={null}
// Node.js specific implementation
const { SparkClient } = require('@spark/sdk')
const client = new SparkClient({
apiKey: process.env.SPARK_API_KEY,
network: 'testnet'
})
```
```python Python theme={null}
# Python implementation
from spark_sdk import SparkClient
client = SparkClient(
api_key=os.getenv('SPARK_API_KEY'),
network='testnet'
)
```
# Brale
Source: https://docs.spark.money/integrations/brale
Issue and manage compliant stablecoins on Spark with Brale.
Brale lets you issue and manage fiat-backed stablecoins on Spark with built‑in compliance and treasury controls. Use Brale to create a token, set rules (freeze, allow/block lists), mint/burn supply, and distribute to Spark wallet addresses via API or dashboard. Designed for regulated issuers and fintechs. See the official docs for details.
***
## Features
***
## Quick Integration
Below is a minimal pattern that mirrors Brale’s Quick Start: create or select a token in the dashboard, use an API key, then mint to a Spark address. Adjust endpoints and fields per your Brale workspace and token configuration.
Reference: [Quick Start](https://docs.brale.xyz/overview/quick-start)
```typescript theme={null}
// 1) Create/select a token in the Brale dashboard and get an API key
// 2) Server-side mint to a Spark address
const BRALE_API = 'https://api.brale.xyz'; // example base URL
const BRALE_API_KEY = process.env.BRALE_API_KEY!;
const TOKEN_ID = process.env.BRALE_TOKEN_ID!; // your Brale-issued token id
type MintRequest = {
tokenId: string;
amount: string; // integer string amount in smallest units
recipientSparkAddress: string; // Spark address (bech32m) to receive tokens
};
async function mintToSpark(req: MintRequest) {
const res = await fetch(`${BRALE_API}/v1/tokens/${req.tokenId}/mint`, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'Authorization': `Bearer ${BRALE_API_KEY}`,
},
body: JSON.stringify({
amount: req.amount,
recipient: req.recipientSparkAddress,
}),
});
if (!res.ok) {
const err = await res.text();
throw new Error(`Brale mint failed: ${res.status} ${err}`);
}
return await res.json();
}
// Example usage
await mintToSpark({
tokenId: TOKEN_ID,
amount: '1000000', // e.g., 1.000000 unit if 6 decimals
recipientSparkAddress: 'spark1xxxxx',
});
```
***
## Documentation
# Breez
Source: https://docs.spark.money/integrations/breez
Add self-custodial Lightning and Spark payments with Breez SDK.
Breez is a fully featured Lightning SDK that lets developers add self-custodial Lightning and Spark payments to their apps with almost zero lift.
If you want to integrate LNURL, Lightning Addresses, or Nostr support, Breez gives you everything in one place. It also includes bindings for all major languages and frameworks, making it the easiest and most complete way to build on Lightning today.
***
## Features
***
The SDK is written in Rust with bindings for all major languages (JS, Kotlin, Swift, Go, Python, RN, Flutter, C#). Here's how to integrate it in your app.
## Installation
```bash theme={null}
npm install @breeztech/breez-sdk-spark
```
## Quick Integration
When developing a browser application, import from `@breeztech/breez-sdk-spark/web`. You must initialize the WebAssembly module with `await init()` before making any other calls.
```javascript theme={null}
import init, {
initLogging,
defaultConfig,
SdkBuilder,
} from "@breeztech/breez-sdk-spark/web";
// Initialise the WebAssembly module
await init();
```
When developing a Node.js application (v22+), import from `@breeztech/breez-sdk-spark/nodejs`.
```javascript theme={null}
const {
initLogging,
defaultConfig,
SdkBuilder,
} = require("@breeztech/breez-sdk-spark/nodejs");
const { Command } = require("commander");
require("dotenv").config();
class JsLogger {
log = (logEntry) => {
console.log(
`[${new Date().toISOString()} ${logEntry.level}]: ${logEntry.line}`
);
};
}
const fileLogger = new JsLogger();
class JsEventListener {
onEvent = (event) => {
fileLogger.log({
level: "INFO",
line: `Received event: ${JSON.stringify(event)}`,
});
};
}
const eventListener = new JsEventListener();
const program = new Command();
const initSdk = async () => {
// Set the logger to trace
await initLogging(fileLogger);
// Get the mnemonic
const mnemonic = process.env.MNEMONIC;
// Connect using the config
let config = defaultConfig("regtest");
config.apiKey = process.env.BREEZ_API_KEY;
console.log(`defaultConfig: ${JSON.stringify(config)}`);
let sdkBuilder = SdkBuilder.new(config, {
type: "mnemonic",
mnemonic: mnemonic,
});
sdkBuilder = await sdkBuilder.withDefaultStorage("./.data");
const sdk = await sdkBuilder.build();
await sdk.addEventListener(eventListener);
return sdk;
};
program
.name("breez-sdk-spark-wasm-cli")
.description("CLI for Breez SDK Spark - Wasm");
program.command("get-info").action(async () => {
const sdk = await initSdk();
const res = await sdk.getInfo({});
console.log(`getInfo: ${JSON.stringify(res)}`);
});
program.parse();
```
***
## Documentation
# Flashnet
Source: https://docs.spark.money/integrations/flashnet
Swap tokens on Spark using Flashnet's AMM and liquidity pools.
Flashnet is the leading AMM (automated market maker) on Spark. It enables instant token swaps, liquidity pools, and market creation without custody risk. Built on Spark's cryptographic messaging layer, Flashnet provides near-instant settlement and zero-fee transactions for Bitcoin and token trading in your applications.
***
## Use Cases
***
## Quick Integration
```bash npm theme={null}
npm install @flashnet/sdk @buildonspark/spark-sdk
```
```bash yarn theme={null}
yarn add @flashnet/sdk @buildonspark/spark-sdk
```
```bash bun theme={null}
bun add @flashnet/sdk @buildonspark/spark-sdk
```
### Initialize Flashnet Client
```typescript theme={null}
import { FlashnetClient } from '@flashnet/sdk';
import { SparkWallet } from '@buildonspark/spark-sdk';
// Initialize your Spark wallet
const { wallet } = await SparkWallet.initialize({
mnemonicOrSeed: process.env.MNEMONIC,
options: {
network: "REGTEST", // or "MAINNET"
},
});
// Create the Flashnet client
const client = new FlashnetClient(wallet);
await client.initialize();
// The client will automatically:
// - Authenticate with the AMM service
// - Handle all signing and intent generation
```
### Using the Spark Wallet
The Flashnet client can also be used as a regular Spark wallet:
```typescript theme={null}
// Create a Lightning invoice using the client wallet
const invoice = await client.wallet.createLightningInvoice({
amountSats: 1000000,
memo: "Test invoice",
});
console.log(invoice);
```
***
## Documentation
# Garden
Source: https://docs.spark.money/integrations/garden
Swap BTC across chains using Garden's atomic swap protocol on Spark.
# Lightspark Grid
Source: https://docs.spark.money/integrations/grid
Add fiat on-ramp and off-ramp to your Spark app with Grid.
Grid gives builders on Spark everything they need to create natively embedded on/off-ramp experiences
directly within their wallet architecture. Whether you’re pulling funds from a U.S. bank account into
a Spark wallet or off-ramping elsewhere, Grid provides the APIs to power the next generation of money apps natively on Bitcoin.
***
### Use Cases
***
## Documentation
# Partner
Source: https://docs.spark.money/integrations/integration-template
Template for Spark integration pages.
Integrate \[Partner Name] for use case this partner enables and what functionality they provide. \[Brief description of what the partner does and their value proposition for developers building on Spark.]
Integrate \[Partner Name] for use case this partner enables and what functionality they provide. \[Brief description of what the partner does and their value proposition for developers building on Spark.]
***
## Features
***
## Quick Integration (if applicable)
```typescript theme={null}
// Example code showing basic integration
import { PartnerClient } from '@partner/client'
const client = new PartnerClient({
apiKey: 'your-partner-key',
network: 'spark'
})
// Basic usage example
const result = await client.someMethod({
// configuration options
})
```
***
## Use Cases
## Guides
Coming soon...
***
## Documentation
# Privy
Source: https://docs.spark.money/integrations/privy
Add embedded wallets and social login to your Spark app with Privy.
Integrate Privy for seamless authentication and wallet management in your Spark applications. Privy provides a complete authentication and wallet management solution for Web3 applications, making it easy to onboard users and manage their digital identities on Spark.
***
## Use Cases
***
## Quick Integration
### Installation
Install the Privy React SDK using your package manager of choice:
```bash npm theme={null}
npm install @privy-io/react-auth@latest
```
```bash pnpm theme={null}
pnpm install @privy-io/react-auth@latest
```
```bash yarn theme={null}
yarn add @privy-io/react-auth@latest
```
### Setup
```typescript App.tsx theme={null}
'use client';
import {PrivyProvider} from '@privy-io/react-auth';
export default function Providers({children}: {children: React.ReactNode}) {
return (
{children}
);
}
```
***
## Documentation
# Sparkscan
Source: https://docs.spark.money/integrations/sparkscan
Block explorer and API for querying Spark transactions and network data.
Integrate Sparkscan for comprehensive blockchain data and network analytics on Spark. Sparkscan is the official block explorer for Spark, providing real-time transaction data, network analytics, and comprehensive blockchain information through both web interface and API.
***
## Features
***
## Quick Integration
```typescript theme={null}
import { SparkscanClient } from '@sparkscan/client'
const client = new SparkscanClient({
apiKey: 'your-sparkscan-key',
network: 'spark'
})
// Get transaction details
const tx = await client.getTransaction('tx-hash')
// Get address balance
const balance = await client.getBalance('spark:address')
// Get network stats
const stats = await client.getNetworkStats()
```
***
## Use Cases
## Guides
Coming soon...
***
## Documentation
# Tether Wallet Development Kit
Source: https://docs.spark.money/integrations/tether-wdk
Build wallets with Tether's WDK using the Spark module for Bitcoin and stablecoin support.
# API Reference
Source: https://docs.spark.money/issuance/api-reference
# Burn Tokens
Source: https://docs.spark.money/issuance/burn-tokens
Burning permanently destroys tokens from the issuer wallet. When called, tokens are automatically removed from circulation and the network state.
## Burn
```typescript theme={null}
await wallet.burnTokens(BigInt(5_000_000000)); // 5,000 tokens (6 decimals)
```
Burning is irreversible. Burned tokens are gone forever.
## When to Burn
Common use cases:
* Stablecoin redemption: user redeems tokens for fiat, you burn the equivalent
* Deflationary mechanics: periodic burns to reduce supply
* Token buyback: repurchase tokens from the market and burn them
## Verify the Burn
```typescript theme={null}
const before = await wallet.getIssuerTokenBalance();
console.log("Before:", before.balance);
await wallet.burnTokens(BigInt(1000_000000));
const after = await wallet.getIssuerTokenBalance();
console.log("After:", after.balance);
```
## Proof-of-Burn Address
Anyone can also burn tokens by sending them to the network's proof-of-burn address:
```
spark1pgssyqszqgpqyqszqgpqyqszqgpqyqszqgpqyqszqgpqyqszqgpqyqszykl0d2
```
Tokens sent to this address are permanently removed from circulation and cannot be recovered.
# Create Token
Source: https://docs.spark.money/issuance/create-token
Creating a token registers it on the network and locks in its properties. Once created, the token exists immediately and you can start minting. All token properties (name, ticker, decimals, max supply, freezability) are permanent and cannot be changed after creation.
## Create Your Token
```typescript theme={null}
import { IssuerSparkWallet } from "@buildonspark/issuer-sdk";
const { wallet, mnemonic } = await IssuerSparkWallet.initialize({
options: { network: "REGTEST" }
});
await wallet.createToken({
tokenName: "Acme Dollar",
tokenTicker: "ACME",
decimals: 6,
maxSupply: BigInt(0),
isFreezable: true
});
const tokenId = await wallet.getIssuerTokenIdentifier();
console.log("Token created:", tokenId); // btkn1q...
```
## Token Properties
All properties are permanent. Choose carefully.
### Name and Ticker
Display name and symbol shown in wallets and explorers.
```typescript theme={null}
tokenName: "Acme Dollar"
tokenTicker: "ACME"
```
### Decimals
Defines the smallest unit. If `decimals: 6`, then `1000000` base units = 1 token.
| Decimals | Smallest Unit | Common Usage |
| :------- | :------------ | :------------ |
| 6 | 0.000001 | Stablecoins |
| 8 | 0.00000001 | Bitcoin-style |
### Max Supply
`BigInt(0)` for unlimited. Any other value caps total supply forever.
```typescript theme={null}
// Unlimited
maxSupply: BigInt(0)
// Capped at 21 million (8 decimals)
maxSupply: BigInt(21_000_000_00000000)
```
### Freezable
If `true`, you can freeze addresses from transacting your token. If `false`, you lose this ability permanently.
This cannot be changed. If you set `isFreezable: false`, you can never freeze tokens.
## Get Token Info
After creation, retrieve your token's metadata and identifier:
```typescript theme={null}
const metadata = await wallet.getIssuerTokenMetadata();
console.log(metadata.tokenName); // "Acme Dollar"
console.log(metadata.tokenTicker); // "ACME"
console.log(metadata.decimals); // 6
console.log(metadata.maxSupply); // 0n
console.log(metadata.isFreezable); // true
const tokenId = await wallet.getIssuerTokenIdentifier();
console.log(tokenId); // btkn1q...
```
## Vanity Identifiers
Token identifiers are derived from your wallet keys. Want a custom suffix like `btkn1...usdc`? Use the [vanity generator](https://github.com/buildonspark/spark/tree/main/tools/vanity-token-generator).
# FAQ
Source: https://docs.spark.money/issuance/faq
### How did you make tokens possible on Spark?
Tokens are possible through the BTKN (Bitcoin Token) protocol. BTKN is our adaptation of the LRC-20 protocol, specifically optimized for Spark. It lets us represent tokens using regular Bitcoin transactions. On Spark, we made this protocol native so tokens can be issued, transferred, and managed directly, while staying fully connected to Bitcoin.
### Can anyone issue a BTKN token?
Yes. Token issuance is permissionless. BTKN and Spark are open protocols.
### Is BTKN compatible with L1?
Yes, BTKN will be compatible with L1. We expect most of the activity to happen on Spark, with L1 acting as the escape valve in case anything goes wrong.
### Are there any fees?
No gas, no network fees. BTKN assets on Spark move freely.
### Does BTKN support stablecoins?
Yes. BTKN was built for stablecoin issuers. We're agnostic to what kind of stablecoin you're building. Our goal is to give issuers the right primitives, and meet them where they are. If you're planning to issue one, reach out.
### Is BTKN compatible with the Lightning Network?
Not planned. As of now, we're focused on Spark support where liquidity, UX, and interoperability matter most. Lightning support is possible in the future, but not a priority today.
### Can BTKN exit from Spark to Bitcoin?
Soon. Spark is designed for unilateral exits. Every token on Spark will wap to a real UTXO on Bitcoin L1. Any users will be able to exit at any time, without trusting anyone.
### Is Spark planning to support other token standards?
Not natively for now. We think liquidity will consolidate around a single standard, and BTKN has the highest chance of winning by onboarding stablecoin issuers. That said, we're open to supporting other standards if real demand emerges.
### How does BTKN work with Spark?
On Spark, BTKN tokens are native. Minting, transfers, and burns are embedded into Spark leaves and validated by Spark Operators.
### I want to issue my token: how do I get started?
Start with our [quickstart](/quickstart/launch-token) to issue your first token. For more details, check out our [API reference](/api-reference/issuer-overview).
# Freeze Tokens
Source: https://docs.spark.money/issuance/freeze-tokens
Freezing is an optional capability that issuers can enable when creating a token. Spark does not require or enforce freezing. If you created your token with `isFreezable: false`, this feature is permanently disabled and your token operates without any freeze restrictions.
Issuers who need compliance controls (regulated stablecoins, for example) can opt into freezing. Issuers who want fully permissionless tokens simply set `isFreezable: false` at creation.
## Freeze an Address
If your token has freezing enabled, you can prevent an address from sending or receiving:
```typescript theme={null}
const tokenId = await wallet.getIssuerTokenIdentifier();
const result = await wallet.freezeTokens({
tokenIdentifier: tokenId,
sparkAddress: "spark1abc..."
});
console.log("Frozen amount:", result.impactedTokenAmount);
```
The response includes:
* `impactedOutputIds`: The token outputs that were frozen
* `impactedTokenAmount`: Total amount of tokens frozen
## Unfreeze
```typescript theme={null}
await wallet.unfreezeTokens({
tokenIdentifier: tokenId,
sparkAddress: "spark1abc..."
});
```
## Check Freezability
```typescript theme={null}
const metadata = await wallet.getIssuerTokenMetadata();
if (!metadata.isFreezable) {
console.log("This token cannot freeze addresses");
}
```
## Limitations
* You cannot freeze your own issuer address
* Freezing takes effect immediately
* Only works for tokens created with `isFreezable: true`
# Issuer Wallet
Source: https://docs.spark.money/issuance/issuer-wallet
Your issuer wallet holds the keys that control your token. Create one to get started, or restore an existing wallet with your mnemonic.
## Create a Wallet
```typescript theme={null}
import { IssuerSparkWallet } from "@buildonspark/issuer-sdk";
const { wallet, mnemonic } = await IssuerSparkWallet.initialize({
options: { network: "REGTEST" }
});
console.log("Backup this phrase:", mnemonic);
console.log("Your address:", await wallet.getSparkAddress());
```
Store the mnemonic securely. It's the only way to recover your wallet.
## Restore a Wallet
```typescript theme={null}
const { wallet } = await IssuerSparkWallet.initialize({
mnemonicOrSeed: "abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon about",
options: { network: "MAINNET" }
});
```
## Networks
| Network | Use |
| :-------- | :---------------------- |
| `REGTEST` | Development and testing |
| `MAINNET` | Production |
Start on `REGTEST`. When you're ready for production, generate a fresh wallet on `MAINNET`.
## Multiple Tokens
Each wallet creates one token. To issue multiple tokens, use different account numbers:
```typescript theme={null}
const { wallet: tokenA } = await IssuerSparkWallet.initialize({
mnemonicOrSeed: "your mnemonic...",
accountNumber: 0,
options: { network: "REGTEST" }
});
const { wallet: tokenB } = await IssuerSparkWallet.initialize({
mnemonicOrSeed: "your mnemonic...",
accountNumber: 1,
options: { network: "REGTEST" }
});
```
## Cleanup
Close connections when your app shuts down:
```typescript theme={null}
await wallet.cleanupConnections();
```
# Mint Tokens
Source: https://docs.spark.money/issuance/mint-tokens
Minting creates new tokens and deposits them directly into your issuer wallet. From there, you can transfer them to users or hold them in reserve. Only the issuer wallet that created the token can mint.
## Mint
```typescript theme={null}
// Mint 10,000 tokens (6 decimals)
await wallet.mintTokens(BigInt(10_000_000000));
```
The amount is in base units. If your token has 6 decimals, multiply the human-readable amount by 1,000,000.
```typescript theme={null}
function toBaseUnits(amount: number, decimals: number): bigint {
return BigInt(Math.floor(amount * Math.pow(10, decimals)));
}
await wallet.mintTokens(toBaseUnits(10000, 6));
```
## Check Your Balance
After minting, tokens appear in your wallet immediately:
```typescript theme={null}
const balance = await wallet.getIssuerTokenBalance();
console.log("Your balance:", balance.balance);
```
## Max Supply
If you set a `maxSupply` when creating your token, minting fails once you reach it. With unlimited supply (`maxSupply: 0`), you can mint indefinitely.
```typescript theme={null}
const metadata = await wallet.getIssuerTokenMetadata();
if (metadata.maxSupply === BigInt(0)) {
console.log("Unlimited supply");
} else {
console.log("Max supply:", metadata.maxSupply);
}
```
# Issuer SDK
Source: https://docs.spark.money/issuance/overview
The Spark Issuer SDK lets you create and manage tokens on the Spark network in the most scalable and developer-friendly way possible. Whether you're launching a new token, managing supply, or building token-based applications, the SDK provides everything you need to get started. See how [Brale built USDB](https://brale.xyz/blog/brale-spark-digital-dollars-native-to-bitcoin), the first Bitcoin-native stablecoin on Spark.
***
## Installation
***
## Guides
***
## Tools
# Testing Guide
Source: https://docs.spark.money/issuance/testing-guide
### Prerequisites
* Node.js 16+
* [Spark CLI tool](https://github.com/buildonspark/spark)
### Using our Spark CLI tool
We have a CLI tool that allows you to test your wallet operations on Spark. No coding is required!
To install the CLI tool:
```bash theme={null}
git clone https://github.com/buildonspark/spark.git
cd spark/sdks/js
yarn && yarn build
cd examples/spark-cli
yarn cli
```
This will start the CLI tool and you will be able to interact with the wallet. Run `help` to see the available commands.
## Command Reference
Command
Usage
initwallet
initwallet \ - Creates a new wallet instance. If no mnemonic provided, generates one
getbalance
Gets the current wallet balance as well the token balance
getsparkaddress
Gets a new Spark Address for receiving transfers
announcetoken
announcetoken \ \ \ \ \ - Mint a certain amount of tokens
minttokens
minttokens \ - Mint a certain amount of tokens
transfertokens
transfertokens \ \ \ - Sends tokens to the given receiver Spark Address using Bech32m token identifier (btkn1...)
burntokens
burntokens \ - Burns the specified amount of tokens
freezetokens
freezetokens \ - Freezes tokens at the Spark Address
unfreezetokens
unfreezetokens \ - Unfreezes tokens at the Spark Address
tokenactivity
Gets the token activity for the issuer's token
tokeninfo
Gets the token info for the issuer's token
help
Shows the help menu
exit
Exits the CLI tool
## Demo application
The fully built Demo Application is available [Here](https://github.com/buildonspark/spark/tree/develop/sdks/js/examples/spark-demo)
## Sample Express server project
### Clone the SDK repo
```bash theme={null}
git clone https://github.com/buildonspark/spark.git
```
### Navigate to project directory
```bash theme={null}
cd spark/sdks/js/examples/spark-node-express/
```
Follow the instructions in the README to install dependencies and run the server.
# Analytics
Source: https://docs.spark.money/issuance/token-analytics
Query the full transaction history for your token. Filter by address, status, or transaction hash to understand how your token moves through the network.
## Query Transactions
Get all transactions for your token:
```typescript theme={null}
const tokenId = await wallet.getIssuerTokenIdentifier();
const response = await wallet.queryTokenTransactions({
tokenIdentifiers: [tokenId]
});
console.log("Total transactions:", response.tokenTransactions.length);
```
## Filter by Address
Get transactions for a specific Spark address:
```typescript theme={null}
const response = await wallet.queryTokenTransactions({
sparkAddresses: ["spark1abc..."]
});
```
You can only use **one filter type** per query. To filter by both token and address, first query by `tokenIdentifiers`, then filter the results by address in your application code.
## Transaction Status
Each transaction has a status:
| Status | Meaning |
| :------------------------------------ | :----------------------- |
| `TOKEN_TRANSACTION_STARTED` | Created, not yet signed |
| `TOKEN_TRANSACTION_SIGNED` | Signed by all parties |
| `TOKEN_TRANSACTION_FINALIZED` | Confirmed |
| `TOKEN_TRANSACTION_STARTED_CANCELLED` | Cancelled before signing |
| `TOKEN_TRANSACTION_SIGNED_CANCELLED` | Cancelled after signing |
Filter by status:
```typescript theme={null}
const finalized = response.tokenTransactions.filter(
tx => tx.status === "TOKEN_TRANSACTION_FINALIZED"
);
console.log("Completed transactions:", finalized.length);
```
## Pagination
For tokens with many transactions, use pagination:
```typescript theme={null}
const response = await wallet.queryTokenTransactions({
tokenIdentifiers: [tokenId],
order: "desc",
pageSize: 50,
offset: 0
});
console.log("Transactions in page:", response.tokenTransactions.length);
```
# Holders
Source: https://docs.spark.money/issuance/token-holders
Check token balances for any Spark address. As an issuer, you can query the balance of any holder on the network.
## Your Balance
Check how many tokens you hold:
```typescript theme={null}
const balance = await wallet.getIssuerTokenBalance();
console.log("Your balance:", balance.balance, "base units");
```
## Check Any Address
Pass an address to check someone else's balance:
```typescript theme={null}
const userBalance = await wallet.getIssuerTokenBalance("spark1abc...");
console.log("User balance:", userBalance.balance);
```
## Display Amounts
Convert base units to human-readable format:
```typescript theme={null}
const metadata = await wallet.getIssuerTokenMetadata();
const balance = await wallet.getIssuerTokenBalance("spark1abc...");
const amount = Number(balance.balance) / Math.pow(10, metadata.decimals);
console.log(`${amount} ${metadata.tokenTicker}`);
```
## All Token Balances
Your wallet can hold multiple tokens. Get them all at once:
```typescript theme={null}
const { balance, tokenBalances } = await wallet.getBalance();
console.log("Bitcoin:", balance, "sats");
tokenBalances.forEach((data, key) => {
const amount = Number(data.balance) / Math.pow(10, data.tokenMetadata.decimals);
console.log(`${data.tokenMetadata.tokenTicker}: ${amount}`);
});
```
## Listen for Changes
Get notified when balances change:
```typescript theme={null}
wallet.on("transfer:claimed", async (transferId, newBalance) => {
console.log("Transfer received:", transferId);
const updated = await wallet.getBalance();
console.log("New balance:", updated.balance);
});
```
# Spark CLI
Source: https://docs.spark.money/issuance/tools/cli
# Transfer Tokens
Source: https://docs.spark.money/issuance/transfer-tokens
Send tokens to any Spark address. Transfers are instant, free, and recipients receive them automatically.
## Send Tokens
```typescript theme={null}
const tokenId = await wallet.getIssuerTokenIdentifier();
await wallet.transferTokens({
tokenIdentifier: tokenId,
tokenAmount: BigInt(100_000000), // 100 tokens (6 decimals)
receiverSparkAddress: "spark1abc..."
});
```
## Batch Transfer
Send to multiple recipients in a single transaction:
```typescript theme={null}
await wallet.batchTransferTokens([
{ tokenIdentifier: tokenId, tokenAmount: BigInt(1000_000000), receiverSparkAddress: "spark1alice..." },
{ tokenIdentifier: tokenId, tokenAmount: BigInt(500_000000), receiverSparkAddress: "spark1bob..." },
{ tokenIdentifier: tokenId, tokenAmount: BigInt(250_000000), receiverSparkAddress: "spark1carol..." }
]);
```
Batch transfers are atomic. All succeed or none do. Each item in the array must include its own `tokenIdentifier`.
## Token Amounts
All amounts are in base units. If your token has 6 decimals:
| Human | Base Units |
| :---- | :---------- |
| 1 | 1,000,000 |
| 100 | 100,000,000 |
| 0.5 | 500,000 |
```typescript theme={null}
function toBaseUnits(amount: number, decimals: number): bigint {
return BigInt(Math.floor(amount * Math.pow(10, decimals)));
}
await wallet.transferTokens({
tokenIdentifier: tokenId,
tokenAmount: toBaseUnits(50.5, 6),
receiverSparkAddress: "spark1..."
});
```
## Receiving Tokens
Recipients don't need to do anything. Tokens appear instantly:
```typescript theme={null}
const { tokenBalances } = await wallet.getBalance();
tokenBalances.forEach((data, key) => {
console.log(data.tokenMetadata.tokenName, ":", data.balance);
});
```
# TypeScript
Source: https://docs.spark.money/issuance/typescript
TypeScript SDK for issuing and managing tokens on Spark. Requires Node.js v16+.
## Getting Started
To get started, follow the steps below.
Install the Spark Issuer SDK packages using your package manager of choice.
```bash npm theme={null}
npm install @buildonspark/issuer-sdk
```
```bash yarn theme={null}
yarn add @buildonspark/issuer-sdk
```
```bash pnpm theme={null}
pnpm add @buildonspark/issuer-sdk
```
Create an issuer instance that will be used to interact with the Spark network.
```ts issuer.ts theme={null}
import { IssuerSparkWallet } from "@buildonspark/issuer-sdk";
const { wallet, mnemonic } = await IssuerSparkWallet.initialize({
mnemonicOrSeed: "optional-mnemonic-or-seed",
accountNumber: "optional-number",
options: {
network: "REGTEST",
},
});
console.log("Wallet initialized successfully:", mnemonic);
```
You're ready to start building.
```ts app.ts theme={null}
import { IssuerSparkWallet } from "@buildonspark/issuer-sdk";
async function main() {
try {
// Initialize issuer wallet
const { wallet } = await IssuerSparkWallet.initialize({
options: { network: "REGTEST" }
});
console.log("Issuer wallet created!");
console.log("Address:", await wallet.getSparkAddress());
// Create a token
const tx = await wallet.createToken({
tokenName: "My Token",
tokenTicker: "MTK",
decimals: 8,
maxSupply: BigInt(21000000_00000000), // 21M tokens
isFreezable: false
});
console.log("Token created:", tx);
// Get token identifier
const tokenId = await wallet.getIssuerTokenIdentifier();
console.log("Token ID:", tokenId);
// Mint tokens to yourself
await wallet.mintTokens(BigInt(1000_00000000)); // 1000 tokens
console.log("Tokens minted!");
} catch (error) {
console.error("Error:", error);
}
}
main();
```
## TypeScript Configuration
### tsconfig.json
Create a `tsconfig.json` file in your project root:
```json tsconfig.json theme={null}
{
"compilerOptions": {
"target": "ES2020",
"module": "commonjs",
"lib": ["ES2020"],
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"declaration": true,
"declarationMap": true,
"sourceMap": true
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist"]
}
```
### Package.json Scripts
Add TypeScript build scripts to your `package.json`:
```json package.json theme={null}
{
"scripts": {
"build": "tsc",
"start": "node dist/app.js",
"dev": "ts-node src/app.ts",
"watch": "tsc --watch"
}
}
```
## Core Issuer Operations
### Initialize an Issuer Wallet
An issuer wallet requires either a mnemonic or raw seed for initialization. The `initialize()` function accepts both. If no input is given, it will auto-generate a mnemonic and return it.
```ts theme={null}
import { IssuerSparkWallet } from "@buildonspark/issuer-sdk";
// Initialize a new issuer wallet
const { wallet, mnemonic } = await IssuerSparkWallet.initialize({
mnemonicOrSeed: "optional-mnemonic-or-seed",
accountNumber: 0, // optional
options: {
network: "REGTEST" // or "MAINNET"
}
});
console.log("Issuer wallet initialized:", mnemonic);
```
### Mnemonic Phrases
A mnemonic is a human-readable encoding of your wallet's seed. It's a 12- or 24-word phrase from the BIP-39 wordlist, used to derive the cryptographic keys that control your issuer wallet.
## TypeScript Features
### Type Safety
The Spark Issuer TypeScript SDK provides full type safety for all issuer operations:
```ts theme={null}
import { IssuerSparkWallet } from "@buildonspark/issuer-sdk";
// TypeScript will provide autocomplete and type checking
const { wallet } = await IssuerSparkWallet.initialize({
options: { network: "REGTEST" }
});
// All methods are fully typed
const address: string = await wallet.getSparkAddress();
const metadata = await wallet.getIssuerTokenMetadata();
const tokenId = await wallet.getIssuerTokenIdentifier();
```
### Interface Definitions
Key interfaces used by the Issuer SDK:
```ts theme={null}
// Token creation parameters
interface CreateTokenParams {
tokenName: string;
tokenTicker: string;
decimals: number;
maxSupply: bigint;
isFreezable: boolean;
extraMetadata?: Uint8Array;
}
// Token metadata returned by getIssuerTokenMetadata()
interface IssuerTokenMetadata {
rawTokenIdentifier: Uint8Array;
tokenPublicKey: string;
tokenName: string;
tokenTicker: string;
decimals: number;
maxSupply: bigint;
isFreezable: boolean;
extraMetadata?: Uint8Array;
}
// Token distribution stats
interface TokenDistribution {
totalCirculatingSupply: bigint;
totalIssued: bigint;
totalBurned: bigint;
numHoldingAddress: number;
numConfirmedTransactions: bigint;
}
```
### Error Handling
The SDK provides typed error classes for better error handling:
```ts theme={null}
import { IssuerSparkWallet } from "@buildonspark/issuer-sdk";
import { SparkError, SparkValidationError } from "@buildonspark/spark-sdk";
try {
await wallet.createToken({
tokenName: "My Token",
tokenTicker: "MTK",
decimals: 8,
maxSupply: BigInt(21000000_00000000),
isFreezable: false
});
} catch (error) {
if (error instanceof SparkValidationError) {
console.error('Validation error:', error.message);
} else if (error instanceof SparkError) {
console.error('SDK error:', error.message);
} else {
console.error('Unexpected error:', error);
}
}
```
# Core Concepts
Source: https://docs.spark.money/learn/core-concepts
A statechain is a protocol that enables off-chain transfers of ownership for blockchain assets. It allows users to transfer control of a UTXO multiple times without creating on-chain transactions, using cryptographic signatures facilitated by a group of entities. This approach aims to improve transaction speed, privacy, and scalability while maintaining the security of the underlying blockchain.
## Spark Entity (SE)
The group of operators that run a Spark. They are responsible for performing the operations necessary for signing and forgetting past keys.
## Spark Operator (SO)
One of the operators within the SE. All or a threshold of operators are used to aid in the transfer of off-chain UTXOs.
## Spark Service Provider (SSP)
A service provider who facilitates efficient deposits/withdrawals to/from Spark. Any number of SSPs can exist within a single Spark. Either SSPs or another entity can additionally optionally serve as Lightning providers to enable Lightning transactions for users within Spark.
## Branches and Leaves
Leaves are terminal transactions of the tree that are owned by an individual user.
Branches are all transactions of the tree that are not leaf transactions. These are nearly identical to leaf transactions, except they do not have timelocks and can only be spent by the sum of the keys of the leaves under them.
## Exit Transaction
A signed Bitcoin transaction that sends funds from Spark to the user. This serves as the unilateral exit mechanism, enabling any user to withdraw funds from Spark without cooperation by broadcasting the exit transaction and its parents.
## Atomic Swap
Exchanging two secrets A and B trustlessly for each other such that either both parties involved know both A and B or neither know both A and B.
# Deposits from L1
Source: https://docs.spark.money/learn/deposits
Depositing L1 funds to Spark is straightforward. The SE (Spark Entity) and user collaborate to generate an aggregate public key and derive a pay-to-taproot address from it. They then work together to create and sign two transactions: an exit transaction, and an intermediate branch transaction before it that triggers the exit transaction's relative timelock. Once these transactions are both signed, the user can finally broadcast the deposit transaction to the pay-to-taproot address. Note that the inputs to the funding transaction should be segwit inputs. The user will now have a leaf in Spark.
### Steps-by-Step Process
1. **Key Generation:**
* The user and SE work together to generate an aggregate public key, which is the sum of the user's public key and the SE's public key (which itself is derived from the individual SO public keys). They then derive a pay-to-taproot address for this key.
* $\text{PubKey}_{Combined} = \text{PubKey}_{User} + \text{PubKey}_{SE}$
* where $\text{PubKey}_{SE} = \sum \lambda_i * \text{PubKey}_{SO_i}$
* and $\lambda_i$ is the Lagrange Coefficient on $x_0 = 0$ for $i$
2. **Setup and Signing:**
* User constructs a deposit transaction sending their funds to the pay-to-taproot address, but doesn't broadcast it.
* User and SE collaboratively create and sign two transactions:
1. An intermediate branch transaction (not broadcasted) with no timelock that spends from the deposit transaction. This transaction triggers the relative timelock of leaves under it.
2. An exit transaction that spends from the intermediate transaction. This transaction is broadcasted if the user wants to unilaterally exit Spark.
* **Both transactions are signed by all parties involved in order to provide a unilateral exit path. Spark-compatible wallets should verify the transaction validity of all transactions involved.**
3. **Storage:**
* User and SE securely store the signed transactions.
4. **User Deposit and Confirmation:**
* User broadcasts the deposit transaction created in step 2. Once the L1 transaction has been confirmed, the funds are available to transfer within Spark.
# FAQ
Source: https://docs.spark.money/learn/faq
### Is Spark live?
Yes. Spark's beta mainnet is now live. Developers can start building and transacting on the network. While this is a beta release, core functionality like sending and receiving Bitcoin, supporting tokens (e.g. stablecoins), and Lightning interoperability is fully operational. That said, Spark is still highly experimental. Expect bugs, rapid iteration, and breaking changes as we continue to scale the network.
### How do I check if Spark is having issues?
Check [spark.money/status](https://spark.money/status) for real-time network status. You can also report issues there.
### How is Spark native to Bitcoin?
The Bitcoin you hold on Spark is the same Bitcoin you hold on Bitcoin. It's not wrapped or custodied in a multisig by a set of signers. On Spark, you can unilaterally exit your funds back to L1 at any time by broadcasting your pre-signed transaction to a Bitcoin node.
### How many Spark Operators (SOs) are there, and who are they?
At launch, Spark is supported by a small set of two Spark Operators (SOs): [Lightspark](https://www.lightspark.com) and [Flashnet](https://flashnet.xyz). We've intentionally kept the set small to simplify debugging and testing during the early phase. More operators will be added soon as we scale.
### Should I expect some limits?
Spark is still early. Please treat everything as a beta environment.
With that in mind, limits may be applied. Our first priority is making sure everything stays secure.
### What happens if Spark operators go offline?
You can always exit to Bitcoin L1 using your pre-signed exit transaction. If all operators go offline, you won't be able to make new Spark transfers until operators are back online, but your funds remain in your custody and are redeemable to Bitcoin L1 at any time.
Check [Spark Status](https://spark.money/status) for real-time network health.
### What are the SSPs present?
Lightspark is running the first SSP on Spark. Anyone can become an SSP. Our goal is to maximize the diversity of SSPs to make the network more competitive and redundant.
### Does Spark have a token or a planned airdrop?
Spark does not have a token. Spark has not announced plans for any airdrop or token generation event. Be wary of scams claiming airdrops, giveaways, etc.
If in doubt, refer to official Spark communication channels:
Web: [https://spark.money](https://spark.money)
Twitter: [@spark](https://x.com/spark)
### Does Spark support smart contracts?
No, Spark does not support smart contracts.
### Is Spark open source?
Yes, Spark is open source. You can read the code or contribute directly [here](https://github.com/buildonspark/spark).
### Does Spark require KYC / KYB?
No. Spark is an open protocol that provides infrastructure for moving Bitcoin and tokens (e.g. stablecoins). Like Bitcoin, it does not impose any KYC requirements at the protocol level.
### What are the fees on Spark?
| Transaction Type | Fee |
| ------------------ | -------------------------------------------------------------------- |
| L1 to Spark | On-chain fee paid by user |
| Spark to Spark | Free (small flat fee coming in 6-12 months) |
| Spark to Lightning | 0.25% + routing fee |
| Lightning to Spark | 0.15% (charged on routing nodes) |
| Exit to L1 | L1 fee + SSP fee (formula: `sats_per_vbyte × (111 × 2 + tx_vbytes)`) |
The L1 exit fee is flat and doesn't scale with amount. For small withdrawals, it may be a higher percentage. The 111 × 2 component is the SSP fee based on minimum transaction size.
# Spark FROST
Source: https://docs.spark.money/learn/frost-signing
Spark employs a customized version of the FROST (Flexible Round-Optimized Schnorr Threshold) signing scheme to enable transaction signing by a required participant alongside a threshold group of signers known as Signing Operators (SOs).
## Problem Statement
In Spark, transactions require co-signing by a user and a group of Signing Operators (see [SOs](/spark/technical-definitions#spark-operator-so)), together SOs make up that Spark Entity [(SE)](/spark/technical-definitions#spark-entity-se). The signature must meet these requirements:
* **Threshold Signing for SOs:** Only $t$ out of $n$ total SOs are needed to complete the signing process.
* **Mandatory User Signing:** The user's participation is essential for a valid signature.
* **Single Schnorr Signature:** The final signature must be aggregated into a single Schnorr signature, ruling out multi-signature schemes like 2-of-2. The final signature is the aggregate of the user's signature and the SOs' signatures, where the user is the required participant.
Additionally, the cryptographic keys must support:
* **Tweakability:** Keys can be modified by adding or multiplying a tweak value.
* **Additive Aggregatability:** Given all shards of two keys (e.g., key1 and key2), it must be possible to derive all shards for their sum (key1 + key2).
This document presents a modified FROST signing scheme tailored to address these needs in Spark.
## Key Generation
### User Key
The user independently generates a single key pair, denoted as $(sk_{user}, pk_{user})$, where $sk_{user}$ is the secret key and $pk_{user}$ is the corresponding public key.
### SO Keys
The Signing Operators (SOs) collaboratively generate a threshold signing key pair, denoted as $(sk_{so}, pk_{so})$, using a Distributed Key Generation (DKG) protocol. Each SO receives a Shamir secret share $ss_i$ of $sk_{so}$, configured with a threshold $(t, n)$, meaning any $t$ of the $n$ SOs can reconstruct the key.
**Note:** Details of the secure DKG process are beyond the scope of this document. You can find more information in the [FROST paper](https://eprint.iacr.org/2020/852.pdf).
## Key Aggregation
The user's key and the SOs' key are combined into an aggregated key using additive aggregation:
* The aggregated secret key is computed as $sk_{agg} = sk_{user} + sk_{so}$.
* The corresponding public key is $Y = pk_{user} + pk_{so}$.
All participants must know the aggregated public key $Y$.
## Pre-processing
Mirroring FROST's pre-processing phase, all participants generate and commit to nonces:
* **SO Nonces:** Each SO generates nonce pairs $(d_{ij}, e_{ij})$ and their commitments $(D_{ij}, E_{ij})$, where $D_{ij} = g^{d_{ij}}$ and $E_{ij} = g^{e_{ij}}$, using a fixed generator $g$.
* **User Nonces:** The user generates nonce pairs $(d_{user_i}, e_{user_i})$ and commitments $(D_{user_i}, E_{user_i})$, sharing these commitments with all SOs.
These nonces enhance security during signing by preventing replay attacks.
## Signing Flow
The signing process involves the user, a coordinator, and a subset of SOs, proceeding as follows:
1. **Initiation:**
* The user submits a signing request for message $m$ to a signing operator coordinator.
2. **Participant and Nonce Selection:**
* The coordinator selects a set $S$ of $t$ participating SOs.
* It compiles an unused nonce commitment list $B = \{(D_{ij}, E_{ij}) \mid i \in S\} \cup \{(D_{user_j}, E_{user_j})\}$ and broadcasts $B$ to all participants.
3. **Signature Share Calculation by SOs:**
* Each SO $i \in S$ computes:
* $\rho_i = H_1(i, m, B)$, using a hash function $H_1$.
* Nonce contribution: $R_i = D_{ij} \cdot E_{ij}^{\rho_i}$.
* Challenge: $c = H_2(R, Y, m)$, where $R = \prod_{i \in S} R_i$.
* Signature share: $z_i = d_{ij} + e_{ij} \rho_i + \lambda_i ss_i c$, where $\lambda_i$ is the Lagrange coefficient for reconstructing $sk_{so}$.
4. **SO Signature Aggregation:**
* The coordinator aggregates the SO shares: $z_{so} = \sum_{i \in S} z_i$.
* It computes $R_{so} = \prod_{i \in S} R_i$, validates the partial signature, and sends $(R_{so}, z_{so})$ along with $B$ to the user.
5. **User Signature Share Calculation:**
* The user computes:
* $\rho_{user} = H_1(0, m, B)$ (assuming user index 0).
* Nonce contribution: $R_{user} = D_{user_j} \cdot E_{user_j}^{\rho_{user}}$.
* Full nonce: $R = R_{so} \cdot R_{user}$.
* Challenge: $c = H_2(R, Y, m)$.
* Signature share: $z_{user} = d_{user_j} + e_{user_j} \rho_{user} + sk_{user} c$.
6. **Final Signature:**
* The user aggregates the final signature as $(R, z)$, where $z = z_{so} + z_{user}$.
## Key Tweaks
The SO key $\mathit{sk_{so}}$ is shared via Shamir secret sharing with the polynomial:
$f(x) = \mathit{sk_{so}} + a_1 x + a_2 x^2 + \cdots + a_{t-1} x^{t-1}$
Each SO $i$ holds the share $(i, f(i))$.
### Additive Tweak
To tweak $\mathit{sk_{so}}$ by adding $t$ (i.e., $\mathit{sk_{so}'} = \mathit{sk_{so}} + t$):
* Define the new polynomial $f'(x) = f(x) + t$
* Update each share to $f'(i) = f(i) + t$
### Multiplicative Tweak
For a multiplicative tweak by $t$ (i.e., $\mathit{sk_{so}'} = t \cdot \mathit{sk_{so}}$):
* Update each share to $f'(i) = t \cdot f(i)$
### Secure Tweak Distribution
Directly sharing $t$ with SOs risks key exposure if the sender is an SO or colludes with one. Instead:
* Construct a polynomial $g(x)$ of degree $t-1$ where $g(0) = t$
* Distribute $g(i)$ to each SO $i$
* Each SO updates their share: $f'(i) = f(i) + g(i)$
This method applies the tweak securely without revealing $t$.
## Key Split
When splitting a key into $n$ child keys (e.g., for transaction splitting), the property holds:
$\mathit{Alice_{old}} + \mathit{SO_{old}} = \sum_{i=1}^n (\mathit{Alice_i} + \mathit{SO_i})$
Here, $\mathit{Alice_{old}}$ and $\mathit{SO_{old}}$ are the original user and SO keys, and $\mathit{Alice_i}$ and $\mathit{SO_i}$ are the child keys.
### Process
1. **User Key Splitting:**
* The user (Alice) generates $n$ key pairs $(\mathit{sk_{Alice_i}}, \mathit{pk_{Alice_i}})$ for $i = 1$ to $n$
* Compute $\mathit{sum_{Alice}} = \sum_{i=1}^n \mathit{sk_{Alice_i}}$
* Calculate $\mathit{Tweak} = \mathit{sk_{Alice_{old}}} - \mathit{sum_{Alice}}$
2. **Tweak Communication:**
* The user sends $\mathit{Tweak}$ to the SOs.
3. **SO Key Splitting:**
* The SOs use DKG to generate $n-1$ key pairs $(\mathit{sk_{SO_i}}, \mathit{pk_{SO_i}})$ for $i = 1$ to $n-1$
* Set the $n$-th key as $\mathit{sk_{SO_n}} = \mathit{sk_{SO_{old}}} - \left( \sum_{i=1}^{n-1} \mathit{sk_{SO_i}} - \mathit{Tweak} \right)$
4. **Verification:**
* The sum of child keys is:
$\sum_{i=1}^n \mathit{sk_{Alice_i}} + \sum_{i=1}^n \mathit{sk_{SO_i}} = \mathit{sum_{Alice}} + \sum_{i=1}^{n-1} \mathit{sk_{SO_i}} + \mathit{sk_{SO_{old}}} - \sum_{i=1}^{n-1} \mathit{sk_{SO_i}} + \mathit{Tweak}$
* Substituting $\mathit{Tweak} = \mathit{sk_{Alice_{old}}} - \mathit{sum_{Alice}}$:
$= \mathit{sum_{Alice}} + \mathit{sk_{SO_{old}}} + (\mathit{sk_{Alice_{old}}} - \mathit{sum_{Alice}}) = \mathit{sk_{Alice_{old}}} + \mathit{sk_{SO_{old}}}$
5. **Shard Adjustment:**
* For SOs' Shamir shares, the $n$-th key's share for SO $j$ is adjusted as:
$f_{SO_n}(j) = f_{SO_{old}}(j) - \left( \sum_{i=1}^{n-1} f_{SO_i}(j) - \mathit{Tweak} \right)$
This provides Alice with true offline receive - she doesn't need to be online in order to receive the full Lightning payment. Additionally, Alice does not need to run a Lightning node and does not need to open any channels or perform any force closures. All operations here are atomic under the same 1/n or minority/n trust assumptions, as the operators must work together to recreate the preimage.
### Sending a Lightning Payment
1. Alice receives a Lightning invoice from Bob.
2. Alice makes an API call to the SSP, agreeing to conditionally transfer leaves upon payment completion.
3. SE locks transfers on Alice's specified leaves until a specified time.
4. SSP makes a Lightning payment to pay Bob's invoice.
5. SSP provides proof of Lightning payment to the SE.
6. SE finalizes the transfer of Alice's leaves to the SSP atomically.
**Note:** If the specified time expires, the SE unlocks usage of Alice's leaves and control returns to her.
***
### Routing Nodes
If you're a Lightning node operator or LSP looking to establish channels with Spark, connect to our routing nodes:
```
02a98e8c590a1b5602049d6b21d8f4c8861970aa310762f42eae1b2be88372e924@172.30.102.161:9735
039174f846626c6053ba80f5443d0db33da384f1dde135bf7080ba1eec465019c3@172.30.202.34:9735
```
Opening channels to these nodes improves routing reliability for Lightning payments to and from Spark wallets.
# Limitations / Attacks
Source: https://docs.spark.money/learn/limitations
If a prior owner of a Spark leaf publishes the branch in a unilateral exit, there is a time limitation during which the correct owner needs to publish the correct leaf transaction. If the current owner does not do so during the window, then the attacker can claim the leaf UTXO. The SOs all have a copy of the signed transaction and can act as watchtowers on behalf of the current leaf owner. Further, the user can delegate watchtower functionality to any third party as the watchtower holds no toxic data. Additionally, depending on how the Spark is configured, this attack can be fairly costly for the attacker - they need to publish the entire branch (unless someone else has unilaterally closed much of the same branch) and CPFP each tree node.
### Loss of SE liveness
If any (or a minority of the Spark is configured for threshold) of the SOs lose liveness or lose their keys, the Spark will not be able to continue. Users will still be able to withdraw their funds through unilateral exits, but they will be unable to continue to send off-chain payments with Spark. This means that the entities comprising the SE should be carefully chosen to be highly available and low-latency since they are in the flow of UTXO transfers. The number of entities and the threshold of trust are configurable - for example, could require trusting ⅓ of the n entities in the SE, which would grant higher liveliness. This can be further mitigated by a single SE holding multiple public keys with the same threshold aggregate public key. That way if one host server is lost, the other cold key can be used, without the whole state-chain being closed on-chain.
# Manifesto
Source: https://docs.spark.money/learn/manifesto
Sixteen years ago, Bitcoin introduced something radical: a peer-to-peer electronic cash system. At the time it sounded like science fiction, but today it's a trillion-dollar asset. It's held by institutions, written about in central bank reports, and debated on Senate floors. Everyone knows what Bitcoin is; whether they use it or not, they know it's real. That alone makes it one of the most successful pieces of software ever written.
The most well-known effort, Lightning, proved you can move Bitcoin quickly and cheaply without breaking its core trust guarantees. However, Lightning alone cannot scale to billions of users: its cumbersome UX, fragmented liquidity and high wallet-creation costs pose significant barriers.
Meanwhile, stablecoins took off by solving a real problem rather than adhering to original visions or philosophical purity. Most people want money that doesn't fluctuate daily, that they can send, store and build with, and that settles instantly anywhere in the world.
For developers, stablecoins became a superpower. A few lines of code enable wallets in any country and power new applications \[marketplaces, savings tools, creator payouts] without permission or bank interactions. Stablecoins transformed crypto from a speculative asset into a functional financial rail, moving more value today than PayPal.
Yet none of this stablecoin activity occurs on Bitcoin, since it was not designed for such expressiveness and lacks the smart-contract capabilities of other networks.
Still, Bitcoin's network remains the most resilient primitive: over 200 million users, more than 60 percent of crypto liquidity, and a transcendent brand. Developers will continue building on Bitcoin long after other networks fade.
Today we introduce Spark, a Bitcoin-native Layer 2 for payments and settlement. No bridges, no custodians, only a lightweight signing protocol that makes digital cash, whether BTC or stablecoin, truly usable.
Spark returns to first principles by enabling native applications on Bitcoin. First, it delivers the best UX ever seen on Bitcoin: whether for wallets, games or marketplaces, it offers the fastest, simplest and cheapest rails in crypto. Second, it unlocks new primitives such as stablecoins directly on Bitcoin as native, scalable, self-custodial assets rather than through wrappers or bridges.
# Privacy
Source: https://docs.spark.money/learn/privacy
# Scalability
Source: https://docs.spark.money/learn/scalability
Spark's design enables it to be nearly infinitely scalable with minimal computational costs. This is unlocked through the lack of global consensus or propagation. Validator nodes (SOs) only need to gather for signing at moment of transfer. Transactions can be processed and settled independently of other transactions, making Spark fully parallelizable.
SOs themselves can be scaled horizontally allowing them to process and sign more transactions as they scale.
The goal of Spark is to be able to handle billions of concurrent users, and to do so with minimal fees and instant finality.
# Sovereignty
Source: https://docs.spark.money/learn/sovereignty
Self-sovereignty is a core principle of Spark's design. It is an unparalleled quality that very few L2s can afford to its users.
Spark, like Lightning, is narrowly designed for transfers of value, not general computation like other L2s. This focus makes achieving non-custodiality and true trustless exits much more straightforward.
How it works:
* **Pre-signed Transactions:** Before you deposit funds into Spark, you work with the operators to create a pre-signed transaction that exits funds to the Bitcoin L1. Meaning you can always exit your funds if the operators go offline or become malicious after a transfer
* **Timelocked Transactions:** When ownership of the leaf (in Spark, leaves work like UTXOs and represent the ownership of a UTXO on L1) is transferred (e.g., from Alice to Bob), a transaction is signed that gives Bob ownership of a leaf on L1. This exit transaction is encumbered by a timelock that is relative to its parent transaction. By the timelock being relative, it means that there is no limit to how long you can hold onto an unpublished leaf before it must go on-chain.
* **Decrementing Timelocks**: If Bob then transfers the leaf to Charlie, a new transaction is signed for Charlie with a shorter timelock (e.g., 300 blocks instead of Bob's 400). Each subsequent transfer reduces the timelock, so the most recent owner's transaction becomes valid first. This ensures that the current owner always has a sooner exit than the previous owner. This can be supported by watchtowers to ensure that the current owner always exits before the previous owner.
* **Fallback to L1:** If the Spark operators disappear, get compromised, attempt to censor, or refuse to cooperate, you (the current owner) can unilaterally broadcast to Bitcoin L1 the pre-signed transactions up to, and including, your leaf and claim the funds once the relative timelocks expire.
This mechanism ensures that you don't rely on any outside entity indefinitely. Even if it's a centralized component, you have an unshakeable and unconditional escape hatch to reclaim your funds on the Bitcoin L1, making your funds non-custodial in practice. The relativity of the timelock also makes it so the user does not have to refresh or exit funds to L1 on a set schedule. Exiting Spark can take as little as 100 blocks depending on the depth of their leaf.
This differs substantially from the standard L2 exit mechanisms, in which users must rely on a centralized sequencer or bridge to exit, without absolute guarantees.
# Understanding Spark's Operator Architecture
Source: https://docs.spark.money/learn/spark-vs-others
## The Mental Model Shift
If you're coming from Aptos, Sui, or EVM-based chains, you're used to a specific pattern: run a node, connect to it, and trust that the consensus mechanism guarantees validity. Spark works the same way—just with a different topology.
| Traditional Chain | Spark |
| :--------------------------------------- | :------------------------------------------ |
| Your node validates transactions locally | SDK validates operator signatures locally |
| Your node gossips with validators | Coordinator gossips with other operators |
| Validators run BFT/PoS consensus | Operators run threshold signature consensus |
| Finality requires quorum of validators | Finality requires threshold of operators |
The security guarantee is equivalent: **no single party can forge transactions or lie about state**.
***
## How the Client Connects to Operators
### Operator Registry
Every Spark SDK is configured with the **complete set of operators** upfront—not just a single API endpoint. This is analogous to how your Aptos/Sui node knows about the validator set.
The SDK ships with hardcoded knowledge of:
* Each operator's gRPC address
* Each operator's identity public key
* The threshold required for finality (e.g., 2-of-3)
```
Operator 0: https://0.spark.lightspark.com
Identity: 03dfbdff4b6332c220f8fa2ba8ed496c698ceada563fa01b67d9983bfc5c95e763
Operator 1: https://1.spark.lightspark.com
Identity: 03e625e9768651c9be268e287245cc33f96a68ce9141b0b4769205db027ee8ed77
Operator 2: https://2.spark.flashnet.xyz
Identity: 022eda13465a59205413086130a65dc0ed1b8f8e51437043161f8be0c369b1a410
```
This is **not** "just talking to an API." The SDK knows every participant in the consensus set and can verify their cryptographic signatures.
***
### Direct Verification
When the SDK performs a sensitive operation (like generating a deposit address), it doesn't blindly trust the coordinator's response. It verifies cryptographic proofs from **all operators**.
Here's what actually happens when you request a deposit address:
1. SDK sends request to coordinator
2. Coordinator forwards to all operators
3. Each operator signs the address with their identity key
4. Coordinator returns the address + all operator signatures
5. **SDK verifies every operator's signature locally**
If even one operator's signature is missing or invalid, the SDK rejects the response. This is identical to how your Aptos node would reject a block without sufficient validator signatures.
***
## How Operators Achieve Consensus
### Coordinator-Based Fanout
Spark uses a coordinator pattern where one operator acts as the entry point. This is purely an optimization—it reduces network round-trips. The coordinator has no special trust privileges.
When a transaction is submitted:
```
User SDK → Coordinator → [All Operators in parallel]
↓
Collects signatures
↓
Returns to SDK
```
The coordinator calls `ExecuteTaskWithAllOperators()`, which spawns parallel gRPC requests to every operator. Each operator independently:
* Validates the transaction
* Checks its local state
* Signs if valid, rejects if not
***
### Threshold Signatures (FROST)
For Bitcoin-layer operations, Spark uses FROST (Flexible Round-Optimized Schnorr Threshold signatures). This is the same threshold signature scheme used by cutting-edge wallet infrastructure.
The flow:
1. **Round 1**: Each operator generates and shares signing commitments
2. **Round 2**: Operators exchange partial signatures
3. **Aggregation**: Partial signatures combine into a single valid Schnorr signature
No single operator can produce a valid signature alone. Exactly like a 2-of-3 multisig, but with a single on-chain signature.
***
### State Replication
Every operator maintains its own complete database. There's no "primary" database that others read from. When a transaction is finalized:
1. Each operator writes to its local DB
2. All operators have identical state (eventually consistent, with strong guarantees on finalized txs)
3. The SDK can query **any** operator to verify state
This is how Aptos validators work—each has a full copy of state, and you can query any of them.
***
## Token Transaction Consensus
### The Three-Phase Protocol
Token transactions follow a strict protocol that requires participation from all (or threshold) operators:
**Phase 1: Start**
* User builds a partial transaction (inputs, outputs, amounts)
* Sends to coordinator with their signature
* Coordinator forwards to all operators
* Each operator validates and reserves resources
* Transaction enters `STARTED` state across all operators
**Phase 2: Sign**
* User confirms by requesting signatures
* Coordinator collects signatures from all operators
* Each operator signs the transaction hash with their identity key
* Transaction enters `SIGNED` state
**Phase 3: Reveal & Finalize**
* Operators exchange revocation secret shares
* Once threshold shares are collected, full secrets are recovered
* Transaction enters `FINALIZED` state
* State is now committed across all operators
At every phase, the transaction must pass validation on every operator independently. A malicious coordinator cannot forge consensus—it would need to compromise the threshold of operators.
***
## Why You Can't Run Your Own Operator (Yet)
This is a fair concern. Here's the honest answer:
**Current state**: The operator set is permissioned, run by Lightspark and partners. This is similar to how Aptos launched with a permissioned validator set before opening up.
**Why this exists**:
1. **Coordination complexity**: Operators must participate in DKG (Distributed Key Generation) ceremonies
2. **Uptime requirements**: Unlike validators that can be slashed, missing operators can halt transactions
3. **Network effects**: More operators means more latency (consensus fanout)
**What you get instead**:
1. **Cryptographic verifiability**: Your SDK verifies all operator signatures
2. **Query any operator**: You can hit any operator's endpoint to verify state
3. **Unilateral exit**: You can always withdraw to L1 using the revocation secrets (no operator cooperation required)
4. **Transparency**: Operator identity keys are public, their signatures are verifiable
***
## Comparison to Other Chains
### vs. Aptos/Sui
| Aspect | Aptos/Sui | Spark |
| :------------------------ | :---------------- | :---------------------- |
| Consensus | HotStuff BFT | Threshold signatures |
| Validator/Operator count | \~100+ | 3 (currently) |
| Can run your own node | Yes | Not yet (planned) |
| Client verifies consensus | Via Merkle proofs | Via operator signatures |
| Finality | \~1 second | Sub-second |
Spark's operator model is intentionally smaller for latency. The tradeoff is fewer independent parties, but with cryptographic guarantees that are mathematically equivalent.
### vs. Lightning Network
| Aspect | Lightning | Spark |
| :----------- | :--------------------- | :--------------------- |
| Counterparty | Single channel partner | Threshold of operators |
| Fraud proofs | On-chain dispute | Revocation secrets |
| Trust model | 1-of-1 | 2-of-3 (or n-of-m) |
| Exit path | Force-close on L1 | Unilateral exit on L1 |
Spark is strictly better than Lightning for counterparty risk—you're not trusting a single entity.
### vs. Rollups (Optimistic/ZK)
| Aspect | Rollups | Spark |
| :----------------------- | :--------------------------------- | :------------------------------------------------- |
| Data availability | On L1/DA layer | Operators + exit path |
| Verification | Fraud proofs / ZK proofs | Threshold signatures |
| Withdrawal time | 7 days (optimistic) / instant (ZK) | Instant (cooperative) / \~1000 blocks (unilateral) |
| Sequencer centralization | Often single sequencer | Multiple operators |
Spark's trust model is similar to a rollup with a decentralized sequencer set.
***
## The Security Guarantees
Let's be precise about what you're trusting:
### What a single malicious operator CAN'T do:
* Forge your signature
* Spend your tokens without your authorization
* Create tokens out of thin air
* Prevent your unilateral exit to L1
* Lie about the state (your SDK verifies signatures)
### What would require threshold collusion:
* Censoring your transactions (but you can exit to L1)
* Halting the network (liveness, not safety)
### What's cryptographically guaranteed:
* All operator signatures are verified client-side
* Transaction hashes are deterministic and verifiable
* Revocation secrets enable unilateral exit
***
## Practical Implications for Your Integration
1. **Your SDK is not blindly trusting an API**. It's verifying cryptographic proofs from a known set of participants.
2. **You can independently verify state** by querying any operator directly. The SDK does this automatically for balance checks.
3. **Your users can always exit to L1**. Even if all operators collude against a user, the revocation secret mechanism ensures funds are recoverable.
4. **The operator set is transparent**. You know exactly who the operators are, and you can verify their signatures.
5. **This is the same trust model as early-stage L1s**. Aptos, Sui, and even Ethereum started with small, permissioned validator sets.
***
## Summary
Spark's architecture is not fundamentally different from other chains. It's a **threshold signature consensus system** where:
* Multiple independent operators must agree on state changes
* Clients verify consensus via cryptographic signatures
* No single point of failure for safety (only for liveness)
* Unilateral exit path to Bitcoin L1 guarantees fund safety
The main difference is **topology, not security model**. Instead of running your own validator that participates in block production, you run a client that verifies validator (operator) signatures. This is analogous to running a light client on Ethereum—you don't produce blocks, but you verify that blocks are correctly signed.
For an issuer, this means:
* Your integration is verifying real cryptographic consensus, not just trusting an API
* The security model is threshold-based, similar to multisig but more sophisticated
* The path to further decentralization exists (more operators, possibly including you)
The architecture is sound. The operator set is small but growing. And the cryptographic guarantees are equivalent to what you'd expect from any other threshold-based system.
# Technical Definitions
Source: https://docs.spark.money/learn/technical-definitions
> **Disclaimer** *This section can be bypassed if the reader understands statechains, which are used as a building block for what follows. This section provides a high-level overview of how statechains work. Please note that the description here is intentionally simplified and may not be entirely accurate; its purpose is to convey the core functionality of statechains in an accessible manner.*
Statechains are a way to transfer ownership of a UTXO off-chain. In this example, all key signing operations are performed with [Schnorr](https://en.bitcoin.it/wiki/Schnorr).
First, begin with the assumption that keys can be linearly added together to form a single key. We will start with two parties holding keys - one is the user Alice (A), and the other will be called the statechain entity (SE), which can be composed of any number of members but will be shown as a single entity below.
$(SE+A=Key1)$
Each key is represented as a large random number. In this example, SE's key is represented by the value 100, and A's key is represented by the value 50.
$(100+50=150=Key1)$
Now assume that a signature is simply multiplying the key by the hash of some object. The hash will also be represented by a number. In this example, our hash will be represented by the value 2. Thus, a signature of this hash will be:
$((100+50)*2=300)$
To deposit funds into the statechain, Alice first creates a transaction to send money to a joint address controlled by the SE and herself (using their combined key of 150). However, she doesn't broadcast this transaction immediately. Instead, Alice and the SE collaborate to create and sign an exit transaction. This exit transaction is designed to transfer the coins directly to Alice, but it is locked with an absolute timelock set to an arbitrary amount of blocks in the future. Once Alice has the signed exit transaction in her possession, she then broadcasts the original deposit transaction. This creates a UTXO with a spending condition that requires the combined signature of SE and Alice. This careful sequencing ensures that Alice always has the ability to exit the statechain at any moment, even before the deposit is confirmed. This setup forms the foundation of the statechain mechanism.
Alice now wishes to transfer ownership of this UTXO to Bob (B), but she wants to do so without putting the transaction on-chain. Bob's randomly generated key is represented by the number 40. He then calculates that the difference between his key and Alice's key is 10 (50 - 40 = 10). For Bob plus the SE to sign using the same value as Alice+SE (150), the SE needs to tweak their key by 10. So, the SE discards the old key (100) and only keeps a copy of the newly tweaked key 110 (100 + 10 = 110). Now, SE+Bob still equals 150 (110 + 40 = 150), so the UTXO can be spent by signing using the combined 150. This meets the original spending condition of the UTXO. Alice can't sign a new transaction anymore because the SE discarded the 100 key. If Alice tried to sign, Alice+SE would now equal 160 (50 + 110 = 160), which isn't a valid key to sign the transaction. This process effectively transfers control of the UTXO from Alice to Bob without requiring an on-chain transaction.
During the key handoff process, the new owner collaborates with the SE to create and sign a new exit transaction. This new exit transaction features a lower timelock compared to the previous owner's exit transaction. For instance, if Alice's original exit transaction had a timelock of 100 blocks, Bob (the new owner) might set his timelock to 90 blocks. This process of decreasing timelocks continues with each subsequent transfer. When Bob transfers to Carol, her exit transaction might have a timelock of 80 blocks, and when Carol transfers to Dave, his could be set to 70 blocks. This decreasing timelock pattern ensures that the most recent owner always has the earliest opportunity to claim the UTXO on-chain. It's important to note that these exit transactions can only be executed on-chain once their respective timelocks expire. This mechanism provides a safeguard, allowing the current owner to claim the funds before any previous owners, while still maintaining the off-chain nature of the transfers until necessary.
It's important to understand that this mechanism imposes a time limit on how long a UTXO can remain within a statechain. The current owner must claim the funds on-chain before the absolute timelock expires. If they fail to do so, previous owners will have the opportunity to claim the UTXO on-chain. This time constraint prevents indefinite off-chain existence of UTXOs in vanilla Statechains.
## Timelocks
Timelocks on Bitcoin can either be absolute or relative. An absolute timelock specifies a block height after which a transaction can be broadcasted. A relative timelock specifies a number of blocks after the parent transaction is included in a block before the child transaction can be broadcasted.
The diagram below shows how statechains typically work:
In the above diagram, transactions 1 through 3 are all held off-chain. They all spend the same output from Txn0 and are replacements for each other. They use decreasing timelocks such that Txn4 can be put on-chain prior to Txn3, which could be put on-chain prior to Txn2. In this way, when ownership of the key controlling the funds is transferred to a new user, they can check the prior transactions and be assured that they can publish their new transaction first. These timelocks create a timebomb where the off-chain transaction with the lowest timelock will need to be put on-chain when its timelock expires, otherwise there is a risk that other transactions can claim the funds.
This can, however, be eliminated by the following flow:
Txn1 spends the output of Txn0. Txn1 has no timelock. When users transfer keys, they transfer ownership of the key that encumbers the output of Txn1. The transactions that spend from Txn1's output look like the normal statechain transactions that include decreasing timelocks relative to their parent transaction. But because Txn1 is held off-chain, there is no absolute timebomb.
The obvious problem with this is that if someone double-spends the output consumed by Txn1, then all of the leaves (Txn2..4) become invalid. To avoid this, the "SE" key is deleted by the SE - resulting in only one transaction ever generated for Txn0's output. The exact mechanics of this will be discussed later.
## Splitting a Leaf
We split leaves to spend smaller denominations. We do so by constructing a Bitcoin transaction that takes the parent UTXO as an input and produces multiple outputs, each controlled by a new key, which is split off from the original key. The sum of the new keys in all branches equals the original key, this allows for re-aggregation of leaves and more flexible leaf management without on-chain transactions.
To split a leaf in Spark, we need to split its existing key into multiple keys such that the sum of the new keys equals the original key. This can be represented as follows:
Original key: $(a_0=150)$
Split into two keys: $(a_1=100)$ $(a_2=50)$
Ensuring $(a_0= a_1+a_2)$
If the parent leaf is encumbered by the spending condition of $(a_0)$ it can then alternatively be spent using $(a_1+a_2)$ as they sum up to $(a_0)$
We want to ensure the following:
$(\text{PrivKey}_{\text{User\_Old}} + \text{PrivKey}_{\text{SE\_Old}} = \sum_{i=1}^{n} \left( \text{PrivKey}_{\text{User}_i} + \text{PrivKey}_{\text{SE}_i} \right))$
1. **User Key Splitting**
* Generate $(n)$ new user private keys
* Calculate the difference $(t)$ between old and new keys:
$(t = \text{PrivKey}_{\text{User\_Old}} - \sum_{i=1}^{n} \text{PrivKey}_{\text{User}_i})$
2. **SE Key Splitting**
* Calculate the new SE key sum:
$(P_{\text{SE\_New}} = P_{\text{SE\_Old}} + t)$
* Generate $(n-1)$ randomly generated SE private keys.
* Calculate the final SE key to satisfy the key sum equality:
$(P_{\text{SE\_New\_n}} = P_{\text{SE\_Old}} + t - \sum_{i=1}^{n-1} P_{\text{SE\_New\_i}})$
3. **Branch Transaction Creation**
* Create transaction with $(i)$ outputs, each locked by a new combined public key $(\text{SE}_n+\text{U}_n)$
4. **Intermediate Branch and Exit Transactions for Each Leaf**
* Create and sign intermediate branch and exit transactions for each new leaf using the new keys
5. **Key Deletion**
* Securely delete original private keys
6. **Finalization**
* Store the branch transaction off-chain.
* Record the new leaves for use within Spark.
## Spark Tree
A Spark tree is made up of both Leaf-Transactions as well as Branch-Transactions.
We extend a UTXO within Spark into branches and leaves. Txn0 is spent by Txn1, which is held off-chain. Txn1 is the first branch transaction. As mentioned above, the absolute timelock timebombs are removed from the tree.
## Aggregation
When a transaction branches, the key that controls the parent input is divided into multiple child keys. These child keys are created in such a way that their sum equals the parent key. For example, Txn1 could be replaced by a new transaction, which would be valid if it's signed using the combined (aggregated) signatures from all the child keys at the bottom of the tree. In other words, the following keys, when aggregated, can spend Txn1:
$( (B_{0.1.1.1.1} + SE_{0.1.1.1.1}) + (B_{0.1.2.1.1} + SE_{0.1.2.1.1}) + (B_{0.2.1.1.1} + SE_{0.2.1.1.1}) + (B_{0.2.2.1.1} + SE_{0.2.2.1.1}) )$
Which is equivalent to all the following:
$(B_0 + SE_0)$
$(B_{0.1} + SE_{0.1} + B_{0.2} + SE_{0.2})$
$(B_{0.1.1} + SE_{0.1.1} + B_{0.1.2} + SE_{0.1.2} + B_{0.2.1} + SE_{0.2.1} + B_{0.2.2} + SE_{0.2.2})$
$((B_{0.1.1.1} + SE_{0.1.1.1}) + (B_{0.1.2.1} + SE_{0.1.2.1}) + (B_{0.2.1.1} + SE_{0.2.1.1}) + (B_{0.2.2.1} + SE_{0.2.2.1}))$
# TLDR
Source: https://docs.spark.money/learn/tldr
Spark is an off-chain scaling solution that builds on top of [Statechains](https://bitcoinops.org/en/topics/statechains/) to enable instant, extremely low fee, and unlimited self-custodial transactions of Bitcoin and tokens while also natively enabling sending and receiving via Lightning.
Spark is not a rollup, nor a blockchain. There are no smart contracts nor VM. Spark is native to Bitcoin and its payments-focused architecture, enabling on-chain funds to be transacted at any scale, with near-instant speeds, and at virtually zero cost.
At its core, Spark is a simple [shared signing protocol](/spark/frost-signing) on top of Bitcoin. It operates as a distributed ledger. There are no bridges, external consensus, or sequencers. Users can enter and exit the network freely, with funds always non-custodial and recoverable on L1 via a unilateral exit which depends on Bitcoin and no one else. Transactions on Spark happen by delegating on-chain funds via the shared signing protocol.
Like the Lightning Network, Spark transactions work by delegating ownership of on-chain UTXOs between parties. The key difference: Spark introduces a set of signers, called Spark Operators (SOs), responsible for helping sign transactions as they happen. SOs cannot move funds without the users, who are required participants in any transfer. On L1, Spark state appears as a chain of multi-sigs, secured by users and SOs. On L2, it operates as a tree structure, mapping transaction history and balances in real time. Simple, fully native to Bitcoin, and open-sourced.
For a system that scales Bitcoin as effectively as Spark does, it achieves the maximum possible level of trustlessness. Specifically, it maintains 1/n trust assumptions or minority/n depending on the setup. To learn more about the trust assumptions, read our [trust assumptions](/spark/trust-model) page.
# Transaction Lifecycle
Source: https://docs.spark.money/learn/tokens/broadcast-lifecycle
How token transactions actually work under the hood.
## The basics
A token transaction moves tokens from inputs to outputs. Three parties involved:
1. **You** (the wallet): construct and sign the transaction
2. **Spark Operators**: validate, co-sign, and commit
3. **Watchtowers**: watch for cheaters trying to double-spend on L1
The key thing: operators never have unilateral control. Every transaction needs your signature. They witness and enforce, but they can't move your tokens without you.
## The flow
```
1. You build the transaction
- Which outputs to spend
- Who gets what
- Sign it
2. Send to operators
3. Operators validate
- Check your signatures
- Check you own the inputs
- Reserve revocation keys
4. Operators fill in the final details
- Revocation commitments (for double-spend protection)
- Withdrawal parameters (for L1 exit)
5. Operators commit
- Threshold signs off
- Watchtowers updated
6. Done
- You get the final transaction
- Recipient can spend immediately
```
One call handles the whole thing. It's atomic. Either everything succeeds or nothing does.
## Transaction types
**Mint**: Issuer creates new tokens. No inputs spent. Just issuer signature authorizing creation.
**Transfer**: Move tokens between owners. Inputs must equal outputs.
**Create**: Register a new token type. Define name, ticker, decimals, max supply.
## Revocation: how double-spend protection works
This is the clever part.
When you receive tokens, each output has a "revocation commitment". This is a public key where the private key is split among operators. Nobody knows the full key.
When you spend those tokens, operators release their key shares to you. Now you can reconstruct the full "revocation key" for the outputs you just spent.
Why does this matter? If you try to cheat by broadcasting old outputs to L1:
* Watchtowers see it
* They have the revocation key (you gave it to them when you spent)
* They sweep your funds immediately
* You lose everything
The economic incentive is simple: cheating costs more than you could gain.
## Validity window
Transactions expire. You have up to 300 seconds (5 minutes) from when you create a transaction to when operators commit it. After that, it's rejected and you have to start over.
This prevents stale transactions from clogging the system.
## What can go wrong
| Problem | What happened |
| -------------------- | ---------------------------- |
| Insufficient balance | You don't have enough tokens |
| Output already spent | Someone else spent it first |
| Invalid signature | Wrong key |
| Frozen | Issuer froze your address |
| Transaction expired | Took too long, try again |
## Finality
Once operators commit, it's done. Instant. Final. The recipient can spend immediately. No confirmations to wait for.
The only "confirmation" that matters is the threshold of operators agreeing. Once they do, the transaction is irreversible at the Spark level.
# Burning
Source: https://docs.spark.money/learn/tokens/burning
Burning destroys tokens permanently. Once burned, they're gone forever.
## How it works
The issuer signs a burn transaction specifying which tokens to destroy. Operators validate and commit. Those tokens no longer exist. Total supply decreases.
## Why burn?
**Redemptions.** User wants to cash out $100 of stablecoin. You send them $100, burn their 100 tokens. Supply stays matched to reserves.
**Deflationary mechanics.** Some tokens burn a portion of each transaction or do periodic buyback-and-burns.
**Mistakes.** Minted too many? Burn the excess.
## Who can burn
Only the issuer can use the burn function directly. But there's a workaround: anyone can burn tokens by sending them to the burn address.
```
spark1pgssyqszqgpqyqszqgpqyqszqgpqyqszqgpqyqszqgpqyqszqgpqyqszykl0d2
```
Tokens sent here are gone. The address exists but nobody has the key. This lets regular users burn their own tokens if they want to.
## Irreversibility
Burning is permanent. There's no undo. Double-check amounts before you burn. If an issuer burns tokens by mistake, the only fix is minting new ones (if max supply allows).
## Burning vs freezing
Freezing is temporary and reversible. Burning is permanent. If you're not sure, freeze first. You can always burn later.
# Core Concepts
Source: https://docs.spark.money/learn/tokens/core-concepts
The building blocks of how tokens work on Spark.
## UTXO model
Tokens on Spark work like Bitcoin. You have outputs. To spend, you consume outputs and create new ones. Math has to balance. Inputs equal outputs.
```
Alice has: 100 tokens (one output)
Alice sends: 75 to Bob
Result:
Bob gets: 75 tokens (new output)
Alice gets: 25 tokens (change output)
Original output: consumed
```
This is different from account-based systems (like Ethereum) where you have a balance that gets debited. Here, you have discrete chunks of tokens that get spent and created.
## TTXOs
A TTXO (Token Transaction Output) is a single token holding. It contains:
* **Owner**: Who can spend it
* **Token**: Which token (identified by a 32-byte identifier)
* **Amount**: How many
* **Revocation commitment**: For double-spend protection
* **Withdrawal parameters**: For L1 exit
Each TTXO maps to a pre-signed Bitcoin transaction. If operators disappear, you broadcast that transaction and claim your tokens on L1. That's the self-custody guarantee.
## Operators and threshold
Spark Operators validate and co-sign transactions. A threshold of operators must agree for any transaction to go through.
| Operators | Threshold |
| --------- | --------- |
| 2 | 2 |
| 3 | 2 |
| 5 | 3 |
Formula: `(n + 2) / 2` rounded down. This is majority, not 1-of-n.
What operators can do:
* Delay transactions by refusing to sign
* See transaction metadata
* Run watchtowers
What operators cannot do:
* Move your tokens without your signature
* Steal your tokens (even if all collude)
* Block your exit to L1
## Revocation
The key innovation. How do you prevent double-spending in an off-chain system?
When you receive tokens, each output has a revocation commitment. This is a public key where the private key is split among operators via DKG. Nobody knows the full key.
When you spend those tokens, operators release their key shares. Now you can reconstruct the full revocation key for those outputs.
If you try to cheat by broadcasting old (spent) outputs to L1:
* Watchtowers detect it
* They have the revocation key
* They sweep your funds
* You lose everything
Cheating costs more than you could gain. That's the security model.
## DKG
Distributed Key Generation. How operators create shared secrets without any single party knowing the full secret.
Operators pre-generate batches of keys where:
* Each operator holds a share
* No single operator has the full private key
* Threshold cooperation is needed to reconstruct
These become revocation commitments for new outputs.
## Watchtowers
Services that monitor Bitcoin L1 for attempted double-spends. Each operator runs one. If someone broadcasts a revoked output:
1. Watchtower detects it
2. Uses stored revocation key to sweep
3. Cheater loses everything
Only one honest watchtower is needed. They're incentivized to catch cheaters. They get the bond.
## Relationship to Bitcoin
Tokens on Spark aren't wrapped or bridged. They're native Bitcoin constructs.
Exit transactions are valid Bitcoin transactions. Security comes from Bitcoin's proof-of-work. Operators can disappear and you still exit to L1.
No bridge. No custodian. That's the point.
# Freezing
Source: https://docs.spark.money/learn/tokens/freezing
Issuers can optionally freeze addresses. This is not enforced at the protocol level. It's a feature issuers can choose to implement.
## How it works
The issuer signs a message saying "freeze this public key." Spark Operators add that key to their frozen list. Any transfer attempt from that key gets rejected.
Unfreezing works the same way. Issuer signs, operators remove from list, transfers work again.
## What gets frozen
Freezing targets a public key, not specific tokens. So:
* All tokens owned by that key become untransferable
* Any new tokens sent to that key are also frozen
## Optional by design
Freezing is entirely optional. Issuers choose whether to use it. Some issuers may never freeze anyone. Others may need it for compliance reasons.
The protocol doesn't require freezing. It's a tool available to issuers who want it.
# Glossary
Source: https://docs.spark.money/learn/tokens/glossary
Quick definitions.
## BTKN
Bitcoin Token. The token protocol on Spark. Fast, cheap, native Bitcoin tokens.
## Create
Registering a new token type. You do this once per token. Define name, ticker, decimals, max supply.
## DKG
Distributed Key Generation. How operators create shared secrets where no single party knows the full key. Used for revocation commitments.
## Final Token Transaction
The complete transaction after operators fill in revocation commitments and withdrawal parameters. This is what gets stored and propagated.
## Mint
Creating new token units. Only the issuer can mint.
## Partial Token Transaction
What the wallet constructs before sending to operators. Missing revocation commitments and other fields that operators fill in.
## Revocation Commitment
A public key embedded in each token output. The private key is split among operators. When you spend, operators release their shares so you can reconstruct the full revocation key.
## Revocation Key
The private key for a revocation commitment. If you try to double-spend by broadcasting old outputs, watchtowers use this to sweep your funds.
## Spark Entity
The group of Spark Operators running the network.
## Spark Operator
A single node that validates, co-signs, and enforces transactions. Each one runs a watchtower.
## Threshold
How many operators must agree. Formula: `(n + 2) / 2`. For 5 operators, threshold is 3.
## Token Identifier
32-byte unique ID for a token. Displayed in bech32m format.
## Transfer
Moving tokens between owners. Inputs must equal outputs.
## TTXO
Token Transaction Output. A single token holding. Owner, amount, token type, revocation commitment, withdrawal parameters.
## Unilateral Exit
Withdrawing to Bitcoin L1 without operator cooperation. You broadcast pre-signed exit transactions, wait for locktime, claim tokens on L1.
## Validity Duration
How long a transaction can wait before operators reject it. Max 300 seconds.
## Watchtower
Service monitoring Bitcoin for double-spend attempts. Each operator runs one. If someone broadcasts a revoked output, watchtowers sweep their funds.
## Withdrawal Bond
Sats you post when exiting to L1. Returned after locktime if legitimate. Swept by watchtowers if you're cheating.
## Withdrawal Locktime
Blocks to wait after broadcasting an exit transaction. Gives watchtowers time to check and respond.
# Hello, BTKN!
Source: https://docs.spark.money/learn/tokens/hello-btkn
### What the BTKN?
BTKN is the Bitcoin token standard we wished existed when we started building on Bitcoin. It's fast, it's incredibly cheap, and it moves natively on Bitcoin without any of the UX limitations that usually comes with tokens on Bitcoin.
For developers, BTKN just makes sense. You can mint tokens in seconds, transfer them instantly, and your users never have to think about gas fees or wait for confirmations. They're native Bitcoin tokens that inherit all of Bitcoin's security guarantees.
BTKN is everything we learned building on Bitcoin, distilled into one protocol. We took the best parts of Bitcoin's UTXO model, added instant settlement when you need it, and made sure you can always fall back to Bitcoin's base layer when security matters most.
### TL;DR
In more technical terms: BTKN (Bitcoin Token) is our adaptation of the [LRC-20](https://github.com/akitamiabtc/LRC-20/blob/main/LRC20-current.pdf) protocol but specifically optimized for Spark. We discovered LRC-20 in Summer 2023 and were impressed by it. After pushing it to its limits, we realized we needed to make significant improvements and changes that warranted its own identity: BTKN.
BTKN has two key components: Bitcoin (L1) as the settlement layer and Spark as the execution engine.
On Bitcoin (L1), BTKN works by tweaking Bitcoin addresses in a way that embeds token data inside regular transactions. Bitcoin nodes process these transactions as usual, but BTKN-aware nodes can extract and verify token movements by watching how those keys are adjusted.
On Spark, BTKN doesn't need to play the same tricks as L1. We don't tweak keys. Instead, they exist natively as metadata within Spark's TTXOs. When an issuer mints new tokens, they submit a transaction embedding the token's details (amount, ID, and properties) directly into a designated TTXO. Spark Operators validate these transactions, making sure they follow protocol rules and attesting to state changes. They then share this data with BTKN nodes, which continuously track transactions and keep Spark in sync with L1. BTKN tokens on Spark inherit the same L1 guarantees as Bitcoin. You can unilaterally exit your assets at any time.
# A bit of history
Source: https://docs.spark.money/learn/tokens/history
If you've spent any time in crypto, you're used to the idea that tokens live inside smart contracts. On Ethereum, Solana, and most other chains, tokens aren't native; they're managed by programs that live on-chain, enforcing balances and rules.
There are no smart contracts on Bitcoin, not in the way you think of them, at least. Bitcoin's base layer is intentionally simple. It doesn't execute arbitrary code. It doesn't have an account model. It doesn't let you just "deploy" a contract and start issuing tokens.
So how do you put tokens on Bitcoin? The answer: metaprotocols, systems that extend Bitcoin by embedding additional information into regular Bitcoin transactions. Instead of deploying new logic to the chain, metaprotocols work on top of Bitcoin's existing rules, using its transactions as a foundation while keeping token data indexed off-chain.
Over the years, we've seen a wave of attempts to bring tokens to Bitcoin: Ordinals, BRC-20, RGB, Taproot Assets, and more. Each one is impressive in its own right, but none have really scaled.
What does scaling actually mean? Liquidity. Tens, if not hundreds, of billions of dollars secured under a single standard.
We're incredibly opinionated on how this plays out: liquidity will consolidate around stablecoins. The Bitcoin standard that wins will be the one that integrates with every stablecoin issuer, meeting them exactly where they are. From the biggest players to the most niche issuers, adoption at scale is the only thing that matters.
And yet, today? Not a single major stablecoin issuer has been onboarded. Zero. If someone tells you they're planning to? Look at the fine print. Implementation details have a way of disappearing when it's time to ship.
We evaluated everything and came to a clear conclusion: we needed to build something new. We saw an opportunity to focus on making tokens that actually work the way people expect them to work.
So we built BTKN (Bitcoin Token) from first principles. It combines the best ideas from across the ecosystem with new innovations that only became possible on Spark. The result is a token standard that's native to Bitcoin, optimized for speed, and ready to onboard any asset types on Bitcoin.
# Minting
Source: https://docs.spark.money/learn/tokens/minting
Minting creates new tokens. Only the issuer can mint.
## How it works
The issuer signs a transaction saying "create X tokens and give them to Y address." Spark Operators verify the signature, confirm it doesn't exceed max supply, and commit the transaction. New tokens now exist.
That's it. No on-chain transaction. No gas. Instant.
## What gets checked
| Check | Why |
| ---------------- | --------------------------------------------- |
| Issuer signature | Only the issuer key can create tokens |
| Token exists | You have to create (register) the token first |
| Max supply | Can't mint beyond the cap you set |
| Not frozen | Frozen issuers can't mint |
## Mint vs Create
People confuse these.
**Create** registers a new token type. You do this once. You define the name, ticker, decimals, max supply.
**Mint** creates new units of an existing token. You do this whenever you want more tokens in circulation.
## Supply
If you set a max supply during creation, you can't mint beyond it. If you didn't set one, you can mint forever.
Most stablecoin issuers mint on-demand: user deposits \$100, issuer mints 100 tokens. Supply grows with demand.
## Where tokens go
By default, minted tokens go to the issuer's address. You can also mint directly to someone else's address. Useful for airdrops or when you're minting in response to a deposit.
## Common patterns
**Premint everything**: Mint total supply upfront, then distribute via transfers. Simple accounting.
**Mint on demand**: Mint as users onboard. Better for stablecoins where supply should match deposits.
**Batch mint**: Send to multiple recipients in one transaction. More efficient.
# Transferring
Source: https://docs.spark.money/learn/tokens/transferring
Transfers move tokens between people. They're instant and final.
## How it works
You have token outputs (TTXOs). To send tokens, you spend some outputs and create new ones for the recipient. If you're sending less than what's in your outputs, you get change back.
```
Alice has: 100 tokens (one output)
Alice sends: 75 tokens to Bob
Result:
Bob gets: 75 tokens (new output)
Alice gets: 25 tokens (change output)
Original 100-token output: gone (revoked)
```
The math has to balance. Inputs = outputs. No tokens created or destroyed.
## Why it's instant
No on-chain transaction. No block confirmations. The transfer happens the moment Spark Operators commit it. Bob can spend those tokens immediately.
## Input selection
When you have multiple outputs, the SDK picks which ones to spend. Two strategies:
**Small first** (default): Spends your smallest outputs first. Over time, this consolidates your holdings into fewer, larger outputs.
**Large first**: Spends your biggest outputs first. Useful if you want to preserve small outputs for exact-amount payments later.
You can also pick specific outputs manually if you care.
## Sending to multiple people
You can send to multiple recipients in one transaction. More efficient than separate transfers. One round trip to operators instead of many.
## Consolidation
If you have lots of small outputs, transfer everything to yourself. You'll end up with one output instead of many. Cleaner, faster, cheaper to exit to L1 if you ever need to.
## Limits
| Limit | Value |
| --------------------------- | ----------- |
| Max inputs per transaction | 500 |
| Max outputs per transaction | 500 |
| Validity window | 300 seconds |
If you need to spend more than 500 inputs, split into multiple transactions.
## Finality
Once operators commit, it's done. The only way to "reverse" is if the recipient sends tokens back voluntarily. There's no chargeback, no undo, no dispute process at the protocol level.
# Transfers on Spark
Source: https://docs.spark.money/learn/transfers
Leaf ownership is transferred by adjusting the SE's key such that the combined key (SE+User) remains the same before and after the transfer, but control shifts from the sender to the receiver.
Original Combined Key:\
$(\text{PubKey}_{\text{Combined}} = \text{PubKey}_{\text{Sender}} + \text{PubKey}_{\text{SE}})$
After Transfer:\
$(\text{PubKey}_\text{Combined} = \text{PubKey}_\text{Receiver} + \text{PubKey}'_\text{SE})$
We do this by tweaking the SE key with the difference between the sender and receiver private keys. Note that no private keys are ever revealed through this process.
### Transfer Process
To transfer ownership of a leaf, the SE adjusts its key so that the combined public key (SE + User) remains the same, but control shifts from the sender to the receiver. This is achieved by the SE tweaking its key by the difference between the sender's and receiver's private keys. By doing so, the SE's new key, when combined with the receiver's key, equals the original combined key, ensuring the UTXO can still be spent under the original spending conditions.
This key adjustment allows the transfer of control without revealing any private keys or requiring an on-chain transaction. The SE securely deletes the old private key associated with the sender and collaborates with the receiver to sign a new exit transaction with a lower timelock than the previous one. This process effectively transfers control of the leaf to the receiver, who now has full control over the UTXO.
# Trust Model
Source: https://docs.spark.money/learn/trust-model
Spark operates under a “moment-in-time” trust model, meaning that trust is only required at the time of a transaction. As long as at least one (or a configurable threshold) of the Spark operators behaves honestly during a transfer, the system ensures perfect forward security. Even if operators are later compromised or act maliciously, they cannot retroactively compromise past transactions or take money from users.
This security is achieved through a fundamental action required of SOs: forgetting the operator key after a transfer. If an operator follows this protocol, the transferred funds are secure, even against future coercion or hacking. Spark strengthens this model by using multiple operators, ensuring that as long as one (or the required threshold) deletes their key, users remain secure and have perfect forward security. This approach contrasts with most other Layer 2 solutions, where operators retain the ability to compromise user funds until they exit the system. The only exception is Lightning, which requires no trusted entities at all.
# Unilateral Exits
Source: https://docs.spark.money/learn/unilateral-exits
# Welcome to Spark
Source: https://docs.spark.money/learn/welcome
Spark is the fastest, lightest, and most developer-friendly way to build financial apps and launch assets, **natively on Bitcoin.**
Spark lets developers move Bitcoin and Bitcoin-native assets (including stablecoins) instantly, at near-zero cost, while staying fully connected to Bitcoin's infrastructure. Spark has no bridge and doesn’t wrap your asset: everything is native to Bitcoin.
## Choose Your Path
# What to Build
Source: https://docs.spark.money/learn/what-to-build
Spark is completely agnostic and unopinionated about what people build on top of it. You do whatever you want with it. The goal is to give developers low-level primitives, nothing more. That said, we're always buzzing with ideas about what's possible.
* Connect to >100 endpoints through Lightning
* Enable instant, dirt-cheap Bitcoin flows
* Enable your users to send Bitcoin privately via Lightning
* Issue your own branded reward/point token on Spark
* Enable native Bitcoin DCA for your users
* Give Bitcoin cashback to users performing a specific on-chain action (e.g. swap, send, pay with card, etc.)
* Give Bitcoin yield/interest for users holding a specific asset (e.g. BTC, stablecoin etc.)
* Launch your branded stablecoin (e.g. via brale.xyz)
* Enable your users to buy/sell native bitcoin in a self-custodial way (e.g. via flashnet.xyz)
* \[coming soon] Connect to the cheapest on-off ramp network
* \[coming soon] Issue debit cards and virtual accounts for your users
* Connect to >100 endpoints through Lightning
* Enable instant, dirt-cheap Bitcoin deposits and withdrawals
* Issue your branded stablecoin (e.g. via Brale.xyz)
* Connect to one of the largest and most competitive cross-border payment networks
* Open new geographies without compliance or regulatory overhead
* Launch on the most distributed crypto network and liquidity
* Enable on-chain Bitcoin applications (hedging for traders, lending & swapping, payouts to miners etc.)
* Give Bitcoin cashback to users performing a specific on-chain action (e.g. send a payment)
* Provide Bitcoin yield (instead of USD pro-rata) to users holding your stablecoin
* \[coming soon] Make your stablecoin interoperable with any domestic payment systems
* \[coming soon] Monetize and distribute any stablecoin activity
* \[coming soon] Let your stablecoin move privately
* Receive Bitcoin 100x faster and cheaper through Lightning
* Let your merchant's customers pay privately via Lightning
* Incentivize custom payment behaviors for users
* Connect to faster and cheaper on-off ramps for your merchants
* AI agents with Spark wallets
* Gate AI scraping behind Spark powered paywalls
* AI agents that can execute trades on your behalf
# Why Bitcoin
Source: https://docs.spark.money/learn/why-on-bitcoin
We get this question a lot: why build Spark on Bitcoin? Why not launch a new L1 or L2? Why take on the challenge of bringing stablecoins to Bitcoin?
The short answer: we're Bitcoin pragmatists. Building a network is one of the hardest things you can do, and if you're going to do it, you need every structural advantage on your side. Bitcoin gives us that.
### 1. Bitcoin is the only network with network effects
Distribution is everything. You can build the fastest, cheapest L1, but if you can't get it into people's hands, it doesn't matter.
Bitcoin has over 300 million users. It's natively integrated into the largest wallets, banks, and fintech apps. That means Spark is instantly interoperable with hundreds of millions of endpoints from day one.
### 2. Bitcoin is where the money is
Liquidity is the lifeblood of any financial network. If you're building in crypto, you need to bootstrap liquidity before your network can take off.
With Bitcoin, that problem is already solved —> 60% of all crypto liquidity is sitting in BTC.
### 3. People actually want to accumulate Bitcoin, not sell it
Most crypto tokens get farmed, dumped, and forgotten. Bitcoin is different — people stack it, not sell it. It's scarce, it's deflationary, and it has the strongest digital gold thesis.
For builders, this is amazing. Bitcoin gives you an instant edge for rewards, retention, and engagement. No need to convince users —> it's already the asset they want. That unlocks entirely new product experiences, like BTC-based yield, cashback, and incentives that feel like wealth accumulation rather than just another points system.
### 4. Bitcoin is the only crypto your grandmother has heard of
Ask your grandmother if she's heard of Bitcoin. Now ask her about any other crypto project.
This matters. When you build on Bitcoin, you're not starting from scratch —> you're tapping into the only crypto brand with mainstream trust and recognition.
### 5. Bitcoin is the only network that will outlive every timeline
In 1 year, 5 years, 10 years, 50 years, or even 100 years, Bitcoin will still be running. Can you confidently say the same for other scaling solutions?
As a builder, do you want to wake up one day and realize that all the liquidity your users rely on is at risk of disappearing?
***
Think we're wrong? Roast us. (Tweet or DM [@spark](https://x.com/spark)). Show us another chain that beats Bitcoin on all five of these dimensions. If you can, we'll listen.
# Withdrawals to L1
Source: https://docs.spark.money/learn/withdrawals
There are two ways to exit Spark and withdraw funds to Bitcoin L1: cooperative and unilateral exits.
### Cooperative Exit
In a cooperative exit, the user and SE work together to create a new transaction that spends directly from the deposit transaction to the user's desired Bitcoin address. This is the most efficient and cost-effective method, as it requires only a single on-chain transaction.
1. **Transaction Creation:** The user and SE collaborate to create a transaction that spends from the deposit transaction's output directly to the user's desired Bitcoin address.
2. **Signing:** Both the user and SE sign this transaction using their respective private keys.
3. **Broadcasting:** The signed transaction is broadcast to the Bitcoin network.
4. **Confirmation:** Once confirmed, the funds are available in the user's Bitcoin address.
### Unilateral Exit
If the SE is unavailable or uncooperative, the user can perform a unilateral exit using the pre-signed exit transactions created during the deposit and transfer processes.
1. **Broadcasting Branch Transaction:** The user broadcasts the branch transaction that was signed during the deposit process.
2. **Waiting for Confirmation:** Once the branch transaction is confirmed, the user waits for the relative timelock to expire.
3. **Broadcasting Exit Transaction:** After the timelock expires, the user broadcasts the exit transaction that was signed during the deposit or most recent transfer.
4. **Confirmation:** Once confirmed, the funds are available in the user's Bitcoin address.
The unilateral exit process ensures that users always have control over their funds, even if the SE becomes unavailable or malicious. This is a critical aspect of Spark's self-custody design, providing users with the security and sovereignty that Bitcoin was designed to offer.
# Create a Wallet
Source: https://docs.spark.money/quickstart/create-wallet
Create a self-custodial Bitcoin wallet on Spark, fund it, and send your first payment.
Get up and running with Spark in minutes. This quickstart guide will walk you through creating your first Spark wallet, funding it with Bitcoin, and making your first transaction on Spark.
***
Clone and launch the Spark CLI, powered by the Spark SDK:
```bash CLI theme={null}
# Clone the Spark SDK repo
git clone https://github.com/buildonspark/spark.git
# Navigate to the JS folder
cd spark/sdks/js
# Install dependencies and build the SDK
yarn && yarn build
# Navigate to the JS CLI folder
cd examples/spark-cli
# Start the CLI
yarn cli
```
Create your first wallet on Spark:
```bash CLI theme={null}
> initwallet
```
Example output:
```bash CLI theme={null}
Mnemonic: please broccoli hole unfold trigger novel marriage come invest need ostrich never
Network: REGTEST
```
Important: Keep your mnemonic securely stored offline. Anyone with access to it can take full control of your funds.
To recover an existing wallet:
```bash CLI theme={null}
> initwallet
```
Generate a static deposit address:
```bash CLI theme={null}
> getstaticdepositaddress
bcrt1pz5sxkd4eaycla7av8c9avmdleyertmhkh2zf60vrmn346wwnjayq8phsra
```
You’ll get a static Bitcoin L1 deposit address linked to your wallet, it can’t be changed. Try it out in our Regtest environment using some monopoly money from the [faucet](https://app.lightspark.com/regtest-faucet).
After depositing Bitcoin from the faucet, get the associated transaction hash:
```bash CLI theme={null}
> getlatesttx
# Example:
> getlatesttx bcrt1pz5sxkd4eaycla7av8c9avmdleyertmhkh2zf60vrmn346wwnjayq8phsra
2c5ccdc5852eb23662344c142970a1d96f2bed539a1be074cbbff65411ba3270
```
Once the transaction is confirmed on-chain, you can claim it. To claim your Bitcoin on Spark, start by requesting a quote:
```bash CLI theme={null}
> claimstaticdepositquote
# Example:
> claimstaticdeposit 2c5ccdc5852eb23662344c142970a1d96f2bed539a1be074cbbff65411ba3270
```
With the quote info ready, you can now claim your Bitcoin.
```bash CLI theme={null}
> claimstaticdeposit
# Example:
> claimstaticdeposit 2c5ccdc5852eb23662344c142970a1d96f2bed539a1be074cbbff65411ba3270 3901 3045022100a69a1c58893947e46d4310d967c2d7e96f539e3e2656e1c76cbce1b96afc149102200a0aef9518cd0a76c9baecbf60afd52ca4c30d2d8025bcba70288a9df6a39e63
```
Verify that your balance has increased:
```bash CLI theme={null}
> getbalance
Sats Balance: 3901
```
Create a new wallet and fetch its Spark Address. The Spark Address is a static address that can be shared with payers to receive Bitcoin.
```bash CLI theme={null}
# Wallet 2
> initwallet
Mnemonic: repeat entry hazard estate normal relief pledge act online raw pull bean
Network: REGTEST
> getsparkaddress sparkrt1pgss95264kxj85cqz8g2cj5f66wy5yhhc0je35f8923de7mk8ttvls7a7x9vp9
```
Now send from Wallet 1:
```bash CLI theme={null}
# Wallet 1
> initwallet
> sendtransfer
# Example:
> sendtransfer 200 sparkrt1pgss95264kxj85cqz8g2cj5f66wy5yhhc0je35f8923de7mk8ttvls7a7x9vp9
```
That's it. The transfer’s complete. Run `getbalance` on each wallet to confirm the Bitcoin moved.
Spark is fully compatible with Lightning. Let's test it by sending a Lightning payment between our 2 wallets.
```bash CLI theme={null}
# Wallet 2 - Create invoice
> initwallet
> createinvoice
# Example:
> createinvoice 1000 Spark is awesome!
```
```bash theme={null}
# Wallet 1 - Pay invoice
> initwallet
> payinvoice
# Example:
> payinvoice lnbcrt10u1p5pqphup[...]cpkql23a 200
```
The payer specifies the maximum fee they're willing to pay for the invoice (in sats). The SDK then finds the route with the lowest possible fees but it will never exceed that limit.
Use `getbalance` on each wallet to verify the payment.
You can withdraw Bitcoin from Spark to Bitcoin by sending them to an L1 address. In this example, we’ll withdraw Bitcoin from Wallet 1 and send it to Wallet 2’s L1 address.
```bash CLI theme={null}
# Wallet 2 - Get deposit address
> initwallet
> getstaticdepositaddress
```
Check withdrawal fee:
```bash CLI theme={null}
# Wallet 1 - Check fees and withdraw
> initwallet
> withdrawalfee
# Example:
> withdrawalfee 15000 bcrt1p6tx52amnr448lv8vyr7fumqt3c2qmlkg4hgvj8swxfcz8cayukvqwk9mu6
```
If it’s acceptable:
```bash CLI theme={null}
> withdraw
# Example:
> withdraw 15000 bcrt1pslvlzmkwz8f42u8vr2fkhdhyzyh2x5cwy8l0lpdnqr4ptsjrefrq0sd0gl FAST
```
Once the transaction is confirmed, use the same claim process described above to claim your Bitcoin on Spark.
# Launch a Token
Source: https://docs.spark.money/quickstart/launch-token
Create a token on Spark, mint supply, and send your first token transfer.
Create and launch your first token on Spark in minutes. This quickstart will walk you through setting up your wallet, issuing a token, minting supply, and sending your first transfer. All on Spark, without touching L1 or waiting for confirmations.
***
Clone and launch the Spark CLI, powered by the Spark SDK:
```bash CLI theme={null}
# Clone the Spark SDK repo
git clone https://github.com/buildonspark/spark.git
# Navigate to the JS folder
cd spark/sdks/js
# Install dependencies and build the SDK
yarn && yarn build
# Navigate to the JS CLI example
cd examples/spark-cli
# Start the CLI
yarn cli
```
Create a wallet to serve as the issuer of your token:
```bash CLI theme={null}
> initwallet
```
Example output:
```bash CLI theme={null}
Mnemonic: rhythm torch mistake reopen device surround cabin wish snake better blind draft
Network: REGTEST
```
Important: Keep your mnemonic safe and offline. The wallet you create here becomes the root of trust for your token. It's the only wallet authorized to mint or burn supply.
Launch your token directly on Spark:
```bash CLI theme={null}
> createtoken
# Example:
> createtoken MyToken MTK 6 1000000 false
```
Human-readable name of the token.
Short symbol for the token (typically 3–6 characters).
Decimal precision used for display and arithmetic.
Hard cap on total mintable supply.
Whether the issuer can freeze or unfreeze token balances.
```bash CLI theme={null}
TxID: 9c5d17acb7fe203ab6342b3f556b4e5f3b4dabe1bca3b0b98b1b3d9cf7d92c4d
Token ID: btkn1qv6cps0n0ttm3p4gx62rzty4rjhzqwe5eqv2wlt
```
Spark transaction ID of the token creation.
Bech32m token identifier (e.g., btkn1...) used in later operations.
Your token now exists on Spark. It's ready for minting and transfers.
Bring new tokens into circulation. Only the issuer wallet can mint, up to the max you set before.
```bash CLI theme={null}
> minttokens 500000
```
Example response:
```bash CLI theme={null}
Minted: 500000 MTK
Remaining mintable: 500000 MTK
```
You can verify the token balance:
```bash CLI theme={null}
> gettokenbalance
Token: MTK
Balance: 500000
```
Tokens on Spark move instantly and cost nothing. You can choose to create a new wallet to find a sender.
```bash CLI theme={null}
# Wallet 2
> initwallet
Mnemonic: explain bullet cradle segment lava enable someone lemon bracket fossil invite crash
> getsparkaddress
sparkrt1pgw6rrt6s4y3xghx6cl5v4mm08eylktaygff62mg8uk3u5zqq2zwqf9t9d0
```
Then send tokens from your issuer wallet:
```bash CLI theme={null}
# Wallet 1 (Issuer)
> transfertokens btkn1qv6cps0n0ttm3p4gx62rzty4rjhzqwe5eqv2wlt 100000 sparkrt1pgw6rrt6s4y3xghx6cl5v4mm08eylktaygff62mg8uk3u5zqq2zwqf9t9d0
```
That’s it. The transfer settles instantly on Spark.
# Features
Source: https://docs.spark.money/start/features
The team behind Spark has spent years building on Bitcoin. After countless iterations, Spark represents our best attempt at creating the definitive platform for builders on Bitcoin. We built it from scratch, with strong opinions on what makes a truly great developer experience. Spark exposes powerful, low-level features for a wide range of use cases.
***
Live
In development
***
### Something we should be building?
Most of what's on this page came from working directly with our builders. If you have a use case that needs something we don't support yet, we want to hear about it.
Get in touch
# Welcome to Spark
Source: https://docs.spark.money/start/overview
Spark is the fastest, cheapest, and most UX-friendly way to build financial apps and launch assets natively on Bitcoin. It's a Bitcoin L2 that lets developers move Bitcoin and Bitcoin-native assets (including stablecoins) instantly, at near-zero cost, while staying fully connected to Bitcoin’s infrastructure.
***
## Quickstart
The fastest way to go from 0 to 1 on Spark
***
## Build something
Spark makes Bitcoin fast enough to build on. Payments, wallets, stablecoins, and whatever comes next.
***
## Get to know Spark
Spark integrates with best-in-class infrastructure partners to help you ship faster
***
## Building on Spark
# Products
Source: https://docs.spark.money/start/products
Spark SDKs and developer tools for Bitcoin L2 development.
Explore Spark's product ecosystem and offerings.
## Core Products
### Spark Network
The foundational layer-2 network for Bitcoin scaling.
### Spark Wallet SDK
Developer tools for building Spark-native wallets.
### Spark Issuer SDK
Platform for creating and managing tokens on Spark.
## Developer Tools
### Spark CLI
Command-line interface for Spark development.
### Spark Explorer
Block explorer for the Spark network.
### Spark Faucet
Test token distribution for developers.
# Use Cases
Source: https://docs.spark.money/start/use-cases
Spark makes Bitcoin fast enough to build on. We're building the infrastructure that makes Bitcoin the definitive platform for payments, wallets, and stablecoins. Where new kinds of apps can finally exist on the world's most secure ledger.
## Payments
Global transfers, tipping, merchant apps, and more payment solutions built on Bitcoin.
## DeFi
Liquidity pools, tokenized BTC, swaps, and decentralized finance protocols.
## Rewards
Bitcoin cashback, loyalty, micro-incentives, and reward-based applications.
# Token Lists
Source: https://docs.spark.money/tools/token-lists
Registry of BTKN tokens on Spark.
# Instant Bitcoin Deposits
Source: https://docs.spark.money/wallets/0-conf
This feature is in **private experimental beta** and only available via a **custom integration**. [Contact us](https://www.spark.money/build-now) to discuss your use case.
**Instant Bitcoin Deposits** enable instant BTC deposits onto Spark, without having to wait for on-chain confirmations.
Normally, payments on Bitcoin L1 can take up to 60 minutes or more, depending on the number of confirmations required. This waiting period impacts user experience, and can expose deposits to volatility and missed opportunities.
Instant Deposits make L1 BTC usable immediately.
***
## 0-conf Bitcoin Deposit Flow
BTC is sent to a static Spark L1 deposit address generated via the Spark API.
When the transaction hits the Bitcoin mempool, the LP detects it and performs a risk evaluation.
If the transaction meets the LPs safety criteria, the LP makes the BTC immediately spendable on Spark.
***
## Use cases
Give wallets and platforms the fastest way to buy Bitcoin through your onramp. The moment a user clicks “Buy,” BTC lands in their destination wallet.
1. Deliver the smoothest card-to-BTC onramp. BTC is usable seconds after purchase, not 30–60 minutes later.
2. Let users use Bitcoin immediately. Funds can be traded, spent, or bridged as soon as the purchase completes.
3. Reduce customer support load. Fewer “Where is my Bitcoin?” tickets.
4. Turn speed into revenue for your integrators. Let wallets sell “Fast Bitcoin” as a paid upgrade.
BTC is purchased by a user through your onramp within a wallet or platform (one of your customers).
As the onramp provider, you send BTC directly to a static deposit address generated by your integrator.
Once approved, the BTC you sent is immediately usable on Spark by the end user or integrator.
The user has full control of the BTC. It can be moved to any wallet or platform use case (trading, betting, or bridging) and sent onward over Bitcoin or Lightning. Settlement is complete and final.
Give integrators (apps, DeFi protocols, wallets) and end users the fastest way to move Bitcoin to their preferred chain. The moment BTC is sent, it’s final on Spark and can be atomically swapped to another chain or asset, based on your bridge design.
1. Remove Bitcoin as the slow leg. BTC is final on Spark instantly, so cross-chain swaps execute without waiting on L1 confirmations.
2. Increase successful intent execution. Faster finality means fewer expired quotes, reverts, and failed routes.
3. Unlock more BTC flow. Lower latency and better UX turn Bitcoin into a viable source asset for high-frequency cross-chain volume.
4. Monetize speed and reliability. Offer “instant BTC bridging" as a premium option.
The bridge generates a static Spark L1 deposit address. While implementations may vary, the address can be created in a trustless manner (Spark supports native HTLCs to enforce this).
Once approved by the LP, BTC becomes immediately spendable on Spark and can be detected by the bridge infrastructure. The exact handling depends on each bridge’s architecture and execution model (e.g., solver-based designs).
Once BTC is available on Spark, the bridge can execute the cross-chain swap. Spark supports trustless execution via HTLCs, allowing BTC to be locked until predefined conditions are met.
Enable the best inbound Bitcoin deposit UX for self-custody wallets and wallet platforms. The moment BTC hits the mempool, it becomes spendable on Spark. Users can transact immediately instead of waiting on confirmations.
1. Keep users in-session. Deposits become usable immediately, so users don’t bounce while waiting for confirmations.
2. Unlock “instant receive” as a product feature. Let users trade, spend, bridge, or pay as soon as the deposit hits.
3. Reduce support and confusion. Fewer “pending deposit” tickets and fewer dropped sessions.
4. Monetize speed. Offer Instant Receive as a paid upgrade or included tier benefit.
Give your exchange the fastest BTC deposit experience in the market. Give your users a faster way to trade, convert, or withdraw immediately after deposting BTC.
1. Win volatile moments. Instant access lets traders act during fast markets.
2. Increase deposit to trade conversion. Fewer idle deposits, more active balances.
3. Reduce ops and support overhead. Fewer pending deposits and manual interventions.
4. Monetize instant access. Offer instant deposits as a VIP or paid feature (take bps).
Make your Bitcoin L1 payment operations more efficient. Reduce exposure to Bitcoin volatility, pay merchants faster, and increase checkout conversion.
1. Real-time payment UX. Accept BTC and complete the payment flow instantly.
2. Faster payouts and settlement. Move from “pending” to “done” in seconds for users and merchants.
3. Reduce FX exposure. Convert to stablecoins or other assets immediately when the flow requires it.
4. Improve cashflow. Faster settlement means funds aren’t stuck in limbo.
5. Increase payment completion rates. Less drop-off from pending states.
6. Charge for speed and certainty. Offer “instant acceptance” or premium execution tiers.
***
## Integration
Each integration can differ based on your specific use case or requirements. If you have any questions, just let us know.
This feature is private experimental beta. For integrators: within the wallet SDK, once a deposit is detected, you’ll receive both the quote and the claim functions needed to complete the flow.
```js theme={null}
const quote = wallet.getInstantDepositQuote_EXPERIMENTAL(
txid,
vout,
PARTNER_JWT || undefined
);
if (!quote.is_eligible) {
console.log("Not available because", quote.reason);
} else {
console.log(`You can claim ${quote.amount} of ${quote.total} instantly!`);
const result = await wallet.claimInstantDeposit_EXPERIMENTAL(quote.id);
}
// After the transaction confirms on-chain, Spark will automatically
// release any remaining amount (beyond the instant credit) to the wallet balance.
```
## FAQ
Instant Deposits remove the 10-60 minute wait that normally exists for Bitcoin deposits.
Build the best deposit UX by letting users trade or spend as soon as the deposit hits, instead of waiting for confirmations
You can charge fees for faster deposits and make it an opt-in feature
More deposits turn into actual trading and usage because funds are available immediately
Fewer support tickets about pending or stuck deposits
Participating LPs on the Spark network run proprietary risk engines that evaluate each incoming transaction and can set threshold safety criteria.
No. Only transactions that meet the participating LP's safety criteria qualify for instant funding. If a transaction doesn't qualify, it follows the standard Spark deposit flow, which settles after on-chain confirmations (typically 1–3 confirmations depending on transaction characteristics).
Typically under 5 seconds from mempool detection to spendable balance on Spark.
Yes. Instant Deposits provide real, native BTC on Spark, not wrapped or synthetic assets. Once available, BTC can move instantly to Bitcoin L1, Lightning, or any Spark address. The Spark protocol guarantees unilateral exit: even if all infrastructure disappeared, funds can always be withdrawn directly to L1 using pre-signed exit transactions.
Yes. Instant Deposits build on Spark's native deposit flow, which is fully trust-minimized. No party, including LPs, can move or control deposited funds. BTC remains locked in the on-chain UTXO until the Spark Protocol finalizes settlement. During the instant deposit window, the LP fronts liquidity on Spark based on the unconfirmed transaction's risk profile. Once the Bitcoin transaction confirms on-chain, the protocol settles the deposit and the LP's fronted liquidity is reconciled.
Yes. LPs enforce criteria such as deposit size and frequency limits. Limits may vary by LP, integrator, and individual transaction characteristics.
# Addressing
Source: https://docs.spark.money/wallets/addressing
Spark Addresses are Bech32m-encoded wrappers that combine two elements: the network identifier and the wallet’s identity public key.
```typescript MAINNET theme={null}
spark1pgssyuuuhnrrdjswal5c3s3rafw9w3y5dd4cjy3duxlf7hjzkp0rqx6dj6mrhu
```
```typescript REGTEST theme={null}
sparkrt1pgssyuuuhnrrdjswal5c3s3rafw9w3y5dd4cjy3duxlf7hjzkp0rqx6dj6mrhu
```
```typescript TESTNET theme={null}
sparkt1pgssyuuuhnrrdjswal5c3s3rafw9w3y5dd4cjy3duxlf7hjzkp0rqx6dj6mrhu
```
```typescript SIGNET theme={null}
sparks1pgssyuuuhnrrdjswal5c3s3rafw9w3y5dd4cjy3duxlf7hjzkp0rqx6dj6mrhu
```
```typescript LOCAL theme={null}
sparkl1pgssyuuuhnrrdjswal5c3s3rafw9w3y5dd4cjy3duxlf7hjzkp0rqx6dj6mrhu
```
***
### Network Identifiers
| Network | Prefix | Availability |
| ----------- | --------- | -------------- |
| **Mainnet** | `spark` | ✅ Available |
| **Regtest** | `sparkrt` | ✅ Available |
| **Testnet** | `sparkt` | 🔜 Coming soon |
| **Signet** | `sparks` | 🔜 Coming soon |
| **Local** | `sparkl` | ✅ Available |
Currently, Spark supports **Mainnet** and **Regtest** networks. Testnet and Signet support are planned for future releases.
***
## Getting Your Spark Address
To get your Spark Address, use the `getSparkAddress()` method:
```typescript theme={null}
const sparkAddress = await wallet.getSparkAddress();
console.log("My Spark Address:", sparkAddress);
```
## Important Notes
* A Spark Address is derived from your wallet's network and identity public key
* The same wallet will always generate the same Spark Address
* Spark Addresses are network-specific (mainnet/testnet/regtest)
# API Reference
Source: https://docs.spark.money/wallets/api-reference
# Balances & Activity
Source: https://docs.spark.money/wallets/balances
Query balances, view transfer history, and monitor wallet activity in real-time.
***
## Check Wallet Balance
Get your current Bitcoin balance and token holdings in your Spark wallet.
getBalance()
Gets the current balance of the wallet, including Bitcoin balance and token balances.
```typescript theme={null}
const balance = await wallet.getBalance();
console.log("Balance:", balance.balance, "sats");
console.log("Token Balances:", balance.tokenBalances);
```
The wallet's current balance in satoshis
Map of token public keys to token balances with Bech32m token identifiers
***
## View Transfer History
Track all incoming and outgoing transfers for your wallet with pagination support.
getTransfers(limit?, offset?)
Gets all transfers for the wallet with optional pagination.
`getTransfers()` includes Spark transfers, Lightning sends/receives, and cooperative exits. For token transaction details (e.g., sender address), use [`queryTokenTransactions()`](/api-reference/wallet/query-token-transactions).
```typescript theme={null}
// Get first 20 transfers
const transfers = await wallet.getTransfers();
console.log("Transfers:", transfers.transfers);
// Get next 10 transfers with pagination
const nextTransfers = await wallet.getTransfers(10, 20);
console.log("Next page:", nextTransfers.transfers);
// Get transfers from the last 24 hours
const yesterday = new Date(Date.now() - 24 * 60 * 60 * 1000);
const recentTransfers = await wallet.getTransfers(50, 0, yesterday);
```
Maximum number of transfers to return (default: 20)
Offset for pagination (default: 0)
Only return transfers created after this date (mutually exclusive with `createdBefore`)
Only return transfers created before this date (mutually exclusive with `createdAfter`)
Array of transfer objects containing transfer details
The offset used for this request
***
## Real-time Event Monitoring
Monitor wallet activity in real-time using EventEmitter methods for instant updates.
on(event, listener)
Adds a listener for the specified event to monitor wallet activity.
```typescript theme={null}
// Listen for incoming transfer claims
wallet.on("transfer:claimed", (transferId, updatedBalance) => {
console.log(`Transfer ${transferId} claimed. New balance: ${updatedBalance}`);
});
// Listen for deposit confirmations (after 3 L1 confirmations)
wallet.on("deposit:confirmed", (depositId, updatedBalance) => {
console.log(`Deposit ${depositId} confirmed. New balance: ${updatedBalance}`);
});
```
The event name to listen for (e.g., "transfer:claimed", "deposit:confirmed")
The callback function to execute when the event is emitted
The SparkWallet instance for method chaining
once(event, listener)
Adds a one-time listener for the specified event.
```typescript theme={null}
// Listen for a single incoming transfer
wallet.once("transfer:claimed", (transferId, updatedBalance) => {
console.log(`Transfer ${transferId} claimed! New balance: ${updatedBalance}`);
});
```
The event name to listen for
The callback function to execute when the event is emitted
The SparkWallet instance for method chaining
off(event, listener)
Removes the specified listener from the specified event.
```typescript theme={null}
// Remove a specific listener
const handleTransfer = (transferId) => console.log(`Transfer: ${transferId}`);
wallet.on("transfer:claimed", handleTransfer);
// Later, remove the listener
wallet.off("transfer:claimed", handleTransfer);
```
The event name to remove the listener from
The specific callback function to remove
The SparkWallet instance for method chaining
***
## Available Events
Spark wallets emit various events for different types of activity:
#### Available Events
| Event | Description |
| --------------------- | --------------------------------------------------- |
| `transfer:claimed` | Emitted when an **incoming** transfer is claimed |
| `deposit:confirmed` | Emitted when a pending L1 deposit becomes spendable |
| `stream:connected` | Emitted when the event stream connects |
| `stream:disconnected` | Emitted when the stream disconnects |
| `stream:reconnecting` | Emitted when attempting to reconnect |
Events only fire for **incoming** funds. For outgoing operations (Lightning sends, withdrawals), poll the status using `getLightningSendRequest()` or `getCoopExitRequest()`.
***
## Use Sparkscan Explorer
Monitor your wallet activity using the Sparkscan block explorer for a visual interface.
Sparkscan provides a web interface to view your wallet's transaction history, balance, and activity without needing to implement the API calls yourself.
***
## Example: Complete Balance Monitoring
```typescript theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
async function setupBalanceMonitoring() {
const { wallet } = await SparkWallet.initialize({
options: { network: "REGTEST" }
});
// Get initial balance
const balance = await wallet.getBalance();
console.log("Initial balance:", balance.balance, "sats");
// Set up event listeners
wallet.on("transfer:claimed", (transferId, newBalance) => {
console.log(`Transfer ${transferId} claimed. New balance: ${newBalance} sats`);
});
// Get recent transfers
const transfers = await wallet.getTransfers(10);
console.log("Recent transfers:", transfers.transfers);
return wallet;
}
```
# Create a Wallet
Source: https://docs.spark.money/wallets/create-wallet
Create and initialize Spark wallets with full key control.
***
## Initialize Wallet
The `initialize` method is the primary way to create or restore a Spark wallet. Leave `mnemonicOrSeed` blank to generate a new wallet, or provide an existing mnemonic seed phrase to import an existing wallet.
```typescript Create Wallet theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
// Create a new wallet
const { wallet, mnemonic } = await SparkWallet.initialize({
options: {
network: "REGTEST" // or "MAINNET"
}
});
console.log("New wallet created!");
console.log("Mnemonic:", mnemonic);
console.log("Address:", wallet.getSparkAddress());
```
```typescript Import Wallet theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
// Restore wallet from existing mnemonic
const { wallet } = await SparkWallet.initialize({
mnemonicOrSeed: "abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon about",
accountNumber: 0, // Optional: specify account index
options: {
network: "REGTEST" // or "MAINNET"
}
});
console.log("Wallet restored from mnemonic!");
console.log("Address:", wallet.getSparkAddress());
```
BIP-39 mnemonic phrase or raw seed. Leave blank to generate a new wallet.
Account index for generating multiple identity keys from the same mnemonic (default: 0 on REGTEST, 1 on MAINNET). **Important:** Always specify this explicitly if you use the same mnemonic across networks.
Custom signer implementation for advanced use cases
Wallet configuration options including network selection
The initialized SparkWallet instance
The 12-word mnemonic seed phrase for wallet recovery (undefined for raw seed)
***
## Essential Wallet Operations
After creating your wallet, you can perform these essential operations
getIdentityPublicKey()
Gets the identity public key of the wallet
```typescript theme={null}
const identityKey = await wallet.getIdentityPublicKey();
console.log("Identity Public Key:", identityKey);
```
The identity public key as a hex string
getSparkAddress()
Gets the Spark address of the wallet
```typescript theme={null}
const sparkAddress = await wallet.getSparkAddress();
console.log("Spark Address:", sparkAddress);
```
The Spark address for receiving Bitcoin and tokens
getBalance()
Gets the current balance of the wallet, including Bitcoin and token balances.
```typescript theme={null}
const balance = await wallet.getBalance();
console.log("Balance:", balance.balance, "sats");
console.log("Token Balances:", balance.tokenBalances);
```
The wallet's current balance in satoshis
Map of Bech32m token identifiers to token balance and metadata. `UserTokenMetadata` includes `extraMetadata?: Uint8Array` for arbitrary issuer-defined bytes.
cleanupConnections()
Properly closes all network connections and cleans up resources when you're done using the wallet.
```typescript theme={null}
await wallet.cleanupConnections();
console.log("Wallet connections cleaned up");
```
No return value - cleans up connections and aborts active streams
***
## Network Configuration
Spark supports both mainnet and regtest networks:
```typescript Mainnet theme={null}
const { wallet } = await SparkWallet.initialize({
options: {
network: "MAINNET"
}
});
// Use for production applications
```
```typescript Regtest theme={null}
const { wallet } = await SparkWallet.initialize({
options: {
network: "REGTEST"
}
});
// Use for development and testing
```
Always use REGTEST for development and testing. Only use MAINNET for production applications with real Bitcoin.
**Account number defaults differ by network.** REGTEST defaults to account 0, MAINNET defaults to account 1. If you test on REGTEST without specifying `accountNumber`, then deploy to MAINNET with the same mnemonic, your wallet will be empty because funds are on a different account. Always explicitly set `accountNumber` for consistent behavior.
***
## Account Derivation
You can create multiple accounts from the same mnemonic by specifying different account numbers:
```typescript theme={null}
// Account 0 (default)
const wallet0 = await SparkWallet.initialize({
mnemonicOrSeed: "your mnemonic here",
accountNumber: 0
});
// Account 1
const wallet1 = await SparkWallet.initialize({
mnemonicOrSeed: "your mnemonic here",
accountNumber: 1
});
// Each account will have different addresses
console.log("Account 0:", wallet0.getSparkAddress());
console.log("Account 1:", wallet1.getSparkAddress());
```
***
## API Reference
```typescript theme={null}
SparkWallet.initialize({
mnemonicOrSeed?: string, // Optional: existing mnemonic or seed
accountNumber?: number, // Optional: account index (default: 0)
signer?: Signer, // Optional: custom signer implementation
options: {
network: "MAINNET" | "REGTEST"
}
})
// Returns: { wallet: SparkWallet, mnemonic: string }
```
**Wallet Methods:**
```typescript theme={null}
// Initialize wallet with mnemonic or seed
initialize({ mnemonicOrSeed, signer, options })
// Get the identity public key
getIdentityPublicKey()
// Get the Spark address
getSparkAddress()
// Clean up connections
cleanupConnections()
```
# Deposit from L1
Source: https://docs.spark.money/wallets/deposit-from-l1
Deposit Bitcoin from L1 using reusable static addresses.
***
## Deposit Flow
The complete process for depositing Bitcoin from Layer 1 into your Spark wallet:
Create a static deposit address that can be reused for multiple deposits.
```typescript theme={null}
const staticAddress = await wallet.getStaticDepositAddress();
console.log("Static Deposit Address:", staticAddress);
```
Send Bitcoin from any wallet to your deposit address.
```typescript theme={null}
// For mainnet: Send real Bitcoin to the address
// For regtest: Use the faucet
console.log("Send Bitcoin to:", staticAddress);
```
Wait for the transaction to be confirmed on the blockchain.
```typescript theme={null}
// Monitor using a block explorer or your infrastructure
// You need to monitor the address for new transactions
```
Claim the deposit once it has 3 confirmations.
```typescript theme={null}
const quote = await wallet.getClaimStaticDepositQuote(txId);
const claimResult = await wallet.claimStaticDeposit({
transactionId: txId,
creditAmountSats: quote.creditAmountSats,
sspSignature: quote.signature
});
```
***
## Generate Static Deposit Address
For Bitcoin deposits on L1, Spark generates P2TR addresses. These addresses start with `bc1p` for mainnet and can be used to receive Bitcoin from any wallet.
Static deposit addresses are reusable, allowing the same address to receive multiple deposits. This approach is user-friendly, minimizes operational overhead, and is ideal for production applications.
Currently, Spark supports one static deposit address per wallet. Creating a second static address will return your existing address instead of generating a new one.
```typescript theme={null}
const staticDepositAddress = await wallet.getStaticDepositAddress();
console.log("Static Deposit Address:", staticDepositAddress);
// This address can be reused for multiple deposits
```
**Mainnet Address Example**
`bc1p5d7rjq7g6rdk2yhzks9smtbqtedr4dekq08ge8ztwac72sfr9rusxg3297`
***
## Deposit Bitcoin
#### Mainnet Deposits
To deposit Bitcoin on the mainnet, send funds to your static deposit address.
#### Regtest Deposits
For testing purposes on the Regtest network, use the faucet to fund your Spark wallet without using real Bitcoin.
## Monitor for Deposit Transactions
After sending Bitcoin to your deposit address, you'll need to monitor for incoming transactions using a blockchain explorer or your own infrastructure.
```typescript theme={null}
const staticAddress = await wallet.getStaticDepositAddress();
// Example: Monitor for new transactions using a block explorer API
// const newTransactions = await yourBlockchainMonitor.checkAddress(staticAddress);
```
Since static addresses can receive multiple deposits, you need to actively monitor the address for new transactions.
***
## Claiming Deposits
Once a deposit is found on the blockchain, you can claim it by providing the transaction ID.
```typescript theme={null}
// Step 1: Get a quote for your deposit (can be called anytime after transaction)
const quote = await wallet.getClaimStaticDepositQuote(txId);
console.log("Quote:", quote);
// Step 2: Claim the deposit using the quote details
const claimResult = await wallet.claimStaticDeposit({
transactionId: txId,
creditAmountSats: quote.creditAmountSats,
sspSignature: quote.signature
});
console.log("Claim successful:", claimResult);
```
You can call `getClaimStaticDepositQuote` anytime after the deposit transaction is made, but `claimStaticDeposit` will only succeed after the deposit transaction has 3 confirmations.
***
## Refunding Static Deposits
If you need to recover funds from a static deposit address without bringing them into Spark, you can refund the deposit back to a Bitcoin address:
```typescript theme={null}
// Refund a static deposit
const refundTxHex = await wallet.refundStaticDeposit({
depositTransactionId: txId,
destinationAddress: "bc1p...",
satsPerVbyteFee: 10
});
// You'll need to broadcast this transaction yourself
console.log("Refund transaction hex:", refundTxHex);
// Or use refundAndBroadcastStaticDeposit to broadcast automatically
const txid = await wallet.refundAndBroadcastStaticDeposit({
depositTransactionId: txId,
destinationAddress: "bc1p...",
satsPerVbyteFee: 10
});
```
**Use cases for refunding:**
* You don't like the quote from `getClaimStaticDepositQuote`
* You prefer to avoid the double-fee scenario (claim fee + cooperative exit fee)
* You want to send funds to a different address without using Spark
## Confirmation Requirements
* Deposits require **3 confirmations** on L1
* Funds will be available in your Spark wallet after claiming
* Static deposits require manual claiming after confirmation
## Minimum Deposit Amount
The minimum deposit must exceed the dust limit plus fees:
* **Dust limit:** \~400 sats
* **Claim fee:** \~99 sats (varies with network conditions)
Deposits below \~500 sats may fail with "utxo amount minus fees is less than the dust amount".
**Important:** Static deposits do NOT auto-claim when your wallet is offline. If a deposit confirms while your wallet instance is not running, you must manually claim it using `claimStaticDeposit()` with the transaction ID when you next initialize the wallet.
# Deposit from Lightning
Source: https://docs.spark.money/wallets/deposit-from-lightning
Receive Bitcoin via Lightning invoices with instant settlement into Spark.
***
## Understanding Lightning Invoices
To send and receive Lightning payments, you can generate and pay Lightning invoices. A Lightning invoice (also called a payment request) is a specially formatted string that contains all the information needed to make a Lightning Network payment:
* **Amount**: How many satoshis to send (can be omitted for zero-amount invoices)
* **Destination**: The recipient's node public key
* **Payment Hash**: A unique identifier for the payment
* **Description**: Optional memo describing the payment
* **Expiry**: How long the invoice is valid for (default 24 hours)
Lightning invoices start with "ln" followed by the network identifier (bc for mainnet) and typically look like this: `lnbc1...`
**Mainnet invoice Example:**
```
lnbcrt2500n1pj0ytfcpp5qqqsyqcyq5rqwzqfqypqhp58yjmdan79s6qqdhdzgynm4zwqdx40shp5jqp3qymd6qgpy99ppk0jqjzylqg5t7fhqhpl6s4kxmqgmrn59w5k0z0cqqqqqqzqqqqq9qsqqqqqq9qqqqqqgq9qsqxl9l55y5cwa9s2h8nvdh4h7h43tcwjdcysf7v0fprz5uh6vshs4n0tvhgzz2xgcqpg8yqv7
```
***
## Lightning Deposit Flow
The complete process for receiving Lightning payments into your Spark wallet:
Generate a Lightning invoice with the desired amount and description.
```typescript theme={null}
const invoice = await wallet.createLightningInvoice({
amountSats: 50000,
memo: "Deposit to Spark wallet"
});
```
Provide the invoice to the sender (via QR code, link, or text).
```typescript theme={null}
// Display invoice for user to share
console.log("Share this invoice:", invoice.invoice.encodedInvoice);
console.log("Or share this ID:", invoice.id);
```
Track the payment status until completion.
```typescript theme={null}
// Poll for payment status
const checkPayment = async (invoiceId) => {
const request = await wallet.getLightningReceiveRequest(invoiceId);
if (request.status === "TRANSFER_COMPLETED") {
console.log("Payment received:", request.amountReceived, "sats");
return true;
} else if (request.status === "expired") {
console.log("Invoice expired");
return false;
} else {
console.log("Payment pending...");
return false;
}
};
```
***
## Create Lightning Invoice
Generate Lightning invoices to receive Bitcoin payments that will be deposited into your Spark wallet.
createLightningInvoice(params)
Creates a Lightning invoice for receiving Bitcoin payments.
```typescript theme={null}
const invoice = await wallet.createLightningInvoice({
amountSats: 10000, // Amount in satoshis
memo: "Payment for services", // Optional description
includeSparkAddress: true // Optional: embed Spark address
});
console.log("Lightning invoice:", invoice.invoice.encodedInvoice);
console.log("Invoice ID:", invoice.id);
```
The amount in satoshis to request
Optional memo/description for the invoice (max 120 bytes). Cannot be used with `descriptionHash`.
Whether to embed Spark address in the invoice
Optional: 33-byte compressed identity pubkey for other Spark users
Optional: Hash of a longer description (h tag). Used in LNURL/UMA. Cannot be used with `memo`.
Unique identifier for the invoice (e.g., `SparkLightningReceiveRequest:...`)
Lightning invoice object containing:
* `encodedInvoice`: The BOLT11-encoded string (e.g., `lnbc1...`)
* `bitcoinNetwork`: Network identifier (`MAINNET` or `REGTEST`)
* `paymentHash`: Unique payment identifier
* `amount`: Object with `originalValue` (millisatoshis) and `originalUnit`
* `createdAt`: Invoice creation timestamp
* `expiresAt`: Invoice expiration timestamp
* `memo`: Base64-encoded memo string
Invoice status (e.g., `INVOICE_CREATED`, `TRANSFER_COMPLETED`)
***
## Embedding Spark Addresses
By passing in `true` for `includeSparkAddress`, a 36-byte string consisting of a recognizable header and a receiver's compressed identity public key `SPK:identitypubkey` will get embedded in the fallback address (f) field of a BOLT11 invoice:
```typescript theme={null}
const invoice = await wallet.createLightningInvoice({
amountSats: 100,
memo: "Invoice with Spark address",
includeSparkAddress: true,
});
console.log("Invoice with Spark address:", invoice);
```
### Fallback Address Format
The embedded Spark address in the fallback field will look something like this:
```javascript theme={null}
{
version: 31,
address_hash: "53504b0250949ec35b022e3895fd37750102f94fe813523fa220108328a81790bf67ade5"
}
```
***
## Creating Invoices for Other Spark Users
To generate an invoice for another Spark user, pass in the 33-byte compressed identity pubkey as a string to `receiverIdentityPubkey`:
```typescript theme={null}
const invoice = await wallet.createLightningInvoice({
amountSats: 100,
memo: "Invoice for another user",
receiverIdentityPubkey: "023e33e2920326f64ea31058d44777442d97d7d5cbfcf54e3060bc1695e5261c93",
});
console.log("Invoice for another user:", invoice);
```
If a wallet is generating an invoice for itself and wants to embed its own Spark identity in the invoice, it will not need to pass in a `receiverIdentityPubkey` to embed a Spark address. That will get taken care of on the backend. Passing it in shouldn't change anything though.
***
## Zero-Amount Invoices
Spark supports creating zero-amount Lightning invoices. Zero-amount invoices don't have a fixed amount and allow the sender to specify the amount when making the payment.
Zero-amount invoices are not widely supported across the Lightning Network. Some exchanges, such as Binance, currently do not support them.
### Creating Zero-Amount Invoices
To create a zero-amount invoice, pass `0` for the `amountSats` parameter:
```typescript theme={null}
// Create a zero-amount invoice
const zeroAmountInvoice = await wallet.createLightningInvoice({
amountSats: 0, // Creates a zero-amount invoice
memo: "Pay any amount you want",
});
console.log("Zero-amount Invoice:", zeroAmountInvoice);
```
***
## Monitor Lightning Payments
Track incoming Lightning payments and their status using receive request monitoring.
getLightningReceiveRequest(id)
Gets the status of a Lightning receive request by ID.
```typescript theme={null}
// Monitor a specific invoice
const receiveRequest = await wallet.getLightningReceiveRequest(invoiceId);
console.log("Payment status:", receiveRequest.status);
console.log("Amount received:", receiveRequest.amountReceived);
// Check if payment is complete
if (receiveRequest.status === "TRANSFER_COMPLETED") {
console.log("Payment received successfully!");
}
```
The ID of the Lightning receive request
The receive request details including status and amount
***
## Real-time Payment Monitoring
Use event listeners to monitor Lightning payments in real-time.
```typescript theme={null}
// Listen for transfer events (Lightning payments trigger transfer events)
wallet.on("transfer:claimed", (transferId, updatedBalance) => {
console.log(`Transfer ${transferId} claimed. New balance: ${updatedBalance} sats`);
});
```
***
## Error Handling
Implement proper error handling for Lightning invoice operations.
```typescript theme={null}
async function createInvoiceSafely(params) {
try {
// Validate amount
if (params.amountSats <= 0) {
throw new Error("Amount must be greater than 0");
}
// Validate expiry time
if (params.expirySeconds && params.expirySeconds < 60) {
throw new Error("Expiry time must be at least 60 seconds");
}
// Create invoice
const invoice = await wallet.createLightningInvoice(params);
console.log("Invoice created successfully:", invoice.id);
return invoice;
} catch (error) {
console.error("Failed to create invoice:", error.message);
// Handle specific error types
if (error.message.includes("Amount")) {
console.log("Please check the amount value");
} else if (error.message.includes("Expiry")) {
console.log("Please check the expiry time");
} else if (error.message.includes("Network")) {
console.log("Network error. Please try again.");
}
throw error;
}
}
```
***
## Checking Balance
You can use `getBalance()` to check a wallet balance after receiving payments.
The `getBalance()` method returns a Promise resolving to an object containing:
* `balance`: A `bigint` representing the total amount in satoshis
* `tokenBalances`: A Map of token balances, where each entry contains:
* `balance`: A `bigint` representing the token amount
* `tokenInfo`: Information about the specific token the wallet is holding
```typescript theme={null}
const balanceInfo = await wallet.getBalance();
console.log("Balance:", balanceInfo.balance, "sats");
```
Additionally, you can listen for balance update events:
```typescript theme={null}
wallet.on("transfer:claimed", (transferId: string, balance: bigint) => {
console.log(`Transfer ${transferId} claimed. New balance: ${balance}`);
});
```
***
## Example: Complete Lightning Deposit Flow
```typescript theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
async function completeLightningDeposit() {
const { wallet } = await SparkWallet.initialize({
options: { network: "REGTEST" }
});
try {
// 1. Set up event listeners
wallet.on("transfer:claimed", (transferId, updatedBalance) => {
console.log(`Transfer ${transferId} claimed. New balance: ${updatedBalance} sats`);
});
// 2. Create Lightning invoice
const invoice = await wallet.createLightningInvoice({
amountSats: 10000,
memo: "Lightning deposit to Spark wallet"
});
console.log("Invoice created:", invoice.invoice.encodedInvoice);
console.log("Invoice ID:", invoice.id);
// 3. Monitor payment status
const monitorPayment = async () => {
const request = await wallet.getLightningReceiveRequest(invoice.id);
switch (request.status) {
case "TRANSFER_COMPLETED":
console.log("Payment completed!");
return true;
case "expired":
console.log("Invoice expired");
return false;
default:
console.log("Payment pending...");
setTimeout(monitorPayment, 5000); // Check again in 5 seconds
return false;
}
};
// Start monitoring
await monitorPayment();
} catch (error) {
console.error("Lightning deposit failed:", error);
}
}
```
# Estimate Fees
Source: https://docs.spark.money/wallets/estimate-fees
Fee breakdown for Lightning payments, cooperative exits, and swaps.
***
## Fee Structure Overview
Below is a breakdown of fees for different transaction types on Spark:
`For Bitcoin`
Transaction Type
Fee structure
L1 to Spark
On-chain fee paid by the user
Spark to Spark
Free. Small flat fee coming in 6-12 months
Spark to Lightning
0.25% + routing fee
Lightning to Spark
0.15% (charged on routing nodes via route hints)
Exit to L1
L1 fee + SSP fee (see formula below)
Unilateral Exit
On-chain fee paid by the user
`For BTKN assets`
Transaction Type
Fee structure
L1 to Spark
On-chain fee paid by the user
Spark to Spark
Free. Small flat fee coming in 6-12 months
Unilateral Exit
On-chain fee + bond locked by user
Some of the fees are sourced directly from the Lightspark SSP, specifically for Spark–Lightning interactions and exits back to L1. Lightspark is the first SSP on Spark, but the system is open. Anyone can run an SSP (SSP specs coming soon). If you're an SSP, reach out, and we'll include your fee structure.
***
## Fee Estimation Flow
The complete process for estimating fees before making transactions:
Determine which type of operation you want to estimate fees for.
```typescript theme={null}
// Lightning payment
const lightningInvoice = "lnbc...";
// Cooperative exit (withdrawal to L1)
const withdrawalAmount = 10000;
const withdrawalAddress = "bc1p...";
// Swap operation
const swapAmount = 5000;
```
Call the appropriate fee estimation method.
```typescript theme={null}
// Lightning send fee estimate
const lightningFee = await wallet.getLightningSendFeeEstimate({
encodedInvoice: lightningInvoice
});
// Cooperative exit fee estimate
const exitFee = await wallet.getWithdrawalFeeQuote({
amountSats: withdrawalAmount,
withdrawalAddress: withdrawalAddress
});
// Swap fee estimate
const swapFee = await wallet.getSwapFeeEstimate(swapAmount);
```
Examine the fee estimate response to understand costs.
```typescript theme={null}
console.log("Lightning fee estimate:", lightningFee, "sats");
console.log("Exit fee estimate:", exitFee);
console.log("Swap fee estimate:", swapFee);
```
Use the fee estimate to set appropriate limits for your transaction.
```typescript theme={null}
// Use fee estimate for Lightning payment
const payment = await wallet.payLightningInvoice({
invoice: lightningInvoice,
maxFeeSats: lightningFee + 5 // Add buffer to fee estimate
});
```
***
## Lightning Send Fee Estimates
Estimate fees for sending Lightning payments before making the transaction.
getLightningSendFeeEstimate(params)
Gets an estimated fee for sending a Lightning payment. Note: the actual fee assessed may be different from the fee estimate as it will be determined by the actual Lightning node routing.
```typescript theme={null}
const feeEstimateSats = await wallet.getLightningSendFeeEstimate({
encodedInvoice: "lnbcrt1u1pnm7ammpp4v84f05tl0kzt6g95g056athdpp8f8azvg6d7epz74z562ymer9jqsp5nc50gazvp0e98u42jlu653rw0eutcl067nqq924hf89q4la4kd9sxq9z0rgqnp4qdnmwu8v22cvq9xsv2l05cn9rre7xlcgdtntxawf8m0zxq3qemgzqrzjqtr2vd60g57hu63rdqk87u3clac6jlfhej4kldrrjvfcw3mphcw8sqqqqrj0q7ew45qqqqqqqqqqqqqq9qcqzpgdq5w3jhxapqd9h8vmmfvdjs9qyyssqj7lf2w4m587g04n4t0ferdv0vnwftzca0xuc9yxycng78cnhrvmyw2mzaa8t76jskpypqnwqhp9xh0vnwxz90jytd34vrmhcngsnl8qplz7ylk"
});
console.log("Lightning send fee estimate:", feeEstimateSats, "sats");
```
The BOLT11-encoded Lightning invoice
Estimated fee in satoshis
### Example Response
```typescript theme={null}
100 // fee estimate in satoshis
```
***
## Cooperative Exit Fee Estimates
Estimate fees for withdrawing funds back to the Bitcoin network (cooperative exit).
getWithdrawalFeeQuote(params)
Gets a fee quote for a cooperative exit (on-chain withdrawal). The quote includes options for different speeds and an expiry time and must be passed to `withdraw` before it expires.
```typescript theme={null}
const exitFeeEstimate = await wallet.getWithdrawalFeeQuote({
amountSats: 10000,
withdrawalAddress: "bc1p5d7rjq7g6rdk2yhzks9smtbqtedr4dekq08ge8ztwac72sfr9rusxg3297"
});
console.log("Cooperative exit fee estimate:", exitFeeEstimate);
```
The amount in satoshis to withdraw
The Bitcoin address where the funds should be sent
A fee quote for the withdrawal, or null if not available
### Example Response
```typescript theme={null}
{
feeEstimate: {
originalValue: 1000,
originalUnit: 'SATOSHI',
preferredCurrencyUnit: 'USD',
preferredCurrencyValueRounded: 0.50,
preferredCurrencyValueApprox: 0.483
}
}
```
***
## Swap Fee Estimates
Estimate fees for leaves swap operations. Leaves swaps are internal wallet optimization operations that consolidate or split your sats leaves.
getSwapFeeEstimate(amountSats)
Gets the estimated fee for a swap of leaves.
```typescript theme={null}
const swapFeeEstimate = await wallet.getSwapFeeEstimate(5000);
console.log("Swap fee estimate:", swapFeeEstimate);
```
The amount of sats to swap
The estimated fee for the swap operation
***
## When Fees Apply
Understanding when different types of fees are charged:
**Lightning Payments:**
* Routing fees for Lightning Network transactions
* Fees vary based on network conditions and routing path
* Estimated fees may differ from actual fees charged
**Cooperative Exits (L1 Withdrawals):**
The fee consists of two components:
* **L1 broadcast fee**: `tx_vbytes × sats_per_vbyte`
* **SSP fee**: `111 × sats_per_vbyte × 2` (where 111 vbytes is the minimum transaction size)
**Total formula:** `sats_per_vbyte × (111 × 2 + tx_vbytes)`
The fee is flat and doesn't scale with withdrawal amount. For small withdrawals, the fee may represent a higher percentage of the amount.
Different speeds (fast, medium, slow) use different `sats_per_vbyte` estimates based on mempool conditions.
**Leaves Swap Operations:**
* Fees for consolidating or splitting sats leaves
* Swaps are automatic and run as needed by the wallet
* Used for internal wallet optimization
***
## Fee Deduction Options
For cooperative exits, you can choose how fees are handled:
### Deduct Fee from Withdrawal Amount
When `deductFeeFromWithdrawalAmount` is set to `true`:
```typescript theme={null}
const withdrawal = await wallet.withdraw({
onchainAddress: "bc1p...",
exitSpeed: "medium",
amountSats: 10000, // This is the net amount you'll receive
feeQuote: feeQuote,
deductFeeFromWithdrawalAmount: true // Fee deducted from amount
});
// You receive: 10000 sats - fee
```
### Pay Fee Separately
When `deductFeeFromWithdrawalAmount` is set to `false`:
```typescript theme={null}
const withdrawal = await wallet.withdraw({
onchainAddress: "bc1p...",
exitSpeed: "medium",
amountSats: 10000, // This is the gross amount you'll receive
feeQuote: feeQuote,
deductFeeFromWithdrawalAmount: false // Fee paid separately
});
// You receive: 10000 sats (fee paid from wallet balance)
```
***
## Complete Example
```typescript theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
async function estimateAllFees() {
const { wallet } = await SparkWallet.initialize({
options: { network: "REGTEST" }
});
try {
// 1. Lightning send fee estimate
const lightningInvoice = "lnbcrt1u1pnm7ammpp4v84f05tl0kzt6g95g056athdpp8f8azvg6d7epz74z562ymer9jqsp5nc50gazvp0e98u42jlu653rw0eutcl067nqq924hf89q4la4kd9sxq9z0rgqnp4qdnmwu8v22cvq9xsv2l05cn9rre7xlcgdtntxawf8m0zxq3qemgzqrzjqtr2vd60g57hu63rdqk87u3clac6jlfhej4kldrrjvfcw3mphcw8sqqqqrj0q7ew45qqqqqqqqqqqqqq9qcqzpgdq5w3jhxapqd9h8vmmfvdjs9qyyssqj7lf2w4m587g04n4t0ferdv0vnwftzca0xuc9yxycng78cnhrvmyw2mzaa8t76jskpypqnwqhp9xh0vnwxz90jytd34vrmhcngsnl8qplz7ylk";
const lightningFee = await wallet.getLightningSendFeeEstimate({
encodedInvoice: lightningInvoice
});
console.log("Lightning fee estimate:", lightningFee, "sats");
// 2. Cooperative exit fee estimate
const exitFee = await wallet.getWithdrawalFeeQuote({
amountSats: 10000,
withdrawalAddress: "bc1p5d7rjq7g6rdk2yhzks9smtbqtedr4dekq08ge8ztwac72sfr9rusxg3297"
});
if (exitFee) {
console.log("Exit fee estimate:", exitFee.fast, "sats (fast)");
console.log("Exit fee estimate:", exitFee.medium, "sats (medium)");
console.log("Exit fee estimate:", exitFee.slow, "sats (slow)");
}
// 3. Swap fee estimate
const swapFee = await wallet.getSwapFeeEstimate(5000);
console.log("Swap fee estimate:", swapFee);
// 4. Use estimates for actual transactions
const payment = await wallet.payLightningInvoice({
invoice: lightningInvoice,
maxFeeSats: lightningFee + 5 // Add 5 sats buffer
});
console.log("Lightning payment initiated with fee buffer");
} catch (error) {
console.error("Fee estimation failed:", error);
}
}
```
# FAQ
Source: https://docs.spark.money/wallets/faq
### What languages does it currently support?
* **TypeScript** & **React Native** — Official Spark SDK
* **Rust, Swift, Kotlin, Python, Flutter, Go, C#, WebAssembly** — Built by the [Breez team](https://breez.technology/). Full documentation at [sdk-doc-spark.breez.technology](https://sdk-doc-spark.breez.technology/).
### What kinds of transfers can I do with Spark?
You can perform the following types of transfers with Spark:
* Deposit and withdraw funds to and from L1 (Bitcoin mainnet) to a Spark wallet.
* Make native Spark-to-Spark transfers (between two Spark addresses).
* Send and receive payments via the Lightning Network.
### Is Spark working on Regtest and Mainnet?
Yes, we support both. Mainnet is already live and ready for you to use today. Internally, we run a custom Regtest network; if you’d like access, reach out to us directly. We’re also planning to release a Signet environment that anyone can spin up locally, removing the need to rely on our Regtest setup.
### Is the SDK self-custodial?
Yes. The Spark Wallet SDK is fully self-custodial. Keys are generated on the user’s side. You have the flexibility to decide how you want to build your product, whether you prefer to manage custody of the funds or allow each user to own their own Spark wallet.
### Does it support MPC/multisig?
Not yet. It's coming soon.
### Is there any limits in payment sizes?
There are no hard limits on payment size. That said, Spark is still early, and we recommend starting with small amounts as the network matures.
The only practical constraints arise if you're moving large amounts in or out of Spark via Lightning, or exiting through a cooperative exit.
If you expect to move significant volume between Lightning and Spark, reach out to us. We're happy to work with you to make sure everything flows smoothly.
# Identity Key Derivation
Source: https://docs.spark.money/wallets/identity-key-derivation
Spark wallets use a hierarchical deterministic (HD) key derivation scheme purpose-built for the Spark protocol. This follows the general principles of BIP43, using a custom purpose field derived from the SHA256 hash of "spark" to define application-specific key derivations.
This derivation scheme is our recommendation so users can easily migrate between different wallet apps. You can implement a different scheme, but it will lose the ability to migrate to/from other Spark wallets.
***
## Derivation Scheme
### Base Derivation Path
```
m/8797555'/accountNumber'/keyType'
```
Where:
* **8797555'**: Custom purpose field (derived from last 3 bytes of SHA256("spark") = 0x863d73)
* **accountNumber'**: Account index (hardened derivation)
* **keyType'**: Specific key type (hardened derivation)
### Key Types
Spark derives 5 key types from the master seed:
| Path | Key Type | Purpose |
| ------------------ | ------------------------- | ------------------------------------------------------ |
| `m/8797555'/n'/0'` | **Identity Key** | Primary wallet identifier and Spark Address generation |
| `m/8797555'/n'/1'` | **Signing HD Key** | Base key for leaf key derivation |
| `m/8797555'/n'/2'` | **Deposit Key** | Receiving L1 Bitcoin deposits |
| `m/8797555'/n'/3'` | **Static Deposit HD Key** | Reusable deposit addresses (SSP integration) |
| `m/8797555'/n'/4'` | **HTLC Preimage HD Key** | Lightning HTLC preimage generation |
**Path:** `m/8797555'/n'/0'`
Primary wallet identifier. Used for authentication, Spark Address generation, and message signing.
**Path:** `m/8797555'/n'/1'`
HD key used as the base for deriving leaf-specific signing keys. Each Spark leaf gets its own derived key.
**Path:** `m/8797555'/n'/2'`
Used for receiving L1 Bitcoin deposits into Spark. Single key, not HD-derived.
**Path:** `m/8797555'/n'/3'`
HD key for generating reusable static deposit addresses. Each index produces a different deposit address.
**Path:** `m/8797555'/n'/4'`
HD key for generating HTLC preimages in Lightning payments. The `htlcHMAC()` method uses this key.
***
## Account Number
### Default Behavior
The `accountNumber` parameter controls which account's keys are derived. **The default value differs by network:**
| Network | Default Account Number |
| ------- | ---------------------- |
| REGTEST | `0` |
| MAINNET | `1` |
MAINNET defaults to `1` for backwards compatibility with legacy wallets created before multi-account support. If you're integrating with the SDK and using account `0` internally, account for this off-by-one behavior on mainnet.
### Using Account Numbers
```typescript theme={null}
// Account 0 (first account)
const { wallet: account0 } = await SparkWallet.initialize({
mnemonicOrSeed: mnemonic,
accountNumber: 0,
options: { network: "REGTEST" }
});
// Account 1 (second account, same mnemonic)
const { wallet: account1 } = await SparkWallet.initialize({
mnemonicOrSeed: mnemonic,
accountNumber: 1,
options: { network: "REGTEST" }
});
// These are completely separate wallets
const address0 = await account0.getSparkAddress();
const address1 = await account1.getSparkAddress();
console.log(address0 !== address1); // true
```
### Explicit Account Numbers
When building wallet software, always specify the account number explicitly to avoid confusion:
```typescript theme={null}
// Explicit is better than implicit
const { wallet } = await SparkWallet.initialize({
mnemonicOrSeed: mnemonic,
accountNumber: 0, // Always specify
options: { network: "MAINNET" }
});
```
***
## Leaf Key Derivation
When accepting transfers, Spark assigns a unique leaf ID to each UTXO. The signing key for that leaf is derived from the Signing HD Key using the leaf ID.
### Derivation Formula
```typescript theme={null}
// 1. Hash the leaf ID
const hash = sha256(leafId);
// 2. Extract first 4 bytes as unsigned 32-bit integer (big-endian)
const hashValue = hash.slice(0, 4).readUInt32BE();
// 3. Calculate derivation index (hardened)
const leafIndex = (hashValue % 0x80000000) + 0x80000000;
// 4. Derive child key
const leafKey = signingHDKey.deriveChild(leafIndex);
```
### Full Derivation Path
```
m/8797555'/{accountNumber}'/1'/{leafIndex}'
```
Where `leafIndex` is calculated from the leaf ID as shown above.
### Implementation
```typescript theme={null}
import { createHash } from 'crypto';
function getLeafDerivationPath(leafId: string, accountNumber: number): string {
// Step 1: Calculate SHA256 hash of leaf ID
const hash = createHash('sha256').update(leafId).digest();
// Step 2: Convert first 4 bytes to number
const hashValue = hash.readUInt32BE(0);
// Step 3: Calculate hardened index
const leafIndex = (hashValue % 0x80000000) + 0x80000000;
// Step 4: Build derivation path
return `m/8797555'/${accountNumber}'/1'/${leafIndex}'`;
}
// Example
const path = getLeafDerivationPath("leaf-abc-123", 0);
// Returns something like: m/8797555'/0'/1'/2147483747'
```
### Using KeyDerivation
In the SDK, you typically don't derive leaf keys manually. Instead, use the `KeyDerivation` type:
```typescript theme={null}
import { KeyDerivationType } from "@buildonspark/spark-sdk";
// The signer handles derivation internally
const publicKey = await signer.getPublicKeyFromDerivation({
type: KeyDerivationType.LEAF,
path: leafId // The leaf ID, not the derivation path
});
```
***
## Static Deposit Key Derivation
Static deposit keys are derived from the Static Deposit HD Key at path `/3'`:
```typescript theme={null}
// Derive static deposit key at index 5
const staticDepositPath = `m/8797555'/${accountNumber}'/3'/${index + 0x80000000}'`;
```
In the SDK:
```typescript theme={null}
// Get static deposit key at index 5
const publicKey = await signer.getStaticDepositSigningKey(5);
const privateKey = await signer.getStaticDepositSecretKey(5);
```
***
## Deposit Address Flow
For L1 Bitcoin deposits, the flow is:
1. **Generate Deposit Key**: Uses the Deposit Key at path `m/8797555'/n'/2'`
2. **Create Deposit Address**: Generate Bitcoin address from the key
3. **Receive Deposit**: User sends Bitcoin to the address
4. **Assign Leaf**: After tree creation, funds are assigned to a leaf with its own derived key
```typescript theme={null}
// Get the deposit signing key
const depositKey = await signer.getDepositSigningKey();
// The SDK handles address generation
const depositAddress = await wallet.getSingleUseDepositAddress();
```
***
## Usage in SparkWallet
### Initialize with Specific Account
```typescript theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
const { wallet, mnemonic } = await SparkWallet.initialize({
mnemonicOrSeed: "abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon about",
accountNumber: 0,
options: {
network: "REGTEST",
},
});
// Identity key is at m/8797555'/0'/0'
const identityKey = await wallet.getIdentityPublicKey();
console.log("Identity key:", identityKey);
```
### Multiple Accounts from Same Mnemonic
```typescript theme={null}
const mnemonic = "abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon abandon about";
// Create two separate accounts
const { wallet: wallet0 } = await SparkWallet.initialize({
mnemonicOrSeed: mnemonic,
accountNumber: 0,
options: { network: "REGTEST" }
});
const { wallet: wallet1 } = await SparkWallet.initialize({
mnemonicOrSeed: mnemonic,
accountNumber: 1,
options: { network: "REGTEST" }
});
// Each account has completely isolated keys
console.log(await wallet0.getSparkAddress()); // Different address
console.log(await wallet1.getSparkAddress()); // Different address
```
***
## Custom Derivation Paths
For non-standard use cases, you can provide a custom key generator:
```typescript theme={null}
import {
DefaultSparkSigner,
DerivationPathKeysGenerator
} from "@buildonspark/spark-sdk";
// Custom derivation template (? is replaced with account number)
const customGenerator = new DerivationPathKeysGenerator("m/44'/0'/?'/0'");
const signer = new DefaultSparkSigner({
sparkKeysGenerator: customGenerator
});
// The keys will be derived from the custom path:
// Identity: m/44'/0'/n'/0'
// Signing HD: m/44'/0'/n'/0'/1'
// Deposit: m/44'/0'/n'/0'/2'
// Static Deposit: m/44'/0'/n'/0'/3'
// HTLC Preimage: m/44'/0'/n'/0'/4'
```
Custom derivation paths will not be compatible with other Spark wallets. Only use this if you have specific requirements that justify breaking compatibility.
***
## Migration Between Wallets
The standardized derivation scheme enables wallet migration:
### Export
```typescript theme={null}
// Save these securely
const walletBackup = {
mnemonic: "your 12 or 24 word phrase",
accountNumber: 0
};
```
### Import
```typescript theme={null}
const { wallet } = await SparkWallet.initialize({
mnemonicOrSeed: walletBackup.mnemonic,
accountNumber: walletBackup.accountNumber,
options: { network: "MAINNET" }
});
// Same identity key, same Spark address
const address = await wallet.getSparkAddress();
```
***
## Security Considerations
### Hardened Derivation
All derivation paths use hardened derivation (indicated by the `'` suffix). This prevents:
* **Key leakage**: Child public keys cannot be used to derive parent keys
* **Chain code exposure**: Compromising a child key doesn't compromise siblings
### Key Isolation
Each account number creates a completely isolated set of keys:
```typescript theme={null}
// Account 0 and Account 1 share no key material
const account0Keys = deriveAccountKeys(mnemonic, 0);
const account1Keys = deriveAccountKeys(mnemonic, 1);
// No cryptographic relationship between them
assert(account0Keys.identity !== account1Keys.identity);
```
### Mnemonic Security
* Store mnemonic phrases securely (encrypted, offline if possible)
* Never transmit mnemonics over the network
* Consider using hardware wallets or secure enclaves for production
***
## Quick Reference
| Key Type | Path | SDK Method |
| -------------- | -------------------------------- | ----------------------------------------------------------------- |
| Identity | `m/8797555'/n'/0'` | `signer.getIdentityPublicKey()` |
| Signing HD | `m/8797555'/n'/1'` | Internal (leaf derivation base) |
| Deposit | `m/8797555'/n'/2'` | `signer.getDepositSigningKey()` |
| Static Deposit | `m/8797555'/n'/3'/i'` | `signer.getStaticDepositSigningKey(i)` |
| HTLC Preimage | `m/8797555'/n'/4'` | `signer.htlcHMAC(transferId)` |
| Leaf | `m/8797555'/n'/1'/hash(leafId)'` | `signer.getPublicKeyFromDerivation({type: "leaf", path: leafId})` |
# Overview
Source: https://docs.spark.money/wallets/overview
The Spark Wallet SDK lets you deploy Spark-native wallets in the most scalable and developer-friendly way possible. Whether you're building for your own custody or shipping self-custodial wallets for your users, the SDK is flexible by design.
***
## Installation
### Spark SDK
The official Spark SDK for TypeScript and React Native applications.
### Breez SDK
The [Breez](https://breez.technology) team has built a fully native SDK in Rust with bindings for Swift, Kotlin, Python, Flutter, Go, C#, and WebAssembly. If you need native mobile performance or a language not covered by the Spark SDK, use the Breez SDK.
***
## Fundamentals
***
## Tools
# Privacy Mode
Source: https://docs.spark.money/wallets/privacy
Hide your transaction history from public view
***
## Overview
Make your wallet fully private with a single call. By default, Spark transactions are visible from public endpoints. When you enable privacy mode, your transaction history becomes invisible.
Privacy mode currently applies to Bitcoin transactions only. Token transactions remain visible.
***
## Enable Privacy Mode
setPrivacyEnabled()
Toggle privacy mode on or off for your wallet.
```typescript Enable Privacy theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
const { wallet } = await SparkWallet.initialize({
mnemonicOrSeed: "your mnemonic here",
options: { network: "MAINNET" }
});
// Enable privacy mode
await wallet.setPrivacyEnabled(true);
console.log("Privacy mode enabled. Your transactions are now hidden.");
```
```typescript Disable Privacy theme={null}
// Disable privacy mode (make transactions public again)
await wallet.setPrivacyEnabled(false);
console.log("Privacy mode disabled. Transactions are now publicly visible.");
```
`true` to hide transactions from public view, `false` to make them visible.
The updated wallet settings, including the new privacy state.
***
## Check Current Settings
getWalletSettings()
Query your wallet's current privacy configuration.
```typescript theme={null}
const settings = await wallet.getWalletSettings();
if (settings?.privateEnabled) {
console.log("Privacy mode is ON");
} else {
console.log("Privacy mode is OFF");
}
```
Whether privacy mode is currently enabled.
The wallet's identity public key.
***
## How It Works
When privacy mode is enabled:
1. Block explorers see nothing. Your address and transactions won't appear publicly.
2. Public APIs return empty. Third parties querying your address get no results.
3. You retain full access. Your wallet can still query all your transactions normally.
Privacy mode controls query visibility, not on-chain data. Your funds remain fully secure and self-custodial regardless of this setting.
***
## API Reference
```typescript theme={null}
// Enable or disable privacy mode
await wallet.setPrivacyEnabled(privacyEnabled: boolean): Promise
// Query current wallet settings
await wallet.getWalletSettings(): Promise
```
**WalletSettings Type:**
```typescript theme={null}
interface WalletSettings {
privateEnabled: boolean;
ownerIdentityPublicKey: string;
}
```
# React Native SDK
Source: https://docs.spark.money/wallets/react-native
React Native SDK for iOS and Android wallet apps. Requires Xcode/iOS Simulator or Android Studio.
Complete React Native example with proper polyfill setup
## Getting Started
To get started, follow the steps below.
Install the Spark SDK packages using your package manager of choice.
```bash npm theme={null}
npm install @buildonspark/spark-sdk
```
```bash yarn theme={null}
yarn add @buildonspark/spark-sdk
```
```bash pnpm theme={null}
pnpm add @buildonspark/spark-sdk
```
Install the required polyfills for React Native compatibility.
```bash npm theme={null}
npm install react-native-get-random-values @azure/core-asynciterator-polyfill buffer text-encoding
```
```bash yarn theme={null}
yarn add react-native-get-random-values @azure/core-asynciterator-polyfill buffer text-encoding
```
```bash pnpm theme={null}
pnpm add react-native-get-random-values @azure/core-asynciterator-polyfill buffer text-encoding
```
**Critical:** Import polyfills at the very top of your `index.js` file, BEFORE importing your app or any other modules. The crypto module will fail to load if this order is wrong.
```js index.js theme={null}
// These MUST be the first imports in your app entry point
import 'react-native-get-random-values';
import '@azure/core-asynciterator-polyfill';
import { Buffer } from 'buffer';
global.Buffer = Buffer;
// Now import your app
import { AppRegistry } from 'react-native';
import App from './App';
import { name as appName } from './app.json';
AppRegistry.registerComponent(appName, () => App);
```
For iOS, you must install the native module dependencies. This step is required for bare React Native apps.
```bash theme={null}
cd ios && pod install && cd ..
```
If you skip this step, you'll see errors like `Cannot read property 'decryptEcies' of null` when initializing the wallet.
Create a wallet instance that will be used to interact with the Spark network.
```tsx wallet.jsx theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
export const initializeWallet = async () => {
const { wallet, mnemonic } = await SparkWallet.initialize({
mnemonicOrSeed: "optional-mnemonic-or-seed",
accountNumber: "optional-number",
options: {
network: "REGTEST" // or "MAINNET"
}
});
console.log("Wallet initialized successfully:", mnemonic);
return wallet;
};
```
You're ready to start building.
```tsx App.jsx theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
export function WalletInfo() {
const [wallet, setWallet] = useState(null);
const [loading, setLoading] = useState(false);
const createWallet = async () => {
setLoading(true);
try {
const { wallet, mnemonic } = await SparkWallet.initialize({
options: { network: "REGTEST" }
});
setWallet(wallet);
console.log("Mnemonic:", mnemonic);
} catch (error) {
console.error("Failed to create wallet:", error);
} finally {
setLoading(false);
}
};
// Note: getSparkAddress() and getBalance() are async
const [address, setAddress] = useState("");
const [balance, setBalance] = useState(0n);
useEffect(() => {
if (wallet) {
wallet.getSparkAddress().then(setAddress);
wallet.getBalance().then(b => setBalance(b.balance));
}
}, [wallet]);
return (
Wallet Information
{loading ? (
Loading...
) : wallet ? (
Address: {address}Balance: {balance.toString()} sats
) : (
)}
);
}
```
### Initialize a Wallet
A wallet requires either a mnemonic or raw seed for initialization. The `initialize()` function accepts both. If no input is given, it will auto-generate a mnemonic and return it.
```tsx theme={null}
// Initialize a new wallet instance
const { wallet, mnemonic } = await SparkWallet.initialize({
mnemonicOrSeed: "optional-mnemonic-or-seed",
accountNumber: "optional-number",
options: {
network: "REGTEST" // or "MAINNET"
}
});
console.log("Wallet initialized successfully:", mnemonic);
```
### Mnemonic Phrases
A mnemonic is a human-readable encoding of your wallet's seed. It's a 12- or 24-word phrase from the BIP-39 wordlist, used to derive the cryptographic keys that control your wallet.
## Troubleshooting
### `Cannot read property 'decryptEcies' of null`
This error occurs when native crypto modules aren't loaded properly. Fix it by:
1. **Ensure polyfills are imported first** in your `index.js` - they must come before any other imports
2. **Run `pod install`** in your `ios/` directory for iOS builds
3. **Rebuild your app** completely (not just a hot reload)
```bash theme={null}
# iOS
cd ios && pod install && cd ..
npx react-native run-ios
# Android
npx react-native run-android
```
### Wallet initialization fails silently
Make sure you're awaiting the `initialize()` call and handling errors:
```tsx theme={null}
try {
const { wallet, mnemonic } = await SparkWallet.initialize({
options: { network: "MAINNET" }
});
} catch (error) {
console.error("Wallet init failed:", error);
}
```
## React Native Current Status
The React Native SDK is currently in **beta** with active development. We're shipping improvements weekly.
**Current Limitations:**
* Uses polling for updates instead of real-time streams
* Some edge cases may have rough handling
# Spark Invoices
Source: https://docs.spark.money/wallets/spark-invoices
Spark Invoices are native payment requests for the Spark network. Unlike Lightning invoices, Spark invoices support both Bitcoin (sats) and tokens, with optional amounts, expiry times, and signature verification.
***
## Overview
Spark Invoices provide:
* **Sats & Token Support** - Request payment in Bitcoin or any Spark token
* **Optional Amounts** - Create invoices with or without a preset amount
* **Expiration** - Set custom expiry times for invoices
* **Signature Verification** - Cryptographically signed by the receiver
* **Batch Payments** - Fulfill multiple invoices in a single call
Spark invoices are receiver-generated and receiver-signed. The receiver must create the invoice for signature validation.
***
## Create a Sats Invoice
Request a Bitcoin payment using a Spark invoice.
createSatsInvoice(params)
Creates a Spark invoice for receiving Bitcoin.
```typescript theme={null}
const invoice = await wallet.createSatsInvoice({
amount: 10000, // Optional: amount in sats
memo: "Payment for coffee", // Optional: description
expiryTime: new Date(Date.now() + 3600 * 1000) // Optional: 1 hour from now
});
console.log("Spark Invoice:", invoice);
```
Amount in satoshis to request. If omitted, payer specifies the amount.
Optional memo/description for the invoice (max 120 bytes)
Optional: restrict payment to a specific sender's Spark address
Optional expiration time as a Date object (default: 30 days from now)
***
## Create a Token Invoice
Request a token payment using a Spark invoice.
createTokensInvoice(params)
Creates a Spark invoice for receiving tokens.
```typescript theme={null}
const invoice = await wallet.createTokensInvoice({
tokenIdentifier: "btkn1...", // Token identifier
amount: 1000n, // Optional: token amount
memo: "Token payment", // Optional: description
expiryTime: new Date(Date.now() + 3600 * 1000) // Optional: 1 hour expiry
});
console.log("Token Invoice:", invoice);
```
The Bech32m token identifier (e.g., `btkn1...`)
Token amount to request. If omitted, payer specifies the amount.
Optional memo/description for the invoice (max 120 bytes)
Optional: restrict payment to a specific sender's Spark address
Optional expiration time as a Date object (default: 30 days from now)
***
## Fulfill Spark Invoices
Pay one or more Spark invoices.
fulfillSparkInvoice(params)
Fulfills Spark invoices. Supports batch payments with a mix of sats and token invoices.
```typescript theme={null}
// Pay a single invoice
const result = await wallet.fulfillSparkInvoice([
{ invoice: "spark1..." }
]);
// Pay multiple invoices (batch)
const batchResult = await wallet.fulfillSparkInvoice([
{ invoice: "spark1..." }, // Sats invoice
{ invoice: "spark1..." }, // Token invoice
]);
// Pay invoice with no preset amount
const customAmount = await wallet.fulfillSparkInvoice([
{ invoice: "spark1...", amount: 5000n } // Specify amount for zero-amount invoice
]);
```
Array of objects with `invoice: SparkAddressFormat` and optional `amount: bigint` for zero-amount invoices
`fulfillSparkInvoice` can process multiple invoices in a single call, including a mix of sats and token invoices for different assets.
***
## Query Spark Invoices
Check the status of Spark invoices.
querySparkInvoices(invoices)
Query the status of one or more Spark invoices.
```typescript theme={null}
const invoiceStatus = await wallet.querySparkInvoices([
"spark1...",
"spark1..."
]);
for (const status of invoiceStatus) {
console.log("Invoice status:", status);
}
```
Array of raw Spark invoice strings to query
***
## Invoice Format
Spark invoices are Bech32m-encoded strings that contain:
* **Network identifier** - Derived from the HRP (human-readable prefix)
* **Receiver's identity public key** - Who will receive the payment
* **Payment type** - Sats or tokens (with token identifier)
* **Amount** - Optional preset amount
* **Memo** - Optional description
* **Expiry** - Optional expiration time
* **Signature** - BIP340 Schnorr signature from the receiver
***
## Use Cases
### Point of Sale
```typescript theme={null}
// Merchant creates invoice for specific amount
const invoice = await merchantWallet.createSatsInvoice({
amount: 25000,
memo: "Order #1234"
});
// Display QR code to customer
displayQR(invoice);
// Customer pays
await customerWallet.fulfillSparkInvoice([
{ invoice }
]);
```
### Donations (Variable Amount)
```typescript theme={null}
// Create invoice without preset amount
const donationInvoice = await wallet.createSatsInvoice({
memo: "Donation to Project X"
});
// Donor chooses amount when paying
await donorWallet.fulfillSparkInvoice([
{ invoice: donationInvoice, amount: 100000n }
]);
```
### Batch Payouts
```typescript theme={null}
// Pay multiple recipients in one call
await wallet.fulfillSparkInvoice([
{ invoice: employeeInvoice1 },
{ invoice: employeeInvoice2 },
{ invoice: employeeInvoice3 }
]);
```
# Spark Signer Interface
Source: https://docs.spark.money/wallets/spark-signer
The Spark SDK provides the `SparkSigner` interface to enable flexible implementation of signing operations. This abstraction allows you to customize how cryptographic operations are performed, enabling support for secure enclaves, hardware wallets, remote signing services, and other specialized key management systems.
The SDK includes `DefaultSparkSigner` which handles standard single-signature operations and stores nonces internally for security. For server-side enclave integrations, `UnsafeStatelessSparkSigner` is available.
***
## Core Concepts
### Key Types
Spark wallets derive 5 key types from a master seed using BIP32:
| Path | Key Type | Purpose |
| ------------------ | --------------------- | -------------------------------------------- |
| `m/8797555'/n'/0'` | Identity Key | Primary wallet identifier and authentication |
| `m/8797555'/n'/1'` | Signing HD Key | Base key for leaf key derivation |
| `m/8797555'/n'/2'` | Deposit Key | Receiving L1 Bitcoin deposits |
| `m/8797555'/n'/3'` | Static Deposit HD Key | Reusable deposit addresses |
| `m/8797555'/n'/4'` | HTLC Preimage HD Key | Lightning HTLC operations |
### The KeyDerivation System
The signer uses a discriminated union type to specify how to derive or retrieve a private key for signing operations:
```typescript theme={null}
enum KeyDerivationType {
LEAF = "leaf", // Derive from signing HD key using sha256(path)
DEPOSIT = "deposit", // Use the deposit key directly
STATIC_DEPOSIT = "static_deposit", // Use static deposit key at index
ECIES = "ecies", // Decrypt private key from ciphertext
RANDOM = "random", // Generate a random key (for adaptor signatures)
}
type KeyDerivation =
| { type: KeyDerivationType.LEAF; path: string }
| { type: KeyDerivationType.DEPOSIT }
| { type: KeyDerivationType.RANDOM }
| { type: KeyDerivationType.STATIC_DEPOSIT; path: number }
| { type: KeyDerivationType.ECIES; path: Uint8Array };
```
This abstraction is used throughout the signer interface, particularly in `signFrost()` and `getPublicKeyFromDerivation()`.
### Security Model
* All private keys are derived from a master seed using BIP32 hierarchical deterministic key derivation
* Private keys never leave the signer. Only signatures and public keys are returned.
* `DefaultSparkSigner` stores nonces internally to prevent reuse attacks
* For enclave integrations, `UnsafeStatelessSparkSigner` exposes nonces externally
***
## Implementations
### DefaultSparkSigner
The recommended implementation for client-side applications. It stores signing nonces internally to prevent reuse attacks.
```typescript theme={null}
import { DefaultSparkSigner } from "@buildonspark/spark-sdk";
const signer = new DefaultSparkSigner();
const mnemonic = await signer.generateMnemonic();
const seed = await signer.mnemonicToSeed(mnemonic);
await signer.createSparkWalletFromSeed(seed, 0);
```
### UnsafeStatelessSparkSigner
For server-side enclave integrations where nonces need to be managed externally. This signer returns nonces in `getRandomSigningCommitment()` instead of storing them internally.
```typescript theme={null}
import { UnsafeStatelessSparkSigner } from "@buildonspark/spark-sdk";
// Only use in secure server environments
const signer = new UnsafeStatelessSparkSigner();
```
`UnsafeStatelessSparkSigner` exposes nonces externally. Only use this in secure server environments where you can properly protect nonce data.
### Custom Signer Implementation
You can extend `DefaultSparkSigner` to implement custom signing logic, such as forwarding requests to a secure enclave:
```typescript theme={null}
import { DefaultSparkSigner, SignFrostParams } from "@buildonspark/spark-sdk";
class EnclaveSigner extends DefaultSparkSigner {
private enclave: EnclaveClient;
constructor(enclave: EnclaveClient) {
super();
this.enclave = enclave;
}
async signFrost(params: SignFrostParams): Promise {
// Forward signing request to secure enclave
return this.enclave.signFrost(params);
}
async createSparkWalletFromSeed(
seed: Uint8Array | string,
accountNumber?: number
): Promise {
// Initialize keys in enclave
return this.enclave.initializeWallet(seed, accountNumber);
}
}
// Use with SparkWallet
const { wallet } = await SparkWallet.initialize({
signer: new EnclaveSigner(myEnclave),
options: { network: "MAINNET" }
});
```
### Custom Key Derivation Paths
For non-standard derivation paths, use `DerivationPathKeysGenerator`:
```typescript theme={null}
import { DefaultSparkSigner, DerivationPathKeysGenerator } from "@buildonspark/spark-sdk";
// Use ? as placeholder for account number
const customGenerator = new DerivationPathKeysGenerator("m/44'/0'/?'");
const signer = new DefaultSparkSigner({
sparkKeysGenerator: customGenerator
});
```
***
## Wallet Initialization
### Generate Mnemonic
generateMnemonic()
Generates a new BIP39 mnemonic phrase for wallet creation.
```typescript theme={null}
const mnemonic = await signer.generateMnemonic();
console.log(mnemonic); // "abandon ability able about above absent..."
```
A 12-word BIP39 mnemonic phrase
### Convert Mnemonic to Seed
mnemonicToSeed(mnemonic)
Converts a BIP39 mnemonic phrase to a cryptographic seed.
```typescript theme={null}
const seed = await signer.mnemonicToSeed(mnemonic);
console.log("Seed length:", seed.length); // 64 bytes
```
Valid BIP39 mnemonic phrase
64-byte seed derived from the mnemonic
### Initialize from Seed
createSparkWalletFromSeed(seed, accountNumber?)
Initializes the signer with a master seed and derives all necessary keys.
```typescript theme={null}
const seed = await signer.mnemonicToSeed(mnemonic);
const identityPubKey = await signer.createSparkWalletFromSeed(seed, 0);
console.log("Identity public key:", identityPubKey);
```
Master seed as bytes or hex string
Account index for key derivation. Defaults to 0 on REGTEST, 1 on MAINNET for backwards compatibility.
Hex-encoded identity public key
***
## Key Management
### Get Identity Public Key
getIdentityPublicKey()
Retrieves the wallet's identity public key.
```typescript theme={null}
const identityKey = await signer.getIdentityPublicKey();
console.log("Identity key:", identityKey);
```
The identity public key (33 bytes, compressed)
### Get Deposit Signing Key
getDepositSigningKey()
Retrieves the deposit signing public key used for L1 Bitcoin deposits.
```typescript theme={null}
const depositKey = await signer.getDepositSigningKey();
console.log("Deposit signing key:", depositKey);
```
The deposit signing public key
### Get Static Deposit Signing Key
getStaticDepositSigningKey(idx)
Retrieves a static deposit signing public key by index.
```typescript theme={null}
const signingKey = await signer.getStaticDepositSigningKey(0);
console.log("Static deposit signing key:", signingKey);
```
Index for the static deposit key
The static deposit signing public key
### Get Static Deposit Secret Key
getStaticDepositSecretKey(idx)
Retrieves a static deposit private key by index. Used when the private key needs to be shared with the SSP for static deposit flows.
```typescript theme={null}
const secretKey = await signer.getStaticDepositSecretKey(0);
```
Index for the static deposit key
The static deposit private key
### Get Public Key from Derivation
getPublicKeyFromDerivation(keyDerivation)
Derives a public key based on a `KeyDerivation` specification.
```typescript theme={null}
import { KeyDerivationType } from "@buildonspark/spark-sdk";
// Get public key for a leaf
const leafPubKey = await signer.getPublicKeyFromDerivation({
type: KeyDerivationType.LEAF,
path: "leaf-123"
});
// Get deposit public key
const depositPubKey = await signer.getPublicKeyFromDerivation({
type: KeyDerivationType.DEPOSIT
});
```
Specifies how to derive the key (LEAF, DEPOSIT, STATIC\_DEPOSIT, ECIES, or RANDOM)
The derived public key
***
## Digital Signatures
### Sign with Identity Key
signMessageWithIdentityKey(message, compact?)
Signs a message using the wallet's identity key with ECDSA.
```typescript theme={null}
const message = new TextEncoder().encode("Hello, Spark!");
const signature = await signer.signMessageWithIdentityKey(message);
console.log("Signature:", signature);
// With compact format
const compactSignature = await signer.signMessageWithIdentityKey(message, true);
```
Message to sign
Use compact signature format instead of DER
ECDSA signature (DER or compact format)
### Validate Signature
validateMessageWithIdentityKey(message, signature)
Validates an ECDSA signature against the identity key.
```typescript theme={null}
const message = new TextEncoder().encode("Hello, Spark!");
const signature = await signer.signMessageWithIdentityKey(message);
const isValid = await signer.validateMessageWithIdentityKey(message, signature);
console.log("Signature valid:", isValid);
```
Original message
Signature to validate
True if signature is valid
### Sign with Schnorr (Identity Key)
signSchnorrWithIdentityKey(message)
Creates a Schnorr signature using the identity key.
```typescript theme={null}
const message = new TextEncoder().encode("Hello, Spark!");
const schnorrSignature = await signer.signSchnorrWithIdentityKey(message);
console.log("Schnorr signature:", schnorrSignature);
```
Message to sign
Schnorr signature (64 bytes)
### Sign Transaction Index
signTransactionIndex(tx, index, publicKey)
Signs a specific input of a Bitcoin transaction. The method looks up the private key based on the provided public key (must be either identity or deposit key).
```typescript theme={null}
import { Transaction } from "@scure/btc-signer";
const tx = new Transaction();
// ... build transaction ...
const identityKey = await signer.getIdentityPublicKey();
signer.signTransactionIndex(tx, 0, identityKey);
```
The Bitcoin transaction to sign (from @scure/btc-signer)
Input index to sign
Public key identifying which private key to use
***
## FROST Protocol (Threshold Signatures)
Spark uses FROST (Flexible Round-Optimized Schnorr Threshold) signatures for collaborative signing between users and Signing Operators.
### Get Random Signing Commitment
getRandomSigningCommitment()
Generates a random signing commitment for FROST protocol. In `DefaultSparkSigner`, the nonce is stored internally. In `UnsafeStatelessSparkSigner`, the nonce is returned in the response.
```typescript theme={null}
const commitment = await signer.getRandomSigningCommitment();
console.log("Commitment:", commitment.commitment);
// commitment.nonce is only present in UnsafeStatelessSparkSigner
```
Object containing the signing commitment, and optionally the nonce (for stateless signers)
### Get Nonce for Commitment
getNonceForSelfCommitment(selfCommitment)
Retrieves the nonce associated with a previously generated commitment. In `DefaultSparkSigner`, this looks up the internally stored nonce. In `UnsafeStatelessSparkSigner`, this returns the nonce from the commitment object.
```typescript theme={null}
const commitment = await signer.getRandomSigningCommitment();
const nonce = signer.getNonceForSelfCommitment(commitment);
```
The commitment returned from getRandomSigningCommitment()
The signing nonce, or undefined if not found
### FROST Signing
signFrost(params)
Performs FROST signing operation. This produces the user's signature share that will be combined with Signing Operator shares.
```typescript theme={null}
import { KeyDerivationType } from "@buildonspark/spark-sdk";
const commitment = await signer.getRandomSigningCommitment();
const params = {
message: sighash, // Transaction sighash to sign
keyDerivation: { type: KeyDerivationType.LEAF, path: leafId },
publicKey: leafPublicKey,
verifyingKey: leaf.verifyingPublicKey,
selfCommitment: commitment,
statechainCommitments: soCommitments, // From Signing Operators
adaptorPubKey: undefined // Optional adaptor for atomic swaps
};
const signatureShare = await signer.signFrost(params);
```
FROST signing parameters:
* `message`: The message (sighash) to sign
* `keyDerivation`: How to derive the signing key
* `publicKey`: The user's public key for this leaf
* `verifyingKey`: The aggregated public key (user + SOs)
* `selfCommitment`: User's signing commitment
* `statechainCommitments`: Signing Operators' commitments
* `adaptorPubKey`: Optional adaptor public key
FROST signature share
### Aggregate FROST Signatures
aggregateFrost(params)
Aggregates FROST signature shares (user's + Signing Operators') into a final Schnorr signature.
```typescript theme={null}
const params = {
message: sighash,
publicKey: leafPublicKey,
verifyingKey: leaf.verifyingPublicKey,
selfCommitment: commitment,
selfSignature: userSignatureShare,
statechainCommitments: soCommitments,
statechainSignatures: soSignatures,
statechainPublicKeys: soPublicKeys
};
const finalSignature = await signer.aggregateFrost(params);
```
FROST aggregation parameters including all signature shares and public keys
Final aggregated Schnorr signature (64 bytes)
***
## Secret Sharing
These methods implement Shamir's Secret Sharing with verifiable proofs, used internally for key splitting operations.
### Split Secret with Proofs
splitSecretWithProofs(params)
Splits a secret into shares using Shamir's Secret Sharing with verifiable proofs.
```typescript theme={null}
const params = {
secret: privateKey,
curveOrder: secp256k1.CURVE.n,
threshold: 3,
numShares: 5
};
const shares = await signer.splitSecretWithProofs(params);
```
* `secret`: The secret to split (as Uint8Array)
* `curveOrder`: The curve order (bigint)
* `threshold`: Minimum shares needed to reconstruct
* `numShares`: Total number of shares to generate
Array of verifiable secret shares
### Subtract and Split with Proofs
subtractAndSplitSecretWithProofsGivenDerivations(params)
Subtracts two derived private keys and splits the result into verifiable shares. Used in transfer flows.
```typescript theme={null}
import { KeyDerivationType } from "@buildonspark/spark-sdk";
const shares = await signer.subtractAndSplitSecretWithProofsGivenDerivations({
first: { type: KeyDerivationType.LEAF, path: "old-leaf" },
second: { type: KeyDerivationType.LEAF, path: "new-leaf" },
curveOrder: secp256k1.CURVE.n,
threshold: 3,
numShares: 5
});
```
### Subtract, Split, and Encrypt
subtractSplitAndEncrypt(params)
Subtracts keys, splits into shares, and encrypts the second key for the receiver. Used in transfer operations.
```typescript theme={null}
const result = await signer.subtractSplitAndEncrypt({
first: { type: KeyDerivationType.LEAF, path: oldLeafId },
second: { type: KeyDerivationType.LEAF, path: newLeafId },
curveOrder: secp256k1.CURVE.n,
threshold: 3,
numShares: 5,
receiverPublicKey: receiverIdentityKey
});
console.log(result.shares); // Verifiable secret shares
console.log(result.secretCipher); // Encrypted key for receiver
```
***
## Encryption
### Decrypt ECIES
decryptEcies(ciphertext)
Decrypts ECIES-encrypted data using the identity key. Returns the public key corresponding to the decrypted private key.
```typescript theme={null}
const ciphertext = encryptedKeyFromSender;
const publicKey = await signer.decryptEcies(ciphertext);
```
ECIES-encrypted private key
Public key corresponding to the decrypted private key
***
## HTLC Operations
### Generate HTLC HMAC
htlcHMAC(transferID)
Generates an HMAC for HTLC (Hash Time-Locked Contract) operations using the HTLC preimage key.
```typescript theme={null}
const hmac = await signer.htlcHMAC(transferId);
```
The transfer ID to generate HMAC for
HMAC output (32 bytes)
***
## Complete Example
```typescript theme={null}
import {
SparkWallet,
DefaultSparkSigner,
KeyDerivationType
} from "@buildonspark/spark-sdk";
async function demonstrateSparkSigner() {
// 1. Create and initialize signer
const signer = new DefaultSparkSigner();
const mnemonic = await signer.generateMnemonic();
console.log("Generated mnemonic:", mnemonic);
const seed = await signer.mnemonicToSeed(mnemonic);
const identityKeyHex = await signer.createSparkWalletFromSeed(seed, 0);
console.log("Identity key:", identityKeyHex);
// 2. Get keys
const identityKey = await signer.getIdentityPublicKey();
const depositKey = await signer.getDepositSigningKey();
const staticDepositKey = await signer.getStaticDepositSigningKey(0);
console.log("Keys initialized");
// 3. Sign and validate a message
const message = new TextEncoder().encode("Hello, Spark!");
const signature = await signer.signMessageWithIdentityKey(message);
const isValid = await signer.validateMessageWithIdentityKey(message, signature);
console.log("Signature valid:", isValid);
// 4. Schnorr signature
const schnorrSig = await signer.signSchnorrWithIdentityKey(message);
console.log("Schnorr signature length:", schnorrSig.length);
// 5. Get public key from derivation
const leafPubKey = await signer.getPublicKeyFromDerivation({
type: KeyDerivationType.LEAF,
path: "my-leaf-id"
});
console.log("Leaf public key:", leafPubKey);
// 6. Use with SparkWallet
const { wallet } = await SparkWallet.initialize({
mnemonicOrSeed: mnemonic,
accountNumber: 0,
options: { network: "REGTEST" }
});
const address = await wallet.getSparkAddress();
console.log("Spark address:", address);
}
```
***
## Integration Patterns
### Remote Signer (Enclave Pattern)
For wallet providers that need to keep keys in a secure enclave:
```typescript theme={null}
class RemoteSigner extends DefaultSparkSigner {
private apiClient: EnclaveAPIClient;
private userId: string;
constructor(apiClient: EnclaveAPIClient, userId: string) {
super();
this.apiClient = apiClient;
this.userId = userId;
}
async signFrost(params: SignFrostParams): Promise {
// Forward to enclave
return this.apiClient.signFrost(this.userId, params);
}
async aggregateFrost(params: AggregateFrostParams): Promise {
return this.apiClient.aggregateFrost(this.userId, params);
}
async createSparkWalletFromSeed(
seed: Uint8Array | string,
accountNumber?: number
): Promise {
// Keys are managed in the enclave
return this.apiClient.initializeWallet(this.userId, seed, accountNumber);
}
}
```
### Multi-User Wallet Pattern
For services managing wallets for multiple users:
```typescript theme={null}
class MultiUserSigner extends UnsafeStatelessSparkSigner {
private keyStore: KeyStore;
async signFrost(params: SignFrostParams): Promise {
// Look up user's key material from secure storage
const userKeys = await this.keyStore.getKeys(params.publicKey);
// Perform signing with user's keys
return super.signFrost({
...params,
// Additional context if needed
});
}
}
```
For multi-user wallets, consider the trust model carefully. See the [Alby architecture blog post](https://getalby.com/blog/a-trust-minimized-multi-user-nwc-wallet-with-ark-spark) for a trust-minimized approach using NWC.
# Sparkscan
Source: https://docs.spark.money/wallets/sparkscan
Sparkscan is the official block explorer for the Spark network, providing visibility into transactions, addresses, and network activity.
## Features
* **Transaction Explorer**: View detailed transaction information and status
* **Address Lookup**: Search for addresses and view their transaction history
* **Network Statistics**: Monitor network activity and performance metrics
* **Real-time Updates**: Live transaction feeds and network monitoring
## Usage
Visit [Sparkscan](https://sparkscan.io) to explore the Spark network and track your transactions.
## API Access
Sparkscan also provides API endpoints for programmatic access to blockchain data, enabling developers to integrate network data into their applications.
# Testing Guide
Source: https://docs.spark.money/wallets/testing-guide
Learn how to test your Spark wallet integration effectively.
## Overview
Testing your Spark wallet implementation is crucial for ensuring reliability and user experience. This guide covers testing strategies, tools, and best practices.
## Testing Environment
### Regtest Network
```typescript theme={null}
// Configure for regtest testing
const wallet = await initialize({
mnemonicOrSeed: 'your-test-mnemonic',
signer: yourSigner,
options: {
network: 'regtest',
apiUrl: 'https://regtest-api.spark.network'
}
});
```
### Test Data
```typescript theme={null}
// Use test tokens and addresses
const testTokenId = 'test-token-id';
const testAddress = 'test-spark-address';
```
## Unit Testing
```typescript theme={null}
// Test wallet initialization
describe('Wallet Initialization', () => {
it('should initialize with valid mnemonic', async () => {
const wallet = await initialize({
mnemonicOrSeed: 'test mnemonic',
signer: mockSigner
});
expect(wallet).toBeDefined();
});
});
```
## Integration Testing
```typescript theme={null}
// Test wallet operations
describe('Wallet Operations', () => {
it('should transfer tokens', async () => {
const result = await wallet.transfer({
to: testAddress,
amount: '1000',
tokenId: testTokenId
});
expect(result.success).toBe(true);
});
});
```
## Best Practices
* Use regtest network for development
* Mock external dependencies
* Test error conditions
* Verify transaction states
* Test with different network conditions
# Spark CLI
Source: https://docs.spark.money/wallets/tools/cli
# Transfer Bitcoin
Source: https://docs.spark.money/wallets/transfer-bitcoin
Send Bitcoin instantly between Spark wallets with zero fees.
***
## Transfer Bitcoin
Send Bitcoin to another Spark wallet using a simple transfer method.
transfer(params)
Transfers Bitcoin to another Spark wallet on the Spark network.
```typescript theme={null}
const transferResult = await wallet.transfer({
receiverSparkAddress: "spark1p...", // Recipient's Spark address
amountSats: 50000, // Amount in satoshis
});
console.log("Transfer successful:", transferResult);
```
The recipient's Spark address
The amount in satoshis to transfer
The completed transfer details including transaction ID and status
***
## What is a Spark Transfer?
A Spark Transfer is a Bitcoin transfer that occurs entirely within the Spark network. Unlike traditional Bitcoin transactions that go through the blockchain, Spark transfers are:
* **Instant** - No waiting for blockchain confirmations
* **Low cost** - Minimal fees compared to on-chain transactions
* **Private** - Transfers are not visible on the public blockchain
* **Efficient** - Uses Spark's layer 2 infrastructure
Spark transfers are ideal for frequent Bitcoin transactions, micro-payments, and applications requiring instant settlement.
***
## Check Transfer Status
Monitor your transfers and track their status using transfer queries and events.
getTransfers(limit?, offset?)
Gets all transfers for the wallet with optional pagination.
`getTransfers()` includes Spark transfers, Lightning sends/receives, and cooperative exits. For token transaction details, use [`queryTokenTransactions()`](/api-reference/wallet/query-token-transactions).
```typescript theme={null}
// Get recent transfers
const transfers = await wallet.getTransfers(10);
console.log("Recent transfers:", transfers.transfers);
// Check specific transfer status
const recentTransfer = transfers.transfers[0];
console.log("Transfer ID:", recentTransfer.id);
console.log("Transfer status:", recentTransfer.status);
console.log("Amount:", recentTransfer.totalValue, "sats");
```
Maximum number of transfers to return (default: 20)
Offset for pagination (default: 0)
Array of transfer objects containing transfer details
The offset used for this request
***
## Real-time Transfer Monitoring
Monitor transfer status in real-time using event listeners.
```typescript theme={null}
// Listen for incoming transfer events
wallet.on("transfer:claimed", (transferId, updatedBalance) => {
console.log(`Incoming transfer ${transferId} claimed! New balance: ${updatedBalance} sats`);
});
// Note: There are no events for outgoing transfers.
// The transfer() method returns immediately when the transfer completes.
```
The `transfer:claimed` event only fires for **incoming** transfers. For outgoing transfers, the `transfer()` method returns a `WalletTransfer` object when complete.
***
## Error Handling
Implement proper error handling for failed transfers and edge cases.
```typescript theme={null}
async function transferBitcoinSafely(params) {
try {
// Check if you have enough balance
const balance = await wallet.getBalance();
if (balance.balance < params.amountSats) {
throw new Error("Insufficient balance");
}
// Validate recipient address format
if (!params.receiverSparkAddress.startsWith('spark1') &&
!params.receiverSparkAddress.startsWith('sparkrt1')) {
throw new Error("Invalid Spark address format");
}
// Attempt the transfer
const result = await wallet.transfer(params);
console.log("Transfer successful:", result);
return result;
} catch (error) {
console.error("Transfer failed:", error.message);
// Handle specific error types
if (error.message.includes("Insufficient")) {
console.log("Please check your Bitcoin balance");
} else if (error.message.includes("Invalid")) {
console.log("Please verify the recipient address");
} else if (error.message.includes("Network")) {
console.log("Network error. Please try again.");
}
throw error;
}
}
```
***
## Example: Complete Bitcoin Transfer Flow
```typescript theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
async function completeBitcoinTransfer() {
const { wallet } = await SparkWallet.initialize({
options: { network: "REGTEST" }
});
try {
// 1. Check current balance
const balance = await wallet.getBalance();
console.log("Current balance:", balance.balance, "sats");
// 2. Set up event listeners for incoming transfers
wallet.on("transfer:claimed", (transferId, newBalance) => {
console.log(`Incoming transfer ${transferId} claimed! New balance: ${newBalance} sats`);
});
// 3. Transfer Bitcoin
const transferResult = await wallet.transfer({
receiverSparkAddress: "spark1p...", // Replace with recipient address
amountSats: 10000 // Transfer 10,000 sats
});
console.log("Transfer initiated:", transferResult);
// 4. Check transfer status
const transfers = await wallet.getTransfers(5);
console.log("Recent transfers:", transfers.transfers);
} catch (error) {
console.error("Transfer failed:", error);
}
}
```
# Transfer Tokens
Source: https://docs.spark.money/wallets/transfer-tokens
Send tokens to any Spark address with single or batch transfers.
***
## Transfer Tokens
Send tokens to another Spark wallet using the token identifier and amount.
transferTokens(params)
Transfers tokens to another user on the Spark network.
```typescript theme={null}
const transferResult = await wallet.transferTokens({
tokenIdentifier: "btkn1p...", // Bech32m token identifier
tokenAmount: BigInt(1000), // Amount of tokens to transfer
receiverSparkAddress: "spark1p...", // Recipient's Spark address
});
console.log("Transfer successful:", transferResult);
```
The Bech32m token identifier (e.g., btkn1...) of the token to transfer
The amount of tokens to transfer
The recipient's Spark address
Optional: Specific outputs to use for the transfer
The transaction ID of the token transfer
***
## Get Token Balances
Before transferring tokens, check what tokens you own and their balances using `getBalance()`.
```typescript theme={null}
const { balance, tokenBalances } = await wallet.getBalance();
console.log("Sats balance:", balance);
// Iterate over token balances
for (const [tokenId, tokenData] of tokenBalances) {
console.log(`Token ${tokenId}:`);
console.log(" Balance:", tokenData.balance);
console.log(" Name:", tokenData.tokenMetadata.tokenName);
console.log(" Ticker:", tokenData.tokenMetadata.tokenTicker);
}
// Check balance of a specific token
const specificToken = tokenBalances.get("btkn1...");
if (specificToken) {
console.log("Token balance:", specificToken.balance);
}
```
***
## Error Handling
Proper error handling is essential when transferring tokens to ensure a smooth user experience.
```typescript theme={null}
async function transferTokensSafely(params) {
try {
// Check if you have enough tokens
const balance = await wallet.getBalance();
const tokenBalance = balance.tokenBalances.get(params.tokenIdentifier);
if (!tokenBalance || tokenBalance.balance < params.tokenAmount) {
throw new Error("Insufficient token balance");
}
// Attempt the transfer
const result = await wallet.transferTokens(params);
console.log("Transfer successful:", result);
return result;
} catch (error) {
console.error("Transfer failed:", error.message);
// Handle specific error types
if (error.message.includes("Insufficient")) {
console.log("Please check your token balance");
} else if (error.message.includes("Invalid address")) {
console.log("Please verify the recipient address");
} else {
console.log("Transfer failed. Please try again.");
}
throw error;
}
}
```
***
## Example: Complete Token Transfer Flow
```typescript theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
async function completeTokenTransfer() {
const { wallet } = await SparkWallet.initialize({
options: { network: "REGTEST" }
});
try {
// 1. Check current balance and token holdings
const { balance, tokenBalances } = await wallet.getBalance();
console.log("Current balance:", balance, "sats");
console.log("Token balances:", tokenBalances);
// 2. Set up event listeners
wallet.on("transfer:claimed", (transferId, newBalance) => {
console.log(`Token transfer ${transferId} claimed!`);
});
// 3. Transfer tokens
const transferResult = await wallet.transferTokens({
tokenIdentifier: "btkn1p...", // Replace with actual token ID
tokenAmount: BigInt(100), // Transfer 100 tokens
receiverSparkAddress: "spark1p..." // Replace with recipient address
});
console.log("Transfer initiated:", transferResult);
} catch (error) {
console.error("Transfer failed:", error);
}
}
```
# TypeScript SDK
Source: https://docs.spark.money/wallets/typescript
TypeScript SDK for building Bitcoin wallet applications on Node.js and browsers. Requires Node.js v16+.
## Getting Started
To get started, follow the steps below.
Install the Spark SDK packages using your package manager of choice.
```bash npm theme={null}
npm install @buildonspark/spark-sdk
```
```bash yarn theme={null}
yarn add @buildonspark/spark-sdk
```
```bash pnpm theme={null}
pnpm add @buildonspark/spark-sdk
```
Create a wallet instance that will be used to interact with the Spark network.
```ts wallet.ts theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
export const initializeWallet = async () => {
const { wallet, mnemonic } = await SparkWallet.initialize({
mnemonicOrSeed: "optional-mnemonic-or-seed",
accountNumber: "optional-number",
options: {
network: "REGTEST" // or "MAINNET"
}
});
console.log("Wallet initialized successfully:", mnemonic);
return wallet;
};
```
You're ready to start building.
```ts app.ts theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
async function main() {
try {
// Initialize wallet
const { wallet, mnemonic } = await SparkWallet.initialize({
options: { network: "REGTEST" }
});
console.log("Wallet created successfully!");
console.log("Mnemonic:", mnemonic);
console.log("Address:", await wallet.getSparkAddress());
const { balance } = await wallet.getBalance();
console.log("Balance:", balance, "sats");
// Example: Send Bitcoin
const transfer = await wallet.transfer({
receiverSparkAddress: "spark1...",
amountSats: 1000,
});
console.log("Transfer completed:", transfer.id);
} catch (error) {
console.error("Error:", error);
}
}
main();
```
## TypeScript Configuration
### tsconfig.json
Create a `tsconfig.json` file in your project root:
```json tsconfig.json theme={null}
{
"compilerOptions": {
"target": "ES2020",
"module": "commonjs",
"lib": ["ES2020"],
"outDir": "./dist",
"rootDir": "./src",
"strict": true,
"esModuleInterop": true,
"skipLibCheck": true,
"forceConsistentCasingInFileNames": true,
"resolveJsonModule": true,
"declaration": true,
"declarationMap": true,
"sourceMap": true
},
"include": ["src/**/*"],
"exclude": ["node_modules", "dist"]
}
```
### Package.json Scripts
Add TypeScript build scripts to your `package.json`:
```json package.json theme={null}
{
"scripts": {
"build": "tsc",
"start": "node dist/app.js",
"dev": "ts-node src/app.ts",
"watch": "tsc --watch"
}
}
```
## Core Wallet Operations
### Initialize a Wallet
A wallet requires either a mnemonic or raw seed for initialization. The `initialize()` function accepts both. If no input is given, it will auto-generate a mnemonic and return it.
```ts theme={null}
// Initialize a new wallet instance
const { wallet, mnemonic } = await SparkWallet.initialize({
mnemonicOrSeed: "optional-mnemonic-or-seed",
accountNumber: "optional-number",
options: {
network: "REGTEST" // or "MAINNET"
}
});
console.log("Wallet initialized successfully:", mnemonic);
```
### Mnemonic Phrases
A mnemonic is a human-readable encoding of your wallet's seed. It's a 12- or 24-word phrase from the BIP-39 wordlist, used to derive the cryptographic keys that control your wallet.
## TypeScript Features
### Type Safety
The Spark TypeScript SDK provides full type safety for all wallet operations:
```ts theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
// TypeScript will provide autocomplete and type checking
const { wallet } = await SparkWallet.initialize({
options: { network: "REGTEST" }
});
// All methods are fully typed
const address: string = await wallet.getSparkAddress();
const { balance, tokenBalances } = await wallet.getBalance();
// balance: bigint (satoshis)
// tokenBalances: Map
```
### Interface Definitions
All wallet methods return properly typed interfaces:
```ts theme={null}
interface InitResult {
wallet: SparkWallet;
mnemonic?: string; // undefined if raw seed was provided
}
interface BalanceResult {
balance: bigint;
tokenBalances: Map;
}
interface WalletTransfer {
id: string;
senderIdentityPublicKey: string;
receiverIdentityPublicKey: string;
status: TransferStatus; // See TransferStatus enum below
totalValue: number;
expiryTime: Date | undefined;
leaves: WalletTransferLeaf[];
createdTime: Date | undefined;
updatedTime: Date | undefined;
type: TransferType; // "TRANSFER" | "COOPERATIVE_EXIT" | "PREIMAGE_SWAP" | "UTXO_SWAP"
transferDirection: "INCOMING" | "OUTGOING";
userRequest: UserRequestType | undefined;
}
// Common status values you'll encounter:
type TransferStatus =
| "TRANSFER_STATUS_COMPLETED"
| "TRANSFER_STATUS_EXPIRED"
| "TRANSFER_STATUS_RETURNED"
| "TRANSFER_STATUS_SENDER_INITIATED"
// ... and other intermediate states
```
# Unilateral Exit
Source: https://docs.spark.money/wallets/unilateral-exit
A unilateral exit allows you to withdraw funds from Spark without requiring anyone else's permission. If you choose not to trust Spark anymore or if Spark were to shut down, you can still retrieve your money.
**Critical Requirements:**
* Only exit leaves **> 16,348 sats** (smaller
amounts cost more in fees than they're worth)
* Must complete entire process
**in single session** - wallet unusable until finished
* Transactions must be
broadcast **from root to leaves** in exact order shown
* **Bitcoin Core V29
required** for ephemeral anchors
* **Beta feature:** we're fixing bugs as we
find them
## Tree Structure Basics
Your Spark funds are organized in a tree:
* Each "leaf" = portion of your funds with different values/timelocks
* Deeper leaves = more transactions required to exit
* Broadcasting order matters: root → leaves
## Getting Started
### 1. Check Your Leaves
```bash theme={null}
sparkcli getleaves
```
Example output:
```
Leaf ID: 019765d1-ad06-79ed-b30e-0afc064c5a1b
Value: 16384 sats
Status: AVAILABLE
Tree ID: 019765d1-ac2e-7ec8-ac37-c77ea070237d
```
### 2. Start Interactive Exit
```bash theme={null}
sparkcli unilateralexit --testmode=true
```
**Important:** Use `--testmode=true` ONLY on Regtest. Mainnet users must use
the regular flow without testmode to properly handle timelocks and manual fee
signing.
The CLI guides you through:
* Leaf selection
* Timelock handling (auto-refreshed in test mode)
* UTXO generation for fees
* Transaction creation and signing
* Broadcast order instructions
## Regtest Testing
**Test on regtest first** - don't learn on mainnet.
### Setup
* Initialize wallet and deposit 131,072 sats for testing
* Use [regtest faucet](https://app.lightspark.com/regtest-faucet) (max 50,000 per deposit)
* CLI auto-generates test wallet for fees
### Broadcasting
Use [https://regtest-mempool.us-west-2.sparkinfra.net/tx/push](https://regtest-mempool.us-west-2.sparkinfra.net/tx/push) to submit packages in order. Wait \~1 min between packages for confirmation.
## Mainnet Usage
### Prerequisites
* Sufficient L1 Bitcoin UTXOs for transaction fees
* Only exit leaves > 16,348 sats
* Recommended: 131,072 sats deposit for better leaf sizes
### Mainnet vs Regtest
* **UTXOs**: Provide your own Bitcoin UTXOs when prompted
* **Broadcasting**: Use `bitcoin-cli submitpackage '["node_tx", "fee_bump_tx"]'`
* **Costs**: Real network fees apply
### Process
1. Check leaves with `sparkcli getleaves`
2. Run `sparkcli unilateralexit` (no testmode - you'll handle timelocks manually)
3. CLI will walk you through it (bring your own UTXOs)
4. Broadcast in exact order shown by CLI
## Essential Commands
```bash theme={null}
sparkcli unilateralexit --testmode=true
sparkcli unilateralexit
sparkcli getleaves
sparkcli checktimelock
sparkcli testonly_expiretimelock
sparkcli signfeebump
```
# Withdraw to L1
Source: https://docs.spark.money/wallets/withdraw-to-l1
Withdraw Bitcoin from Spark to any L1 address via cooperative exit.
***
## Withdrawal Flow
The complete process for withdrawing Bitcoin from Spark back to Layer 1:
Request a fee quote for your withdrawal with different speed options.
```typescript theme={null}
const feeQuote = await wallet.getWithdrawalFeeQuote({
amountSats: 50000,
withdrawalAddress: "bc1p..." // Your Bitcoin address
});
console.log("Fee quote:", feeQuote);
```
Start the withdrawal process using the fee quote.
```typescript theme={null}
// Calculate fee based on exit speed
const exitSpeed = "fast"; // or "medium", "slow"
let feeAmountSats: number;
switch (exitSpeed) {
case "fast":
feeAmountSats = (feeQuote.l1BroadcastFeeFast?.originalValue || 0) +
(feeQuote.userFeeFast?.originalValue || 0);
break;
case "medium":
feeAmountSats = (feeQuote.l1BroadcastFeeMedium?.originalValue || 0) +
(feeQuote.userFeeMedium?.originalValue || 0);
break;
case "slow":
feeAmountSats = (feeQuote.l1BroadcastFeeSlow?.originalValue || 0) +
(feeQuote.userFeeSlow?.originalValue || 0);
break;
}
const withdrawal = await wallet.withdraw({
onchainAddress: "bc1p...", // Your Bitcoin address
exitSpeed: exitSpeed,
amountSats: 50000,
feeQuoteId: feeQuote.id,
feeAmountSats: feeAmountSats,
deductFeeFromWithdrawalAmount: false
});
console.log("Withdrawal initiated:", withdrawal);
```
Track the withdrawal status until completion.
```typescript theme={null}
const exitRequest = await wallet.getCoopExitRequest(withdrawal.id);
console.log("Withdrawal status:", exitRequest.status);
console.log("Transaction ID:", exitRequest.coopExitTxid);
```
***
## Get Withdrawal Fee Quote
Request fee quotes for different withdrawal speeds before initiating a withdrawal.
getWithdrawalFeeQuote(params)
Gets a fee quote for a cooperative exit (on-chain withdrawal). The quote includes options for different speeds and an expiry time and must be passed to `withdraw` before it expires.
```typescript theme={null}
const feeQuote = await wallet.getWithdrawalFeeQuote({
amountSats: 100000,
withdrawalAddress: "bc1p5d7rjq7g6rdk2yhzks9smtbqtedr4dekq08ge8ztwac72sfr9rusxg3297"
});
console.log("Fee quote:", feeQuote);
console.log("Fast option:", feeQuote.fast);
console.log("Medium option:", feeQuote.medium);
console.log("Slow option:", feeQuote.slow);
```
The amount in satoshis to withdraw
The Bitcoin address where the funds should be sent
A fee quote for the withdrawal, or null if not available
***
## Initiate Withdrawal
Start the withdrawal process using a fee quote.
withdraw(params)
Initiates a cooperative exit to withdraw Bitcoin from Spark to Layer 1.
```typescript theme={null}
// Calculate fee based on exit speed
const exitSpeed = "fast"; // "fast", "medium", or "slow"
let feeAmountSats: number;
switch (exitSpeed) {
case "fast":
feeAmountSats = (feeQuote.l1BroadcastFeeFast?.originalValue || 0) +
(feeQuote.userFeeFast?.originalValue || 0);
break;
case "medium":
feeAmountSats = (feeQuote.l1BroadcastFeeMedium?.originalValue || 0) +
(feeQuote.userFeeMedium?.originalValue || 0);
break;
case "slow":
feeAmountSats = (feeQuote.l1BroadcastFeeSlow?.originalValue || 0) +
(feeQuote.userFeeSlow?.originalValue || 0);
break;
}
const withdrawal = await wallet.withdraw({
onchainAddress: "bc1p5d7rjq7g6rdk2yhzks9smtbqtedr4dekq08ge8ztwac72sfr9rusxg3297",
exitSpeed: exitSpeed,
amountSats: 100000,
feeQuoteId: feeQuote.id,
feeAmountSats: feeAmountSats,
deductFeeFromWithdrawalAmount: false
});
console.log("Withdrawal ID:", withdrawal.id);
console.log("Status:", withdrawal.status);
```
The Bitcoin address to send the funds to
The withdrawal speed: "fast", "medium", or "slow"
The amount in satoshis to withdraw (if not specified, withdraws all available funds)
The ID from the fee quote returned by getWithdrawalFeeQuote
The fee amount in satoshis based on the chosen exitSpeed
**Deprecated:** Use feeQuoteId and feeAmountSats instead
Whether to deduct the fee from the withdrawal amount (default: true)
The withdrawal request details including ID and status
***
## Monitor Withdrawal Status
Track the status of your withdrawal request.
getCoopExitRequest(id)
Gets a cooperative exit request by ID to check withdrawal status.
```typescript theme={null}
const exitRequest = await wallet.getCoopExitRequest(withdrawalId);
console.log("Withdrawal status:", exitRequest.status);
console.log("Transaction ID:", exitRequest.coopExitTxid);
console.log("Fee:", exitRequest.fee.originalValue, exitRequest.fee.originalUnit);
// Check if withdrawal is complete
if (exitRequest.status === "SUCCEEDED") {
console.log("Withdrawal completed successfully!");
}
```
The ID of the cooperative exit request
The cooperative exit request details or null if not found
***
## Withdrawal Speeds
Spark offers three withdrawal speed options with different fee structures:
**Highest priority**
**Higher fees**
**\~1-2 hours**
Best for urgent withdrawals
**Balanced option**
**Moderate fees**
**\~4-6 hours**
Good for most use cases
**Lowest priority**
**Lowest fees**
**\~12-24 hours**
Best for non-urgent withdrawals
***
## Real-time Withdrawal Monitoring
Monitor withdrawal status by polling `getCoopExitRequest()`.
Withdrawals don't emit events. Poll the status using `getCoopExitRequest()` with the withdrawal ID returned from `withdraw()`.
```typescript theme={null}
// Check withdrawal status periodically
const checkWithdrawalStatus = async (withdrawalId) => {
const exitRequest = await wallet.getCoopExitRequest(withdrawalId);
switch (exitRequest.status) {
case "SUCCEEDED":
console.log("Withdrawal completed!");
console.log("On-chain txid:", exitRequest.coopExitTxid);
return true;
case "FAILED":
console.log("Withdrawal failed");
return false;
default:
console.log("Withdrawal pending...");
setTimeout(() => checkWithdrawalStatus(withdrawalId), 30000); // Check again in 30 seconds
return false;
}
};
```
***
## Error Handling
Implement proper error handling for withdrawal operations.
```typescript theme={null}
async function withdrawSafely(params) {
try {
// 1. Get fee quote first
const feeQuote = await wallet.getWithdrawalFeeQuote({
amountSats: params.amountSats,
withdrawalAddress: params.onchainAddress
});
if (!feeQuote) {
throw new Error("Unable to get fee quote");
}
// 2. Check if quote is still valid
const now = Date.now();
if (feeQuote.expiryTime && now > feeQuote.expiryTime) {
throw new Error("Fee quote expired, please get a new one");
}
// 3. Calculate fee based on exit speed
let feeAmountSats: number;
switch (params.exitSpeed) {
case "fast":
feeAmountSats = (feeQuote.l1BroadcastFeeFast?.originalValue || 0) +
(feeQuote.userFeeFast?.originalValue || 0);
break;
case "medium":
feeAmountSats = (feeQuote.l1BroadcastFeeMedium?.originalValue || 0) +
(feeQuote.userFeeMedium?.originalValue || 0);
break;
case "slow":
feeAmountSats = (feeQuote.l1BroadcastFeeSlow?.originalValue || 0) +
(feeQuote.userFeeSlow?.originalValue || 0);
break;
}
// 4. Initiate withdrawal
const withdrawal = await wallet.withdraw({
...params,
feeQuoteId: feeQuote.id,
feeAmountSats: feeAmountSats
});
console.log("Withdrawal initiated successfully:", withdrawal.id);
return withdrawal;
} catch (error) {
console.error("Withdrawal failed:", error.message);
// Handle specific error types
if (error.message.includes("Insufficient")) {
console.log("Please check your balance");
} else if (error.message.includes("expired")) {
console.log("Please get a new fee quote");
} else if (error.message.includes("Invalid address")) {
console.log("Please verify the withdrawal address");
}
throw error;
}
}
```
***
## Example: Complete Withdrawal Flow
```typescript theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
async function completeWithdrawal() {
const { wallet } = await SparkWallet.initialize({
options: { network: "REGTEST" }
});
try {
// 1. Check current balance
const balance = await wallet.getBalance();
console.log("Current balance:", balance.balance, "sats");
// 2. Set up event listeners
wallet.on("transfer:claimed", (transferId, updatedBalance) => {
console.log(`Transfer ${transferId} claimed. New balance: ${updatedBalance} sats`);
});
// 3. Get fee quote
const feeQuote = await wallet.getWithdrawalFeeQuote({
amountSats: 50000,
withdrawalAddress: "bc1p5d7rjq7g6rdk2yhzks9smtbqtedr4dekq08ge8ztwac72sfr9rusxg3297"
});
console.log("Fee quote:", feeQuote);
// 4. Calculate fee based on exit speed
const exitSpeed = "medium";
let feeAmountSats: number;
switch (exitSpeed) {
case "fast":
feeAmountSats = (feeQuote.l1BroadcastFeeFast?.originalValue || 0) +
(feeQuote.userFeeFast?.originalValue || 0);
break;
case "medium":
feeAmountSats = (feeQuote.l1BroadcastFeeMedium?.originalValue || 0) +
(feeQuote.userFeeMedium?.originalValue || 0);
break;
case "slow":
feeAmountSats = (feeQuote.l1BroadcastFeeSlow?.originalValue || 0) +
(feeQuote.userFeeSlow?.originalValue || 0);
break;
}
// 5. Initiate withdrawal
const withdrawal = await wallet.withdraw({
onchainAddress: "bc1p5d7rjq7g6rdk2yhzks9smtbqtedr4dekq08ge8ztwac72sfr9rusxg3297",
exitSpeed: exitSpeed,
amountSats: 50000,
feeQuoteId: feeQuote.id,
feeAmountSats: feeAmountSats,
deductFeeFromWithdrawalAmount: false
});
console.log("Withdrawal initiated:", withdrawal.id);
// 6. Monitor withdrawal status
const monitorWithdrawal = async () => {
const exitRequest = await wallet.getCoopExitRequest(withdrawal.id);
if (exitRequest.status === "SUCCEEDED") {
console.log("Withdrawal completed successfully!");
console.log("Transaction ID:", exitRequest.coopExitTxid);
} else if (exitRequest.status === "FAILED") {
console.log("Withdrawal failed");
} else {
console.log("Withdrawal pending...");
setTimeout(monitorWithdrawal, 30000); // Check again in 30 seconds
}
};
await monitorWithdrawal();
} catch (error) {
console.error("Withdrawal failed:", error);
}
}
```
# Withdraw to Lightning
Source: https://docs.spark.money/wallets/withdraw-to-lightning
Pay any Lightning invoice directly from your Spark wallet.
***
## Lightning Withdrawal Flow
The complete process for withdrawing Bitcoin from Spark to the Lightning Network:
Obtain a Lightning invoice from the recipient or Lightning service.
```typescript theme={null}
// Example Lightning invoice
const invoice = "lnbcrt1u1pnm7ammpp4v84f05tl0kzt6g95g056athdpp8f8azvg6d7epz74z562ymer9jqsp5nc50gazvp0e98u42jlu653rw0eutcl067nqq924hf89q4la4kd9sxq9z0rgqnp4qdnmwu8v22cvq9xsv2l05cn9rre7xlcgdtntxawf8m0zxq3qemgzqrzjqtr2vd60g57hu63rdqk87u3clac6jlfhej4kldrrjvfcw3mphcw8sqqqqrj0q7ew45qqqqqqqqqqqqqq9qcqzpgdq5w3jhxapqd9h8vmmfvdjs9qyyssqj7lf2w4m587g04n4t0ferdv0vnwftzca0xuc9yxycng78cnhrvmyw2mzaa8t76jskpypqnwqhp9xh0vnwxz90jytd34vrmhcngsnl8qplz7ylk";
console.log("Lightning invoice:", invoice);
```
Get a fee estimate for the Lightning payment to understand costs.
```typescript theme={null}
const feeEstimate = await wallet.getLightningSendFeeEstimate({
encodedInvoice: invoice
});
console.log("Fee estimate:", feeEstimate);
```
Send the Lightning payment with appropriate fee limits.
```typescript theme={null}
const payment = await wallet.payLightningInvoice({
invoice: invoice,
maxFeeSats: 5, // Maximum fee to pay
preferSpark: true // Prefer Spark transfers when possible
});
console.log("Payment initiated:", payment);
```
Track the payment status until completion.
```typescript theme={null}
const paymentStatus = await wallet.getLightningSendRequest(payment.id);
console.log("Payment status:", paymentStatus.status);
```
***
## Pay Lightning Invoice
Send Bitcoin from your Spark wallet to any Lightning Network invoice.
payLightningInvoice(params)
Pays a Lightning invoice using your Spark wallet balance.
```typescript theme={null}
const payment = await wallet.payLightningInvoice({
invoice: "lnbcrt1u1pnm7ammpp4v84f05tl0kzt6g95g056athdpp8f8azvg6d7epz74z562ymer9jqsp5nc50gazvp0e98u42jlu653rw0eutcl067nqq924hf89q4la4kd9sxq9z0rgqnp4qdnmwu8v22cvq9xsv2l05cn9rre7xlcgdtntxawf8m0zxq3qemgzqrzjqtr2vd60g57hu63rdqk87u3clac6jlfhej4kldrrjvfcw3mphcw8sqqqqrj0q7ew45qqqqqqqqqqqqqq9qcqzpgdq5w3jhxapqd9h8vmmfvdjs9qyyssqj7lf2w4m587g04n4t0ferdv0vnwftzca0xuc9yxycng78cnhrvmyw2mzaa8t76jskpypqnwqhp9xh0vnwxz90jytd34vrmhcngsnl8qplz7ylk",
maxFeeSats: 5,
preferSpark: true,
amountSatsToSend: 1000 // Only for zero-amount invoices
});
console.log("Payment Response:", payment);
```
The BOLT11-encoded Lightning invoice to pay
Maximum fee in satoshis to pay for the invoice
When true, Spark wallets will initiate a Spark transfer instead of a Lightning transfer if a valid Spark address is found in the invoice
Amount in satoshis to send (only used for zero-amount invoices)
The Lightning payment request details including ID and status
***
## Get Fee Estimate
Estimate the fees for a Lightning payment before sending.
getLightningSendFeeEstimate(params)
Gets an estimated fee for sending a Lightning payment.
```typescript theme={null}
const feeEstimate = await wallet.getLightningSendFeeEstimate({
encodedInvoice: "lnbcrt1u1pnm7ammpp4v84f05tl0kzt6g95g056athdpp8f8azvg6d7epz74z562ymer9jqsp5nc50gazvp0e98u42jlu653rw0eutcl067nqq924hf89q4la4kd9sxq9z0rgqnp4qdnmwu8v22cvq9xsv2l05cn9rre7xlcgdtntxawf8m0zxq3qemgzqrzjqtr2vd60g57hu63rdqk87u3clac6jlfhej4kldrrjvfcw3mphcw8sqqqqrj0q7ew45qqqqqqqqqqqqqq9qcqzpgdq5w3jhxapqd9h8vmmfvdjs9qyyssqj7lf2w4m587g04n4t0ferdv0vnwftzca0xuc9yxycng78cnhrvmyw2mzaa8t76jskpypqnwqhp9xh0vnwxz90jytd34vrmhcngsnl8qplz7ylk"
});
console.log("Estimated fee:", feeEstimate);
```
The BOLT11-encoded Lightning invoice
The estimated fee for the Lightning payment
***
## Monitor Payment Status
Track the status of your Lightning payment.
getLightningSendRequest(id)
Gets a Lightning send request by ID to check payment status.
```typescript theme={null}
const paymentStatus = await wallet.getLightningSendRequest(paymentId);
console.log("Payment status:", paymentStatus.status);
console.log("Payment amount:", paymentStatus.amountSats);
// Check if payment is complete
if (paymentStatus.status === "TRANSFER_COMPLETED") {
console.log("Payment completed successfully!");
}
```
The ID of the Lightning send request
The Lightning send request details including status and amount
***
## Zero-Amount Invoices
Spark supports paying zero-amount Lightning invoices, which allow you to specify the amount when making the payment.
Zero-amount invoices are not widely supported across the Lightning Network. Some exchanges, such as Binance, currently do not support them.
### Paying Zero-Amount Invoices
When paying a zero-amount invoice, you need to specify the amount using the `amountSatsToSend` parameter:
```typescript theme={null}
// Pay a zero-amount invoice with a specific amount
const payment = await wallet.payLightningInvoice({
invoice: "lnbc...", // Zero-amount Lightning invoice
maxFeeSats: 5,
amountSatsToSend: 1000, // Specify the amount to send (in satoshis)
});
console.log("Zero-amount Payment Response:", payment);
```
The `amountSatsToSend` parameter is only used for zero-amount invoices. For regular invoices with a fixed amount, this parameter is ignored.
***
## Spark Transfer Preference
When paying Lightning invoices, you can enable Spark transfer preference to automatically use Spark transfers when possible.
```typescript theme={null}
// Pay with Spark preference enabled
const payment = await wallet.payLightningInvoice({
invoice: "lnbc...", // Lightning invoice (potentially with embedded Spark address)
maxFeeSats: 5,
preferSpark: true, // Defaults to false
});
console.log("Payment Response:", payment);
```
When `preferSpark` is set to `true`, Spark wallets will:
* Initiate a Spark transfer instead of a Lightning transfer if a valid Spark address is found in the invoice
* Fall back to regular Lightning payment if no Spark address is found
***
## Real-time Payment Monitoring
Monitor Lightning payment status by polling.
The `transfer:claimed` event does **not** fire for outgoing Lightning payments. Use `getLightningSendRequest()` to poll payment status.
```typescript theme={null}
// For outgoing Lightning payments, poll the status:
const checkPaymentStatus = async (paymentId) => {
const paymentStatus = await wallet.getLightningSendRequest(paymentId);
switch (paymentStatus.status) {
case "TRANSFER_COMPLETED":
console.log("Lightning payment completed!");
return true;
case "TRANSFER_FAILED":
console.log("Lightning payment failed");
return false;
default:
console.log("Lightning payment pending...");
setTimeout(() => checkPaymentStatus(paymentId), 5000); // Check again in 5 seconds
return false;
}
};
// For INCOMING transfers/deposits, use events:
wallet.on("transfer:claimed", (transferId, updatedBalance) => {
console.log(`Incoming transfer ${transferId} claimed. New balance: ${updatedBalance} sats`);
});
```
***
## Fee Recommendations
We recommend setting the maximum routing fee to whichever is greater:
* **5 sats** (minimum fee)
* **17 bps × transaction amount** (0.17% of the transaction)
```typescript theme={null}
// Calculate recommended fee
const amountSats = 10000; // Your payment amount
const recommendedFee = Math.max(5, Math.ceil(amountSats * 0.0017)); // 17 bps = 0.17%
const payment = await wallet.payLightningInvoice({
invoice: "lnbc...",
maxFeeSats: recommendedFee
});
```
***
## Error Handling
Implement proper error handling for Lightning payment operations.
```typescript theme={null}
async function payLightningSafely(invoice, maxFeeSats) {
try {
// 1. Get fee estimate first
const feeEstimate = await wallet.getLightningSendFeeEstimate({
encodedInvoice: invoice
});
console.log("Estimated fee:", feeEstimate);
// 2. Check if fee is within limits
if (feeEstimate > maxFeeSats) {
throw new Error(`Estimated fee (${feeEstimate}) exceeds maximum (${maxFeeSats})`);
}
// 3. Pay the invoice
const payment = await wallet.payLightningInvoice({
invoice: invoice,
maxFeeSats: maxFeeSats
});
console.log("Payment initiated successfully:", payment.id);
return payment;
} catch (error) {
console.error("Lightning payment failed:", error.message);
// Handle specific error types
if (error.message.includes("Insufficient")) {
console.log("Please check your balance");
} else if (error.message.includes("expired")) {
console.log("Invoice has expired, please get a new one");
} else if (error.message.includes("Invalid invoice")) {
console.log("Please verify the invoice format");
}
throw error;
}
}
```
***
## Checking Balance
You can use `getBalance()` to check a wallet balance after sending payments.
The `getBalance()` method returns a Promise resolving to an object containing:
* `balance`: A `bigint` representing the total amount in satoshis
* `tokenBalances`: A Map of token balances, where each entry contains:
* `balance`: A `bigint` representing the token amount
* `tokenInfo`: Information about the specific token the wallet is holding
```typescript theme={null}
const balanceInfo = await wallet.getBalance();
console.log("Balance:", balanceInfo.balance, "sats");
```
***
## Complete Example
```typescript theme={null}
import { SparkWallet } from "@buildonspark/spark-sdk";
async function payLightningInvoice() {
const { wallet } = await SparkWallet.initialize({
options: { network: "REGTEST" }
});
try {
// 1. Set up event listeners
wallet.on("transfer:claimed", (transferId, updatedBalance) => {
console.log(`Transfer ${transferId} claimed. New balance: ${updatedBalance} sats`);
});
// 2. Example Lightning invoice (replace with real invoice)
const invoice = "lnbcrt1u1pnm7ammpp4v84f05tl0kzt6g95g056athdpp8f8azvg6d7epz74z562ymer9jqsp5nc50gazvp0e98u42jlu653rw0eutcl067nqq924hf89q4la4kd9sxq9z0rgqnp4qdnmwu8v22cvq9xsv2l05cn9rre7xlcgdtntxawf8m0zxq3qemgzqrzjqtr2vd60g57hu63rdqk87u3clac6jlfhej4kldrrjvfcw3mphcw8sqqqqrj0q7ew45qqqqqqqqqqqqqq9qcqzpgdq5w3jhxapqd9h8vmmfvdjs9qyyssqj7lf2w4m587g04n4t0ferdv0vnwftzca0xuc9yxycng78cnhrvmyw2mzaa8t76jskpypqnwqhp9xh0vnwxz90jytd34vrmhcngsnl8qplz7ylk";
// 3. Get fee estimate
const feeEstimate = await wallet.getLightningSendFeeEstimate({
encodedInvoice: invoice
});
console.log("Fee estimate:", feeEstimate);
// 4. Calculate recommended fee
const amountSats = 1000; // Example amount
const recommendedFee = Math.max(5, Math.ceil(amountSats * 0.0017));
// 5. Pay the invoice
const payment = await wallet.payLightningInvoice({
invoice: invoice,
maxFeeSats: recommendedFee,
preferSpark: true
});
console.log("Payment initiated:", payment.id);
// 6. Monitor payment status
const monitorPayment = async () => {
const paymentStatus = await wallet.getLightningSendRequest(payment.id);
if (paymentStatus.status === "TRANSFER_COMPLETED") {
console.log("Lightning payment completed successfully!");
console.log("Amount sent:", paymentStatus.amountSats, "sats");
} else if (paymentStatus.status === "TRANSFER_FAILED") {
console.log("Lightning payment failed");
} else {
console.log("Lightning payment pending...");
setTimeout(monitorPayment, 5000); // Check again in 5 seconds
}
};
await monitorPayment();
} catch (error) {
console.error("Lightning payment failed:", error);
}
}
```