7 releases
new 0.1.7 | Mar 17, 2025 |
---|---|
0.1.6 | Mar 17, 2025 |
#25 in Magic Beans
325 downloads per month
1MB
15K
SLoC
Spark Wallet SDK
This workspace is the semi-official Rust development environment for Spark. This crate forms a complete wallet with all necessary Spark utilities. The cryptographic primitives are provided by the spark-cryptography crate.
Overview
Spark Wallet SDK has 5 components:
config
: This module contains the configuration for the Spark wallet, as found in the config directory.handlers
: This module contains the user-facing APIs for the Spark wallet, as found in the handlers directory. Examples illustrating typical usage are provided below.internal_handlers
: Contains the internal service handlers for coordinating signing processes and Spark RPC communications, as documented in the internal_handlers directory.rpc
: Provides an RPC client for establishing secure connections to Spark nodes, handling TLS configurations, and creating service-specific clients.signer
: Provides comprehensive key management, storage, and signing capabilities, fully conforming to the traits found in src/signer/traits. In addition, a convenient built-in signer (default_signer.rs) is included for quick and straightforward integration.
Installation
Make sure that you have protos installed.
# Make sure you have protos installed
brew install protobuf
Also, make sure you have Rust version 1.75.0
. Ideally, you should use the latest stable version.
# For version 1.75.0
rustup update
rustup install 1.75.0
# For the latest stable version
rustup update stable
Quick Start
use spark_rust::SparkSdk;
use spark_rust::SparkNetwork;
use spark_rust::signer::default_signer::DefaultSigner;
#[tokio::main]
async fn main() -> Result<SparkSdk, SparkSdkError> {
// Initialize the default signer. Alternatively, you can also create a custom signer as long as it implements all the necessary signing traits. In this case, it is your responsibility to make sure that the signer is safe to use and works as expected.
let mnemonic = "abandon ability able about above absent absorb abstract absurd \
abuse access accident";
let default_signer = DefaultSigner::from_mnemonic(mnemonic, SparkNetwork::Regtest).await?;
let sdk = SparkSdk::new(SparkNetwork::Regtest, default_signer).await?;
// Generate a deposit address. Note: This deposit address is one time use only!
let deposit_address = sdk.generate_deposit_address().await?;
println!("Deposit address: {}", deposit_address.deposit_address);
// You should send a deposit to this address on L1, and Spark will detect it. You can choose the amount of sats.
// This line sends 100,000 sats to the deposit address.
let txid = l1_wallet.send_to_address(deposit_address.deposit_address, 100_000).await?;
// For Regtest, sleep for 30 seconds
sleep(Duration::from_secs(30)).await;
// Claim the deposit
let deposits = sdk.claim_deposit(txid).await?;
let balance = sdk.get_bitcoin_balance();
assert_eq!(balance, 100_000);
// Also, query all the incoming transfers from other Spark users.
let pending = sdk.query_pending_transfers().await?;
println!("Querying all transfers...");
for transfer in all_transfers.transfers {
println!("Transfer: {:?} satoshis", transfer.total_value);
}
// So far, you have NOT claimed these transfers.
// You should claim them by calling `sdk.claim_transfers()`.
let claimed = sdk.claim_transfers().await?;
// And now, your Bitcoin balance should be updated.
let balance = sdk.get_bitcoin_balance();
}
API Documentation
Below you will find the primary wallet API documentation. For developers interested in implementing custom signers, refer to the signer documentation at the end.
Conceptual Overview: SSP and Fees in Spark
The Spark network operates with fee structures that are different from traditional Bitcoin or Lightning wallets. All fees in Spark are service fees charged by the Spark Service Provider (SSP) for various operations they facilitate on behalf of users.
Fee Overview
- All fees are taken by the SSP - The Spark Service Provider charges service fees for operations they perform on your behalf.
- No direct mining fees - You don't directly pay Bitcoin mining fees when using Spark. These are handled by the SSP when they interact with the Bitcoin network.
- Fee estimation - Before performing fee-incurring operations, you can use the estimation methods to determine the cost.
- Common fee operations include:
- Lightning payments (sending and receiving)
- Leaves swaps (optimizing your wallet structure)
- Cooperative exits (withdrawing to on-chain Bitcoin)
Types of Fees
- Lightning Send Fees - Charged when you pay a Lightning invoice through the SSP.
- Lightning Receive Fees - Charged when you receive Lightning payments through the SSP.
- Cooperative Exit Fees - Charged when you withdraw funds from Spark to an on-chain Bitcoin address.
- Leaves Swap Fees - Charged when you optimize your wallet leaf structure through the SSP.
Each fee type has a corresponding estimation method that helps you determine the cost before performing the actual operation. The fee structure is designed to be transparent and predictable.
Initialize the SDK
Use the new
method to create a new instance of the Spark SDK. This is the main entry point for interacting with the Spark protocol.
Parameters
network: SparkNetwork
- The Spark network to connect to (e.g., Regtest or Mainnet)signer: S where S: SparkSigner
- Implementation of the SparkSigner trait for secure key management
Response
Returns a Result<SparkSdk, SparkSdkError>
, which contains:
• The initialized SparkSdk instance if successful
• A SparkSdkError if initialization fails
Internally, this constructor:
- Creates the wallet configuration for the specified network
- Initializes the leaf manager to track UTXOs
- Authenticates with the Spark network
- Synchronizes Bitcoin leaves to establish the wallet state
Steps
- Create a signer implementation (typically DefaultSigner)
- Call
SparkSdk::new(network, signer)
with the desired network and signer - The SDK connects to the network, authenticates, and initializes its state
Example
// Create a signer using a mnemonic phrase
let mnemonic = "abandon ability able about above absent absorb abstract absurd abuse access accident";
let network = SparkNetwork::Regtest;
let signer = DefaultSigner::from_mnemonic(mnemonic, network.clone()).await?;
// Initialize the SDK with the signer and network
let sdk = SparkSdk::new(network, signer).await?;
// The SDK is now ready to use
println!("SDK initialized successfully");
Get Spark Address
Use the get_spark_address
method to retrieve the Spark address of the wallet, which is derived from the wallet's identity public key.
Parameters
None - This method doesn't require any parameters.
Response
Returns a Result<PublicKey, SparkSdkError>
, which contains:
• A PublicKey representing the wallet's identity public key
• This key serves as the wallet's unique identifier on the Spark network
The Spark address serves several purposes:
- Authenticates the wallet with Spark operators during API calls
- Used in deposit address generation to prove ownership
- Required for validating operator signatures
- Helps prevent unauthorized access to wallet funds
Steps
- Call
sdk.get_spark_address()
. - SDK returns the PublicKey that represents the wallet's identity.
Example
// Get the wallet's Spark address
let spark_address = sdk.get_spark_address()?;
// This address is a compressed secp256k1 public key in SEC format (33 bytes)
println!("Your Spark address: {}", spark_address);
// You can share this address with others so they can send you funds
let serialized_address = spark_address.serialize();
assert_eq!(serialized_address.len(), 33); // 33-byte compressed format
Get Network
Use the get_network
method to retrieve the Bitcoin network that this wallet is connected to.
Parameters
None - This method doesn't require any parameters.
Response
Returns a SparkNetwork
enum indicating whether this is a mainnet or regtest wallet.
The network determines which Spark operators the wallet communicates with and which Bitcoin network (mainnet or regtest) is used for transactions. It's set when creating the wallet and cannot be changed after initialization.
Steps
- Call
sdk.get_network()
. - SDK returns the SparkNetwork enum value.
Example
// Get the network this wallet is connected to
let network = sdk.get_network();
// You can use this to display appropriate information in your UI
match network {
SparkNetwork::Mainnet => println!("Connected to Spark Mainnet"),
SparkNetwork::Regtest => println!("Connected to Spark Regtest (testing network)"),
}
// Or use it for conditional logic
if network == SparkNetwork::Regtest {
println!("This is a test wallet - don't use real funds!");
}
Generate Deposit Address
Use the generate_deposit_address
method to obtain a unique, one-time-use deposit address for Spark. This method returns a
GenerateDepositAddressSdkResponse
, which explicitly contains:
• A deposit address of type bitcoin::Address
• A signing public key of type bitcoin::secp256k1::PublicKey
• A verifying public key of type bitcoin::secp256k1::PublicKey
Internally, Spark combines the user's signing public key with a Spark Operator public key to derive a taproot address.
Steps
- Call
sdk.generate_deposit_address()
. - Spark returns a
GenerateDepositAddressSdkResponse
containing all three fields. - Normally, you only need the deposit address. For advanced use cases, you can also use the signing public key and verifying public key.
Example
// 1. Calling Spark to generate all three fields in GenerateDepositAddressSdkResponse.
let generate_deposit_response = sdk.generate_deposit_address().await?;
// 2. This deposit address (bitcoin::Address) is a one-time address you can use to send funds on L1.
let deposit_address = generate_deposit_response.deposit_address;
// 3. The signing public key (bitcoin::secp256k1::PublicKey) for the deposit address,
// generally managed internally by the SDK.
let signing_public_key = generate_deposit_response.signing_public_key;
// 4. The verifying public key (bitcoin::secp256k1::PublicKey),
// used to verify threshold signatures (not typically needed directly).
let verifying_public_key = generate_deposit_response.verifying_public_key;
Claim Deposit
Use the claim_deposit
method to claim funds that have been deposited to a Spark deposit address.
Parameters
txid: String
- The transaction ID of the L1 transaction that sent funds to the Spark deposit address
Response
Returns a Result<Vec<TreeNode>, SparkSdkError>
, which explicitly contains:
• A vector of TreeNode
objects representing the claimed deposits
• Each TreeNode
are returned by Spark Operators and contains details about the deposit such as amount and status
• TreeNode
is a tonic
message type pre-compiled using Spark's official protobuf
definitions
Internally, Spark processes the L1 transaction, verifies the deposit, and adds it to your wallet balance, making the funds available for use in the Spark network.
Steps
- Call
sdk.claim_deposit(txid)
with the transaction ID of your deposit. - Spark processes the deposit and returns a vector of
TreeNode
objects. - The funds are now available in your Spark wallet.
Example
// 1. Generate a deposit address first (as shown in the previous example)
let deposit_address = sdk.generate_deposit_address().await?;
// 2. Send bitcoin to this address on L1 (using an L1 wallet)
let txid = l1_wallet.send_to_address(deposit_address.deposit_address, 100_000).await?;
// 3. Wait for the transaction to be confirmed
// For Regtest, this will take around 30 seconds
sleep(Duration::from_secs(30)).await;
// 4. Claim the deposit using the transaction ID
let deposits = sdk.claim_deposit(txid).await?;
// 5. Verify the balance has been updated
let balance = sdk.get_bitcoin_balance();
assert_eq!(balance, 100_000);
Query Unused Deposit Addresses
Use the query_unused_deposit_addresses
method to retrieve all unused deposit addresses that have been previously generated for your wallet. This helps you track deposit addresses that you've created but haven't received funds on yet.
Parameters
None - This method doesn't require any parameters.
Response
Returns a Result<Vec<DepositAddressQueryResult>, SparkSdkError>
, which explicitly contains:
• A vector of DepositAddressQueryResult
objects representing unused deposit addresses
• Each result contains the deposit address and associated metadata
• DepositAddressQueryResult
is a tonic
message type pre-compiled using Spark's official protobuf
definitions
Internally, Spark queries the network for all deposit addresses associated with your identity public key that haven't been used for deposits yet.
Steps
- Call
sdk.query_unused_deposit_addresses()
. - Spark returns a vector of
DepositAddressQueryResult
objects representing all unused deposit addresses. - You can use these addresses to track expected deposits or for reconciliation purposes.
Example
// Query all unused deposit addresses associated with your wallet
let unused_addresses = sdk.query_unused_deposit_addresses().await?;
// Process each unused address
for address_result in unused_addresses {
println!("Unused address: {}", address_result.deposit_address);
// You might want to check if these addresses have received funds on L1
// or display them to users who are expected to make deposits
}
// You can also count how many unused addresses you have
println!("You have {} unused deposit addresses", unused_addresses.len());
Finalize Deposit
This is an advanced method that allows you to finalize a deposit without using the claim_deposit
method for custom use cases. Use the finalize_deposit
method to finalize the claiming process for funds deposited to a Spark deposit address. Note: Users typically do not need to call this method directly, as claim_deposit
automatically calls it internally. This method is provided for advanced use cases where you need to override the default claiming logic.
Parameters
signing_pubkey: Vec<u8>
- Binary representation of the signing public key used for the depositverifying_pubkey: Vec<u8>
- Binary representation of the verifying public key used for the depositdeposit_tx: bitcoin::Transaction
- The full Bitcoin transaction containing the depositvout: u32
- The output index in the transaction that contains the deposit
Response
Returns a Result<TreeNode, SparkSdkError>
, which explicitly contains:
• A TreeNode
object representing the finalized deposit
• Contains details about the deposit such as amount and status
• TreeNode
is a tonic
message type pre-compiled using Spark's official protobuf
definitions
Internally, Spark finalizes the deposit process by submitting the provided parameters to Spark Operators, who verify and process the deposit, making the funds available in your wallet.
Steps
- Call
sdk.finalize_deposit()
with the required parameters. - Spark processes the finalization request and returns a
TreeNode
object. - The funds are now available in your Spark wallet.
Example
// STANDARD APPROACH: In most cases, you would simply use claim_deposit:
// let deposits = sdk.claim_deposit(txid).await?;
// ADVANCED APPROACH: Only if you need to bypass claim_deposit for custom logic:
// 1. Get the Bitcoin transaction containing the deposit
let deposit_tx = bitcoin_client.get_transaction(txid).await?;
// 2. Identify which output contains the deposit (custom logic)
let vout = 0; // Example: using custom logic to determine output index
// 3. Get the signing and verifying public keys from your deposit tracking system
let signing_pubkey = your_custom_storage.get_signing_pubkey_for_deposit(txid).await?;
let verifying_pubkey = your_custom_storage.get_verifying_pubkey_for_deposit(txid).await?;
// 4. Call finalize_deposit directly (bypassing claim_deposit)
let deposit = sdk.finalize_deposit(
signing_pubkey,
verifying_pubkey,
deposit_tx,
vout
).await?;
// The funds are now available in your wallet
let balance = sdk.get_bitcoin_balance();
Query Pending Transfers
Use the query_pending_transfers
method to retrieve all pending transfers where the current user is the receiver. A pending transfer represents funds that have been sent to the user but have not yet been claimed. The transfers remain in a pending state until the receiver claims them, at which point the funds become available in their wallet.
This function does not claim any pending transfers. To claim a transfer, you should call claim_transfers()
. This will execute key tweaking, which is the core of Spark's security mechanism. Before the receiver tweaks the keys, the transfer is not final.
Parameters
None - This method doesn't require any parameters.
Response
Returns a Result<Vec<Transfer>, SparkSdkError>
, which explicitly contains:
• A vector of Transfer
objects representing pending transfers
• Each Transfer
contains details about the pending transfer such as amount, sender, and status
• Transfer
is a tonic
message type pre-compiled using Spark's official protobuf
definitions
Internally, Spark queries the network for all pending transfers associated with the user's identity public key.
Steps
- Call
sdk.query_pending_transfers()
. - Spark returns a vector of
Transfer
objects representing all pending transfers. - You can then process these pending transfers as needed, such as displaying them to the user or accepting them.
Example
// Query all pending transfers where the current user is the receiver
let pending = sdk.query_pending_transfers().await?;
// Process each pending transfer
for transfer in pending {
println!("Pending transfer: {:?} satoshis", transfer.total_value);
// You might want to automatically accept transfers or display them to the user
// For example:
// if should_auto_accept(&transfer) {
// sdk.accept_transfer(transfer.id).await?;
// }
}
Transfer Funds
Use the transfer
method to send funds from your wallet to another Spark user. This initiates a transfer process where the funds are removed from your wallet and become available for the recipient to claim.
Parameters
amount: u64
- The amount to transfer in satoshis. Must be greater than the dust limit and the wallet must have a leaf with exactly this amount.receiver_spark_address: &bitcoin::secp256k1::PublicKey
- The Spark address identifying the receiver of the transfer. This should be the receiver's identity public key, not a regular Bitcoin public key.
Response
Returns a Result<String, SparkSdkError>
, which explicitly contains:
• A String representing the transfer ID if successful
• This ID can be used to track the status of the transfer
Internally, Spark handles the process of transferring funds by selecting appropriate leaves (UTXOs), locking them, generating new signing keys, creating and signing the transfer transaction, and removing the used leaves from your wallet.
Steps
- Call
sdk.transfer(amount, &receiver_spark_address)
with the amount and receiver's Spark address. - Spark selects appropriate leaves (UTXOs) containing sufficient funds for the transfer.
- Spark generates new signing keys and creates the transfer transaction.
- The transfer is submitted to the Spark network, and the leaves are removed from your wallet.
- The transfer remains in a pending state until the receiver claims it (expiry is set to 30 days by default).
Example
// Define the amount to transfer (in satoshis)
let amount = 100_000;
// Get the recipient's Spark address (which is their public key)
// This can be shared between users in your application
let receiver_spark_address = PublicKey::from_str(
"02782d7ba8764306bd324e23082f785f7c880b7202cb10c85a2cb96496aedcaba7"
).unwrap();
// Send the transfer
let transfer_id_string = sdk.transfer(amount, &receiver_spark_address).await?;
// The transfer ID is a UUID string that can be parsed and stored
let transfer_id = Uuid::parse_str(&transfer_id_string).unwrap();
println!("Transfer successfully initiated with ID: {}", transfer_id);
// The recipient will need to call query_pending_transfers() and claim_transfer()
// to receive these funds
Transfer Specific Leaves (Advanced)
This is an advanced method intended for specialized use cases where you need precise control over which leaves (UTXOs) are used in a transfer. Most users should use the standard transfer(amount, receiver)
method instead.
Use the transfer_leaf_ids
method to transfer specific leaves from your wallet to another Spark user by directly providing the leaf IDs to be transferred.
Parameters
leaf_ids: Vec<String>
- Vector of leaf IDs to transfer. Each ID identifies a specific UTXO in your wallet.receiver_identity_pubkey: &PublicKey
- The Spark address identifying the receiver of the transfer. This should be the receiver's identity public key.
Response
Returns a Result<String, SparkSdkError>
, which explicitly contains:
• A String representing the transfer ID if successful
• This ID can be used to track the status of the transfer
Internally, this method follows a similar process to the standard transfer, but instead of selecting leaves based on an amount, it uses the exact leaves specified by their IDs.
Steps
- Call
sdk.transfer_leaf_ids(leaf_ids, &receiver_spark_address)
with the leaf IDs and receiver's Spark address. - Spark locks the specified leaves and generates new signing keys.
- The transfer is created, signed, and submitted to the Spark network.
- The specified leaves are removed from your wallet.
- The transfer remains in a pending state until the receiver claims it (expiry is set to 30 days by default).
Example
// Get specific leaf IDs from your wallet that you want to transfer
// This requires knowledge of your wallet's internal leaf structure
let leaf_ids = vec!["leaf_id_1".to_string(), "leaf_id_2".to_string()];
// Get the recipient's Spark address
let receiver_spark_address = PublicKey::from_str(
"02782d7ba8764306bd324e23082f785f7c880b7202cb10c85a2cb96496aedcaba7"
).unwrap();
// Transfer the specified leaves
let transfer_id_string = sdk.transfer_leaf_ids(leaf_ids, &receiver_spark_address).await?;
// The transfer ID can be parsed and stored
let transfer_id = Uuid::parse_str(&transfer_id_string).unwrap();
println!("Leaf transfer initiated with ID: {}", transfer_id);
Claim Transfer
Use the claim_transfer
method to claim a specific pending transfer that was sent to your wallet. This method processes a pending transfer and adds the funds to your wallet balance.
Parameters
transfer: Transfer
- The pending transfer to claim, must be inSenderKeyTweaked
status
Response
Returns a Result<(), SparkSdkError>
, which indicates:
• Success (Ok) if the transfer was successfully claimed
• Error (Err) if there was an issue during the claim process
Internally, Spark performs several security-critical steps:
- Verifies the transfer is in the correct state (SenderKeyTweaked)
- Verifies and decrypts the leaf private keys using your identity key
- Generates new signing keys for the claimed leaves
- Finalizes the transfer by tweaking the leaf keys, signing refund transactions, and submitting signatures
Steps
- Obtain a pending transfer (typically from
query_pending_transfers()
) - Call
sdk.claim_transfer(transfer)
with the transfer object - Spark processes the transfer and adds the funds to your wallet
- The funds are now available for use in your wallet
Example
// First get pending transfers
let pending = sdk.query_pending_transfers().await?;
// Then claim each transfer individually
for transfer in pending {
sdk.claim_transfer(transfer).await?;
println!("Successfully claimed transfer: {}", transfer.id);
}
// Verify your updated balance
let balance = sdk.get_bitcoin_balance();
println!("Updated balance: {} satoshis", balance);
Claim All Transfers
Use the claim_transfers
method to claim all pending transfers sent to your wallet in a single operation. This convenience method automatically retrieves all pending transfers and claims them for you.
Parameters
None - This method doesn't require any parameters.
Response
Returns a Result<(), SparkSdkError>
, which indicates:
• Success (Ok) if all transfers were successfully claimed
• Error (Err) if there was an issue during the claim process
Internally, this method:
- Calls
query_pending_transfers()
to get all pending transfers - Processes each transfer in parallel using
claim_transfer
- Returns success only if all transfers are claimed successfully
Steps
- Call
sdk.claim_transfers()
- Spark automatically retrieves and processes all pending transfers
- The funds from all claimed transfers are added to your wallet
Example
// Claim all pending transfers in a single call
sdk.claim_transfers().await?;
println!("Successfully claimed all pending transfers");
// Verify your updated balance
let balance = sdk.get_bitcoin_balance();
println!("Updated balance: {} satoshis", balance);
// You can also check if there are any remaining pending transfers
// (there shouldn't be any if claim_transfers was successful)
let pending = sdk.query_pending_transfers().await?;
assert!(pending.is_empty(), "All transfers should have been claimed");
Get All Transfers (Get transfer history)
Use the get_all_transfers
method to retrieve the history of all transfers (both sent and received) associated with your wallet. This method supports pagination to manage large transfer histories.
Parameters
limit: Option<u32>
- Optional maximum number of transfers to return (defaults to 20 if not specified)offset: Option<u32>
- Optional number of transfers to skip (defaults to 0 if not specified)
Response
Returns a Result<QueryAllTransfersResponse, SparkSdkError>
, which explicitly contains:
• A QueryAllTransfersResponse
object containing the list of transfers
• This response includes both sent and received transfers
• Each transfer contains details such as amount, sender, receiver, status, and timestamp
• QueryAllTransfersResponse
is a tonic
message type pre-compiled using Spark's official protobuf
definitions
Internally, Spark queries the network for all transfers associated with your identity public key and applies the pagination parameters.
Steps
- Call
sdk.get_all_transfers(limit, offset)
with optional pagination parameters - Spark returns a
QueryAllTransfersResponse
containing the requested transfers - You can process these transfers as needed, such as displaying them in a transaction history UI
Example
// Get the first 20 transfers (default pagination)
let first_page = sdk.get_all_transfers(None, None).await?;
println!("First page of transfers: {}", first_page.transfers.len());
// Display transfer details
for transfer in &first_page.transfers {
println!("Transfer ID: {}, Amount: {} sats, Status: {}",
transfer.id,
transfer.total_value,
transfer.status);
}
// Get the next 20 transfers (pagination)
let second_page = sdk.get_all_transfers(Some(20), Some(20)).await?;
println!("Second page of transfers: {}", second_page.transfers.len());
// You can implement pagination controls in your UI
let page_size = 10;
let page_number = 3; // 0-indexed
let transfers = sdk.get_all_transfers(
Some(page_size),
Some(page_size * page_number)
).await?;
Get Bitcoin Balance
Use the get_bitcoin_balance
method to retrieve the current total balance of your wallet in satoshis.
Parameters
None - This method doesn't require any parameters.
Response
Returns a u64
value representing the total available balance in satoshis.
Internally, Spark calculates this by summing the value of all available leaves (UTXOs) in your wallet.
Steps
- Call
sdk.get_bitcoin_balance()
. - Spark returns the total balance as a u64 value.
Example
// Get the current wallet balance
let balance = sdk.get_bitcoin_balance();
println!("Your current balance is {} satoshis", balance);
// You can also use this to check if you have enough funds for a transfer
let amount_to_send = 50_000;
if balance >= amount_to_send {
sdk.transfer(amount_to_send, &receiver_spark_address).await?;
} else {
println!("Insufficient funds: you need {} but only have {}",
amount_to_send, balance);
}
Sync Wallet
Use the sync_wallet
method to perform a comprehensive synchronization of your wallet with the Spark network. This is a convenience method that executes multiple synchronization operations in a single call.
Parameters
None - This method doesn't require any parameters.
Response
Returns a Result<(), SparkSdkError>
, which indicates:
• Success (Ok) if all synchronization operations completed successfully
• Error (Err) if there was an issue during any synchronization step
Internally, this method performs the following operations in sequence:
- Claims all pending Bitcoin transfers
- Synchronizes all leaves (UTXOs) with the Spark network
- Optimizes leaf distribution for efficient wallet operation
Steps
- Call
sdk.sync_wallet()
. - Spark automatically performs all synchronization operations.
- Your wallet state is updated with the latest information from the network.
Example
// Perform a full wallet synchronization
sdk.sync_wallet().await?;
println!("Wallet successfully synchronized with the network");
// After syncing, you'll have the most up-to-date balance
let updated_balance = sdk.get_bitcoin_balance();
println!("Updated balance: {} satoshis", updated_balance);
// Your wallet will also have claimed all pending transfers
let pending = sdk.query_pending_transfers().await?;
assert!(pending.is_empty(), "All transfers should have been claimed during sync");
Request Leaves Swap (Advanced)
This is an advanced method that allows you to optimize your wallet's leaf structure by swapping your current leaves with the Spark Service Provider (SSP). This function is primarily used internally by the SDK when you need to transfer an amount that doesn't match any of your existing leaves.
For example, if you have a single leaf of 100,000 satoshis but need to send 80,000 satoshis, this function will swap with the SSP to get leaves totaling 100,000 satoshis but with denominations that include the 80,000 you need. The SSP typically provides leaves in power-of-2 denominations for optimal efficiency.
Parameters
target_amount: u64
- The amount (in satoshis) you want to have in a specific leaf after the swap
Response
Returns a Result<String, SparkSdkError>
, which explicitly contains:
• A String representing the ID of the newly created leaf with the target amount
• This leaf ID can be used for future transfers
Internally, this method:
- Locks all available Bitcoin leaves in your wallet
- Prepares leaf key tweaks for each leaf
- Creates a transfer to the SSP with all your available leaves
- Uses cryptographic adaptor signatures for security
- Requests new leaves from the SSP with your desired target amount
- Verifies the cryptographic integrity of the returned leaves
- Completes the swap process and claims the new leaves
- Deletes your old leaves
Steps
- Call
sdk.request_leaves_swap(target_amount)
with your desired amount - Spark handles the entire swap process with the SSP
- Your wallet now has optimized leaves including one with your target amount
Example
// Let's say you have a single leaf of 100,000 satoshis but need to send 80,000
let target_amount = 80_000;
// Request a swap with the SSP to get optimized leaves
let new_leaf_id = sdk.request_leaves_swap(target_amount).await?;
println!("Created new leaf with ID: {}", new_leaf_id);
// Now you can transfer exactly 80,000 satoshis
let receiver_spark_address = PublicKey::from_str(
"02782d7ba8764306bd324e23082f785f7c880b7202cb10c85a2cb96496aedcaba7"
).unwrap();
sdk.transfer(target_amount, &receiver_spark_address).await?;
// Your wallet balance should still total 100,000 satoshis, but in optimized denominations
let balance = sdk.get_bitcoin_balance();
assert_eq!(balance, 100_000);
Pay Lightning Invoice
Use the pay_lightning_invoice
method to pay a Lightning Network invoice using the Spark Service Provider (SSP) as an intermediary. Unlike traditional Lightning wallets, Spark doesn't directly connect to the Lightning Network. Instead, it uses a cooperative approach where:
- You provide your leaves (UTXOs) to the SSP
- The SSP makes the Lightning payment on your behalf
- The transaction is secured using cryptographic techniques
Parameters
invoice: &String
- A BOLT11 Lightning invoice string that you want to pay
Response
Returns a Result<String, SparkSdkError>
, which explicitly contains:
• A String representing the payment ID if successful
• This ID can be used to track the payment status
Internally, this method:
- Parses and validates the Lightning invoice
- Selects appropriate leaves to cover the invoice amount
- Prepares cryptographic leaf tweaks for security
- Executes a swap with the SSP (your leaves in exchange for the invoice payment)
- The SSP processes the Lightning payment using their Lightning node
- The leaf transfer is completed and your old leaves are removed
Steps
- Call
sdk.pay_lightning_invoice(invoice)
with the Lightning invoice string - Spark handles the entire payment process with the SSP
- If successful, you'll receive a payment ID
Example
// Get a Lightning invoice from somewhere (e.g., a merchant)
let invoice = "lnbc1500n1p3zty3app5wkf0hagkc4egr8rl88msr4c5lp0ygt6gvzna5hdg4tpna65pzqdq0vehk7cnpwga5xzmnwvycqzpgxqyz5vqsp5v9ym7xsyf0qxqwzlmwjl3g0g9q2tg977h70hcheske9xlgfsggls9qyyssqtghx3qqpwm9zl4m398nm40wj8ryaz8v7v4rrdvczypdpy7qtc6rdrkklm9uxlkmtp3jf29yhqjw2vwmlp82y5ctft94k23cwgqd9llgy".to_string();
// Pay the invoice
let payment_id = sdk.pay_lightning_invoice(&invoice).await?;
println!("Lightning payment initiated with ID: {}", payment_id);
// Your leaves have been transferred to the SSP, and the SSP has made the Lightning payment
Create Lightning Invoice
Use the create_lightning_invoice
method to generate a Lightning Network invoice that others can pay to you. When someone pays this invoice via Lightning, the funds will be received by the SSP and then transferred to your Spark wallet.
Parameters
amount_sats: u64
- The amount in satoshis that you want to receivememo: Option<String>
- Optional description/memo for the invoiceexpiry_seconds: Option<i32>
- Optional expiry time in seconds (defaults to 30 days if not specified)
Response
Returns a Result<Bolt11Invoice, SparkSdkError>
, which explicitly contains:
• A Bolt11Invoice
object representing the generated Lightning invoice
• This invoice can be shared with anyone who wants to pay you via Lightning
Internally, this method:
- Generates a secure payment preimage and hash
- Creates the invoice through the SSP
- Distributes preimage shares to Spark operators using a threshold secret sharing scheme
- Returns the formatted BOLT11 invoice
Steps
- Call
sdk.create_lightning_invoice(amount, memo, expiry)
with your desired parameters - Spark generates the invoice and distributes the cryptographic material
- Share the returned invoice string with the person who wants to pay you
Example
// Create an invoice for 50,000 satoshis
let amount_sats = 50_000;
let memo = Some("Payment for services".to_string());
let expiry = Some(3600 * 24); // 24 hours
// Generate the Lightning invoice
let invoice = sdk.create_lightning_invoice(amount_sats, memo, expiry).await?;
// Get the invoice string to share with the payer
let invoice_string = invoice.to_string();
println!("Lightning Invoice: {}", invoice_string);
// When someone pays this invoice via Lightning, the funds will automatically
// appear in your Spark wallet (after being processed by the SSP)
Get Lightning Send Fee Estimate
Use the get_lightning_send_fee_estimate
method to estimate the fees associated with sending a Lightning payment through the Spark Service Provider (SSP).
Parameters
invoice: String
- The Lightning invoice you want to pay
Response
Returns a Result<SparkFeeEstimate, SparkSdkError>
, which explicitly contains:
• A SparkFeeEstimate
object with the estimated fees in satoshis
This helps you understand the cost of making a Lightning payment before you commit to it. The fee is a service fee charged by the SSP for facilitating the Lightning payment.
Steps
- Call
sdk.get_lightning_send_fee_estimate(invoice)
with the invoice - Spark returns the estimated fees from the SSP
Example
// Get a Lightning invoice from somewhere
let invoice = "lnbc1500n1p3zty3app...".to_string();
// Get fee estimate before paying
let fee_estimate = sdk.get_lightning_send_fee_estimate(invoice.clone()).await?;
println!("Estimated fee: {} satoshis", fee_estimate.fees);
// Decide whether to proceed with the payment
if fee_estimate.fees < 100 {
// Fee is acceptable, proceed with payment
sdk.pay_lightning_invoice(&invoice).await?;
} else {
println!("Fee too high, payment aborted");
}
Get Lightning Receive Fee Estimate
Use the get_lightning_receive_fee_estimate
method to estimate the fees associated with receiving a Lightning payment through the Spark Service Provider (SSP).
Parameters
amount: u64
- The amount in satoshis you want to receive
Response
Returns a Result<SparkFeeEstimate, SparkSdkError>
, which explicitly contains:
• A SparkFeeEstimate
object with the estimated fees in satoshis
This helps you understand how much will be deducted from the payment amount as fees. The fee is a service fee charged by the SSP for facilitating the Lightning payment reception.
Steps
- Call
sdk.get_lightning_receive_fee_estimate(amount)
with the desired amount - Spark returns the estimated fees from the SSP
Example
// Amount you want to receive
let amount_sats = 50_000;
// Get fee estimate for receiving this amount
let fee_estimate = sdk.get_lightning_receive_fee_estimate(amount_sats).await?;
println!("Estimated receive fee: {} satoshis", fee_estimate.fees);
// Calculate the net amount you'll receive after fees
let net_amount = amount_sats - fee_estimate.fees;
println!("You'll receive {} satoshis after fees", net_amount);
// Create invoice if fees are acceptable
if fee_estimate.fees < amount_sats * 0.01 { // Less than 1% fee
sdk.create_lightning_invoice(amount_sats, None, None).await?;
}
Get Cooperative Exit Fee Estimate
Use the get_cooperative_exit_fee_estimate
method to estimate the fees associated with withdrawing funds from Spark to an on-chain Bitcoin address through the Spark Service Provider (SSP).
Parameters
leaf_ids: Vec<String>
- The specific leaf IDs you want to withdrawon_chain_address: String
- The Bitcoin address where you want to receive the funds
Response
Returns a Result<SparkFeeEstimate, SparkSdkError>
, which explicitly contains:
• A SparkFeeEstimate
object with the estimated fees in satoshis
This helps you understand the cost of withdrawing your funds back to the Bitcoin blockchain before initiating the withdrawal. The fee is a service fee charged by the SSP for facilitating the on-chain exit.
Steps
- Call
sdk.get_cooperative_exit_fee_estimate(leaf_ids, on_chain_address)
with the leaf IDs and address - Spark returns the estimated fees from the SSP
Example
// Identify the leaves you want to withdraw
let leaf_ids = vec!["leaf_id_1".to_string(), "leaf_id_2".to_string()];
// Specify the Bitcoin address to receive funds
let onchain_address = "bc1q...".to_string();
// Get fee estimate before withdrawing
let fee_estimate = sdk.get_cooperative_exit_fee_estimate(leaf_ids.clone(), onchain_address.clone()).await?;
println!("Estimated withdrawal fee: {} satoshis", fee_estimate.fees);
// Decide whether to proceed with the withdrawal
if fee_estimate.fees < 1000 { // Example threshold
// Fee is acceptable, proceed with withdrawal
let bitcoin_address = Address::from_str(&onchain_address).unwrap();
sdk.withdraw(&bitcoin_address, None).await?;
} else {
println!("Fee too high, withdrawal aborted");
}
Get Leaves Swap Fee Estimate
Use the get_leaves_swap_fee_estimate
method to estimate the fees associated with optimizing your wallet's leaf structure by swapping your leaves with the Spark Service Provider (SSP).
Parameters
total_amount_sats: u64
- The total amount in satoshis that will be involved in the swap
Response
Returns a Result<SparkFeeEstimate, SparkSdkError>
, which explicitly contains:
• A SparkFeeEstimate
object with the estimated fees in satoshis
This helps you understand the cost of optimizing your leaf structure before initiating the swap. The fee is a service fee charged by the SSP for facilitating the leaves swap operation.
Steps
- Call
sdk.get_leaves_swap_fee_estimate(total_amount_sats)
with the total amount - Spark returns the estimated fees from the SSP
Example
// Total amount to be swapped
let total_amount_sats = 100_000;
// Get fee estimate before swapping leaves
let fee_estimate = sdk.get_leaves_swap_fee_estimate(total_amount_sats).await?;
println!("Estimated swap fee: {} satoshis", fee_estimate.fees);
// Decide whether to proceed with the swap
if fee_estimate.fees < total_amount_sats * 0.005 { // Less than 0.5% fee
// Fee is acceptable, proceed with swap
let target_amount = 80_000; // The specific denomination you need
sdk.request_leaves_swap(target_amount).await?;
} else {
println!("Fee too high, swap aborted");
}
Signer Documentation
The signing system is a critical component of the Spark wallet, handling all cryptographic operations including key derivation, transaction signing, and threshold signatures via the FROST protocol. This documentation is intended for developers who need to implement custom signers or understand the internal signing architecture.
Signer Architecture Overview
The signer in Spark follows a trait-based architecture, where various cryptographic capabilities are separated into distinct traits that together form a complete signing system:
SparkSigner
├── SparkSignerDerivationPath - Key derivation path handling
├── SparkSignerEcdsa - ECDSA signature operations
├── SparkSignerEcies - Encryption/decryption of secret keys
├── SparkSignerFrost - FROST nonce and commitment management
├── SparkSignerFrostSigning - FROST threshold signature operations
├── SparkSignerSecp256k1 - Secp256k1 keypair operations
└── SparkSignerShamir - Verifiable secret sharing operations
The SDK includes a DefaultSigner
implementation that manages keys in memory. While this implementation works well for most use cases, you may implement your own signer for specialized needs such as remote signing or integration with custom key management systems.
Security Model
The Spark security model requires that both the user and Spark Operators participate in signing Bitcoin transactions:
- The user always maintains control of their signing keys
- Spark Operators use threshold signing (FROST) for their portion of signatures
- The total signature that appears on the blockchain is a single signature, composed of both the user's signature and the aggregated operator signatures
- The user always initiates the signing process and receives signature shares from operators first
This ensures that neither the user nor the operators alone can spend funds, providing a secure multi-party computation model for Bitcoin transactions.
Implementing a Custom Signer
To create a custom signer, you must implement the SparkSigner
trait and all its associated sub-traits. The implementation details will depend on your specific requirements, but there are some important considerations:
Key Derivation Path
The derivation path scheme is critical for compatibility with other Spark wallets. The scheme follows:
m/8797555'/account'/key_type'/[leaf_index']
Where:
8797555'
is the purpose value (derived from "spark")account'
is the account index (hardened, starting from 0)key_type'
is the key type:0'
for identity key1'
for base signing key2'
for temporary signing key
leaf_index'
is a hash-derived index for leaf-specific keys (optional)
All indices use hardened derivation for enhanced security.
FROST Threshold Signing
The FROST implementation in Spark is customized to support Taproot tweaking. The process generally follows these steps:
- Generate nonce pairs and commitments (
SparkSignerFrost
) - Create signing jobs with the appropriate messages and participant information
- Perform signing operations to generate signature shares (
SparkSignerFrostSigning
) - Aggregate signature shares from all participants into a complete signature
Your custom signer will need to properly implement these steps while maintaining the security properties of the FROST protocol.
Key Management Considerations
When implementing a custom signer, carefully consider:
- Private Key Storage: Determine how to securely store private keys (default implementation keeps them in memory)
- Deterministic Keys: Signing keys are deterministic given the seed and leaf ID
- Ephemeral Keys: One-time keys used for commitments or during transfers
- Key Recovery: Consider how keys can be recovered or backed up
Trait Details
SparkSignerDerivationPath
Handles the derivation of keys according to Spark's custom path scheme.
fn get_deposit_signing_key(&self, network: Network) -> Result<PublicKey, SparkSdkError>;
fn derive_spark_key(leaf_id: Option<String>, account: u32, seed_bytes: &[u8],
key_type: SparkKeyType, network: Network) -> Result<SecretKey, SparkSdkError>;
fn get_identity_derivation_path(account_index: u32) -> Result<SparkDerivationPath, SparkSdkError>;
SparkSignerEcdsa
Provides ECDSA signature capabilities for identity verification and other non-threshold operations.
fn sign_message_ecdsa_with_identity_key<T: AsRef<[u8]>>(&self, message: T,
apply_hashing: bool,
network: Network) -> Result<Signature, SparkSdkError>;
fn sign_message_ecdsa_with_key<T: AsRef<[u8]>>(&self, message: T,
public_key_for_signing_key: &PublicKey,
apply_hashing: bool) -> Result<Signature, SparkSdkError>;
SparkSignerEcies
Handles encryption and decryption of secret keys for secure exchange between parties.
fn encrypt_secret_key_with_ecies(&self, receiver_public_key: &PublicKey,
pubkey_for_sk_to_encrypt: &PublicKey) -> Result<Vec<u8>, SparkSdkError>;
fn decrypt_secret_key_with_ecies<T>(&self, ciphertext: T,
network: Network) -> Result<SecretKey, SparkSdkError>
where T: AsRef<[u8]>;
SparkSignerFrost
Manages FROST nonce pairs and commitments for threshold signing.
fn new_frost_signing_noncepair(&self) -> Result<SigningCommitments, SparkSdkError>;
fn sensitive_expose_nonces_from_commitments<T>(&self, signing_commitments: &T)
-> Result<SigningNonces, SparkSdkError>
where T: AsRef<[u8]>;
fn sensitive_create_if_not_found_expose_nonces_from_commitments(&self, signing_commitments: Option<&[u8]>)
-> Result<SigningNonces, SparkSdkError>;
SparkSignerFrostSigning
Performs the actual FROST threshold signing operations, including signing and aggregation.
fn sign_frost(&self, signing_jobs: Vec<FrostSigningJob>) -> Result<SignFrostResponse, SparkSdkError>;
fn aggregate_frost(&self, request: AggregateFrostRequest) -> Result<AggregateFrostResponse, SparkSdkError>;
// Additional specialized signing methods...
SparkSignerSecp256k1
Manages secp256k1 keypairs for various wallet operations.
fn get_identity_public_key(&self, account_index: u32, network: Network) -> Result<PublicKey, SparkSdkError>;
fn new_secp256k1_keypair(&self, leaf_id: String, key_type: SparkKeyType,
account_index: u32, network: Network) -> Result<PublicKey, SparkSdkError>;
fn insert_secp256k1_keypair_from_secret_key(&self, secret_key: &SecretKey) -> Result<PublicKey, SparkSdkError>;
// Additional keypair management methods...
SparkSignerShamir
Provides verifiable secret sharing operations for secure key distribution.
fn split_with_verifiable_secret_sharing(&self, message: Vec<u8>, threshold: usize,
num_shares: usize) -> Result<Vec<VerifiableSecretShare>, SparkSdkError>;
fn split_from_public_key_with_verifiable_secret_sharing(&self, public_key: &PublicKey,
threshold: usize,
num_shares: usize) -> Result<Vec<VerifiableSecretShare>, SparkSdkError>;
Example: Using the Default Signer
For most applications, the provided DefaultSigner
implementation will be sufficient:
// Create a DefaultSigner from a mnemonic
let mnemonic = "abandon ability able about above absent absorb abstract absurd abuse access accident";
let network = SparkNetwork::Regtest;
let signer = DefaultSigner::from_mnemonic(mnemonic, network.clone()).await?;
// Initialize the SDK with the signer
let sdk = SparkSdk::new(network, signer).await?;
Note on Current Implementation
This is an early version of the Spark signing system. The architecture may undergo optimizations and refinements in future releases while maintaining backward compatibility where possible. The current implementation prioritizes security and correctness over performance optimization.
For most users, the provided DefaultSigner
will be sufficient. Custom signer implementations should be undertaken only when specific requirements necessitate it, such as integration with remote signing services or hardware security modules.
Dependencies
~91MB
~1.5M SLoC