Skip to content

feat: Jupyter Notebook / IPython Magic Integration (#454)#481

Merged
MDUYN merged 5 commits intomainfrom
dev
Apr 28, 2026
Merged

feat: Jupyter Notebook / IPython Magic Integration (#454)#481
MDUYN merged 5 commits intomainfrom
dev

Conversation

@MDUYN
Copy link
Copy Markdown
Collaborator

@MDUYN MDUYN commented Apr 27, 2026

Summary

Add %%backtest cell magic and %backtest line magic for running backtests directly in Jupyter notebooks.

Closes #454

Changes

New files

  • investing_algorithm_framework/notebook/__init__.py — Module entry point
  • investing_algorithm_framework/notebook/magic.py — IPython magic extension
  • tests/notebook/__init__.py
  • tests/notebook/test_magic.py — 15 unit tests

Modified files

  • investing_algorithm_framework/__init__.py — Added load_ipython_extension for %load_ext support

Usage

# Load the extension
%load_ext investing_algorithm_framework

# Cell magic — define and run a backtest inline
%%backtest --start 2023-01-01 --end 2023-12-31 --initial-amount 10000 --market BITVAVO --trading-symbol EUR -o results
from investing_algorithm_framework import TradingStrategy, DataSource, TimeUnit

class MyStrategy(TradingStrategy):
    time_unit = TimeUnit.DAY
    interval = 1
    data_sources = [
        DataSource(identifier="btc", symbol="BTC/EUR", data_type="OHLCV", time_frame="1d")
    ]

    def run_strategy(self, context, data):
        ...

# Line magic — run from an existing file
%backtest strategies/my_strategy.py --start 2023-01-01 --end 2023-12-31 -o results

Supported parameters

Flag Description
--start Start date (YYYY-MM-DD or YYYY-MM-DD-HH)
--end End date (defaults to now)
--initial-amount Initial balance (default: 1000)
--market Market (e.g. BITVAVO)
--trading-symbol Quote currency (e.g. EUR)
-o / --output Store result as notebook variable
--vectorized Use vectorized backtesting
--show-report Display inline HTML report
--show-progress Show progress bars
--risk-free-rate Risk-free rate for metrics
--snapshot-interval DAILY or TRADE_CLOSE
--no-fill-missing-data Skip filling missing data

Backtesting modes

  • Default (event-driven): calls app.run_backtest() — strategy's run_strategy is called at each time step
  • --vectorized: calls app.run_vector_backtest() — uses generate_buy_signals/generate_sell_signals

Tests

All 15 tests passing:

  • Date parsing (YYYY-MM-DD, YYYY-MM-DD-HH, invalid, rounding)
  • Argument parser (minimal, full, strategy path, risk-free-rate, no-fill)
  • Strategy class discovery (subclass, multiple, base class, non-classes)
  • Extension registration (magic registration, top-level load_ext)

MDUYN added 5 commits April 27, 2026 13:01
- Replace hardcoded /tmp/ with tempfile.gettempdir()
- Use Path.as_uri() instead of f"file://" for browser URLs
- Normalize backslashes in sqlite:/// URIs
Add %%backtest cell magic and %backtest line magic for running
backtests directly in Jupyter notebooks.

- %%backtest: define a strategy inline and run a backtest in one cell
- %backtest: run a backtest from an existing strategy file
- Supports both event-driven and vectorized backtesting (--vectorized)
- Parameters: --start, --end, --initial-amount, --market, --trading-symbol,
  -o (output variable), --show-report, --show-progress, --risk-free-rate
- Extension loaded via %load_ext investing_algorithm_framework
- Includes unit tests for parser, date parsing, strategy discovery, and
  extension registration

Closes #454
Replace batch-level progress bar with strategy-level progress tracking
when using parallel workers (n_workers). Workers now increment a shared
counter after each individual strategy completes, and a monitoring thread
updates a tqdm progress bar in real time (every 500ms).

Changes:
- Use multiprocessing.Manager().Value() as shared counter across workers
- Add monitoring thread to poll counter and update tqdm bar
- Show per-strategy throughput (strategies/s) and ETA instead of
  batch-level progress
- Move multiprocessing, threading, concurrent.futures imports to top level
- Order all imports per PEP 8 (stdlib, third-party, local)
- Remove redundant inline combine_backtests import
Pass data_provider_service via ProcessPoolExecutor initializer instead
of pickling it per task submission. On spawn-based systems (Windows/WSL)
this avoids serializing the full data provider (with loaded dataframes)
for every batch — now each worker pickles it only once at startup.

- Add _init_worker() initializer and _worker_data_provider_service global
- Copy data_provider_service once before pool starts
- Worker falls back to module-level global when args value is None
- No behavior change on fork-based systems (macOS/Linux)
@MDUYN MDUYN merged commit 0d3f37c into main Apr 28, 2026
8 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature: Jupyter Notebook / IPython Magic Integration

1 participant