Grid provides the browser infrastructure for parallelism, but your test runner decides how tests are distributed across those browsers. In Python, pytest-xdist is the standard plugin for parallel test execution. In Java, JUnit 5 has built-in parallel execution configuration. Both split your test suite into groups and run them simultaneously, each group using its own browser session on the Grid.
Parallel Test Runners — pytest-xdist and JUnit 5
Configuring parallel execution is the final piece of the puzzle: Grid provides the browsers, the runner distributes the tests, and together they deliver the speedup.
# ── pytest-xdist — Parallel Execution for Python ──
# Install: pip install pytest-xdist
# Run with 4 parallel workers:
# $ pytest tests/ -n 4
# Run with auto-detected workers (one per CPU core):
# $ pytest tests/ -n auto
# Run on Grid with 8 workers:
# $ GRID_URL=http://grid:4444/wd/hub pytest tests/ -n 8
# ── How pytest-xdist works ──
XDIST_MECHANICS = [
"1. xdist collects all test items from the test suite",
"2. It spawns N worker processes (one per -n count)",
"3. Each worker gets its own pytest session with independent fixtures",
"4. Tests are distributed to workers using a scheduling algorithm",
"5. Each worker's browser fixture creates its OWN browser session",
"6. Workers execute tests independently and report results back",
"7. xdist aggregates results into a single report",
]
# ── Scheduling modes ──
SCHEDULING_MODES = [
{
"mode": "load (default)",
"flag": "-n 4 --dist=load",
"behaviour": "Send tests to whichever worker finishes first",
"best_for": "Tests with varying durations — keeps all workers busy",
},
{
"mode": "loadscope",
"flag": "-n 4 --dist=loadscope",
"behaviour": "Group tests by module or class, send each group to one worker",
"best_for": "Tests that share expensive fixtures within a module",
},
{
"mode": "loadfile",
"flag": "-n 4 --dist=loadfile",
"behaviour": "All tests in a file go to the same worker",
"best_for": "Tests within a file that share state (not recommended but practical)",
},
{
"mode": "each",
"flag": "-n 4 --dist=each",
"behaviour": "Every test runs on EVERY worker (for cross-browser testing)",
"best_for": "Running the same suite on Chrome, Firefox, Edge simultaneously",
},
]
# ── conftest.py for parallel-safe fixtures ──
PARALLEL_CONFTEST = '''
import pytest
from selenium import webdriver
from selenium.webdriver.chrome.options import Options
import os
@pytest.fixture(scope="function")
def browser():
"""Each test gets its own browser — safe for parallel execution."""
options = Options()
options.add_argument("--headless=new")
options.add_argument("--no-sandbox")
options.add_argument("--window-size=1920,1080")
grid_url = os.getenv("GRID_URL")
if grid_url:
driver = webdriver.Remote(command_executor=grid_url, options=options)
else:
driver = webdriver.Chrome(options=options)
yield driver
driver.quit()
'''
print("pytest-xdist — Parallel Test Execution")
print("=" * 60)
print("\nHow it works:")
for step in XDIST_MECHANICS:
print(f" {step}")
print("\n\nScheduling Modes:")
for mode in SCHEDULING_MODES:
print(f"\n {mode['mode']}")
print(f" Flag: {mode['flag']}")
print(f" Behaviour: {mode['behaviour']}")
print(f" Best for: {mode['best_for']}")
print("\n\nCommon Commands:")
print(" $ pytest tests/ -n 4 # 4 workers, local browsers")
print(" $ pytest tests/ -n auto # auto-detect CPU count")
print(" $ pytest tests/ -n 8 --dist=loadscope # 8 workers, grouped by module")
print(" $ GRID_URL=... pytest tests/ -n 12 # 12 workers on Selenium Grid")
// ── JUnit 5 — Parallel Execution Configuration ──
// File: src/test/resources/junit-platform.properties
// junit.jupiter.execution.parallel.enabled=true
// junit.jupiter.execution.parallel.mode.default=concurrent
// junit.jupiter.execution.parallel.config.strategy=fixed
// junit.jupiter.execution.parallel.config.fixed.parallelism=4
// Each test class gets its own browser via @BeforeEach
// Tests within a class can run concurrently if thread-safe
scope="function" setting in the browser fixture is critical for parallel safety. It creates a new browser for every test function, ensuring no two parallel workers share a browser session. Using scope="session" or scope="module" with parallel workers causes race conditions — two workers sending commands to the same browser produce unpredictable results. Always use function-scoped fixtures when running tests in parallel.--dist=loadscope scheduling mode when tests within a module share expensive setup (like database seeding). This sends all tests from test_checkout.py to one worker, allowing module-scoped fixtures to work correctly while still distributing different modules across workers. It is a practical compromise between full test isolation and setup efficiency.pytest-xdist, console output from different workers is interleaved and can be confusing. Use pytest -n 4 --tb=short -q for concise output during CI, and add the pytest-html or allure-pytest plugin for structured reports that cleanly separate results by test, regardless of which worker executed them.Common Mistakes
Mistake 1 — Using session-scoped browser fixtures with parallel workers
❌ Wrong: @pytest.fixture(scope="session") for the browser with -n 4 — four workers attempt to share one browser session, causing race conditions.
✅ Correct: @pytest.fixture(scope="function") — each test in each worker gets its own browser. More browser startups, but zero shared-state bugs.
Mistake 2 — Setting parallel workers higher than Grid capacity
❌ Wrong: Running pytest -n 20 against a Grid with only 8 available slots — 12 tests queue and timeout, appearing as failures.
✅ Correct: Matching the -n count to your Grid capacity. With 2 Chrome nodes x 4 sessions = 8 slots, use -n 8 or lower.