My macOS Development Environment for UX Testing

This document details my current macOS-based development environment, optimized for hybrid AI-assisted design, biometric UX testing, and rapid iteration on digital systems. The setup prioritizes performance, modularity, and long-term maintainability over fleeting trends.

systems-and-infrastructure

1. Core Principles of My Environment

  • Speed over spectacle — Every tool must justify its presence with measurable output gains.

  • Isolation over contamination — Dev environments are containerized or sandboxed to avoid polluting the host OS.

  • Consistency across contexts — Whether I’m at my desk or on the go, my toolset behaves predictably.

  • Biometric readiness — Hardware and software are preconfigured to integrate with testing devices without reconfiguration.

2. Hardware Stack

  • Primary Machine: MacBook Pro 14” M3 Max, 16GB RAM

  • Monitors: Dual external displays (16” + 14”) for split testing, code/design parallel workflows

  • Input Devices: Split mechanical keyboard, precision trackball mouse

  • Testing Equipment: Eye-tracking device, EEG headset, GSR sensor

  • Connectivity: High-speed, low-latency wired connection for live biometric streaming

3. Core Software Layer

  • Development IDEs: Cursor (primary), Zed (offline coding), Xcode (Swift/macOS-specific projects)

  • Containerization: Docker for isolated build environments

  • Data Analysis: Python + Jupyter for quick biometric dataset parsing

  • Design Tools: Figma, Spline, Amadine (for vector work)

  • Documentation: Obsidian for raw notes → Export to this Documentation site

  • Automation: Raycast + custom scripts for instant tool launches

4. Biometric Testing Integration

  • Preconfigured drivers and data ingestion pipelines for biometric hardware

  • Custom scripts to auto-start test sessions and sync outputs to analysis folders

  • AI-assisted tagging for participant reactions during playback review

5. Workflow Example

1. Launch biometric capture via Raycast shortcut

2. Open Cursor project in Docker container

3. Run AI-assisted UI build

4. Simultaneously monitor EEG + eye tracking feeds

5. Export data for immediate review in Python

6. Document results directly into Foundations & Experiments sections

6. Why This Environment Works

This setup ensures zero downtime between ideation, build, test, and iteration. Everything is one command away, every tool is purpose-driven, and every output ties back into a measurable UX or system performance metric.

Jonathan Hines Dumitru

Software architect focused on translating ambiguous ideas into fully shippable native applications.