A purely algorithmic implementation of a chronometer on limited hardware. The goal was to achieve maximum precision without relying on high-level abstraction libraries.
Problem
•Standard libraries introduce significant overhead and unpredictable latency.
•Limited clock cycles available for updating display and handling user input concurrently.
•Need for state-machine reliability to prevent 'ghost' inputs.
Approach
•Wrote raw C++ interacting directly with hardware registers for minimal latency.
•Implemented interrupt-driven logic for high-precision timekeeping.
•Designed a custom circular buffer for debounce logic to handle noisy physical buttons.
Outcome
•achieved microsecond-level precision on standard Arduino hardware.