Converting an Integer to a Decimal String in Under Two Nanoseconds
Monday, 16 Feb 2026·
,
Jaël Champagne Gareau
Daniel Lemire
Abstract
Converting binary integers to variable-length decimal strings is a fundamental
operation in computing. Conventional fast approaches rely on recursive division
and small lookup tables. We propose a SIMD-based algorithm that leverages
AVX-512 IFMA instructions available on recent AMD and Intel processors. Our
method eliminates lookup tables entirely and computes multiple quotients and
remainders in parallel.
Additionally, we introduce a dual-variant design with dynamic
selection that adapts to input characteristics: a branch-heavy variant optimized
for homogeneous digit-length distributions and a branch-light variant for
heterogeneous datasets. Our single-core algorithm consistently outperforms all
ten competing methods across the full range of integer sizes, running
1.4–2$\times$ faster than the closest competitor and 2–4$\times$ faster than
the C++ standard library function
std::to_chars across tested workloads.Type
Publication
Software: Practice and Experience
This paper has been submitted for peer review. Data are shared here for reviewer access.

Authors
Postdoctoral Researcher in Computer Science
I am currently a postdoctoral researcher in computer science at Université
TÉLUQ, where my research focuses on speeding up the conversion of integer and
floating-point numbers into decimal strings. During my doctoral studies, I
designed algorithms and data structures that leverage modern computer
architectures to solve large instances of Markov decision processes (MDPs). In
my master’s research, I developed routing algorithms for electric vehicles
aimed at determining the optimal path between two points while minimizing
travel time (including driving, charging, and expected waiting time at
charging stations).
Authors