GTAC: A Generative Transformer for Approximate Circuits

arXiv:2601.19906v2 Announce Type: replace-cross Abstract: Targeting error-tolerant applications, approximate computing relaxes rigid functional equivalence to significantly improve power, performance, and area. Traditional approximate logic synthesis (ALS) relies on incremental rewriting, limiting design space exploration. Meanwhile, the inherently probabilistic nature of Transformer-based generative AI makes it a natural fit for generating approximate circuits. Exploiting this, we propose GTAC, an end-to-end framework for arbitrary-scale generative ALS. To overcome the memory bottleneck of generative AI, GTAC partitions a large circuit into tractable subcircuits, applies a generative core to produce approximate candidates for each subcircuit, and finally selects proper candidates to form the final design. Its core generative Transformer utilizes a novel irredundant encoding to compactly encode a circuit, alongside a masking mechanism to exclude designs violating the given error bound. Empowered by a self-evolutionary training strategy, GTAC establishes a new paradigm that demonstrates superior performance: It reduces delay by 30.9% and gate count by 50.5% over exact generative baselines and saves 6.5% area with a 4.3x speedup against traditional ALS methods. Furthermore, its irredundant encoding achieves a 33.3x reduction in sequence length and a 61.6x reduction in peak memory compared to conventional memoryless traversal.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top