Bioinspired123D: Generative 3D Modeling System for Bioinspired Structures
arXiv:2603.29592v1 Announce Type: cross
Abstract: Generative AI has made rapid progress in text, image, and video synthesis, yet text-to-3D modeling for scientific design remains particularly challenging due to limited controllability and high computational cost. Most existing 3D generative methods rely on meshes, voxels, or point clouds which can be costly to train and difficult to control. We introduce Bioinspired123D, a lightweight and modular code-as-geometry pipeline that generates fabricable 3D structures directly through parametric programs rather than dense visual representations. At the core of Bioinspired123D is Bioinspired3D, a compact language model finetuned to translate natural language design cues into Blender Python scripts encoding smooth, biologically inspired geometries. We curate a domain-specific dataset of over 4,000 bioinspired and geometric design scripts spanning helical, cellular, and tubular motifs with parametric variability. The dataset is expanded and validated through an automated LLM-driven, Blender-based quality control pipeline. Bioinspired3D is then embedded in a graph-based agentic framework that integrates multimodal retrieval-augmented generation and a vision-language model critic to iteratively evaluate, critique, and repair generated scripts. We evaluate performance on a new benchmark for 3D geometry script generation and show that Bioinspired123D demonstrates a near fourfold improvement over its non-finetuned base model, while also outperforming substantially larger state-of-the-art language models despite using far fewer parameters and compute. By prioritizing code-as-geometry representations, Bioinspired123D enables compute-efficient, controllable, and interpretable text-to-3D generation, lowering barriers to AI driven scientific discovery in materials and structural design.