Jiaheng Zhang pioneered and continues to lead the emerging field of zero-knowledge machine learning (ZKML), which aims to verifiably ensure the integrity and confidentiality of model inference. Building on his foundational work, his team's 2025 research reduced the proof-generation time for Transformer model inference to under one minute—a 100-fold improvement over prior systems—clearing a key barrier to deploying privacy-preserving AI at scale. To lower the technology's barrier to entry, he is also leading the development of the zkPyTorch framework to seamlessly integrate ZKP capabilities into mainstream machine learning workflows.
In the blockchain field, he addresses the challenges of interoperability and scalability. His zkBridge is a cross-chain solution that eliminates external trust assumptions by using ZKPs to verify on-chain states, enabling secure communication. The technology has been commercialized by Polyhedra Network, a company he co-founded, which now supports over 25 blockchains and processes hundreds of cross-chain transactions daily. To address scalability bottlenecks, his 2024 Pianist protocol enables scalable ZK-Rollups by using a fully distributed design that allows parties to collaboratively generate proofs in a model akin to a mining pool.
These application breakthroughs stem from his contributions to the design of the underlying ZKP protocol design. His early designs for protocols such as Libra and Virgo addressed the bottleneck of prover computational overhead and removed the reliance on trusted setups. More recently, he expanded his research to the systems and hardware levels, proposing in 2025 a batch proof generation system (BatchZK) that leverages GPU pipeline parallelism, significantly improving proof generation throughput.
His work is shaping the future of decentralized systems and verifiable AI by building a more secure, transparent, and trustworthy digital future.