quantum-computing

5 posts

google

Dynamic surface codes open new avenues for quantum error correction (opens in new tab)

Google Research has demonstrated the operation of dynamic surface codes for quantum error correction, marking a significant shift from traditional static circuit architectures. By alternating between different circuit constructions and re-tiling "detecting regions" in each cycle, these dynamic circuits offer greater flexibility to avoid hardware defects and suppress correlated errors. Experimental results on the Willow processor show that these methods can match the performance of static codes while significantly simplifying the physical design and fabrication of quantum chips. ## Error Triangulation via Dynamic Detecting Regions Quantum error correction (QEC) functions by localizing physical errors within specific "detecting regions" over multiple cycles to prevent them from affecting logical information. While standard surface codes use a static, square tiling for these regions, dynamic codes periodically change the tiling pattern. * Dynamic circuits allow the system to "deform" the detecting regions in spacetime, providing multiple perspectives to triangulate errors. * This approach enables the use of different gate types and connectivity layouts that are not possible with fixed, repetitive cycles. * The flexibility of dynamic re-tiling allows the system to sidestep common superconducting qubit issues such as "dropouts" (failed qubits or couplers) and leakage out of the computational subspace. ## Quantum Error Correction on Hexagonal Lattices Traditional square lattices require each physical qubit to connect to four neighbors, which creates significant overhead in wiring and coupler density. Dynamic circuits enable the use of a hexagonal lattice, where each qubit only requires three couplers. * The hexagonal code alternates between two distinct cycle types, utilizing one of the three couplers twice per cycle to maintain error detection capabilities. * Testing on the Willow processor showed that scaling the hexagonal code from distance 3 to 5 improved the logical error rate by a factor of 2.15, matching the performance of standard static circuits. * Reducing coupler density simplifies the optimization of qubit and gate frequencies, leading to a 15% improvement in simulated error suppression compared to four-coupler designs. ## Walking Circuits to Mitigate Leakage Superconducting qubits are prone to "leakage," where a qubit exits its intended computational states (0 and 1) into a higher energy state (2). In static circuits, repeated measurements on the same physical qubits can cause these leakage errors to accumulate and spread. * "Walking" circuits solve this by shifting the roles of data and measurement qubits across the lattice in each cycle. * By constantly moving the location where errors are measured, the circuit effectively "flushes out" leakage and other correlated errors before they can damage logical information. * Experiments confirmed that walking circuits achieve error suppression equivalent to static circuits while offering a more robust defense against long-term error correlations. ## Flexibility with iSWAP Entangling Gates Most superconducting quantum processors are optimized for Controlled-Z (CZ) gates, but dynamic circuits prove that QEC can be effectively implemented using alternative gates like iSWAP. * The research team demonstrated a dynamic surface code that utilizes iSWAP gates, which are native to many quantum hardware architectures. * This flexibility ensures that QEC is not tethered to a specific gate set, allowing hardware designers to choose entangling operations that offer the highest physical fidelity for their specific device. The move toward dynamic surface codes suggests a future where quantum processors are more resilient to manufacturing imperfections. By adopting hexagonal layouts and walking circuits, developers can reduce hardware complexity and mitigate physical noise, providing a more scalable path toward fault-tolerant quantum computing.

google

Google Research 2025: Bolder breakthroughs, bigger impact (opens in new tab)

Google Research in 2025 has shifted toward an accelerated "Magic Cycle" that rapidly translates foundational breakthroughs into real-world applications across science, society, and consumer products. By prioritizing model efficiency, factuality, and agentic capabilities, the organization is moving beyond static text generation toward interactive, multi-modal systems that solve complex global challenges. This evolution is underpinned by a commitment to responsible AI development, ensuring that new technologies like quantum computing and generative UI are both safe and culturally inclusive. ## Enhancing Model Efficiency and Factuality * Google introduced new efficiency-focused techniques like block verification (an evolution of speculative decoding) and the LAVA scheduling algorithm, which optimizes resource allocation in large cloud data centers. * The Gemini 3 model achieved state-of-the-art results on factuality benchmarks, including SimpleQA Verified and the newly released FACTS benchmark suite, by emphasizing grounded world knowledge. * Research into Retrieval Augmented Generation (RAG) led to the development of the LLM Re-Ranker in Vertex AI, which helps models determine if they possess sufficient context to provide accurate answers. * The Gemma open model expanded to support over 140 languages, supported by the TUNA taxonomy and the Amplify initiative to improve socio-cultural intelligence and data representation. ## Interactive Experiences through Generative UI * A novel implementation of generative UI allows Gemini 3 to dynamically create visual interfaces, web pages, and tools in response to user prompts rather than providing static text. * This technology is powered by specialized models like "Gemini 3-interactive," which are trained to output structured code and design elements. * These capabilities have been integrated into AI Mode within Google Search, allowing for more immersive and customizable user journeys. ## Advanced Architectures and Agentic AI * Google is exploring hybrid model architectures, such as Jamba-style models that combine State Space Models (SSMs) with traditional attention mechanisms to handle long contexts more efficiently. * The development of agentic AI focuses on models that can reason, plan, and use tools, exemplified by Project Astra, a prototype for a universal AI agent. * Specialized models like Gemini 3-code have been optimized to act as autonomous collaborators for software developers, assisting in complex coding tasks and system design. ## AI for Science and Planetary Health * In biology, research teams utilized AI to map human heart and brain structures and employed RoseTTAFold-Diffusion to design new proteins for therapeutic use. * The NeuralGCM model has revolutionized Earth sciences by combining traditional physics with machine learning for faster, more accurate weather and climate forecasting. * Environmental initiatives include the FireSat satellite constellation for global wildfire detection and the expansion of AI-driven flood forecasting and contrail mitigation. ## Quantum Computing and Responsible AI * Google achieved significant milestones in quantum error correction, developing low-overhead codes that bring the industry closer to a reliable, large-scale quantum computer. * Security and safety remain central, with the expansion of SynthID—a watermarking tool for AI-generated text, audio, and video—to help users identify synthetic content. * The team continues to refine the Secure AI Framework (SAIF) to defend against emerging threats while promoting the safe deployment of generative media models like Veo and Imagen. To maximize the impact of these advancements, organizations should focus on integrating agentic workflows and RAG-based architectures to ensure their AI implementations are both factual and capable of performing multi-step tasks. Developers can leverage the Gemma open models to build culturally aware applications that scale across diverse global markets.

google

A new quantum toolkit for optimization (opens in new tab)

Researchers at Google Quantum AI have introduced Decoded Quantum Interferometry (DQI), a new quantum algorithm designed to tackle optimization problems that remain intractable for classical supercomputers. By leveraging the wavelike nature of quantum mechanics to create specific interference patterns, the algorithm converts complex optimization tasks into high-dimensional lattice decoding problems. This breakthrough provides a theoretical framework where large-scale, error-corrected quantum computers could eventually outperform classical methods by several orders of magnitude on commercially relevant tasks. ### Linking Optimization to Lattice Decoding * The DQI algorithm functions by mapping the cost landscape of an optimization problem onto a periodic lattice structure. * The "decoding" aspect involves identifying the nearest lattice element to a specific point in space, a task that becomes exponentially difficult for classical computers as dimensions increase into the hundreds or thousands. * By using quantum interference to bridge these fields, researchers can apply decades of sophisticated classical decoding research—originally developed for data storage and transmission—to solve optimization challenges. * This approach is unique because it requires a quantum computer to leverage these classical decoding algorithms in a way that conventional hardware cannot. ### Solving the Optimal Polynomial Intersection (OPI) Problem * The most significant application of DQI is for the OPI problem, where the goal is to find a low-degree polynomial that intersects the maximum number of given target points. * OPI is a foundational task in data science (polynomial regression), cryptography, and digital error correction, yet it remains "hopelessly difficult" for classical algorithms in many scenarios. * DQI transforms the OPI problem into a task of decoding Reed-Solomon codes, a family of codes widely used in technologies like QR codes and DVDs. * Technical analysis indicates a massive performance gap: certain OPI instances could be solved by a quantum computer in approximately a few million operations, while the most efficient classical algorithms would require over $10^{23}$ (one hundred sextillion) operations. ### Practical Conclusion As quantum hardware moves toward the era of error correction, Decoded Quantum Interferometry identifies a specific class of "NP-hard" problems where quantum machines can provide a clear win. Researchers and industries focusing on cryptography and complex data regression should monitor DQI as a primary candidate for demonstrating the first generation of commercially viable quantum advantage in optimization.

google

Accelerating the magic cycle of research breakthroughs and real-world applications (opens in new tab)

Google Research is accelerating a "magic cycle" where breakthrough scientific discoveries and real-world applications continuously reinforce one another through advanced AI models and open platforms. By leveraging agentic tools and large-scale foundations, the company is transforming complex data into actionable insights across geospatial analysis, genomics, and quantum computing. This iterative process aims to solve critical global challenges while simultaneously uncovering new frontiers for future innovation. ### Earth AI and Geospatial Reasoning * Google has integrated various geospatial models—including those for flood forecasting, wildfire tracking, and air quality—into a unified Earth AI program. * The newly introduced Geospatial Reasoning Agent uses Large Language Models (LLMs) to allow non-experts to ask complex questions and receive plain-language answers derived from diverse datasets. * Riverine flood models have been significantly expanded, now providing forecasts for over 2 billion people across 150 countries. * New Remote Sensing and Population Dynamics Foundations have been released to help researchers understand nuanced correlations in planetary data and supply chain management. ### DeepSomatic and Genomic Research * Building on ten years of genomics work, DeepSomatic is an AI tool designed to identify somatic mutations (genetic variants in tumors) to assist in cancer research. * The tool follows the development of previous foundational models like DeepVariant and DeepConsensus, which helped map human and non-human genomes. * These advancements aim to move the medical field closer to precision medicine by providing health practitioners with higher-resolution data on genetic variations. ### The Magic Cycle of Research and Development * Google highlights "Quantum Echoes" as a key breakthrough in quantum computing, contributing to the broader goal of solving fundamental scientific problems through high-scale computation. * The acceleration of discovery is largely attributed to "agentic tools" that assist scientists in navigating massive datasets and uncovering new research opportunities. * The company emphasizes a collaborative approach, making foundation models available to trusted testers and partners like the WHO and various international research institutes. To maximize the impact of these breakthroughs, organizations should look toward integrating multimodal AI agents that can bridge the gap between specialized scientific data and practical decision-making. By utilizing open platforms and foundation models, the broader scientific community can translate high-level research into scalable solutions for climate resilience, healthcare, and global policy.

google

A verifiable quantum advantage (opens in new tab)

Google Quantum AI researchers have introduced "Quantum Echoes," a new algorithm designed to measure Out-of-Time-Order Correlators (OTOCs) to characterize quantum chaos. By demonstrating this task on the 103-qubit Willow chip, the team has achieved a verifiable quantum advantage that surpasses the limitations of previous random circuit sampling techniques. This work establishes a direct path toward solving practical problems in physics and chemistry, such as Hamiltonian learning, through the use of stable and reproducible quantum expectation values. ## Limitations of Random Circuit Sampling * While the 2019 "quantum supremacy" milestone proved quantum computers could outperform classical ones, the bitstring sampling method used was difficult to verify and lacked practical utility. * In large-scale quantum systems, specific bitstrings rarely repeat, which restricts the ability to extract useful, actionable information from the computation. * The Quantum Echoes approach shifts focus to quantum expectation values—such as magnetization, density, and velocity—which remain consistent across different quantum computers and are computationally verifiable. ## The Quantum Echoes Algorithm and OTOCs * The algorithm measures OTOCs, which represent the state of a single qubit after a series of "forward" ($U$) and "backward" ($U^\dagger$) evolutions. * In the experiment, 103 qubits on the Willow processor underwent evolution through random quantum circuits to reach a highly chaotic state. * A perturbation (gate $B$) is applied between the forward and backward evolutions; if the system is chaotic, this small change triggers a "butterfly effect," resulting in a final state significantly different from the initial one. * Higher-order OTOCs involve multiple "round trips" of these evolutions, increasing the system's sensitivity to the perturbation and allowing for a more detailed characterization of the quantum dynamics. ## Many-Body Interference and Signal Amplification * The researchers discovered that higher-order OTOCs function like many-body interferometers, where the quantum states of many particles interfere with one another. * The perturbation gates ($B$ and $M$) act as mirrors; when a resonance condition is met (where $U^\dagger$ is the exact inverse of $U$), constructive interference occurs. * This constructive interference amplifies specific quantum correlations, allowing the OTOC signal magnitude to scale as a negative power of the system size, rather than the exponential decay typically seen in chaotic systems. * This amplification makes the OTOC a sensitive instrument for identifying the specific correlations generated between two different qubits during the evolution of the circuit. ## Practical Applications and Future Research The success of the Quantum Echoes algorithm on the Willow chip marks a transition toward using quantum computers for tasks that are both beyond-classical and physically relevant. This method is particularly well-suited for Hamiltonian learning in Nuclear Magnetic Resonance (NMR) and studying the flow of electrons in high-temperature superconductors. Moving forward, the ability to measure verifiable expectation values in the chaotic regime will be essential for researchers looking to simulate complex quantum materials that are impossible to model on classical hardware.