Haiqu, a quantum software company, has demonstrated a proprietary technique for efficiently encoding complex, high-dimensional data onto quantum processors, achieving performance gains in anomaly detection. The work was executed over the cloud on an IBM Quantum Heron processor.
The core innovation is a novel data encoding solution that addresses the dimensionality gap: the inability of most near-term quantum devices to handle datasets with hundreds or thousands of features due to limited qubit counts. Haiqu’s method successfully loaded over 500 features into quantum circuits on 128 qubits, a process the company claims can scale to problems with tens of thousands of features on near-term hardware.
The experiment used quantum preprocessing combined with classical machine learning to analyze high-dimensional financial datasets. This hybrid approach consistently outperformed a classical baseline, achieving a final F1 score of 0.96 even on the noisy IBM Quantum Heron hardware. The quantum features are generated through a form of Projected Quantum Kernel (PQK), which transforms classical data into a richer quantum state representation.
The preprocessing time on the quantum device was reported to be faster than classically simulating the same computations, providing an empirical signal of potential quantum advantage in industrial deployment. Haiqu is offering early access to its quantum feature embedding technology for beta testers to explore applications across financial modeling, predictive maintenance, and health diagnostics.
Read the full announcement here and the technical blog detailing the experiment and results here.
November 13, 2025

Leave A Comment