Russian Scientists Cut Memory Use for Training Neural Networks Eightfold
Researchers at Moscow State University have introduced a new method for aligning medical images using neural operators.

Scientists at Moscow State University have found a way to reduce memory consumption in artificial intelligence training by a factor of eight, the university’s press service said.
They developed a new method for medical image registration based on neural operators. The FNOReg model does not require high-resolution images for training, which is particularly important when working with large volumes of 3D data. Computing resources can be saved without sacrificing accuracy.
In medicine, aligning and precisely matching medical images is a core diagnostic technique. However, traditional mathematical approaches demand significant computing power and careful manual tuning. Neural network-based methods, meanwhile, require large amounts of memory that most desktop computers do not have.
Resilient to Resolution Changes
During testing, the model demonstrated accuracy comparable to leading international solutions. While the accuracy of other programs dropped by 24–25% when image quality was reduced by half, FNOReg’s accuracy declined by less than 1%.
As Dmitry Sorokin, senior researcher at the Laboratory of Mathematical Image Processing Methods, noted, the development paves the way for more efficient large-scale data processing in medicine.








































