Photo: STR/JIJI PRESS/AFP (Getty Images)
Kyoto University, a top research institute in Japan, recently lost a ton of research after the supercomputer system accidentally erased as much as 77 terabytes of data during what would have been a routine backup procedure.
That outage, which occurred sometime between Dec. 14 and 16, wiped out about 34 million files from 14 different research groups using the school’s supercomputer system. The university operates Hewlett Packard Cray computer systems and a DataDirect ExaScaler storage system, which can be used by research teams for a variety of purposes.
It’s unclear what kind of files were specifically deleted or what caused the actual outage, though the school has said the work of at least four different groups cannot be recovered.
BleepingComputer, who originally reported on this incident, helpfully points out that supercomputing research isn’t super cheap either — it costs somewhere in the neighborhood of hundreds of dollars an hour to run.
Kyoto, one of the most highly regarded schools in Japan and originally receives significant scholarships and funding published details about the unfortunate incident in mid-December.
“Dear users of Supercomputing services,” the message begins (translated to English via Google). “Today, a bug in the storage system backup utility caused an accident where some files in /LARGE0 were lost. We have stopped processing the issue, but we may have lost close to 100TB of files and are investigating the magnitude of the impact.”
Supercomputing is different from normal computing largely due to his speed and the ability to use multiple computer systems to process complex mathematical calculations. Its advantages over normal computing make it a valuable research tool for: a whole range of areas, including climate and atmospheric modelling, physics, vaccine science, and everything in between. Unfortunately, all of that is pointless if your machine isn’t working properly.
Stay tuned for more such real estate news and updates at zavalinka.in