Develop Tools & CodePaper and LLMs

One-bit Flip is All You Need: When Bit-flip Attack Meets Model Training

We propose a training-assisted bit flip attack, in which the adversary is involved in the training stage to build a high-risk model to release.

Tags:

Pricing Type

  • Pricing Type: Free
  • Price Range Start($):

GitHub Link

The GitHub link is https://github.com/jianshuod/tba

Introduce

The GitHub repository “jianshuod/TBA” contains the official code for the ICCV 2023 paper titled “One-bit Flip is All You Need When Bit-flip Attack Meets Model Training.” The project focuses on the convergence of bit-flip attacks and model training. The code, developed using Python 3 and PyTorch, offers a main pipeline for the method and provides instructions for installation and usage. The repository includes details about task specifications, hyperparameters, and results, particularly highlighting the attacking of 8-bit quantized ResNet-18. The work is licensed under the Apache License 2.0.

We propose a training-assisted bit flip attack, in which the adversary is involved in the training stage to build a high-risk model to release.

Content

This is the official implementation of our paper One-bit Flip is All You Need: When Bit-flip Attack Meets Model Training, accepted by ICCV 2023. This research project is developed based on Python 3 and Pytorch. If you think this work or our codes are useful for your research, please cite our paper via: Install by running the following cmd in the work directory Step 1: Download the model checkpoint, and then place it in the directory “checkpoint/resnet18” Step 2: Fill out the path to this work directory in your server Step 3: configure the path to CIFAR-10 dataset in config.py The log for attacking 8-bit quantized ResNet-18 is provided. Please refer to log_resnet18_8.txt for our results. This project is licensed under the terms of the Apache License 2.0. See the LICENSE file for the full text.


One-bit Flip is All You Need: When Bit-flip Attack Meets Model Training

Related