TIST Journal 2026 Journal Article
Microscale-Searching Optimization for Transfer Learning-Based Filter Fine-Tuning
- Le Feng
- Fujian Feng
- Li Xiao
- Mian Tan
- Han Huang
- Yinong Wang
Fine-tuning has emerged as a popular technique in the field of transfer learning, demonstrating remarkable achievements in various data-scarce tasks. The performance of fine-tuning in deep convolutional neural networks depends on the selection of which parameters to fine-tune and freeze. However, it is difficult to determine which parameters in the pre-trained model need to be fine-tuned for a new task. This article proposes a filter-level discrete optimization model to identify the filter subset for fine-tuning, a core step of filter selection coding optimization. Due to the huge search space of the filter fine-tuning problem, we propose a filter interactivity decomposition strategy to find a valid search subspace (a smaller search subspace containing the optimal solution) by dividing the entire filter fine-tuning problem into multiple suboptimization problems. Based on the decomposition strategy, we design a microscale-searching transfer optimization algorithm, which solves each subproblem by searching the valid search subspace instead of the original search space of the filter fine-tuning problem. To verify the validity of the proposed algorithm, extensive experiments are conducted on seven publicly available image classification datasets: Stanford Dogs, MIT Indoors, Caltech 256-30, Caltech 256-60, Aircraft, UCF-101, and Omniglot. Experimental results show that the proposed method significantly improves the fine-tuning accuracy while effectively reducing the filter fine-tuning problem scale. Moreover, the proposed algorithm outperforms the state-of-the-art fine-tuning methods on the fine-tuning problem for transfer learning.