Research Article

Big Transfer Learning for Fine Art Classification

Table 3

State-of-the-art results for artist, genre, and style categorization on the WikiArt dataset, including samples and classes in each task.

ReferenceYearMethodArtistStyleGenre
SampleClassesAcc. (%)SamplesClassesAcc. (%)SamplesClassesAcc. (%)

[9]2016CNN fine-tuning (AlexNet)19,0502376.1181,4442754.564,9931074.14
[31]2016Feature fusion18,5992363.0678,4492745.9763,6911060.28
[43]2017CNN fine-tuning (ResNet18)17,1005777.7
[35]2017CNN fine-tuning (ResNet-34)79,4342661.15
[44]2018CNN fine-tuning (CaffeNet)20,3202381.9496,0142756.4386,0871077.6
[19]2019Two-stage classification approach26,4002266.71
[32]2020RGB and brush stroke channels97661988.3830,8252558.9928,7601076.27
[34]2021Structure selection19,0502391.7381,4442769.9764,9931078.03
BiT-S (ours)Big transfer19,0502391.3481,4442768.2764,9931080.88
BiT-M (ours)Big transfer19,0502393.5081,4442771.2464,9931082.39

The bold values show that the proposed transfer learning approach outperforms the previous work by a large margin and achieves state-of-the-art performance in the art field.