Download PDFOpen PDF in browser

Pyramidal Combination of Separable Branches for Deep Short Connected Neural Networks

EasyChair Preprint 464

12 pagesDate: August 29, 2018

Abstract

Recent works have shown that Convolutional Neural Networks (CNNs) with deeper structure and short connections have extremely good performance in image classification tasks. However, deep short connected neural networks have been proven that they are merely ensembles of relatively shallow networks. From this point, instead of traditional simple module stacked neural networks, we propose Pyramidal Combination of Separable Branches Neural Networks (PCSB-Nets), whose basic module is deeper, more delicate and flexible with much fewer parameters. The PCSB-Nets can fuse the caught features more sufficiently, disproportionately increase the efficiency of parameters and improve the model’s generalization and capacity abilities. Experiments have shown this novel architecture has improvement gains on benchmark CIFAR image classification datasets.

Keyphrases: CNNs, PCSB-Nets, deep learning

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:464,
  author    = {Yao Lu and Guangming Lu and Rui Lin and Bing Ma},
  title     = {Pyramidal Combination of Separable Branches for Deep Short Connected Neural Networks},
  doi       = {10.29007/dq2m},
  howpublished = {EasyChair Preprint 464},
  year      = {EasyChair, 2018}}
Download PDFOpen PDF in browser