Knowledge Distillation Circumvents Nonlinearity for Optical Convolutional Neural Networks

dc.contributor.advisorShlizerman, Eli
dc.contributor.authorXiang, Jinlin
dc.date.accessioned2021-07-07T20:03:22Z
dc.date.issued2021-07-07
dc.date.submitted2021
dc.descriptionThesis (Master's)--University of Washington, 2021
dc.description.abstractIn recent years, Convolutional Neural Networks (CNNs) have enabled ubiquitous image processing applications. As such, CNNs require fast runtime (forward propagation) to process high-resolution visual streams in real-time. This is still a challenging task even with state-of-the-art graphics and tensor processing units. The bottleneck in computational efficiency primarily occurs in the convolutional layers. Performing operations in the Fourier domain is a promising way to accelerate forward propagation since it transforms convolutions into elementwise multiplications, which are considerably faster to compute for large inputs and kernels. Furthermore, such computation could be implemented using an optical 4f system with orders of magnitude faster operation. However, a major challenge in using this spectral approach, as well as in an optical implementation of CNNs, is the inclusion of nonlinearity between each convolutional layer, without which CNN performance drops dramatically. Here, we propose a Spectral CNN Linear Counterpart (SCLC) network architecture and develop a Knowledge Distillation (KD) approach to circumvent the need for nonlinearity and successfully train such networks. While the KD approach is known in machine learning as an effective process for network pruning, we adapt the approach to transfer the knowledge from a nonlinear network (teacher) to a linear counterpart (student). We show that the KD approach can achieve performance that easily surpasses the standard linear version of a CNN and could approach the performance of the nonlinear network. Our simulations show that the possibility of increasing the resolution of the input image allows our proposed 4foptical linear network to perform more efficiently than a nonlinear network with the same accuracy on two fundamentalimage processing tasks: (i) object classification and (ii) semantic segmentation.
dc.embargo.lift2022-07-07T20:03:22Z
dc.embargo.termsRestrict to UW for 1 year -- then make Open Access
dc.format.mimetypeapplication/pdf
dc.identifier.otherXiang_washington_0250O_22581.pdf
dc.identifier.urihttp://hdl.handle.net/1773/47092
dc.language.isoen_US
dc.rightsnone
dc.subjectConvolutional Neural Networks
dc.subjectKnowledge Distillation
dc.subjectOptical Neural Networks
dc.subjectOptics
dc.subject.otherMechanical engineering
dc.titleKnowledge Distillation Circumvents Nonlinearity for Optical Convolutional Neural Networks
dc.typeThesis

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Xiang_washington_0250O_22581.pdf
Size:
2.86 MB
Format:
Adobe Portable Document Format