Handle URI:
http://hdl.handle.net/10754/626689
Title:
Deep Learning Microscopy
Authors:
Rivenson, Yair; Gorocs, Zoltan; Gunaydin, Harun; Zhang, Yibo; Wang, Hongda; Ozcan, Aydogan
Abstract:
We demonstrate that a deep neural network can significantly improve optical microscopy, enhancing its spatial resolution over a large field-of-view and depth-of-field. After its training, the only input to this network is an image acquired using a regular optical microscope, without any changes to its design. We blindly tested this deep learning approach using various tissue samples that are imaged with low-resolution and wide-field systems, where the network rapidly outputs an image with remarkably better resolution, matching the performance of higher numerical aperture lenses, also significantly surpassing their limited field-of-view and depth-of-field. These results are transformative for various fields that use microscopy tools, including e.g., life sciences, where optical microscopy is considered as one of the most widely used and deployed techniques. Beyond such applications, our presented approach is broadly applicable to other imaging modalities, also spanning different parts of the electromagnetic spectrum, and can be used to design computational imagers that get better and better as they continue to image specimen and establish new transformations among different modes of imaging.
Publisher:
arXiv
Issue Date:
12-May-2017
ARXIV:
arXiv:1705.04709
Type:
Preprint
Additional Links:
http://arxiv.org/abs/1705.04709v1; http://arxiv.org/pdf/1705.04709v1
Appears in Collections:
Publications Acknowledging KAUST Support

Full metadata record

DC FieldValue Language
dc.contributor.authorRivenson, Yairen
dc.contributor.authorGorocs, Zoltanen
dc.contributor.authorGunaydin, Harunen
dc.contributor.authorZhang, Yiboen
dc.contributor.authorWang, Hongdaen
dc.contributor.authorOzcan, Aydoganen
dc.date.accessioned2018-01-04T07:51:39Z-
dc.date.available2018-01-04T07:51:39Z-
dc.date.issued2017-05-12en
dc.identifier.urihttp://hdl.handle.net/10754/626689-
dc.description.abstractWe demonstrate that a deep neural network can significantly improve optical microscopy, enhancing its spatial resolution over a large field-of-view and depth-of-field. After its training, the only input to this network is an image acquired using a regular optical microscope, without any changes to its design. We blindly tested this deep learning approach using various tissue samples that are imaged with low-resolution and wide-field systems, where the network rapidly outputs an image with remarkably better resolution, matching the performance of higher numerical aperture lenses, also significantly surpassing their limited field-of-view and depth-of-field. These results are transformative for various fields that use microscopy tools, including e.g., life sciences, where optical microscopy is considered as one of the most widely used and deployed techniques. Beyond such applications, our presented approach is broadly applicable to other imaging modalities, also spanning different parts of the electromagnetic spectrum, and can be used to design computational imagers that get better and better as they continue to image specimen and establish new transformations among different modes of imaging.en
dc.publisherarXiven
dc.relation.urlhttp://arxiv.org/abs/1705.04709v1en
dc.relation.urlhttp://arxiv.org/pdf/1705.04709v1en
dc.titleDeep Learning Microscopyen
dc.typePreprinten
dc.contributor.institutionCalifornia NanoSystems Institute (CNSI), University of California, Los Angeles, CA, 90095, USA.en
dc.contributor.institutionBioengineering Department, University of California, Los Angeles, CA, 90095, USA.en
dc.contributor.institutionElectrical Engineering Department, University of California, Los Angeles, CA, 90095, USA.en
dc.contributor.institutionDepartment of Surgery, David Geffen School of Medicine, University of California, Los Angeles, CA, 90095, USA.en
dc.identifier.arxividarXiv:1705.04709en
All Items in KAUST are protected by copyright, with all rights reserved, unless otherwise indicated.