- WaveY-Net: Physics-Augmented Deep Learning for High-Speed Electromagnetic Simulation and Optimization SPIE-INT SOC OPTICAL ENGINEERING. 2022
- Multiobjective and categorical global optimization of photonic structures based on ResNet generative neural networks NANOPHOTONICS 2021; 10 (1): 361–69
- Deep neural networks for the evaluation and design of photonic devices NATURE REVIEWS MATERIALS 2020
- Design Space Reparameterization Enforces Hard Geometric Constraints in Inverse-Designed Nanophotonic Devices ACS PHOTONICS 2020; 7 (11): 3141–51
- Robust Freeform Metasurface Design Based on Progressively Growing Generative Networks ACS PHOTONICS 2020; 7 (8): 2098–2104
MetaNet: a new paradigm for data sharing in photonics research
2020; 28 (9): 13670–81
Optimization methods are playing an increasingly important role in all facets of photonics engineering, from integrated photonics to free space diffractive optics. However, efforts in the photonics community to develop optimization algorithms remain uncoordinated, which has hindered proper benchmarking of design approaches and access to device designs based on optimization. We introduce MetaNet, an online database of photonic devices and design codes intended to promote coordination and collaboration within the photonics community. Using metagratings as a model system, we have uploaded over one hundred thousand device layouts to the database, as well as source code for implementations of local and global topology optimization methods. Further analyses of these large datasets allow the distribution of optimized devices to be visualized for a given optimization method. We expect that the coordinated research efforts enabled by MetaNet will expedite algorithm development for photonics design.
View details for DOI 10.1364/OE.388378
View details for Web of Science ID 000530854700092
View details for PubMedID 32403837
Reparameterization to Enforce Constraints in the Inverse Design of Metasurfaces
View details for Web of Science ID 000612090000187
Free-Form Diffractive Metagrating Design Based on Generative Adversarial Networks.
A key challenge in metasurface design is the development of algorithms that can effectively and efficiently produce high-performance devices. Design methods based on iterative optimization can push the performance limits of metasurfaces, but they require extensive computational resources that limit their implementation to small numbers of microscale devices. We show that generative neural networks can train from images of periodic, topology-optimized metagratings to produce high-efficiency, topologically complex devices operating over a broad range of deflection angles and wavelengths. Further iterative optimization of these designs yields devices with enhanced robustness and efficiencies, and these devices can be utilized as additional training data for network refinement. In this manner, generative networks can be trained, with a one-time computation cost, and used as a design tool to facilitate the production of near-optimal, topologically complex device designs. We envision that such data-driven design methodologies can apply to other physical sciences domains that require the design of functional elements operating across a wide parameter space.
View details for DOI 10.1021/acsnano.9b02371
View details for PubMedID 31314492
Global Optimization of Dielectric Metasurfaces Using a Physics-Driven Neural Network.
We present a global optimizer, based on a conditional generative neural network, which can output ensembles of highly efficient topology-optimized metasurfaces operating across a range of parameters. A key feature of the network is that it initially generates a distribution of devices that broadly samples the design space and then shifts and refines this distribution toward favorable design space regions over the course of optimization. Training is performed by calculating the forward and adjoint electromagnetic simulations of outputted devices and using the subsequent efficiency gradients for backpropagation. With metagratings operating across a range of wavelengths and angles as a model system, we show that devices produced from the trained generative network have efficiencies comparable to or better than the best devices produced by adjoint-based topology optimization, while requiring less computational cost. Our reframing of adjoint-based optimization to the training of a generative neural network applies generally to physical systems that can utilize gradients to improve performance.
View details for DOI 10.1021/acs.nanolett.9b01857
View details for PubMedID 31294997
Simulator-based training of generative neural networks for the inverse design of metasurfaces
View details for DOI 10.1515/nanoph-2019-0330