How to simulate exit pupil expander (EPE) with diffractive optics for augmented reality (AR) system in OpticStudio: part 3

In this article, an example is demonstrated to set up an exit pupil expander (EPE) using the RCWA tool for an augmented reality (AR) system in OpticStudio. The planning of gratings in k-space (optical momentum) is first explained, and the details of setting up each grating is discussed.

Authored By Michael Cheng

Downloads

Article attachment

Introduction

This article is part 3 of 4 articles, which demonstrate how to check the footprint diagram and simulate the image for the exit pupil expander system. In the end, possible improvements and other considerations to the EPE system are also discussed. Links to other parts are listed below for reader’s convenience.

How to simulate exit pupil expander (EPE) with diffractive optics for augmented reality (AR) system in OpticStudio: part 1

How to simulate exit pupil expander (EPE) with diffractive optics for augmented reality (AR) system in OpticStudio: part 2

How to simulate exit pupil expander (EPE) with diffractive optics for augmented reality (AR) system in OpticStudio: part 4

Check footprint

It is useful to investigate the footprint of each field when designing the EPE. To check the footprint diameter, a detector needs to be added, and the result can be observed in the Shaded Model. An example is saved in the attached file “step4_check footprint.zar”.

Key points to know when checking this file:

1.    The Color and Scale parameters for the detector is changed from the default value of 0. They are mainly set for switching to different appearances of the detector in the Shaded Model and does not have any physical effect.
2.    To make sure the detector can be clearly seen in the Shaded Model, we also need to change the opacity of all the grating and waveguide objects, as shown in Figure 1. In this example, the waveguide is set to 10% and gratings are set to 30%, but this is totally up to users.


Figure 1 The object can be transparent in the Shaded Model by setting the Opacity to be less than 100% in the Object Properties.

3.    The footprint can then be observed by opening the Shaded Model. In the Shaded Model, it’s set to show the object’s hidden lines and to color the detector’s pixels by the last analysis run as shown in Figure 2. The final result is shown in Figure 3.


Figure 2 In Shaded Model, show the object’s hidden lines and color the detector’s pixels by last analysis.


Figure 3 A detector added for detecting the footprints from each field.

Image simulation

To simulate how the image looks like when observed by the human eye, a dummy image source is required. Additionally, at the exit pupil, an ideal lens system is needed to mimic the eye system to check the image. An example is demonstrated in attached “step5_image simulation.zar”.


Figure 4 To simulate how the image looks like with this waveguide system, 6 objects are added. The first 3 objects (#10 to #12) are an ideal projection system that collimate a point source on the image source and sends the beam to the in-coupling grating. The last 3 objects (#13 to #15) are an imaging system that focuses the out-coupled beam to a detector at its focal plane.

Key points to know when checking this file:

1.    The objects #10 to #12 together represent an ideal projection system which projects an image source to infinity (afocal). The exit pupil of the projection system is at the couple-in grating, which is the entrance pupil of the EPE system.
2.    Object 10 is a rectangular Lambertian light source built by using a Source DLL object with Lambertian_Overfill.DLL. The parameters Target Diameter and Target Distance are specified to the in-couple grating. This makes the simulation efficient because only rays that can hit the couple-in grating will launch.
3.    Object 11 is a Slide object which can use a bitmap as a mask. Its position should be slightly offset from Object 10. Combined with Object 10, a Lambertian image source can be built. In this example, the bitmap is a QR code to test.
4.    Object 12 is a Paraxial Lens that has a focal length equal to the distance between Object 12 and Object 10 & 11. In other words, the image source is exactly at the focal plane of this ideal lens.
5.    The image source is rectangular with width of 5 mm and the focal length of the system is 10 mm. This means this AR system has FOV of ±14 degrees in both X and Y directions and 20 degrees in the diagonal direction.
6.    Objects #13 to #15 represent an ideal imaging system to simulate what a human eye would observe. Object 13 is a Paraxial Lens that can focus the infinite-conjugate image to its focal plane. Object 15 is a detector at the focal plane to detect the image focused by the Paraxial Lens. 
7.    Object 14 is an Annulus that can absorb light. This object is used for two reasons. The first one is because Object 13, the Paraxial Lens, is rectangular while the human pupil is circular. The other reason is the Annulus with large outer diameter can block out the rays that don’t pass through the Paraxial Lens but directly hit the detector. The Annulus has an inner radius of 3 mm, which represents a human eye with 6 mm pupil size.
8.    Note the objects cannot overlap, so there is a small value set at the Z Position on Objects 13 and 14. The ray will first hit the exit pupil detector (Object 8), then hit the Paraxial Lens (object 13), then finally be filtered by the Annulus (object 14).

Discussion

The example here is only for demonstration. The following points are not considered in detail in this article, but they might inspire designers for this kind of system.

1.    The position and shape of the turning grating can be rearranged for best space utilization. As shown in Figure 5, the turning grating can be as close to the couple-in grating as possible. The smallest required shape can be decided from the couple-in grating’s size and the desired FOV for the system. From the FOV, it’s easy to know the “fan angle” formed by the light propagation from the couple-in grating.
2.    To improve the uniformity at the exit pupil, the grating can be separated into several regions with different grating parameters.
3.    In the demonstration of the image simulation, the eye pupil is only put at the center of the out-coupling grating. It is important to also check how the eye may receive the image when the eye pupil shifts. It’s also important to check the image quality when the eye pupil is smaller.
4.    In this demonstration, the projection system is idealized with a Paraxial Lens, which in reality is a lens system designed separately. Using a real lens by introduces aberration, but can also bring some flexibility to the system. For example, through adequate vignetting design, the F/# for different field points can be controlled to compensate the performance of the waveguide and gratings.


Figure 5 The planning of the turning grating shape mainly needs to consider the first couple-in grating’s size and the FOV for the system. The FOV used by the system decides the “Fan angle”.

 

Previous Article: How to simulate exit pupil expander (EPE) with diffractive optics for augmented reality (AR) system in OpticStudio: part 2

Next Article: How to simulate exit pupil expander (EPE) with diffractive optics for augmented reality (AR) system in OpticStudio: part 4

KA-01984

Was this article helpful?
2 out of 2 found this helpful

Comments

0 comments

Article is closed for comments.