A number of optical systems (e.g. digital displays and projection systems) require knowledge of the photometric (human eye) response of the system to a broadband source. In this article we'll examine the methods used in OpticStudion to implement broadband sources, and both analyze and optimize the system performance with the Detector Color object and NSDE operand.
Authored By Sanjay Gangadhara
Downloads
Introduction
A number of optical systems (e.g. digital displays and projection systems) require knowledge of the photometric (human eye) response of the system to a broadband source. As described in the article entitled “How to Model Colored and Tristimulus Sources”, a polychromatic source may be defined in Zemax either by using multiple system wavelengths or by defining the Tristimulus (or associated) values for the source. The resultant color distribution of the output image may be visualized using the Detector Color object. In this article, we’ll provide a brief overview of the Detector Color object and its associated merit function operand NSDE.
Detector Color object
The Detector Color object is very similar to the Detector Rect object. The main difference is that the Detector Color object stores Tristimulus data in addition to power data (and unlike the Detector Rect object, the Detector Color object does not store coherent data). The Detector Color object can be used to visualize the True Color image of a given irradiance distribution:
This image can be viewed in both Position Space as well as Angle Space, so we can view the True Color image of both the irradiance and radiant intensity distributions.
A simple example in which a beam of light is used to illuminate the Detector Color object is provided in the file “Simple Example System.ZMX”. The archive file for this example is located at the beginning of this article. In this example, a collimated beam of light at 0.555 microns is propagated 1 mm to the Detector Color object:
The color of the beam shown in the Layout plot corresponds to an RGB representation of the wavelength being traced:
For the source, the wavelength traced is set by System Wavelengths:
which are defined in the Wavelength Data dialog box:
The wavelengths emitted by the source can also be determined by Tristimulus values, as described in the article “How to Model Colored and Tristimulus Sources”.
Tracing 1 million rays in this system, the following results are found for the irradiance distribution:
The total power is given in Lumens, as expected for a True Color image (in which the photopic, i.e. human eye, response has been accounted for). We can use this same detector object to determine the amount of power on the detector in Watts, simply by switching the view to False Color (or equivalent):
The total power is 1 Watt, consistent with the amount of power launched by the source. At 0.555 microns, this amount of power corresponds to 683 Lumens, as shown in the True Color image.
The radiant intensity distribution shows that all the power lands on a single pixel in angle space, as expected for a collimated beam:
The distribution is still shown in True Color in this case, although the results may also be observed in False Color (or Grey Scale, etc.) as well. As we will see later, both Tristimulus and power values for the irradiance and radiant intensity on the detector may be obtained in the merit function using the NSDE operand.
Color mixing
In addition to being able to see the True Color representation of a single wavelength propagated through an optical system, the Detector Color object can of course be used to view the True Color representation of a range of wavelengths as they hit the detector. Thus, this object can be used to see how colors would “mix” as they hit the detector.
A nice example of this is provided in the sample file “Example 1 Two-color mixing gives white.ZMX”, whichis located in Zemax\Samples\Non-Sequential\Colorimetry. In this example, two monochromatic sources are mixed together to form a white color. Each source is also shown individually, so that the color for each source can also be seen separately:
The top two sources are not mixed together, so that we may see the yellow and blue colors separately on the Detector Color object. The wavelengths used to define the yellow and blue colors are 0.57 and 0.46 microns, respectively (Tristimulus sources could have also been used to define the color of each source, as described in the article “How to Model Colored and Tristimulus Sources”):
The orientation of each of the bottom two sources is tilted so that the colors will mix together. If 1 million rays are traced from each source, we find:
At the top of the detector, each individual color can be seen separately. On the bottom of the detector, we see that the individual colors have mixed together over some region to form the color white. If we zoom in on the results, the region of overlap may be observed, as well as the regions where light from the two sources does not overlap:
It may be counterintuitive to see only two monochromatic colors mixing together to form white (generally, we think that three colors are needed to form white). However, remember that the Detector Color object shows the True Color representation of the image, which also accounts for the response of the human eye to color. When the human eye (photopic) response is accounted for, we find that there are indeed two monochromatic colors which can mix together to form what the human eye perceives to be white.
Color mixing can also be seen in the radiant intensity distribution:
However, in this case the observed colors correspond to mixing of different sources. The sources at the bottom of the system – which mixed together in Position Space – are tilted at different angles (+/- 25 degrees), so in Angle Space the light from these sources lands on different pixels on the detector. Thus, in this case the two sources at the bottom of the system provide the distinct yellow and blue spots seen in the above image. Conversely, the sources at the top of the system – which did not mix together in Position Space – are both oriented at the same angle (0 degrees), and thus the light from these sources land on the same pixels in Angle Space. In this case, the two sources at the top of the system mix together, providing the white color seen in the center of the image (centered at zero degrees in Angle Space).
NSDE operand
The NSDE operand allows Tristimulus and power data from the Detector Color object to be placed in the merit function. This data may be extracted simply for purposes of investigation, or for purposes of optimization. For example, a system could be set up to determine the correct ratio of various source colors that will generate a desired output color.
The setup of such a system is provided in the file “Color optimization setup.ZMX”. The archive (.ZAR) file for this system is locatedat the beginning of this article. In this example, three source rays land on a single-pixel Detector Color object. The wavelength of each source is different, and the three different wavelengths are defined in the Wavelength Data dialog box:
The initial power of each source is 1 Watt. The resulting color generated by the mixing of these three wavelengths is a powder blue:
We can quantify this color in a number of ways: using the Tristimulus XYZ values associated with the color, the Chromaticity xy values associated with the color, or the Chromaticity u’v’ values associated with the color. You may read more about these different color definitions in the article entitled “How to Model Colored and Tristimulus Sources”. The NSDE operand provides values for all of these various color definitions, as described in the OpticStudio help files section The Optimize Tab (sequential ui mode)>Automatic Optimization Group>Merit Function Editor (automatic optimization group)>Optimization Operands (Alphabetically). In this example, the color will be characterized using the Chromaticity xy values.
To use the NSDE operand to obtain Tristimulus or power data from the Detector Color object, we first need to have an NSDE operand in the merit function that clears the detector. Then, an NSTR operand is needed to run the ray trace inside of the merit function. Subsequent NSDE operands may then be used to evaluate results on the detector. This sequence of operands is identical to what is needed when using the NSDD operand with the Detector Rect, Detector Surf, or Detector Volume objects – see the article entitled “How to Optimize Non-Sequential Optical Systems” for more details.
The powder blue color associated with the initial setup of the example has a Chromaticity x value of 0.27 and a Chromaticity y value of 0.29, as determined using the NSDE operand with Data = 5 and 6, respectively:
The total power reaching the detector in Watts and Lumens is also reported back in the merit function using the NSDE operand with Data = 1 and Data = 3, respectively. The total power in Watts is 3.0, as expected (since each source has a power of 1 W), and the total power in Lumens is 786.8, consistent with the value shown on the Detector Color object itself.
The targets for the desired color have been set to 0.50 and 0.35 for the Chromaticity x and y values, respectively. The optimization variables in this case are the power associated with each source:
If we perform optimization using the Damped Least Squares algorithm, we quickly find a solution:
that yields the desired salmon pink color:
These optimized results are provided in the file “Color optimization results.ZMX”. The archive (.ZAR) file for this system is located in the Article Attachments.
The total power of the system has increased from 3.0 to 5.7 Watts and from 786.8 to 1134.9 Lumens. Additional operands could be added to constrain the total power of the system, if needed.
NSDE: more details
As seen in the previous section, the NSDE operand can be used to extract both Tristimulus and power data into the merit function. These data can be in either position space or angle space, and can be values from a single pixel or averaged over all pixels on the Detector Color object. Average values for the Chromaticity xy and u’v’ parameters are obtained by averaging only over those pixels that have a finite amount of power on them. More details are provided in the OpticStudio help files section The Optimize Tab (sequential ui mode)>Automatic Optimization Group>Merit Function Editor (automatic optimization group)>Optimization Operands (Alphabetically)..
When power or Tristimulus XYZ values are reported from a single pixel, these values correspond to the
power (in Watts) or XYZ values (in Lumens) per unit area (if the data are in position space) or per unit steradian (if the data are in angle space). XYZ values are calculated for a given pixel by summing the intensities on each pixel weighted by the human eye response:
More details are provided in the article entitled “How to Model Colored and Tristimulus Sources”. The XYZ values on a given pixel – or even averaged over all pixels on the Detector – may not always agree with the Tristimulus XYZ values for the source, as a result of two conditions:
- If the number of rays traced is small, there can be statistical variation between the XYZ values on the detector and those of the source
- If there are optics between the source and the detector whose behavior varies with wavelength (e.g. optics with wavelength-dependent transmission), the XYZ values on the detector will vary from those of the source
Once the XYZ values are calculated, these values are divided by the pixel area (position space) or solid angle (angle space) to determine the actual value returned by the NSDE operand for the given pixel.
The ZPL function NSDE() is available for extracting both Tristimulus and power data into a macro. The NSDE() function takes the same inputs as the NSDE merit function operand. This function requires that a ray trace is first run in the macro using the NSTR keyword. More details may be found in the OpticStudio help files section The Programming Tab>About the ZPL .
KA-01423
Comments
Please sign in to leave a comment.