Measurement solutions for
virtual and augmented reality headsets

During the last decades digitalization has increasingly influenced our lives. Virtual and augmented reality products are now changing reality as we know it. Our real environment is replaced with digital images (Virtual Reality – VR) or enhanced with digital data (Augmented Reality – AR). TRIOPTICS offers various measurement solutions for VR and AR optics and thus enables the production of ever higher quality VR and AR products. To optimize technical product realizations TRIOPTICS offers various test solutions for VR and AR applications.

AR Headset

The complete AR headset consists of two Near-Eye Displays (NEDs), sensors (such as position sensors), cameras and the eye-tracking unit. The alignment of the two NEDs (stereo alignment) is important to prevent nausea while wearing the headset and to ensure the best image quality.
A complete test of the headset is usually the last step in production.

Test parameters

  • Image sharpness (MTF), distortion, lateral chromatic aberration (chief ray angle) across the field of view in VIS and NIR
  • Absolute brightness, uniformity and color fidelity
  • Measurement of the exit pupil position and eye relief distance
  • Measurement of virtual object distance
  • Measurement of the alignment of the NEDs to each other: divergence, dipvergence, rotation, and distance of the eyeboxes.

Near-Eye Display

The near-eye display (NED), which comprises a projector and waveguide/combiner, projects the virtual image superimposed with the real environment onto the human eye. Frequently, two NEDs, one per eye, are combined into one headset.

  • Image sharpness (MTF), distortion, lateral chromatic aberration (chief ray angle) across the field of view in VIS and NIR
  • Absolute brightness, uniformity and color fidelity
  • Measurement of the exit pupil position and eye relief distance
  • Measurement of virtual object distance

Projector

The projector displays an image of the computer-generated parts of the AR view, which is projected onto the user’s eye via the waveguide/combiner. It consists of a display unit (e.g. based on LCOS, micro-LED, laser beam scanning) and lens that project the generated image from a virtual distance.

  • Image sharpness (MTF), distortion, lateral chromatic aberration (chief ray angle) across the field of view in VIS and NIR
  • Absolute brightness, uniformity and color fidelity
  • Measurement of the exit pupil position and eye relief distance
  • Measurement of virtual object distance
  • Measurement of the alignment of optical elements to each other
Alignment of optical elements to each other for optimal centering and positioning

Eye-Tracking Camera (NIR)

AR headsets continuously measure the viewing direction and the distance between the user’s pupils to calculate an accurate virtual image.

  • Image sharpness (MTF), distortion, lateral chromatic aberration (chief ray angle) across the field of view
  • OECF (Opto-Electronic Conversion Function)
  • Defective pixels
  • Tilt and focusing lens image plane to sensor
Automatic alignment of optics to image sensor and optimal focus position

Exterior Cameras (VIS & NIR)

AR headsets usually have several VIS and NIR cameras for observing the environment, e.g. for object or marker detection.

  • Image sharpness (MTF), distortion, lateral chromatic aberration (chief ray angle) across the field of view
  • OECF (Opto-Electronic Conversion Function), color reproduction
  • Defective pixels
  • Tilt and focusing lens image plane to sensor
Automatic alignment of optics to image sensor and optimal focus position

Do you have a measurement or assembly task?
Talk to our experts.

“The VR and AR market offers great growth potential. TRIOPTICS is driving this development from the very beginning.”

Dr. Stefan Krey | Chief Technology Officer

VR Headset

The complete headset consists of the VR projector, sensors (such as position sensors), cameras and eye-tracking unit. A complete test of the headset is usually the last step in production.

Test parameters

  • Image sharpness (MTF), distortion, lateral chromatic aberration (chief ray angle) across the field of view in VIS and NIR
  • Absolute brightness, uniformity, and color fidelity
  • Measurement of the exit pupil position and eye relief distance
  • Measurement of virtual object distance
  • Measurement of the alignment of the NEDs to each other: divergence, dipvergence, rotation, and distance of the eyeboxes.

VR-Projector

The VR projector consists of a display and projection lens and displays the virtual image to the eye. Usually, a display and 2 projection lenses, one per eye, are combined to form a VR projector. The alignment of the two projection lenses (stereo alignment) is important to prevent nausea when wearing the headset and to ensure the best image quality.

  • Image sharpness (MTF), distortion, lateral chromatic aberration (chief ray angle) across the field of view in VIS and NIR
  • Absolute brightness, uniformity and color fidelity across the field of view
  • Measurement of the exit pupil position and eye relief distance
  • Measurement of virtual object distance of the projection lens
  • Alignment of the two individual projectors to each other: line of sight (divergence, dipvergence)

Automatic alignment of the two projection lenses to the display(s) and to each other (stereo alignment)

Eye-Tracking Cameras (NIR)

AR and VR headsets continuously measure the viewing direction and the distance between the user’s pupils in order to calculate an accurate virtual image.

  • Image sharpness (MTF), distortion, lateral chromatic aberration (chief ray angle) across the field of view
  • OECF (Opto-Electronic Conversion Function)
  • Defective pixels
  • Tilt and focusing lens image plane to sensor

Automatic alignment of optics to image sensor and optimal focus position

Projection lens

The projection lens project the 2D image of the display from a virtual object distance that can be perceived by the eye. Pancake, fresnel or free-form lenses are often used to keep the overall height as low as possible.

  • Image sharpness (MTF), distortion, lateral chromatic aberration (chief ray angle) across the field of view and eyebox in VIS and NIR
  • Measurement of the exit pupil position and eye relief distance
  • Scattered light (veiling glare)
Alignment of optical elements to each other for optimal centering and positioning

Do you have a measurement or assembly task?
Talk to our experts.

Eye Simulator

In the human eye, light enters through the pupil and is focused onto the retina by a lens. The amount of light hitting the retina is determined by the iris, which controls the diameter and size of the pupil.

ImageMaster® VR Series replicate this function of the biological eye. A camera with adjustable focus transmits the generated test images to an image sensor. The camera takes over the function of the human pupil and the image sensor the tasks of the retina.

Eyebox scanning

VR and AR headsets must produce a uniform, realistic and pleasant image for many different people. Since the human head is individually shaped, differences such as different pupil distances must be taken into account. The test devices of the ImageMaster® VR Series therefore measure the test images within a certain limit – the so-called eyebox.

Stray light

Stray light can cause unwanted effects such as ghost reflections in the image plane, which can reduce image contrast. For optimal image quality, optical designers must be able to effectively characterize stray light effects caused by the facet-like structure of Fresnel lenses and by damage or contamination to the lens surface. ImageMaster® VR/AR Series products measure the Veiling Glare Index to optimize the optical design of VR and AR eyewear.

Some of our products for the AR/VR market

Our Newsletter – Your advantage in knowledge

Be one of the first to experience our
product novelties and innovative application possibilities.