Technical specs are available for the TL1 component.
+ DOWNLOAD PDF
WARNING - HANDLE WITH CARE
The TL1 waveguide is a precision instrument and needs to be free of surface contaminants such as fingerprints etc. This is because the images confined in the glass waveguide will become distorted and compromised at a contaminated interface when reflected on its path between the first and second holographic coupler.
TL2 is a clear low iron glass waveguide with two holographic couplers bonded transparently onto the waveguide for introducing and exiting the data to and from the waveguide.
TL2 Dimensions
Thickness: 2.4mm (3.9mm over couplers)
Width: 107mm
Height 30mm
Holographic couplers
Thickness: 1.5mm
Width: 13mm
Height: 21mm
NB: Please note that we are working towards mass production holograms on polymer, which will reduce the eventual thickness for consumers
Please see the diagram left for these dimensions schematically and note that the actual waveguide and its couplers are drawn in blue.
The holograms for TL2 were shot at 640 nm, 532 nm and 473 nm
We allow for a modest amount of shrinkage as this improves the rendition, so the current replay central wavelengths are approximately 625 nm, 525 nm and 460nm.
The spectral curve is shown on the following graph
The red and green are at minimum 50% efficient, though 60% has been reached. The blue trails in efficiency at between 30 to 35% and we are working to improve this.
The nominal bandwidths FWHM are approximately:
Colour Bandwidth
Red 25 - 30 nm
Green 20 - 25 nm
Blue 15 - 20 nm
Angle of view
The angle of view is 18 to 20 degrees total though the more accurate colour is in a narrower area
The holograms were made using RGB lasers in our own proprietary material. This enables us to tailor the characteristics required during prototyping such as refractive index modulation, thickness of layer, Bragg plane profile linearity etc.
Given that we have access to our own proprietary material we welcome enquiries from product developers for bespoke optical specifications.
The waveguide can be checked for functionality by holding the left hand side coupler up to the right eye.
The right hand coupler can then access views (say 20 m away) arriving from behind the head, as long as the hand holding the coupler or the head itself do not obscure the line of sight.
See right for illustration
For a more specific evaluation you can image the icons on your phone using a lens with a focal length between 20 to 50 mm to collimate the beam.
The image quality will be determined by the quality of the lens and the display screen resolution.
The simplest form of AR using the waveguide optics requires a microdisplay, a collimating lens and a frame to hold them together so that the microdisplay is imaged through the waveguide to the eye. The microdisplay will need a connection to the real time imaging source – for example a smartphone.
The microdisplay would normally be at the side of the head pointing forwards to the input holographic coupler on the waveguide. The collimating lens would be between the microdisplay and the waveguide. The other end of the waveguide would be positioned so that the output coupler would be directly in front of the eye.
With optimum alignment of all the elements the microdisplay image should be visible in the transparent output coupler and at a comfortable focus for the eye. This allows the environment to be viewed at the same time as the received imaging from the microdisplay.
The optimum angle at which the micro display and lens assembly couple to the waveguide is typically slightly off 90 degrees laterally. This offset is best found by actual experiment and is only a matter of a few degrees, typically 3 degrees from the normal.
Diagram illustrating the setup of an augmented reality device using the TL2 optic
We have found that using two sets of TL2 to enable a source entering each eye allows for augmented reality in 3D.
Utilising the 2 eyes with separate imaging streams for right and left eye to produce stereo 3D would permit the operator to manually handle the real objects in the same depth space as the digital 3D imaging.
This would be useful for many complex tasks where updated or detailed graphical information could be presented for work such as pipeline repair, surgical operations, etc.
Due to the diffraction characteristic of the holographic technology there is some tendency for the image to show a discreet lateral colour bias i.e red to the left, green in the centre and blue to the right.
We are looking at a number of ways to solve this issue including:
1. Software enabled modification to boost opposite colour bias on the image to offset that caused by diffraction
2. Linear variable edge filters on the backlight
3. Stereo pair use – by using two sets of TL2, one for each eye
What we have found is in our 3D version the colour spectrum is balanced by utilising the opposite sense for right and left eye and this counterbalances the colour gradation so that the 3D image appears in good colour over the entire field of vision.
4. Digital input using a fibre optic
The method shown below is a possible way to modify imaging and reduce the lateral colour bias.
In this regime we would utilise a spatial light modulator combined with a holographic optic and a filter to optimise aspects of the image.
Here the image is injected via optic fibre and the hologram is for enabling the angles required for the SLM to work effectively. The SLM would be fast switching to deal with the RGB illumination sequentially as it can only deal with one colour per computed generated fringe pattern.