Denmark’s Realfiction Expanding Beyond Hologram-ish Displays With New Multi-View Tech
December 15, 2023 by Dave Haynes
The Danish display tech firm Realfiction is best known for its merchandising-sized holographic-ish displays that use a variation on the old Pepper’s Ghost reflected visual illusion technique, but next month during CES in Las Vegas it is planning to show and demo technology that shows different visuals to different viewers.
The company is not at the show-proper, but has a demo suite booked at Aria.
This is how I think of these guys (below), but what’s being advanced here is different from what you might have seen at trade shows or in retail.
I assume these are all from what the company calls Project Echo.
AV consultant and writer Chris Chinnock of Insight Media has pushed out a summary that is mostly over my big, empty head that runs through four different demos of what is also called multi-view technology. I’m just going to copy-paste here because it gets very nerdy, very quickly.
Demo #1 – This is essentially the same demo as was shown at DisplayWeek earlier this year. It is a proof-of-concept demo that showcases the use of a new very fast switching ferroelectric modulator that creates a multi-zone display with each zone able to display a different image. It features a low resolution microLED display with the F-LCD modulator in front used to create a moving parallax barrier pattern. The new F-LCD modulator uses new materials to offer robust mechanical stability (a prior issue) along with clever drive schemes that mitigate the need to blank the display during the reverse voltage cycle, needed to maintain DC balancing. They call this a Hybrid Scan Display. In addition, this technology forms the basis for replacing the optically-based steerable 2D/3D displays used in the next two demos.
Demo #2 – This and the third demo are both based on a new way to create a directional backlight. They are also both multi-stereoscopic 17” displays. The backlight uses conventional LED and drivers, but the arrangement is different along with the driving algorithm. On top is a horizontal-only parallax lens array and an IPS LCD panel. The idea is to time sequentially drive the LED backlight to steer light through the lens array and LCD. 28 backlight scanning columns are created and scanned in synchrony with the IPS panel using a specially-designed overdrive algorithm. To maintain the full resolution of the panel and achieve a 60 Hz refresh rate, they employ eye tracking technology. Using this data, they project left and right eye images in the zone where a viewer is sitting. While eye trackers to support up to 10 users are in development, the current demo will use an Intel RealSense eye tracker to deliver the image to two different users.
I’m back, briefly. This video seems to be getting into how that might work for automotive, but there would perhaps be commercial display applications, as well.
Demo #3 – This demo features the same hardware with a software modification to showcase and automotive application for a dual-view display. The demo drops the eye tracking as the relative positions of the driver and passenger are well known, while the number of zones is reduced to two and the width of the viewing cone is widened for easier access to the image. Two separate 2D images can then be shown to the driver and the passenger. Since directional backlight in demos #2 and #3 are software driven, they can also be used as conventional 2D displays.
Demo #4 – The fourth demo is based on a novel OLED backplane design for eye tracked light field displays, supporting a very large number of subpixels per pixel. Perhaps the best way to explain this is by analogy. Consider a conventional video wall solution. Each module contains a medium-resolution display with wiring to every sub-pixel in the module. But the module-to-module connection has a much simpler interface, and the entire video wall can be driven over a single cable from a controller. This concept has been extended to an OLED backplane made from LTPS, LTPO or Oxide materials. As with the video wall concept, the backplane is divided into a series of zones, where there are connections to each subpixel in the zone. These zones are then interconnected with a separate network, which also forms the interface to the external display drivers. This separate zone-connection network is an active matrix which can be driven with standard TCON driver chips. A FHD display must add 33% more column driver contacts on the edge of the display, but the zone can have a much higher density. In the demo, each zone now has 600 sub-pixels per pixel instead of 3. Sub-pixel addressing is embedded in the video signal passed through the TCON. The tradeoff is that you cannot update all subpixels individually. Instead, you must select which subpixels to update when you scan out an image. For example, one can scan out 6 different time multiplexed images directed to six different tracked eyeballs at 360 fps, and each eyeball will experience a separate perspective image at a frame rate of 60 fps. The 12 x 6 mm proof-of-concept is on silicon and not glass. It has two such segments with a total of 128 pixels with 76.8um x 7um wide subpixels using only 40 data lines and 16 scan lines.
https://www.linkedin.com/posts/peter-simonsen-6a3a937_realfiction-introduces-dualview-an-innovative-activity-7137738933186027520-qwOE?utm_source=share&utm_medium=member_android
Here can you see a finished product of the “Multi-view Display” for automotive.
The technology istelf is called DPT ” directional pixel technology”.