Virtual Display provides a display to the system which is not necessarily linked to the real display hardware. Applications and middleware, e.g. system compositor, should be able to use this display as usual. The concrete implementation has to transfer the content of this virtual display and present this on the real display hardware. Virtual Display concept can be considered as subcategory of Surface Sharing.
- Simple to use because it abstracts the details of display implementation for the middleware
- Doesn't provide a fine granular control over the display content but it is still good enough for a lot of use-cases
- Entire display content is handled instead of applications content
Virtual Display doesn't describe an own sharing technology, it is more a concept which can be used to realize distributed HMI use-cases. Important characteristic of the Virtual Display concept is that the entire display content is transferred instead of transferring content from dedicated applications. The system which provides the content should have the display which looks like the normal one but is not necessarily liked to the physical display, so the middleware and applications can use it as usual. Such display can be called Virtual Display. The implementation of Virtual Display on the producer system should be generic enough to look like the normal display and should take care of the transferring the display content to another HMI unit or another system. This basically means a final graphical surface needs to be transferred, therefore it can be considered as subcategory of Surface Sharing. On the receiver site the content can be handled with more flexibility. It could be directly used as content for physical display, it could be mapped to an one physical layer of the composition hardware or used as part of a composition combined with another local available content. This flexibility makes the definition and the separation between different technologies a bit blurry. Important characteristic of the Virtual Display concept is the we are handling the entire display content instead of handling content from dedicated applications.
Example: virtual display in Weston
Open source wayland compositor-Weston provides an example of Virtual Display implementation . Weston can be configured to create a Virtual Display which is not linked to any physical display. Properties like resolution, timing, display name, IP-address and port can be defined in weston.ini file. From this configuration Weston will create virtual display "area" and all content which will appear in this area will be encoded to M-JPEG video compression format and transmitted to the configured IP address, current Weston implementation uses corresponding gstreamer plugings for this. The assignment to virtual display will happen in a different way depending on the used shell in Weston. e.g. If ivi-shell is used: new virtual display can be identified with the configured display name and corresponding wayland protocol can be used to put some applications content to this display. In case of xdg-shell, user can just drag the application with the mice to the corresponding area.
Weston example doesn't provide any code for the receiver site. For first iterations example of the gstreamer pipeline is provided which is able to decode and present the content provided by Weston, it should run on main linux distributions. For production environment it should be fairly easy to write an application to receive the content by using available API's and frameworks on the target system.
Example: virtual display in Android
Placeholder. Existing APIs in AOSP & Auto, how this is implemented under the hood (BSP abstraction?), considerations, limitations, real-world experience... (AllGo)