New development in projection mapping technology
Automatic positioning technique that does not require special pattern projection and measurement by cameras
The use of projection mapping is being advanced in a wide area ranging from entertainment to medical treatment. In projection mapping, it’s essential to achieve the correspondences of the projected imagery and the projection surface.
A group of researchers led by Professor Kosuke SATO and Associate Professor Daisuke IWAI at the Graduate School of Engineering Science, Osaka University succeeded in developing two types of position estimation techniques for a laser projector. One estimates the position of a projected beam when it directly illuminates a photosensor, while the other localizes a beam by measuring the reflection from a retro-reflective marker with the photosensor placed in the optical path of the projector. The technology developed by this group for a laser projector, a next-generation image projection technology, does not require a projection of special patterns and measurement by a camera.
A laser projector projects laser beams to create an image on a screen by scanning laser beams horizontally and vertically, so each pixel is irradiated once at a certain timing (the time from vertical synchronization (V-Sync) to a timestamp) while displaying images. By estimating the positions of projected pixels from the time when a projected beam hits a photosensor embedded in the projection surface while projecting meaningful image content, this group achieved correspondence of a projection image and the projection surface. The group also demonstrated that their technique could solve technical limitations in conventional positioning techniques in experiments using a prototype system.
In addition, this group showed that the correspondences between the projector pixels and patterns on the projection surface were achieved by measuring the reflection from a retro-reflective marker with the photosensor placed in the optical path of the projector. In each frame, this process generates an image, which is the equivalent of an image captured by a camera with the projector's field of view. Using this technique, this group demonstrated that conventional positioning based on camera measurements that use retro-reflective markers as binary codes (e.g., graycode) could be done without cameras.
As the Internet of things (IoT) moves forward, it is expected that photosensors connected to the network will pervade every aspect of our life. The use of the techniques developed by this group will offer a variety of services in our daily life that use a projection mapping system without additional measurement devices using cameras.
This paper presents a novel projected pixel localization principle for online geometric registration in dynamic projection mapping applications. We propose applying a time measurement of a laser projector raster-scanning beam using a photosensor to estimate its position while the projector displays meaningful visual information to human observers. Based on this principle, we develop two types of position estimation techniques. One estimates the position of a projected beam when it directly illuminates a photosensor. The other localizes a beam by measuring the reflection from a retro-reflective marker with the photosensor placed in the optical path of the projector. We conduct system evaluations using prototypes to validate this method as well as to confirm the applicability of our principle. In addition, we discuss the technical limitations of the prototypes based on the evaluation results. Finally, we build several dynamic projection mapping applications to demonstrate the feasibility of our principle.