Intermediated Reality

Intermediated Reality empowers users to interact with physical objects as if these objects were magically coming to life using Augmented Reality.




We introduce Intermediated Reality (IR), a framework for intermediated communication enabling collaboration through remote possession of entities (e.g., toys) that come to life in mobile Mediated Reality (MR). By altering the camera video feed with a reconstructed appearance of the object in a deformed pose, we perform the illusion of movement in real-world objects to realize collaborative tele-present augmented reality (AR).

Reality Mixing

Intermediated Reality (IR) allows the simultaneous usage of entities that come to life in Augmented Reality enabling them to collaborate and have fun together remotely. This real-world object acts as an intermediary who reproduces recorded interactions made by the sender. Seamless appearance of the physical deformed object is preserved using texture deformation.

Metaverse Reality


Seamless Augmentation

By augmenting the camera feed with our reconstructed appearance of the object in a deformed shape, we perform the illusion of movement for real-world static objects, remotely. This method achieves an illusion of movement from the real-world object through image retargeting techniques in real-time.

Seamless Augmentation


Shadow Retargeting

Shadow Retargeting maps real shadow appearance to virtual shadows given a corresponding deformation of scene geometry such that appearance is seamlessly maintained. By performing virtual shadow reconstruction from un-occluded real shadow samples observed in the camera frame, we recover the deformed shadow appearance efficiently.

Shadow Retargeting


Publications


Intermediated Reality: A Framework for Communication through Tele-Puppetry

Intermediated Reality: A Framework for Communication through Tele-Puppetry

L. Casas and K. Mitchell. "Intermediated Reality: A Framework for Communication through Tele-Puppetry". Frontiers in Robotics and Artificial Intelligence - Special issue "Collaboration in Mixed-Reality" 2019.


Enhanced Shadow Retargeting with Light-Source Estimation Using Flat Fresnel Lenses

Enhanced Shadow Retargeting with Light-Source Estimation Using Flat Fresnel Lenses

L. Casas, M. Fauconneau, M. Kosek, K. Mclister, and K. Mitchell, “Enhanced Shadow Retargeting with Light-Source Estimation Using Flat Fresnel Lenses”, Computers, vol. 8, no. 2, p. 29, Apr. 2019.


Multi-reality games: an experience across the entire reality-virtuality continuum

Multi-reality games: an experience across the entire reality-virtuality continuum

L. Casas*, L. Ciccone*, G. Çimen, P. Wiedemann, M. Fauconneau, R. W. Sumner and K. Mitchell. "Multi-reality games: an experience across the entire reality-virtuality continuum". In Proceedings of the 16th ACM SIGGRAPH International Conference on Virtual-Reality Continuum and its Applications in Industry (VRCAI '18). Tokyo, Japan. 2018. Article 18, 1–4.


Image Based Proximate Shadow Retargeting

Image Based Proximate Shadow Retargeting

L. Casas, M. Fauconneau, M. Kosek, K. Mclister, and K. Mitchell, "Image Based Proximate Shadow Retargeting", In Proceedings of the Conference on Computer Graphics & Visual Computing (CGVC '18), Swansea, Wales, UK, 2018, Eurographics Association, pp. 43–50.


Props Alive: A Framework for Augmented Reality Stop Motion Animation

Props Alive: A Framework for Augmented Reality Stop Motion Animation

L. Casas, M. Kosek and K. Mitchell, "Props Alive: A Framework for Augmented Reality Stop Motion Animation", 2017 IEEE 10th Workshop on Software Engineering and Architectures for Realtime Interactive Systems (SEARIS), Los Angeles, CA, USA, 2017, pp. 1-4.


Patents


Intermediated Reality. Augmented Reality Systems and Methods

Intermediated Reality. Augmented Reality Systems and Methods

L. Casas and K. Mitchell. 2019. US20220068010A1 (Patent Pending).

Get in touch!


Dr. Llogari Casas and Professor Kenny Mitchell co-founded 3FINERY LTD with a host of top notch advisors and partners. We are an independent spinout of Edinburgh Napier University in the context of more than 7 years of research on advanced audio and visual immersive avatar communication technologies. If you would like to find out more about how we can help you and your business, please get in touch with us.





Supported by