Free trial

Life’s Imprints in 360 Degrees

In an earlier blog entry, we wrote about diving into virtual reality through tech research at profiq. We started by thoroughly examining VR, back when we were primarily interested in the general status of this technology. What equipment is available on the market, and what usage opportunities does it present for users? What’s the architecture of the technical solution itself, and on which platforms is it built? What are the use cases in the real world, and how can we, as software engineers, use our expertise and our own development to contribute? These are the kinds of questions we asked during this very fun and inspiring phase of getting to know virtual reality and testing its ever-expanding boundaries and shifting limits. So which area did we ultimately select for closer examination?

It is truly fascinating how many possibilities this new technology offers for engineering research, and how many facets of human life it can actually impact in the near future. One of the basic premises of virtual reality is that it can transport you to another environment than the one in which you are using it. The essence of the places you can „visit“ from the comfort of your home or office can be of a dual nature. How so?

First, these hyper-real worlds can be modelled artificially. A user enters a newly-created space of 3D objects and polygons, which offers a very high level of believability and adds a wonderful interactivity that closely approaches the way the real world behaves around us. Valve Corporation , the top among game developers, created an engaging application called The Lab, which offers a suitable example of this case. As VR users, we’re very curious as to what sorts of new opportunities artificially modelled worlds will bring to VR. It’s great that current developments indicate we have a lot to look forward to!

The second method for „teleporting“ elsewhere through your VR headset is the interposition of experiences and recorded images from the real world. These are subsequently transmitted to your glasses and displayed there in a manner that most faithfully evokes the impression that you’re in the place where the recording was taken. Google Street View offers a good example of how this use of virtual reality allows for very believable virtual travel to places that have been recorded into the application. This domain of „life’s imprints in 360 degrees“ has brought a whole new dimension to 360° photographs and especially 360° videos, which bring precisely this to users.

The transmission of video content to VR glasses and the display of this content just became the subject of profiq’s closer, more detailed research. We’ll cover this topic in several other blog entries along with this one.

How does the transmission of 360° videos actually work? We’ll start with a simple overview of the most essential conditions that fundamentally come into play when trying to understand the process from recording a 360° video to playing it in your VR glasses. In the infographic below, you can identify the major milestones encountered in a closer look at the journey of a video from the camera all the way to your headset. Although it’s a simplified diagram, it’s clear that this process encompasses a wide range of topics that software developers can explore. So which of these did we choose, and why? Which of these topics impressed our developers? Follow our blog and soon you’ll find out more!

360video vr/ar

2 Responses to “Life’s Imprints in 360 Degrees”

  1. Creating a Mobile Live Stream Platform With Wowza Media Systems » profiq says:

    […] At profiq, we’ve been intrigued by the possibilities of live streaming and have been actively pursuing it since 2015. You can read about our ventures into 360° video and VR streaming here How we delved into alternate reality, and why and here Life’s imprints in 360 degrees. […]

  2. profiq’s Technical Research Team » profiq says:

    […] We are also interested in researching nascent concepts that may have potential for future developments. “We are currently researching the science behind computer-generated speech,” said Milos, referencing the “speaking piano” as one example of synthesized speech. Past projects have included options to project 360-degree videos on mobile devices. […]

Leave a Reply

Related articles


Let’s make LLMs generate JSON!

In this article, we are going to talk about three tools that can, at least in theory, force any local LLM to produce structured output: LM Format Enforcer, Outlines, and Guidance. After a short description of each tool, we will evaluate their performance on a few test cases ranging from book recommendations to extracting information from HTML. And the best for the end, we will show you how forcing LLMs to produce a structured output can be used to solve a very common problem in many businesses: extracting structured records from free-form text.

Notiondipity: What I learned about browser extension development

Me and many of my colleagues at profiq use Notion for note-taking and work organization. Our workspaces contain a lot of knowledge about our work, plans, or the articles or books we read. At some point, a thought came to my mind: couldn’t we use all this knowledge to come up with project ideas suited to our skills and interests?

From ChatGPT to Smart Agents: The Next Frontier in App Integration

It has been over a year since OpenAI introduced ChatGPT and brought the power of AI and large language models (LLMs) to the average consumer. But we could argue that introducing APIs for seamlessly integrating large language models into apps developed by companies and independent hackers all over the world can be the true game changer in the long term. Developers are having heated discussions about how we can utilize this technology to develop truly useful apps that provide real value instead of just copying what OpenAI does. We want to contribute to this discussion by showing you how we think about developing autonomous agents at profiq. But first a bit of background.