My "work from VR" experience

28 June 2023

Published On:

28 June 2023

Tags:

VR
AR
VisionPro
Meta
Quest
Immersed
Workrooms

In case you didn't notice, VR and AR are all the rage again. This is because Apple presented a revolutionary (and expensive) headset that might completely change the game, as they did already multiple times in the past (remember pre-2007 phones?).

Interestingly enough, some weeks earlier I was putting my hands for the first time on a VR headset. I realize that I am definitely not an early adopter: the first Oculus prototype was created in 2012 and Facebook bought the company in 2014. However somehow this technology never really captured my interest so I managed to spend all these years without even trying one of these devices. This changed around one month ago when, while having dinner, my friend Andres said: "By the way: I have an Oculus Quest 2 in this backpack. You might want to give it a try".

So I went home with the new toy and thought: "what do I do with this now?".

I obviously tried some cool games (Hyper Dash and BeatSaber among the others), created my personal Meta avatar, explored the power of 360° videos, etc.

Meta AvatarThis is how I look like in the Metaverse

But the thing I was mostly interested in was testing it in a professional environment and seeing if a VR headset can potentially replace a real work setup.

I remember watching a YouTube video a while ago about Immersed so this is the first app that I tried out. The installation is quite simple: you install the Immersed app on the headset, and then the desktop agent on the Mac. The two devices need to be on the same Wi-Fi network and after a quick setup you're done: your computer is now available in the virtual space.

I quickly created a 3 monitors setup, chose a nice virtual room, and started to play around with my new workstation.

Here I immediately noticed something interesting: I thought that I was able to type on my keyboard without the need of looking at it. While this is mostly the case, it turned out that sometimes I have to take a quick look. For example, I was struggling to take a screenshot (with the CMD+SHIFT+4 combination), and it took me some attempts to get it right.

Speaking of the keyboard, there are two possible modes:

  1. Keyboard Passthrough Portal: with this option, you are opening a "portal" where you can see your actual keyboard in the real space, and therefore even your real hands typing on it.

  2. Tracked keyboard: in this case, you have to tell Immersed which model of keyboard you have. And then boom, a virtual replica of your keyboard is available in the virtual space.

I tried option 1 but realized that it was not good enough. Passthrough in general is not that good on the Quest 2, so you don't really see the keyboard well enough.

Then I tried option 2 and overall thought that it was way better. If you have hand tracking enabled you also see a visual representation of your hands in the virtual space. The cool thing is that, while you're typing, your hands transform into the real ones (thanks to passthrough) while the keyboard remains the virtual one. It's still not as good as in the real world, but good enough to be able to type with an acceptable level of mistakes. The other annoying thing is that, when using the mouse (which doesn't have a counterpart in the virtual space), Immersed doesn't really understand what you're doing, so your hands keep projecting a light beam as if you are constantly pointing to something.

Choosing a room

This was actually quite fun: when choosing the location, I initially picked one called "Space Lounge". I thought it was quite cool and didn't initially realize that this was not a private room. At some point, another "person" popped up in the seat right next to me and I initially freaked out a bit. Then I found out that I was actually in one of the many co-working space locations that Immersed offers. And I even ended up having a nice chat with this person, who was also a developer. Then I discovered that there are several other locations that are indeed private spaces, so nobody will randomly pop up in front of you when you least expect it.

Immersed Webcam

Let's say you want to join a video meeting while you're in VR. What will other people see? If you use your standard webcam, people will just see a dude wearing a VR headset. Not very cool. That's why Immersed created a virtual webcam. This camera exists in our virtual setup and it will basically float around, which means you can position it exactly where you want. Remember, we are in a virtual world, so the laws of gravity don't apply here.

This camera will be selectable in most of the meeting apps (I tried it only on Google Meet, but it works for sure as well in Zoom, Skype, and Microsoft Teams). As for what other people will see: they will see your avatar (including your hand movements) and obviously the virtual location where you chose to be.

I put together a small video where I'm showcasing most of the things described so far:

Doing some actual work

I had everything set up and working fine. But the question was: can I really work with this setup? The only way to find out was to try, therefore I decided to start my work week from my new fancy virtual cottage at the lake, with my 3 giant screens floating in front of me. When I entered the daily standup meeting I could see that my colleagues were a mix of entertained/curious/facepalming. Which is totally expectable, since I showed up with a virtual version of me. However, the whole meeting went perfectly fine, including screen sharing, etc.

The only thing that I noticed is that there was a little bit of lag in certain moments. This was easily fixable by using a cable connection between the headset and the computer, instead of using the Wi-fi connection.

Speaking of the actual work, I did a coding session of ~1 hour. After that, I kind of felt the need of going back to my monitors. The virtual screens, in fact, are not sharp enough and after a while you realize that it's too much fatigue for your eyes. I repeated the experiment multiple times that day and in the following days, but I never managed to do a session longer than 1 hour.

Other Software

I initially thought that Immersed was the only software offering this type of functionality. After some research, I found out that having your computer in the virtual space is a feature offered as well by Meta Horizon Workrooms. Therefore I decided to try it out. Setting everything up is quite similar: you download an agent on your computer (called Meta Quest Remote Desktop), download Meta Horizon Workrooms on the Quest, and then you're all set.

One interesting thing that I noticed is that, while with Immersed having hand tracking enabled is completely optional, for Workrooms this is a requirement. You cannot even launch the app if the setting is off. Speaking of hand tracking, I think the whole thing is managed better. For example, I couldn't experience the "accidental pointing" issue that I was describing earlier, and also the typing experience felt a bit better. Similar to what I explained earlier, you see your real fingers typing on the virtual keyboard, which looks exactly like the real one.

Another cool thing in Workrooms is that you can transform your desk into a whiteboard and use your controller as a pencil. This worked incredibly well and I can definitely see some nice use cases for this, especially if you are sharing the space with somebody else (more on this later)

Unfortunately, the mixed reality behavior of the hands is not really visible in the video since, apparently, screen recording doesn't include the passthrough video stream. But trust me, in the headset this looks way cooler.

Meeting people

As per Meta company vision, Workrooms is a software to work, collaborate and connect. Therefore, I was super curious to try this with other people and have a real meeting in VR. Luckily my friend and former co-worker Antonio (yes, I know, same name) also happens to have a Quest 2, so I set up a meeting and invited him to my workroom.

Meeting in workroomsMeeting a friend inside a Workroom

I have to say that this is way cooler than I anticipated. First of all, it takes only some minutes to completely forget that you're in a virtual space. Everything feels very natural: the audio comes from where it's supposed to come (thanks to spatial audio), you look at the other person in the eyes when you're talking, you see the hands moving as in real life, etc.

When you're using your computer, the other person sees that you have a screen in front of you but cannot see what is being displayed, unless you explicitly decide to share it. In that case, your shared screen shows up on the big whiteboard, as you would do in a real meeting room with a projector. Speaking of the whiteboard, you can literally just go there and draw something using the controller. Or you can stay at your desk, draw on your workstation (as in the video shown earlier) and share your drawing in real-time on the whiteboard so that everybody can see it. You also have different configurations for the room, depending on the type of meeting you want to have.



Final thoughts

Overall it was a very interesting experience. The two apps I tested have some pros and cons. For example, Immersed provides more customizations and different types of locations (from cafeterias to spaceships) but the visual quality of these places is not really great, sometimes it felt a bit like being in an early 2000s videogame. Workrooms is a bit more visually appealing in this sense, even if they provide just a few different spaces. Another difference that I noticed is that Workrooms doesn't allow you to choose your screen resolution. This is probably on purpose because they know that with higher resolutions readability will not be optimal so, in my case, this was locked on 1600x1200, which is quite low. Immersed, on the other hand, offers full customization in this regard.

Also Immersed has this nice concept of co-working spaces while with Workrooms you are responsible for creating rooms and there are no public places where you can just go and meet people.

In general I really liked the concept and I think that the only downside was the definition of the screens which was not enough to do prolonged sessions of work. But, of course, this is just a hardware limitation and will for sure be solved. The recently announced Quest 3 already should be way better and I'm really curious to try the Apple Vision Pro to see if it's really that good as many people who already tried it out claim it is.

What I'm already pretty sure of is that this will be part of our future in computing, and that exciting times are ahead.

Antonio Cosentino © 2021 - 2023 · All rights reserved