This post is brought to you by…
NFTfi allows borrowers to put up assets for a loan and lenders to make offers in exchange for interest. The NFT is held in escrow so lenders know for sure that they will either get their money back with interest, or receive the NFT in exchange.
Part 2 of the Metaverse Primer covers the hardware that enables us to interact with the Metaverse at a user level. Specifically, Matthew focuses on hardware designed to access, interact with, or develop the Metaverse. The article deliberately doesn’t include compute or network-specific hardware (e.g chips, servers or cabling).
Despite being a shorter piece from the primer, it was mind-blowing to discover just how capable some of the current tech we hold in our hands (and wear on our heads!) has become. There isn’t a great deal of conjecture from Matthew on the future either, but we add some colour ourselves. Let’s get to it.
Consumer Hardware
The takeaway from this section is that consumer tech continues to get better every year, but our portable devices are immensely powerful. Increased capability from sensors, battery life, haptic feedback and screen quality all improve our experience of the Metaverse, as they contribute to making it more immersive. When these improvements are combined with the ubiquity of smart devices, users' immersion into the digital realm is further enhanced.
As an example, Matthew turns to live avatars, made possible by immensely powerful hardware in our mobile phones, something many including myself probably take for granted.
A second example, again from Apple. Using ‘object capture’ you can snap a pic or two of anything in the real world and use it to create a virtual version of the object. Once virtually generated, the object can be shared inside a variety of virtual environments.
In order to understand how tech like this is possible, let’s take a quick glance at some numbers:
iPhones are able to track 30,000 points on the face through infrared, primarily for FaceID. The data can be used by other apps, the live avatar above being the perfect example.
The latest models also emit 500 million radar pulses per second(!) to map the surroundings and provide location data down to the centimetre. This means you can walk around your home wearing a VR headset and not bump into things. Handy.
Extended Reality (XR) headsets have a similar story, where advances in the hardware immerse us more deeply every year. There is still a way to go before we reach levels of near-perfect immersion however. For that, we still need:
Refresh rate >120Hz (currently 90-120Hz)
Vision angle of 210° (currently 52°)
Resolution at 8K or better (currently 4K)
Headsets also only deal with a single sense, that of sight. As noted in the piece, any advancements here will need to be complemented by increased quality, and reduced size of other wearables, to bring the experience closer to real life.
Overall, this stance fits with our vision for the Metaverse here at MetaPortal. The Metaverse is an emergent state that exists at the intersection of digital and physical realities. It will be accessed through embedded tech like that described above, including earbuds, haptic gloves, better VR gear and so on. The tech exists at the boundary between the two realities, a region that will tend towards imperceptibility as the hardware is able to project at higher quality, in a smaller, lighter package. We might also expect it to become more embedded within our clothing, and potentially even our bodies, reducing that boundary further still.
Non-consumer Hardware
Another slew of insane numbers is presented as we move to higher-end hardware. Leica has photogrammetric cameras that can scan 360,000 laser points per second, capturing urban spaces in greater detail than a person would ever see. Epic Games have something similar called Quixel which creates MegaScans, accurate to the tune of tens of billions of pixels.
The advent of this tech makes it faster and cheaper to produce high-quality mirror worlds of our own reality, or brand new environments entirely. Again, the pace of this innovation is emphasised, where 10 years ago Google captured 2D street view and stunned the world, now Amazon Go captures dozens of people moving around in-store, every day. Matthew quips that perhaps in the future:
When you go to Disneyland, you might be able to see virtual (or even robot) representations of your friends at home and collaborate with them to defeat Ultron.
In the future, perhaps you won’t even physically ‘go’ to Disneyland at all…
Conclusion
The article doesn’t conclude so much as abruptly end, having covered the two sections succinctly. The final sentence does wrap things up nicely, however, conceding that hardware is just one piece of the puzzle, but without it, the Metaverse could not exist:
These experiences require more than just the hardware, but they will always be constrained, enabled and realised through it.