The Future of Computer Generated Imagery and The Metaverse

Hey Hive Peeps

For this post, I wanted to talk a bit about CGI (Computer Generated Imagery) and what the metaverse of the future could look like. I have recently been doing a fair amount of research on the current state of CGI and some of the many areas that utilise it.

Enemies_Blog_Thumbnail_600_x_338.png.jpg

Looking at the current landscape, it is really starting to feel like we might be on the verge of what could be a kind of industrial revolution when it comes to software like Unreal Engine and Unity. I think it’s also fair to say that consumer-grade hardware is finally getting to a point that can deliver visual experiences that are almost indecipherable from the real thing and at a semi-affordable cost. As I’m sure some of you are aware there is now a battle going on between Unreal Engine and Unity to see who can deliver the best tools for developers to be able to craft these experiences and I’m still not quite sure who is winning at the moment as they are both doing a pretty mind-blowing job of pushing that envelope to places it has never been pushed before. All I can say is that they are both getting very good at making the bleeding edge bleed.

Given that both these companies are already playing a very large part in the building of platforms that will eventually come together to form what we will possibly come to call the metaverse, it has me wondering, how real could things look in this elusive world when it eventually gets here?

Being a self-confessed Unreal Engine and Matrix fanboy I thought I would start with Unreal’s release of the photorealistic city that they created for the Matrix Awakens tech demo given that it is pretty much a whole city like world out of the box and is available to be experienced by anyone who has the hardware that is up to the task of running it. It is entirely conceivable that an asset like that could quite easily be turned into a metaverse type platform or that something similar to it in nature could be created for that purpose. It would definitely be interesting to experience it in a VR setting and It wouldn’t at all surprise me if a mod like that is already in the works somewhere.

Epic Games have also been kind enough to make a large number of the assets that were used to create this city available in the Unreal Engine marketplace so that people can have some fun with them in their own creations.

Pretty cool!!!

The whole Matrix Awakens city development was definitely groundbreaking for many reasons and also did a really good job of showcasing what is possible with the new Unreal 5 Nanite and Lumen technology which from what I understand are 2 of the main things that have really made this type of project possible given the previous constraints that were involved with the baking of lighting and the general optimization of graphical elements in CGI production environments. This was something that really began a short while back with the production of The Valley of The Ancients Tomb Raider tech demo and revolved around the ideology of being able to achieve more with less.

Here is a really good vid that explains this in more detail so you can see where it all started.

Another massively important tool that forms part of the Unreal 5 arsenal of tools is the Quixel megascans asset library. The team at Quixel have created an entire asset library of laser-scanned assets that can literally be dragged and dropped directly into your Unreal environments and with the use of the nanite technology be blended seamlessly into almost any scene.

I have been finding that the deeper I dive into this world the more amazing the people and the things that they create become. It seems that there is honestly no limit to what can be achieved when it comes to this amazing world of CGI. It is a world that knows no bounds except that of the imagination. It’s hard not to love it.

Another area where both Unreal and Unity are starting to shake things up is the world of movie production. It is well known how much it can cost to make a high fidelity Hollywood movie production as there are so many factors that are involved that, when brought together can end up costing millions of dollars. Things like actors, film locations, wardrobe, hair and makeup, camera equipment, film crews and so on. Things like Unreal and Unity have managed to find ways to be able to virtualise many of these components and drastically reduce the cost of production in a lot of ways. We already have amazing TV shows like The Mandalorian that are already using Unreal 5 to solve some of these problems.

We are also now at a point where entire movies are being created using software and CGI and in some cases have been created by a single person using a single computer.

Here are some nuggets created by some incredibly talented people that I managed to dig up.

The last example is from the recent Unity Enemies tech demo which is probably one of the best tech demos I have seen so far when it comes to digital humans. Unity recently joined forces with
Ziva Dynamics and Weta Digital which is a move that may well help level the playing field between Unreal and Unity.

So in answer to the question, how real could the metaverse look? Pretty Damn Real I think!!!

If we are already creating things of this quality now, what will we be creating 3 or 5 years from now?

I think the bigger question is are we ready for this kind of reality?

I will leave you with that thought for now.

I hope you enjoyed this little showcase and as always thanks for taking the time to read my post.

H2
H3
H4
3 columns
2 columns
1 column
Join the conversation now
Logo
Center