fbpx

Does Sony’s investment in Epic Games shape the future of next-gen console success?

The UE5 tech demo that I spoke about in last month’s blog has set the internet on fire. With the demo launching on PS5, Sony managed to secure something of a major coup for their new console which finally launches this autumn alongside the new XBOX console. Sony’s subsequent investment of $250m USD to secure a minority stake in Epic Games, developer of Unreal, which powers so many of the games industry’s biggest titles, has certainly raised eyebrows. So, what does this mean for other consoles, specifically the that will fight for dominance in the coming months and years?

Tim Sweeney, CEO of Unreal has seemingly scotched any suggestion that the deal might somehow favour the PS5 and disadvantage to Microsoft’s hopes for its own new console XBOX Series X.  ‘There’s no secret deal’ Tim Sweeney tweeted last week, further stating that ‘Serious investment discussions followed from the Unreal Engine 5 demo we showed on PlayStation 5. I guess they liked it!’.

 

SONY’S MINORITY STAKE IN EPIC GAMES

Any notion that this represents some kind of ‘golden handcuffs’ deal that might tie Unreal exclusively to the future of PS5 should be viewed with some scepticism. Such a strategy would be foolish for Epic who have much broader ambitions outside of the games Industry. Indeed, Epic have already confirmed that the deal will still allow them to publish to other platforms.  In the grand scheme of things, Sony’s investment gives them only a relatively small minority stake compared to other investments the developer has received from games developers in recent years.  Although an enormous sum of money, $250m USD currently represents only a 1.4% stake in Epic Games.  By comparison, Tencent still retain a 40% stake in the developer following an investment that it made in 2012, which it acquired at the time for ‘only’ $330m USD. With Epic now valued at a whopping $17.86 billion USD, Tencent’s investment has proved to be an extremely savvy one that not only underpins the value of Epic’s Unreal engine to the games industry but hints at the Unreal’s broader potential to other sectors.

 

EPIC’S RISING STOCK

The rise in Epic’s stock value is a reflection of their growing ambitions (and success) for Unreal’s application outside of the games industry. With Sony’s own portfolio of entertainment assets extending to Music, Film and TV it makes perfect sense for Sony to make this investment now. As Tim Sweeney mentioned when the announcement was made “Sony and Epic have both built businesses at the intersection of creativity and technology, and we share a vision of real-time 3D social experiences leading to a convergence of gaming, film, and music.

It’s the last sentence that is most interesting. Historically, console developers have always been fiercely protective of their tech, taking a ‘walled garden’ approach to their platforms, usually resisting any form of cross-play compatibility with other hardware. Epic’s own success with the juggernaut that is ‘Fortnite’ has enabled it to push for its belief in more open platforms, paving the way for cross-play functionality between competing consoles.  Indeed, Fortnite was one of the first games to allow players on a PS4 to compete with their friends experiencing the same game on an XBOX One. With 350 million registered players and still one of the biggest games on the planet nearly three years after its launch, Fortnite’s success has given Epic a huge say in how games should be played in the future. Thanks to this, other games including ‘Dauntless’, ‘Paladins’ and Smite’ can also now be played as cross-platform experiences.

Sony’s investment in Epic shouldn’t be viewed as anything that will give them an immediate advantage in the upcoming ‘console wars’ that the launch of the next-gen consoles will inevitably bring.  Epic’s push for cross-play should be viewed as evidence of this. Technologically agnostic, Unreal continues to be at the centre of many leading XBOX exclusive games including ‘Sea of Thieves’, ‘Everwild’ and other forthcoming games that the REALTIME team have used the games engine to create Cinematics and VFX for.

 

UNREAL’S LONG TERM PLANS

Instead, the investment should be viewed as a mark of the level of confidence that Sony has in shaping the future of entertainment outside the remit of games alone. With Fortnite recently hosting a ‘virtual concert’ by travis Scott that reached 27 million viewers, it’s easy to see how such an experience might positively impact its music business. Similarly, with Unreal being used more and more as a VFX tool for Film & TV, this will also be of huge interest to their Sony Pictures division. And this is just the tip of the iceberg. Asides from its ground breaking graphical capability, the tech could be used as the building blocks for a metaverse, Tim Sweeney’s dream for a shared digital space where we ‘live, work and play’ that may well turn Sweeney into a real-life James Halliday, the creator of the OASIS in Ernest Cline’s ‘Ready Player One’.

Sony’s investment probably won’t shape the outcome of who will likely succeed in the next generation of consoles, but it might well be an indication of new experiences to come.

REALTIME have experience in working with the Unreal Engine in all areas of our studio including TV VFX and Automotive. So, if you’re looking for an experienced partner who specialises in creative solutions give me a shout on [email protected]

TV production in a COVID-19 world

Some months after the UK lockdown began, businesses can finally reopen. But running a business can’t be done like it was before Covid 19. Every industry has had to adopt new ways of working in an effort to keep staff and customers safe. Strict cleaning regimes, two-metre distancing, perspex screens to separate people – every different safety angle has to be covered and adapted to each industries’ differing requirements.

Production has been on hold for every movie and TV show, and it’s only now that some are starting to get back up and running. Productions have been given guidance on the ‘how to work safely with Covid 19’ by the BFI and the BFC released a detailed document. The guidelines include advice on maintaining distance and how to navigate daily tasks such as hair and makeup.

 

Working under the Covid 19 – Survey Results 

These BFI and BFC guidelines are helpful, but in practice, productions will find it difficult to implement and adhere to many of the guidelines – indeed some shows will be unable to enter production again at all. 

Working within these guidelines will be difficult, particularly for those productions that are starting up again now. We conducted our own research survey into the topic and came to some very interesting conclusions, which we are sharing as promised.

 

The guidelines in practice

Put simply, over 80% of professionals in the industry expect that implementing these changes present some degree of difficulty. Over 77% of people said they had seen delays on their projects, with many expecting it to take 3 months, at the very least, to get back to normal production. In fact, a significant number are thinking more along the lines of 12-18 months.

Some shows have already picked up production again, but as predicted, it’s filled with new challenges. This particularly interesting Financial Times piece details some of the extreme ways some productions are working around restrictions. Over in Spain, casts carried out rehearsals wearing face shields and another show quarantined their cast with Netflix arranging for their food delivery.

But this is revealing another issue; not everyone has the financial backing of Netflix. These guidelines, whether productions choose to follow them or not, will increase costs and timeframes. One respondent in our survey predicts production could be up to 50% slower than it was before.

While Netflix shows can benefit from their significant resources, many productions are wondering how they will manage. It’s an industry that relies on freelancers too, people who won’t have been able to make use of any government furlough schemes. They will need the most help; our survey found almost 90% have seen a significant drop in new work. 

 

Insurance Issues

Another key finding from the survey was the issue of production insurance – or the lack thereof.

In our survey, every respondent said a government-backed production insurance scheme was essential to return to pre-COVID-19 levels of work, with over 77% labelling it as “extremely important”.

Insurers won’t cover the shutting down of a production if a crew or cast member is affected by the coronavirus. Some insurers may cover the costs of replacing key cast and crew who get infected by Covid 19 (though the rates are high). However they won’t cover any other related costs from the delay caused by having an outbreak on set or the wider impact of having a sick crew or cast member on the production.

These insurance costs are being covered by some of the larger studios, but it’s much harder if you’re an independent film or TV show that doesn’t have the backing of a studio that will underwrite your insurance.

As it is, there are too many variables to go back to how things were. As one of our respondents put it:

“Stunts need physicality, crowd scenes need people to be close together, crew and cast all need to be transported to and from home to set, foreign locations will need flights. Equipment will need deep cleaning every day. Shooting will slow down exponentially and budgets will rise.”

It’s also why one of the most important VFX additions is predicted to be crowd replication as it’s the simplest way to circumvent the issue and respect the guidelines. People are looking for any way to speed the process back up again without negatively affecting their shows too much. 

Some problems are out of our control, so it’s up to us to take control of the things we can. There are still things to be optimistic about, but exactly how the industry will change in the long-term remains to be seen. I think many production companies and producers will be looking to the first shows that start up again to see how they manage to adapt to producing under the Covid 19 industry guidelines

At a time like this, we need a strong support network. Now is the time to reach out to others and work together to deliver high-quality productions we can be proud of. If you have an upcoming project you need assistance with, get in touch at [email protected].

How video game technology is breaking into other industries

Video games are one of the most-loved pastimes. To get an idea of just how popular it is, look no further than the PS5’s reveal which has – as of writing – garnered more than 23 million views on YouTube. It’s fair to say everyone’s a bit excited.

But for some, the most exciting aspect of gaming isn’t the new releases or the shiny new hardware. Its uses extend far beyond that. There are people out there who use gaming technology without even touching a video game. For them, it has revolutionised the way they work, learn, or even interact with the world.

Gaming has given us a lot of great moments but we’re here today to talk about the alternative uses you might not know about. Here are three gaming innovations that have found a place outside of our living rooms.

 

Engines

Epic Games has long touted its popular Unreal Engine as a resource for use outside of gaming. That will no doubt still be the case with its recently revealed fifth iteration. And plenty of people have taken them up on that offer.

It’s popular in industries where you need to render a 3D model for one reason or another. The automotive industry will use it to create hyper-realistic recreations of their cars for a configurator. Or the TV and film industry might use it to create pre-vis assets to give the production team an idea of how the finished product might look.

One drug development company has even used it to develop medicines. It came from a need to make the initial discovery and development stage simpler. It’s reasons like this why Epic created their “Enterprise” team, whose job it is to bring the technology to new mediums. Who knows, maybe it’ll play an integral role in creating the next groundbreaking medicine.

 

Motion controls

Opinions on motion controls will vary depending on who you ask. There’s a tendency to view it as “gimmicky” or a “flash in the pan”. And maybe that’s the case; Kinect is hardly in everyone’s living rooms, as Microsoft hoped it might be. But it has found a home elsewhere.

When Microsoft released the SDK for the peripheral, it opened the door for many other industries who could make use of its frankly amazing motion-tracking technology. Here’s a video of it being used to interpret sign language, which could break down barriers in communication.

There were even stories of it being used in surgery. Beyond that, it could be used for mixed reality purposes, such as digitally trying on clothes at a store, capturing high-quality 3D scans, or even in stroke recovery.

 

Virtual reality

VR is slowly gathering pace in the video game world, with games like Valve’s Half-Life: Alyx pushing it further into the mainstream. But such a strong piece of tech was always bound to find a home outside of video games.

In healthcare, it’s become a useful tool for training surgeons, with some using it to hone their skills for complicated procedures, such as brain surgery. You also have the military using it to train soldiers. And you can see museums adopting the technology to deliver a more immersive, educational experience.

Back to the automotive industry, where Toyota is using VR to teach people about the dangers of distracted driving. It could also be used during car production to let people look inside the car in great detail (thanks to a high-quality render) even when they aren’t at the manufacturing facility.

 

And more!

This one is a throwback; remember when the US Air Force made the 33rd largest supercomputer using PS3s? 1,760 of them, to be precise. Not only was it big, but it was fast, too. At the time, it was actually the fastest interactive computer in the entire US Defense Department.

But we’re still seeing more and more examples of how we can use games for good to this day. CCP Games have even integrated scientific research into their massively popular game EVE Online. As part of what they call “Project Discovery” players can take part in a minigame that can help scientists fight COVID-19. By marking groups of cell populations present in blood, they can help scientists understand how different cell populations are altered through infection.

And this list is far from exhaustive! There are hundreds of other uses for video game tech I wish we could talk about but no one wants to read me go on for 10,000 words. It’s great to see video games have an influence on even more lives and show how it can be a force for greater good. With the impending release of new consoles and technology, we can’t wait to see what it’s used for next.

At REALTIME, we’re always looking for ways to improve what we do, across all aspects of our operations not only the video game trailers. We live and breathe this technology and if you need an experienced body for your next project, our team is the perfect pick. Get in touch with me at [email protected].

Unreal – No longer just a game

When REALTIME was founded in 1996, its choice of name was a nod to the optimism of a far distant future where high-quality animation and VFX could be achieved in real-time. A premise that, at the time, was regarded as being the holy grail of the VFX industry. Unimpeded by extensive rendering times, the most demanding visuals would be realised at a silky-smooth framerate right in front of the artist’s eyes. Thanks to ever evolving technology and the Games Industry’s pursuit for visual excellence, this dream continued to take iterative steps towards its goal. Sometimes, with the release of each new generation of hardware, it would often be entire leaps forward. Even so, the quality never quite reached the same level as that of the biggest Hollywood movie blockbusters. High framerates always came at the expense of the final level of visual fidelity and therefore not a production pipeline that could be used to achieve the very highest levels of quality – something that REALTIME subsequently founded its reputation on.

Recently Unreal has thrust themselves in to the limelight, demonstrating that the newest iteration of their engine might make our optimistic hopes, a reality, very soon.

Initially created by Epic games to power its 1998 first-person shooter game ‘Unreal’, the Unreal engine quickly became adopted by many other games developers who wanted to take advantage of the engine’s superior graphical capability. Fast forward to 2020 and the engine’s capability has grown exponentially, allowing its users to create experiences and productions for TV and Film that extend far beyond Video Games alone. Whilst UE4 has found its place in Film and TV, it’s fair to say that by-and-large, its use in largescale productions has somewhat been limited to a pre-vis tool. It has become an invaluable tool for Directors and storytellers to see how successfully their story works at a very early stage of production and hopefully avoid the need for expensive re-shoots. However, that was pretty much where the story ended for UE4; despite its graphical capabilities pushing the boundaries of what could be achieved in games, its final output paled in comparison to that of Film and TV. Whereas UE4 could offer the flexibility of a real-time rendering solution, its final visual output could not compete with the demands that Film and TV required.

Ahead of the launch of the Playstation 5 console, all of that looks set to change. As is the norm, the main competitors are keen to showcase the power of their machines in an attempt to seduce gamers to their platform and ecosystem. The launch of the Playstation 5 and XBOX Series X in Q4 2020 is already seeing this battle take place online, with exclusive events designed to win over the hearts, minds and wallets of the next generation of gamers. This year, Sony has landed something of a coup by using their console as the first to showcase UE5 – the latest iteration of Unreal, the video games engine that has powered generations of some of the industry’s biggest games. It’s a clever move that has shown the PS5 as a powerhouse console – one that looks set to finally deliver the same level of visuals as the most ambitious Hollywood blockbuster movies, and all at a dizzying high frame rate. Marrying photo-real quality with the flexibility of real-time rendering, the latest iteration of the Unreal engine is a real game changer, (excuse the pun).

Whilst this is obviously a major boost to Playstation 5, it is worth noting that the new version of Unreal obviously won’t be exclusive to that console, with new projects already confirmed for XBOX.

UE5 boasts both an impressive lighting system that can handle even the most demanding assets with apparent ease. Whereas games engines are typically reliant on geometry assets being fairly well optimised to allow then to be rendered successfully at a high frame rate, UE5 is using its own developed ‘nanite’ technology allowing for the successful ingestion of high poly assets into the engine without the need to optimise them first. In addition to assets used directly from Z-Brush, the demo shows cinematic quality Megascan assets, that would typically be used in a pre-rendered solution, being drawn in real-time. Each multimillion-polygon asset is seen rendering in real-time using 8K textures. Epic boasts that the engine can handle an ‘insane’ amount of triangles per frame citing that each frame of the demo is crunching down a billion triangles to 20 million drawn triangles, evidently with no need for LODing or compromise to the final visual quality. Fitting that many triangles into screen means that each one looks like the size of a pixel; evidence as to how much detail the engine can show. So much detail demands pixel perfect shadows; something that the demo evidently shows that UE5 is capable of delivering too. The engine’s lighting solution, Lumen, offers dynamic multi-bounce global illumination without the need for baking of lightmaps. In short, it all adds up to the holy grail of VFX – real-time movie quality levels of visual fidelity without the need for extensive rendering time. Well, at least for environments anyway.

The recent Disney production of its hit TV series ‘The Mandalorian’ used Unreal technology, combined with state-of-the-art LED screens to provide an alternative to traditional Green screen techniques. Whereas green screen can lead to green lighting ‘spill’ to be cast onto its subject, the combination of Unreal and a LED screen setup to create a ‘Virtual Studio’ has multiple benefits. Firstly, it enables it’s subjects to be filmed in more realistic lighting setting; the projections of the virtual set are reflected more accurately onto its subject matter, embedding the real-world subject matter into a virtual environment (rendered using Unreal) with pixel-perfect accuracy. Secondly, it gives the production team an unparalleled level of freedom in being able to compose their shots; effortlessly changing the viewpoint and lighting conditions of their world, generated in real-time using Unreal, with the click of a mouse. All of this in real-time without the need to wait to see the final output.

Unreal Engine

It is a huge irony that the games industry, which has always been viewed as the poor relation to film and TV, is now playing a pivotal role in the reinvention of its production pipeline. Film makers and storytellers are using the many benefits of Unreal to speed up the process and quality of their productions. In the future, I’m confident that UE5 will continue to make further in-roads to Film and TV, challenging incumbent production pipelines that have been in place for decades. In doing so, it will further blur the lines between games and movies; making games more accessible and appealing to audiences that might otherwise be dismissive. It’s a reciprocal relationship that offers a win-win for both Movies and Games. For REALTIME, the adoption of Unreal technology to create Film and TV quality VFX is not only an inevitability, but an imminent reality that our clients can benefit. As well as enabling our team to create projects that might otherwise be too unwieldly to produce using a more ‘traditional’ pre-rendered pipeline, Unreal’s toolsets are finally allowing REALTIME to grow into its name.

UE5 is literally a game changer (excuse the pun, again) and REALTIME can’t wait to share the future with you.

Six questions to ask yourself when appointing a CG supplier

The journey of creating a car is a multi-faceted one. There are seemingly countless jobs to check off your list as you approach its launch. One of those is finding a CG supplier who can fit perfectly into your development and work as a cohesive part of your team.

Much like recruiting a new designer or engineer, you can’t underestimate the importance of finding someone who gels with your team. And – just like in that situation – it’s not a decision to take lightly. But how can you be sure they’ll fit in? How does working with an outside company integrate into your ongoing project? Here are six questions to ask yourself when looking at a new CG supplier.

 

Can they feel your pain?

There are a plethora of CG studios out there. But not all will be the right choice for you. You need someone who understands the automotive industry – the challenges you face and the problems you regularly come up against – something that I’d wager can only come from actually having spent time working on that side of the fence.

If you can establish that they have this direct experience of what it really means to design, develop, launch and then sustain a new car in the sales charts, then your chosen agency will really appreciate your processes, will do a better job of integrating into the overall direction and will be able to address any issues before they even become a problem.

 

What is their approach to tech?

This is an industry where you need to know how cutting-edge a supplier’s tech is. How can they keep pace with you if their technology isn’t up to scratch?

With the best tech available, they’ll be able to produce the best results for you, the clients. This is something you could see in their case studies or ask them outright about. Ask them to propose a solution to your problem; see if they can come up with something you haven’t.

 

What is their tech pipeline?

Ask yourself if you want or need visibility of the tech pipeline they use. This ties into the above, giving you confidence in the results they’ll deliver, along with the timescale you’ll be looking at.

Beyond that, it will be useful to know if you plan on working with other third parties. You may need to share assets during a big campaign. Integration into the process is an important factor you need to consider.

 

Do you know your own requirements?

This requires a bit of introspection. To know what you need from a CG supplier, you actually need to know what you need. Ask yourself what your requirements are in terms of the balance between features, interactivity with what’s going on on-screen, and visual fidelity.

You might find some companies excel in two areas, but not the third. So you may have to prioritise some elements over others. There’s a chance someone out there can do all three, but again, you need to ask yourself if you have the time and costs for them.

 

How will they tie into your marketing?

As we said at the start, the entire process of releasing a car has a plethora of moving parts. One of which will be your marketing. Your CG supplier can play a role in this part too.

The 3D assets they create would work great in your marketing materials. But you have to ensure they can actually supply you with these assets. If they can, then great – it will save you money.

 

What is important to you?

This might sound somewhat philosophical, but we’re referring to the look and feel of your car. What ‘look’ are you after as a client? Do you have an idea in mind or are you planning on working with your supplier to develop something unique for you? Or maybe they have a ‘one look fits all approach’ that you’re happy with?

You also need to work out what you expect of the final product. What do you expect of the look, the materials, the functionality, or the quality of the final image? With these details in mind, you can find a CG supplier that can meet those expectations and help give them an idea of the costs during the brief.

So before you set out to find your perfect CG partner, consider these questions first. Answering them ahead of time will help set your expectations and provide your chosen supplier with the details they need to get the job done to the best of their ability. If you follow these steps, you could end up with a fruitful, long-lasting relationship.

At REALTIME, we have years of automotive experience under our belt. Our team is well-equipped to handle all your wants and needs, and we’re always looking for new, exciting projects to be a part of. Feel free to get in touch with me on [email protected].