fbpx

tyFlow Test

Our hugely talented Artist Danilo Lombardo has been testing some new tools that have caught our attention, tyFlow in particular got an amazing result.

Lucky for us he’s done a lovely little write up… take it away Danilo!

This scene started as an experiment, with the mossy forest of Wistmans Wood , in Devon, England, as our main reference.
A scene like this relies on some little but crucial fundamentals.

The main objective was to create a scattering system that could be manipulated and art directed while creating an interesting chaos throughout the scene.
In doing this, it’s really beneficial to come up with a hierarchy of growth that can be repeated across, and that helps us define the visual pattern for the elements.

In our scene we defined some elements that would act as main surfaces while the rest would simply grow on top and dress it.
These elements were the terrain, rocks and trees.

The first area is the terrain, that was achieved using Gaea powerful node based system and then exported as geo. The terrain was then scattered with rocks and trees using forest pack to have a fast random distribution , keeping a low poly version of those elements that would later serve as scatter base for all the vegetation.

The base layout is quite simple.

Realtime tyFlow test

Trees and rocks were sculpted using zbrush and textured using substance painter, following a pretty standard approach for this kind of objects, always having the gnarly dwarf oak trees
from Wistmans Wood as a goal.

Only 5 different variations of rocks and trees were sufficient for having enough visual noise in the scene. Some smartmaterials such as moss and tree bark were developed in substance to be easily instantiated on all elements. The vegetation in the scene is a mix of megascan assets plus some additional ones obtained using textures atlases from textures.com mapped on simple planes. Every final tree is then converted to a vrayproxy and placed in the scene.

[All the trees are sculpted in zbrush and then dressed with vines , plants and leaves using zbrush fibermesh and tyFlow. The plugin “ultimate painter” was used to place some of the objects manually.]

Realtime tyFlow test

The real fun starts with the infinite possibilities that scattering brings to the table. Forest pack was used on the ground, from small grass to bigger plants, using textures to drive the distribution and using the powerful forest material to colourise those assets.

The rest of the scattering was handled by tyFlow in order to test its capability. Although being mainly a tool for FX artists, I was blown away by the possibilities for environment artists alike. I have always been a fan of using particle flow in max for certain tasks, and tyFlow have the amazing ability of scattering a crazy amount of points to any surface and gives artist a lot of freedom in what to do with those points.

I’ve been able to create a moss system that would just work on every asset in the scene, and can be reused even in future projects. Particles are being born on objects using a position object operator, which has now been improved to take in account objects normals, material IDs and textures to drive the density of those points. Points are then converted to moss , using a shape operator with the out of the box “grass clumps” 3D template present in the shape node.

[particles are scattered on the objects using the z-axis to drive their placement. Each particle is converted to a grass clump at render time.]

Realtime tyFlow test

This simple setup was all I needed to literally cover the entire scene in moss, with a display node set as “sprite” that kept the viewport extremely fast.
tyFlow was used to scatter all the moss plus many other smaller items like twigs on the rocks etc. and it proved to be a trustful help when it comes to scatter and art direct a huge number of points in the scene, without ever leaving max.

I was able to texture my moss using different custom V-Ray materials assigned to particle ID groups. A mesh operator then told V-Ray to treat each strand of moss a V-Ray instance.
tyFlow is fully compatible with V-Ray and is also able to output V-Ray proxies. Final lighting and fog were handled in V-Ray.

The final scene is quite light considering the amount of details involved. Different colours were used for different tyFlows.

Scattering and dealing with huge number of points is vital in 3D environments production, from creating garbage on the ground and piling of objects in a natural fashion,to parametrically dress a shelf full of items without having to do it manually to creating more complex setups with destruction in order to model decaying structures, the possibilities are endless.

I’m really excited about tyFlow and the power it brings to 3dsMax.

Here’s a little breakdown showing different stages of growth.

We have lots of lots of top tips in our insights take a look!

 

What can we expect from the next generation of consoles?

We are fast approaching the end of the current generation of consoles. Next year, we will most likely see the gaming landscape move into a new era. Both Microsoft and Sony will reveal and potentially release their newest offerings, and with Google releasing Stadia, the next generation could be less of a ‘war of the consoles’ and more a ‘battle of the streaming services’.

We don’t know what the future of gaming will look like but we can make some assumptions based on the trajectory of the market at the moment. So let’s don our hypothetical hats and delve into the potential future of the games industry.

 

Console wars

When it comes to the next generation of consoles, we’ll no doubt see the same game of Top Trumps we’ve seen before. Each will claim to have bleeding-edge graphics, unbeatable processing power, and a tantalising frame count. There are plenty of buzzwords already floating around in relation to the new consoles – ray tracing and SSD drives were particular highlights of Microsoft’s E3 wink-and-a-nod reveal of what they are currently calling ‘Project Scarlett’.

But what will the games be like? We’re already seeing a shift towards a ‘games as a service’ model with ongoing support planned for future releases like Square-Enix’s ‘Avengers’ game. Coming so late in the current cycle’s life, we may we see it become a cross-gen release. Will the next generation have fewer sequels, instead choosing to focus on advancing just one game? Or will a game continue to be relevant up until the point the sequel releases, meaning players have no downtime from said game?

It will also be interesting to see where VR fits into the landscape, which was, at one point, all the rage but has since taken a backseat. PlayStation hasn’t said much about their VR unit in recent months, though that may only be a feeling exacerbated by their no-show at this year’s E3. Of course, will consoles even stay relevant?

 

Enter Google

In Google’s own words, “the future of gaming is not a box; it’s a place”. While we have yet to see the long-term effects of Google’s efforts, you can’t deny they have the power behind them. What separates them from previous attempts to bring game streaming to the masses is their sheer size. The future of gaming could move from Microsoft vs Sony to consoles vs streaming.

How they fight that remains to be seen. Microsoft is planning their own streaming service; Sony already has one in the form of PlayStation Now. The alternative might be a ‘games-on-demand’ sort of service. Microsoft has Xbox Game Pass and if their E3 presentation is anything to go by, it will play an integral role in their future plans. They aren’t alone; EA has EA Access, Ubisoft will have Uplay+, and PlayStation Now allows you to download PS4 games to your console for offline play.

It all depends if Stadia can be truly stable. It’s biggest mountain to climb will be delivering a seamless, lag-free experience. If it can’t accomplish that for the majority of people, they may well stick with something that lets them avoid that issue. Stadia also doesn’t have the Netflix-esque subscription model other services have. In this age of binging, could a service that delivers hundreds of games at the touch of a button win out?

 

The future is bright

Regardless of which way the games industry goes, it’s good news for the consumer. Having multiple options is never a bad thing.

Whatever happens, there will always be new games. Traditional releases, games as a service, free to play – there’s something for everyone. What won’t change is the need for a quality trailer. With more options comes more competition, so you need to think now about how you will separate yourself from the pack. A quality trailer is your way of making your game known to the masses and could be the big difference between a successful launch and a damp squib.

At Realtime, we have worked with many great developers and helped them deliver a quality trailer befitting of their game. If you have an upcoming project and want to discuss it, get in touch with me at [email protected].

The art of recreating a car in CGI

Recreating a car in CGI: Cars are a work of art. From the body lines to the composition of the headlight, every section has been meticulously crafted and thought through to a level Picasso would be proud of. But this isn’t a one-off artwork, these are marketable products, open to being altered by a customer. So no matter what, you need to ensure the designer’s vision is still represented accurately.

If you work with an external company to develop CGI assets – be they for adverts or a configurator – you want someone who can represent that vision. Anyone can take the CAD data to recreate the car, but if you want the feel to be correct, you need to recreate the passion of the designer.

 

The bigger picture

That CAD data is the start of the journey. With it, a CG production company can accurately recreate every minute detail.

Understanding how best to represent goes beyond its form. It is about understanding the car,  the brand, how best to show the lines, the details, the blood, sweat, and tears gone into creating the beauty that stands before you. What separates the good from the great is their dedication to realising the why of the car.

The audience. The market. The picture. To deliver the best, they have to become an extension of your id. In the case of the configurator, they might have to integrate with your wider system, capable of being a sales platform that can easily translate the buyer’s preferences in a way that still instils the emotional connection. They aren’t just modelling the car; they are digitally recreating the entire process, down to the individual stitch.

 

The smaller details

And you know what they say about that: the devil’s in the detail. Using just the CAD data would give you a pristine CG model. That isn’t necessarily a good thing. Part of creating the best CG assets is making something that looks real. During the buying process, you don’t want consumers to spend their time questioning whether or not what they are seeing is real.

Translating the car into a CG model means making it look real in the most literal sense. It means recreating the exact stitching on the seats, to the point where you can see the leather stretch. It means taking the texture of the fabric and materials and demonstrating the feel of it. Hours of craftsmanship go into the real-world models, sometimes in areas you might never see, such as the texture on the back of the paddles affixed to the steering wheel. Let’s use this medium to enlighten the customer on exactly what they are about to receive.

The attention to detail should carry over to every area. For instance, the paint shouldn’t just show that it has metallic flakes in it that give it that lustrous shine. In reality, the flakes can be of varying sizes and shapes, influenced to lie a particular way. The top coat can be candied to enhance the lustrousness, all of which is more than just a pretty wet line. The realisation of these elements goes towards turning some metal into that aforementioned work of art.

Done right, by the end you’ll have more CG assets than a movie has frames. The entire process is reminiscent of the attention to detail you’d find in a multi-million dollar Hollywood production.

All of this can take time, though nowhere close to the manufacture of the real thing. But you need someone who can work with your schedule to deliver according to your roadmap. Whether you’re aiming for a car launch or a motor show, a brochure or a configurator, these CG assets need to play their part as an important cog in your marketing and sales campaign. Nothing can go wrong. And with the right people, it won’t.

At RealtimeUK, we know that recreating a car in CGI takes more than polygons. We have a history of delving into our clients’ wants and needs. We are an extension of your company. If you would like to discuss your upcoming project, get in touch with me at [email protected].

Making a CGI trailer for your video game

So you are developing a game. Years of passion are coming to fruition. Your heart and soul – your vulnerability – laid out for the world to see. The time to show your hard work to the general public draws near. It’s a moment you both dread and gleefully anticipate. It’s time for the marketing to begin as you only get one chance to impress.

Like anybody else, you want your game to sit on a pedestal for the world to see. But in a market rife with so much competition, it can be so easy to fade into obscurity. Kickstarting your campaign with a CGI trailer can be an ideal way to attract the attention your game deserves. But putting trust in someone else to deliver on your vision is a big leap. How does that process work?

 

The start of the journey

There is no paint-by-numbers roadmap that works for everyone. The first thing to understand is the journey is pretty much different for everyone. And that’s a good thing.

Producing a high-quality trailer is only half the battle. What separates the good from the bad is the level of dedication to accurately representing your IP and key USPs. This only comes from a company that cares about working collaboratively with you; as part of you.

You need a studio that maintains a constant line of contact. Whether this is through in-studio meetings or more convenient over-internet communication, what matters is they listen to you. Worried you don’t have your own in-house creative available? That shouldn’t get in the way; a good production company should mould to your situation and should be able to help with any script development and suggest creative solutions that won’t blow the budget.

The initial discussions should help outline the direction of the CG trailer. Working with you, the production company should pin down which characters or assets you’d like to feature in the trailer that can get across the USPs and distinctive brand of your game. If you have these ready to go then great. If not, then a good partner should be able to make these assets in-house, carefully updating you with their progress as the pre-production process begins.

 

Producing brilliance

The length of time a production may take will vary, depending on the scope of the piece. A typical pre-rendered trailer can often take several months, so planning on your side will be an essential element to the success of the piece. Even with this in mind, it is crucial to keep the studio up-to-date with your plans, allowing enough time for the studio to produce the trailer and apply the specialist resources to accommodate the project. 

Also, be wary of when you want to enlist their service. In the run-up to any major industry event, such as E3, many studios will be fully booked up. With this in mind, you should look as far down the road ahead as possible to avoid disappointment. What matters most is that the final trailer is a creative testament to your game and an open line of communication can go a long way to help with this.

 

Collaboration

A consistent and collaborative attitude to communication throughout production will inevitably help you arrive at a CG trailer that all stakeholders are happy with. So having a permanent point of contact within the production company is key. The studio’s Head of Production should be your day-to-day contact who keeps you informed of any changes and respond to feedback. 

Over the course of the production cycle, you should be privy to many milestones, initially beginning at the pre-production stage with concept art. Storyboards and rudimentary animatics are intended to give you insight into the direction of the trailer. These are created with the intention of providing your team with an opportunity for feedback. Over time, you should see the final product start to form, as the production company sculpts a work of art before your eyes.

If you have stayed communicative throughout, you should have a final product that trumps every expectation. A cinematic tour de force sure to capture the attention of any audience. Something that encapsulates your game with ease.

It seems like an almost impossible task; how can you find someone who can deliver on your expectations? How can a company ever truly understand your product? A specialist CG studio that understands games can – if you allow them.

RealtimeUK is that company. We aren’t just a production company; we are an extension of your studio. We work intimately with our clients from the start, working with a focus on open communication. If you would like to discuss your next project, get in touch with me at [email protected].

TV VFX for dummies

Welcome back to part two of our ‘VFX for dummies’ series! We know you could be a veteran of the industry but still be unsure about this rapidly changing field of TV production. What CGI and VFX are might not be immediately obvious to you. So if you’ve ever tried to research the topic to gain a clearer understanding, you might have left more confused than before.

Like any highly technical specialisation, VFX has its own dictionary of industry terms that can – on the surface – appear confusing. But we’re here to demystify some of the most common terms and break them down for you.

TV VFX

8-bit/16-bit

This refers to the bit depth of your footage, i.e. how much colour information is stored in your imagery. The more colour information it has, the more colours you have available. The higher the bit depth, the more colours it can store. This term is common when discussing concepts like ultra-high-definition (UHD) or high dynamic range (HDR).

 

Assets

These are your 3D models – what you might think of when you think ‘VFX’. They can be as large or small, or as noticeable or inconsequential as you want. It can be anything from a plane to a three-legged war machine, a tree to an animal, or a box to an even smaller box.

 

CGI

Computer-generated imagery. As the name suggests, these are the visual elements of your production that are created on a computer. Often used to refer specifically to 3D Computer Animation or as another term for Visual Effects or VFX. CGI and VFX are not the same though. CGI – and its integration – can be considered part of the VFX process.

 

Compositing

The combination of at least two source images to create a new integrated image. Compositing happens  when you put all your different ‘elements’ together – your 3D assets, your backgrounds, your particle effects, and your actual on-set footage.

 

EXR

See OpenEXR.

 

High dynamic range (HDR)

This is a common term used in relation to next-generation TV’s which can deal with a larger than normal dynamic range. Dynamic range relates to the brightness values in a scene or image, from brightest to darkest, often expressed as a ratio. In a digital image, it also relates to the total number of different colours in the image. Streaming content providers like Netflix and Amazon have spearheaded the drive in the industry to deliver films and series to HDR standards. 

 

Keying

Keying is the process of algorithmically extracting an object from its background and combining it with a different background. To help with this process, productions use a ‘green screen’ or ‘blue screen’ to shoot against. This is used so, during the keying process, you can cut out the green or blue colour and insert your own background digitally. Ideal in situations where a location does not exist so needs creating (such as an alien planet) or where it is too dangerous to have the actor in that situation (such as in a special effects explosion).

 

Matte painting

From the small to the large, sometimes you need to create entire landscapes. You may be able to use a matte painting, which is a 2D, digitally drawn background that can be added to your scene. Before digital production became the industry standard, matte paintings were painted onto glass. The paint techniques used now are created using software like Photoshop, Nuke, Mari, and ZBrush.

 

OpenEXR

This is a specific image file format designed for use with High Dynamic Range (HDR) imagery.

 

Parallax

Parallax is defined as the perceptual difference in an object’s location or spatial relationship  when seen from different vantage points. Parallax is an effect which can be used to add more depth to 2D shots. You can adjust focus and depth of field to make certain elements appear closer to or further away from the camera, adding depth to a 2D shot.

 

Particle system / particle effects 

A 3D computer graphics technique that is used to create a large number of objects that obey well-defined behavioural rules. Useful for controlling multitudes of discrete objects, such as asteroids or flocks of birds, but also as a tool for creating natural phenomena such as fire, smoke or water. Particles are small 3D elements that add tiny details to a shot. If there’s a fire, you’ll need rising embers and smoke. If there’s rain, you’ll need small droplets.

 

Pipeline

A pipeline is the generic term used to describe a set of processes for achieving a certain result. It is most commonly used to describe the VFX pipeline. The VFX pipeline covers all the processes from pre-production through to post-production and delivery. It involves many things in this glossary, including previz, matte painting, and tracking. Creating a robust and efficient pipeline is a key part of developing a successful VFX company.

 

Previz (abbreviation for previsualisation)

Previs is a collaborative process that generates preliminary versions of shots or sequences, predominantly using 3D animation tools and a virtual environment. Previs is used by filmmakers to explore creative ideas, plan technical solutions to shooting, and to help the whole production team visualise how finished 3D elements will look in the final project ahead of final animation being completed.

 

Rotoscoping

A rotoscope was originally the name of a device patented in 1917 to aid in cel animation. It is now used as a generic term for ‘rotoing’. This is the process of cutting someone or something out of a more complex background to use them in another way. For example, you might want to put a VFX explosion behind your actors as they walk away from it towards the camera. You would need to rotoscope them out of the shot so you can place the explosion behind them but in front of the background scenery.

 

SFX

Special effects. While these aren’t visual effects, it’s worth defining how the two are different. While visual effects are digitally created assets, special effects are real effects done on set – for example, explosions or stunts. It can also include camera tricks or makeup. People often confuse SFX and VFX.

 

Texturing

When 3D models are first created, they are just blank shapes with no realistic details on them. Texturing is a process which is akin to painting the model – giving it a skin or surface.

 

Tracking

Tracking is the process of determining the movement of objects in a scene (relative to the camera) by analysing the captured footage of that scene. 2D tracking is dependent on tracking points in the image. These can be tracking markers placed there or points on objects being tracked. 3D tracking  – also referred to as match moving – is the process of extracting the camera move from a live action plate in order to replicate it in a computer generated (CG) environment. A match move is often created by hand, whereas 3D tracking is done with specialist software. 3D tracking is used to recreate the movements of a camera in a digital space. So, for example, you have a shot that pans from left to right. When you add in your 3D asset, it needs to move from left to right in the same way at the same speed so it looks as if it was actually there.

 

VFX

VFX stands for ‘visual effects’. It is a very broadly used term used to describe just about anything that cannot be captured through standard photographic capture

 

Speaking of which, it’s time to end another entry in our VFX for dummies series. We hope this has helped you understand the often complex world of VFX. There’s more to come in the series so check back regularly. Next time, we’ll discuss how to plan your CGI elements to make filming your TV show that much easier.

We know VFX can be confusing to even the most experienced industry veteran. But at REALTIME, we make the process as stress-free and easy as possible. If you need CGI elements in your upcoming project, get in touch with me on [email protected] to see how we can help.