fbpx

tyFlow Test

Our hugely talented Artist Danilo Lombardo has been testing some new tools that have caught our attention, tyFlow in particular got an amazing result.

Lucky for us he’s done a lovely little write up… take it away Danilo!

This scene started as an experiment, with the mossy forest of Wistmans Wood , in Devon, England, as our main reference.
A scene like this relies on some little but crucial fundamentals.

The main objective was to create a scattering system that could be manipulated and art directed while creating an interesting chaos throughout the scene.
In doing this, it’s really beneficial to come up with a hierarchy of growth that can be repeated across, and that helps us define the visual pattern for the elements.

In our scene we defined some elements that would act as main surfaces while the rest would simply grow on top and dress it.
These elements were the terrain, rocks and trees.

The first area is the terrain, that was achieved using Gaea powerful node based system and then exported as geo. The terrain was then scattered with rocks and trees using forest pack to have a fast random distribution , keeping a low poly version of those elements that would later serve as scatter base for all the vegetation.

The base layout is quite simple.

Realtime tyFlow test

Trees and rocks were sculpted using zbrush and textured using substance painter, following a pretty standard approach for this kind of objects, always having the gnarly dwarf oak trees
from Wistmans Wood as a goal.

Only 5 different variations of rocks and trees were sufficient for having enough visual noise in the scene. Some smartmaterials such as moss and tree bark were developed in substance to be easily instantiated on all elements. The vegetation in the scene is a mix of megascan assets plus some additional ones obtained using textures atlases from textures.com mapped on simple planes. Every final tree is then converted to a vrayproxy and placed in the scene.

[All the trees are sculpted in zbrush and then dressed with vines , plants and leaves using zbrush fibermesh and tyFlow. The plugin “ultimate painter” was used to place some of the objects manually.]

Realtime tyFlow test

The real fun starts with the infinite possibilities that scattering brings to the table. Forest pack was used on the ground, from small grass to bigger plants, using textures to drive the distribution and using the powerful forest material to colourise those assets.

The rest of the scattering was handled by tyFlow in order to test its capability. Although being mainly a tool for FX artists, I was blown away by the possibilities for environment artists alike. I have always been a fan of using particle flow in max for certain tasks, and tyFlow have the amazing ability of scattering a crazy amount of points to any surface and gives artist a lot of freedom in what to do with those points.

I’ve been able to create a moss system that would just work on every asset in the scene, and can be reused even in future projects. Particles are being born on objects using a position object operator, which has now been improved to take in account objects normals, material IDs and textures to drive the density of those points. Points are then converted to moss , using a shape operator with the out of the box “grass clumps” 3D template present in the shape node.

[particles are scattered on the objects using the z-axis to drive their placement. Each particle is converted to a grass clump at render time.]

Realtime tyFlow test

This simple setup was all I needed to literally cover the entire scene in moss, with a display node set as “sprite” that kept the viewport extremely fast.
tyFlow was used to scatter all the moss plus many other smaller items like twigs on the rocks etc. and it proved to be a trustful help when it comes to scatter and art direct a huge number of points in the scene, without ever leaving max.

I was able to texture my moss using different custom V-Ray materials assigned to particle ID groups. A mesh operator then told V-Ray to treat each strand of moss a V-Ray instance.
tyFlow is fully compatible with V-Ray and is also able to output V-Ray proxies. Final lighting and fog were handled in V-Ray.

The final scene is quite light considering the amount of details involved. Different colours were used for different tyFlows.

Scattering and dealing with huge number of points is vital in 3D environments production, from creating garbage on the ground and piling of objects in a natural fashion,to parametrically dress a shelf full of items without having to do it manually to creating more complex setups with destruction in order to model decaying structures, the possibilities are endless.

I’m really excited about tyFlow and the power it brings to 3dsMax.

Here’s a little breakdown showing different stages of growth.

We have lots of lots of top tips in our insights take a look!

 

TV VFX for dummies

Welcome back to part two of our ‘VFX for dummies’ series! We know you could be a veteran of the industry but still be unsure about this rapidly changing field of TV production. What CGI and VFX are might not be immediately obvious to you. So if you’ve ever tried to research the topic to gain a clearer understanding, you might have left more confused than before.

Like any highly technical specialisation, VFX has its own dictionary of industry terms that can – on the surface – appear confusing. But we’re here to demystify some of the most common terms and break them down for you.

TV VFX

8-bit/16-bit

This refers to the bit depth of your footage, i.e. how much colour information is stored in your imagery. The more colour information it has, the more colours you have available. The higher the bit depth, the more colours it can store. This term is common when discussing concepts like ultra-high-definition (UHD) or high dynamic range (HDR).

 

Assets

These are your 3D models – what you might think of when you think ‘VFX’. They can be as large or small, or as noticeable or inconsequential as you want. It can be anything from a plane to a three-legged war machine, a tree to an animal, or a box to an even smaller box.

 

CGI

Computer-generated imagery. As the name suggests, these are the visual elements of your production that are created on a computer. Often used to refer specifically to 3D Computer Animation or as another term for Visual Effects or VFX. CGI and VFX are not the same though. CGI – and its integration – can be considered part of the VFX process.

 

Compositing

The combination of at least two source images to create a new integrated image. Compositing happens  when you put all your different ‘elements’ together – your 3D assets, your backgrounds, your particle effects, and your actual on-set footage.

 

EXR

See OpenEXR.

 

High dynamic range (HDR)

This is a common term used in relation to next-generation TV’s which can deal with a larger than normal dynamic range. Dynamic range relates to the brightness values in a scene or image, from brightest to darkest, often expressed as a ratio. In a digital image, it also relates to the total number of different colours in the image. Streaming content providers like Netflix and Amazon have spearheaded the drive in the industry to deliver films and series to HDR standards. 

 

Keying

Keying is the process of algorithmically extracting an object from its background and combining it with a different background. To help with this process, productions use a ‘green screen’ or ‘blue screen’ to shoot against. This is used so, during the keying process, you can cut out the green or blue colour and insert your own background digitally. Ideal in situations where a location does not exist so needs creating (such as an alien planet) or where it is too dangerous to have the actor in that situation (such as in a special effects explosion).

 

Matte painting

From the small to the large, sometimes you need to create entire landscapes. You may be able to use a matte painting, which is a 2D, digitally drawn background that can be added to your scene. Before digital production became the industry standard, matte paintings were painted onto glass. The paint techniques used now are created using software like Photoshop, Nuke, Mari, and ZBrush.

 

OpenEXR

This is a specific image file format designed for use with High Dynamic Range (HDR) imagery.

 

Parallax

Parallax is defined as the perceptual difference in an object’s location or spatial relationship  when seen from different vantage points. Parallax is an effect which can be used to add more depth to 2D shots. You can adjust focus and depth of field to make certain elements appear closer to or further away from the camera, adding depth to a 2D shot.

 

Particle system / particle effects 

A 3D computer graphics technique that is used to create a large number of objects that obey well-defined behavioural rules. Useful for controlling multitudes of discrete objects, such as asteroids or flocks of birds, but also as a tool for creating natural phenomena such as fire, smoke or water. Particles are small 3D elements that add tiny details to a shot. If there’s a fire, you’ll need rising embers and smoke. If there’s rain, you’ll need small droplets.

 

Pipeline

A pipeline is the generic term used to describe a set of processes for achieving a certain result. It is most commonly used to describe the VFX pipeline. The VFX pipeline covers all the processes from pre-production through to post-production and delivery. It involves many things in this glossary, including previz, matte painting, and tracking. Creating a robust and efficient pipeline is a key part of developing a successful VFX company.

 

Previz (abbreviation for previsualisation)

Previs is a collaborative process that generates preliminary versions of shots or sequences, predominantly using 3D animation tools and a virtual environment. Previs is used by filmmakers to explore creative ideas, plan technical solutions to shooting, and to help the whole production team visualise how finished 3D elements will look in the final project ahead of final animation being completed.

 

Rotoscoping

A rotoscope was originally the name of a device patented in 1917 to aid in cel animation. It is now used as a generic term for ‘rotoing’. This is the process of cutting someone or something out of a more complex background to use them in another way. For example, you might want to put a VFX explosion behind your actors as they walk away from it towards the camera. You would need to rotoscope them out of the shot so you can place the explosion behind them but in front of the background scenery.

 

SFX

Special effects. While these aren’t visual effects, it’s worth defining how the two are different. While visual effects are digitally created assets, special effects are real effects done on set – for example, explosions or stunts. It can also include camera tricks or makeup. People often confuse SFX and VFX.

 

Texturing

When 3D models are first created, they are just blank shapes with no realistic details on them. Texturing is a process which is akin to painting the model – giving it a skin or surface.

 

Tracking

Tracking is the process of determining the movement of objects in a scene (relative to the camera) by analysing the captured footage of that scene. 2D tracking is dependent on tracking points in the image. These can be tracking markers placed there or points on objects being tracked. 3D tracking  – also referred to as match moving – is the process of extracting the camera move from a live action plate in order to replicate it in a computer generated (CG) environment. A match move is often created by hand, whereas 3D tracking is done with specialist software. 3D tracking is used to recreate the movements of a camera in a digital space. So, for example, you have a shot that pans from left to right. When you add in your 3D asset, it needs to move from left to right in the same way at the same speed so it looks as if it was actually there.

 

VFX

VFX stands for ‘visual effects’. It is a very broadly used term used to describe just about anything that cannot be captured through standard photographic capture

 

Speaking of which, it’s time to end another entry in our VFX for dummies series. We hope this has helped you understand the often complex world of VFX. There’s more to come in the series so check back regularly. Next time, we’ll discuss how to plan your CGI elements to make filming your TV show that much easier.

We know VFX can be confusing to even the most experienced industry veteran. But at REALTIME, we make the process as stress-free and easy as possible. If you need CGI elements in your upcoming project, get in touch with me on [email protected] to see how we can help.

TV VFX for dummies: what are 3D VFX?

We are in a golden age of visual effects. CGI visuals have never been more in demand. Your expertise in the industry may lie elsewhere so you might not realise the actual breadth and depth of what a VFX supplier like us can provide.

3D VFX are more than an alien invasion or a superhero fight. VFX is a multi-faceted process, one that we will break down over the course of our ‘CGI for dummies’ series! With our three-part adaptation of War of the Worlds looming, we couldn’t think of a better time to dive into the world of 3D visual effects for TV.

 

What is VFX for TV shows?

At its most basic, visual effects (VFX) are the imagery created, altered, or enhanced for a production. They accomplish that which cannot be done during live-action shooting. They aren’t to be confused with special effects (SFX) which are usually done on location during filming. Explosions are a common example, though they are also being replaced by digital recreations as these are safer and cheaper.

On a large scale, they can create the impractical or impossible, such as new background environments, props such as planes, trains, or boats, or life-like animals and creatures. But they are also frequently used for small, simple fixes and to save time and money.

As we said, they aren’t limited to your big-budget action flicks. Today, you’ll be hard-pressed to find a movie, TV show, or advert that doesn’t use VFX in some way. In the coming months, we’ll be covering everything VFX. Here’s what you can expect in our upcoming series.

 

Demystifying industry terms

Like any industry, the VFX sector has its own dictionary of jargon. It can make the VFX process seem like a Gordian knot, but every term has a simple definition. In this blog, we’ll break down the ABCs of VFX in an easily digestible way. You can learn terms like keying, which is where you replace the green from a green screen with whatever background you like.

 

How to plan your CGI elements

Creating your CGI elements is one thing; planning them is a whole other process. From concept art to pre-visualisation, we’ll guide you through the process of planning the integration of your assets. You can’t expect to direct your show if you don’t know where your assets will be and when. Look out for our simple step-by-step guide in the future.

 

Asset building

This might be the first thing you think of when you think ‘VFX’ – creating 3D assets for the production. You can create anything you can think of: from a squadron of planes to a giant three-legged machine and a simple crate, whatever you need. But even this aspect has multiple parts to it. We’ll save the details for a future blog, so keep your eyes peeled.

 

Particle effects

3D modelling can cover even the tiniest details, including particles like smoke or dust. Particle effects – or particle systems – are made up of tiny particles that, when combined, create the illusion of a greater entity. They will also have their own ‘lifespan’ of sorts, so you can realistically create something like embers in a fire or droplets of rain. The smallest details can make the biggest difference, so look out for a blog on this in the future.

 

2D vs 3D

If you read 2D and thought we were talking about animated cartoons, then this blog will be for you. Because when we say 2D, we’re talking about visual effects that can be achieved in a 2D space, as opposed to a 3D one. For example, your backgrounds might be 2D matte painting backdrops placed behind your actors. While 3D VFX are great, you can achieve an awful lot more cost-effectively with 2D VFX. We’ll dive into this topic more in a future blog.

 

CGI vs practical effects

How do you know when to use practical effects or CGI? Is a practical explosion better looking than a CG one? Hopefully, we’ll be able to break down the pros and cons of both for you, so you can figure out how to save yourself time and money during production. Look out for that blog in the near future.

 

Creating realistic, seamless 3D visual effects takes time and effort. Over the coming months, we hope you’ll find a new appreciation for this art. Check back regularly for another taste of the world of VFX. We hope to see you soon!

At REALTIME, we have the staff and expertise to carry out everything we just talked about. So if you have an exciting project in the pipeline and would like to discuss your visual effects needs, please get in touch with me at [email protected].

Say hello to Joe!

We love to give you an insight into who we are at REALTIME and we are our people, so say hello to our new Lighting Artist Joe!

 

Tell me a little bit about yourself?

My name is Joe Worthington, I’m 31 and from Manchester. I studied Computer Visualisation and Animation at Bournemouth University which landed me an exciting post-graduate job at my local Tesco! I then moved to London and got my foot in the door of the VFX industry as render support. Through this I was fortunate enough to be moved into lighting where I worked on several exciting (and some less exciting) blockbusters. After seven and a bit years working in the south I felt the pull of home and moved back up north, eventually leading me to RealtimeUK!

 

What’s your role at REALTIME?

My primary role here will be CG lighting and look-dev but I’ll also be handling compositing duties.

 

What first sparked your interest in 3D Art?

I’d love to say I had a revelatory experience during a classic film at a young age and this has been my dream ever since, but it was nothing that poetic. I come from a fine art background but I also enjoyed coding, this in conjunction with a love of film led me down the path to 3D art. While it may have been a decision decided with logic I couldn’t be happier with the choice I made. I love working in this industry and am passionate about the work I create.

 

So, what does an average day consist of for you?

A lighters first job in the morning is to check the farm and see how their overnight renders fared, investigate errored frames, resubmit where needed and generally assess render times. I’ll then quickly comp up anything which is ready so it can be reviewed in dailies. Tasks throughout the day will generally be a balancing act of shot lighting, asset lookdev and compositing. Late afternoon is the time to make sure renders are prepared to run overnight, ensuring I’m pulling together all the latest and greatest assets passed on by other artists.

Oh and lots of cups of tea, especially in the morning, I need a strong brew to wake me up.

 

What’s the best thing about working here?

So far the best thing at RealtimeUK has been the people, they’re an exceptionally talented bunch. I’m always looking to improve and learn so I’m excited to work as part of a team which will push me to be a better artist.

 

Are there any upcoming things in the industry that you think people need to keep an eye out for?

AI learning is a growing area for the industry, while its only had specific applications thus far like facial capture, I hear it’s also being looked into as a means of denoising ray-traced renders. It will be very interesting to see where else in the pipeline it can be incorporated. In terms of lighting and rendering, game engines are producing images of increasingly impressive fidelity and with NVIDIA making a push for ray traced graphics on the GPU I’m curious to see how, if at all, this will affect the industry.

 

Tell me a fun fact about yourself?

I once survived having a pub fall on me. Made the news.

The road ahead is strong for diversity in digital media

In the past few decades, digital media has made tremendous progress in the pursuit of diversity. Protagonists, directors, and developers are just some of the roles that are being increasingly occupied by under-represented minorities in video games, film, and TV.

Industry-wide encouragement for more representation is enjoying success in several ways. Let’s dive into the figures and see what people are doing to increase engagement with a more diverse audience.

 

Video game participation

Studies are challenging the inaccurate ‘gamer’ stereotype that assumes they are straight, white, and male. Look at the fact that:

Stereotypes are clearly not an accurate representation of gamer demographics. Many developers have seen this as an opportunity to engage with a larger audience.

 

Successful models of diverse engagement

Diversity in video games is growing and some adopters are enjoying roaring success. Apex Legends, a free-to-play Battle Royale game, has taken the gaming world by storm. With relatively little marketing efforts, they hit 50 million players in the first month. They also instilled diversity into the core fabric of the game’s characters. Out of the eight characters so far, two of them are black women, one non-binary, one mixed race, one Hispanic, and the other non-hetero.

Other developers are creating narratives that represent the experiences of different backgrounds. In Assassin’s Creed III: Liberation, the game revolves around interacting with the world as a biracial woman. Naughty Dog’s The Last of Us explored Ellie’s sexuality. It’s a theme that will be expanded upon as she steps into the light as the protagonist in The Last of Us Part II.

Elsewhere, in-game character customisation gives players even more options. They don’t have to be bound to any race, gender, or even sexuality. BioWare might be the trendsetter in this regard, and relative newcomer Stardew Valley lets you marry NPCs no matter what the gender.

 

Film and TV diversity

Minority-led movies and TV series are taking off across many platforms. Black Panther enjoyed a record-shattering box office success. It was the biggest February opening weekend ever, the biggest non-sequel debut ever, and the top-grossing film by a black director. It breaks the Hollywood illusion that actors and directors of colour generate less revenue than their white counterparts.

Before the major motion picture Crazy Rich Asians, Hollywood hadn’t released a film with a majority Asian cast for 25 years. This romantic comedy topped the box office, once again proving the power of diversity and the importance of engaging with a variety of audiences.

In terms of TV series, Netflix has done well to reflect diverse perspectives and progressive points of view. Important social and political conversations are opened when popular shows tackle stigma on mental illness (Lady Dynamite) and highlight problems with the prison industrial complex (Orange Is The New Black), the immigrant experience in the U.S. (Master Of None), and systemic racism (Dear White People).

For gaming, film, and TV industries, it’s clear that minority represented or minority-led productions can thrive. The movement to actively seek out engagement from wider audiences has begun and is largely enjoying success. There’s still a long a way to go, but representation and diversity in digital media has a bright future ahead.

RealtimeUK understands the importance of diverse engagement and can bring that to life in your cinematic trailers. Get in touch with me today on [email protected] to discuss your next steps.