Chaos
Architecture
Limited-time offer: Get an extra three months of free V-Ray with an annual license. Buy now and save

Industry solutions

Explore our ecosystem

Products

Media & Entertainment

Industry solutions

Explore our ecosystem

Products

Product & E-Commerce

Industry solutions

Explore our ecosystem

Products

Education & CommunityHelp

Chaos Help Center


What are you looking for?

Screen Scene © Starz
Screen Scene © Starz

Screen Scene goes Invisible with V-Ray on Black Sails


In television VFX right now, some of the most elaborate work required is of the ‘invisible’ variety. These are the shots that you might just not notice, but have actually gone through the hands of many skilled effects artisans.

One show capitalizing on this seamless type of VFX work is the Starz series Black Sails. For season four, Dublin-based studio Screen Scene VFX was called upon to create a raft of period city shots by augmenting live action and creating photorealistic buildings and locations. In order to reach that photorealistic look, Screen Scene took advantage of Chaos Group’s V-Ray renderer, a plugin the studio has been using for more than a decade.

We asked Screen Scene visual effects supervisor Ed Bruce and senior 3D artist John O’Connell to break down their Black Sails work and discuss how V-Ray was part of the studio’s pipeline.

The amount of invisible visual effects work by Screen Scene in Black Sails is incredible – what was your brief when coming onto the show in terms of the style of VFX required?

Ed Bruce (visual effects supervisor): Ultimately the brief was the same as it always is. Respect the in camera material, be as invisible and seamless as possible, don’t let the VFX work distract from the director’s storytelling and bring something rewarding to every shot.

Season four of Black Sails gave us a great opportunity for delivering the unnoticeable visual effects but also the scope of many hero wide establishing shots that clearly must have been VFX due to period, environment/location and assets. These big wides really helped locate the story and were important for the audience whilst giving our crew in SSVFX the challenge and reward that fully CG shots bring.

Can you talk about the assets that had to be built or used for your shots – what reference did you look to in crafting and look-dev’ing them? What sets or areas had been shot for real that you needed to extend or replicate in CG?

Ed Bruce: The production’s visual effects team, headed by visual effects supervisor Erik Henry and producer Terron Pratt, collected a vast array of data from the set in sunny South Africa. Production also shared assets from previous seasons from multiple vendors, especially some very high resolution ship models.

Many of Screen Scene’s shots were based around the dockland area of 1720s Philadelphia which we built in CG and populated with ships, masts and small vessels. In a post TV schedule, sharing assets to avoid any additional modelling work really becomes beneficial.

Archival reference for Philadelphia

Recreating 1720s Philadelphia was challenging. As with most things before the 1900s there aren’t any photos of the time, thus making it more difficult to get good references. Being one of the first US cities to adopt the grid style layout this helped with reducing many layout decisions for the backgrounds.

Production themselves had stayed faithful to Philadelphia’s main square layout and some of the buildings featured in the show still stand. Production built the ground floor of each building on the streets that the actors walk along to the measurements of the current city. Firstly we were tasked with topping up all of the on-set buildings with their second stories and roofing. The art department had created SketchUp documents for their set construction plans and while they only physically built the first level of each house, they had completed the entire building in SketchUp. This is what we used as our guide for the building top-ups.

The on-set VFX team had also taken Lidar of the shoot set and supplied us mesh files to work with. We were then able to create the extensions cleanly and then lay them on top of the Lidar scans. As you’d imagine, the real world is never as mathematically perfect as what we create in 3D, therefore we made slight adjustments from our models to line them up with the Lidar to better fit the live action set.

The second task was creating the continuation of each street and the surrounding streets and buildings whilst populating the set with CG people and props. There’s some artwork and drawings of Philadelphia back in the 1700s which was a great starting point to the style of period’s building construction that our modelers were then able to create generic houses around those themes. A few fun things popped up in our research too. Houses back in Philadelphia were quite often timber frame and since it was a new city which was gradually spreading out from a central docks, there weren’t always concrete foundations to build upon. As a result some houses were raised above the ground on posts and had the ability to be moved. If you hated your neighbors you could pop your house up on a trailer and pull it with a load of horses somewhere else!

In terms of lookdev we find CG integration is far easier than fully CG shots. You’ve got rules and limitations set for you by the shoot location with most of the reference that you’ll need in the plate in front of you. We made a lot of use of Itoo’s RailClone for both the creation of the building cladding and roofing and as an overall layout tool to create city blocks driven by splines we’d traced from historical maps of Philadelphia’s streets.

For the Island and fort shots we did most of our vegetation population with their Forest plugin which is fabulous for environment layout. The Itoo guys have worked quite closely with the Chaos Group team to take advantage of their render time instancing. For some close-up trees we used GrowFX giving us some gentle wind motion to match what was happening with the practical vegetation.

Can you talk about some of the lighting and rendering challenges in particular? What was it, do you think, that helped sell your CG shots and also integrate CG work into live action?

Ed Bruce: As always on TV schedules, time is our biggest challenge, especially with heavy CG renders. On a previous show we’d done large scale fully CG shots containing similar components and levels of detail. However, Black Sails had a lot more large CG extensions and fully CG shots within a single episode.

Filling out environments as complex and busy as the streets of Philadelphia leading down to its bustling docks required us to create a large variety of assets in order for the shots to become an environment that feels varied and lived in. The downside to this level of detail is you’ve got to deal with all of the data that you’ve produced within the 3D scene and of course when it comes time to render! We imagine we’re no different from any other VFX company in that we’ll try and push our resources as far as we can and in this case we were just about squeezing the renders into the available ram on the render nodes.

In terms of lighting and integration, the on-set VFX team did a great job of Lidar-ing the majority of the set where we had to extend buildings upwards and we were able to use these models to drive our matchmoving giving us very precise alignment between the on-set surfaces and our CG extensions.

Since the shoot was in South Africa, which had incredibly crisp blue skies and direct sunlight, we could use the Lidar surfaces and the on-set silver balls to match the direction and softness of the sun very easily and have something that flowed very nicely from CG shadow into the live action shadows. Our 3D Supervisor Krzysztof Fendryk spent quite a bit of time making sure that the weathering and breakup of our building textures were a close match to what was detailed practically.

The compositors played a big part doing the subtle and finer points of matching lens artefacts and overall levels taking advantage of shot Macbeth colour charts and lens distortion grids.

What were some of the advantages of using V-Ray on this show?

John O’Connell (senior 3D artist): We’ve been using V-Ray for around 12 years now and therefore our artists have a lot of familiarity with it. V-Ray doesn’t have many weaknesses as it’s been used for most aspects of 3D. So, while some renderers might have particular strengths, V-Ray is able to cover nearly anything you’d ask of it very well, without too much fuss. The frame buffer and render pass options give you a lot of feedback. We find it quite easy to isolate aspects of the render when we want to tweak one specific thing. The speed improvements in their V-Ray mesh format meant we could keep our working files quite agile and not get heavily penalised when it came to rendering shots out as the compositors like the amount of matte, light or other utility passes we can throw over to them with little fuss.

It’s a great performing renderer and it’s got pretty much everything you’d need out of the box. We are always interested to see where the development team are innovating. We’re especially interested in their porting of the auto texture mip-mapping of textures from the GPU renderer – we find with the jobs we’re doing, we always want to try and add in more “stuff” to make our scenes richer. Therefore if the Chaos team can find ways for us to do this on the same machines then that’s very much appreciated!

Overall there’s very few complaints about the software at SSVFX, it’s very well proven and if we ever run into anything specific, the support team is fabulous. As we keep trying to push ever more ambitious shots through our facility, having a renderer that continues to develop ways to solve and complete these tasks is vitally important.

Can you talk about how V-Ray is used in general in SSVFX’s pipeline, and perhaps on other projects? Did you use any other Chaos Group plugins or products on the show?

John O’Connell: Screen Scene has always been a 3ds Max-based house. We started off in the commercials market originally and the native Irish market. In the past this body of work wasn’t big enough to warrant large teams or R&D departments etc, therefore it was very handy for us to take advantage of the agility of Max and its wealth of off-the-shelf plugins. We’ve always got a variety of requests from clients, which can cover anything that they see on TV or in cinema so we wouldn’t have the resources, budgets or speed to build software to meet the project’s needs. It’s great to have a wealth of small developers that are making relatively inexpensive and high quality plugins to meet the market’s needs or fill in gaps in the base software’s capabilities.

In the early 2000s there was a bit of a renderer battle going on between Brazil, FinalRender, Arnold (we had an early beta for Max before it was sold to Sony for development) and of course V-Ray. We had tried the lot of them and they all had their good points but we settled on V-Ray for our first HD job as it had a really solid implementation of all the basics – good geometry handling, high quality anti-aliasing, fast raytracing and 3d motion blur. It was the first renderer to do render time displacement too which was great for a character job we were doing at the time.

All the other major Dublin companies at the time were based around Maya or Softimage which meant their renderers were quite heavily behind. We were getting far nicer results purely because we were able to use GI from a very early point without murdering ourselves with render time and the likes of having light cache to fill out the brightness of an interior. That was a godsend. V-Ray was easy to get to grips with for the non-technical artists and gave great result right away. It’s been our renderer ever since.

When we set up a specific VFX department for long form TV and film in 2010, it was great to have a solid EXR implementation, render passes for free and all the other various image quality aspects it just kept on delivering. Since it’s been a faster raytracer and better GI solution than most renderers for years it’s been great that other VFX companies have picked it up and requested all of the other features you’d need in production.

In terms of the next wave for utilising V-Ray, we’re yet to implement it within Nuke but it’s very interesting to look at the “scene assembly” approach where everything is baked down into dumb caches and drawn together in a content management application to be fed to V-Ray.

We have been using Phoenix recently, the infinite ocean texture is terribly handy for a lot of our wide shots where we just need moving water with a matte painted opposite shore reflected. We’re using it heavily on a current project for a tonne of blood and gore too, it’s a very fast simulator. We completed a show with a lot of stormy ocean setups a few years ago, which was a very challenging process. We’re looking forward to when the Phoenix dev team have some time to blend a texture driven water tank into an infinite ocean for large rolling wave shots.

What were one or two of the most challenging shots to pull off in your work for Black Sails?

Ed Bruce: On the live action side there were a few difficult shots that were extremely long. We were dealing with getting very accurate registration of our building top ups, having to graft tree branches onto trunks that were purposely trimmed heavily to allow easier keying for the street extensions, the usual roto fun of having a heavily populated street with motion blurred people criss-crossing everywhere, which in the most challenging shots we shot handheld.

With long shots the tech can be difficult to spot until you see a full render. You think you’ve got it all perfect and then you spot one little mis-alignment in geometry in extension or render glitch. When this happens it’s back through another iteration and render. On the fully CG side it wasn’t bad, the main issue was the memory overheads. The scenes we generated were pretty close to the limits of our render node’s RAM capacity.

John O’Connell: We have a good workflow on how to generate the scenes, laying everything out fast which gives us things to look at quite quickly. A big thanks must go to Paul Roberts in Itoo for helping us with a very elegant layout solution using RailClone to make generic city blocks very efficiently.

We started the scenes off in as realistic fashion as possible, again using the proper proportions of Philadelphia and the Delaware River from maps and references. It was also great to have Erik Henry pop in for a few days especially for setting up the fully CG shots camera angles. Erik knew the clients vision and desire. This helped reduce the design process and version count for each shot.

How did we pull them off? SSVFX has always had great, talented artists working on our shows and it’s always good to see their craft materialize in the finished product. The visual success of season four is a collective effort from the production team through to our contribution. We’re looking forward to working again with Starz, Erik Henry and Terron Pratt and their team.

Original story by Ian Failes. Republished with permission from AV3 Software. Check out their blog.

Henry-Winchester-profile-pic.jpg
About the author

Henry Winchester

Before becoming Chaos' content marketing manager, Henry contributed to magazines and websites including PC Gamer, Stuff, T3, ImagineFX, Creative Bloq, TechRadar, and many more. Henry loves movies, cycling, and outrageously expensive coffee.

Originally published: October 9, 2017.
© Škoda Design

Subscribe to our blog.

Get the latest news, artist spotlight stories, tips and tricks delivered to your inbox.

By submitting your information you are agreeing to receive marketing messages from Chaos. You can opt-out at any time. Privacy Policy.

Chaos
© 2024 Chaos Software EOOD. All Rights reserved. Chaos®, V-Ray® and Phoenix FD® are registered trademarks of Chaos Software EOOD in Bulgaria and/or other countries.