The online edition of the ICE.ART 2025.II conference, held 1 & 2 October 2025, has released all presentation and panel recordings. They are now freely available via the website and the YouTube channel.
Game developer and educator Sam Martino, founder of Dogwood Gaming, delivered one of ICE.ART 2025’s coolest talks with Rebuilding 100-Year-Old Tracks: The Montgomery Raceways Project. His presentation explored how Unreal Engine 5, GIS height-map data and painstaking archival research are being combined to digitally reconstruct long-vanished pre-war racetracks for a historically grounded racing game. Working with small museums and local historical societies, Martino’s team has recreated sites such as the Laurel Raceway board-track in Maryland, the Aquidneck Park circuit in Rhode Island, and the Second Beach course, America’s first recorded beach race. Each track rebuild begins with scans of archival stock brochures, period town maps and surviving photos, which are aligned with real-world topography to ensure geographic accuracy inside Unreal. Beyond the technical craft, Martino framed the project as a form of digital preservation, rescuing automotive heritage from oblivion as museums close and collections disperse. By transforming lost raceways into interactive worlds, he argued, game technology can safeguard mechanical history and remind new generations how ingenuity once roared on wood and sand.
In his talk Capturing the Magic of Automotive Viz in Unreal Engine, Nairobi-based CGI generalist and director Jay H. Patel shared an in-depth look at how he and the Lotus Tech Creative Centre produce cinematic automotive visualisations entirely in Unreal Engine. With nearly 15 years of cross-industry experience, Patel demonstrated how world-building, lighting, virtual cinematography and procedural content generation (PCG) combine to deliver high-fidelity marketing films such as Lotus Type One and Lotus Emira for Drivers. He described Unreal as the team’s “final assembly” tool, integrating assets from Houdini, 3ds Max, Blender, and Substance Painter and emphasised real-time iteration as the key advantage over offline renderers. His breakdowns included real-world techniques such as path-traced rendering for glass accuracy, motion-blur under-cranking for speed shots, and rule-based PCG setups for dynamic environments.
In his presentation Scope City – Procedural Urban Manufacturing, Faruk Heplevent, founder of The Scope, demonstrated how his team has built a fully procedural city-generation system in Houdini to meet the demands of high-end automotive visualisation. Developed over five years, Scope City automates the creation of legally safe, photoreal urban environments for car commercials: cities that feel authentic without copying real-world architecture. The tool ingests OpenStreetMap data and converts it into editable urban layouts, allowing art directors to control scale, style and composition for each campaign. Heplevent detailed the studio’s pipeline from Houdini and Substance Designer through V-Ray, Vantage and Omniverse, with ongoing work toward Unreal Engine and OpenUSD/MaterialX support. By designing “manufactured” cities with strict calibration (six pixels per centimeter), watertight geometry and procedural materials, his team ensures assets can hold up under close scrutiny and serve multiple production needs — from commercials to virtual production and synthetic-data training. At its core, Heplevent described Scope City as an evolving machine for storytelling control: a digital assembly line that lets car brands design their own cities.
Some of Platige Image’s R&D team (Sara Batista and Zoltan Cseri) presented Lights, Camera, Unreal: Redefining the Cinematic Workflow at Platige. The duo offered a technically grounded look at how the studio has standardised Unreal Engine as a core component of its cinematic pipeline. Batista, a software engineer turned production manager, and Cseri, a senior developer specialising in realtime engines, outlined how Platige’s internal Unreal team evolved from early experiments in 2020 (Koala) into a full-fledged department producing over 30 projects across commercials, cinematics and short films. The studio now maintains its own Unreal 5.6 build, develops proprietary tools, and integrates the engine into traditional CG workflows used for productions such as Love, Death & Robots, Crossfire, and Little Ruby.
Their presentation compared two milestone projects. The Aston Martin commercial, produced with Unreal 5 beta, demonstrated procedural materials for marble and crystal shaders and spline-based city generation, but also exposed the need for a dedicated pipeline and toolset. The later Warhaven cinematic, created in Unreal 5 release, showed the results of that investment: realtime fire effects built with EmberGen flipbooks, dynamic slow-motion using Unreal’s time dilation, a custom “turn-to-stone” blueprint replacing Houdini caches, and efficient render-farm integration.
Beyond showpieces, the talk detailed Platige’s in-house automation: Perforce and Git versioning, Deadline render-queue management, custom Python/C++ shot-building and render-submission tools, shared shader caches, and OCIO-based colour consistency. Batista and Cseri also addressed the cultural shift of retraining artists for realtime production, citing optimisation discipline as the hardest transition from offline rendering.
Michael Shelton, a 30 year industry veteran, works as Visual Effects Supervisor, Second Unit Director, and in other roles. Find more about his services on his website and work on IMDB.
In his presentation, Michael talks about his early work starting in the 1980s, ranging from designing, building to help puppeteering practical creatures. The creatures, such as a 1/6 scale Godzilla (Roland Emmerich’s movie in 1997), were moved using telemetry suit, which drove massive, dangerous hydraulics systems.
With time, Michael got into the digital realm, both for 2D and 3D work, using the first versions of Photoshop, Maya, and many, many more. Projects he worked on include Mary Poppins, Star Wars, West World, and many others. We’re all waiting for Michael to publish a book about his work!
How to watch
Recordings are available through the ICE.ART website, in the “Recordings” section under ICE.ART 2025.II, and on the official YouTube channel, making it easy to catch up at your own pace.
Upcoming events
Looking ahead, ICE.ART already lists a special collaboration event with the Visual Effects Society (VES) on 26 November 2025 under the rubric “Beyond VFX”, focusing on AI ethics. Also planned is the next main edition, ICE.ART 2026.I (7–9 April 2026), which maybe will switch to a new presentation platform, and that is going to be great. Be ready to be surprised!






