renderman | test

i’ve known pixar renderman since ca. 2005 when it first came out as a plug-in for (then) alias|wavefront maya. at that time it was the king for soft lighting, less technical than mental ray, with often unrealistic but beautiful results.

over the years i lost contact, and in recent months the → arnold renderer by the spanish programmers from solid angle served all my needs. it is just an amazing tool for lighting, texturing and rendering.

renderman is a direct competitor, and this is my first attempt to use it for a maya scene about → NURBS modeling. i’m quite impressed. the scene has a global and three local light sources. the NURBS fillets are depicted as highly reflective. nice, and not hard to set up.

maya scene with NURBS fillets, rendered using renderman

our | house

computer gamers would love this. it’s a very versatile house, made in 3D, of course. you can enter from different sides and angles, but you cannot use the same entry points for exiting the house. i won’t show it here. i’ll only show how versatile the house can be from the outside. technically i used Maya 2018, the prosets plug-in for 1 minute modeling of a cylinder, a toon shader and two lights.

the house in black with orange

the house in white with some yellow

the house in grey and some red

sobering view

thumbs | down

i run a youtube channel, free of charge, about computer animation. since the channel reached a critical amount of subscribers, every new tutorial i upload receives one (and in most cases only one) negative review by a thumb pointing down. since that happens typically only few minutes after publication of the video, i guess it’s a bot at work. but what kind of bot, and to what purpose?

thumbs down within minutes

blue horse in a | box

this horse was an “accident”, at least not planned in any way. initially i wanted to model a small, almost cubic room with subtle deformations of all three visible walls. then i thought, nice, let’s cut out some windows. after having done this (a matter of seconds), i needed a light, a light from the back, a strong light from the back, a light with so-called “god rays”. well, then only one thing was missing: a 3D object. i picked the default horse (from maya’s built-in library of 3D meshes) and coloured it red. didn’t look good. blue? yep. the rendering at the very bottom has a slight overdose of depth of field (DOF). nevertheless, quite cool.

the whole work was done in maya 4, which still sits in my computer. i imported it later to the current version for the screenshots.

blue horse in a box. © ms

the 3D scene

same scene, thin focus plane and an extra in the left window.

wired | horse

wired horse. © ms

the rendering above is how it ended. it involves some atmospheric light distribution and a tiny breeze of depth of field. the focus plane is where the horse’s eyes would be, and this works nicely together with the wireframe shadow. the light sends what is called “god rays” through the wired horse. the next images show different approaches with, at the bottom, an intermediate result. once i had that image, it was only a matter of minutes to refine it for the final rendering. the model of the horse, btw, is from → maya‘s built-in library of 3D objects.

two different approaches. wireframe and toon rendering

end of the wiring and test-rendering phase

swinging | orbit

swinging orbit. grafik: ms

actually, this was not meant to lead to a rendered image at all. i experimented with particles in CGI, 10 per second, emitted in the center of the scene, about 10 cm from the objects you see in this rendering, blown into one direction by wind. i then applied so-called trails to connect them. what i actually wanted to get to, however, was a more loose connection of the particle objects by thin curves. it took me about an hour or two to find out how this can work. and while experimenting, this camera angle came up. the view was so beautiful, i introduced some light and did this high-res rendering with depth of field. the camera actually is in the particle stream, with the particles flying towards us, and it looks into a world of macroscopic dimensions. that‘s why the DOF is so shallow, even with a pretty small aperture.

hand kenn ich | nicht

google lens ist eine erst vor wenigen tagen freigeschaltete KI-app. sie tut typischerweise folgendes: ich halte meine handy-kamera auf eine küchenlampe, google lens erkennt das nicht nur als küchenlampe, sondern nennt mir “ikea” und bietet mir ähnliche lampen samt kaufmöglichkeit an. hier aber versagt die app mitsamt ihrer künstlichen intelligenz:

google lens erkennt nicht, dass das finger sind.