6 posts tagged

3dmodeling

Houdini to Redshift: Keeping Colors Sharp

In Houdini, I usually assign color to primitives (though Houdini defaults to assigning it to “points”). However, if you want Redshift to recognize color attributes (using RSUserDataColor), you need to promote the Cd attribute to points or vertices, as Redshift doesn’t interpret it directly on polygons.

Promoting Cd to points will result in color blending when you subdivide the model, which can create blurred colors. To maintain sharp color boundaries, promote Cd to vertices instead, as Redshift can understand vertex-level color attributes clearly.

 7   27 d   3dmodeling   houdini   redshift

Modeling low poly pirate ship in 3D

Timelapse of modeling low poly pirate ship in 3d using Houdini, Zbrush, Substance Painter, RizomUV and Redshift:

If you prefer video on youtube with chapters you can watch it here.

Small concept drawing that you see in timelaps I found on pinterest quite a while ago. Sadly I don’t know the author.
You can rotate model in your browser here:

I also passed geometry to Meta Spark Studio to make an IG filter.
Here is link to open filter in Instagram:

If you want to do that there are couple of things to remember:

 967   2023   3dmodeling   animation   modeler   process   rizomuv

Zbrush lowpoly modeling and polygroups.

If you ever wondered why during polymodeling in zbrush you keep selecting several polygroups by CTRL+Shift clicking on only one:
Zbrush for most of brushes and selections uses vertices. And if your polygon doesn’t have any other polygon in same polygroup next to vertex that you clicked – it just selects also next polygoup.

So in this case you can use Select Lasso tool and click on one edge to hide full polygon loop. And then invert the visibility by CTRL+Shift dragging outside of mesh:

 864   2022   3dmodeling   zbrush

Houdini – random coloring from image palette.

I was trying to optimize my coloring process for a project. And here is where I got right now:

Coloring process:

  1. Get palette that I want as a screenshot from here:
    https://paletton.com/#uid=60B0u0kllzcboPZgUH4pEuxt-pp
  2. Convert image to Utility-Texture_sRGB with target color space ACEScg using PYCO ColorSpace converter. (I still need to make some more test on this part by using this .exrs files as emission texute to compare colors with reference).
    https://pyco.gumroad.com/l/pycocs
    Free with the code free at checkout.
  3. From github you can install Color Palette Ramp – a Houdini HDA that creates a ramp based on a color palette from an image.
    https://github.com/jamesrobinsonvfx/colorpaletteramp
  4. In Houdini using that HDA (colorpaletteramp) on SOP level create a ramp. If I got image from Paletton webpage then I use Stops -> 20. But something around 10 works great for other images.
  1. With OD Tools you can right click on result and “Palletize Ramp [OD]” to make colors separation constant and look more like palette instead or gradient. You can get OD Houdini Shelf Tools 2021 for $100 here:
    https://origamidigital.com/cart/index.php?route=product/product&manufacturer_id=11&product_id=66
  1. You can save this ramp in your OD Asset library for future use.
  1. To color geometry based on disconnected pieces: first use “Connectivity” node on points to create integer attribute called id. Then use “Attribute Adjust Color” node with Adjustment Value -> Pattern type set to Random. Randomization By -> Custom attribute. Custom Attribute -> id. Then with changing seed parameter you can get random options of color combinations.

Results from 3 different ramps:

 1164   2022   3dmodeling   houdini

Shaman – Houdini vs Blender

I wanted to try Blender for a long time. And came across a series of tutorials from YouTube channel Blender 3d. After watching it became clear why so many people love this free software.
I started in Blender, but then jumped back into Houdini. With plugin called Modeler, you can repeat the steps without problems.
Here is a “turntable” and then a speedup walkthrough:

UVs I did in RizomUV. They have just released an update. And now you can insert one group into another. For example, a group of “feathers” can be included in the “head” group and packed together. One of my favorite tricks: you can pack the islands using their direction in 3D space. Want everything to be aligned by Y in UV space? Just a click of a button. By the way, the groups made in Houdini are visible in Rizom.

After UVing, I imported some groups into Zbrush to add details.
I baked from high to low in Marmoset. It also understands groups from Houdini, and therefore it is not necessary to export the “exploded” mesh separately, as it is usually done for baking in Substance. Another nice thing about it is the auto-reload of textures and geometry. If you change something in another program and save, Marmoset automatically shows those changes.
I textured in Substance Painter. Then I rendered in Houdini with Redshift.

To make the cartoon outline: I cloned geometry. Assigned double-sided material to it. The “front” is transparent, and the “back” has only black emission material assigned to it. Then you add displacement with a constant instead of a texture. And that’s it. You can control the thickness of the line with the amount of displacement. And color of line with emission (yellow in this example):

Then I repeated the same trick in Marmoset. It works when rendering. But displacement is not supported in the “viewer”. So if you want to send a link to the client, so he can rotate the model in the browser you need another approach:
I exported additional geometry from Houdini, but with reversed normals and a bit inflated with “Peak” node. Then in Marmoset I assined a new dark material without reflections, and set the Diffusion module to Unlit.

Here is the result that you can rotate:

And couple more renders from Redshift:

Original concept drawing was made by amazing La Foret Oublie.
And here is a great article on shading in Marmoset.

 1684   2021   3dmodeling   blender   houdini   panama   redshift

What language to use in Panama.

Sometimes I feel like writing or talking on social media. But deciding which language to use is tricky. Russian guys don’t know Spanish. Many Panamanians do not speak English fluently and certainly do not know Russian.
Everything I watch and read is in English. But I rarely speak it. All my notes are also in English.
After ten years in the tropics, I began to forget some Russian words. Whether it is worth remembering – I do not know. Better to learn to speak correct Spanish. I speak often as a dockman with zero benevolence. My woman suggested at the beginning of conversations with new people to mention that I am Russian and frankness is not considered rudeness in our country.

In the video suggested by YouTube, a guy makes a dialog icon in five minutes in Sketch:

I liked the thumbnail and I made it in 3D.
At the same time, I practiced gluing panoramas with a high dynamic range. They are used for lighting most scenes in 3D. Just before the start of the quarantine, I took a few photos in the office. Unfortunately, the only program that glues them well (PTGui 12) costs $300. The demo works without restrictions, but it fills everything with watermarks. In Photoshop, however, they can be erased even in 32 bits. I guess it’s ok for just a fun project.
I also figured out a little bit more about the differences in normals between points and primitives in houdini.

Sometimes I need to get vector graphics from illustrator and extrude it in houdini.

Constant problem is that some parts can be flipped and will extrude in different direction. It’s happening because direction of paths when they were made in illustrator is different.

So I thought that I can add Attribute Expression node, set it to points, attribute to normal, set VEXpression from dropdown to Constant Value and write 0, 1, 0 in Contact Value to get normals pointing up. But it will not change the primitives normals. Because primitive normals are not actually an attribute. They are derived information that is calculated based upon the vertices that make up the primitive. As such, they cannot be modified. You can still use PolyExtrude, set it to point normal and extrusion mode to Existing. But you will end up with geo where some primitives normals will look “out” and others will look “in”. I don’t know if there is an easy fix for that.

So after you bringing you paths from illustrator you need first to separate primitives that are flipped. You can do this by using simple group node. Use only “keep by normals”, set the direction to 0,-1,0 and lower the spread angle. There is also Labs Split Primitives by Normal node that does exactly this but with less clicks.
Then use a reverse node. It WILL reverse vertex order in the primitives.

Also takes are amazing. You can create different version of scene and in render node save an image with take name like this:
$HIP/render/r01/bubbles.static.s5.`chsop(“take”)`.$F2.tif
So `chsop(“take”)` part is responsible for take name. And in my case the output names will be:
bubbles.static.s5.blue_bubbles.01.tif
bubbles.static.s5.orange_bubbles.01.tif

 829   2021   3dmodeling   houdini   redshift