One of the most recent additions to Cycles render engine has been a toon shader. Many people seem to be interested in using it, but aside from the links shown here, there is little to no information about it, so they must resort to personal experimentation. This tutorial aims to teach readers how to use this toon shader to create renders with a comic style, and will be based on this simple scene as an example:


We will break up this tutorial in four different sections:

  • Lighting setup
  • Shading
  • Render parameters
  • Compositing

Lighting setup:

The first step will be setting up our lighting, which will be basically a key light and a fill light. Both lights will be point lights with 0 size. Any light with a size different to zero will create gradients in our shading. We want shading to be as flat as possible, since we will manually create and control gradients in our objects and characters using ‘size’ and ‘smooth’ parameters in the toon shader.

Key Light: we will use a point lamp, placed slightly above and to the right of our character and set to a very high ‘strength’ value. This will be our main light, the one that will define shading direction and the only one casting shadows in our scene. The high ‘strengh’ value shown here is needed to light a model that is 10 blender units high. Smaller objects will need much lower values.

cart_tut01_01_keylight_ location

Key light location


Key light settings


Scene lighted by key light

Fill Light: we will use a point lamp parented to the camera, and with ‘cast shadow’ disabled so it won’t create a second shadow under objects and characters. This light is there to fill the dark side of objects, so they won’t be pitch black. It will also create additional highlights and details in our objects. Again, seemingly excessive ‘strengh’ value is due to the size of the lighted model. Smaller objects will require more ordinary values.


Fill light location


Fill light settings


Scene lighted by key and fill lights

World Background: usually our background will be absolute black, but if you are willing to experiment a little with it, you can raise it to a 25% gray. It can help to fill shadow, but can also introduce some noise, which will require higher sampling to clear up, and that means longer render times. For this particular scene we won’t use background lighting.


In this stage we will setup the materials defining our model’s appearance. We are going to use toon shader for all our materials here, so some advice about its behavior is in order.

First of all, you should always use the toon shader with smooth shading activated, otherwise you will get some nasty artifacts.


Another thing to be aware of when working with the toon shader is that certain combinations of ‘size’ and ‘smooth’ settings in the shader will produce very visible terminator artifacts on low poly objects.


The toon shader can work in diffuse mode and glossy mode, and a combination causing artifacts in one mode might not cause them in the other mode. After some experimenting I found that following these two rules for diffuse and glossy mode respectively, you can avoid terminator artifacts:

  • Diffuse mode: size+smooth <= 0.9
  • Glossy mode: size+smooth <= 1.0

As long as you don’t set the added values of size and smooth above 0.9 in diffuse mode or 1.0 in glossy mode, you will be safe from the terminator artifacts in the toon shader.

We will now examine the node setup for the main material in our robot. We will begin by setting our base diffuse shader. ‘Size’ will let us define the proportion between lighted and shadow areas in the object. On the other hand, ‘smooth’ will allow us to define the width of the terminator area, which is the border region that separates light and shadow areas in an object. The diffuse component in our material will have a 0.6 size, assigning lighted surface to a majority of the model, and 0.2 smooth, giving it a nice transition into shadow.

Material node setup - Step 1

Material node setup – Step 1

Material appearance - Step 1

Material appearance – Step 1

Next step is adding the glossy element. As counterintuitive as it may seem, this time we are going to use another toon shader in diffuse mode. As it turns out, setting smooth value to 0 will create a very sharp terminator with no transition between light and shadow, creating a glossy look. In certain cases such as this one, a toon shader in diffuse mode with smooth set to 0 will give us a better result than a toon shader in glossy mode. In order to get more subtle details in certain areas, we will use the same color we used for the diffuse element, tuned down to 80% by a HSV node. Also, instead of combining our diffuse and glossy elements with a mix shader node, driven by a layer weight node, we will simply use an Add Shader node:

Material node setup - Step 2

Material node setup – Step 2


Material appearance – Step 2

To get a more detailed look, we can assign different variations of this material setup to parts of our model. The aspect of the model before proceeding to final rendering and compositing is this:

Scene with all materials before compositing stage

Scene with all materials before compositing stage


Render parameters:

Before reaching the compositing stage, we must set the render settings for the scene.

Render settings

Render settings


Light paths: Forget physically correctness our unbiased rendering. We are doing a cartoon style render here, so we don’t need light bouncing around. Set min and max bounces to 0 and disable caustics.

Sampling: 0 rebounds and hard lights mean that noise will be low to pretty much nonexistent very soon. Usually 50 samples should be enough to have a clean render. You could use even less sampling and probably there wouldn’t be much of a difference to your naked eye, but edge filter nodes in the compositor will notice the difference and return a lot of noise along with the edges we need for our contours. Besides, this technique renders blazing fast already, so there isn’t much of a benefit for lower sampling.

Film: We are going to break up our scene into ‘background’ and ‘foreground’ renderlayers, so in order to be able composite them we will need to enable transparent film.

As I said above, before going into the compositing stage, we must create two renderlayers: Foreground and Background. This is necessary for a correct edge detection, and can even help us to speed our render (more on that in the background renderlayer paragraph). It will also allow us to set different contour intensities for background and foreground elements.

Foreground renderlayer: This renderlayer will have all objects in the scene, excluding background elements such as floors, walls, ceilings, landscapes or terrains. We must always enable normal pass in this renderlayer, as it is necessary for edge detection in the compositor. As you can see in the screenshot below, this renderlayer only includes the first layer, where we have placed everything but the floor.


Background renderlayer: This Renderlayer is for background elements such as floors, walls, ceilings, landscapes or terrains. Notice in the screenshot below that we have selected for this renderlayer only the second layer, where we have placed the floor. We will enable normal pass in this renderlayer only in those cases where we want to have countours drawn for the background. Also, in many cases with rather simple backgrounds, like the one in this scene, we can set a much lower sampling just for this renderlayer, thus speeding up our render. This can be done by adjusting the ‘Samples’ indicator in the screenshot below. If we leave the default value of 0, this renderlayer will be rendered using the same amount of samples as the rest of the scene.



We can now hit render and begin compositing. This is the node configuration we will setup in the compositor to create our contours:


As you can see we start by extracting the inner and outer contours from the foreground renderlayer. We are not going to extract contour from our background renderlayer in this example because the floor is a simple plane. Once we have our contours extracted, we composite together background an foreground and then multiply the result by inner and outer contours in independent steps, so we can use different intensities for both contours.

Inner Contour: this is where the normal pass comes into play. We will first split the Normal output in the Foreground renderlayer into its HSV components and filter its V (for Value) outupt using an edge detection filter. In this case we will use Kirsch, but in certain cases Prewitt and Sobel will work too. The resulting image can show some noise in areas where the source render is darker, which can be solved by decreasing the factor in the filter. After that we will invert the colors to get black edges over a white background, apply a Dilate/Erode Filter to make the edges more homogeneous and Gaussian Blur for some antialiasing. If you are not sure if you are going to make resolution changes in the future, setting relative values for the Gaussian blur is the sensible thing to do, since that mode gives the most consitent output across resolutions, thus minimizing tuture tweaking. Finally, when we apply this contour multplying our render with it, we will use only a 0.4 factor, which will give as a result a subtle contour that integrates better into the image.

Extracted inner contour

Extracted inner contour

Inner contour applied to render

Inner contour applied to render

Outer contour: Using the same basic method we used for get the inner contour, we will perform edge extractions based on the S (for Saturation) and A (for Alpha) components of the image output of the Foreground renderlayer and mulptiply them together before applying a gaussian blur filter to soften the resulting contour. This will allow us to have a more clear and regular contour, which is another good reason to split background and foreground elements into separate renderlayers. Having already composited the inner contour into our render, we will multiply the previous step with this contour, but this time will use a higher factor in order to highlight outer contour over the inner contour.

Extracted outer contour

Extracted outer contour

Outer contour applied

Outer contour applied

Final considerations:

Cycles has been criticized often because of its render speeds and hardware requirements, but this 2048×1024 render takes a mere 26,69 seconds to render on an I7-2600K (Quadcore CPU at 3.4Ghz with hyperthreading). Another common misconception is that being a pathtracer, unbiased physically correct renders are its only objective, however Cycles has a large bag of tricks and tools for non photo realistic rendering, such as lights that don’t cast shadows, ray visibility settings for all lights and objects, negative lights that absorb light from the scene, light path nodes… and now a toon shader. The bottom line is this: although still in development, Cycles is already a versatile render engine, and its possibilities have not been totally exploited yet. Its future certainly couldn’t look any brighter.

  1. Robynsveil says:

    Very elegant, thoughtful tutorial, JuanJose. Thank you for taking the time to make a HTML page vs a video: this I can refer to and do searches on and go back and forth much more easily.
    Interestingly, I have developed a need for just such a shader, so this is very timely! 🙂

    • Juan José Torres says:

      Glad you will use this. Just drop me a line when you start having results. I’m very eager to see what others can do with this technique. As for the text tutorial form, like you I find text tutorials more useful and flexible than video tutorials. That and the fact that I don’t like recording myself. 😀

  2. Tom Telos says:

    Text tutorials, yes! Agreed. Thank you.

    This is a total surprise to me, right after that the Blender Foundation released the long-awaited Freestyle for Blender Internal (which I haven’t even tried yet), but I’m modeling a large building, futuristic in the spirit of e.g. Vincent Callebaut so, yes, I’m fascinated by what you are showing us.

    If you look at any of Callebaut’s projects, you will see some quasi-realistic renderings of the intended finished product, followed up by cutaways and draftsman-style images, which are all made (I assume) with the help of archiviz-oriented software (presumably expensive), but I’m married to Blender, so let me ask you:

    1.- Edge detection via the compositor is a new concept to me, but it could allow drafstman-style rendering of architectural glass, long before Cycles receives any “Toon Glass” shader, right?

    2.- Have you compared the usability of Freestyle versus the Cycles+Compositor technique that you describe?

    • Juan José Torres says:

      Hi, Telos. Glad you like text tutorials. 🙂 The designs in the link you have provided are certainly mesmerizing.

      As for your questions:

      1.- I want to do some experimentation to find ways of showing certain transparent and semi-transparent materials in toon style. I will write a post with my findings when I’m done.

      2.- I’ve done a few tests. Unfortunately, the workflow is not exactly ideal. You are forced to create a copy of your scene and adapt it to BI, applying a single shadeless white material to everything. Once you render both original and copied scene, you still have to composite them together, and Freestyle takes its sweet time on scenes with a high polygon count. Compositor edges are lightning fast even on high resolution renders with a high polygon count and don’t require a copy of my scene adapted to BI, that’s why I used them instead of freestyle. I wish developers worked on direct integration between Freestyle and Cycles. That would be an interesting thing to try.

      • Tom Telos says:

        …You are forced to create a copy of your scene and adapt it to BI…

        Thank you!
        You led me by that line of thought, to an idea that could
        turn out convenient for architectural visualization:

        Say you’re modeling a building, and you’re planning to show it
        both in realistic form and in draftsman mode.
        Hence, when assigning materials to said building,
        rather than suffering to choose between Toon and non-Toon,
        choose BOTH, and feed the two into a Mix node.

        Connect the non-Toon to the upper input socket
        and the Toon to the lower, and in the Fac field
        right-click and select Add Driver.
        Blender now expects you to type a Python expression
        into the Fac field, and I suggest that you type the name of
        a Custom Property that you have defined for the whole scene,
        to wit:

        In the Properties panel, I selected the Scene icon,
        then added a Custom Property that I called ArchiDraft,
        my intent being that, by assigning a zero to ArchiDraft,
        I’m telling Blender Cycles that I want a realistic render,
        and by assigning a one, I want a draftsman-style render.
        Conceivably, I might assign some intermediate value.
        The software should pipe said value into the Toon/noToon Mix node
        that I set up for each material pertaining to the building.

        As explained in
        there are two Python ways to refer to the value of ArchiDraft:
        one, is via a call to get(“ArchiDraft”, default)
        which would force a dictionary lookup for every pixel rendered(!);
        the other says[0].ArchiDraft

        which leaves the efficiency freak inside me
        a lot more satisfied, at least conceptually.

        The complication with that,
        is that you then have to add the following
        in the Text Editor panel, give it a name
        (e.g. ArchiDraftActivate) and Run it:

        import bpy
        bpy.types.Scene.ArchiDraft = bpy.props.FloatProperty()

        Forever after, you will have to Run those two lines of code
        in order for the ArchiDraft property (conceptually: a Scene-scope variable)
        to take effect (i.e. enable it to take on a value other than zero).

        The scheme works —hats off to the Blender developers—
        except that, at least in BF2.67r57600, two bugs kept pestering:One, is the fact that if I add a PyDriver directly into the Fac field
        of the Mix node, and then look for it in the Graph Editor,
        it is not there, and, even though the field turns purple and
        (by hovering the cursor over the field)
        one can see that the intended expression is indeed present,
        Blender simply refuses to evaluate the expression.
        That bug has a workaround:
        bring in a Value node, plug it into the Fac field of the Mix node,
        and plop the PyDriver expression into the Value node.
        The Graph Editor acknowledges; I bear witness. Good. But:
        Bug #2: the expression does not update automatically;
        it needs to be coaxed and cajoled!
        Ok; I post this, rush over to the bugtracker, etc.


        P.S. How does one add pictures to these comments??

    • Juan José Torres says:

      De nada. No lo he intentado, pero imagino que actualmente la única forma de integrar Freestyle con Cycles será renderizar la escena en BI, con todos un único material blanco sin sombreado y Freestyle activado. Después podrías combinar el resultado con el renderizado de Cycles utilizando el compositor. Para mi gusto es muy complicado y probablemente tarde bastante más en renderizar que el metodo descrito en el tutorial.

  3. jovlem says:

    Where can I find the Toon shader in blender? It is not in my 2.67a version available ( windows 32 bit ). Do I need to activate something or so?

  4. Robin says:

    Why can’t I find the toon shader in the cycles shader menu?

    Do I need to download a non-trunk version of Blender?

    • Juan José Torres says:

      Hi, Robin. Toon shader was committed to trunk in revision r56980 (2.67.1). If you want to try the toon shader before the next Blender official release comes out, you can grab latest buildbot build from here.

  5. doug powell says:

    Thanks for this tutorial and the great model you made. I am an absolute n00b with Blender and am just starting to learn it. I read somewhere that an accomplished Blender artist saw a piece of Blender artwork that inspired hm to achieve something like it himself no matter how many years of leraning it would take. Your figure here with this type of shading is my goal.

    Thanks for the inspiration to persevere and learn.

  6. TBolt says:

    Greetings Juan Jose, just popped in to say this is one of the best text tutorials I’ve come across! Well organized, thorough, and easy to understand. I generally like video tuts, but would certainly enjoy seeing more in this format from you. As I’ve been playing with Blender for a bit over a year, I’ve tried to soak in as much knowledge and inspiration from as many sources as possible. I’ve added you to my favorites and look forward to seeing more from you. Thank you sir and good luck in your endeavours.

  7. gimatli says:

    Outer contour: I performed edge extractions based on the Saturation components of the “Diffuse Color” instead the “Image” output of the Foreground renderlayer to have less noise

  8. Adam says:

    I made lighting setup as you wrote but I cannot achieve such effect like “Scene lighted by key light”. Shadows on the robot looks so smooth. Is it a raw render or did you add any nodes or compositing to this? Maybe there is a problem with blender version, I used 2.68. Great tutorial btw 🙂

    • Juan José Torres says:

      Hola, perdona por la tardanza en responder. Llevo una temporada bastante cargada de trabajo. Para aumentar o reducir el grosor de las líneas, es necesario modificar los parámetros del nodo “dilate/erode”. En principio el valor de distancia debería permitirte hacer los bordes más gruesos o más delgados, pero también puedes experimentar con los diferentes modos de este nodo si no estás conforme con el resultado.

  9. Pat says:

    Fantastic tutorial! I mainly work in LightWave 3D, but your detailed explanation gives me some good ideas for translating this technique from Blender to LightWave. Thanks for sharing!

    • Juan José Torres says:

      Thank you! It would be really awesome if you succeed in translating this technique into Lightwave. If you do, could youpost some renders or maybe a short tutorial? I’d love to add a link to it in my article. 🙂

  10. Lawrence D’Oliveiro says:

    What does it mean to apply a separate-HSVA node to a normal vector? A vector isn’t a colour. What you are doing is calculating the magnitude of the vector. It would help to say so. Because there is a more direct way of doing it.

  11. Patrick says:

    Very nice tutorial. I’m glad you wrote it down instead of making a video. I like the art style and for me the new way to make the outlines of the models instead of using freestyle from blender.

    I was wondering if have any idea on how to make cartoony fire and/or smoke

    Anything like this


    I was trying to use quick smoke but the result isn’t what I was thinking of

    This is my result atm

    Thanks you in advance

  12. Victor says:

    Hola, disculpa, no entendí como extrajiste los nodos del contorno interno y externo :/ espero me puedas echar la mano con eso. Gracias

  13. Dan says:

    Hola! I couldn’t resist but send a quick comment to thank you for this fantastic tutorial! I love the step-by-step, detailed, clear, beautifully-illustrated explanations 🙂 You make it a real pleasure to read and to learn! So far it’s the best Blender tutorial I’ve ever seen on the Internet! Muchas muchas gracias!

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes:

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Time limit is exhausted. Please reload CAPTCHA.