MaxMara Bearing Gifts
Live the new #MaxMaraBearingGifts experience with Max The Teddy.
Our friends at LOW reached out to us with an interactive web experience for MaxMara. The concept of the experience is pretty straight forward: it uses the Max The Teddy to lead the audiences to different scenes, some of them are playful scenes and some of them are for product display. LOW wanted to work with us to bring something beautiful and playful to their client.
Early Stage Look Dev
Whenever we start a project, we generally plan ahead to estimate the render quality based on the estimated project timeline, budget as well as the audiences. It helps us to align the expectation with our client as well as building something as nice as possible within the scope. It is very important to get the expected look at the very early stage so that the artists and the art director of our agency partner could understand what they were expecting and would feel comfortable to let us explore something interesting.
Same for this project, in order to make sure the experience runs smoothly on every single device, we decided not to use any real-time lighting system. It means everything you see in the experience are either using matcap or prebaked. We worked with the 3D modelling vendor to build the experience in real-time applications for the web. We helped generate and optimise some of the 3D assets we needed in the experience, as well as providing guidelines on adding our service into their pipeline.
Matcap is a widely used real-time technique to mimic the final high quality render. The biggest downside is that it doesn’t work well with any camera movement due to the missing information from various camera angles. Luckily, our object materials are not very shiny and we can somewhat get away with it.
There is another downside of using matcap is that it doesn’t work well on a flat surface, the solution we have for this is… try not to use any flat surface. What we meant here is to make sure the vertex normals within the same polygon are not pointing in the exact same direction. Taking the gift box as an example, we first add a bit of bevelness to the edges of the gift box to give it a bit of smoothness on the edges. Then we take the average vertex angle as the normal instead of using the face normal. This simple hack can enforce the matcap sampling changes on the flat surface to prevent getting the same result on every pixel in the same polygon.
Prebaked Ambient Occlusion
Even with Matcap shading, everything still looks a bit flat because there is no lighting contribution between neighbour geometries:
One of the easiest way to improve the quality of the image is to add the prebaked ambient occlusion:
We chopped and packed the whole scene into a couple of smaller models and each model is using different AO values from different color channels of a few 2k textures. It works nicely on both desktop and mobile with a small network bandwidth and GPU memory footprint.
Blurry Reflection and Refraction
Rendering transparent objects is always challenging in real-time, especially if we want the experience to run smoothly on mobile devices. So, we decided not to use any complex depth raymarching technique to render the reflection and refraction for those glassy and glossy objects but instead using a simple fake solution by rendering all opaque objects first and use the screen space result for the reflection and refraction without any depth information. This kind of technique is widely used in AR applications because it is cheap to render and can yield a pretty convincing result.
As you can see, since there is no depth information involved, nothing is physically correct. So, we decided to add a bit of blurriness to hide the artifact as well as to give a bit of “thickness” to the side of the transparent cylinder.
The solution is pretty direct forward, instead of directly using the screen space rendered result, we first generated the mipmap lod level from the screen space rendered result and render the image with different lod level based on the view normal of the transparent cylinder surface.
However, as you can see there are some artifacts introduced by the linear sampling from the higher mipmap level result. To improve the rendering, it is pretty simple, we can use bicubic sampling and everything just looks way smoother:
Here are some more runtime screenshot of using these transparent materials on other objects to demonstrate this effect:
One bonus thing is that, we also used this similar technique for the golden material to add some fake screen space reflection to it. Take a look at the shiny fake reflection of the golden gift boxes:
Other things and stuff
- GPU Confetti - The particle simulation runs in GPU as most of the project we did. We also used 8 bit per channel instead of 32 bit float per channel render target to make sure it works on most of the mobile devices.
- Physics - We used Oimo.js for the ridgy body physics for those letters and the pipes. It is straightforward and performant.
It was a great experience working with the talented team at LOW and we appreciated the amount of professional trust they had on us to help build this immersive experience together. This ensured the whole process went smoothly. The website also won some awards on FWA and AWWWARDS so it was a huge plus!