pathtracer

webgpu-based path tracer

README.md

3.2 kB
 1## Overview
 2
 3An interactive path tracer implemented in WGSL. Supports multiple sampling methods, physically based materials including micro-facets, and realistic light sources. Primarily written to explore WGSL and the WebGPU API. So it takes some shortcuts and is pretty straightforward.
 4
 5This is a GPU "software" path tracer, since there is no HW-accel using RT cores, it contains manual scene intersections and hit tests.
 6
 7- Single megakernel compute pass, that blits the output to a viewport quad texture
 8- All primitives are extracted from the model -> a SAH split BVH is constructed from that on the host
 9- Expects GLTF models, since the base specification for textures and PBR mapped pretty nicely to my goals here. So roughness, metallic, emission, and albedo. Textures and the normal map
10- A BRDF for these material properties, pretty bog standard. Cosine weighted hemisphere sample for the rougher materials and a GGX distribution based lobe for the specular.
11- There is no support for transmission, IOR, or alpha textures
12- Balanced heuristic based multiple importance sampling; two NEE rays. One for direct emissives and another for the sun
13- Uses stratified animated blue noise for all the screen space level sampling and faster resolves.
14- Contains a free cam and mouse look, typical `[W][A][S][D]` and `[Q][E]` for +Z, -Z respectively. `[SHIFT]` for a speed up. The camera also contains a basic thin lens approximation.
15- Since WebGPU doesn't have bindless textures, the suggested way of doing textures is a storage buffer. This would require generating mipmaps on the host. I just stack the textures and call it a day here.
16- All shader internals are full fat linear color space, that is then tonemapped and clamped to SRGB upon blitting.
17
18## Local setup
19
20```
21pnpm install
22pnpm run dev
23```
24
25`/public` should contain the assets. Just compose the scene manually in `main.ts` position, scale, rotate.
26All the included models are licensed under Creative Commons Attribtution 4.0.
27
28## To-do
29
30- Direct lighting NEE from an HDR equirectangular map
31
32### Resources
33
34- [WebGPU specification](https://www.w3.org/TR/webgpu/)
35- [WGSL Specification](https://www.w3.org/TR/WGSL/)
36- **Jacob Bikker:** [Invaluable BVH resource](https://jacco.ompf2.com/about-me/)
37- **Christoph Peters:** [Math for importance sampling](https://momentsingraphics.de/)
38- **Jakub Boksansky:** [Crash Course in BRDF Implementation](https://boksajak.github.io/files/CrashCourseBRDF.pdf)
39- **Brent Burley:** [Physically Based Shading at Disney](https://media.disneyanimation.com/uploads/production/publication_asset/48/asset/s2012_pbs_disney_brdf_notes_v3.pdf)
40- **Möller–Trumbore:** [Ray triangle intersection test](http://www.graphics.cornell.edu/pubs/1997/MT97.pdf)
41- **Pixar ONB:** [Building an Orthonormal Basis, Revisited
42  ](https://www.jcgt.org/published/0006/01/01/paper-lowres.pdf)
43- **Uncharted 2 tonemap:** [Uncharted 2: HDR Lighting](https://www.gdcvault.com/play/1012351/Uncharted-2-HDR)
44- **Frostbite BRDF:** [Moving Frostbite to Physically Based Rendering 2.0](https://media.contentapi.ea.com/content/dam/eacom/frostbite/files/course-notes-moving-frostbite-to-pbr-v2.pdf)
45- **Reference Books**: Ray tracing Gems 1 and 2, Physically based rendering 4.0.