Goal is to support more combinations of shader stages, both for
vertex-only shaders and mesh/raytracing shaders in the future. In
general most of the logic that conflated "stage count" with "shader
type" has been changed to look at individual shader stages.
This allows them to be initialized/destroyed from multiple threads in
any order. Previously, the first thread to require a module had to be
the last thread to use the module, otherwise it would be destroyed too
early.
There are still a few issues. If the main thread doesn't require a
module, it won't pick up the conf.lua settings. Also graphics isn't
handling the shader cache writing properly. And I think this breaks the
headset-graphics refcounting. But these will be fixed in future
commits.
- If you load an image with a non-blittable format, mipmap count will
default to 1.
- If you explicitly request mipmaps on a non-blank texture with a
non-blittable format, that's an error (rather than leaving mipmap
chain undefined or sliently falling back to 1 mipmap).
- 'sample' now implies both sample and linear filtering (practically always
true for all formats lovr supports)
- 'render' now includes 'blend' for color formats (also practically
always true except for r32f on some old mobile GPUs)
- 'blit' now includes 'blitsrc'/'blitdst' because lovr doesn't support
blitting between textures with different formats
- 'atomic' is removed because lovr doesn't really support atomic images yet
These are methods that implement some common raycast behaviors without
creating a closure. They also take tag filters. The tag filters could
be optimized to use tag numbers instead of string comparison, and the
default raycast method could take a tag filter, but I didn't want to
modify the C API while Jolt is in progress.