Because of how and when draws occur in our Oculus Mobile path, during a restart it would attempt to draw a frame after lovrGraphicsDestroy() is called, leading to a crash in lovrGraphicsSetCamera(). This blocks draws until the restart is finished and renderTo() has been called (conveniently detectable using the existing state.renderCallback).
Returns the predicted display time, which is the estimated time at which
the photons of the next frame will hit the eyeballs of a person in the HMD.
This should be used instead of lovr.timer.getTime when used for rendering
something that is time-dependent. Updating simulations, logic, or access
to high frequency times should still use lovr.timer.getTime.
It's still a rough draft and likely only works on my machine, but can be
improved over time.
Rough explanation:
- tup.config contains high-level build configuration defaults.
- Tuprules.tup contains mostly compiler flags (generated from the
tup.config) and declares some macros used to compile code.
- Tupfile takes all generated object files and links them into the
lovr executable.
- src/Tupdefault defines the default build steps for src and all
subdirectories, which is to compile all .c files to .o files and put
them in the <objects> bucket for linking by the toplevel Tupfile.
It's possible to have multiple configs active at once for different
platforms, projects, etc. To do this, create a folder for each build
variant you want, and place a tup.config in each folder (it can be a
symlink, which is helpful). Then, invoking `tup` will build all your
variants, or you can build a specific one by doing `tup <foldername>`.
- Ref struct only stores refcount now and is more general.
- Proxy stores a hash of its type name instead of an enum.
- Variants store additional information instead of using a vtable.
- Remove the concept of superclasses from the API.
- Clean up some miscellaneous includes.
If we expose both unhanded hands and handed hands, people need to
deal with handling (haha) both cases in their apps. It's simpler
to always deal with left and right hands, even though it is a bit
less general. Still, this is congruent with the current state of
OpenVR and OpenXR, and I think there are still open questions about
the more uncommon cases where there are more than two hands.