The new libOBS API allows us to directly access the underlying API instead of having to mess around in memory. By using it we can avoid crashing in case the compiler for it is different, or in case the actual back end structure changes.
Additionally the mostly unimplemented and unused options have also been removed, which streamlines the use of this class even further and reduces both shader and code complexity.
Finally by optimizing the use of the internal render target we can achieve a speed up of up to 3000% over the old way, allowing for many more mipmapped filters.
Through converting the code to a threaded asynchronous approach, the libOBS video renderer no longer has to wait on our tracking code to run, and we can enjoy a little bit of extra calculation time before we actually have to do anything.
However due to the remaining synchronization with the Direct3D11/OpenGL context, it is not entirely safe to spend a full frame tracking as libOBS will then start skipped/dropping frames. Even though the priority of the stream is now increased, this still means that we can't just sit around and have to quickly finish all work.
Related #150
Previously sources had to manually implement migration code, which resulted in unresolvable regression issues due to the lack of version and commit tagging. With the new migration code, all sources automatically have this version and commit tagging at all times, and as such can now have a temporary regression fixed without the user needing to change any values manually.
This header includes all common data between headers used in the plugin. This should improve cross-platform compiling support whenever possible, as all platform-dependent common includes and defines can be done here.
Ever wished you had a professional camera operator to highlight and follow the action, ensuring the audience never misses a beat? Thanks to NVIDIA, you can now do this at home for free! The new NVIDIA AR SDK unlocks augmented reality features, including motion tracking for faces.
This allows me to provide you with an automated zoom and cropping solution for your video camera to transform your streams into a slick, polished broadcast, where you’ll always be the star of the show. Don’t forget - everything is customizable so the possibilities are endless. You can even recreate that Futurama squinting meme if you wanted to (with some scripting)!
The filter requires compatible Nvidia RTX hardware and the Nvidia AR SDK Runtime to be installed ahead of time. This filter is considered "stable" and shouldn't change much from version to version.
These allow you to apply any kind of filtering to a any source, using just standard HLSL. Just like transitions, one extra parameter is set called 'InputA'.
Fixes#95
This fixes#116 which was caused by a refactor in commit efb6b0b9be. This bug was left undiscovered until users started upgrading from the last stable version to the current pre-release.
Previously a wrong blend state caused a slight discoloration on transparent sources, which was caused by assuming them to always be fully solid without transparency. By instead relying on OBS to do the rendering we do not have to deal with blend states as much and instead can simply enjoy the result.
Fixes#104
This drastically improves stability and prevents all exceptions from leaking into libobs C code, which prevents crashes and unexpected freezes from exception handlers further down the stack.
Additionally minor work was done to further improve the quality and user experience for the filter.