Originally intended to be an experiment with no future, it turned out to be very popular with streamers that move a lot. In the end it was popular enough that NVIDIA added their own variant to their Broadcast software, which works decently enough. Unfortunately my wrapper code around the library was written very poorly, so it didn't take long for it to break out of nowhere.
'Perspective' and 'Orthographic' work great if you know what the parameters were to generate the exact object position, but what if you don't know them? That is where 'Corner Pin' comes in! With it you can specify the exact location of every corner down to the micro-pixel, instead of fiddling with parameters.
Fixes#565
As people appear to be far too willing to mess with settings they have absolutely no reason to mess with, removing these seems like the best option. Both of these can still be set if you know where to look, and aren't actually required for operation at all.
The previous name was too strict on what could be put into the effect, and would result in additional clutter in the Filter menu when we eventually decide to support other Upscaling methods than Super-Resolution networks.
While the long descriptions were useful, keeping the updated and translated is pretty much impossible. Technology moves fast and not everyone that translates the project knows a lot about technology.
Therefore the long descriptions have now been replaced with a button that opens the wiki page for the feature instead. This should drastically reduce the number of help cases, and improve the translation coverage at the same time.
There is hardly any reason for us to recalculate everything all the time. LUTs can cache the work once, and then re-use it every time necessary, drastically reducing the impact of Color Grading by almost 60% (on some GPUs even more). Additionally this fixes the negative gamma issue, which plagued the filter for a while.
In the future, once PR 4199 (https://github.com/obsproject/obs-studio/pull/4199) has been merged, we can cut away one intermediate rendering step currently required to make the effect work. Hopefully this will be with the 27.x release of OBS Studio.
Adds support for the AMD Advanced Media Framework H.264 and H.265 encoders via FFmpeg. The majority of settings are supported, and the UI/UX experience mimics that of the NVENC implementation. Various settings are left out due to their complexity and should be controlled via the custom parameters field.
Implements a manual and automatic update checker with support for both release and testing update channels, allowing users to stay as up to date as possible. It is fully compliant with privacy regulations around the world, as it stays completely silent and inactive until the user gives the Ok to connect to GitHub for the latest releases.
* "Quality" Minimum/Maximum is actually QP Minimum/Maximum
* Bitrate Limits is now just Limits
* Buffer Size and Quality Target have been moved into "Limits".
The high priority CUDA stream causes libOBS to be at a lower priority than the tracking, which is not what we want. Instead we want tracking to be incomplete in those cases, rather than slowing down encoding and other things.
Geometry updates are also now done once per frame instead of one per tracking update, which should improve the smoothness without affecting performance too much. Additionally all tracking info is now in the 0..1 range, which drastically simplifies some math - especially with texture coordinates.
To deal with tracking and updates being asynchronous, a very simple approximation of movement velocity has been added. This is mostly wrong, but it can bridge the gap where tracking updates are slower, as the values are all filtered anyway.
As OBS Studio locks some mutexes in a different order depending on what actions are being done, using modified_properties for GPU work causes things to freeze in place. Instead have users manually click the refresh button when they changed files in order to prevent this freeze from happening.
Fixes: #118
Ever wished you had a professional camera operator to highlight and follow the action, ensuring the audience never misses a beat? Thanks to NVIDIA, you can now do this at home for free! The new NVIDIA AR SDK unlocks augmented reality features, including motion tracking for faces.
This allows me to provide you with an automated zoom and cropping solution for your video camera to transform your streams into a slick, polished broadcast, where you’ll always be the star of the show. Don’t forget - everything is customizable so the possibilities are endless. You can even recreate that Futurama squinting meme if you wanted to (with some scripting)!
The filter requires compatible Nvidia RTX hardware and the Nvidia AR SDK Runtime to be installed ahead of time. This filter is considered "stable" and shouldn't change much from version to version.
Adds support for specifying Minimum Bitrate directly in the UI instead of requiring custom settings to do so. Additionally Adaptive I/B-Frames are now only shown if Look-Ahead is a value greater than 0 frames.
Quality Minimum can also now be left at a default value of -1, the Quality group is no longer toggleable and Quality Target moved into the group. Settings options on the context is now searching children too (if there are any).
Finally, some C++17 formatting was done.
Fixes#101
This drastically improves stability and prevents all exceptions from leaking into libobs C code, which prevents crashes and unexpected freezes from exception handlers further down the stack.
Additionally minor work was done to further improve the quality and user experience for the filter.
Caching the output of a source is only necessary for really expensive to render sources, so it is disabled by default now. Thanks to that, most Source Mirrors are now "free" instead of requiring two context switches and a texture, while those really expensive can be manually set to cache.
The scaling mode is also set to disabled instead of point when rescaling is off to further improve performance. The previous method would incorrectly cause an extra texture to be used.
Additionally we now have support for debug markers for graphics debugging, allowing us to exactly tell apart improvements in rendering cost for this source.