The rendering code here was one of the older code bases, which was riddled with bugs and leaks. The new code doesn't look pretty, but it works for the time being until a better solution is found or made. It should be able to handle HDR inputs now, but it may not be completely correct yet. This also fixes the double-free bug.
As an additional improvement, I've moved the allocation of the effect to a shared class which should reduce the memory usage slightly when multiple effects are in play. And now selecting nothing selects the filter target itself without infinitely adding references to the filter. Good enough in my eyes.
Fixes#819
It seems to be possible to encode with a different framerate than what libOBS is configured for. While technically any framerate appears to be possible, it is currently limited to integer fractions only in order to make the implementation much easier. Integer fractions only require skipping N frames and multiplying the denominator by N, where N is the configured integer. For sanity reasons, the limit of N is currently 10.
This allows power users to split their streaming and recording framerates with relative ease, and opt for things such as:
- 30 FPS (1/4) streaming with 120 FPS (1/1) recording.
- 30 FPS (1/10) streaming with 300 FPS (1/1) recording.
- 30 FPS (1/10) streaming with 100 FPS (1/3) recording.
- and so on.
While some of these combinations are just stupid, they are now available to power users.
To ensure better stability of future releases, we need to adopt multiple stages in the release cycle. As we already label Alpha, Beta, Candidate and Stable differently, simply adopting this classification system already does everything for us. This also allows us to maintain compatibility with the existing system, while offering something new entirely.
When FFmpeg Encoders was originally written, FFmpeg 4.2 was still new and OBS Studio did not seem to want to update to anything newer for a while. This led to code being fine-tuned for FFmpeg 4.2, which stops working the moment OBS Studio upgrades FFmpeg. This removes the dependency on FFmpeg 4.2 hopefully, and allows using newer FFmpeg versions - or perhaps even older versions.
Additionally the nonsensical behavior of the Target Quality slider was fixed. It is now from 0 to 51, instead of from 0 to 100, and as such matches FFmpeg exactly.
Implements File, Source and Enumeration type for Texture shader inputs, completing the initial Shader implementation.
Related: #5
Co-authored-by: Michael Fabian 'Xaymar' Dirks <info@xaymar.com>
Originally intended to be an experiment with no future, it turned out to be very popular with streamers that move a lot. In the end it was popular enough that NVIDIA added their own variant to their Broadcast software, which works decently enough. Unfortunately my wrapper code around the library was written very poorly, so it didn't take long for it to break out of nowhere.
'Perspective' and 'Orthographic' work great if you know what the parameters were to generate the exact object position, but what if you don't know them? That is where 'Corner Pin' comes in! With it you can specify the exact location of every corner down to the micro-pixel, instead of fiddling with parameters.
Fixes#565
As people appear to be far too willing to mess with settings they have absolutely no reason to mess with, removing these seems like the best option. Both of these can still be set if you know where to look, and aren't actually required for operation at all.
The previous name was too strict on what could be put into the effect, and would result in additional clutter in the Filter menu when we eventually decide to support other Upscaling methods than Super-Resolution networks.
While the long descriptions were useful, keeping the updated and translated is pretty much impossible. Technology moves fast and not everyone that translates the project knows a lot about technology.
Therefore the long descriptions have now been replaced with a button that opens the wiki page for the feature instead. This should drastically reduce the number of help cases, and improve the translation coverage at the same time.
There is hardly any reason for us to recalculate everything all the time. LUTs can cache the work once, and then re-use it every time necessary, drastically reducing the impact of Color Grading by almost 60% (on some GPUs even more). Additionally this fixes the negative gamma issue, which plagued the filter for a while.
In the future, once PR 4199 (https://github.com/obsproject/obs-studio/pull/4199) has been merged, we can cut away one intermediate rendering step currently required to make the effect work. Hopefully this will be with the 27.x release of OBS Studio.
Adds support for the AMD Advanced Media Framework H.264 and H.265 encoders via FFmpeg. The majority of settings are supported, and the UI/UX experience mimics that of the NVENC implementation. Various settings are left out due to their complexity and should be controlled via the custom parameters field.
Implements a manual and automatic update checker with support for both release and testing update channels, allowing users to stay as up to date as possible. It is fully compliant with privacy regulations around the world, as it stays completely silent and inactive until the user gives the Ok to connect to GitHub for the latest releases.
* "Quality" Minimum/Maximum is actually QP Minimum/Maximum
* Bitrate Limits is now just Limits
* Buffer Size and Quality Target have been moved into "Limits".
The high priority CUDA stream causes libOBS to be at a lower priority than the tracking, which is not what we want. Instead we want tracking to be incomplete in those cases, rather than slowing down encoding and other things.
Geometry updates are also now done once per frame instead of one per tracking update, which should improve the smoothness without affecting performance too much. Additionally all tracking info is now in the 0..1 range, which drastically simplifies some math - especially with texture coordinates.
To deal with tracking and updates being asynchronous, a very simple approximation of movement velocity has been added. This is mostly wrong, but it can bridge the gap where tracking updates are slower, as the values are all filtered anyway.
As OBS Studio locks some mutexes in a different order depending on what actions are being done, using modified_properties for GPU work causes things to freeze in place. Instead have users manually click the refresh button when they changed files in order to prevent this freeze from happening.
Fixes: #118
Ever wished you had a professional camera operator to highlight and follow the action, ensuring the audience never misses a beat? Thanks to NVIDIA, you can now do this at home for free! The new NVIDIA AR SDK unlocks augmented reality features, including motion tracking for faces.
This allows me to provide you with an automated zoom and cropping solution for your video camera to transform your streams into a slick, polished broadcast, where you’ll always be the star of the show. Don’t forget - everything is customizable so the possibilities are endless. You can even recreate that Futurama squinting meme if you wanted to (with some scripting)!
The filter requires compatible Nvidia RTX hardware and the Nvidia AR SDK Runtime to be installed ahead of time. This filter is considered "stable" and shouldn't change much from version to version.