Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Nvidia Broadcast App (nvidia.com)
263 points by ryneandal on Sept 17, 2020 | hide | past | favorite | 151 comments


Overall I'm very impressed and plan to incorporate the Background Blur into my lecture videos (since I had to create a 'decorative' background for them anyway).

So everyone can see how it looks with a makeshift lighting setup (kitchen lights, overhead lights, and a 3 bulb lamp).

Background Removal: https://i.imgur.com/e3y4kwX.png

Background Replacement: https://i.imgur.com/AajgJIf.png

Background Blur - Low Setting: https://i.imgur.com/CeXmneX.png

Background Blur - Max Setting: https://i.imgur.com/OoJ76ow.png

Frame Tracking - https://i.imgur.com/wQ8IOq1.mp4

Full Album: https://imgur.com/a/RWECpd4

EDIT: As a followup to a question my friend asked - memory wise it is sitting idly at 105MB; running OBS it went to 670MB.

EDIT2: The blur effect holds even if you move to the background as well without sharpening the objects in the background. It did look a little "off" but that's mostly because a sharply rendered body interacting with a blurred environment looks off.


Nice smile :)


Gotta look good for all those movie producers that just found their next big star :D


That smile is going to hunt me in my dreams.


Can you post an example of your edit 2? Seems like it'd be interesting to watch



This is a little uncanny!

Thanks for posting.


Wow is that image view counter only from these links on HN?


Its the only place I've posted them, so unless someone's sharing the gif around, turns out a lot of people were interested in face tracking.


Interestingly, this is a perfect example of the 90/10 rule.

2450 views on Imgur

245 = 2450 * 0.1

259 likes on this post

147 comments


The frame tracking one needs to be a gif I can use


> For those of you that want to try out the AI noise removal capabilities but aren’t ready to upgrade to an RTX GPU yet, we have also patched RTX Voice with support for NVIDIA GeForce GTX GPUs. Though, of course, your mileage may vary on older cards.

That's a strange way to say "yeah we admit that we were just blocking older GPUs in the installer and you guys noticed so decided to stop."


Well it kinds looks like your car manufacturer stating in the manual that your car cannot tow.

You know it just a matter of installing one, but they basically don't want to spend time supporting something they have no interest in.


In the case of cars often a car can tow, but it shouldn't. I know someone who burned out a transmission because the trailer was overweight (in this case it was a truck that the manual said max tow weight was 1/4th what he was towing). I haven't seen it myself, but my EMT friends have had to respond to cases where someone was towing

I work for John Deere, it is well known that ECU code can change the horse power of our tractors. What isn't as well known is the tractors with higher horsepower ECUs have more warranty work, and that is a case where we engineer the system for the change (there is more than an ECU changed in getting higher horsepower if you buy it that way from us because we don't want warranty claims)

If you know what the real limits of your system are you will sometimes discover the stated limits are less than the real limits. However other times the stated limits are there for a reason.


Often it's also for regulatory reasons, particularly around emissions. Larger engines require more complex emission controls, so often an engine (including e.g. a JD) will be sold with a nameplate of say 74hp and chip-limited. It could very likely get its maximum efficiency at that output.

We have a vendor that sells equipment with a built-in engine, and we use some of the power from that engine to run other equipment. Under Tier-4 emissions requirements the vendor moved to all-digital controls, which have a much tighter control on power output. Our customers are annoyed that they can't get the same amount of power out of the new equipment. Our vendor tells us, unofficially, you could tweak the ECU and get more power out, but we're not going to do that to equipment we're selling, and we're not going to tell our customers to do that, because it would violate the legal restrictions on The operation of that equipment.


idk... John Deere is notorious for DRMing their tractors so that even out of warranty modifications or repairs are impossible. It should be up to the user to be able to do the modifications they like and it's simply then up to the manufacturer whether to honor the warranty if the device/tractor has been significantly tampered with.


This is tangential and doesn't reason with or negate the main point you're replying to in any way. Yes, John Deere does these things, but the reasoning provided by parent for why softlocks are sometimes used in the context of performance restriction and warranty support is absolutely plausible and worthy of consideration all the same. We owe it to ourselves to take care not to confuse this conversation with that cause, and to consider these views rather than distracting and derailing from them.


I'm not going to comment much except to point out that it is fraud if you do such a thing and then sell/trade it in - the mechanical damage is done and the buyer should know - but has no way to know.

I'll also point out that you probably were mad at VW for making the types of modifications that you claim anyone should be able to make.


John Deere DRM has such a bad reputation there's a lot of goodwill to be won back before people will listen to some of the good reasons for it.

Honestly, I'm a fan of price discrimination through DRM. It leads to better economic efficiency. But John Deere DRM must really be something else. I'm not even in the field and I have a negative reaction to the brand. Like they're going to nickel and dime me and then give me the run-around on fixes.


Industries are being computerised, so it's not too surprising that business practices are bleeding over.

I don't really see too much of a difference between buying a tractor from John Deere and buying a fancy SAN from Oracle. In either case you're paying a premium for the name and the associated service contract for a time-sensitive mission-critical locked-down (and to some extent black-box) function which is presumably crucial to your business's ability to generate revenue. JD have competitors, and I'd imagine that if this episode hasn't tanked their business it's due to risk-aversion from buyers (à la IBM), vendor lock-in, or customers being set in their ways.

It's also a bit like the Creative sector when Adobe moved to a subscription model; people balked and held out for a time, but ultimately they voted with their wallets and bought [subscriptions to] the magic product that could be provided by only one company on Earth, and without which they couldn't possibly continue to make art-work [commercially, in a productive manner, and in keeping with their prior experience].

I'm just cynical about how often people will knowingly make themselves reliant on a single company without reading through the small print or thinking through the implications of what is essentially entering into a long-term strategic partnership with a vastly unequal partner without a having a hedging strategy. If you can't imagine not using their products after the next refresh, you've put yourself in a very poor negotiating position.


There's a vast difference in costs, expected lifetime, depreciation, maintenance requirements, capital efficiency, time-sensitivity, and criticality to a business between a fancy SAN and a tractor.

Right-to-repair is all about people being able to repair their own products, or at the very least use certified 3rd-party mechanics, instead of being forced to go to the manufacturer for every possible problem.


I think all of those differences (time-scales, capital intensity/efficiency, credit access) are accounted for by the translation between the type of company that would buy a combine harvester Vs the kind of company that would buy a SAN. I totally get the desire from consumers for a right to repair, but ultimately I feel that this shouldn't override the freedom of economic actors to make voluntary agreements in the marketplace. And in the case of agricultural equipment I can see why some vendors would want to move to a Trusted Computing model whereby every component authenticated with every other component via HSMs, addressing the problems such as theft (which is widespread in some parts of the world) by making stolen devices/parts practically useless, which would obviously conflict with a Right to Repair (as well as any purposefully user-hostile ambitions that they had).


Free markets still need some guardrails, and the limited competition combined with the importance of agriculture makes this a complex situation.

Besides, if you have ever used a warranty recently then it’s a good bet that it was legislated to act that way. Right to repair is similar in focus to help consumers.


I disagree with the idea that free markets need guardrails for the reasons that I mentioned before: I believe that consensual agreements between people shouldn't normally be regulated by the state, and that when you take a free market and then add "guardrails" to it, it's no longer a "free" market.

All that minimum warranty legislation does is to remove the freedom that people previously had to buy products with shorter warranties, or without one. It means that they can't even buy the same product together with an optional warranty from the manufacturer, thereby ending up with a situation that's functionally identical. Warranties for practically every type of durable product existed before these changes were introduced, and people have always tended to weigh up the projected lifetime (which they can infer from factors such as the availability of manufacturer or third-party warranties or insurance), manufacturer reputation and projected total costs of ownership whenever they have bought something sufficiently expensive.

Agriculture is of course an important industry, but I don't think there's any wider threat to agriculture in this particular case, and if these practices were to create one in the future, the market would respond by punishing a company like JD (and no doubt the state would as well in light of the strategic implications). It just seems like a slow and painful realisation for parts of the industry that things have changed. Hopefully we'll now see plenty of disruption and innovation (and modding ;)).


> But John Deere DRM must really be something else.

I think it’s rather that it was introduced in a domain where users historically can (and often need) to be able to do more or less anything as many operations are extremely time-sensitive and they generally live in the ass-end of nowhere, CA. Any measure which can (and does) hamper the ability to do the thing right here right now without huge benefits to compensate is an active threat.


>I'm a fan of price discrimination through DRM.

Me too, because it means I can buy the cheapest version and unlock it for free.


Under US law the burden is on the manufacturer to show that a modification impacted the functioning of the device/machine.

It's kind of funny, because parts DRM could be seen as the manufacturer breaking the device when a replacement could have succeeded.


Pretty sure nobody is going to go after me for fraud for putting non-VW heads on the `73 bus I used to drive around in...


If you try to sell it as an "unmodified engine" without mentioning that in writing, they absolutely could. They might not, but if it breaks in some manner that could remotely be connected to them, they could.

If the tractor is rated for 50 whatever and you chip it to run at 60 whatever and sell it without saying that you'd ever done so, then if the buyer finds damage caused by overheating or who knows what from running at 60 whatever on a 50-rated-and-softlocked engine, then you absolutely could be held liable for failing to indicate the non-factory modifications you made to the engine, even if you later 'removed' them.

(Usual disclaimer: I am not your lawyer, I will not compile citations for your review, please seek legal counsel before taking action based on my comments.)


I wouldn't go so far as to say cars shouldn't tow, but people can and do go well beyond both the manufacturer's rated capacity and what's dictated by common sense, and as a result manufacturers are extremely conservative.


Give people the facts. Let them decide but make sure you're protected from supporting idiots.

Better than coming out of it looking like the evil dudes.

EDIT being most well known outside your sector for suing your customers is generally a bad look


It sadly doesn't work like that. Idiots are idiots. "You sold me this truck, and it doesn't work anymore.

- Well, we stated that you shouldn't do what you did...

- It doeesnnn't work !!!"

Ensue time, energy and money lost to defend yourself against idiots who don't listen.

Edit : I don't know anything about trucks or John Deere. But you can see this type of "artificial" limitation with simple unlock pretty often. For example the cpu overclocking community : the manufacturer doesn't officialy support overclocking, but they give hints and some advice if you want to do it anyway. You fry your cpu, you're on your own.


Why should you care how you're known outside your sector? People outside your sector aren't your customers, by definition.


And in many cases it's just a certification issue - my Mazda is rated to tow things while the US model with exact same transmission and engine isn't.


It's just an analogy dude


It’s actually nothing like that because nobody’s expecting Nvidia to support the functionality, just not block the software from even trying. You could solve it by literally just adding a disclaimer pop up that “this functionality is not supported on your GPU, would you like to try anyway?”.


> nobody’s expecting Nvidia to support the functionality

I 100% guarantee you there are people who will call Nvidia support (yes, they will literally dial +1-800-797-6530 on a phone like it's 1972) to complain about broadcast. Some people are unbelievably entitled, and think that no matter what it says on the box if they paid good money for it then it should by God do whatever stupid ass thing they want it to do.


So? This is why service desk exists. And I say that as someone who previously worked in one. The key is that something being unsupported is clearly stated in the software and enforced by everyone in the organization. If done correctly you’ll have no PR penalty as the majority will take the party of the organization, and a PR benefit if it works for people even when they know it “shouldn’t”.


I don't know the specifics of the tech requirements, and I certainly don't condone giving nvidia the benefit of the doubt, but it may be due to efficacy of the hardware.

Non-RTX cards are technically capable of DXR as well, but the lack of specialized silicon makes it a nonstarter. I've used RTX Voice for months now, and coworkers have noticed issues and artifacts on occasion, I imagine they occur more often on non-RTX cards.


Yep. I have a GTX 980 and most likely won't be replacing it until later this year when 3080s aren't as hard to find and I can get a look at the upcoming Ryzens for a full replacement build.

I've used the "tweaked" RTX Voice on my 980 and it mostly works, but while testing it in some social Zoom meetings I also found it was causing artifacts unless I applied the typical modest overclock profile I use when playing games.

I will confess that one of the reasons I look forward to getting a 3000-series is that I want to use RTX Voice and Broadcast during all of the web conferences I do nowadays. Hardware accelerated fake-chromakey and effective noise cancellation will both be welcome additions.


I get "artifacts" on my 1080ti SLI machines too, OC isn't really solving it the RTX 2080ti machine isn't having issues.

I might try to run some GPU profiler at some point, my best guest is that the model is time sensitive so it requires a very fast execution and one that might not be parallelized that well.

To me the sound issues seem to be the same as certain temporal based effect shader issues you might encounter at low framerate or frame drops when streaming video basically since the model and the app are built for real time voice buffering is limited so there is an expectation of specific timing constraints in which the computation must be completed when your GPUs cant meet those constraints you get audio artifacts on the output.


You can change a single bit in RTX voice and it works on other cards.

Seems to pretty heavily imply that it was an entirely artificial requirement.

Also it's NVidia, all of the home grade cards are artificially limited by drivers.


I see people are responding about performance considerations, but the other aspect is support. It may just be that NVIDIA does not believe supporting RTX Voice on GTX cards is worth the time and resources. Because even if they put in a disclaimer about "We haven't tested this on every GTX card, so you may notice poor performance or bugs and we won't support that", they'll get plenty of people complaining to them about poor performance or bugs. It happens all the time, and it's often a big concern in choosing which products to support.

Yes, I agree that initially choosing not to support GTX cards was a money-based decision, but sometimes there are additional (somewhat more reasonable) factors that contribute to that money-based decision, which people often seem to leave out.


> Also it's NVidia, all of the home grade cards are artificially limited by drivers.

You know that all products with any electronics in them do this, right? It's not an Nvidia thing by any stretch of the imagination. Your phone and your computers all have software feature toggles for features meant for future hardware. Your OS has features disabled by design. Unless you're driving a 30 year old car, it's probably true that your car has features turned off that are meant for other models. Since you're on HN, there's a reasonable chance that the company you work for releases a product with "artificially limited" features. (Of course I have no idea what you do or who you work for, just making the point that hidden features are so common, the odds of me guessing right are quite good.)

There are good and legitimate reasons why products turn features off, especially when the features were designed for specific hardware, with specific specs in mind, and work best with specific hardware support. Since the fallback could be a drain on performance, battery, heat, noise, safety, and last but not least user experience, it's pretty understandable why some products have features disabled, especially when the features were developed after the product was released, right?


> Also it's NVidia, all of the home grade cards are artificially limited by drivers.

In some ways yes, but in the way this is usually meant (DP flops) this hasn't been true for many generations. GeForce chips simply only have one DP ALU for every 32 SP ALUs, while the HPC accelerators have one DP ALU for every two SP ALUs.


The parent is probably referring to the Quadro (cad workstation) vs Geforce (gaming) split, not the Geforce vs HPC. There's a history of limiting Geforce performance for CAD workflows in software, and flashing the firmware of Geforce cards to unlock much better "professional" performance.


The flashing didn't increase performance it allowed you to use Quadro certified drivers which are available now for Geforce (minus the official certification and support iirc) too (i think they are called creator drivers or something like that).

This also impacted only a tiny subset of CAD/professional imaging products which were out of reach of consumers anyhow often due to the massive PITA and cost that was certifying your hardware and drivers for those products.

Quadro wasn't running faster in Blender, 3DSM not even AutoDesk (in fact they were often slower due to lower frequencies), the apps were it made a difference when those drivers were finally released for Geforce and Titan cards were only the likes of Catia and Siemens NX hardly consumer/prosumer software.

I'm sorry but if you are a 7 figure a year license holder for Siemens NX spending $2000-3000 extra per on each GPU every 3 year isn't going to be an issue, not to mention that you aren't going to be using Geforce GPUs anyhow as the drivers are still not certified (which is required for the industry) and even more likely you'll be buying certified workstations from the likes of HP and Dell.


This is absolutely not true, gaming focused Geforce cards have intentionally firmware crippled 64bit float performance, and the Quadro firmware flashes unlocked it.


Gaming GeForce cards don’t have FP64 silicon, and neither do the Quadros which use the same dies.


Not currently, but this was true in the 6xx series for instance.


No it wasn’t GK104 the biggest GeForce 6XX die didn’t only had 1/16th on both the Quadro and GeForce cards.

The Titan / Titan Black had 1/3 FP64 just like the Quadro K6000 which used the same die.


Seems a bit of a weak argument to be talking about something released 8 years earlier?


You can, but the issue here is the use case and performance this is meant to be used by streamers hence any performance impact on the game must be neglibible.

The normal ALU's aka CUDA Cores and the Tensor cores on RTX cards can run concurrently without blocking, without require context switching and for the most part outsides of the register file / cache in that SM and GPC they aren't competing for resources.

Sure no model runs only on the Tensor Cores and probably requires some generic ALU usage but optimized models only require a fraction of their execution time to involve those standard cores.

Optimizing their model to work on a multiple generation of GPUs with different compute capabilities (if you look at CUDA compute you have native/multiple instruction operations constantly being switched in and out) without having a severe impact on performance that would lead to "OMG RTX VOICE MAKES YOUR GAME LAG!!!!!" posts on social media and streamers trashing the software and the brand isn't easy.

So you launch it first on hardware where you can guarantee that it would run well without adverse impact then release it to a broader audience with a huge disclaimer often to showcase just how better your new hardware is like in the case of DXR ray tracing which now runs on essentially all DX12 GPUs from NVIDIA.

People in general are clueless to how GPUs actually work, and they tend to sensationalize everything (especially when it comes to Team Green vs Team Red) just look at the nonsense that came out after Crytek demoed their "ray tracing" on VEGA people were basically saying RTX is a scam.

The reality is quite different, Crytek used a hybrid model, they weren't running GI via ray tracing they were using SVOGI and have implemented very rudimentary ray tracing for reflections with heavy limits on how many BVH instances you can have at the same time and at what range objects can fall into the BVH.

So yes it can run for example on the Xbox One X but at <30 fps, with low quality SVOGI and upto 5 objects being able to be reflected at any time via RT.

NVIDIA isn't limiting homegrade cards artificially, FP64 units aren't there in the smaller dies, AMD used to do it during the days of GCN as their big dies still had full FP64 ALUs.

The only feature that NVIDIA has been currently limiting on their GPUs is SR-IOV support which what would you know is also disabled on AMD consumer cards ever since SR-IOV support was introduced with VEGA.

Even the "it's-definitely-not-the-founder-edition" VEGA FE which was quickly re-marketed at launch as a prosumer card despite the fact that it's driver support was essentially killed within 6 months which was sold at a huge markup (compared to Vega 56/64 which came later) didn't enable it.

Posts like this is why we can't get good stuff, because people will always complain why something doesn't work and then go into conspiracy theories when something does work only on newer hardware.


AFAIK NVidia also limiting streams of NVEnc and patch to remove restriction is available.

https://github.com/keylase/nvidia-patch


They allow ray tracing on older cards too, but it doesn't run well thanks to the lack of dedicated ray tracing hardware. This AI noise reduction probably runs a lot better on newer cards with tensor cores, whereas older cards could see this feature eating into game performance if you use it while playing.


iirc analysis of the utility indicated it was not using tensor features at all, which is why it was trivial to unlock on older cards.


Yeah, I patched it myself day one, it was literally a flag in the installer. What colossally unpleasant behaviour.


Any tips for locating which flag in the installer is the one to flip the bit?


This. Could it be just as simple for the Broadcast app? Because I use my RX5600XT for games, and my older GTX1060 for streaming/recording/RTX Voice.


wait so should this app be able to work with something like the RTX 2070?


Is there anything wrong with this? You're only entitled to what you bought or were promised at the time of purchase (and security updates). If the manufacturer comes along later and improves your product there aren't any obligations to extend that generosity as far as technically possible.


"This can only work on new cards" or "we're only whitelisting new cards" aren't necessarily dick moves but saying "This can only work on new cards" when it's really "we're only whitelisting new cards" is.


You should see what Nvidia does for CAD software...They used to force you to use Quadro cards only (which is maybe 2X-3X price of their consumer version) even though the GT/X series worked just as good. People figured out how to change the registry setting to make it work until they finally caved and supported it "somewhat" (where if it breaks it breaks and this was maybe 5 years after the registry hacking). Nvidia has always been notorious for making people use the "good stuff" (expensive) even though it's not really required.


> Nvidia has always been notorious for making people use the "good stuff" (expensive) even though it's not really required.

Do you think that these Quadro features materialized out of thin air, or did they require engineering effort and additional silicon area?

If the latter, is it reasonable for Nvidia to expect to be compensated for that additional effort (which is born by, say, 2 orders of magnitude lower volume) ?

Additionally, would you be happier if Nvidia simply removed those features so that no home users would complain anymore (at the expense of Quadro users who would lose the ability to use real features.)

And, using my standard to-go-to argument, are you similarly upset that software companies are charging different amounts of money for different tiers of a software product (e.g. Microsoft Office Home vs Professional), even though these different tiers are only enabled by a license key? If not, why?


> are you similarly upset that software companies are charging different amounts of money for different tiers of a software product (e.g. Microsoft Office Home vs Professional), even though these different tiers are only enabled by a license key?

Is there anyone who isn't upset by that?


Why would you be upset by that?

By lowering the price for an entry version, customers who otherwise couldn’t afford the full version still get the option to use a feature reduced version.

(See how I framed that?)


> Why would you be upset by that?

The former Microsoft Excel Program Manager explains it well:

https://www.joelonsoftware.com/2004/12/15/camels-and-rubber-...

> this business about segmenting? It pisses the heck off of people.

> People want to feel they’re paying a fair price. They don’t want to think they’re paying extra just because they’re not clever enough to find the magic coupon code.

> God help you if an A-list blogger finds out that your premium printer is identical to the cheap printer, with the speed inhibitor turned off.

> So, while segmenting can be a useful tool to “capture consumer surplus,” it can have significant negative implications for the long term image of your product.

> Even assuming you’re willing to deal with a long-term erosion of customer goodwill caused by blatant price discrimination, segmentation is just not that easy to pull off. First of all, as soon as your customers find out you’re doing it, they’ll lie about who they are

> If your customers talk amongst themselves, they’re going to find out about the price you’re offering the other people, and you’ll find yourself forced to match the lowest prices for everyone.


If they only offered one tier, it would be Office Professional, at a price that made sense for most businesses, so they can recoup the cost of building and providing ongoing support for the product. This would absolutely price a huge number of people out from getting Office for personal use.

So, no, I don't think there are a lot of people upset that there exists an Office product they can afford.


Even if they offered an affordable ultimate Office package with everything, they'd recoup the development costs and even profit. This is not about "recouping costs", it's about market segmentation. It's about figuring out how much each group is willing to pay and then charging exactly that. If they could, they'd call customers and figure out their net worth before quoting them a price.

https://www.joelonsoftware.com/2004/12/15/camels-and-rubber-...

> this business about segmenting? It pisses the heck off of people.

> People want to feel they’re paying a fair price. They don’t want to think they’re paying extra just because they’re not clever enough to find the magic coupon code.

I don't know a single normal person who's ever bought Office anyway. Their real customers are corporations, since they have money and can actually be sued for copyright infringement if they use Office without licensing it. Microsoft should simply give it away to normal people and charge corporations for it.


> Even if they offered an affordable ultimate Office package with everything, they'd recoup the development costs and even profit.

Mind sharing the development & support cost numbers, and the sale numbers for the Office suite that you're basing that statement on? If not, maybe a Fermi calculation to show that it's a reasonable statement?


Why do you suppose that? Surely there are many many more customers who can afford cheaper individual/student licenses.


This argument assumes that pricing is to be determined through some a priori philosophical consideration of what is "fair" or "reasonable" compensation for effort involved in creating supply, according to some subjective personal idea of what those terms mean.

Empirically, pricing is - always and everywhere - determined by the intersection of supply and demand, nothing more and nothing less. The labor theory of value and its relatives are empirically false.


Empirically, prices are set by the seller.


The seller can _ask_ for any price they like, not unilaterally "set" it.

They're only going to get that, and thus set the price, if there is also a buyer who agrees to pay that price.

In technical terms, I was using "set the price" to mean "discover the the market-clearing price that allows a transaction to happen."


Right, but this attitude allows neglecting any behavior that results in bad resource distribution as inevitable and immutable. That's not how economies work by all observation. Empirically supply/demand is insufficient to explain all the market behavior that's trivially observable, especially when you can easily manipulate both sides of that narrative with enough capital.

The labor theory of value certainly has issues but at the very least I would expect a critique to provide an alternative narrative of observable pricing pressures, such as scarcity and market controls. This type of reductive thinking doesn't have a clear end to my eyes.


Sure, but the seller is not guaranteed to get sales.

So there must be an agreement between the seller and the buyer on what the item is worth, and "worth" includes what alternatives both of them have available as well.


The big difference is that with hardware it feels wasteful. With software, you're not missing out on anything if they decide to lock certain features under a paywall. But if the hardware already exists and can perform a certain function and the only thing blocking it from doing that is a license key then doesn't that seem like a waste of good hardware? Software is unlimited but hardware isn't.


I personally think there's a pretty big difference between being compensated for their work and using software to limit where the hardware can be used (and Im not saying Nvidia is the only company that does this, like Tektronix who is even worse for it). This differs than a purely software product like you mentioned between home vs professional usage. When I buy a GPU, I expect to be able to use it as a....GPU regardless of the software it connects to. Sure, I get it, there is some additional software work such as CUDA and driver stuff depending on the package but at a basis with something like DirectX, this should be more or less software agnostic. Unfortunately our market is heading towards that direction (e.g Tesla and John Deere) where even "unlocking" the features via hardware means is a violation of DMCA.


It's not that simple though. It's cheaper to engineer one chip that does everything then selectively disable certain features for lower end SKUs.

Going to the effort of making different hardware for the lower end SKUs that completely lacks these features would end up raising prices for consumers because it makes engineering and manufacturing more costly.

The other alternative is they just don't offer lower price SKUs and make everyone buy top-end parts.


The other choice would be to actually manufacture different product lines with different hardware, which would 1. increase costs, and 2. prevent the user from being able to upgrade via software, and have to buy completely new hardware. From a business, consumer, and environmental point of view, unlocking via software seems the superior solution.


You say that this is different than a software product, but you don’t explain why.

You didn’t buy a GPU, you bought a consumer class GPU promised you the advertised consumer class features.

Why do you expect it to give you professional class features from a higher pricing tier?


Sorry, complete noob here. How did nvdia control if CAD uses the gpu or not. Did the gpu know the type of application acting it?


The NVidia drivers are aware of which application is running and apply different profiles for popular applications. But I think in this case it was more that the drivers locked out particular APIs (e.g. overlay-related) that were used primarily by CAD applications rather than outright locking out those applications.


as a Nvidia shareholder I totally support this, you're free not to buy their hardware


I'm happy to hear that software locks make more rich.


or for running their GPUs in VMs...


Is that not a thing now?


It looks like they are still disabling it: https://youtu.be/AG_ZHi3tuyk?t=627


which ist completely user hostile, as the original idea (back with certified Z800s, RHEL5 and FX3800s) of a dual-OS-workstations has not panned out.

Today it's basically nagging developers and technical users, who don't want to run proprietary software unsandboxed (but want CUDA...) and have a hassle-free experience into buying quadros. I can't think of any business-case for that, because cracking down at the shady ML-cloud-vendors works out okish right now for them and at unis people won't notice at large...


With Ryzen's support of ECC, if AMD brought SR-IOV to its non-work station radeon cards, it would be another interesting feather in its cap. They do have product announcements coming up, bought doubt that is an important feature to most users.


I haven’t heard about this before and I’m curious - what software only allows Quadro cards?


Almost all CAD software prior to maybe 2016 was like this. If you ever run NX/Solidworks/Inventor, you will notice the GPU usage will be 0% unless there is a Quadro GPU. The main reason is that the graphic acceleration is disabled until you use a "proper" GPU. You are able to force graphic acceleration by turning on the correct registry key.


They also quietly disabled the use of their "Capture SDK" on Geforce cards in a driver update, before whitelisting Geforce Experience and presumably anyone else who would cough up the money to license it.

Developers without Quadro cards just updated their drivers one day and weren't able to use it anymore.

Same goes for their hypervisor checks. Just silently added in a driver update, with the reps only saying why after people started asking why it broke.


Due to having to keep an eye on my kids during virtual school, I'm stuck in the family room which is incredibly noisy with toys and furniture. I've played with the software for about 5 minutes now and think the virtual background functionality works incredibly well: https://imgur.com/a/0ByEYF9


Especially given that you have a bright background light, also that it cut through the space between your headphones. Pretty neat, and better than zoom’s silhouette guess.


Yeah, no issues with the shades drawn on the window, it's impressive.

I hadn't noticed the space between my head and headphones prior to this comment, great catch. I'll definitely be using it during work meetings now, I'm sold.


The Broadcast app offers three AI-powered features:

Noise Removal: remove background noise from your microphone feed – ...

Virtual Background: remove the background of your webcam feed and replace it with game footage, a replacement image, or even a subtle blur.

Auto Frame: zooms in on you and uses AI to track your head movements, keeping you at the center of the action even as you shift from side to side...

These features can be used beyond game broadcasting as well — from video conferencing at home with Zoom, to gaming with friends on Discord.


I did a quick demo here: https://www.youtube.com/watch?v=aoa0aB1AXtI

Hardware: Sony A6000 Mirrorless camera; 2x Elgato Key Lights for illumination; GameCapture HD60 S; RTX 2080 Super; OBS for the video feed and recording.


I have Sony A6000 with Elgato too. Do you mind showing your setup esp. camera placement? I'm curious your eyes can look at the audience while in front of PC.


By looking at the camera while recording, watch the first few seconds again.


For anyone not using an nVidia graphics card, I highly recommend https://krisp.ai/ for the voice cleanup part. Works wonders.


Do you know of any other companies offering a product like krisp? They seem to be one of the only players.


BabbleLabs is the closest but they got acquired by Cisco.

Google Meets also has similar tech https://support.google.com/meet/answer/9919960?hl=en

Audatic was really cool but they got acquired by a hearing aid company.

Whisper.ai has very similar processing but they are also focused on hearing aids.


Admittedly, it's a young branch, no wonder there's few players to date


Interesting, I wonder if on a meta-note we're seeing an evolution of Nvidia's strategy. For a while, it was hardware, then it was leveraging that hardware to create developer-facing APIs (CUDA being a driver for many ppl in ML to buy Nvidia in particular), and now it seems they may be trying out leveraging that compute to create specialized, end-user facing services.

Are there any more examples of this?

I wonder if Nvidia will try to move upmarket to own the best UI powered services (e.g. if you want a camera with a virtual background, it has to be nvidia or something similar)


I can't seem to find a single page that lists all of Nvidia's current vertical software stacks, but they are legion. Aerial(5G), Metropolis(smart cities), Drive(self driving), RAPIDS(big data?), Clara(genomics), IndeX(visualization), Isaac(robotics)... etc.. etc...

They basically have to move towards vertical software stacks requiring their HW. GPUs are becoming commoditized, with integrated GPUs from AMD and Intel becoming good enough to play modern games now. 8k(haha) may keep things interesting for a while, but eventually AMD/Intel will compete there as well.

I don't know about user-facing apps though. So far, most of what they've released(i.e. Gaugan) seem more like demos.


They are becoming the Apple of B2B compute. And Shield is their killer consumer app. Watch out for FAANNG


I don't think it's particurly interesting or novel - they're just creating a complete ecosystem to keep customers within so they keep buying their hardware. It's not exactly vendor lockin, but it's that same vibe.


Just noting that the Turing series of cards and later (which isn't just RTX, but also the GTX16* the the T4) does have specialised hardware which helps with the video tracking side of this.

They have hardware optical flow[1] which doesn't seem widely known, but would absolutely help with parts of this - especially the reframing.

[1] https://developer.nvidia.com/opticalflow-sdk


Only a windows version, meh.


I'm running build 460 of the NVIDIA drivers on Win 10 because I'm using the DirectML support for WSL 2. It appears that Broadcast doesn't work with this as it tells me "cannot start service" each time I try to launch the Broadcast app.

Has anyone else gotten this to work, and what build of the NVIDIA drivers are you using on Windows?


> I'm using the DirectML support for WSL 2

How's that going for you? In my tests (directly on Windows host), tensorflow-directml was about 6x slower than tensorflow-gpu (e.g. with CUDA/cuDNN).


I'm running it now on 456.38/RTX 2080.


Here is the video explaining it nicely with demos: https://www.youtube.com/watch?v=GRFjfGH87Dk


I have 2 Q's

1. As in video on background removal , why was the chair not blurred and because generally in background removal only the person is visible and all rest are blurred, how is it working?

2. How Background Noise removal works, i mean how it is so smoothly removing background noise.How it is able to distinguish which is the primary and secondary,i know it's AI but still want to know the logic and basic understanding.Thanks


How mature is this software? I tried to use RTX voice on an important call and it was not working at all causing me to be late and left a bad taste for these for serious use.


I've had no issues with it. The real problem was Windows 10's extremely frustrating and confusing audio engine.


I just updated my Quadro RTX 4000 to the latest available driver and it's only R450, when I try to install Broadcast it says I need R455 or newer.


Check for beta drivers. If I recall, Nvidia only cuts new "stable" Quadro drivers very rarely, putting them through very rigorous validation since they are pro-level hardware.


The latest driver was released 8/18/2020 and Broadcast's system requirements say they're supported...

There don't appear to be beta drivers.


Anything similar like this exists for Mac?


Krisp - Mute background noise in any communication app

https://krisp.ai/


Or linux?



For virtual background without green screen: mmhmm


or noise reduction, feel these features should be part of conferencing apps itself vs having to use multiple apps on computers that feed into conference app.


FWIW Google Meet has a built-in option to filter out non-speech noise.

It works pretty well.


And they have just added background blur.


The website version too? Can't find it


For me it's in the options where I set my mic and camera. But I'm on an enterprise account for work so it's possible it's not available on lower tiers.


This seems better than RTX voice I have my ears blown out when someone uses it on discord and it crashes.


Odds that this winds up with a Linux release eventually?

=(


I'd say the chance is roughly the same as the percentage of Nvidia customers running desktop Linux, I imagine. The three of you must be used to this by now though?


My RTX-2060 runs well on Pop_OS!/Ubuntu

I've had pretty much nothing but good experiences running Nvidia on Linux the past 4-5 years, barring one strange issue with multiple monitors, a DisplayLink adapter, and trying to rotate a screen sideways.

I can see fiscally how it doesn't make a lot of sense for them -- but why write a platform-dependent implementation and not use code that's generic?

Is the problem the GUI? Build it with Qt or something


Ah yet another Nvidia thing that would work everywhere but is blocked in software because greed. Remember how raytracing was supposed to only work properly on RTX but when people got mad suddenly it wasn't only RTX cards? And now we are even seeing it on the old and slow PS4. Next they'd likely do something crazy like require unnecessary hardware in screens or something. Oh wait..


Noise reduction, background removal. Technologies that have been around for ages. But now apparently it's "with the power of AI" so it's newsworthy?


Machine learning for signal processing is no joke. It's an order of magnitude improvement. The approach is different, too. Instead of approaching the signal mathematically with Fourier transforms and such, you approach it as part of a pipeline where you extract high level components such as pitch and timber. The ML model learns how to do that.

This field is going to change everything. Look what I've been able to do with some of the recent ML innovations: https://vo.codes


The next layer that would put the realism off the charts is ML that's able to replicate a given speakers idiolect, not just their voice.

Mimicking people's ideolects is a hobby of mine, and I typed in phrases that Dr Phil would say, and the voice itself is good but the delivery is off enough that I know it's definitely not him.

Im sure it all get there one day.


IMO, this one's newsworthy because whatever they did works in click-boom fashion- no messing with gain or knobs or fiddling- there's a checkbox for remove noise, and a slider for how much you want to remove.

I'd been trying to get rid of the noise in my mic forever- and RTX Voice just worked.

(That being said, I also had to patch the installer, but like who cares? Someone else did the hard part of figuring it out already, and at that point in time, it was 'beta' software, as well.)


AI just means deeplearning here. The past few years have seen great improvement when applying deep learning to a bunch of already established domains, such as noise reduction and background removal


It's newsworthy because it's actually good. Might be even exceptionally good if it's for average person with cheap microphone.


But is it really? The background removal doesn't seem better than Skype's: even in their demo the hair is all messed up…


1, Given the use of the word microphone, I doubt they were talking about background removal.

2, How do you think Skype is doing it...? Here's a hint: the team has published a number of papers on convolutional neural networks.


Sorry. I should have specified I that I talking specifically about background noise cancellation for voice input. I haven't seen any other of this tech in action so can't comment on how good it is.


What other software would you suggest for good quality real time background removal? The ones I've tried I've been stuck with pretty bad artefacts.


I pinned a piece of green cloth behind me and used OBS with the virtual webcam plugin. Had someone in a meeting comment that there's no artifacts on my feed. He thought it was because I have a shaved head though.


That's not the same thing as pushing a button and having a perfect key without doing anything to your environment


So all I need to do is buy a green cloth, and move it around when I take Zoom calls in the bedroom, living room, or the office.

Lovely!


The deep learning based noise reduction is of significantly better quality than previous algorithms.


Depends on where this product ends up, I suppose. If it learns and improves it's current performance with my admittedly noisy background, then the improvements on the technology would be pretty incredible.

I mean, isn't the whole point of the tech industry to iterate on old tech and make life better with each improvement?


And a Tesla is just a Power Wheels.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: