rvlobato@lemmy.ml to Open Source@lemmy.ml · 1 year agoAMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.comexternal-linkmessage-square15fedilinkarrow-up1179arrow-down12cross-posted to: [email protected][email protected]
arrow-up1177arrow-down1external-linkAMD Quietly Funded A Drop-In CUDA Implementation Built On ROCm: It's Now Open-Sourcewww.phoronix.comrvlobato@lemmy.ml to Open Source@lemmy.ml · 1 year agomessage-square15fedilinkcross-posted to: [email protected][email protected]
minus-squaremayooooo@beehaw.orglinkfedilinkarrow-up10arrow-down1·1 year agoA serious question - when will nvidia stop selling their products and start asking for rent? Like 50 bucks a month is a 4070, your hardware can be a 4090 but thats a 100 a month. I give it a year
minus-squarepoVoq@slrpnk.netlinkfedilinkarrow-up11·1 year agoIt’s more efficient to rent the same GPU to multiple people the same time, and Nvidia is already doing that with GeforceNow.
minus-squareLudrol@szmer.infolinkfedilinkarrow-up3·1 year agoWhen the AI and data center hardware will stop being profitable.
minus-squareumbrella@lemmy.mllinkfedilinkarrow-up3·1 year agowhenever the infrastructure is good enough they can keep the hardware and stream your workload to you.
A serious question - when will nvidia stop selling their products and start asking for rent? Like 50 bucks a month is a 4070, your hardware can be a 4090 but thats a 100 a month. I give it a year
It’s more efficient to rent the same GPU to multiple people the same time, and Nvidia is already doing that with GeforceNow.
When the AI and data center hardware will stop being profitable.
whenever the infrastructure is good enough they can keep the hardware and stream your workload to you.