Carrier grade NAT. For instance, on our local mobile phone network, thousands of handsets will have the same public IP address.
Carrier grade NAT. For instance, on our local mobile phone network, thousands of handsets will have the same public IP address.
You can get soft silicone ear pickers with a built in camera now so you can see what you’re scooping.
How old is “older?”
I run the latest Debian on a 10 year old Macbook Pro. Linux has given this laptop a second life as a lab machine - it’s still plenty fast enough and it has a really nice screen (Retina) which Debian gets right out of the box with no tweaking. The only thing I needed to do when installing Debian is manually get the drivers for the WiFi hardware during the install (although Debian has the non-free firmware by default these days, they aren’t permitted to distribute all firmware and the WiFi hardware in this machine unfortunately happened to be one of those).
Never. We had a work lunch and one of the guys a few days later said “I just tested positive for covid, better test”. About 2 days later I was testing positive, but none of us in the household ever had any symptoms other than testing positive (about 4 days in, the LFT was going bright red as soon as the liquid reached the test line). None of us ever had so much as a sniffle. The guy we got it off was really rough for a few days.
I think 30fps (25fps in PAL-land) became the standard because televisions were 30 FPS (NTSC) or 25 FPS (PAL) due to interlacing. While the screen redraw on a NTSC television is 60 per second, it’s done as two fields so you only get 30 actual frames per second. This was done so you could have a decent resolution (525 lines for NTSC or 625 lines for PAL) while maintaining reasonable RF bandwidth limits for the TV signal by sending a single frame as two fields, half of the picture in each field on alternate TV scanlines.
So you probably have a lot of industry inertia to deal with so 30 fps (or 25 fps where PAL was formerly the standard) ends up being the standard. And for video it’s good enough (although 60fps/50fps is still better - until fairly recently, this would entail too much bandwidth so sticking with the old NTSC or PAL frame rates made sense).
But for computers no one really used interlaced displays because they are awful for displaying the kind of things computers usually show (the flicker is terrible with a static image in an interlaced screen mode. While it’s true there were some interlace modes, nearly everyone tried to avoid them. The resolution increase wasn’t worth the god-awful flicker). So you always had 60 Hz progressive scan on the old computer CRTs (or in the case of original VGA, IIRC it was 70 Hz). To avoid tearing, any animated content on a PC would use the vsync to stay synchronized with the CRT and this is easiest to do at the exact frequency of the CRT and provided very smooth animation, especially in fast moving scenes. Even the old 8-bit systems would run at 60 (NTSC) or 50 (PAL) FPS (although 1980s 8-bit systems were generally not doing full screen animation, usually it was just animating parts of the screen).
So a game should always be able to hit at least 60 frames per second. If the computer or GPU is not powerful enough and the frame rate falls below 60 fps, the game can no longer use the vsync to stay synchronized with the monitor’s refresh, and you get judder and tearing.
Virtual reality often demands more (I think the original Oculus Rift requires 90 fps) and has various tricks to ensure the video is always generated at 90 fps, and if the game can’t keep up, frames get interpolated (see “asynchronous space warp”) although if you’re using VR if you can’t hit the native frame rate, it’s generally awful having to rely on asynchronous space warp which inevitably ends up distorting some of the grpahics and adding some pretty ugly artifacts.
Bikes don’t go very well in neutral, I’ve found.
A good curry the night before usually guarantees at least two.
Honda. The answer is Honda.
Why we won’t raise our kids in suburbia: https://www.youtube.com/watch?v=oHlpmxLTxpw
Renewables are already viable in the UK and making up an ever increasing percentage of electricity generation. Additionally, the time when it’s windiest in the UK is also the time when electricity demand is at its highest.
Using coal for electricity in the UK is now rare. Coal only made up 1.5% of electricity generation in the UK in 2022. Just ten years ago coal generation was nearly half.
You can use dd on another machine to make a bitwise copy of the card before you first use it.
Debian (a very conservative distro) switched to Wayland by default in debian 10 if I’m not mistaken (we’re now on 12).
I didn’t notice the change until I tried to run a niche program that really needs X11. Unless you’re doing this kind of thing, then you can probably just use Wayland. At least in Debian it’s really easy to switch between Wayland and X11 by selecting the session type when you log in.
https://www.youtube.com/watch?v=iYWzMvlj2RQ
“I’m also very happy to point out that nVidia has been the worst […] so nVidia, “fuck you!””
¡Me cago en Dios!
Similar to “He’s one can short of a six pack”
Twice, and they were completely different experiences.
First was gas at the dentists for taking 3 teeth out as my mouth was overcrowded. I was kind of asleep, I could hear people’s voices in a really trippy flanged way, and I could vaguely feel some tugging at my jaw (but no pain). The gas tasted awful.
The second was for an operation at hospital after an accident (requiring 6.5 hours of microsurgery). It was like jumping forwards 7 hours in time, literally counting the seconds after the anaesthetic went in at night, then immediately waking up in broad daylight. It is completely unlike deep sleep (where you still are aware that time has passed).
Mostly 80s and early 90s (I think the nodelist peaked in size in about 1991 or 1992 or thereabouts, at about 30,000 nodes - I could be wrong). It’s still going by the way.
But it does help give an idea of who’s making the most reliable drives (both SSD and hard disk). No, this isn’t a guarantee, but it’s still useful information especially when it’s not just a friend-of-a-friend anecdote but gained over tens of thousands of drives.
As a workaround, you could use OBS and use OBS’s virtual camera so Discord is streaming what it thinks is a camera, and set up whatever you want to share on your desktop through OBS.