I have an other 2-3 years with my 1600.
I have an other 2-3 years with my 1600.
Gaming, working (data processing, physical modelling).
The trick is to use a lower overhead OS than Windows.
I’m still using a i7-3630QM and a R5-1600.
They are both enough for what I do with them. Why would I upgrade?
I feel vindicated that Vista and 8 where my favorite as well.
It’s not worth the cost of ruining LEO and the environmental effects of them burning up in the atmosphere
deleted by creator
I’m daily driving a 2013 laptop on Endeavour and it feels as fast as new stuff. Doing a lot of relatively heavy compute on it too.
Sponsors pay much more than views. So does patrons.
The true issue is discoverability in my opinion.
Where can I book a train to Europe?
Move it to am external hard drive with anything else you want to keep, then you’ll have access to it on any computer no matter the OS.
The only way he doesn’t eventually is if he dies, and he’ll have shown them that they don’t have to wear masks anymore and can go as hard as they want.
I so wish I could get my hands on an electric one to replace my 2006 TDI…
Fair enough
Mercurial is way better.
There, I said it.
Please, most people don’t know how to use a scientific calculator at all.
If I want to look at the world through a screen I’d stay home and watch a documentary.
The camera they use will never have the acuity, color perception and dynamic range that your eyes have. It probably doesn’t work super well in dark environment and it’s definitely completely useless for stargazing.
Yeah, but the cost of low latency is thousands of satellites that burn up in the atmosphere, need to be continuously launched, are a catastrophe for optical and radio astronomy and crowd LEO, reducing available space and increasing collision risk. All for a barely scalable system.
It’s not worth it. If you want low latency get a cable run or talk to a ground based antenna.
A geosynchronous satellite makes much more sense for those use cases.
The issue with starlink is the choice to be in LEO instead of using geosats. It lowers the latency but it makes the whole project completely unsustainable.
My modelling is CPU bound as it’s a model made in Fortran by physicists (me included). The fact is that I wouldn’t get a 4x boost, and a model running overnight still would. When I actually need performance I use a 1000 cores compute cluster for multiple days, so that would never run on any consumer CPU anyways.
For the data processing, the real bottle neck is disk access and my scripting speed, so the CPU doesn’t really need to be amazing.