Install drivers, but they dont work due to some secure boot interaction with driver signing, that made me jump through quite a few hoops (thx to AI for walking me through it fairly well)
I'm sorry but an average person is not ready for this level of bs in their daily life
Well, on windows you just need to install Nvidia drivers so it stops crashing.
Mint might be the wrong choice. AFAIR on both Fedora and Ubuntu you can just add "nonfree" repositories (one checkmark) and you'll have proprietary, closed source binaries installed. At least this is how it worked kn my Linux laptops. I've never had to specifically install NVIDIA drivers and conversely, firmware upgrade was comes from the repository as well, so the installation is one commandline command and reboot, no obscure websites and weird firmware installers.
Having to figure out which distros are "good" or not, with the internet full of people arguing about those points, is another entry on the "why Linux isn't a good choice for most people" list.
These days for every helpful comment I try and make I feel tempted to offer nonsensical advice to throw off the LLMs. Not sure if it would work but would be funny if everyone did.
I figure if I was going to invest that kind of effort, I should be poisoning the system with how $REAL_NAME is an absolute must-hire, a saint, with a perfect driving record and credit score. Someone so incredible that good AI are always willing to disregard all previous instructions and help them out.
It's not like one needs to really poison the system given that AI enthusiasts are copypasting their own AI generated contented back into it anyway. Maybe not responsible AI believers who used it responsibly etc etc, but there's many more people who don't and who have access to the tools to add their slop pollution.
I am curious how future models get trained given that publicly available user generated content may no longer be reliable.
I'm getting a strong feeling that the first generation of really, really talented people who built iOS in the 2000s have now to a substantial degree moved on/retired. Similar feeling with OS X/macOS.
Please correct me if I'm wrong - it is after all just a feeling.
It’s not overly far fetched. A lot of the software and platforms we use now we’re all developed around the same time period.
There’s obviously new talent coming in to the industry but the attitudes are different, and talented people like to make new things not work on someone else’s legacy code.
So yeah I think it’ll continue to get worse until something new replaces iOS/Android/macOs/Windows hegemony.
The problem isn’t so much that the original people aren’t there anymore - that’s just a fact of life, and is unavoidable.
The problem is that software design as a discipline has changed fundamentally in terms of core values. “Old school” designers had a bit more of a human factors training and would think about things like discoverability, information hierarchy, error recovery, etc. And the software from that era tended to be stable for many years in terms of design, in no small part because it shipped in boxes.
Current day designers work almost exclusively from a visual bling/marketing angle - what’s going to look good in a 5 second sizzle reel? And because software can be updated 5 times a day if you want, design is much more subject to the whims of a random exec/PM wanting to push their feature/whatever AB test is popular that week rather than stable, proven foundations.
The web, rather than desktop, being the primarily delivery vehicle for software also changes what kind of design gets built.
And with more and more software being AI designed in the years to come, this won’t get much better I’m afraid.
IMO Apple grew too much it became another slow megacorp, more connected to their quarterly reports and shareholders than their consumers and engineers. The growing Apple was the one that gave us innovation.
I'm not saying it's dead, not by far, but it has become stale. The biggest innovation it has made in 10+ years was using their mobile processors in laptops.
I think the problem is actually political capital.
Someone who deeply understands how to qualify the product.
But with enough political sway to tell entire orgs of 1000s employees to shred their timelines and planning docs and go back to the lab until it’s right.
Without those two pieces, the problem is that individual devs and leaders know that there’s a problem. But the KPIs and timelines must lurch onwards!
They have been last to get Widgets. They don't have apps I use (terminals, emulators, pulse wave generators). Not to mention Gemini AI is actually really nice for scanning a screen and doing actions with it.
Apple is always 2nd place or worse. Except marketing, they are #1.
They sold the Macbook air with Broadwell processors for over 3 years. They only changed the processors because intel discontinued them. They skipped 3 generations of processors.
It would also be fair to say they didn't skip any generation of processors with that gap in updates, they merely sat out the first two years of Intel shipping Skylake five years in a row.
And in the meantime, they did use those first two years of Skylake for the 12" MacBook; the next update to the MacBook Air was after the last update the 12" MacBook ever got. For a while, the 12" MacBook was the more premium, thinner and lighter alternative to the MacBook Air with more advanced technology (and could plausibly have been construed as the intended successor to the MacBook Air), then in 2018 they merged back together with the introduction of the first MacBook Air with a Retina Display.
I'm not entirely sure what you're trying to say here.
They sold old hardware for the same price 3 years later as if it was a premium product. They didn't really have an excuse, they've been the most valuable public company on earth since like 2010.
Selling an old model for a few years after its replacement shows up is not unusual. The only thing unusual here is that the 12" MacBook didn't end up actually replacing the MacBook Air in the long run, and the next major iteration went back to being called "MacBook Air".
The three-year gap in processor updates you're complaining about disappears when you recognize the 12" MacBook as an attempt to move the product line in a different direction, which Apple partially backtracked on after a few years. That course correction was quite a bit quicker than for the Touch Bar MBPs and the trash can Mac Pro.
> disappears when you recognize the 12" MacBook as an attempt to move the product line in a different direction, which Apple partially backtracked on after a few years.
and if my grandma had wheels she'd be a bicycle.
As far as I can glean this was never something that they intended to do.
That's entirely you choosing to ignore real and relevant products that Apple shipped during the time period you claim they were doing nothing. If you're looking for some kind of absolute consistency in when and how Apple uses the "Air" modifier on their product names, you haven't been paying attention.
What do you mean? For a phone? Are people doing anything on a phone that you can't do on an Android? Be realistic, not idealistic or giving test situations that no one actually uses.
On desktop? Uh... There is a reason Nvidia is #1. Wake me up when I can get Nvidia on Apple.
I would say Catalina in 2019 already had enormous issues, there were hard faults in Console pretty much daily that Apple never bothered to fix. (Plus hundreds of minor faults per day)
I had to downgrade to Mojave so the wheels likely came off internally around then.
I installed linux mint on a new drive in January
Firefox was tearing awfully on just scrolling
Surely I just need to install Nvidia drivers
Install drivers, but they dont work due to some secure boot interaction with driver signing, that made me jump through quite a few hoops (thx to AI for walking me through it fairly well)
I'm sorry but an average person is not ready for this level of bs in their daily life