I think most of my complaints/frustrations can be summed up in one notion.
I’m accustomed to very literal (CnC/GCode) and very low-level/basic programming languages (PICAXE, Parallax STAMP).
While some of these are remedial in nature, that’s not the source of my frustration.
Python is very, shall we say, flexible. Reality is that it’s been made sloppy to assist the snowflake generation in having a programming language. There are many ways to do the same thing. Some being more “Pythonic” than others. To my mind, the “most Pythonic” option is the one, true option (everyone else is an apostate, so nyah). No, not because I’m a purist, as the snowflakes might contend, but because the path of least bullshit (observer, endured, or generated) is the only acceptable path.
Amiga. PICAXE. Parallax STAMP. CnC machines designed/invented in the 60s/70s.
These all have (very) limited resources. You have to use methods of maximum cleverness and resourcefulness. You can’t afford to waste space, memory, or clock cycles. Before you write a single line of code, you’ve already got not enough of everything… You can’t be stupid or wasteful if you want your project to exist at all.
The primary reason for code bloat (and bad performance) is that newer, snowflake-targeted languages don’t require this. Computing hardware has advanced hundreds of orders of magnitude.
Just as snowflakes live the rest of their lives; inside the box of magic wires, they assume entitlement to inexhaustible resources which can, and should, be squandered flippantly.
…and this they do, with deliberate abandon…
Which is why the infinitely more powerful computers of today are slower and less useful in spite of being infinitely faster, with storage and RAM wildly in excess.
Want to #LearnToCode? Most “tutorials” have become backwards epeen brags from people who just barely know enough to speak on the subject. Documentation is sparse, inaccurate, sloppy, illiterate (sometimes due to the writer’s first language being non-English, which is entirely understandable), outdated, or usually, nonexistent. It’s very hard to learn from people who know next to nothing themselves, but put themselves up as experts just to brag…
The only saving grace is the IoT. Let me be clear, I hate the Internet of Things. But, tiny, embedded computers have forced a revival of actually knowing what the fuck you’re doing, kinda… If the Raspberry Pi hadn’t come along, I’d have likely become a voluntary neophyte.
The monumental mess that the snowflake generation has made from using, but absolutely never understanding (often outright refusing to learn), technology, is monumental. Imagine what else they’ll fuck up… The sky is the limit!
I learned, and very much prefer, the stringent rules and limited options of older technology. It forced intelligence and understanding. Retards simply couldn’t hack it (no pun/dad joke intended).
Much like the Internet at large…
Back in the 1990s, you had to have at least 3/4 of a brain to get on the Internet. Any form of socializing done on the Internet (including online dating) had an automatic filter that kept the trash out. Too stupid/trashy to Internet? Access Denied!
Apple reinvented their brand as a gateway to technology for the morbidly stupid. Android followed suit. The “Internet Appliance” of yesteryear became smartphones and tablets. We used to laugh at AOL and it’s users as the enclave of retards we’re glad aren’t smart enough to escape AOL’s platform…
Thanks to Apple, and after that, Google; brainless trash have infiltrated, taken over, and spread like the pestilence that they are… Facebook. Twitter. Snapchat. Instagram. Reddit. Social Media is Cancer. A monument to useless degeneracy, Humanity’s chronic race to the bottom…
“Platforms” that enable the lazy, stupid trash to do what they otherwise couldn’t… With those natural barriers torn down, the human race is doomed.