(whispers: the original iPhone was in many ways a superior user experience) https://vmst.io/@jalefkowit/113162083054623081
-
@inthehands I've never thought of it that way, but you're right. Ostensibly clean, but they hide the non intuitive complexity. Plus, for the sake of appearing to be new (and sell devices), I think there's impetus to keep fiddling with it.
After I replied earlier, I was reminded of how consistent the Xbox user interface has been (after they shed the silly Windows 8 look). Probably for close to a decade now. I assume the PlayStation is similar.
-
@inthehands Did you ever try using a Linux phone like the Librem 5 yet?
I mean it only asks for attention if I allow it and I'm okay with it to do so. Still it more seems like your concept of a tool for specialists.
-
@trochee
The “self-effacing” phrase is yours, and I love it. -
P.S. If you’re wondering about my gripe about semantic drift above, the original meaning of “enshittifcation” was a situation where one middle player comes to control both the buyer and seller side of a market in such a way that they can go all in on extracting value from the whole system without having to care about making things worse for everyone else. Traditional anti-trust view focus on the seller side (monopoly/cartel); enshittifcation points to a semi-distinct •intermediary• pattern.
-
@joe @inthehands I don't disagree, but there's a positive side to this. Like technical vocabularies in many disciplines, where you can express more in shorter words and more exactly, but a layman will be confused.
Similarly most consumers of smart phones are now domain experts in smart phones, so we get interfaces that depend on this knowledge to do more, at the expense of confusion to those who are unfamiliar.
-
Brian Hawthornereplied to Paul Cantrell last edited by
@inthehands @darkuncle Well, a Gen 1 iPhone had no AppStore and no way of installing software beyond the carefully coordinated set of Apple apps. Non-Apple software was limited to web apps. In retrospect, perhaps Steve Jobs was right.
-
@thejackimonster
I haven’t used one, no. I am interested in the PureOS effort.iPhones are in fact super good at not asking for attention unless allowed…if people undertake the considerable effort to actually use those features! But it’s not a default.
-
@gregtitus @joe
Yeah, it’s a real tradeoff.Spending so much of my time teaching programming to beginners has brought this tradeoff into sharp focus for me: I’m constantly forced to re-see things that had become so familiar to me as to be invisible. Half the time I’m in awe at the collective structure we’ve built, the other half in awe at the collective mess. For example, I don’t think I truly, deeply understood what was wrong with null/nil until I taught.
-
I just stuck a catchy name on your fundamentally new-to-us-in-this-timeline idea
Almost all our "personal technology" — from phones to "smart" houses and cars — draws attention to itself.
But it seems important to consider that it's both "personal" and "technology" that is self-aggrandizing
1/2
-
Paul Cantrellreplied to Brian Hawthorne last edited by
@bhawthorne @darkuncle
It’s a thought I hear regularly, and I do think there’s something to it — but looking at the history, it doesn’t seem to me that 3rd party apps are the line where things shifted. That “just pick it up and you can figure it out” phenomenon was alive until 2010 at least, in the heyday of third-party apps. -
@inthehands @gregtitus it's also not always obvious when there's load-bearing "structure" hidden in the "mess"
-
The "impersonal" technology — "cloud" computing, for example — is carefully designed to direct attention elsewhere (& thereby hides its redistribution of externalities)
The personal tools that don't self-aggrandize don't get considered as "technology"
- the paper notebook
- the seatbelt
- the filling cabinet -
@trochee
That’s an important insight! -
@joe @gregtitus
Yes. Something we don’t tell students nearly enough is that all these things were made by actual people, and they were solving real problems. Sometimes even solving them well! -
@inthehands
incentives for the phone
to be loyal to its owner
not its manufacturer -
@inthehands @tehstu While I don't think strict skeumorphism actually makes UX better, the (Jony Ive?) era at Apple where they ripped the graphical design out and replaced it with minimalist flat color, also ripped out a bunch of context on what was an interactive element and what was UI chrome, to the detriment of usability.
To make the UI usable it needs to do less and demand less state tracking by the user, esp when your brain doesn't have a 14-element working-memory
-
@inthehands I wonder how much the need to keep the dev team busy with visible "features" and trying to be all things to all people drive this MS Word level of feature creep away from a simple tool which does simple things. If there were a more competitive market with 6+ viable phone/tablet OS companies you would have a better chance at finding an OS with an interface that fits your usage, as each fights for a different overlapping set of users with different needs. Duopoly can't produce that.
-
Lisa L. Spangenbergreplied to Paul Cantrell last edited by
@inthehands In a related observation, UI chages to basic intersactions mean I spen dseveral weeks helpin residents in my mom's senior community figure out how to do basic tasls that no longer work the way they did
-
Paul Cantrellreplied to Lisa L. Spangenberg last edited by
@medievalist
In my very strong opinion, software folks and designers of all stripes should give sustained and serious attention to the moment-to-moment experiences of beginners, the elderly, and other groups likely to experience usability friction — not just for the benefit of those people, which is reason enough, but because it will frequently improve the experience for everyone. Universal design ftw!