Apple’s Role in the Noösphere

Approximate Reading Time: 10 minutes

At the behest of a good friend, who asked me on Facebook what I thought of the following article on the Web, I read the article. I tried to read it twice, to assist in getting past a significant number of potholes and bumps in the text, but I simply could not muster the initiative to get past that first reading, which left my friend feeling with regard to Mr. Kay’s narrative, “he goes over my head a few times.” I felt, conversely, that my friend was being charmingly polite and self-effacing. As you will see, I can’t manage these otherwise authentic sentiments and remain credible in what would be in me my feigned sincerity.

Here is the link to the “Fast Company” Web page with the article in question. You can read it before or after reading what I have to say. Or, if you have sufficient self-regard, you can skip it altogether. If you have an overabundance of self-regard, it’s possible you’ll elect to stop reading me right here.

https://www.fastcompany.com/40435064/what-alan-kay-thinks-about-the-iphone-and-technology-now

Undoubtedly Alan Kay has always been a smart cookie. It’s not entirely clear that he is able to articulate intelligibly and clearly what goes on in that head of his, not from this Fast Company interview. It’s been filtered through the mindset of a typical Fast Company contributor, which is to say, one of a huge team of well-educated millennial ax-grinders. Whatever Kay actually said remains, likely irretrievable, in the silicon pathways of Brian Merchant’s digital recording device.

That’s the first problem with extracting anything of meaning, never mind of value, from this deliberate, cozy, but still reverential brush with the greatness of late 20th century cybernetic science pioneers. The second problem is that, despite the first fact, I think, but cannot be sure, it’s possible to extract some hints of motive in the various expostulations of Mr. Kay, though these may have been colored by the mission of young Mr. Merchant (as evidenced in his selective contributions to the “conversation” documented here). It sounds like there’s more than a bit of the product designer manqué in Kay, and despite his generous assessment of Steve Jobs’s marketing genius, it seems clear that the deficiencies he delineates in the progress of the product concepts he envisioned with his collaborators more than 50 years ago now are more of a marketing nature, than of a failure in the evolution of the underlying technology, which he hardly touches on (possibly because the lede here should have been not that Mr. Kay is not impressed—this seems to be a fragmentary memoir of his history of insufficient esteem for the accomplishments of Mr. Jobs, with whom Mr. Kay seems to imply a collaborative bond—but that Mr. Kay would have loved to have introduced products to a market that had the same demonstrable, indeed monstrous, success as those that Apple actually did present so successfully, going back to the iMac).

Further, and this is the third and possibly the biggest of the problems I have reading this feature story from “Fast Company,” it is not at all clear that errors in navigation, so to speak, for the great ship of Human Knowledge (with its fleet of support vessels, which entail the means of not merely furthering its course, but how it will continue to sail the endless seas of the universe), at least since the advent of products that further what I’ll call the market for consumer computing, are attributable to the products being offered so much as the applications to which the market asserts its preferences. In short, it’s never been my perception that Apple envisioned the design and production engineering of a product that would optimally enable spending hours playing the wholly hermetic self-involvement of a game called “Candy Crush.” Along these lines, and more in an abstract sphere, Kay had occasion to allude to the great, if not culturally cataclysmic, aperçu of Professor Marshall McLuhan concerning the impact of certain specific mechanical technologies on not only human societies, but on human nature. I think it’s unfortunate that Kay, I am sure unwittingly and unintentionally (but who knows?) perpetuates the perception that McLuhan was a philosopher (and maybe possibly an evolutionary psychologist) when he was, in fact, mainly a literary qua cultural critic.

I can’t be sure of this, though, because, ironically (which Merchant and Kay make clear is the touchstone communicative mode of the zeitgeist), albeit Kay lavishes praise on the rhetorical skills of such as Neil Postman, or even further back to Bertrand Russell (“that bastard”) being capable of writing “like a dream,” Mr. Kay is not capable, at least he doesn’t talk like a dream… All of this suggests, and punctuates the perceptible fact in the form of this published interview, that unlike them Mr. Kay is not capable of being either clear, first and foremost, and thereby persuasive—especially of facts, it’s suggested, not otherwise palatable to the recipient of the argument. But then, this is a heavily edited and manipulated interview on the heels of a major product introduction by the world’s largest company in terms of market capitalization. And it appears in Fast Company—a tarted up business magazine that has what seems to be an inalterable mission. Its agent, in this case, the aggressive journalist bent on positioning himself as the resident historian of the development and impact of the Apple iPhone, states his professional purpose (on his LinkedIn site profile) as follows:

Today, he spends most of his time investigating the myriad ways humanity is attempting to survive itself.

Talking of high-minded purposeful solipsism.

Instead of McLuhan, it seems as if Kay, and his self-appointed henchman Merchant, should have dug a bit more into the ideas of Teilhard de Chardin, and in particular that of the noösphere. It’s a concept that has been kicking around (though it’s hardly a popular lively topic) since the 1930s, and thereby lends a certain estimable patina to the already comfortably burnished ideas that issued from the labs of not only Xerox (the company that never got over not becoming what Apple has proven itself to be, though it showed every promise of doing so; it just could never get over the hump of being utterly incapable of conceptualizing and developing products that could be marketed and sold successfully to the mass consumer market… something that Sony, Apple, and for a long while (until it lost its technological grip) Polaroid, among many others, had proven themselves to be), but of a great number of academic laboratories and whole departments in the applied and theoretical sciences.

The notion that there is a concurrent, coextensive, and (insofar as I can understand some complex and possibly arcane theory) commingled developmental human capacity keeping pace with, if not finally and now (should I say NOW!) exceeding the excrescences of evolution, usually understood in terms of natural selection is, in short, not a new idea. That there is a superceding (what I will provocatively call) ontological development in the evolution of human epistemology—please IM me if that “human” is redundant, and I’m just sounding like a fool—remains to be proven, however. But a lot of people seem to sure want to think so. And a lot of very smart people are counting on the insinuation of certain largely 20th century technologies (starting with the Turing engine—in the form of the still barely modern digital computer—and continuing through the accretive accumulation of a wide range of programming languages, including so-called object-oriented ones, but not stopping with them, as well as mimetic architectures for computer engine design (with their tightly bound software|hardware manifestations) with neural networks the most prominent as an example in my mind) in the gestation of some new kind of what I’ll call consciousness, and which Kay here, very clumsily and slightly incoherently calls “another level of thought.” There is, possibly, some suggestion, and this would be particularly in keeping with the thinking of the theorists of noöspheric structure, that this presumably extranumerary level of thought is, in fact, a wholly new level of thought—somehow, again mysteriously and incomprehensibly (here) aided and abetted, if not stimulated, with some vague suggestion of insemination, by the great potential computing advances envisioned in Palo Alto, and other places. That, the aiding, abetting, the, uh, stimulation, the, erm, insemination, which is to say, the enabling of some new dawn of thinking would occur if only we would let it. Except we are bent on watching serially, or with sporadic binging, entire seasons of the alleged comedy series “Bojack Horseman.” All that potential enlightenment down the omniverous black hole of popular culture.

Having said all that, allow me to say, just briefly, because I am afraid that I have already taken up too much of your time to leave you comfortable, even at the risk of seeming suddenly to change the subject. I’m not. I’m just doing what every creative nonfiction writer in this day and age does, worth his or her rhetorical salt, and that is, I am making it personal, because the mission of deconstructing and then deriding the suspect emissions of a noteworthy brilliant computer scientist is always a dead end. Unlike Mr. Merchant, who by familiar conversational postures and ploys suggests he is, I am not by any means Mr. Kay’s peer (though, to play the age card, I am far closer chronologically than Mr. Merchant can ever be while Mr. Kay lives—and thereby have my own memories of the very same periods of the development of computer products and the underlying science and engineering).

Nevertheless, I have no problem stating that I am not at all unimpressed by Apple’s latest product announcements, and especially in the light of what small lights went off in my mind (kind of premonitory LEDs) as I watched the Apple Event on September 12, the extended product commercial, wherein they announced the much anticipated new line of iPhones. First let me say, and I must offer the caveat that I am not an inveterate watcher of these fanboy events—I’ve never watched one from beginning to end, as I did this one, before. Something told me, and I can’t say what (nor do I wish to devote the time and emotional energy to figuring out what “told” me; I’ll just say, I have a lot of faith in intuition), to watch.

After consciously noting and filtering out that tsunami of ejaculations (I am speaking entirely of rhetorical phenomena) from the mouths of the parade of Apple executives delegated to announce the products and their attendant features, consisting essentially of the words, “beautiful” and “magic,” I realized that two things struck me as particularly compelling. I don’t pretend to be an exhaustive reviewer of popular media, or even the self-consciously nominal intellectual fare of which I am a significant consumer. But little attention was paid to two facts about the new products, one a functional capability of the newly announced Apple Watch Series 3 and the other a facet of the underlying enabling computer design of the new crown jewel of smartphones, the iPhone X.

First, we were told that in addition to the liberating capability of being able to don a watch that would leave us coupled, with an available signal of sufficient power, to the nation’s grid of cellular transmitters, the watch, with forthcoming software revisions, will be capable of monitoring cardiac arrhythmias. This is very big. It’s big, no doubt, in terms of a significant potential advance in diagnosis and prevention of debilitating, if not fatal, cardiac and cerebral anomalies. Without belaboring this (this is not the place, and I don’t have the time, even if you do) this can have a significant impact on ensuring well-being and greater healthy longevity for humans, and I would suggest, tantalizingly, that this has implications for how we will be able to think about the nature of mortality, and all the attendant epistemological matters pertaining. Talk about a new “level of thought.”

Second, and this could be even bigger, but I can’t say, because I don’t have the bona fides for even thinking about potential applications, the new iPhone X, embedded in the Face ID engine of the product, has a computing advance—with clear, proven, highly affordable manifestations, albeit as a consumer product the vendor is hard-pressed to describe to an avid public in any language other than to use the ridiculous word, “magic.” It’s not magic, but it is incredibly powerful, and it will fit in anyone’s pocket. I am speaking of the architecture of the new A11 Bionic Neural Network chip in the iPhone X. This was conceivable, but, if you will, unthinkable in a consumer product, back in the 1980s (for perspective, the Macintosh, which Mr. Kay considered the first personal computer “worth criticizing,” was introduced in 1984; it was capable of facilitating, but in what in hindsight was only in the most primitive way, the graphic user interface, with the ability to “draw” on the screen of a cathode-ray tube… and, frankly, not much else worth noting, except the use of a new “input” device charmingly called a mouse, and all of which were envisioned by Mr. Kay and his cohort at Xerox PARC labs, back in the 60s and early 70s—which is to say, it took, let us say, 10 or 15 years to see realized in a consumer product). It has, for practical purposes, taken 30 years for a true neural network architecture to see realization as a viable product.

I’ll just say, to conclude, that it’s too bad, to note only one major benefit at least as Apple presented it. I mean this aside from the vaguely engaging (not sufficiently to justify replacing my perfectly fine current iPhone 7+ model, less than a year old, outmoded as its technology suddenly has become; I will just have to live with the humiliation) application of highly secure three-dimensional facial recognition to permit use of the phone. It’s really too bad, in fact, that Apple in their considerable wisdom (borne of incredibly successful and undoubted marketing acumen—certainly Mr. Kay attests to it) chose to put enormous emphasis on what I can only describe as the colossally trivial ability to animate cartoon characters with a simulacrum of basic emotive expressions, and all that anthropomorphically at best.

You’d think, and I hope in a small way, that Mr. Kay would be in concurrence, if this is not precisely what he was trying to say, and would have without the interference of Mr. Merchant, the world has enough smiling, grimacing, gesticulating cartoon panda bears.

I could add that, once you have an iPhone X, unless you can use it to solve some significant complex problem that has eluded very serious invesigators and researchers for years (which I fully intend to do with mine), I would suggest you put down the phone making sure it’s on its self-charging wireless Qi pad, and go play with a dog. Salutary for all aspects of the brain chemistry. But I won’t add that, because it would be snotty.

Digiprove sealCopyright  © 2017 Howard Dininrssrssby feather
FacebooktwitterredditpinterestlinkedintumblrmailFacebooktwitterredditpinterestlinkedintumblrmailby feather
Share

2 thoughts on “Apple’s Role in the Noösphere

  1. Gets complicated, this social media business, doesn’t it Geoff. Should I answer here, and if I do, why… it appears all of 13 people, at best will see what I have to say. But if I answer here, and I did spend all that time saying what I had to say, why let it go to waste? I mean, the guy did decide to use the slim justification of having the crutch of a blog he’s been keeping since donkey’s years—well, OK, so it’s 2004; but I’ve been sending out my screeds to what were always probably only equivocal recipients, only two of whom ever had the temerity to say directly (and privately, thank God) “please take my name off your mailing list for this stuff…I love your thoughts and writing and all, but I just don’t have the time…my hedge fund is so demanding” through the simple expedient of using that nifty bcc: feature that’s been on email from the beginning. Remember Eudora? Remember when great software was free, because the taxpayers of the great state of Illinois, in this instance, and wholly unbeknownst to them, because who had enough loose cash to buy a computer and what’s a modem anyway?, were bankrolling the hacking and screwing around of graduate students who simply ginned up what we now call applications, because that was the only model they had for doing what they wanted to do with these gizmos with the black screens and the glowing kelly green characters that looked like a bad teletype on acid, and since it was paid for with public funds, they made their handiwork available to all comers with a phone line and an Amiga?
    But let me let you, and the 12 other people here, in on something. It’s only because it was well in excess of 1000 words, but also because I have decided that I have to act on the pernicious (my word) influence of Facebook on my life, at least, and therefore, using some nice little apps I got cheap, I am severely self-limiting the time I spend on what is actually a long list of “time-suck” websites daily to no more than 30 minutes out of each 24 hours. So, I’m in and then I’m out. And it means that almost all things I say there are written off-line, which is to say, not within the confines of the little Facebook posting window, which, as you know with your advanced levels of sensitivity, good taste, and a sterling vocabulary, are incredibly limiting in terms of how one can spread one’s rhetorical and typographic wings and fly… expressively speaking. It’s not so much that the PTB (powers that be) at Facebook give a shit about how much you write—the biggest mistake all sites that allow comments, but especially those that thrive on what people say for free about places they dine, or slumber, or cavort, which they gin up as informed criticism by experts (look! I got an expert badge! And all because I’ve shit on little restaurants in 14 states from here to New Mexico…) is that they limit you, without telling you, to 5000 characters, and then their editing engine, such as it is, hasn’t had a word of code changed since some grad student in Indiana back in 1978 wrote it in order to keep from flunking his required course in Pascal and getting thrown out, and so it’s virtually impossible to cut out the now verboten excess verbiage, which is why I use a keystroke logger and act as my own big brother (because no company that manages to stay in business writing decent keystroke logging software can do it without positioning their products as a way of monitoring employee workstation behavior, to curtail lollygagging, porn-viewing, and writing nasty right-wing posts anonymously on Twitter).

    But really. To the matter at hand.

    Computer vendors have been the whipping boy(s) of high and mighty above the fray displaced academics, like Kay and all the rest (Berners-Lee, Papert, etc.) who were fugitives from the ivory tower or had a genuine vocation, not to be mocked, to be pedagogues and teachers, and thereby saw all technology as a lever to raise the world of learning higher and higher onto the shoulders, Atlas-like, of the god of knowledge and enlightenment. So of course Apple, like every other seriously ambitious vendor of the next best thing (or as I liked to called it—even owned the domain name for awhile—the best next thing), promised to give away computers, software, gizmos, gimcracks, whatever, to schools everywhere for free… Fat chance. The promise of computers as not merely facilitators, but enablers of learning skills (do we really need skills at learning? I thought we were born with those. What we really need are what the Ferengi would call tools for making the process of knowledge acquisition more productive. As for me, since no one asked, and I do have a modest, if shrinking, vocation to teach, what I always thought the most useful purpose for immersion in formal education-saturated environments is to teach young ‘uns to think, beyond their innate unformed ability, variable by individual and genome, not to mention the home, uh, situation, to intuit and grope around… I learned to think in school, but, unfortunately not because anyone ginned up a strategy for doing so with that as an objective. I did it mainly by picking and choosing who among my teachers seemed best at expressing their particular way of looking at the world. And not one of them was the type who were constantly referring to an inexhaustible stack of 3×5 index cars, a small pile of which was always handy, along with the chalk.

    So of course Kay is faulting Apple. It’s like faulting Barney Olds for not making that first car capable of taking you where you want to go on a cupful of fuel, and no other involvement on your part than to say vaguely in the direction of the proto-dashboard, “Car, take me to the King of Prussia Mall, and park near the entrance to Nieman Marcus.” Apple doesn’t make educated minds. It makes contraptions for the self to engage with the other. Experience, and the process of observing, undergoing the effects, and cogitating upon the impact on one’s perceptions, if not one’s physical being in its several sensory manifestations, is what teaches anything assuming there is something to be learned. We apparently (correct me if I’m wrong here) create superb drone pilots because we raised at least two generations of young ‘uns to develop those skills with progressively complex and immersive games seemingly bent on allowing you to rack up trophies with ever greater numbers of kills (or whatever they’re called). And you (or Alan Kay) are going to tell me that computers don’t teach you anything.

    I’m still not clear, frankly, about what the hell Kay’s hardon is over Apple. And that’s after forgetting about the Fast Company piece and diving in, at your implied invitation, to the TED Talk. By the way, you do recall what Richard Saul Wurman (my hero back in the early 80s, but really more for his graphic design ideas and his notions about re-conceptualizing the process of guiding people around the modern megalopolis) felt he was ginning up (besides an exclusive, and hyper-costly, geek version of Renaissance Weekend) with TED conferences. Do you remember or know the acronym? Technology Entertainment Design… You’d have thought that “E” would be for something like the WWII Navy “E” or for something noble, like Education… But nah.

    What we’re hearing from Alan Kay is the strangled frustration he’s harbored for at least 40 years (around the time of Apple’s incorporation).

    The next Youtube video that teed itself up after the Kay speech (and to which Kay had in fact, very indirectly, adverted) was a speech, given about the same time to TED by Nick Negroponte, when he was still the former golden boy of the MIT Media Lab, and was about the genesis of the $100 learning laptop. That’s an idea that’s been kicking around since the 80s (about the time the Macintosh was introduced, with a retail price of $3500). Pipe dreams my friend. Pipe dreams.

  2. Allow me to be grateful at the appellation “good friend”, and chagrined that my confession of bewilderment provoked a charmed sympathy, H. My self-regard bids me to remember that it is, more than anything, genuinely enjoyable and fulfilling to have the time, the opportunity and the accomplice-in-speculation necessary to develop my own ideas “in concerto”.

    I fully agree with your evaluation of the medium, and the probable limitations imposed by a filter whose shape and characteristics are available to us only through blind speculation. I would love to have access to the unedited original – in fact, the very capabilities of this multi-exabyte producing wastrel we call the internet would seem to be perfectly aligned to proffer such availability. Why isn’t source material linked to, in furtherance of the embryonic noosphere? Further examination of Mr. Kay’s communication style might even lead to concluding that he has trouble communicating below his lofty plane – his TED talk in 2008, is tantalizingly inspiring – it introduces ideas that come – THIS – close to sparking a revolution in my mind, then slip my grasp, a storm of butterflies entrancing…dazzling…evanescent.

    As to what you parse as the second problem, I can see it as you describe, an event editorial in nature both recently (by Fast Company) and over time (in the memory of Mr. Kay, as events have proven friendly to Mr. Job’s genius). I’m speculating based on additional information (The TED talk) that Kay actually does see a genuine lack in Apple products, but there isn’t enough “there” there to distinguish whether he regards that as a design flaw, an accident of history or something else altogether.

    Onward to the noosphere! I’m an admirer of Postman, as Kay is, and regard Russell with all due respect, and McLuhan as an incisive observer and spectacularly effective communicator. Reading between the lines, I suspect you feel similarly, but can’t quite be sure. I’ve long been more a camp follower of Descartes than Chardin or Bergson.

    The noosphere strikes me as a useful teleological bucket, but like most buckets, I think it has a tendency to overuse, splashing, and leaks, and your comment to the effect that it is not new, but not yet proven either, elicits a sonorous echo in some craggy recess of my personal corner of said speculative sphere. May I inquire – do you think criticism of the idea that emergent consciousness is restrained by human fascination with mindless entertainment qualitative or quantitative, H? Is the mere participation in the action indicative (or causally tied to) insufficient collective nous in the sphere? Or is it more that distraction per say deflects humanity from critical nous-mass?. How many angels may dance, H? How many?

    You sound like almost as much an Apple fanboy as I am. I’ve gone so far as to build my own “Hackintosh” three times now, in addition to the Palo Alto devices I use in pure factory form. I’ve got a 6S myself, and so am seriously considering upgrading, but not to the X, but the 8. As you note, the qualities distinguishing the X from even last years now hopelessly, piteously obsolete 7 series are trivial, and the difference between the X and 8 almost invisible for anyone who doesn’t require daily infusions of personal technology ego stroking. I have no problem with snotty. The emergent noosphere could possibly benefit from a stricter diet if it is to be better than the soup which regularly emerges when I collocate sufficient water, leftovers, and heat energy.

Comments are closed.