Apple’s Role in the Noösphere

Reading Time: 11 minutes

At the behest of a good friend, who asked me on Facebook what I thought of the following article on the Web, I read the article. I tried to read it twice, to assist in getting a significant number of potholes and bumps in the text, but I simply could not muster the initiative to get past that first reading, which left my friend feeling with regard to Mr. Kay's narrative, "he goes over my head a few times." I felt, conversely, that my friend was being charmingly polite and self-effacing. As you will see, I can't manage these otherwise authentic sentiments and remain credible in what would be in me my feigned sincerity.

Here is the link to the "Fast Company" Web page with the article in question. You can read it before or after reading what I have to say. Or, if you have sufficient self-regard, you can skip it altogether. If you have an overabundance of self-regard, it's possible you'll elect to stop reading me right here.

https://www.fastcompany.com/40435064/what-alan-kay-thinks-about-the-iphone-and-technology-now

Undoubtedly Alan Kay has always been a smart cookie. It's not entirely clear that he is able to articulate intelligibly and clearly what goes on in that head of his, not from this Fast Company interview. It's been filtered through the mindset of a typical Fast Company contributor, which is to say, one of a huge team of well-educated millennial ax-grinders. Whatever Kay actually said remains, likely irretrievable, in the silicon pathways of Brian Merchant's digital recording device.

That's the first problem with extracting anything of meaning, never mind of value, from this deliberate, cozy, but still reverential brush with the greatness of late 20th century cybernetic science pioneers. The second problem is that, despite the first fact, I think, but cannot be sure, it's possible to extract some hints of motive in the various expostulations of Mr. Kay, though these may have been colored by the mission of young Mr. Merchant (as evidenced in his selective contributions to the "conversation" documented here). It sounds like there's more than a bit of the product designer manqué in Kay, and despite his generous assessment of Steve Jobs's marketing genius, it seems clear that the deficiencies he delineates in the progress of the product concepts he envisioned with his collaborators more than 50 years ago now are more of a marketing nature, than of a failure in the evolution of the underlying technology, which he hardly touches on (possibly because the lede here should have been not that Mr. Kay is not impressed—this seems to be a fragmentary memoir of his history of insufficient esteem for the accomplishments of Mr. Jobs, with whom Mr. Kay seems to imply a collaborative bond—but that Mr. Kay would have loved to have introduced products to a market that had the same demonstrable, indeed monstrous, success as those that Apple actually did present so successfully, going back to the iMac).

Further, and this is the third and possibly the biggest of the problems I have reading this feature story from "Fast Company," it is not at all clear that errors in navigation, so to speak, for the great ship of Human Knowledge (with its fleet of support vessels, which entail the means of not merely furthering its course, but how it will continue to sail the endless seas of the universe), at least since the advent of products that further what I'll call the market for consumer computing, are attributable to the products being offered so much as the applications to which the market asserts its preferences. In short, it's never been my perception that Apple envisioned the design and production engineering of a product that would optimally enable spending hours playing the wholly hermetic self-involvement of a game called "Candy Crush." Along these lines, and more in an abstract sphere, Kay had occasion to allude to the great, if not culturally cataclysmic, aperçu of Professor Marshall McLuhan concerning the impact of certain specific mechanical technologies on not only human societies, but on human nature. I think it's unfortunate that Kay, I am sure unwittingly and unintentionally (but who knows?) perpetuates the perception that McLuhan was a philosopher (and maybe possibly an evolutionary psychologist) when he was, in fact, mainly a literary qua cultural critic.

I can't be sure of this, though, because, ironically (which Merchant and Kay make clear is the touchstone communicative mode of the zeitgeist), albeit Kay lavishes praise on the rhetorical skills of such as Neil Postman, or even further back to Bertrand Russell ("that bastard") being capable of writing "like a dream," Mr. Kay is not capable, at least he doesn’t talk like a dream… All of this suggests, and punctuates the perceptible fact in the form of this published interview, that unlike them Mr. Kay is not capable of being either clear, first and foremost, and thereby persuasive—especially of facts, it's suggested, not otherwise palatable to the recipient of the argument. But then, this is a heavily edited and manipulated interview on the heels of a major product introduction by the world's largest company in terms of market capitalization. And it appears in Fast Company—a tarted up business magazine that has what seems to be an inalterable mission. Its agent, in this case, the aggressive journalist bent on positioning himself as the resident historian of the development and impact of the Apple iPhone, states his professional purpose (on his LinkedIn site profile) as follows:

Today, he spends most of his time investigating the myriad ways humanity is attempting to survive itself.

Talking of high-minded purposeful solipsism.

Instead of McLuhan, it seems as if Kay, and his self-appointed henchman Merchant, should have dug a bit more into the ideas of Teilhard de Chardin, and in particular that of the noösphere. It's a concept that has been kicking around (though it's hardly a popular lively topic) since the 1930s, and thereby lends a certain estimable patina to the already comfortably burnished ideas that issued from the labs of not only Xerox (the company that never got over not becoming what Apple has proven itself to be, though it showed every promise of doing so; it just could never get over the hump of being utterly incapable of conceptualizing and developing products that could be marketed and sold successfully to the mass consumer market… something that Sony, Apple, and for a long while (until it lost its technological grip) Polaroid, among many others, had proven themselves to be), but of a great number of academic laboratories and whole departments in the applied and theoretical sciences.

The notion that there is a concurrent, coextensive, and (insofar as I can understand some complex and possibly arcane theory) commingled developmental human capacity keeping pace with, if not finally and now (should I say NOW!) exceeding the excrescences of evolution, usually understood in terms of natural selection is, in short, not a new idea. That there is a superceding (what I will provocatively call) ontological development in the evolution of human epistemology—please IM me if that "human" is redundant, and I'm just sounding like a fool—remains to be proven, however. But a lot of people seem to sure want to think so. And a lot of very smart people are counting on the insinuation of certain largely 20th century technologies (starting with the Turing engine—in the form of the still barely modern digital computer—and continuing through the accretive accumulation of a wide range of programming languages, including so-called object-oriented ones, but not stopping with them, as well as mimetic architectures for computer engine design (with their tightly bound software|hardware manifestations) with neural networks the most prominent as an example in my mind) in the gestation of some new kind of what I'll call consciousness, and which Kay here, very clumsily and slightly incoherently calls "another level of thought." There is, possibly, some suggestion, and this would be particularly in keeping with the thinking of the theorists of noöspheric structure, that this presumably extranumerary level of thought is, in fact, a wholly new level of thought—somehow, again mysteriously and incomprehensibly (here) aided and abetted, if not stimulated, with some vague suggestion of insemination, by the great potential computing advances envisioned in Palo Alto, and other places. That, the aiding, abetting, the, uh, stimulation, the, erm, insemination, which is to say, the enabling of some new dawn of thinking would occur if only we would let it. Except we are bent on watching serially, or with sporadic binging, entire seasons of the alleged comedy series "Bojack Horseman." All that potential enlightenment down the omniverous black hole of popular culture.

Having said all that, allow me to say, just briefly, because I am afraid that I have already taken up too much of your time to leave you comfortable, even at the risk of seeming suddenly to change the subject. I'm not. I'm just doing what every creative nonfiction writer in this day and age does, worth his or her rhetorical salt, and that is, I am making it personal, because the mission of deconstructing and then deriding the suspect emissions of a noteworthy brilliant computer scientist is always a dead end. Unlike Mr. Merchant, who by familiar conversational postures and ploys suggests he is, I am not by any means Mr. Kay's peer (though, to play the age card, I am far closer chronologically than Mr. Merchant can ever be while Mr. Kay lives—and thereby have my own memories of the very same periods of the development of computer products and the underlying science and engineering).

Nevertheless, I have no problem stating that I am not at all unimpressed by Apple's latest product announcements, and especially in the light of what small lights went off in my mind (kind of premonitory LEDs) as I watched the Apple Event on September 12, the extended product commercial, wherein they announced the much anticipated new line of iPhones. First let me say, and I must offer the caveat that I am not an inveterate watcher of these fanboy events—I've never watched one from beginning to end, as I did this one, before. Something told me, and I can't say what (nor do I wish to devote the time and emotional energy to figuring out what "told" me; I'll just say, I have a lot of faith in intuition), to watch.

After consciously noting and filtering out that tsunami of ejaculations (I am speaking entirely of rhetorical phenomena) from the mouths of the parade of Apple executives delegated to announce the products and their attendant features, consisting essentially of the words, "beautiful" and "magic," I realized that two things struck me as particularly compelling. I don't pretend to be an exhaustive reviewer of popular media, or even the self-consciously nominal intellectual fare of which I am a significant consumer. But little attention was paid to two facts about the new products, one a functional capability of the newly announced Apple Watch Series 3 and the other a facet of the underlying enabling computer design of the new crown jewel of smartphones, the iPhone X.

First, we were told that in addition to the liberating capability of being able to don a watch that would leave us coupled, with an available signal of sufficient power, to the nation's grid of cellular transmitters, the watch, with forthcoming software revisions, will be capable of monitoring cardiac arrhythmias. This is very big. It's big, no doubt, in terms of a significant potential advance in diagnosis and prevention of debilitating, if not fatal, cardiac and cerebral anomalies. Without belaboring this (this is not the place, and I don't have the time, even if you do) this can have a significant impact on ensuring well-being and greater healthy longevity for humans, and I would suggest, tantalizingly, that this has implications for how we will be able to think about the nature of mortality, and all the attendant epistemological matters pertaining. Talk about a new "level of thought."

Second, and this could be even bigger, but I can't say, because I don't have the bona fides for even thinking about potential applications, the new iPhone X, embedded in the Face ID engine of the product, has a computing advance—with clear, proven, highly affordable manifestations, albeit as a consumer product the vendor is hard-pressed to describe to an avid public in any language other than to use the ridiculous word, "magic." It's not magic, but it is incredibly powerful, and it will fit in anyone's pocket. I am speaking of the architecture of the new A11 Bionic Neural Network chip in the iPhone X. This was conceivable, but, if you will, unthinkable in a consumer product, back in the 1980s (for perspective, the Macintosh, which Mr. Kay considered the first personal computer "worth criticizing," was introduced in 1984; it was a capable of facilitating, but in what in hindsight was only in the most primitive way, the graphic user interface, with the ability to "draw" on the screen of a cathode-ray tube… and, frankly, not much else worth noting, except the use of a new "input" device charmingly called a mouse, and all of which were envisioned by Mr. Kay and his cohort at Xerox PARC labs, back in the 60s and early 70s—which is to say, it took, let us say, 10 or 15 years to see realized in a consumer product). It has, for practical purposes, taken 30 years for a true neural network architecture to see realization as a viable product.

I'll just say, to conclude, that it's too bad, to note only one major benefit at least as Apple presented it. I mean this aside from the vaguely engaging (not sufficiently to justify replacing my perfectly fine current iPhone 7+ model, less than a year old, outmoded as its technology suddenly has become; I will just have to live with the humiliation) application of highly secure three-dimensional facial recognition to permit use of the phone. It's really too bad, in fact, that Apple in their considerable wisdom (borne of incredibly successful and undoubted marketing acumen—certainly Mr. Kay attests to it) chose to put enormous emphasis on what I can only describe as the colossally trivial ability to animate cartoon characters with a simulacrum of basic emotive expressions, and all that anthropomorphically at best.

You'd think, and I hope in a small way, that Mr. Kay would be in concurrence, if this is not precisely what he was trying to say, and would have without the interference of Mr. Merchant, the world has enough smiling, grimacing, gesticulating cartoon panda bears.

I could add that, once you have an iPhone X, unless you can use it to solve some significant complex problem that has eluded very serious invesigators and researchers for years (which I fully intend to do with mine), I would suggest you put down the phone making sure it’s on its self-charging wireless Qi pad, and go play with a dog. Salutary for all aspects of the brain chemistry. But I won’t add that, because it would be snotty.

Digiprove sealCopyright  © 2017 Howard DininFacebooktwittergoogle_plusredditpinterestlinkedintumblrmailFacebooktwittergoogle_plusredditpinterestlinkedintumblrmailby feather
rssrssby feather
Share

Street Photography 2016

Reading Time: 1 minute

I have a new photo book. It’s a record of the street photography portfolio I assembled for review by LensCulture, the online photography magazine. It does not include all photos submitted, but includes photos in black & white considered for submission. The submissions were selected from 20 years of shooting in this genre.

I invite you to preview it. It’s available for sale. I purposely set the price to an even dollar amount—the profit to me is less than 25 cents.

Enjoy.

Digiprove sealCopyright  © 2016 Howard DininFacebooktwittergoogle_plusredditpinterestlinkedintumblrmailFacebooktwittergoogle_plusredditpinterestlinkedintumblrmailby feather
rssrssby feather
Share

The Pro

Reading Time: 7 minutes
Sometimes the old ways are best.

Sometimes the old ways are best.


Wednesday this week was a day I had been anticipating. November 11 was the announced date of the availability of a new Apple product, the iPad Pro, introduced in September to great fanfare (and my immediate optimistic enthusiasm) as a new generation of tablet. The tablet, especially the iPad, which is the market leader in a number of dimensions, price, popularity, performance, has been on the decline by the usual economic measures, not the least of them being growth of sales and percentage of corporate revenue for Apple. Nevertheless, the Pro seemed to augur a new stage of development for this particular species of device: a professional tool, highly mobile, that could be used not only to enable, but to facilitate creative innovation, from concept to execution for a world more and more populated by expressive and informative media that are digital, starting from the origin of an idea.

To be introduced along with the significantly enlarged form and screen size of the tablet are two ancillary or collateral or auxiliary input devices: a keyboard and a stylus. The screen of the new version of iPad is more or less the same size as the screens of the smallest laptop products offered by Apple, the MacBook line of computers. The latter are still, more or less, the only growth category in the slowly withering category of personal computer devices that are not strictly mobile. All this is to say that the world at large is slowly becoming dominated by digital devices that fit in pockets, however large and capacious they may have to be to accommodate some of the larger smartphones.

It was to be, and, for all I know, still is to be expected that new devices like the iPad Pro, as well as the hybrid Surface Book, which is Microsoft’s entry into what marketers like to call the “crossover” category of tablets qua notebook computer (the former ad man in me cannot overcome the impulse to recall that old joke, I think from SNL, “It’s a mouthwash! It’s a floor polish!”). It has a keyboard, but it also can be used strictly as a touchscreen device. It also has been designed (this is somewhat more true of the Apple product, but applies as well to the design ethos of the Microsoft product) to be that much more precise in its responsiveness and accuracy in the rendition of design elements: both type and graphic elements, photographic or manually drawn. To reinforce the perception of the iPad Pro as a professional device is the Apple Pencil, the company’s first venture into a discrete digital stylus offering (the doomed Apple Newton, in the 90s, had a stylus as a necessary adjunct to using it, but it was just a stick, not a connected digital instrument). The sainted Steve Jobs famously allegedly put the kibosh on the development of any such capability in association with the iPad concept by dictating, in an off-handed brief encyclical to the world, that a fingertip was stylus enough for anyone.

Nevertheless, in terms of technical specifications, it seemed to me that the Apple Pencil raised the product category of stylus to a new level of accomplishment for the engineers—or so it seemed back in September with the onstage product demo. It is responsive and significantly more precise in terms of the rendering almost immediately and visibly with the necessary synchrony of where the tip of the instrument, which is about as sharp (or dull) as a slightly dull graphite pencil or, perhaps, a Bic ballpoint pen, meets the slick glass surface of the screen. Beyond that it falls short in exactly the same way all styli on digital touchscreen instruments do. It is a unique form of recording instrument and must be learned to the expert level to exploit fully. I imagine the heuristics are similar to any new form of technology, meant to analogize an existing, especially what I’ll call a natural, form. Playing a musical instrument, or learning to perform arthroscopic surgery, or learning to perform arthroscopic surgery using a robot across the operating theater with the patient separated from the surgeon, flying by wire, operating a rover on another planet. All of it is possible, and any of these examples doubtless provides a context for creating new forms of art (in the broadest sense). In the case of the Apple Pencil, or any digital stylus, what it, the thing itself, does not do is replicate in any way, for one, the act of drawing on materials made of fibrous layers of something that at one time was growing: a tree, a reed, a young animal, flattened into a thin pliant sheet—so as to create a surface receptive to microscopically colored media prepared in a form of a sharpened stick or paint or ink, in short, material that may, at once, lubricate, suspend, and release the colorant as the instrument is dragged across the surface of the substrate. I mean of course ink, or graphite, or chalk, or paint, or dye used to make a record of one’s strokes. Whether it’s paper in its myriad forms, or parchment, canvas, or other kinds of cloth, or, alternatively, wood or metal or the stone wall of a cave, and with the laborious technology required to make these surfaces receptive to the colored material, there’s “tooth” to the surface which variously impedes and permits the progress of the recording medium onto the tactile surface. But there’s no way polished plastic molded to a point pulled across a glass surface can mimic the sensory experience of these phenomena.

I’m not commenting on and surely not denigrating the genius of programmers who have found ways of digitally replicating the act, and the result of the action, of nearly every technique invented in the entire history of homo pictor, whether it’s spray painting, pencil drawing or creating a cyanotype photographic image. However, it’s the result that is replicated, or at worst mimicked, and not the techniques or means to create the artifact with the appearance of a specimen prepared in the canonical manner. I could lapse into a very long digression about the increasingly resonant meanings and applicability of Walter Benjamin’s essay, “The Work of Art in the Age of its Technological Reproducibility,” whose increasing relevance to the existential aspects of daily life as time progresses in Western countries has a crescendo-like impact on my sensibilities at least.

But I think I’ll stick with what I set out to say, which is to critique the new Apple product, and to articulate why I decided not to buy it. The latter phrase will alert only people who know me personally and fairly well, know me for the inveterate and, I’m sure, sometimes seemingly unquestioning, wholly credulous, perpetually positive fan boy that my money and its free flow into the coffers of Apple Inc. would have seemed to attest in the past. However, the last three product introductions, for me, have proven a bust: the MacBook laptop (in three simulacrums of other kinds of metal than the aluminum of which it is crafted, including two tones of gold), the Apple Watch, including the version that actually is made of gold and priced starting at north of 17 thousand dollars, and now the iPad Pro. My first criterion is whether conceptually, especially as depicted in Apple’s devilishly alluring product videos, but, more importantly, once I can put the object in my hands, it instills that acquisitive impulse, usually fashioned in the compelling form of a felt need, as opposed to the more likely, and more accurate, covetous want. Without that impulse, I can’t persuade myself, even with considerable powers of logic, sometimes used self-reflexively, but still sophistically, that I should part company with significant amounts of cash.

In the end, what impressed me, after a full half hour at the Apple Store, unmolested and uninterrupted, with the iPad Pro, the Apple Pencil, and the so-called Smart Keyboard, was that these were products perhaps as expressive as any, if not even more so, of the Apple genius for slick design. But our household is filled with devices, none at the moment quite as new as introduced this week, but still representative of the same corporate ethos: tablets and phones, and computers, and music players, and dozens of accessories, converters, adapters and cables—and there are only two of us. And the iPad Pro is simply a very big iPad, and, despite Apple’s attempt to turn its size into a virtue, and for some, doubtless, a very large screen is exactly that, to me it’s clunky, and it has passed over some undefined boundary in my mind for what is an acceptable mass to carry around for a purportedly portable device (I hate the term “mobile,” and even more so as a noun; a wheelchair also is mobile—at the very least it must be conceded that it makes the person borne in it mobile when they would otherwise not be).

The Apple Pencil is now the premier exemplar of the category of digital stylus. Indeed, if Apple had lost its capability to trump all previous efforts by all other parties to dominate a category of digital device with the design and relentless rigor and simple beauty of it, I’d short my stock. The problem for me is that it is not that superior to the half dozen or so other styluses I have acquired, each of them at the time of acquisition the “state of the art” one way or another (and developed in the context of Apple purportedly considering category to be irrelevant pace Steve Jobs, who thought we should all be satisfied with the digital styli we were born with at the ends of the palms of our hands). I wish I still had all the money invested in those styli. They do work, and some not very much worse than the Apple Pencil, as Apple has no complete monopoly on clever engineers.

No I’ll skip it, and the keyboard, and the stylus, and I’ll reserve those discretionary funds, which I also remind myself I am very fortunate to have to enhance my resources for being creative when the mood strikes. There’s a lens for one of my cameras that I’m looking at, and am kind of fond of.

Digiprove sealCopyright  © 2015 Howard DininFacebooktwittergoogle_plusredditpinterestlinkedintumblrmailFacebooktwittergoogle_plusredditpinterestlinkedintumblrmailby feather
rssrssby feather
Share

A Response to Paul Krugman on the Apple Watch

Reading Time: 10 minutes

This is a response, at the request of my friend Phil Mathews, to a blog entry in the New York Times by economist Paul Krugman, which appears here: http://hdin.in/1PAOPYk

First of all, I’m glad for the opportunity to opine about the Apple Watch publicly as it’s a solicitation rather than a personal impulse (the response to which, never mind the receptivity, is virtually impossible for me to gauge; as far as I can tell, I have about three fans, and those not consistently). I do have opinions about the device, which I’ve shared, in pure speculation, because it has not been available for viewing or handling by the hoi polloi, of which I am a decided fixture. But I’ve shared them privately. Just to give a context for whatever else I might have to say, I did agree with another friend here on Facebook that one of my first reactions to the announcement of an actual product, with photos and some cursory explanations as to functions and functionality, was, thank God, finally a gizmo from Apple I don’t want and, when you come down to it, I really don’t need.

I think it’s interesting that Krugman has a point of view about the Apple Watch, of course. However, I’m disappointed that he decides to take a personal perspective, instead of doing what he’s done so well in other regards so often—though not always—that is, to step to one side, figuratively speaking, and look at the phenonomenon of the Apple Watch and the category it represents as the trained scientist he is. More pointedly, it’s possible, in fact, that the Apple Watch will actually end up defining that category, as Apple is wont to do with emerging consumer product technology. They invent very little in that regard, the genre aready exists, i.e., a wearable multi-function computing device. In the same way the portable digital music player was defined by iPod, or a highly portable entertainment, consultative and reference device, with facilities for rudimentary record keeping, similar to both a laptop, for the size of the screen, and a smartphone, for its lightness and compactness by the iPad, of course, and so forth.

Rather he has taken a tack, perfectly legitimate in this world of media wherein anything goes, even in the name of news, analysis, and factual reporting of the truths derived from statistical data and double-blind experimentation on live subjects in actual conditions. If he wants to speak for himself, who’s to stop him? As he says, what the heck?

He does, in the process, break a cardinal rule, as I have always understood it, in market research and analysis, even of a speculative sort, and that is, never to assume that you are yourself representative of even a tiny valid statistical segment of prospective markets.

In the end, I beg to differ with Mr. Krugman (disclosure: I too wear a fitness band, though I gather a different brand than his, and I have always been a small-time aficionado of the art of the horologist, that is, I love watches, and own several; in the past 50 years I’d guess it’s rarely that a day has gone by that I have not been wearing a watch, and for most of the past 20 years or so, it’s been the same watch, the acquisition of which was a purely personal attainment, it had been an object of desire for me for some time and, as it was, at the time, costly (to me) required extra long deliberation about making the ultimate purchase… though once I did I never looked back, and I also never stopped looking at other fine specimens of the watchmaker’s art—none of which I indulged in acquiring).

I think of the Apple Watch, still sight unseen except in dazzling, augmented images mainly on the Web, in the same way I think of the iPhone, as well as of the iPad, and that is, one way or another, they are computers that have been designed to a particular set of applications, in the broadest sense, and in a form that makes them suitable and adaptable to a particular set of highly specific computer programs, or apps as they’ve come to be called.

The first unfortunate observation Mr. Krugman makes is the one he asserts at the very beginning, setting the tone, but more importantly defining a polarity that I think is not even factitious. I think he’s made it up in terms of his own highly circumscribed needs and the uses to which he himself puts these devices to meet those needs.

I’ve gone out of my way to describe the phones and the tablets and even the watches (as well as the music players, and a whole variety of hybrid devices: phablets, lapbook/tablets) as computers, because that is, ultimately, the genus of each of these species of cybernetic creature. Alan Turing, the fathering genius of the age in which we find ourselves, posited in what he called “the universal machine,” or in plain terms of today, a computer (a word which originally meant, when applied to a device designed to a specific task, a machine to do calculations). What Turing meant, and what the whole industry spawned by his idea has set about to make actual—even to defining the epoch in which we conduct our daily business—was that such a machine or computer could use a calculating engine to perform almost any task, including a universe of tasks (like talking in real time to another person over extreme distances in a simulacrum of voices that are unmistakably those of the speakers) that seemingly have nothing to do with calculating numbers. It’s because all tasks can be understood, using the legerdemain of converting physical changes, of even the most minute dimensions, into sequences of numbers that, reinterpreted by a reverse process of conversion back to something resembling the original physical changes, to be mere sequences of coded symbols, called programs. Even the stuff of life, in something of a misnomer—as the real stuff of what we call life remains a mystery—DNA and RNA are understood best as sequences of replicable codes of a deceptively minimal number of constituents.

What I’m getting at, with all this beating around the bush, is that Mr. Krugman can use his fitness band and presumably an Apple Watch, or a competitive product (and I predict he’ll own one, probably sooner than later) any way he likes. I use my fitness band differently, and I needn’t go into it as it’s irrelevant, and I do so mainly because I have a different set of personally important objectives to attain by doing so, than he does.

Further, and truly to get into the meat of the matter, he misses the boat entirely, in my opinion, because he fails to account for what is an indisputable set of phenonmena that have emerged as more and more people use more and more smart devices. Most people have a streak, wide or narrow, it’s there in most of us, wherein two seemingly very human impulses are served.

It is important, in increasingly complex ways, for us to stay in touch with increasingly larger circles of individuals with whom we either share an affinity—even if its only an affinity for staying in touch with increasingly larger numbers of people—or can at least pretend to have an affinity, again if only on the strength of having formed a connection in the first place. And what we share in the actualization of that continuous connection, is information, some of it, probably most of it, of a personal nature, and essentially trivial, banal, and, without using judgmental qualifiers such as these, most certainly quotidian. We tell one another, on a full-time basis, if not, indeed, 24/7, what we’re doing, what we’ve done, and what we plan to do, even so as to subsume all of our habits, including eating habits, sleeping habits, fitness habits, leisurely pursuits, passive entertainments, and game-playing. Many people, doubtless, share even more intimate details of their emotional states, their loves, their hates, their fears—or why would people keep doing it and yet express such outrage at the prospect of having all that information captured by the government?

Smart devices have made it easier and easier not only to track our own activities, but more importantly, or at least as importantly in a different context, we can not only share the record of those activities with others, but we can count on the computational and analytical capabilties of these really amazingly powerful computers that fit, now, on our wrists (and there has been talk for years, to varying degrees in response to the prospect of horror and wonder, of embedding computer chips into our bodies, with nary a lump or a shock) to allow us to compare our “performance” and achievements with those of our cyber-families.

If anything, because they are more literally more intimate, actually contacting on a continuous basis our skin, the largest organ of our bodies, and tap into the wealth of data obtainable via this means of connection, even to more deeply embedded organs, recording by ingenious means, respiration, perspiration, heartbeat, blood pressure, and, if not now, then no doubt imminently, fat-to-body mass ratio, rate of caloric intake, rate of caloric consumption, etc., and I’m just listing somatic data (mainly because Krugman set the pace, so to speak). There’s also neurological and specific brain wave activity somewhere in the future…

And no doubt, there are many of us for whom, as for Krugman, this is of some level of vital personal significance to know, if only for the sake of knowing as a touchstone for maintaining honesty with oneself about how responsible one is being about keeping fit (as if that were all there to it). I have to wonder, do we even need a minimally 350 dollar aluminum watch, assuming we are desirous of the status of the Apple Watch (a status it has apparently already begun to accrue to itself, still two weeks before the first orders are fulfilled for the first customers) to help us be honest with ourselves?

Krugman mentions only monitoring his personal fitness stats once or twice a day. Sometimes for me, as long as it’s confession time, I rarely consult the gizmo at all. I did far more often when I first started using it, as it represented an indisputable, highly accurate frame of reference—a reality check. I don’t need a gadget to know I’ve pretty much done my duty by myself to get in some physical exercise sufficient to preserve whatever pitiful level of fitness I enjoy at the moment. Whatever it’s merits, or lack of them, to me, I share this information, about sleep habits, steps, exercise, etc. with no one, except my wife, who has a more avid involvement for her own legitimate reasons with her own activities, and a legitimate fond conjugal concern for my state of health. I don’t compare my “performance” with norms established and maintained by the manufacturer of my fitness band. The last thing I would do is share any of this information with my friends. My universal motto, in that regard, as regards all matters of social intercourse insofar as its constituted of the exchange of news about daily activities, physical or intellectual, is “It’s not a contest.” Even less than I am interested in the minutiae of my own behaviors, as measured by these devices and wondrous gizmos, I am not interested in how many steps my buddies have taken that day, or how long they spent on their rowers, treadmills, elliptical trainers, etc.

However, unlike Krugman, by inference from what he says in the Times, I don’t suppose in any way that I am a typical specimen, subject, or consumer. Very much the contrary. I think, contrary to his conclusions “A smartphone is useful mainly because it lets you keep track of things; wearables will be useful mainly because they let things keep track of you,” that both are parts of some larger universal machine that allows the aggregation of data, instantly retrievable, automatically transmitted and shared, and rapidly analyzed for comparative, if not strictly competitive, purposes.

The chief complaint about the Apple Watch in preliminary reviews allowed by Apple to be conducted by a selected band of “power users” and professional industry watchers is that though the functions of the iPhone, especially by way of tracking and notification of one’s own agenda, schedule and itinerary (the framework of a busy life for a particular tribe of people engaged in a particular set of occupations) are no longer an annoyance as manifest on the phone, they are an immense annoyance on the watch, because it not only makes small annoying sounds. It actually buzzes, vibrates, tickles, pokes, and otherwise prods your epidermis in a way that is, by their almost universal account of it, distracting and, in the presence of others, invasive. I see all this not as a sign of a different function for these devices in the Krugmanian formulation: “they let things keep track of you.”

As I already said, I think this is an utterly shallow misreading of the actual gestalt of increasingly personal cybernetic extensions of our conscious preoccupations. And the initial complaints are merely a sign that the necessary adaptation of the always elastic set of protocols and behaviors (what used to be called manners and etiquette) are due for another revision, like a new release of a major operating system. The iPhone, with its beeps, whistles, vibrations and blinking and winking, was thought to be a distraction and rudeness personified. An individual’s attachment to their iPhone, even in public, even in social scenarios, involving as few as one other person, and as many as a conference room full of many others, has become the basis for a normative set of behaviors that people my age find at best amusing, and at worst painfully rude and offputting.

I predict in not too long a period of time (as the Apple Watch seems destined, indeed, to be the best next thing, and an expansion of the armamentarium of gadgetry with which large segments of the population will equip themselves) that wrist consulting, and various otherwise comically impolite sound effects and reflexive behaviors (haptics are a new set of phenomena to which people will have to become acclimated), will become the newly revised norm that in a couple of years we’ll all wonder was such a bother.

Krugman’s got it wrong, because, for once, he’s not looking at a big enough picture.

Facebooktwittergoogle_plusredditpinterestlinkedintumblrmailFacebooktwittergoogle_plusredditpinterestlinkedintumblrmailby feather
rssrssby feather
Share