So this time I simply want to tell a story. I want to tell how I came to work with computer networks, and how I've made that also part of who I am and what I want to write about.
I now exit Information Technology work after what has been for me the extravagant commitment of two consecutive sabbatical stretches. I'm talking about seven year spans, each as long as I've stayed in any one full-contact relationship.
So, even though you will have to agree that I'm the consummate dilettante, objectively speaking I'm more a techie than anything else I've ever been. I call myself a "Microsoft drone" to the people I run into high up in the Company, just to be sure they know who's bringing in their honey.
You might say my heart isn't in it anymore. I just honestly don't find the technology all that cool, or even, truth be told, very useful the way it often gets deployed. I mean, sure it's useful, but as its prices drop and the hype hyperbolizes, a question gets begged about a corner turned.
I'm not convinced in the end that the actual inputs to our economy go toward better living, as much as toward bubbly froth. In that, I'm reminded, of course, of how the recent sub-prime mortgage meltdown happened.
Technology feeds its own expansion, and will and can and must end only with our terminal boredom.
For sure the work that I would prefer to do could be done as well with pen and paper. But as the medium has changed, the message changes too. And it's not all bad.
It ought to be the case that these words might strike some sympathetic harmonic in you. Our connection could never occur in other past universes I've known.
The way I write cannot be identical to penned words prepared for publication. My audience can be as diminutive as only myself and still it seems worth my while to write as if for you. My writing has been whittled down from publication form, through and past once formal epistolary manners. I blew right by, as we all did, the highly personal letters we might have written, now so thoroughly destroyed by e-mail.
And while I have learned to make personal contact by emails, with those few people who have the patience actually to read, it is this new form which still draws my actual labors. The others are so purely pleasurable, and informed as though spontaneously by my feelings.
This would all be narcissistic if I cared what you thought of me, or if I thought there was something all that charming about the me whose being here provides the evidence for what and how I think. Sure I fear that narcissism, but I think the connection might also prove to be that much closer to personal. I like this medium fine.
It's true, I don't distinguish very well between my life's story and who I am, and I'm never certain which is writing whom. I strive for truth, and marvel at my own very good fortune. So blogging suits me, you know, almost as though I'd have had to invent it if it didn't already exist.
And sadly, my handwriting has atrophied now to where I can't even read it myself. Blame it on information technology, starting with the keyboard.
Effectively used by trained engineers, computers are clearly awesome. But, it's also pretty clear that these effective users of computing power could never afford to do their work if it weren't for the subsidy from the rest of us, releasing our minor passions through our purchase decisions. Phenomena like the blogosphere get created as byproduct. And maybe IT is important after all.
For me, between or among careers, I got my start with a need to master statistical calculations for the social sciences. My career path would have required geographic mobility, which my divorce stopped in its tracks. So naturally, I ended up going back to school.
I'd been intrigued early on by word-processing, by the excitement of virtualized commitments to textual experimentation. It felt quite liberating, up against the literal cutting and pasting I'd had to do, last minute of course, to graduate from college. And then the mouse!! I was transported. Sitting at a word-processor keyboard, I almost felt that my thoughts were flowing from the very world around me.
This was a proper "computer" and not one of those early "dedicated" word processors, which could do only that one thing the designers had in mind. I doubt anyone would take the counterpoint that it is their very universality which makes the computer interesting at all. It seems that you can make it do almost anything done by any other machine in the house now! No wonder they have to subsidize the dedicated gaming machines so that they can sell you their lucrative cartridges.
I do have to tell you though, that the very first true computer-based word processor I used had almost no hard memory, and so it had to store my texts right on the huge and actually quite floppy spinning disk. The next one, which I actually bought myself, had a "hard-drive" equivalent, extravagantly, to some 100 of those floppy disks. The much higher RPM hard disk made the swapping in and out from texts that I might scroll through a little quicker and less noisy than the spinning up and down of the slowly turning floppy drive.
The actual code underneath the text up on my screen was also just plain text, using "control" characters almost exactly like the hypertext markup language underneath these web-pages. HTML. There was no human unreadable binary "code". That was all stored away, "compiled," in the program, and not listed right along with my typed words.
So, if you wanted to, and I often did, you could look under the covers to see the actual guts of what you were working on. That was especially useful when things went wrong, as they often do, and the screen got a little scrambled. Now I can't even remember the name of the program - I think it was called PC-Write - but it was free way before open source got fully defined, and got me all the way through graduate school.
But when I bumped against SPSS, the Statistical Package for the Social Sciences, which I was required to master in order to pass a course, my old machine was no longer adequate. I'd paid well over $5,000 when that was real money. When I also was, not incidentally, earning an actual living wage as I've seldom done before or since.
So now, as a graduate student, and having to fund child support payments, I had to find a way to make my old PC work.
I could have used the "mainframe" for SPSS, but it was so darned cool to be able to load up my own computer with huge datasets from the world bank or some other global agency. I could play with the numbers to my heart's content, way way into the night. I wanted my regressions fast. I wanted them completely within my control.
I have to admit that I was having fun. I discovered that I could buy a new "motherboard" cheap, and then I discovered that I had to buy a floating point co-processor to handle the statistical math, and then I discovered that I would need more memory.
I remember well that price - a whopping fifty bucks per megabyte after laborious bargain hunting - dirt cheap at the time. Now you can get more than a few thousand times that much for the same nominal money.
Meanwhile, I ended up with a kind of monster computer which, like the U.S.S Constitution, has maybe one board in it from when it was first built. It still gets to be called the original ship because it has persisted as such through all the repairs and upgrades. I think my computer had the original power supply in the end. Maybe. And I never added up the total cost, although it sure felt cheap.
So I had a skill, and the university had a need, and I jumped ship again myself at some point, feeling very very tired of being quite so broke. Professors were moving around too, leaving me behind, and I just wasn't being turned on by my academic work anymore. I know what you're going to say, that it's not all about being turned on, and no that isn't why I left my wife.
But I started working in the newly created "distributed computing" infrastructure, to support the School of Education, among other graduate and professional schools. The University had just then gotten ether-netted, and was pretty near the cutting edge of the new and great big 'net. This was back in the day when there was no commercial use. Either because no-one had thought of it, or because it was a University franchise.
Computer support at the big U. had previously been premised on a centralized service which ran "big iron" mainframe computers whose disk drives were about the size of a Model T once upon a time.
Through various central computing services, some of us learned to hyper-link across continents using a text-based program called Lynx, as I recall its name. It was pretty exciting. No pictures. Just text.
And the professors were starting to find uses for inexpensive PCs right on their desktops, especially once color and graphics became universal. And their own grant funding was paying for them, having nothing at all to do with central anything.
As a suggestive and possibly meaningful aside, I remember visiting my uncle Roger once, in his office at Syracuse University. I had no idea then, and I have a dim one now, about how important he has been in the development of the electronics from which computers, ultimately, descended.
I think he was more a radio frequency than a digital guy, but he developed the method by which components in an electronic circuit which are idealized in the schematic can be brought into close and miniature proximity, such as had to be done for integrated circuits, for instance, descended from those cute transistor radios we were all so excited to have when I was a kid.
He'd gotten his start in crystal radios, tube-type amplifiers, and other things which geeky kids back then would play with. My own dad went into the law, which was the family tradition, but number two was liberated to do his own thing. In case you didn't guess, my older brother's a lawyer, and I'm pretty um, yeah.
Anyhow, real things don't behave the way the idealized ones do on the schematic. I'm pretty sure that with tubes and their high power brute force circuits having lots of real distance to separate the parts, you can safely ignore or guesstimate the interactions among the components. But as you miniaturize these things, and multiply them, they start to interact in sensitive ways far beyond the power of schematic designers to either represent or account for.
A kind of math would be required to compute those interactions. It can't be done on paper in a human lifetime, and no answer can be gotten in theory. You have to whittle down the interactions by high-speed computational machine to get your answer. Pretty much the way your mortgage gets calculated. So the older art of tuning became the high-speed calculus of miniaturization.
Anyhow, my Uncle Roger was excited to show me the calculator on his desk, and marvel that it contained as much computing power as had the mainframe on which he'd developed his 'method of moments,' as I think it is called. He had an Apple II on order. You and I can't fathom his excitement.
Now that's what computers are really good for. But computers are also good for all sorts of things they weren't intended to be good for.
I still do remember, among the early adopters for dialup from home to the University network where my work increasingly was, learning how to squeeze pictures from plain text. Now that, if you ask me, was way cool!
It wasn't unrelated to how you could get Chinese written characters to reify from simple pairs of Western letters. You probably already fully understand that the basic set of characters as used in my first Word Processor - the ASCII set - is roughly descended from what a typewriter is good for. Or a teletype machine. It's rationalized almost like the periodic table of the elements, and it is truly a beautiful thing when you grok it.
But there are a radically limited number of "characters" which can be represented each by 7 bits of data. The first 30 or so were never meant to be "printable," and could be exploited as control characters to provide effectively many more combinations than a mere 7 bits could determine. In the case of my word processor, these could be things like font or italics or spacing.
You wouldn't want to use a Chinese mechanical typewriter, which includes trays and trays of somehow organized Chinese characters, descending in order of frequency, just like the Querty keyboard doesn't do (it was designed to avoid collisions among the flying hammers).
You can imagine the pressure to keep your writing within the top tray. And always a toolshop to create the ones you rarely use, or you could just handwrite them in. Who could ever afford the full 50,000 or so blocks of metal type which would be needed to compose the full set of distinct Chinese written characters..
So, early on, there had to be a way to combine two ASCII characters to come up with code for how Chinese gets written. You could display Chinese easily enough by heading off these strings of letters with the same kind of control character my word-processor used. These could be taken as a kind of trigger - the control characters - not to be taken literally and printed up on the screen. And then, provided you had a graphics-capable display, you could retrieve the Chinese character graphic from its trivially indexed store.
You could do the same with pictures. I think the idea is to imagine being able to transmit only text, just like in the days of teletype so far before the fax machine. In the early days of on-line connections, we were transmitting only text as well.
Text was all that was supported on the various media for exchanging words, scholar to scholar. It was a matter of cost and efficiency, also descended from those early teletype machines which were all that could be planted on each end of the wire.
People will innovate. Just as much as you can squeeze Chinese written characters out of the ASCII code, so you can squeeze out full color pictures, which plenty of clever people seemed plenty motivated to do. You can tell them not to, and they will still build tunnels beneath your walls.
And so the pictures, which were composed of binary code, had to be sent as big long strings of text, chunked up according to the limits conventionally imposed, for a single message, say; to be decoded once you'd chunked them all back together on the other end.
At first you'd have to strip off the leading and trailing padding, pasting these long meaningless messages into a single massive textual body. Eventually, still more clever people would devise ways - and generally distribute them freely - to make this happen auto-magically.
There is something irretrievably exciting about that first picture I successfully decoded. A semi-famous Harley Davidson from some post-modern virtual Hell's Angels club. The denizens of doom I think they called themselves.
Sure there were other pictures I drew the curtains on. There's an almost addictive allure to that kind of reproduction; the conjuring forth from the ether of something made only of indecipherable-by-mere-humans machine-code letters.
Kind of makes you want to believe in teleportation. That there's nothing we can't encode. That no matter what the limitations at first envisioned, or the proper uses of the medium, clever people will figure out how to make it do new tricks. It very much does seem as though there might be no limit to what can come about, especially if the infrastructure is fundamentally open.
I remember, again back at the University, when students, especially foreign students without a lot of money, figured out how to send their voices across the Internet "cloud," to be reassembled on the other end. The chunking apart and back together this time was of very much more finely grained packets than my earlier rendered pictures were, but it's not a fundamentally different process.
I remember how the University IT "suits" - the ones with perfect hair - were up in arms about the bandwidth - the precious bandwidth - and people were encouraged to consider this kind of exploitation of the open-ended possibilities of the technologies to be some kind of abuse.
That was way back when Al Gore was talking about how the government would have to build the "information superhighway." Back when he had perfect hair too, come to think of it. Now whole businesses are converting their phones to take advantage of the free transport of this cloud.
So, that's what got me started. I'd missed the boat in its season, as I always must and do. I'm quite literally surrounded by folks my age who got right into the proper geek pursuits and rode that wave right to its top. I'm guessing none of them really have to work any more.
Sad to say, I do. But I'm happy to take a moment out here to tell you all about myself. Aren't we all?
So anyhow, my first real IT responsibility was to manage the first Local Area Network - or LAN - for the Graduate School of Education, where I had been a student. These early central storage and distribution PCs which were the "file servers" which formed the core for Local Area Networking. They were first developed by a company called Novell, still very much in business.
Novell developed the technology for file "redirection", which simply means that you can fool your desktop PC into thinking that files on the network - in the "ether" or over the wire - were actually on a floppy disk or your local hard disk.
Bill Gates' near monopolistic Disc Operating System for desktops assigned letters, as did nearly everyone else, to the floppy, conventionally written with a colon after it so: a:. There were often two floppies in the early days, one to store the Operating System itself, and a second to store the data you were working on or with. So, if there were a hard disk, it would typically be called c:.
What Novell did was to work within evolving standards for signal propagation over wires, the most versatile to be called Ethernet, and then to develop some code so that your local PC could access "drives" say f: through z: as if they were just a large collection of separate disks.
And then on the other end would be a file server which would keep track of whether your particular account was allowed to access a set of files, and if so, what you could do with or to them, and then provided the access was all good, you would see these just as if they were "locally" available.
I turned out to be really good at getting professors' desktop computers to be able to connect over the wire.
And on the file-server end, I had a pretty good knack for understanding the logic of permissions, file attributes, and even what might be wrong with the hardware if things weren't going right.
Early on, we got into a bit of a mess, as groups of people do, with all the technical staff construed as equals. It was a too many-spoons-in-the-pot kind of messy stew, and so I elbowed others out as the guy who should and would take direct responsibility for keeping the network "up," which it tended often not to be if there were too many, um, chefs in the kitchen. Or strange things would happen in ways different from the promises you made to professors about nobody else being able to see their stuff.
Now Novell, the company, had a training and certification program which was truly a marvel. You might almost say it put the University itself to shame with its exacting standards for competency, and its well thought-out teaching protocols.
I, of course, availed myself of none of that, exercising my perpetual prerogative as a very smart person to make of myself an exception to everyone else's need to have their understandings certified. Besides, it was really expensive, and the departments certainly weren't offering any funding. We were supposed to be a cheap and local kind of dirty alternative to frustrating bureaucratic access to centralized services.
So when I pulled the ticket to head up the project to replace the Law School's file server, there was some perfectly understandable concern from the Centralized Services network engineers who had set it up in the first place. These folks had certifications up and down their sleeves - I think Centralized IT services had funding for that sort of thing, or maybe salaries were high enough to make self-funding worth the prospect of advancement.
First, there was the plain fact of competition for work. We were encroaching on what had been their territory. Next, there was genuine and honest concern that we would screw things up. Lawyers, in a cosmos of prima-donnish professors, tended to be even a little bit more so - after all they could easily point to viability outside the Ivory Tower - and so there might have been concern for our well being and even sanity as well.
The Law School network was a marvel. There were, I believe, well over 100 professor workstations, each running a version of the still fairly new Microsoft Windows 3.12 (if memory serves, which it might not, at least not so efficiently as the File Server I'm about to describe). These machines had no local hard-drives, booting instead from a clever little device which enabled them to redirect right during the boot-strap process to the file server instead of to their local floppy or hard disk drive.
In those days, a graphical user interface, just like Windows sported, was still a pretty new thing, and it required a lot of computing power just to "paint" its screen. And it didn't always work smoothly, depending on how much tinkering might get done to each individual installation.
The goal here was to develop a highly standardized "desktop", centrally managed for ease of configuration, maintenance, and backups, but also to reduce the overall cost of the computing infrastructure.
These basic principles have remained pretty much the same from then until now, with debates still raging about standards and centralized vs. distributed repositories for data, configuration, security and the rest.
Somehow, and I'm really not sure how, we were allowed in the end to have our day in the spotlight. How well I remember the handoff. The old server was stuck in a small closet to which Central Services had the key. Now this machine, "containing" over 100 running desktops, had a total load of 16 MB of RAM, which your phone would laugh at today. It's disk space was far less than the smallest netbook now has, and it's computing power was probably in line, also, with your phone.
That's how well optimized the Novell Network Operating System was.
Very early one morning, as we were going to have to bring this server down to do some necessary maintenance along the way toward replacing it, my colleague and I were waiting for the Central services folks to show up. We had a smallish window before the very demanding professors would show up to work. The lock looked small. Way back when I was a bike mechanic, my buddy had explained to me the principles of lock picking. It was shockingly trivial, and well, thank God nobody asked any questions.
So, we were off! Not only were we going to run the risk of leaving 100 workstations out of service, but we were performing an upgrade to the actual Network Operating System, so at some point, the changes we had drilled and practiced were going to be a one-way street, with no going back possible within reasonable time limits.
I'm pretty sure it took us an entire night, but I'm happy to say that we pulled it off (and I spent an entire sabbatical cycle together with my colleague in a, um, full-contact relationship, but you're not going to hear any more about that here!)
Now already by that time we in our little "node" were swimming against a mighty tide. Windows had just come out with its first viable network operating system, called Windows NT. There was a lot of grousing at first because techie types noticed that the network operating system which was pretty expensive, was code-identical to the desktop operating system, which was much cheaper.
I guess there might have been a single line in what's called "the registry" to distinguish them, and as I recall, Microsoft made a good case that because it was intended to be used and supported differently they could and should charge differently for it.
We in our area were already grizzled network engineers and administrators by that time, having pulled off a fairly complicated migration to new hardware and a much more complex network operating system. We'd seen how efficiently it would run on how little hardware, plus we had gotten a flavor of the lively esprit de corps which comes along with a fully professional cadre of workers.
Windows was proposing something completely different. Their operating system was meant to be more intuitive to install and configure. This, naturally, caused concern rather than excitement, at least for me, because it was a pretty open invitation to let lots of chefs into my kitchen, and to allow for as many different combinations of creative configuring as experimental amateurs might like to come up with.
To compound things, Microsoft could in no way replicate the quite sizable pool of professionals who had become well seasoned in the Novell offerings. Which would mean that they would have to play catch up by a combination of financial and ease-of-entry lures into Microsoft's version of Certified Network Engineer.
Traditionally, becoming a Certified Engineer really meant something, and the exams were designed to weed out cramming and memorization of texts. You actually had to have some hands-on smarts before you could expect to get your nod.
But none of this was about to slow the already mighty Microsoft marketing machine. No matter that Windows NT was almost primitive in terms of networking protocol support, cobbling together its support for Novell's routable network "stack" (think ability to go from building to building in a campus, across routers which were located according to physical distance constraints for signal propagation).
No matter that when you mistook a server for a workstation, which it practically invited you to do, the load of the graphical interface would use up most of the actual computing resources. No matter that it could serve up files more slowly than a single elevator with a sleepy attendant could serve up a skyscraper full of workers.
Microsoft was selling a vision of a kind of seamless network ecology, made up of limitless nodes, all sharing code and processing in a fabric of interconnection, defined and held together by a near-limitless distributed network.
I remember the Central Services UNIX guys coming back from a trip to Redmond where they attended a briefing designed expressly for people like them. "We aren't in Kansas any more" was pretty much the sum of their report to the rest of us.
To some great extent, Microsoft has delivered on the vision they set out to sell. I really don't know what their strong-arm and soft-soap sales techniques actually were, but it was pretty clear that this tsunami was going to engulf everyone and everything in its way.
Getting most PCs sold to businesses to ship pre-loaded with the Microsoft OS pretty much meant that having the Windows Network Operating System at the core of your management planning was going to be the best way to leverage your investments in workstations.
Naturally, Microsoft was not motivated to help companies like Novell interact quite so effectively as Microsoft itself could with the Windows desktop ecology. Although, to be truthful, Novell did a pretty darned good job. But you don't sell to the engineers in the field, you sell to the decision making "suits" and they were all pretty much drinking Microsoft's water; even at a place like the university, where the world was also divided between suits and the rest of us. Well, that's just the laity who support the place. I'm not talking about the clergy here.
So, this tension between power and control at the center, and distribution of computing power throughout a kind of ecological grid has persisted throughout the entire span of the surprisingly recent IT revolution. It's become so much a part of our lives that it can also seem as if we've had these things forever.
Clearly there are plenty of people who can and likely have written up this history from the basis of considerable and deep expertise, which I don't have. I'm writing up my impressions as someone who has observed society from, perhaps, a rather more broad perspective from that of those who might be completely enamored of or embedded in the technologies I describe.
At the same time, I've done enough hands on work, much of which I'm quite proud of, to feel that my sense is accurate in at least the broadest outlines.
So, what's the point of all this? Well, as I recall the head of Novell when I was into it was none other than Eric Schmidt, now CEO of Google. I'm pretty sure he was the one who bought WordPerfect in an attempt to go head to head with Microsoft in the soup-to-nuts world of network design and application delivery. On the merits of the products it was not a bad wager.
Plenty of heavy users of word processing software were wildly enthusiastic about WordPerfect, even especially up against Microsoft Word. It had more features, was better thought out, and was designed to appeal to its principle users; professional wordsmiths, rather than to the mass of occasional users like students and professors and lawyers who might want or need to do their own typing once secretaries became obsolete.
In any case, without a prayer of significant market share for a desktop operating system, and given Microsoft's marketing muscle, Novell's was a dream destined for failure. And I don't think, again, that Microsoft was especially helpful when WordPerfect turned the corner to the Windows graphical user interface. Lots of companies were hit with crash and hang peformance, including Microsoft, but not all of them were equipped to make it through.
There is a good enough analogy between Novell's Network Operating System and WordPerfect on the one side, catering to professionals as their core target user group; and Microsoft's Windows and Office, which aimed to become ubiquitous, on the other.
So Google now, interestingly (at least to me) wants to turn your computer into a kind of dumb terminal again, pretty much like those law school desktops I used to support. It won't have to do much other than to boost itself onto the network, which is all that the new Chrome OS will be designed to do.
Ideally, Chrome will be quite similar to the hypervisor OS "shim" which is used to host the latest "virtual machines" where everything about the computer Operating System is abstracted from any particular hardware configuration. The hypervisor shim emulates a universal hardware machine, and the operating system talks to this virtual hardware. The actual "machine" you count on to get your work done is a software file, and can be physically located, virtually, anywhere. It's pretty darned cool stuff.
For the Google vision, in place of the operating system will be a very "lightweight" Internet browser, also called Chrome, which can run applications, analogous to WordPefect, or Excel or what you will, which are actually hosted somewhere in what now almost universally gets called "the cloud". I'm pretty sure it was originally the Microsoft engineers which started calling it that.
Now we in the trenches of IT are pretty used to thinking of Microsoft as "evil". Even though we make our money, indirectly, from their monopoly, we all kind of wish there were more choices, and that we could specialize in the various stuff which might turn us on, or capture our enthusiasm because it works the way we think it should.
That will, of course, be different for different people, but that's the very point. So, Microsoft is "evil" because it pushed or crowded out all the competition, to the point where we all do the same thing almost the same way almost all the time. Whether we work in a bank or in a university, the design considerations are almost identical.
Yes, Apple also has a network operating system, but it's pretty hayfoot-strawfoot compared to where Microsoft's has evolved. And then there's Linux, another story entirely, which I will get to in a minute.
And to tell you the truth, Microsoft's ubiquity is a large part of why I just can't do it any more. Don't get me wrong, there are still plenty of decisions to make, and plenty of competing companies which ride on top of the basic Microsoft ecology. There are gargantuan ranges of pricing and features, and lots of ways to make a complete ass out of yourself with the wrong decisions.
But the big picture is pretty much shaping up to be Microsoft's vision of a distributed ecology of computing power vs. a seemingly much more distributed version of accessibility.
I now really don't care what machine I use. In most instances, I prefer the Linux OS because it's actually less fussy to load and get going, and is a lot less fussy about the hardware it rides on. It's a little more fussy to use, but not enough to make a difference.
All my documents are somewhere in Google's cloud, as is this blog, as is my email. I'm reasonably confident that it will be there whenever I need it from whatever machine I'm stuck with. There are no features that I find missing, and I'm happy to leave the heavy duty spreadsheeting, or complexly formatted word-processing or databasing to and for users who want to fuss with those complicated and bloated locally installed and expensive programs.
But there's a pretty big catch to all of this. There's pretty much only one company big enough to index and replicate the entire Internet real-time. It's not just a matter of computing power, it's also the electrical power to run and cool the computing power as well as the extremely expensive bandwidth to keep in touch with the vast reaches and every corner of this World Wide Web.
Plus, when I sign on, it's not like I'm always getting my stuff from the same place on the same machine which serves it up to me across this "cloud". My own work is replicated throughout Google's complex infrastructure, such that if the particular machine I'm working on or through or with bites the dust, I won't even know it since my work will be rerouted to another copy.
Even still it does go down from time to time. So these days some people like to have a backup email service if they use Gmail principally. And they might like to take a copy of their work down to their own PC just in case Google is unavailable when they need it, or God forbid, Google decides not to give them access because they violated some arcane section of the Google terms of service. Yeah, like anyone on the planet actually reads those things you click to agree to.
So, I want to know, who's good now, and who's evil? My Linux-loaded laptop seems to crash just about as often or as seldom as my Windows machines used to. Every single day I must consume bandwidth just to keep it up to date. To keep out the bad guys who I guess are always trying to horn their way in to catch a password and maybe steal my identity or my bank account.
I'll be happy when my computer really is a shim up in to some cloud which is freely maintained on my behalf. But that's a pipe dream. Not the cloud, which is already real. The "on my behalf" part.
I don't kid myself that once people get wind of what I'm really saying here, the plug will be pulled on my saying it. There are no terms of service which can allow Catalytic Narratives to crystalize a web of lies into its inevitable thunking.
Meanwhile, I write freely and with plenty of abandon, confident, as you should be too, that these essentially evil empires will be overcome by the goodness of the people who are their workers.
One day, Google's money-printing apparatus will be overrun by the clear superiority of low power ways to organize our blessed Internet. These will depend not at all on massive replication and indexing of everything that's out there. Instead, machines will resolve actual human clicks into virtual spaces which represent connections among people and things and ideas and places.
Machines will be quieted down, and people who want to teach them to think will do so in universities where they belong - harnessing the surplus power which will otherwise clog our landfills.
Real people don't need or want our machines to do our thinking for us. Real people aren't afraid to expose their identity in ways to make it impossible for someone else to impersonate them. Real people aren't terribly embarrassed by our most private fantasies.
Real people are connected by their actions and words and even clicks, and it is this kind of networking which can result in a kind of automatic indexing, by simple calculation of degrees of separation, of those people and words and things which crystallize our desires; our wants and our needs.
So, that's what I'm working on, while you all (well, not you, but you know who I mean) continue to be fascinated by the fireworks displays of too much light and noise and power.
Peace out dudes.
1 comment:
What a great web log. I spend hours on the net reading blogs, about tons of various subjects. I have to first of all give praise to whoever created your theme and second of all to you for writing what i can only describe as an fabulous article. I honestly believe there is a skill to writing articles that only very few posses and honestly you got it. The combining of demonstrative and upper-class content is by all odds super rare with the astronomic amount of blogs on the cyberspace.
Post a Comment