Saturday Night Science: Life After Google

“Life after Google” by George GilderIn his 1990 book Life after Television, George Gilder predicted that the personal computer, then mostly boxes that sat on desktops and worked in isolation from one another, would become more personal, mobile, and be used more to communicate than to compute. In the 1994 revised edition of the book, he wrote. “The most common personal computer of the next decade will be a digital cellular phone with an IP address … connecting to thousands of databases of all kinds.” In contemporary speeches he expanded on the idea, saying, “it will be as portable as your watch and as personal as your wallet; it will recognize speech and navigate streets; it will collect your mail, your news, and your paycheck.” In 2000, he published Telecosm, where he forecast that the building out of a fibre optic communication infrastructure and the development of successive generations of spread spectrum digital mobile communication technologies would effectively cause the cost of communication bandwidth (the quantity of data which can be transmitted in a given time) to asymptotically approach zero, just as the ability to pack more and more transistors on microprocessor and memory chips was doing for computing.

Clearly, when George Gilder forecasts the future of computing, communication, and the industries and social phenomena that spring from them, it’s wise to pay attention. He’s not infallible: in 1990 he predicted that “in the world of networked computers, no one would have to see an advertisement he didn’t want to see”. Oh, well. The very difference between that happy vision and the advertisement-cluttered world we inhabit today, rife with bots, malware, scams, and serial large-scale security breaches which compromise the personal data of millions of people and expose them to identity theft and other forms of fraud is the subject of this book: how we got here, and how technology is opening a path to move on to a better place.

The Internet was born with decentralisation as a central concept. Its U.S. government-funded precursor, ARPANET, was intended to research and demonstrate the technology of packet switching, in which dedicated communication lines from point to point (as in the telephone network) were replaced by switching packets, which can represent all kinds of data—text, voice, video, mail, cat pictures—from source to destination over shared high-speed data links. If the network had multiple paths from source to destination, failure of one data link would simply cause the network to reroute traffic onto a working path, and communication protocols would cause any packets lost in the failure to be automatically re-sent, preventing loss of data. The network might degrade and deliver data more slowly if links or switching hubs went down, but everything would still get through.

This was very attractive to military planners in the Cold War, who worried about a nuclear attack decapitating their command and control network by striking one or a few locations through which their communications funnelled. A distributed network, of which ARPANET was the prototype, would be immune to this kind of top-down attack because there was no top: it was made up of peers, spread all over the landscape, all able to switch data among themselves through a mesh of interconnecting links.

As the ARPANET grew into the Internet and expanded from a small community of military, government, university, and large company users into a mass audience in the 1990s, this fundamental architecture was preserved, but in practice the network bifurcated into a two tier structure. The top tier consisted of the original ARPANET-like users, plus “Internet Service Providers” (ISPs), who had top-tier (“backbone”) connectivity, and then resold Internet access to their customers, who mostly initially connected via dial-up modems. Over time, these customers obtained higher bandwidth via cable television connections, satellite dishes, digital subscriber lines (DSL) over the wired telephone network, and, more recently, mobile devices such as cellular telephones and tablets.

The architecture of the Internet remained the same, but this evolution resulted in a weakening of its peer-to-peer structure. The approaching exhaustion of 32 bit Internet addresses (IPv4) and the slow deployment of its successor (IPv6) meant most small-scale Internet users did not have a permanent address where others could contact them. In an attempt to shield users from the flawed security model and implementation of the software they ran, their Internet connections were increasingly placed behind firewalls and subjected to Network Address Translation (NAT), which made it impossible to establish peer to peer connections without a third party intermediary (which, of course, subverts the design goal of decentralisation). While on the ARPANET and the original Internet every site was a peer of every other (subject only to the speed of their network connections and computer power available to handle network traffic), the network population now became increasingly divided into producers or publishers (who made information available), and consumers (who used the network to access the publishers’ sites but did not publish themselves).

While in the mid-1990s it was easy (or as easy as anything was in that era) to set up your own Web server and publish anything you wished, now most small-scale users were forced to employ hosting services operated by the publishers to make their content available. Services such as AOL, Myspace, Blogger, Facebook, and YouTube were widely used by individuals and companies to host their content, while those wishing their own apparently independent Web presence moved to hosting providers who supplied, for a fee, the servers, storage, and Internet access used by the site.

All of this led to a centralisation of data on the Web, which was accelerated by the emergence of the high speed fibre optic links and massive computing power upon which Gilder had based his 1990 and 2000 forecasts. Both of these came with great economies of scale: it cost a company like Google or Amazon much less per unit of computing power or network bandwidth to build a large, industrial-scale data centre located where electrical power and cooling were inexpensive and linked to the Internet backbone by multiple fibre optic channels, than it cost an individual Internet user or small company with their own server on premises and a modest speed link to an ISP. Thus it became practical for these Goliaths of the Internet to suck up everybody’s data and resell their computing power and access at attractive prices.

As a example of the magnitude of the economies of scale we’re talking about, when I migrated the hosting of my Fourmilab.ch site from my own on-site servers and Internet connection to an Amazon Web Services data centre, my monthly bill for hosting the site dropped by a factor of fifty—not fifty percent, one fiftieth the cost, and you can bet Amazon’s making money on the deal.

This tremendous centralisation is the antithesis of the concept of ARPANET. Instead of a worldwide grid of redundant data links and data distributed everywhere, we have a modest number of huge data centres linked by fibre optic cables carrying traffic for millions of individuals and enterprises. A couple of submarines full of Trident D5s would probably suffice to reset the world, computer network-wise, to 1970.

As this concentration was occurring, the same companies who were building the data centres were offering more and more services to users of the Internet: search engines; hosting of blogs, images, audio, and video; E-mail services; social networks of all kinds; storage and collaborative working tools; high-resolution maps and imagery of the world; archives of data and research material; and a host of others. How was all of this to be paid for? Those giant data centres, after all, represent a capital investment of tens of billions of dollars, and their electricity bills are comparable to those of an aluminium smelter. Due to the architecture of the Internet or, more precisely, missing pieces of the puzzle, a fateful choice was made in the early days of the build-out of these services which now pervade our lives, and we’re all paying the price for it. So far, it has allowed the few companies in this data oligopoly to join the ranks of the largest, most profitable, and most highly valued enterprises in human history, but they may be built on a flawed business model and foundation vulnerable to disruption by software and hardware technologies presently emerging.

The basic business model of what we might call the “consumer Internet” (as opposed to businesses who pay to host their Web presence, on-line stores, etc.) has, with few exceptions, evolved to be what the author calls the “Google model” (although it predates Google): give the product away and make money by afflicting its users with advertisements (which are increasingly targeted to them through information collected from the user’s behaviour on the network through intrusive tracking mechanisms). The fundamental flaws of this are apparent to anybody who uses the Internet: the constant clutter of advertisements, with pop-ups, pop-overs, auto-play video and audio, flashing banners, incessant requests to allow tracking “cookies” or irritating notifications, and the consequent arms race between ad blockers and means to circumvent them, with browser developers (at least those not employed by those paid by the advertisers, directly or indirectly) caught in the middle. There are even absurd Web sites which charge a subscription fee for “membership” and then bombard these paying customers with advertisements that insult their intelligence. But there is a fundamental problem with “free”—it destroys the most important channel of communication between the vendor of a product or service and the customer: the price the customer is willing to pay. Deprived of this information, the vendor is in the same position as a factory manager in a centrally planned economy who has no idea how many of each item to make because his orders are handed down by a planning bureau equally clueless about what is needed in the absence of a price signal. In the end, you have freight cars of typewriter ribbons lined up on sidings while customers wait in line for hours in the hope of buying a new pair of shoes. Further, when the user is not the customer (the one who pays), and especially when a “free” service verges on monopoly status like Google search, Gmail, Facebook, and Twitter, there is little incentive for providers to improve the user experience or be responsive to user requests and needs. Users are subjected to the endless torment of buggy “beta” releases, capricious change for the sake of change, and compromises in the user experience on behalf of the real customers—the advertisers. Once again, this mirrors the experience of centrally-planned economies where the market feedback from price is absent: to appreciate this, you need only compare consumer products from the 1970s and 1980s manufactured in the Soviet Union with those from Japan.

The fundamental flaw in Karl Marx’s economics was his belief that the industrial revolution of his time would produce such abundance of goods that the problem would shift from “production amid scarcity” to “redistribution of abundance”. In the author’s view, the neo-Marxists of Silicon Valley see the exponentially growing technologies of computing and communication providing such abundance that they can give away its fruits in return for collecting and monetising information collected about their users (note, not “customers”: customers are those who pay for the information so collected). Once you grasp this, it’s easier to understand the politics of the barons of Silicon Valley.

The centralisation of data and information flow in these vast data silos creates another threat to which a distributed system is immune: censorship or manipulation of information flow, whether by a coercive government or ideologically-motivated management of the companies who provide these “free” services. We may never know who first said “The Internet treats censorship as damage and routes around it” (the quote has been attributed to numerous people, including two personal friends, so I’m not going there), but it’s profound: the original decentralised structure of the ARPANET/Internet is as robust against censorship as it is in the face of nuclear war. If one or more nodes on the network start to censor information or refuse to forward it on communication links it controls, the network routing protocols simply assume that node is down and send data around it through other nodes and paths which do not censor it. On a network with a multitude of nodes and paths among them, owned by a large and diverse population of operators, it is extraordinarily difficult to shut down the flow of information from a given source or viewpoint; there will almost always be an alternative route that gets it there. (Cryptographic protocols and secure and verified identities can similarly avoid the alteration of information in transit or forging information and attributing it to a different originator; I’ll discuss that later.) As with physical damage, top-down censorship does not work because there’s no top.

But with the current centralised Internet, the owners and operators of these data silos have enormous power to put their thumbs on the scale, tilting opinion in their favour and blocking speech they oppose. Google can push down the page rank of information sources of which they disapprove, so few users will find them. YouTube can “demonetise” videos because they dislike their content, cutting off their creators’ revenue stream overnight with no means of appeal, or they can outright ban creators from the platform and remove their existing content. Twitter routinely “shadow-bans” those with whom they disagree, causing their tweets to disappear into the void, and outright banishes those more vocal. Internet payment processors and crowd funding sites enforce explicit ideological litmus tests on their users, and revoke long-standing commercial relationships over legal speech. One might restate the original observation about the Internet as “The centralised Internet treats censorship as an opportunity and says, ‘Isn’t it great!’ ” Today there’s a top, and those on top control the speech of everything that flows through their data silos.

This pernicious centralisation and “free” funding by advertisement (which is fundamentally plundering users’ most precious possessions: their time and attention) were in large part the consequence of the Internet’s lacking three fundamental architectural layers: security, trust, and transactions. Let’s explore them.

Security. Essential to any useful communication system, security simply means that communications between parties on the network cannot be intercepted by third parties, modified en route, or otherwise manipulated (for example, by changing the order in which messages are received). The communication protocols of the Internet, based on the OSI model, had no explicit security layer. It was expected to be implemented outside the model, across the layers of protocol. On today’s Internet, security has been bolted-on, largely through the Transport Layer Security (TLS) protocols (which, due to history, have a number of other commonly used names, and are most often encountered in the “https:” URLs by which users access Web sites). But because it’s bolted on, not designed in from the bottom-up, and because it “just grew” rather than having been designed in, TLS has been the locus of numerous security flaws which put software that employs it at risk. Further, TLS is a tool which must be used by application designers with extreme care in order to deliver security to their users. Even if TLS were completely flawless, it is very easy to misuse it in an application and compromise users’ security.

Trust. As indispensable as security is knowing to whom you’re talking. For example, when you connect to your bank’s Web site, how do you know you’re actually talking to their server and not some criminal whose computer has spoofed your computer’s domain name system server to intercept your communications and who, the moment you enter your password, will be off and running to empty your bank accounts and make your life a living Hell? Once again, trust has been bolted on to the existing Internet through a rickety system of “certificates” issued mostly by large companies for outrageous fees. And, as with anything centralised, it’s vulnerable: in 2016, one of the top-line certificate vendors was compromised, requiring myriad Web sites (including this one) to re-issue their security certificates.

Transactions. Business is all about transactions; if you aren’t doing transactions, you aren’t in business or, as Gilder puts it, “In business, the ability to conduct transactions is not optional. It is the way all economic learning and growth occur. If your product is ‘free,’ it is not a product, and you are not in business, even if you can extort money from so-called advertisers to fund it.” The present-day Internet has no transaction layer, even bolted on. Instead, we have more silos and bags hanging off the side of the Internet called PayPal, credit card processing companies, and the like, which try to put a Band-Aid over the suppurating wound which is the absence of a way to send money over the Internet in a secure, trusted, quick, efficient, and low-overhead manner. The need for this was perceived long before ARPANET. In Project Xanadu, founded by Ted Nelson in 1960, rule 9 of the “original 17 rules” was, “Every document can contain a royalty mechanism at any desired degree of granularity to ensure payment on any portion accessed, including virtual copies (‘transclusions’) of all or part of the document.” While defined in terms of documents and quoting, this implied the existence of a micropayment system which would allow compensating authors and publishers for copies and quotations of their work with a granularity as small as one character, and could easily be extended to cover payments for products and services. A micropayment system must be able to handle very small payments without crushing overhead, extremely quickly, and transparently (without the Japanese tea ceremony that buying something on-line involves today). As originally envisioned by Ted Nelson, as you read documents, their authors and publishers would be automatically paid for their content, including payments to the originators of material from others embedded within them. As long as the total price for the document was less than what I termed the user’s “threshold of paying”, this would be completely transparent (a user would set the threshold in the browser: if zero, they’d have to approve all payments). There would be no need for advertisements to support publication on a public hypertext network (although publishers would, of course, be free to adopt that model if they wished). If implemented in a decentralised way, like the ARPANET, there would be no central strangle point where censorship could be applied by cutting off the ability to receive payments.

So, is it possible to remake the Internet, building in security, trust, and transactions as the foundation, and replace what the author calls the “Google system of the world” with one in which the data silos are seen as obsolete, control of users’ personal data and work returns to their hands, privacy is respected and the panopticon snooping of today is seen as a dark time we’ve put behind us, and the pervasive and growing censorship by plutocrat ideologues and slaver governments becomes impotent and obsolete? George Gilder responds “yes”, and in this book identifies technologies already existing and being deployed which can bring about this transformation.

At the heart of many of these technologies is the concept of a blockchain, an open, distributed ledger which records transactions or any other form of information in a permanent, public, and verifiable manner. Originally conceived as the transaction ledger for the Bitcoin cryptocurrency, it provided the first means of solving the double-spending problem (how do you keep people from spending a unit of electronic currency twice) without the need for a central server or trusted authority, and hence without a potential choke-point or vulnerability to attack or failure. Since the launch of Bitcoin in 2009, blockchain technology has become a major area of research, with banks and other large financial institutions, companies such as IBM, and major university research groups exploring applications with the goals of drastically reducing transaction costs, improving security, and hardening systems against single-point failure risks.

Applied to the Internet, blockchain technology can provide security and trust (through the permanent publication of public keys which identify actors on the network), and a transaction layer able to efficiently and quickly execute micropayments without the overhead, clutter, friction, and security risks of existing payment systems. By necessity, present-day blockchain implementations are add-ons to the existing Internet, but as the technology matures and is verified and tested, it can move into the foundations of a successor system, based on the same lower-level protocols (and hence compatible with the installed base), but eventually supplanting the patched-together architecture of the Domain Name System, certificate authorities, and payment processors, all of which represent vulnerabilities of the present-day Internet and points at which censorship and control can be imposed. Technologies to watch in these areas are:

As the bandwidth available to users on the edge of the network increases through the deployment of fibre to the home and enterprise and via 5G mobile technology, the data transfer economy of scale of the great data silos will begin to erode. Early in the Roaring Twenties, the aggregate computing power and communication bandwidth on the edge of the network will equal and eventually dwarf that of the legacy data smelters of Google, Facebook, Twitter, and the rest. There will no longer be any need for users to entrust their data to these overbearing anachronisms and consent to multi-dozen page “terms of service” or endure advertising just to see their own content or share it with others. You will be in possession of your own data, on your own server or on space for which you freely contract with others, with backup and other services contracted with any other provider on the network. If your server has extra capacity, you can turn it into money by joining the market for computing and storage capacity, just as you take advantage of these resources when required. All of this will be built on the new secure foundation, so you will retain complete control over who can see your data, no longer trusting weasel-worded promises made by amorphous entities with whom you have no real contract to guard your privacy and intellectual property rights. If you wish, you can be paid for your content, with remittances made automatically as people access it. More and more, you’ll make tiny payments for content which is no longer obstructed by advertising and chopped up to accommodate more clutter. And when outrage mobs of pink hairs and soybeards (each with their own pronoun) come howling to ban you from the Internet, they’ll find nobody to shriek at and the kill switch rusting away in a derelict data centre: your data will be in your own hands with access through myriad routes. Technologies moving in this direction include:

This book provides a breezy look at the present state of the Internet, how we got here (versus where we thought we were going in the 1990s), and how we might transcend the present-day mess into something better if not blocked by the heavy hand of government regulation (the risk of freezing the present-day architecture in place by unleashing agencies like the U.S. Federal Communications Commission, which stifled innovation in broadcasting for six decades, to do the same to the Internet is discussed in detail). Although it’s way too early to see which of the many contending technologies will win out (and recall that the technically superior technology doesn’t always prevail), a survey of work in progress provides a sense for what they have in common and what the eventual result might look like.

There are many things to quibble about here. Gilder goes on at some length about how he believes artificial intelligence is all nonsense, that computers can never truly think or be conscious, and that creativity (new information in the Shannon sense) can only come from the human mind, with a lot of confused arguments from Gödel incompleteness, the Turing halting problem, and even the uncertainty principle of quantum mechanics. He really seems to believe in vitalism, that there is an élan vital which somehow infuses the biological substrate which no machine can embody. This strikes me as superstitious nonsense: a human brain is a structure composed of quarks and electrons arranged in a certain way which processes information, interacts with its environment, and is able to observe its own operation as well as external phenomena (which is all consciousness is about). Now, it may be that somehow quantum mechanics is involved in all of this, and that our existing computers, which are entirely deterministic and classical in their operation, cannot replicate this functionality, but if that’s so it simply means we’ll have to wait until quantum computing, which is already working in a rudimentary form in the laboratory, and is just a different way of arranging the quarks and electrons in a system, develops further.

He argues that while Bitcoin can be an efficient and secure means of processing transactions, it is unsuitable as a replacement for volatile fiat money because, unlike gold, the quantity of Bitcoin has an absolute limit, after which the supply will be capped. I don’t get it. It seems to me that this is a feature, not a bug. The supply of gold increases slowly as new gold is mined, and by pure coincidence the rate of increase in its supply has happened to approximate that of global economic growth. But still, the existing inventory of gold dwarfs new supply, so there isn’t much difference between a very slowly increasing supply and a static one. If you’re on a pure gold standard and economic growth is faster than the increase in the supply of gold, there will be gradual deflation because a given quantity of gold will buy more in the future. But so what? In a deflationary environment, interest rates will be low and it will be easy to fund new investment, since investors will receive money back which will be more valuable. With Bitcoin, once the entire supply is mined, supply will be static (actually, very slowly shrinking, as private keys are eventually lost, which is precisely like gold being consumed by industrial uses from which it is not reclaimed), but Bitcoin can be divided without limit (with minor and upward-compatible changes to the existing protocol). So, it really doesn’t matter if, in the greater solar system economy of the year 8537, a single Bitcoin is sufficient to buy Jupiter: transactions will simply be done in yocto-satoshis or whatever. In fact, Bitcoin is better in this regard than gold, which cannot be subdivided below the unit of one atom.

Gilder further argues, as he did in The Scandal of Money, that the proper dimensional unit for money is time, since that is the measure of what is required to create true wealth (as opposed to funny money created by governments or fantasy money “earned” in zero-sum speculation such as currency trading), and that existing cryptocurrencies do not meet this definition. I’ll take his word on the latter point; it’s his definition, after all, but his time theory of money is way too close to the Marxist labour theory of value to persuade me. That theory is trivially falsified by its prediction that more value is created in labour-intensive production of the same goods than by producing them in a more efficient manner. In fact, value, measured as profit, dramatically increases as the labour input to production is reduced. Over forty centuries of human history, the one thing in common among almost everything used for money (at least until our post-reality era) is scarcity: the supply is limited and it is difficult to increase it. The genius of Bitcoin and its underlying blockchain technology is that it solved the problem of how to make a digital good, which can be copied at zero cost, scarce, without requiring a central authority. That seems to meet the essential requirement to serve as money, regardless of how you define that term.

Gilder’s books have a good record for sketching the future of technology and identifying the trends which are contributing to it. He has been less successful picking winners and losers; I wouldn’t make investment decisions based on his evaluation of products and companies, but rather wait until the market sorts out those which will endure.

Gilder, George. Life after Google. Washington: Regnery Publishing, 2018. ISBN 978-1-62157-576-4.

Here is a talk by the author at the Blockstack Berlin 2018 conference which summarises the essentials of his thesis in just eleven minutes and ends with an exhortation to designers and builders of the new Internet to “tear down these walls” around the data centres which imprison our personal information.

This Uncommon Knowledge interview provides, in 48 minutes, a calmer and more in-depth exploration of why the Google world system must fail and what may replace it.

Like 12+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

The Web, Decentralized?

I have developed a significant resentment of the monopolistic=leftist web companies and have asked why there aren’t alternatives. Well, now I have a question for those in the know on Ratburger: Is there anything to Tim Berners-Lee’s assertion that his new startup Inrupt

its mission is to turbocharge a broader movement afoot, among developers around the world, to decentralize the web and take back power from the forces that have profited from centralizing it. In other words, it’s game on for Facebook, Google, Amazon. For years now, Berners-Lee and other internet activists have been dreaming of a digital utopia where individuals control their own data and the internet remains free and open. But for Berners-Lee, the time for dreaming is over.

It is based on something called ‘Solid,’ a “decentralized web platform he and others at MIT have spent years building.”

Should I be excited? Shall I start limbering up my ‘personal digital communicator (aka my middle finger) to signal “you’re fired” to Apple, Amazon, Google (I don’t use any of the others)? I hope those with understanding of the big picture will jump in.


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

LifeSiteNewsDotCom

LifeSiteNews is a small anti-abortion activist group, pro-life journalism outlet, and news aggregator. It was launched in 1997 as a spinoff of Campaign Life Coalition. Both are based in Toronto. Unless you are a traditionalist Catholic or a pro-life culture warrior, you probably never heard of them.

They have had a lot of excitement lately.   For a year they have been fighting for their life as an organization. They had become very dependent on their Facebook page as their primary way to communicate with their network of donors, most of whom are Catholic families making small-time contributions. Facebook has been waging war against them.

Facebook ghetto

In addition to filtering them out of searches and giving them the “shadow ban” treatment, Facebook has refused to run their ads:

One response that our team received as the reason for Facebook’s disapproval of our ads is equally concerning. The ad pertaining to this response simply showed an image of a pregnant mother holding a photo of her baby’s ultrasound…

I do see that the ad has a fetus and while it involves your ad text and topic, it may be viewed too strong for Facebook to allow to show.

Such viewpoint discrimination is a direct attack on our shared life and family values, and is greatly affecting our efforts to fundraise and spread our news.

Yes, a pregnant woman showing off the ultrasound picture of her baby is “too strong” for Facebook. That is a transparent excuse that says Facebook does not like advocacy for babies. Facebook is enforcing the Culture of Death.

They do this by decreeing that accurately reporting on the abortion industry and Planned Parenthood is “fake news.” Truth is irrelevant; what matters is the narrative.

Facebook recently admitted to combating “fake news” by developing a system that ranks users’ trustworthiness on a scale from 1 to 10. This is determined by users’ opinions rather than objective investigations!

This means that aggressively pro-choice and anti-family Facebook users can rank LifeSiteNews as “untrustworthy” with the simple click of a button – just because they dislike the facts that we publish.

Facebook has therefore made it ridiculously easy for our highly organized, well-financed (George Soros, etc) and hateful opponents to have LifeSiteNews wrongly categorized as “fake news” and our traffic suppressed according to Facebook’s “terms of agreement.” Truth does not matter according to this mob-mentality-serving process.  

Sex scandals

If you are wondering where it was that you recently saw their name, it was because they landed the biggest Catholic scoop of August. In the middle of the Catholic summer of distress over new sex scandals, Archbishop Viganò released a letter that said that Pope Francis and the rest of the Vatican were aware of Cardinal McCarrick’s habit of pressing young seminarians for sex, and also that he had covered for homosexual priests who preyed on teenage boys. Pope Francis had rehabilitated McCarrick in spite of this knowledge.

Archbishop Viganò gave his letter to two conservative Italian journalists that he trusts. He also sent it to LifeSiteNews. Evidently that was the only English-language outlet that he trusts.

Since then, other traditionalist Catholics have gone directly to LifeSiteNews with background and new developments on these scandals.

Search and you will not find

Facebook is not the only internet service that is hostile to pro-life advocates. Several news aggregators have the habit of demoting LifeSiteNews as well as other conservative outlets. So for the past weeks we have seen searches that turned up dozens of articles and editorials that cited LifeSiteNews, but unless you type “lifesitenews” in your search, you will not see their original reporting on the first four pages of results.

Allies

I am not a Catholic. As a Lutheran, the Church of Rome teaches that I am condemned to hell as a Schismatic. Nevertheless I have several Catholic friends, and I find that traditionalist Catholics are my most trustworthy allies in the culture wars. I need strong Catholics to help rescue western civilization from the assaults of Satan.

Please consider giving a little support to LifeSiteNews, either with a few bucks, or by sharing their plight with your Catholic and pro-life friends.


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

Scam caller saga continues…

Well  the saga continues, I posted, (see BLUE text ), on several “who called me” sites because another scammer got through NoNoRobo. Why did they get through? Well this scammer, like many others, fakes the caller ID to that of a somewhat local number.  But the number they said to call was from Florida. So NoMoRobo did not recognize the (local) number as a scammer or robo-call (telemonster).

The message on my recorder said; “This message is intended for Jolene. I’m calling in regards to a pending matter that is being in the process of being reviewed today. I’m also calling to verify that we do have the correct address on file for this individual. To avoid any further proceedings at this time you have the right to contact the information Center. Should you wish to contact them the contact number listed as 561-223-6950 and you will have to reference your file number 16112.”

The message I posted on several web sites was; “called left this number to call back, asked for someone by first name that I never heard of except in Dolly Parton song, “Jolene”, LOL. Took the number they called from and forwarded it to the number they gave, let them get a taste of their own medicine. If you have XFINITY, you can do this free of charge, forward scammer calls back to themselves. Hope it ties up their call center!”

I hope XFINITY customers that are plagued by these calls do the same.


Users who have liked this post:

  • avatar
  • avatar
  • avatar

How times have changed…

Below you will see two examples of portable storage media. So what’s next? Or what’s next that I can afford? Or will I need it?

(The 64 Gig thumb drive was in my pocket, it accidentally took a swim in the washing machine and survived a wash and two rinse cycles, not to forget the three spin cycles!)


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

Homepage Recommendations?

To some extent, I am a creature of habit. This tendency has not improved with age and – being that I am 74 – my formative years did not include computers. Thus, gmail has been my Safari homepage for quite some time. Aside from my increasing desire to jettison all things google, I am particularly desirous of finding a homepage which loads quicker. Every time I open a new window or tab, I must wait for it to load (or stop it from loading with an additional click of the ‘x’ in the url bar) before entering my next search. This has become annoying. I may be old, but I am still type-A  (my wife refers to me as “his royal A-ness.”

I would much appreciate Ratburgher suggestions as to good homepages.


Users who have liked this post:

  • avatar
  • avatar

Family D.I.Y. Backup Solutions?

In my extended family, there are around a dozen computers which should be backed up, not counting an equal number of phones. Myself, I have an old Time Capsule and I do a weekly external drive bootable backup. Most of our computers are for personal matters, not work.

My son is an outlier. He is near completion of a Ph.D. in genetics and does some high-power statistics whose processing often runs for hours. He has many large files of data on a one year-old MacBook Pro. Loss of this would be catastrophic. He has an external drive for backup, but keeps it in his not-too-secure apartment in a ratty (in the negative sense) building. His chained bike was stolen recently from an inside hallway, and that event led to this entire inquiry.

It occurs to me – I like the idea of having my backup local – that the combined annual cost of online backup subscriptions for a dozen computers would quickly far exceed the cost of an online server set up as a personal cloud. For the cognoscenti among us – is this a worthwhile line of thought? Have you better suggestions? Anyone for hire to set it up for me (only half kidding).


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

Game Review: Bioshock Infinite

Bioshock Infinite (game)

I’ve been writing about virtual reality, simulated worlds, and the new forms of entertainment and education they will engender for thirty years.  Other than Kerbal Space Program, which is more of a simulated world sandbox than a game (there is no specific goal and no conflict other than with the laws of nature), I have not played video games since the age of Pac-Man.  As we approach the threshold of the Roaring Twenties, I thought it would be a good idea to check in and see the current state of the art and whether it justified the things people were saying about contemporary games being a new art form and interactive medium of fiction.

The game I chose to play was Bioshock Infinite, the latest in the Bioshock series (but an entirely different story line from the previous games).  This game, originally released for consoles in 2013 and ported to Linux in 2015, was rated very high by reviewers, with most rankings 9/10 or 10/10.  The budget for the game was not disclosed by its privately-held developers, but was said to be around US$ 100 million for development and a comparable sum for promotion, larger than many major motion pictures.  It has sold more than 11 million copies since its release.  I learned of the game from a detailed description by James Lileks on his blog.

This review will be presented in a very eccentric manner.  I will not describe the plot from an omniscient standpoint (for that, see the links above), but rather things as I encounter them.  These are my notes, written in the style of a software development log or system narrative.  I will post items in comments, one or more per day, time permitting, running behind my play-through of the game (providing a buffer for days I have too many other things to do).  In each entry, I will provide links to an on-line play-through which will give you more details and screen grabs.  There’s no point in making my own, since the job has already been done superbly.  There will, of course, as in any play-through, be major spoilers in the comments.  I will make no effort to avoid or mark them.  If you want to experience the game without any foreknowledge, don’t read the comments that follow.

I am playing the game on an Xubuntu Linux system under the Steam gaming environment.  I am using 1920×1080 screen resolution in “High” resolution rendering mode with “Normal” difficulty.  As I am interested more in exploring this virtual world, how it is rendered, and how a visitor interacts with it than testing my prowess against the game engine, I am exploring it with the aid of play-throughs prepared by people who have made it all the way to the finish.  These are guides, however, not cheats—there is substantial randomness in the game and you’re on your own when the shooting starts—it’s generally up to you to figure out how to defeat the assaults you’ll face as you progress through the game.

This is a “first-person shooter” game: you are the protagonist and have to reach your objectives by defeating foes—human, mechanical, and supernatural—with weapons, wits, and capabilities you acquire as you pursue your quest.  If you find this repellent, so be it—that’s the model for many games, and it’s the one adopted here.  Personally, I find most of the combat episodes tedious, although it’s fun learning tricks to defeat adversaries and deploying newly-acquired weapons and “vigors” (supernatural powers) against them.  What is the most fun is exploring the huge, magnificently-rendered world here.  The production values are equal to contemporary CGI movies, but you’re not stuck in your seat munching popcorn but able to explore it at will, looking at things from various perspectives and interacting with this world you’re discovering.  The game is, as far as I’ve played it when I’m writing this, beautiful, with superbly-rendered three-dimensional models; an airy, misty ambience; and a musical track, both vintage and original, which complements the story line.  There is explicit violence (although nothing which would go beyond a “PG” rating in the movies), but no obscenity or nudity (at least as far as I’ve played).

I should note that the credits screen includes the following acknowledgements:

This software product includes Autodesk® Beast™ software,© 2013 Autodesk, Inc.,  Autodesk® HumaniK® software,© 2013 Autodesk, Inc., Autodesk® Kynapse® software,© 2013 Autodesk, Inc., and Autodesk® Scaleform® software,© 2013 Autodesk, Inc.

I had no idea Autodesk was such a player in the game space, as well as CGI movies.  You never know what the kids will do after they grow up and move out….  (Guys, you only need to use the “registered” or other marks on the first reference to the word.)

Remember that you can follow this post, receiving notifications for new comments without having your comment appear by posting  a comment consisting of just the word “follow”.

And here we go….


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar

Persistent Bugger (revisited) or REVENGE

This is related to the persistent bugger post I made a little while ago.

OK, I did some looking around on Youtube regarding “telemonsters” or Robo-callers. There was everything from actual conversations that people had with them, especially the hard to understand Indian scammers,to people with the technical know how to initiate their own robocallers to flood the telemonster’s phone center preventing them from making outgoing calls.

Continue reading “Persistent Bugger (revisited) or REVENGE”


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar

Book Review “The Autodesk File”

A comment John made (#18) on a recent post by 10 cents (“Programming Question”), reminded me I had reviewed one of John’s books. The review was posted a while back on the legacy site. As this is one of the most worthwhile books I have ever read, I thought it should be posted it here.

A work of non-fiction is understood in a context. A great work actually articulates the context before anybody else gets it. A review of such a book may go seemingly far afield, if the book’s power can be construed to provoke and, indeed, license the inspired musings of its readers. Such is the case here, as “The Autodesk File”’s roots are deep in the intellectual, technological, economic, financial, and even spiritual soil of this, the spring garden of the information age.

When was the last time you couldn’t put down a book which had not a single murder, courtship, love or sex scene? OK, I’m not counting some ancillary trysts consisting of mergers and takeovers, which some might construe as sexy, or at least allude to being on the receiving end of a certain Anglo-Saxon gerund. This book contains no obscenities, save a rare mention of taurine spoor. That serves as a welcome reminder: important ideas and even emotions are amenable to description sans vulgarity.

Lest one think this a narrow commercial exposition, “The Autodesk File” is in the public domain in multiple formats. Neither is it a mere exposition of commerce. About half way through, amidst essays explaining the nature of businesses dealing in intellectual property (rather than capital-intensive equipment), the reader is treated to a short science fiction story whose theme is no less than a plausible tale of the origin of human life. Our bodily construction is, after all, prescribed in lines of code, albeit compressed into helixes wound around themselves then wrapped around histones. Like some of their software counterparts, they, too, must be unzipped before use.

Also punctuating this eclectic opus are quotes from Aristophanes. It is a tour de force, a truly awe-inspiring account of much more than the building and workings of one trailblazing company. It encapsulates the noblest of human aspirations, idealizations, creativity, ingenuity and critical self-examination; inescapable is the conclusion that voluntary cooperation and exchange of ideas, knowledge and capital is a great boon to the world at large. If a business is built to serve the needs of customers by creating products of the highest possible quality, greed is not a good; it is irrelevant. Also inescapable is the perhaps ironic conclusion that ongoing success requires continual vigilance, lest arrogance take hold. The fruition of critical self-examination can be seen in renewal of that same humility which was so essential in powering that first whiff of success.

Nonetheless, apart from arcane sections dealing with technical matters of computer hardware and programming (these, too, may be great for the cognoscenti; this writer simply knows too little), this book is a spellbinder. Readers may be surprised to be persuasively regaled with the fundamentals of various disciplines, including economics, finance, taxation, corporate law, engineering, computer science, thermodynamics, rocket science, quantum mechanics, cosmology and the nature of reality. That is, readers who don’t know John Walker. For those who do, none of this is surprising.

Have you ever had a million dollar idea? I have – lots of ‘em. Have I turned even one of those ideas into a product? Nope. Why not? Because I lacked the understanding, the talent, and the single-minded discipline to even get one idea off the ground. This book, edited by Ratburger’s own John Walker (himself author of most of the collected writings), is a chronicle of birth, growth, crises and maturation of Autodesk Inc., whose products helped unleash the creativity and productivity of millions of people. It did so beginning with a key insight: that the infant personal computer was a general tool and not a specific workstation. As a general tool, through the intelligent design of software, it would rapidly evolve in utility in virtually every field of endeavor, beginning with design. Design, in this line of thinking, is a logical first step down the path which aims, eventually, to capture all of reality in the box we call a computer. This stunning insight occurred while all the rest of us still went through our days typing on an IBM Selectric, without once even using the word “computer.” Way back then in 1980, virtually none of us thought about computers or any of the other words and things without which our lives today would be unimaginable. Historically speaking,1980 happened yesterday.

An additional insight guided Autodesk’s ethos: that personal computers would grow exponentially in processing power and become useful by ordinary people (with no computer or programming skills) to undertake virtually any task. Autodesk’s first product,  AutoCAD, moved design from a small number of dedicated, expensive CAD workstations operated by highly-trained people, to desks virtually everywhere where drawing might be needed. In the process of “squeezing too much code into too small a box,” Autodesk did not compete with previous generations of single-purpose CAD workstations which cost 10 – 50X as much. Instead, it created and increased a market for CAD by the same orders of magnitude, by bringing this tool to the 98% of designers and draftsmen who could not afford dedicated CAD workstations.

In less than one year, this new company had a hit product. Time to rest on one’s laurels? How about after the IPO? Time to coast? Not quite. Going into the CAD business – and that is the business, as opposed to the software business (read the book to learn why), is something like launching a rocket from Earth and hoping to land on a comet and send back data – all except that the precise trajectory of the comet cannot be known, and its surface material and contours are completely unknown. The difficulties were perhaps not unlike those encountered by the ESA’s $1.8 billion Rosetta/Philae spacecraft which did rendezvous and land on comet 67P. Philae’s tether harpoons failed to fire, so the probe bounced and wound up in a permanently-shaded spot (due to an unanticipated hard surface, they likely would not have worked anyway), preventing use of solar power. Batteries enabled an estimated 80% overall mission success. AutoCAD’s launch – with $59,000 in capital, mid-course hardware and software corrections and “landing” on users, by contrast, remains successful to this day.

“The Autodesk File” attributes success to the company’s understanding that it represented what it coined “The New Technological Corporation.” This is an an enterprise which does not conform to traditional capital-intensive business, as it can deploy intellectual, debt-free leverage. Such businesses embrace an unpredictable but essential element: “wild talent.” This talent is a necessary but not sufficient condition for success when it comes to creating software, which is unlike most all prior businesses. Rather than capital, such entities require a peculiar kind of talent – one which grasps the present desires of a market, knows what is possible with present hardware and the correctly plots the trajectories of both the market and evolving hardware. I believe it to be objectively true that the editor is faithfully and humbly describing the truly awe-inspiring talent he, himself, brought to Autodesk. Other such individuals, like Jobs or Gates, are known in the early computer and software businesses. Few, however, have operated as willing members of an extended team with humility, dedication to excellence and human decency. If nothing else, “The Autodesk File” shows how this can be accomplished. 

Attempts to find individuals with “wild talent” are most difficult, maybe impossible. “Wild talent” illustrates the essential difference between aggregate information, traditionally used by analysts to “value” companies which trade on public exchanges, and actual events which take place within any company. For instance, money spent on R&D is aggregate data which subsumes the activities of many employees of a given company. Whether it means the company will grow really depends on what individual employees accomplish. When it come to software, the outcome will be notably different for R&D teams which play it safe versus ones which continually push the envelope of what may be remotely possible. Intellectual leverage is such that the cost of failure of 8 out of 10 ideas is far outweighed by success in only 1 or 2 of them. The presence of such loyal individuals is also a bulwark against hostile takeovers. You can lead a programmer to the R&D department, but you can’t make him plink – at least not in the way which is essential to success.

Perhaps most revealing about this unusual book is the ongoing critical self-examination engaged in by the primary author. These analyses were distilled into the form of internal company communications as essays and information letters.  At many points in the journey, the author is able to adumbrate the – sometimes previously un-articulable – principles which guided his often momentous insights. These usually arose in chaotic circumstances with incomplete information. The essential humility of this approach is demonstrated at various points in the book. Repeatedly, the author makes clear the importance of open communication and understanding of the roles of all the other parts of the company. A programmer, for example, must understand management’s plan, what customers want, how a product will be marketed and shipped, what competitors are doing, etc. Only then can a “wild talent” be effective.

 “The Autodesk File” is a much-needed reminder that human beings are still capable of doing awe-inspiring, creative and even noble things; that they can voluntarily collaborate and, working in their own self-interest, set off endless waves of non-zero sum games in their wakes. This is also a success story, then, a chain of decisions, clearly rooted in the philosophy of Classical Liberalism – in some of its untidy and altogether messy human details. Without aiming to, this story affirms the primacy and value of the individual, both as producer and consumer; it convincingly shows that communication – positive and negative feedback – between individual, voluntary buyers and sellers – is the essence of what a market is. This is in contrast to statist dirigisme, where aggregate data and arrogance rule, in derogation of the value of the individual. 

Diametrically opposed to today’s received collectivist wisdom, “The Autodesk File” shows how individuals create markets where none previously existed, to the betterment of all. From those roots emerge timeless operating principles: 1. build the best products, period – with open architecture so as to invite developers to customize and find as yet undreamed uses (an essential form of feedback for software companies), thereby further expanding markets; 2. invite, quickly assess and respond to this feedback from customers in the form of improved new releases; 3. employ owners, not merely ‘investors’ – pay well for results – with ownership whenever possible. This is a story which demonstrates the huge difference between owners, whose time preference is long and investors focused only on the forecast for the next fiscal quarter. The tyranny of industry analysts, a form of economic lunacy where short time preference is brutally and pervasively enforced on behalf of “investors,” operates so as to threaten the short-term existence of sound public companies which actually attempt to pursue the best long-term business practices.

In a somewhat philosophic interview around the tenth anniversary of Autodesk, the author/editor describes the operation of a new “design cult” of engineering as a “form of creationism, which thinks its members are so omniscient that they have no need for market-driven evolution to perfect their efforts.” This view, coupled with the information letters, again displays an essential humility in the ethos of Autodesk. Management must lead toward explicit goals. Every part of the organization must understand and communicate with all others, particularly as it affects product development. This is not the typical hierarchical corporate ethos. Neither is it anarchy. Management must lead, but not without listening, understanding and explaining. 

It is difficult for this writer to refrain from drawing parallels to the author’s description of this “design cult” of engineering. Such an attitude is not surprising, given that we live in a society which increasingly and officially denies the existence of a supreme being, while at the same time acting – through a “cult” of increasingly centralized authoritarian government – as though it were omniscient and omnipotent; as though its policies have no unintended consequences; as though no cost is too high to accomplish its goals, whose only feedback is its own reverberating positive-feedback echo chamber. It is hard to know which cult is imitating which. In either case, the state-erected obstacles to starting and running a business, while not emphasized, are on display in this epic. This common ethos of the state and large corporations has inevitably given us today’s pernicious corporatism.

It may be that the most significant intellectual error of our time is the belief that society can be modeled and manipulated as well as physical reality now can be, thanks in large part to private companies like Autodesk. Unlike government, though, companies are forced to relearn their limits – i.e., lessons in humility are given, at least annually, and enforced as necessary by balance sheets and owners. The fear of going out of business would be a highly salutary fear for modern government to experience. Instead of a healthy humility, however, the state often displays antipathy toward private enterprise – ironically, the very source of its own financial power. The public relations nature of this attitude  likely represents either envy of private successes and/or virtue signaling in an effort to garner votes in the incessant lust for yet more power.

God is traditionally described as a jealous God. Do you suppose that our deity/government has its own version of the Ten Commandments, the first of which explains its animus toward private enterprise? “Thou shalt have no other Gods before Me…” …otherwise put, “Trust me. I’m from the government.” “I’m here to protect you from those big, bad, corporations.”

Thus, as you may see for this reader, the story of Autodesk led to much contemplation of human nature and the whole spectrum our interactions – both voluntary and coercive. It is an inspiring and epic tale of the utility and nobility of voluntary cooperation.

“The Autodesk File” is in the public domain. It is available in several downloadable versions. All formats are accessible here: http://www.fourmilab.ch/autofile/


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

HAL’s Legacy

On the occasion of the fiftieth anniversary of the release of 2001: A Space Odyssey, here is a SETI Institute talk by Dr David Stork on “HAL’s Legacy: 2001’s Computer as Dream and Reality”.  This was the title of a book he edited in 1998 comparing the technology envisioned in the film with that a few years before the year 2001.  In this lecture, he brings things up to date with progress toward achieving the capabilities of HAL in various domains in the ensuing twenty years.

We are now 549 days before the start of the Roaring Twenties.


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar