Google “responsible development”

I am not suggesting that you perform a Google search for “responsible development.”   I just want to call attention to the demise of the “Responsible Development of AI Advisory Council” at Google.   It was just last week that Google announced the formation of the Advisory Council, intended to debate potential policy related to Artificial Intelligence.   They started announcing persons who were going to serve on the Advisory Council.   One of the persons named was Kay Coles James.

The histrionics from Google staffers were immediate and intense.   Tantrums were thrown and a clamor of angry rhetoric consumed much energy and attention for a couple of days.   Google promptly caved, and today they announced the dissolution of the Advisory Council.   Evidently placating the SJW staffers at Google was much more important than any effort to get ahead of the plethora of ethical pitfalls that beset the development of artificial intelligence.

What prompted the outrage?   Well, Ms. Kay Coles James is an African-American grandmother who held a variety of jobs in government and education.   She is currently on the Advisory Council for NASA.   And, by the way, she is also the current President of Heritage Foundation.

petition with more than 2,000 signatories from within the company was published on Medium on Monday, with the title “Googlers Against Transphobia and Hate.”

…Meredith Whittaker, who leads Google’s Open Research Group, posted on a private Google listserv that, “I would disagree that their views are important to consider, when those views include erasing trans people, targeting immigrants, and denying climate change.”

Other ringleader employees at Google vociferously trashed Heritage Foundation on a variety of charges, such as “anti-LGBTQ,” “anti-immigrant” and climate denial, etc.   When some employees said that the hoo-rah sounded intolerant, they were attacked with messages saying that there is no need to listen to such haters.

Both Daily Caller and Breitbart have the story.   Links are in the comments.

4+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar

Tele-Monsters revisited.

Tele-Monsters, or those dammed scam/spam/nuisance/marketing calls.

I’m sure a few Ratburgers remember my posts about them and my successes and failures to turn the tide of them.
Yep “NoMoRoBo” is still the premier answer. But with drawbacks. The drawbacks are when every tele-monster calls and NoMoRoBo intercepts the call, we get one ring. Sort of like on “The Hunt for Red October“, but ring instead of ping.

Continue reading “Tele-Monsters revisited.”

6+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

“When did two plus two equal six?” (at 15:00)

I am watching this video of two former employees of Theranos. Theranos is the company that was going to automate 100s of blood tests from a drop of blood. It was a great company excepting for the fact they lied, put people’s health at risk, and lost about 900 million of investors’ money.

The video is long but the first part is enough to see some of what these at the time recent college graduates went through. Tyler Shultz is the grandson of George Shultz, the Secretary of State that didn’t lose in 2016.

It impresses me when people have the moral courage to speak up. They get a lot of flak but in the end are vindicated.

1+

Users who have liked this post:

  • avatar

A Perplexing Trend- Visually Complicated Menus

I have been noticing that menus of all kinds–from websites to restaurants–have become more complicated and thus more and more difficult to navigate. The trend toward clean and simple seems to be reversing.  Now, I would say most of the time when I go to a website, I am overwhelmed with visual tiles on the landing page,  plus information revealed only to the enthusiastic scroller, menus layered under other menus, and pages that do not deliver as promised. It can take several minutes of clicking around to figure out what to do next.

This now widespread tendency to present the customer with confusing arrays of choices, and make it difficult to complete such simple actions as viewing a product sample, makes me wonder whether sprawling menus are not some kind of marketing strategy that increases sales.  Non-profits are guilty of it–note the inscrutable internal workings of the College Board site–but most private companies are doing it, too.  Just the other day my index finger got a big workout with the mouse merely trying to locate a demo for a tech product that the company was presumably wanting to sell to interested schools. Also, our school’s online portfolio and PD credits system is not really something one could teach to a colleague. You simply make selections and click the mouse, because neither logic nor intuition helps with the opaque setup. You just keep boldly advancing, and somehow the work gets done.

Also, SurveyMonkey used to be so clean and navigable when we first signed up for it. Now that the subscription price has increased, the service is becoming turgid. There are multiple landing pages that don’t seem to offer me anything new or helpful. It doesn’t seem as if the function buttons parallel one another from one page to the next.   The site designers are starting to fold up one of the toolbars on the left, leaving you to decipher their runes to locate the item you need. I can still use the program, despite my annoyance with the designers’ compulsion to fiddle with it, because I’ve been a customer for so long. But I received a highly agitated phone call from my sister one day, who had signed up to create a simple survey, couldn’t figure out how to change the question type, and as a result kept re-entering the question. At that point, she would have been happier collecting her data via snail mail.

Even fast-food restaurants have become infected with the need to set up confusing menus.  Lately at Subway, I can’t find the deal of the month on the 12-inch sandwich anymore.  Availability and pricing are divided up into meaningless categories like “signature” and “classics.” If you study the panoply of boards enough, you might light upon the information you are seeking.  Frugal tendencies may provide you with motivation enough to stand there scanning, for little reward.

I went to our town’s new Panera Bread, with lovely loaves arranged in a display window, and studied the menu in vain for mention of “bread.” I had to ask to find out that indeed, the soup was served with a bit of french baguette on the side.  When I asked about the sandwich choices at Starbucks, which were fuzzy in ways I couldn’t put my finger on, the cashier explained that they had a tomat0-mozzarella item that might not have been listed.  Now that’s a head-scratcher. Why wouldn’t it be on the menu?  To head off some local tomato shortage?  Because the basil wasn’t fair trade?

Even at Wendy’s, I have a hard time locating the bargain items on the dense display .  (Hint: they are grouped under the 4 for 4 dollars section.)  A local Asian restaurant seems to switch cooks and offerings, enough so that instead of changing the main menu board, new menus are simply hung in available space at the front. I can think of five sources of ordering information in that establishment, including a page handwritten on yellow notebook paper.  The menu slog is worth it, though, since the Thai food there is delicious.

Has anyone else noticed the trend toward complicated menus?  Any thoughts as to why businesses are favoring this approach?

3+

Users who have liked this post:

  • avatar
  • avatar
  • avatar

This Week’s Book Review – Stanley Marcus

I write a weekly book review for the Daily News of Galveston County. (It is not the biggest daily newspaper in Texas, but it is the oldest.) My review normally appears Wednesdays. When it appears, I post the review here on the following Sunday.

Book Review

‘Stanley Marcus’ highly entertaining and informative

By MARK LARDAS

Feb 5, 2019

“Stanley Marcus: The Relentless Reign of a Merchant Prince,” by Thomas E. Alexander, State House Press, 2018, 280 pages,$19.95

Neiman Marcus is Texas’ signature department store. It was the first place where Texas and high fashion converged. It remained the Texas arbiter of fashion throughout the 20th century.

“Stanley Marcus: The Relentless Reign of a Merchant Prince,” by Thomas E. Alexander, is a biography of the man who turned Neiman Marcus into the aristocrat of department stores.

Stanley Marcus did not found Neiman Marcus. His father and uncle did. They, along with Stanley’s aunt, made Neiman Marcus into Dallas’s leading store. Herbert Marcus’ salesmanship and insistence on customer satisfaction, Carrie Neiman’s (nee Marcus) fashion sense and Al Neiman’s shrewd management of expenses proved a perfect fit for a Dallas growing wealthy through then-new oil money. The new-money rich could go to Neiman Marcus, get dressed right without feeling condescended to.

Stanley Marcus became the prince inheriting this kingdom because he was Herbert’s oldest son (Al and Carrie had none). That was how family businesses ran back then. But, as in a fairy tale, he had a magic touch when it came to retailing luxury goods.

Alexander’s biography shows how Stanley Marcus transformed Neiman Marcus from Dallas’ leading department store to an American fashion icon. Alexander shows how in the 1930s Marcus managed to make Dallas a fashion center by a combination of fashion sense, marketing and exclusivity. Neiman Marcus was the first fashion store outside of New York City advertising nationally, creating a national identity.

The book is told from an insider’s perspective. Alexander became Neiman Marcus’ sales promotion director in 1970. He worked directly with Stanley Marcus for decades, becoming close friends with Marcus. Alexander’s accounts of the store’s fashion “fortnights” (two- and later three-week marketing extravaganzas focusing on fashions of a country) are often personal recollections. He recounts the successes, failures and challenges met. A similar approach frames his accounts of the company’s expansion to other cities.

“Stanley Marcus: The Relentless Reign of a Merchant Prince” is a book praising a respected friend who has passed. It’s also a highly entertaining and informative look at a great store and the man most responsible for its greatness.

Mark Lardas, an engineer, freelance writer, amateur historian, and model-maker, lives in League City. His website is marklardas.com.

4+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar

New Normal

Remember when President O said that his very weak economic recovery was “the new normal.” He said that the sort of economic recovery we had historically experienced after a recession was no longer possible.

He was full of stuff. He was reading Leftist economics and drinking his team’s Kool-aid.

Hillary campaigned on a continuation of Obama policy.

D.J. Trump campaigned on an “America first” platform, unconcerned about the unfortunate history of that phrase. His key slogan was “Make America Great Again,” which he pledged to do by rolling back regulations, ending anti-business policies, promoting jobs and deals, and cutting taxes. He ran a pro-America, pro-business campaign, a breath of fresh air after eight years of our first Anti-American President.

Business confidence began to rise on the morning after the election of 2016, and is still rising. Consumer confidence is catching up, despite an amazing level of fearmongering by Leftist mass media.

And the numbers are continuing to support President Trump. I saw the jobs report for January, and happened to click on an article at CNN Business, titled “Hiring Boomed in January.” Here is a line that caught my attention:

The continued hot pace of job growth is evidence that people who may have been sidelined by the Great Recession more than a decade ago are still coming off the sidelines. Labor force participation grew slightly, to its highest level since 2013.

We are still digging out of the hole we fell into in 2008. Team Obama had simply been digging it deeper. We are only now recovering.

Go Trump go.

MAGA

9+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

“Tele-Monster” update. (the saga continues….)

It almost make me want to terminate my land line.

There is one persistent “tele-monster” that had called 15 times in the past 2 weeks. Last night I had enough. Granted NoMoRobo does detect these scam calls and terminates them as soon as they can get the caller ID. Unfortunately the caller ID is transmitted after the first ring, so I must endure many single rings from my land line phones. Tired of this, I set up my phone system through my provider so that when another call would come in and I knew it would, it would forward that call and only that call back to the point of origin.

(Click Continue reading below to see the screen capture.) Continue reading ““Tele-Monster” update. (the saga continues….)”

7+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

This Week’s Book Review – Smoke ‘Em if You got ‘Em

I write a weekly book review for the Daily News of Galveston County. (It is not the biggest daily newspaper in Texas, but it is the oldest.) My review normally appears Wednesdays. When it appears, I post the review here on the following Sunday.

Book Review

‘Smoke ’em’ shows military’s role in masculine rite

By MARK LARDAS

Nov 27, 2018

“Smoke ‘em if You Got ‘em: The Rise and Fall of the Military Cigarette Ration,” by Joel R. Bius, Naval Institute Press, 2018, 328 pages, $39.95

Anyone serving in the U.S. military before 1980 remembers the cry opening every break: “Smoke ‘em if you got ‘em.” Almost everyone, from the lowest private to the most senior officer present, would light up a cigarette.

“Smoke ‘em if You Got ‘em: The Rise and Fall of the Military Cigarette Ration,” by Joel R. Bius examines the link between the military and cigarette smoking. He shows how cigarette consumption and the military were connected.

In 1900 cigarettes were a surprisingly small fraction of tobacco consumption. Around 7 percent of all tobacco products were retailed in the form of cigarettes. Cigarette smoking was viewed as unmanly and un-American.

World War I changed that. Nicotine proved the American Expeditionary Force’s battlefield drug of choice. Tobacco simultaneously calmed the nerves while increasing alertness. Smoking masked the battlefield’s stench. Although tobacco was known to be bad, its adverse effects were long-term. Meantime, there was a war to win. Organizations like the YMCA freely distributed cigarettes, the most convenient form of smoking tobacco to our boys in the trenches.

The link stuck when the boys returned home. Cigarettes gained the cachet as a man’s vice, linked with battlefield bravery. Bius follows the arc cigarette consumption followed through the century’s middle years. Battlefield use of cigarettes in World War II sealed the image of cigarettes as a masculine activity. By then, the Army issued a cigarette ration and subsidized smokes at the PX. Use hit a peak after World War II years when 80 percent of men smoked cigarettes.

Despite the 1964 Surgeon General’s warning and government efforts to cut tobacco use thereafter, cigarettes remained popular, even after the military eliminated the cigarette ration in 1972. It took the All-Volunteer Army to break the link between smoking and the military. Containing health care costs led the military to discourage tobacco use. That in turn broke smoking’s image as a masculine activity. Cigarette use plunged; until today, cigarette use is almost back to 1900 levels.

“Smoke ‘em if You Got ‘em” is a fascinating story about the rise and fall of a masculine rite of passage.

 Mark Lardas, an engineer, freelance writer, amateur historian, and model-maker, lives in League City. His website is marklardas.com.

7+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

A must see: The Ingraham Angle

IMHO; it ain’t only the Clintons…  Desperate Obama wants credit for Trump economy. (No surprise there, I never trusted that man. I took an oath to respect the office of the president, I did and I will respect the office, But I could never respect that man that held that office for those turbulent eight years.)

Continue reading “A must see: The Ingraham Angle”

3+

Users who have liked this post:

  • avatar
  • avatar
  • avatar

Ion-powered aircraft flies with no moving parts

Ladies and Gentlemen, we truly live in a wonderful age, an age of inventions not ever imagined by anyone before us, (us being those of this time).

Continue reading “Ion-powered aircraft flies with no moving parts”

8+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

Tracking books

I didn’t want to step on John Walker’s post about citing books in posts, but the image of a barcode in conjunction with identifying books triggered some old memories.

One of the early projects (late 60’s, early 70’s) at my first “real” job was to help build a system which would identify paperback books by optical recognition of the cover.  My company had done several military pattern recognition projects and we were approached by a New Jersey company to build two machines which would recognize paper back book covers.  The customer’s business model was based on the need to sort out the book returns from retailers.  They did not trust the retailer to provide a legitimate count, so the existing system was to ship (by boat) the books to Puerto Rico where cheap labor would do the counting.  The new company would strip the covers off the books and then run them through the machines we were building which would provide an more timely accounting of the different titles returned.  The reason the covers were stripped off was to make handling them easier and the fact that it was more expensive to get the whole books back into circulation at a retailer who would sell them than to just publish new books and ship them.  A side effect was that we could get all the paperbacks we wanted – without covers, of course.

In one of our first meetings to discuss the project, we Engineers (obviously not marketing types) tried to un-sell the recognition project by suggesting they just put a barcode on the books.  We were told in no uncertain terms that “The American public will never put up with a barcode on a retail product”.  (I just checked and the first barcodes were put on Wrigley gum in June of 1974, so they got a couple of years before competition hit)

One thing I learned from that project was that the problem as stated : “recognize paperback book covers” was only part of the real business solution.  After we were done, my mentor left us and went up to New Jersey to work with the customer to tie the recognition system into an accounting and billing system.  There is always a bigger picture.

One other thing about the project was that the book cover scanner was to be built so that the covers were shredded as soon as they were recognized.  They didn’t trust the operators to not feed them through again to pad the count.

5+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

Standards II (revisited)

OK, the last post about standards drifted way off topic, or so it seemed to some. I tried to get a screen grab of an interview with the owner as seen on FOX News. Since I could not get a direct link to the clip, I grabbed it and reduced it in size to post. Unfortunately the video clip is still too large, even after I reduced the resolution by 50%, so here is the audio from the clip. The video just included stock footage that many have seen before. The point is that he took the effort to exceed standards, deeper pilings, special windows and accepting the fact that the first floor would be swept away.

2+

Users who have liked this post:

  • avatar
  • avatar

Saturday Night Science: Life After Google

“Life after Google” by George GilderIn his 1990 book Life after Television, George Gilder predicted that the personal computer, then mostly boxes that sat on desktops and worked in isolation from one another, would become more personal, mobile, and be used more to communicate than to compute. In the 1994 revised edition of the book, he wrote. “The most common personal computer of the next decade will be a digital cellular phone with an IP address … connecting to thousands of databases of all kinds.” In contemporary speeches he expanded on the idea, saying, “it will be as portable as your watch and as personal as your wallet; it will recognize speech and navigate streets; it will collect your mail, your news, and your paycheck.” In 2000, he published Telecosm, where he forecast that the building out of a fibre optic communication infrastructure and the development of successive generations of spread spectrum digital mobile communication technologies would effectively cause the cost of communication bandwidth (the quantity of data which can be transmitted in a given time) to asymptotically approach zero, just as the ability to pack more and more transistors on microprocessor and memory chips was doing for computing.

Clearly, when George Gilder forecasts the future of computing, communication, and the industries and social phenomena that spring from them, it’s wise to pay attention. He’s not infallible: in 1990 he predicted that “in the world of networked computers, no one would have to see an advertisement he didn’t want to see”. Oh, well. The very difference between that happy vision and the advertisement-cluttered world we inhabit today, rife with bots, malware, scams, and serial large-scale security breaches which compromise the personal data of millions of people and expose them to identity theft and other forms of fraud is the subject of this book: how we got here, and how technology is opening a path to move on to a better place.

The Internet was born with decentralisation as a central concept. Its U.S. government-funded precursor, ARPANET, was intended to research and demonstrate the technology of packet switching, in which dedicated communication lines from point to point (as in the telephone network) were replaced by switching packets, which can represent all kinds of data—text, voice, video, mail, cat pictures—from source to destination over shared high-speed data links. If the network had multiple paths from source to destination, failure of one data link would simply cause the network to reroute traffic onto a working path, and communication protocols would cause any packets lost in the failure to be automatically re-sent, preventing loss of data. The network might degrade and deliver data more slowly if links or switching hubs went down, but everything would still get through.

This was very attractive to military planners in the Cold War, who worried about a nuclear attack decapitating their command and control network by striking one or a few locations through which their communications funnelled. A distributed network, of which ARPANET was the prototype, would be immune to this kind of top-down attack because there was no top: it was made up of peers, spread all over the landscape, all able to switch data among themselves through a mesh of interconnecting links.

As the ARPANET grew into the Internet and expanded from a small community of military, government, university, and large company users into a mass audience in the 1990s, this fundamental architecture was preserved, but in practice the network bifurcated into a two tier structure. The top tier consisted of the original ARPANET-like users, plus “Internet Service Providers” (ISPs), who had top-tier (“backbone”) connectivity, and then resold Internet access to their customers, who mostly initially connected via dial-up modems. Over time, these customers obtained higher bandwidth via cable television connections, satellite dishes, digital subscriber lines (DSL) over the wired telephone network, and, more recently, mobile devices such as cellular telephones and tablets.

The architecture of the Internet remained the same, but this evolution resulted in a weakening of its peer-to-peer structure. The approaching exhaustion of 32 bit Internet addresses (IPv4) and the slow deployment of its successor (IPv6) meant most small-scale Internet users did not have a permanent address where others could contact them. In an attempt to shield users from the flawed security model and implementation of the software they ran, their Internet connections were increasingly placed behind firewalls and subjected to Network Address Translation (NAT), which made it impossible to establish peer to peer connections without a third party intermediary (which, of course, subverts the design goal of decentralisation). While on the ARPANET and the original Internet every site was a peer of every other (subject only to the speed of their network connections and computer power available to handle network traffic), the network population now became increasingly divided into producers or publishers (who made information available), and consumers (who used the network to access the publishers’ sites but did not publish themselves).

While in the mid-1990s it was easy (or as easy as anything was in that era) to set up your own Web server and publish anything you wished, now most small-scale users were forced to employ hosting services operated by the publishers to make their content available. Services such as AOL, Myspace, Blogger, Facebook, and YouTube were widely used by individuals and companies to host their content, while those wishing their own apparently independent Web presence moved to hosting providers who supplied, for a fee, the servers, storage, and Internet access used by the site.

All of this led to a centralisation of data on the Web, which was accelerated by the emergence of the high speed fibre optic links and massive computing power upon which Gilder had based his 1990 and 2000 forecasts. Both of these came with great economies of scale: it cost a company like Google or Amazon much less per unit of computing power or network bandwidth to build a large, industrial-scale data centre located where electrical power and cooling were inexpensive and linked to the Internet backbone by multiple fibre optic channels, than it cost an individual Internet user or small company with their own server on premises and a modest speed link to an ISP. Thus it became practical for these Goliaths of the Internet to suck up everybody’s data and resell their computing power and access at attractive prices.

As a example of the magnitude of the economies of scale we’re talking about, when I migrated the hosting of my Fourmilab.ch site from my own on-site servers and Internet connection to an Amazon Web Services data centre, my monthly bill for hosting the site dropped by a factor of fifty—not fifty percent, one fiftieth the cost, and you can bet Amazon’s making money on the deal.

This tremendous centralisation is the antithesis of the concept of ARPANET. Instead of a worldwide grid of redundant data links and data distributed everywhere, we have a modest number of huge data centres linked by fibre optic cables carrying traffic for millions of individuals and enterprises. A couple of submarines full of Trident D5s would probably suffice to reset the world, computer network-wise, to 1970.

As this concentration was occurring, the same companies who were building the data centres were offering more and more services to users of the Internet: search engines; hosting of blogs, images, audio, and video; E-mail services; social networks of all kinds; storage and collaborative working tools; high-resolution maps and imagery of the world; archives of data and research material; and a host of others. How was all of this to be paid for? Those giant data centres, after all, represent a capital investment of tens of billions of dollars, and their electricity bills are comparable to those of an aluminium smelter. Due to the architecture of the Internet or, more precisely, missing pieces of the puzzle, a fateful choice was made in the early days of the build-out of these services which now pervade our lives, and we’re all paying the price for it. So far, it has allowed the few companies in this data oligopoly to join the ranks of the largest, most profitable, and most highly valued enterprises in human history, but they may be built on a flawed business model and foundation vulnerable to disruption by software and hardware technologies presently emerging.

The basic business model of what we might call the “consumer Internet” (as opposed to businesses who pay to host their Web presence, on-line stores, etc.) has, with few exceptions, evolved to be what the author calls the “Google model” (although it predates Google): give the product away and make money by afflicting its users with advertisements (which are increasingly targeted to them through information collected from the user’s behaviour on the network through intrusive tracking mechanisms). The fundamental flaws of this are apparent to anybody who uses the Internet: the constant clutter of advertisements, with pop-ups, pop-overs, auto-play video and audio, flashing banners, incessant requests to allow tracking “cookies” or irritating notifications, and the consequent arms race between ad blockers and means to circumvent them, with browser developers (at least those not employed by those paid by the advertisers, directly or indirectly) caught in the middle. There are even absurd Web sites which charge a subscription fee for “membership” and then bombard these paying customers with advertisements that insult their intelligence. But there is a fundamental problem with “free”—it destroys the most important channel of communication between the vendor of a product or service and the customer: the price the customer is willing to pay. Deprived of this information, the vendor is in the same position as a factory manager in a centrally planned economy who has no idea how many of each item to make because his orders are handed down by a planning bureau equally clueless about what is needed in the absence of a price signal. In the end, you have freight cars of typewriter ribbons lined up on sidings while customers wait in line for hours in the hope of buying a new pair of shoes. Further, when the user is not the customer (the one who pays), and especially when a “free” service verges on monopoly status like Google search, Gmail, Facebook, and Twitter, there is little incentive for providers to improve the user experience or be responsive to user requests and needs. Users are subjected to the endless torment of buggy “beta” releases, capricious change for the sake of change, and compromises in the user experience on behalf of the real customers—the advertisers. Once again, this mirrors the experience of centrally-planned economies where the market feedback from price is absent: to appreciate this, you need only compare consumer products from the 1970s and 1980s manufactured in the Soviet Union with those from Japan.

The fundamental flaw in Karl Marx’s economics was his belief that the industrial revolution of his time would produce such abundance of goods that the problem would shift from “production amid scarcity” to “redistribution of abundance”. In the author’s view, the neo-Marxists of Silicon Valley see the exponentially growing technologies of computing and communication providing such abundance that they can give away its fruits in return for collecting and monetising information collected about their users (note, not “customers”: customers are those who pay for the information so collected). Once you grasp this, it’s easier to understand the politics of the barons of Silicon Valley.

The centralisation of data and information flow in these vast data silos creates another threat to which a distributed system is immune: censorship or manipulation of information flow, whether by a coercive government or ideologically-motivated management of the companies who provide these “free” services. We may never know who first said “The Internet treats censorship as damage and routes around it” (the quote has been attributed to numerous people, including two personal friends, so I’m not going there), but it’s profound: the original decentralised structure of the ARPANET/Internet is as robust against censorship as it is in the face of nuclear war. If one or more nodes on the network start to censor information or refuse to forward it on communication links it controls, the network routing protocols simply assume that node is down and send data around it through other nodes and paths which do not censor it. On a network with a multitude of nodes and paths among them, owned by a large and diverse population of operators, it is extraordinarily difficult to shut down the flow of information from a given source or viewpoint; there will almost always be an alternative route that gets it there. (Cryptographic protocols and secure and verified identities can similarly avoid the alteration of information in transit or forging information and attributing it to a different originator; I’ll discuss that later.) As with physical damage, top-down censorship does not work because there’s no top.

But with the current centralised Internet, the owners and operators of these data silos have enormous power to put their thumbs on the scale, tilting opinion in their favour and blocking speech they oppose. Google can push down the page rank of information sources of which they disapprove, so few users will find them. YouTube can “demonetise” videos because they dislike their content, cutting off their creators’ revenue stream overnight with no means of appeal, or they can outright ban creators from the platform and remove their existing content. Twitter routinely “shadow-bans” those with whom they disagree, causing their tweets to disappear into the void, and outright banishes those more vocal. Internet payment processors and crowd funding sites enforce explicit ideological litmus tests on their users, and revoke long-standing commercial relationships over legal speech. One might restate the original observation about the Internet as “The centralised Internet treats censorship as an opportunity and says, ‘Isn’t it great!’ ” Today there’s a top, and those on top control the speech of everything that flows through their data silos.

This pernicious centralisation and “free” funding by advertisement (which is fundamentally plundering users’ most precious possessions: their time and attention) were in large part the consequence of the Internet’s lacking three fundamental architectural layers: security, trust, and transactions. Let’s explore them.

Security. Essential to any useful communication system, security simply means that communications between parties on the network cannot be intercepted by third parties, modified en route, or otherwise manipulated (for example, by changing the order in which messages are received). The communication protocols of the Internet, based on the OSI model, had no explicit security layer. It was expected to be implemented outside the model, across the layers of protocol. On today’s Internet, security has been bolted-on, largely through the Transport Layer Security (TLS) protocols (which, due to history, have a number of other commonly used names, and are most often encountered in the “https:” URLs by which users access Web sites). But because it’s bolted on, not designed in from the bottom-up, and because it “just grew” rather than having been designed in, TLS has been the locus of numerous security flaws which put software that employs it at risk. Further, TLS is a tool which must be used by application designers with extreme care in order to deliver security to their users. Even if TLS were completely flawless, it is very easy to misuse it in an application and compromise users’ security.

Trust. As indispensable as security is knowing to whom you’re talking. For example, when you connect to your bank’s Web site, how do you know you’re actually talking to their server and not some criminal whose computer has spoofed your computer’s domain name system server to intercept your communications and who, the moment you enter your password, will be off and running to empty your bank accounts and make your life a living Hell? Once again, trust has been bolted on to the existing Internet through a rickety system of “certificates” issued mostly by large companies for outrageous fees. And, as with anything centralised, it’s vulnerable: in 2016, one of the top-line certificate vendors was compromised, requiring myriad Web sites (including this one) to re-issue their security certificates.

Transactions. Business is all about transactions; if you aren’t doing transactions, you aren’t in business or, as Gilder puts it, “In business, the ability to conduct transactions is not optional. It is the way all economic learning and growth occur. If your product is ‘free,’ it is not a product, and you are not in business, even if you can extort money from so-called advertisers to fund it.” The present-day Internet has no transaction layer, even bolted on. Instead, we have more silos and bags hanging off the side of the Internet called PayPal, credit card processing companies, and the like, which try to put a Band-Aid over the suppurating wound which is the absence of a way to send money over the Internet in a secure, trusted, quick, efficient, and low-overhead manner. The need for this was perceived long before ARPANET. In Project Xanadu, founded by Ted Nelson in 1960, rule 9 of the “original 17 rules” was, “Every document can contain a royalty mechanism at any desired degree of granularity to ensure payment on any portion accessed, including virtual copies (‘transclusions’) of all or part of the document.” While defined in terms of documents and quoting, this implied the existence of a micropayment system which would allow compensating authors and publishers for copies and quotations of their work with a granularity as small as one character, and could easily be extended to cover payments for products and services. A micropayment system must be able to handle very small payments without crushing overhead, extremely quickly, and transparently (without the Japanese tea ceremony that buying something on-line involves today). As originally envisioned by Ted Nelson, as you read documents, their authors and publishers would be automatically paid for their content, including payments to the originators of material from others embedded within them. As long as the total price for the document was less than what I termed the user’s “threshold of paying”, this would be completely transparent (a user would set the threshold in the browser: if zero, they’d have to approve all payments). There would be no need for advertisements to support publication on a public hypertext network (although publishers would, of course, be free to adopt that model if they wished). If implemented in a decentralised way, like the ARPANET, there would be no central strangle point where censorship could be applied by cutting off the ability to receive payments.

So, is it possible to remake the Internet, building in security, trust, and transactions as the foundation, and replace what the author calls the “Google system of the world” with one in which the data silos are seen as obsolete, control of users’ personal data and work returns to their hands, privacy is respected and the panopticon snooping of today is seen as a dark time we’ve put behind us, and the pervasive and growing censorship by plutocrat ideologues and slaver governments becomes impotent and obsolete? George Gilder responds “yes”, and in this book identifies technologies already existing and being deployed which can bring about this transformation.

At the heart of many of these technologies is the concept of a blockchain, an open, distributed ledger which records transactions or any other form of information in a permanent, public, and verifiable manner. Originally conceived as the transaction ledger for the Bitcoin cryptocurrency, it provided the first means of solving the double-spending problem (how do you keep people from spending a unit of electronic currency twice) without the need for a central server or trusted authority, and hence without a potential choke-point or vulnerability to attack or failure. Since the launch of Bitcoin in 2009, blockchain technology has become a major area of research, with banks and other large financial institutions, companies such as IBM, and major university research groups exploring applications with the goals of drastically reducing transaction costs, improving security, and hardening systems against single-point failure risks.

Applied to the Internet, blockchain technology can provide security and trust (through the permanent publication of public keys which identify actors on the network), and a transaction layer able to efficiently and quickly execute micropayments without the overhead, clutter, friction, and security risks of existing payment systems. By necessity, present-day blockchain implementations are add-ons to the existing Internet, but as the technology matures and is verified and tested, it can move into the foundations of a successor system, based on the same lower-level protocols (and hence compatible with the installed base), but eventually supplanting the patched-together architecture of the Domain Name System, certificate authorities, and payment processors, all of which represent vulnerabilities of the present-day Internet and points at which censorship and control can be imposed. Technologies to watch in these areas are:

As the bandwidth available to users on the edge of the network increases through the deployment of fibre to the home and enterprise and via 5G mobile technology, the data transfer economy of scale of the great data silos will begin to erode. Early in the Roaring Twenties, the aggregate computing power and communication bandwidth on the edge of the network will equal and eventually dwarf that of the legacy data smelters of Google, Facebook, Twitter, and the rest. There will no longer be any need for users to entrust their data to these overbearing anachronisms and consent to multi-dozen page “terms of service” or endure advertising just to see their own content or share it with others. You will be in possession of your own data, on your own server or on space for which you freely contract with others, with backup and other services contracted with any other provider on the network. If your server has extra capacity, you can turn it into money by joining the market for computing and storage capacity, just as you take advantage of these resources when required. All of this will be built on the new secure foundation, so you will retain complete control over who can see your data, no longer trusting weasel-worded promises made by amorphous entities with whom you have no real contract to guard your privacy and intellectual property rights. If you wish, you can be paid for your content, with remittances made automatically as people access it. More and more, you’ll make tiny payments for content which is no longer obstructed by advertising and chopped up to accommodate more clutter. And when outrage mobs of pink hairs and soybeards (each with their own pronoun) come howling to ban you from the Internet, they’ll find nobody to shriek at and the kill switch rusting away in a derelict data centre: your data will be in your own hands with access through myriad routes. Technologies moving in this direction include:

This book provides a breezy look at the present state of the Internet, how we got here (versus where we thought we were going in the 1990s), and how we might transcend the present-day mess into something better if not blocked by the heavy hand of government regulation (the risk of freezing the present-day architecture in place by unleashing agencies like the U.S. Federal Communications Commission, which stifled innovation in broadcasting for six decades, to do the same to the Internet is discussed in detail). Although it’s way too early to see which of the many contending technologies will win out (and recall that the technically superior technology doesn’t always prevail), a survey of work in progress provides a sense for what they have in common and what the eventual result might look like.

There are many things to quibble about here. Gilder goes on at some length about how he believes artificial intelligence is all nonsense, that computers can never truly think or be conscious, and that creativity (new information in the Shannon sense) can only come from the human mind, with a lot of confused arguments from Gödel incompleteness, the Turing halting problem, and even the uncertainty principle of quantum mechanics. He really seems to believe in vitalism, that there is an élan vital which somehow infuses the biological substrate which no machine can embody. This strikes me as superstitious nonsense: a human brain is a structure composed of quarks and electrons arranged in a certain way which processes information, interacts with its environment, and is able to observe its own operation as well as external phenomena (which is all consciousness is about). Now, it may be that somehow quantum mechanics is involved in all of this, and that our existing computers, which are entirely deterministic and classical in their operation, cannot replicate this functionality, but if that’s so it simply means we’ll have to wait until quantum computing, which is already working in a rudimentary form in the laboratory, and is just a different way of arranging the quarks and electrons in a system, develops further.

He argues that while Bitcoin can be an efficient and secure means of processing transactions, it is unsuitable as a replacement for volatile fiat money because, unlike gold, the quantity of Bitcoin has an absolute limit, after which the supply will be capped. I don’t get it. It seems to me that this is a feature, not a bug. The supply of gold increases slowly as new gold is mined, and by pure coincidence the rate of increase in its supply has happened to approximate that of global economic growth. But still, the existing inventory of gold dwarfs new supply, so there isn’t much difference between a very slowly increasing supply and a static one. If you’re on a pure gold standard and economic growth is faster than the increase in the supply of gold, there will be gradual deflation because a given quantity of gold will buy more in the future. But so what? In a deflationary environment, interest rates will be low and it will be easy to fund new investment, since investors will receive money back which will be more valuable. With Bitcoin, once the entire supply is mined, supply will be static (actually, very slowly shrinking, as private keys are eventually lost, which is precisely like gold being consumed by industrial uses from which it is not reclaimed), but Bitcoin can be divided without limit (with minor and upward-compatible changes to the existing protocol). So, it really doesn’t matter if, in the greater solar system economy of the year 8537, a single Bitcoin is sufficient to buy Jupiter: transactions will simply be done in yocto-satoshis or whatever. In fact, Bitcoin is better in this regard than gold, which cannot be subdivided below the unit of one atom.

Gilder further argues, as he did in The Scandal of Money, that the proper dimensional unit for money is time, since that is the measure of what is required to create true wealth (as opposed to funny money created by governments or fantasy money “earned” in zero-sum speculation such as currency trading), and that existing cryptocurrencies do not meet this definition. I’ll take his word on the latter point; it’s his definition, after all, but his time theory of money is way too close to the Marxist labour theory of value to persuade me. That theory is trivially falsified by its prediction that more value is created in labour-intensive production of the same goods than by producing them in a more efficient manner. In fact, value, measured as profit, dramatically increases as the labour input to production is reduced. Over forty centuries of human history, the one thing in common among almost everything used for money (at least until our post-reality era) is scarcity: the supply is limited and it is difficult to increase it. The genius of Bitcoin and its underlying blockchain technology is that it solved the problem of how to make a digital good, which can be copied at zero cost, scarce, without requiring a central authority. That seems to meet the essential requirement to serve as money, regardless of how you define that term.

Gilder’s books have a good record for sketching the future of technology and identifying the trends which are contributing to it. He has been less successful picking winners and losers; I wouldn’t make investment decisions based on his evaluation of products and companies, but rather wait until the market sorts out those which will endure.

Gilder, George. Life after Google. Washington: Regnery Publishing, 2018. ISBN 978-1-62157-576-4.

Here is a talk by the author at the Blockstack Berlin 2018 conference which summarises the essentials of his thesis in just eleven minutes and ends with an exhortation to designers and builders of the new Internet to “tear down these walls” around the data centres which imprison our personal information.

This Uncommon Knowledge interview provides, in 48 minutes, a calmer and more in-depth exploration of why the Google world system must fail and what may replace it.

12+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

TOTD 2018-09-27: The New 95

Peter Thiel created the 1517 Fund, named after the year when Martin Luther nailed his 95 Theses to the door of All Saints’ Church in Wittenberg (this story may be apocryphal, but the theses were printed and widely distributed in 1517 and 1518).  Thiel’s fund invests in ventures created by young entrepreneurs, most beneficiaries of Thiel Fellowships, who eschew or drop out of the higher education fraud to do something with their lives before their creativity is abraded away by the engine of conformity and mediocrity that present-day academia has become.

The goal of these ventures is to render impotent and obsolete the dysfunctional scam of higher education.  There is no richer target to be disrupted by technology and honesty than the bloated academia-corporate-government credential shakedown racket that has turned what could have been a productive generation into indentured debt slaves, paying off worthless degrees.

On the (approximate) 500th anniversary of Luther’s 95 Theses, the 1517 Fund published “The New 95”: blows against the fraudulent academic empire.  Here are a few choice excerpts.

15. Harvard could admit ten times as many students, but it doesn’t. It could open ten more campuses in different regions, but it doesn’t. Elite schools are afraid of diluting brand equity. They’re not in the education business. They’re in the luxury watch business.

19. In 1987, the year Stephen Trachtenberg became president of George Washington University, students paid $27,000 (in 2017 dollars) in tuition, room, and board. When he retired twenty years later, they paid more than double — close to $60,000. Trachtenberg made GW the most expensive school in the nation without improving education at all. The degree “serves as a trophy, a symbol,” he said. “I’m not embarrassed by what we did.” There are buildings on campus named after this guy.

28. The problem in schooling is not that we have invested too little, but that we get so little for so much.

36. There’s no iron law of economics that says tuition should go up — and only up — year after year. By many measures, universities are the same or worse at teaching students as they were in the early 1980s. But now, students are paying four times as much as they did then. Imagine paying more every year for tickets on an airline whose planes flew slower and crashed more frequently, but that spent its revenue on one hell of a nice terminal and lounge instead. Would you put that sticker on your car’s back window?

48. The people who give exams or evaluate essays and the people who teach should not be one and the same. Creating the best content for people to learn and creating a system to certify that people have achieved some level of mastery are two different problems. By fusing them into one, universities curtail freedom of thought and spark grade inflation. Critical thinking is currently mistaken for finding out what the professor wants to hear and saying it.

57. Professors should be better than snowmen. Snowstorms cancelling class tend to bring more joy to students than learning new ideas. What a strange service! Higher education, root canals, rectal exams, and schooling are the only services that consumers rejoice in having cancelled.

78. Every academic and scientific journal should be open and free to the public. It is much easier to check results for reproducibility with a billion eyes.

84. Too much of school is about proving that you can show up every day on time, work, and get along with the people around you.

Read the whole thing.

10+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar