Close Call: STS-93

Space Shuttle mission STS-93 in July 1999 launched the Chandra X-ray Observatory which, with its upper stage rocket, was the heaviest payload ever launched by a shuttle mission.  The first attempt to launch STS-93 was aborted seven seconds before liftoff, just a moment before the main engines were to start, due to an warning about hydrogen build-up which was subsequently determined to be erroneous.

Three days later, after an uneventful countdown, the mission launched on 1999-07-23.  Seconds after liftoff, problems were apparent.  The Master Alarm went off on board Columbia, and messages indicated a water pump failure and potassium hydroxide leak in fuel cell number one (of three).  A potassium hydroxide leak could potentially cause the fuel cell to explode, which would be bad.

Further, at Mission Control, a big red light came on indicating that hydraulic pressure in the right solid rocket booster had fallen to disastrously low levels.  Failure of the hydraulics, which were used to steer the booster, would cause the shuttle to veer off course and break up or impact the ground.  However, things appeared to proceed normally, so it was decided the low pressure indication must be due to faulty instrumentation, not a real problem.

Seconds later the primary engine control computer on the centre main engine failed, with the back-up taking over control, and the back-up control computer on the right engine failed.  Two of the shuttle’s three main engines were now running with no control redundancy.  Failure of their remaining computers would cause the engines to shut down, requiring a risky mission abort.

Simultaneous with all of this, one of the AC power buses dropped offline.  Loss of another bus would take down one of the main engines, so the crew were told to inhibit the automatic shutdown of the AC buses (essentially equivalent to hard-wiring around a fuse).

All of this happened in the first minute of flight.

As the ascent continued, anomalies were noted in the performance of the centre and right main engines.  The engines cut off a fraction of a second too early, leaving the shuttle short on velocity (but nothing that couldn’t be made for with its orbital maneuvering engines).  There was an indication that the engines cut off due to depletion of liquid oxygen, which shouldn’t have happened since an adequate reserve is always loaded.

After engine shutdown and confirmation of orbital insertion had occurred, flight director John Shannon exclaimed on the Flight audio loop, “Yikes.  We don’t need another one of those.”  It was like one of the nefarious multiple-failure scenarios dreamed up by evil simulation supervisors to test flight controllers and crew in simulation runs, but this time for real.

STS-93 hydrogen leak in right engineWhat nobody knew at the time was that at the moment of engine ignition a tiny gold plug, used to stopper defective liquid oxygen injection posts inside the right main engine, had come loose, shot through the nozzle, and impacted the wall of the nozzle extension with the velocity of a rifle bullet.  It had ripped open three adjacent cooling tubes in the extension.  Engineers estimated that loss of five tubes would be sufficient to create a burn-through which would result in a catastrophic failure and loss of the vehicle and crew.  In the picture at the right, you can see the hydrogen leak as a bright blue-green stripe in the rightmost engine bell.

Still, with three tubes breached, the engine was leaking 1.6 kg of liquid hydrogen every second, and since the leak was downstream of the hydrogen flow sensors, there was no indication in telemetry that there was a problem.  Further, most of the telemetry for the engines is processed by the centre engine primary control computer, which was down due to the electrical problem.

The hydrogen leak caused the chamber pressure to fall, which the control computer countered by increasing oxygen flow.  This in turn caused the turbine temperature to increase toward the red-line at which the engine would throttle down to protect itself.

None of these details were known until Columbia returned after its successful mission and the engines were torn down for inspection.  The shuttle had dodged several bullets.  The defective liquid oxygen post could have failed or melted, destroying the engine (recall that that’s why it had been pinned shut in the first place).  The gold pin could have punctured five or more cooling tubes leading to a burn-through and engine explosion.  Finally, the electrical anomaly, which was tracked down to a frayed wire shorting against a burr in a screw head, could have been even more serious than it had been, disabling redundant engine control computers or other mission-critical systems.

Further, it was found that due to other, unrelated problems, Columbia was launched with 407 kg less liquid oxygen than planned.  Had this been slightly greater, the increased oxygen consumption due to the leak could have caused the engines to cut off earlier, possibly with a velocity shortfall too large to make up with the maneuvering engines.

The mission was a success, and the Chandra Telescope is still in service today.  But the experience demonstrated just how thin the margins were in the Space Shuttle system and how the tiniest components: a small gold pin or a burr on a screw head, could lead the vehicle and crew to the brink of disaster.

Here is a Scott Manley video with the story of STS-93.

Wayne Hale was the Mission Operations Director for STS-93.  In 2014, he recounted the story on his blog.

The following is the NASA TV coverage of the launch from ten minutes before launch to insertion in orbit.  The NASA Public Affairs Officer makes things seem much more calm than the voices he was hearing in his ear from Mission Control.

This is an audio recording of the Flight loop from Mission Control during the launch.  The “Yikes” remark is at the 10:20 point in the recording.

Like 14+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

Perseid Meteor Shower 2018

Perseid meteor, 2015-08-13

Tonight (August 12–13, 2018 UTC) the Perseid meteor shower will peak.  This meteor shower occurs every year around August 12th as the Earth passes through the orbit of debris from comet Swift-Tuttle.  This is one of the most reliable and intense meteor showers and, in ideal conditions (clear, dark sky and dark-adapted eyes) you may see a meteor a minute.  (As with everything, Pareto is on the job—there are many more dim meteors than bright ones.)

This year, the Moon will not interfere with observation, so there should be a good show.  Here is information about observing the Perseids.

It couldn’t be easier.  Any time after around 23:00 local time (the later the better, as the “radiant”—the point in the sky from which the meteors appear to come—rises higher in the sky) go out to a place as far as you can find away from street lights or other interference and look up toward the northeast.  Allow time for your eyes to dark-adapt.  Once you can see the Milky Way, you should be able to see the dimmer meteors.

Sometimes you’ll be lucky and see a bright fireball which leaves a persistent trail that lasts for several seconds.  Don’t expect this, however: the last one I saw was in 2015, as pictured above.

I’ll not be watching for Perseids tonight.  After a perfectly clear day, around the end of astronomical twilight clouds rolled in and completely obscured the sky.  The peak of the Perseids is broad, however, so if it’s clear I’ll try to-morrow.

If you have clear skies tonight, go out and have a look.  You need no equipment other than the Mark I eyeball and, if in skeeter country, a splash of DEET.

Clear skies!

Like 10+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

Book Review “The Autodesk File”

A comment John made (#18) on a recent post by 10 cents (“Programming Question”), reminded me I had reviewed one of John’s books. The review was posted a while back on the legacy site. As this is one of the most worthwhile books I have ever read, I thought it should be posted it here.

A work of non-fiction is understood in a context. A great work actually articulates the context before anybody else gets it. A review of such a book may go seemingly far afield, if the book’s power can be construed to provoke and, indeed, license the inspired musings of its readers. Such is the case here, as “The Autodesk File”’s roots are deep in the intellectual, technological, economic, financial, and even spiritual soil of this, the spring garden of the information age.

When was the last time you couldn’t put down a book which had not a single murder, courtship, love or sex scene? OK, I’m not counting some ancillary trysts consisting of mergers and takeovers, which some might construe as sexy, or at least allude to being on the receiving end of a certain Anglo-Saxon gerund. This book contains no obscenities, save a rare mention of taurine spoor. That serves as a welcome reminder: important ideas and even emotions are amenable to description sans vulgarity.

Lest one think this a narrow commercial exposition, “The Autodesk File” is in the public domain in multiple formats. Neither is it a mere exposition of commerce. About half way through, amidst essays explaining the nature of businesses dealing in intellectual property (rather than capital-intensive equipment), the reader is treated to a short science fiction story whose theme is no less than a plausible tale of the origin of human life. Our bodily construction is, after all, prescribed in lines of code, albeit compressed into helixes wound around themselves then wrapped around histones. Like some of their software counterparts, they, too, must be unzipped before use.

Also punctuating this eclectic opus are quotes from Aristophanes. It is a tour de force, a truly awe-inspiring account of much more than the building and workings of one trailblazing company. It encapsulates the noblest of human aspirations, idealizations, creativity, ingenuity and critical self-examination; inescapable is the conclusion that voluntary cooperation and exchange of ideas, knowledge and capital is a great boon to the world at large. If a business is built to serve the needs of customers by creating products of the highest possible quality, greed is not a good; it is irrelevant. Also inescapable is the perhaps ironic conclusion that ongoing success requires continual vigilance, lest arrogance take hold. The fruition of critical self-examination can be seen in renewal of that same humility which was so essential in powering that first whiff of success.

Nonetheless, apart from arcane sections dealing with technical matters of computer hardware and programming (these, too, may be great for the cognoscenti; this writer simply knows too little), this book is a spellbinder. Readers may be surprised to be persuasively regaled with the fundamentals of various disciplines, including economics, finance, taxation, corporate law, engineering, computer science, thermodynamics, rocket science, quantum mechanics, cosmology and the nature of reality. That is, readers who don’t know John Walker. For those who do, none of this is surprising.

Have you ever had a million dollar idea? I have – lots of ‘em. Have I turned even one of those ideas into a product? Nope. Why not? Because I lacked the understanding, the talent, and the single-minded discipline to even get one idea off the ground. This book, edited by Ratburger’s own John Walker (himself author of most of the collected writings), is a chronicle of birth, growth, crises and maturation of Autodesk Inc., whose products helped unleash the creativity and productivity of millions of people. It did so beginning with a key insight: that the infant personal computer was a general tool and not a specific workstation. As a general tool, through the intelligent design of software, it would rapidly evolve in utility in virtually every field of endeavor, beginning with design. Design, in this line of thinking, is a logical first step down the path which aims, eventually, to capture all of reality in the box we call a computer. This stunning insight occurred while all the rest of us still went through our days typing on an IBM Selectric, without once even using the word “computer.” Way back then in 1980, virtually none of us thought about computers or any of the other words and things without which our lives today would be unimaginable. Historically speaking,1980 happened yesterday.

An additional insight guided Autodesk’s ethos: that personal computers would grow exponentially in processing power and become useful by ordinary people (with no computer or programming skills) to undertake virtually any task. Autodesk’s first product,  AutoCAD, moved design from a small number of dedicated, expensive CAD workstations operated by highly-trained people, to desks virtually everywhere where drawing might be needed. In the process of “squeezing too much code into too small a box,” Autodesk did not compete with previous generations of single-purpose CAD workstations which cost 10 – 50X as much. Instead, it created and increased a market for CAD by the same orders of magnitude, by bringing this tool to the 98% of designers and draftsmen who could not afford dedicated CAD workstations.

In less than one year, this new company had a hit product. Time to rest on one’s laurels? How about after the IPO? Time to coast? Not quite. Going into the CAD business – and that is the business, as opposed to the software business (read the book to learn why), is something like launching a rocket from Earth and hoping to land on a comet and send back data – all except that the precise trajectory of the comet cannot be known, and its surface material and contours are completely unknown. The difficulties were perhaps not unlike those encountered by the ESA’s $1.8 billion Rosetta/Philae spacecraft which did rendezvous and land on comet 67P. Philae’s tether harpoons failed to fire, so the probe bounced and wound up in a permanently-shaded spot (due to an unanticipated hard surface, they likely would not have worked anyway), preventing use of solar power. Batteries enabled an estimated 80% overall mission success. AutoCAD’s launch – with $59,000 in capital, mid-course hardware and software corrections and “landing” on users, by contrast, remains successful to this day.

“The Autodesk File” attributes success to the company’s understanding that it represented what it coined “The New Technological Corporation.” This is an an enterprise which does not conform to traditional capital-intensive business, as it can deploy intellectual, debt-free leverage. Such businesses embrace an unpredictable but essential element: “wild talent.” This talent is a necessary but not sufficient condition for success when it comes to creating software, which is unlike most all prior businesses. Rather than capital, such entities require a peculiar kind of talent – one which grasps the present desires of a market, knows what is possible with present hardware and the correctly plots the trajectories of both the market and evolving hardware. I believe it to be objectively true that the editor is faithfully and humbly describing the truly awe-inspiring talent he, himself, brought to Autodesk. Other such individuals, like Jobs or Gates, are known in the early computer and software businesses. Few, however, have operated as willing members of an extended team with humility, dedication to excellence and human decency. If nothing else, “The Autodesk File” shows how this can be accomplished. 

Attempts to find individuals with “wild talent” are most difficult, maybe impossible. “Wild talent” illustrates the essential difference between aggregate information, traditionally used by analysts to “value” companies which trade on public exchanges, and actual events which take place within any company. For instance, money spent on R&D is aggregate data which subsumes the activities of many employees of a given company. Whether it means the company will grow really depends on what individual employees accomplish. When it come to software, the outcome will be notably different for R&D teams which play it safe versus ones which continually push the envelope of what may be remotely possible. Intellectual leverage is such that the cost of failure of 8 out of 10 ideas is far outweighed by success in only 1 or 2 of them. The presence of such loyal individuals is also a bulwark against hostile takeovers. You can lead a programmer to the R&D department, but you can’t make him plink – at least not in the way which is essential to success.

Perhaps most revealing about this unusual book is the ongoing critical self-examination engaged in by the primary author. These analyses were distilled into the form of internal company communications as essays and information letters.  At many points in the journey, the author is able to adumbrate the – sometimes previously un-articulable – principles which guided his often momentous insights. These usually arose in chaotic circumstances with incomplete information. The essential humility of this approach is demonstrated at various points in the book. Repeatedly, the author makes clear the importance of open communication and understanding of the roles of all the other parts of the company. A programmer, for example, must understand management’s plan, what customers want, how a product will be marketed and shipped, what competitors are doing, etc. Only then can a “wild talent” be effective.

 “The Autodesk File” is a much-needed reminder that human beings are still capable of doing awe-inspiring, creative and even noble things; that they can voluntarily collaborate and, working in their own self-interest, set off endless waves of non-zero sum games in their wakes. This is also a success story, then, a chain of decisions, clearly rooted in the philosophy of Classical Liberalism – in some of its untidy and altogether messy human details. Without aiming to, this story affirms the primacy and value of the individual, both as producer and consumer; it convincingly shows that communication – positive and negative feedback – between individual, voluntary buyers and sellers – is the essence of what a market is. This is in contrast to statist dirigisme, where aggregate data and arrogance rule, in derogation of the value of the individual. 

Diametrically opposed to today’s received collectivist wisdom, “The Autodesk File” shows how individuals create markets where none previously existed, to the betterment of all. From those roots emerge timeless operating principles: 1. build the best products, period – with open architecture so as to invite developers to customize and find as yet undreamed uses (an essential form of feedback for software companies), thereby further expanding markets; 2. invite, quickly assess and respond to this feedback from customers in the form of improved new releases; 3. employ owners, not merely ‘investors’ – pay well for results – with ownership whenever possible. This is a story which demonstrates the huge difference between owners, whose time preference is long and investors focused only on the forecast for the next fiscal quarter. The tyranny of industry analysts, a form of economic lunacy where short time preference is brutally and pervasively enforced on behalf of “investors,” operates so as to threaten the short-term existence of sound public companies which actually attempt to pursue the best long-term business practices.

In a somewhat philosophic interview around the tenth anniversary of Autodesk, the author/editor describes the operation of a new “design cult” of engineering as a “form of creationism, which thinks its members are so omniscient that they have no need for market-driven evolution to perfect their efforts.” This view, coupled with the information letters, again displays an essential humility in the ethos of Autodesk. Management must lead toward explicit goals. Every part of the organization must understand and communicate with all others, particularly as it affects product development. This is not the typical hierarchical corporate ethos. Neither is it anarchy. Management must lead, but not without listening, understanding and explaining. 

It is difficult for this writer to refrain from drawing parallels to the author’s description of this “design cult” of engineering. Such an attitude is not surprising, given that we live in a society which increasingly and officially denies the existence of a supreme being, while at the same time acting – through a “cult” of increasingly centralized authoritarian government – as though it were omniscient and omnipotent; as though its policies have no unintended consequences; as though no cost is too high to accomplish its goals, whose only feedback is its own reverberating positive-feedback echo chamber. It is hard to know which cult is imitating which. In either case, the state-erected obstacles to starting and running a business, while not emphasized, are on display in this epic. This common ethos of the state and large corporations has inevitably given us today’s pernicious corporatism.

It may be that the most significant intellectual error of our time is the belief that society can be modeled and manipulated as well as physical reality now can be, thanks in large part to private companies like Autodesk. Unlike government, though, companies are forced to relearn their limits – i.e., lessons in humility are given, at least annually, and enforced as necessary by balance sheets and owners. The fear of going out of business would be a highly salutary fear for modern government to experience. Instead of a healthy humility, however, the state often displays antipathy toward private enterprise – ironically, the very source of its own financial power. The public relations nature of this attitude  likely represents either envy of private successes and/or virtue signaling in an effort to garner votes in the incessant lust for yet more power.

God is traditionally described as a jealous God. Do you suppose that our deity/government has its own version of the Ten Commandments, the first of which explains its animus toward private enterprise? “Thou shalt have no other Gods before Me…” …otherwise put, “Trust me. I’m from the government.” “I’m here to protect you from those big, bad, corporations.”

Thus, as you may see for this reader, the story of Autodesk led to much contemplation of human nature and the whole spectrum our interactions – both voluntary and coercive. It is an inspiring and epic tale of the utility and nobility of voluntary cooperation.

“The Autodesk File” is in the public domain. It is available in several downloadable versions. All formats are accessible here: http://www.fourmilab.ch/autofile/


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

Saturday Night Science: Losing the Nobel Prize

“Losing the Nobel Prize” by Brian KeatingEver since the time of Galileo, the history of astronomy has been punctuated by a series of “great debates”—disputes between competing theories of the organisation of the universe which observation and experiment using available technology are not yet able to resolve one way or another. In Galileo’s time, the great debate was between the Ptolemaic model, which placed the Earth at the centre of the solar system (and universe) and the competing Copernican model which had the planets all revolving around the Sun. Both models worked about as well in predicting astronomical phenomena such as eclipses and the motion of planets, and no observation made so far had been able to distinguish them.

Then, in 1610, Galileo turned his primitive telescope to the sky and observed the bright planets Venus and Jupiter. He found Venus to exhibit phases, just like the Moon, which changed over time. This would not happen in the Ptolemaic system, but is precisely what would be expected in the Copernican model—where Venus circled the Sun in an orbit inside that of Earth. Turning to Jupiter, he found it to be surrounded by four bright satellites (now called the Galilean moons) which orbited the giant planet. This further falsified Ptolemy’s model, in which the Earth was the sole source of attraction around which all celestial bodies revolved. Since anybody could build their own telescope and confirm these observations, this effectively resolved the first great debate in favour of the Copernican heliocentric model, although some hold-outs in positions of authority resisted its dethroning of the Earth as the centre of the universe.

This dethroning came to be called the “Copernican principle”, that Earth occupies no special place in the universe: it is one of a number of planets orbiting an ordinary star in a universe filled with a multitude of other stars. Indeed, when Galileo observed the star cluster we call the Pleiades, he saw myriad stars too dim to be visible to the unaided eye. Further, the bright stars were surrounded by a diffuse bluish glow. Applying the Copernican principle again, he argued that the glow was due to innumerably more stars too remote and dim for his telescope to resolve, and then generalised that the glow of the Milky Way was also composed of uncountably many stars. Not only had the Earth been demoted from the centre of the solar system, so had the Sun been dethroned to being just one of a host of stars possibly stretching to infinity.

But Galileo’s inference from observing the Pleiades was wrong. The glow that surrounds the bright stars is due to interstellar dust and gas which reflect light from the stars toward Earth. No matter how large or powerful the telescope you point toward such a reflection nebula, all you’ll ever see is a smooth glow. Driven by the desire to confirm his Copernican convictions, Galileo had been fooled by dust. He would not be the last.

William Herschel was an eminent musician and composer, but his passion was astronomy. He pioneered the large reflecting telescope, building more than sixty telescopes. In 1789, funded by a grant from King George III, Herschel completed a reflector with a mirror 1.26 metres in diameter, which remained the largest aperture telescope in existence for the next fifty years. In Herschel’s day, the great debate was about the Sun’s position among the surrounding stars. At the time, there was no way to determine the distance or absolute brightness of stars, but Herschel decided that he could compile a map of the galaxy (then considered to be the entire universe) by surveying the number of stars in different directions. Only if the Sun was at the centre of the galaxy would the counts be equal in all directions.

Aided by his sister Caroline, a talented astronomer herself, he eventually compiled a map which indicated the galaxy was in the shape of a disc, with the Sun at the centre. This seemed to refute the Copernican view that there was nothing special about the Sun’s position. Such was Herschel’s reputation that this finding, however puzzling, remained unchallenged until 1847 when Wilhelm Struve discovered that Herschel’s results had been rendered invalid by his failing to take into account the absorption and scattering of starlight by interstellar dust. Just as you can only see the same distance in all directions while within a patch of fog, regardless of the shape of the patch, Herschel’s survey could only see so far before extinction of light by dust cut off his view of stars. Later it was discovered that the Sun is far from the centre of the galaxy. Herschel had been fooled by dust.

In the 1920s, another great debate consumed astronomy. Was the Milky Way the entire universe, or were the “spiral nebulæ” other “island universes”, galaxies in their own right, peers of the Milky Way? With no way to measure distance or telescopes able to resolve them into stars, many astronomers believed spiral neublæ were nearby objects, perhaps other solar systems in the process of formation. The discovery of a Cepheid variable star in the nearby Andromeda “nebula” by Edwin Hubble in 1923 allowed settling this debate. Andromeda was much farther away than the most distant stars found in the Milky Way. It must, then be a separate galaxy. Once again, demotion: the Milky Way was not the entire universe, but just one galaxy among a multitude.

But how far away were the galaxies? Hubble continued his search and measurements and found that the more distant the galaxy, the more rapidly it was receding from us. This meant the universe was expanding. Hubble was then able to calculate the age of the universe—the time when all of the galaxies must have been squeezed together into a single point. From his observations, he computed this age at two billion years. This was a major embarrassment: astrophysicists and geologists were confident in dating the Sun and Earth at around five billion years. It didn’t make any sense for them to be more than twice as old as the universe of which they were a part. Some years later, it was discovered that Hubble’s distance estimates were far understated because he failed to account for extinction of light from the stars he measured due to dust. The universe is now known to be seven times the age Hubble estimated. Hubble had been fooled by dust.

By the 1950s, the expanding universe was generally accepted and the great debate was whether it had come into being in some cataclysmic event in the past (the “Big Bang”) or was eternal, with new matter spontaneously appearing to form new galaxies and stars as the existing ones receded from one another (the “Steady State” theory). Once again, there were no observational data to falsify either theory. The Steady State theory was attractive to many astronomers because it was the more “Copernican”—the universe would appear overall the same at any time in an infinite past and future, so our position in time is not privileged in any way, while in the Big Bang the distant past and future are very different than the conditions we observe today. (The rate of matter creation required by the Steady State theory was so low that no plausible laboratory experiment could detect it.)

The discovery of the cosmic background radiation in 1965 definitively settled the debate in favour of the Big Bang. It was precisely what was expected if the early universe were much denser and hotter than conditions today, as predicted by the Big Bang. The Steady State theory made no such prediction and was, despite rear-guard actions by some of its defenders (invoking dust to explain the detected radiation!), was considered falsified by most researchers.

But the Big Bang was not without its own problems. In particular, in order to end up with anything like the universe we observe today, the initial conditions at the time of the Big Bang seemed to have been fantastically fine-tuned (for example, an infinitesimal change in the balance between the density and rate of expansion in the early universe would have caused the universe to quickly collapse into a black hole or disperse into the void without forming stars and galaxies). There was no physical reason to explain these fine-tuned values; you had to assume that’s just the way things happened to be, or that a Creator had set the dial with a precision of dozens of decimal places.

In 1979, the theory of inflation was proposed. Inflation held that in an instant after the Big Bang the size of the universe blew up exponentially so that all the observable universe today was, before inflation, the size of an elementary particle today. Thus, it’s no surprise that the universe we now observe appears so uniform. Inflation so neatly resolved the tensions between the Big Bang theory and observation that it (and refinements over the years) became widely accepted. But could inflation be observed? That is the ultimate test of a scientific theory.

There have been numerous cases in science where many years elapsed between a theory being proposed and definitive experimental evidence for it being found. After Galileo’s observations, the Copernican theory that the Earth orbits the Sun became widely accepted, but there was no direct evidence for the Earth’s motion with respect to the distant stars until the discovery of the aberration of light in 1727. Einstein’s theory of general relativity predicted gravitational radiation in 1915, but the phenomenon was not directly detected by experiment until a century later. Would inflation have to wait as long or longer?

Things didn’t look promising. Almost everything we know about the universe comes from observations of electromagnetic radiation: light, radio waves, X-rays, etc., with a little bit more from particles (cosmic rays and neutrinos). But the cosmic background radiation forms an impenetrable curtain behind which we cannot observe anything via the electromagnetic spectrum, and it dates from around 380,000 years after the Big Bang. The era of inflation was believed to have ended 10−32 seconds after the Bang; considerably earlier. The only “messenger” which could possibly have reached us from that era is gravitational radiation. We’ve just recently become able to detect gravitational radiation from the most violent events in the universe, but no conceivable experiment would be able to detect this signal from the baby universe.

So is it hopeless? Well, not necessarily…. The cosmic background radiation is a snapshot of the universe as it existed 380,000 years after the Big Bang, and only a few years after it was first detected, it was realised that gravitational waves from the very early universe might have left subtle imprints upon the radiation we observe today. In particular, gravitational radiation creates a form of polarisation called B-modes which most other sources cannot create.

If it were possible to detect B-mode polarisation in the cosmic background radiation, it would be a direct detection of inflation. While the experiment would be demanding and eventually result in literally going to the end of the Earth, it would be strong evidence for the process which shaped the universe we inhabit and, in all likelihood, a ticket to Stockholm for those who made the discovery.

This was the quest on which the author embarked in the year 2000, resulting in the deployment of an instrument called BICEP1 (Background Imaging of Cosmic Extragalactic Polarization) in the Dark Sector Laboratory at the South Pole. Here is my picture of that laboratory in January 2013. The BICEP telescope is located in the foreground inside a conical shield which protects it against thermal radiation from the surrounding ice. In the background is the South Pole Telescope, a millimetre wave antenna which was not involved in this research.

BICEP2 and South Pole Telescope, 2013-01-09

BICEP1 was a prototype, intended to test the technologies to be used in the experiment. These included cooling the entire telescope (which was a modest aperture [26 cm] refractor, not unlike Galileo’s, but operating at millimetre wavelengths instead of visible light) to the temperature of interstellar space, with its detector cooled to just ¼ degree above absolute zero. In 2010 its successor, BICEP2, began observation at the South Pole, and continued its run into 2012. When I took the photo above, BICEP2 had recently concluded its observations.

On March 17th, 2014, the BICEP2 collaboration announced, at a press conference, the detection of B-mode polarisation in the region of the southern sky they had monitored. Note the swirling pattern of polarisation which is the signature of B-modes, as opposed to the starburst pattern of other kinds of polarisation.

Cosmic background radiation B-modes from BICEP2

But, not so fast, other researchers cautioned. The risk in doing “science by press release” is that the research is not subjected to peer review—criticism by other researchers in the field—before publication and further criticism in subsequent publications. The BICEP2 results went immediately to the front pages of major newspapers. Here was direct evidence of the birth cry of the universe and confirmation of a theory which some argued implied the existence of a multiverse—the latest Copernican demotion—the idea that our universe was just one of an ensemble, possibly infinite, of parallel universes in which every possibility was instantiated somewhere. Amid the frenzy, a few specialists in the field, including researchers on competing projects, raised the question, “What about the dust?” Dust again! As it happens, while gravitational radiation can induce B-mode polarisation, it isn’t the only thing which can do so. Our galaxy is filled with dust and magnetic fields which can cause those dust particles to align with them. Aligned dust particles cause polarised reflections which can mimic the B-mode signature of the gravitational radiation sought by BICEP2.

The BICEP2 team was well aware of this potential contamination problem. Unfortunately, their telescope was sensitive only to one wavelength, chosen to be the most sensitive to B-modes due to primordial gravitational radiation. It could not, however, distinguish a signal from that cause from one due to foreground dust. At the same time, however, the European Space Agency Planck spacecraft was collecting precision data on the cosmic background radiation in a variety of wavelengths, including one sensitive primarily to dust. Those data would have allowed the BICEP2 investigators to quantify the degree their signal was due to dust. But there was a problem: BICEP2 and Planck were direct competitors.

Planck had the data, but had not released them to other researchers. However, the BICEP2 team discovered that a member of the Planck collaboration had shown a slide at a conference of unpublished Planck observations of dust. A member of the BICEP2 team digitised an image of the slide, created a model from it, and concluded that dust contamination of the BICEP2 data would not be significant. This was a highly dubious, if not explicitly unethical move. It confirmed measurements from earlier experiments and provided confidence in the results.

In September 2014, a preprint from the Planck collaboration (eventually published in 2016) showed that B-modes from foreground dust could account for all of the signal detected by BICEP2. In January 2015, the European Space Agency published an analysis of the Planck and BICEP2 observations which showed the entire BICEP2 detection was consistent with dust in the Milky Way. The epochal detection of inflation had been deflated. The BICEP2 researchers had been deceived by dust.

The author, a founder of the original BICEP project, was so close to a Nobel prize he was already trying to read the minds of the Nobel committee to divine who among the many members of the collaboration they would reward with the gold medal. Then it all went away, seemingly overnight, turned to dust. Some said that the entire episode had injured the public’s perception of science, but to me it seems an excellent example of science working precisely as intended. A result is placed before the public; others, with access to the same raw data are given an opportunity to critique them, setting forth their own raw data; and eventually researchers in the field decide whether the original results are correct. Yes, it would probably be better if all of this happened in musty library stacks of journals almost nobody reads before bursting out of the chest of mass media, but in an age where scientific research is funded by agencies spending money taken from hairdressers and cab drivers by coercive governments under implicit threat of violence, it is inevitable they will force researchers into the public arena to trumpet their “achievements”.

In parallel with the saga of BICEP2, the author discusses the Nobel Prizes and what he considers to be their dysfunction in today’s scientific research environment. I was surprised to learn that many of the curious restrictions on awards of the Nobel Prize were not, as I had heard and many believe, conditions of Alfred Nobel’s will. In fact, the conditions that the prize be shared no more than three ways, not be awarded posthumously, and not awarded to a group (with the exception of the Peace prize) appear nowhere in Nobel’s will, but were imposed later by the Nobel Foundation. Further, Nobel’s will explicitly states that the prizes shall be awarded to “those who, during the preceding year, shall have conferred the greatest benefit to mankind”. This constraint (emphasis mine) has been ignored since the inception of the prizes.

He decries the lack of “diversity” in Nobel laureates (by which he means, almost entirely, how few women have won prizes). While there have certainly been women who deserved prizes and didn’t win (Lise Meitner, Jocelyn Bell Burnell, and Vera Rubin are prime examples), there are many more men who didn’t make the three laureates cut-off (Freeman Dyson an obvious example for the 1965 Physics Nobel for quantum electrodynamics). The whole Nobel prize concept is capricious, and rewards only those who happen to be in the right place at the right time in the right field that the committee has decided deserves an award this year and are lucky enough not to die before the prize is awarded. To imagine it to be “fair” or representative of scientific merit is, in the estimation of this scribbler, in flying unicorn territory.

In all, this is a candid view of how science is done at the top of the field today, with all of the budget squabbles, maneuvering for recognition, rivalry among competing groups of researchers, balancing the desire to get things right with the compulsion to get there first, and the eye on that prize, given only to a few in a generation, which can change one’s life forever.

Personally, I can’t imagine being so fixated on winning a prize one has so little chance of gaining. It’s like being obsessed with winning the lottery—and about as likely.

In parallel with all of this is an autobiographical account of the career of a scientist with its ups and downs, which is both a cautionary tale and an inspiration to those who choose to pursue that difficult and intensely meritocratic career path.

I recommend this book on all three tracks: a story of scientific discovery, mis-interpretation, and self-correction, the dysfunction of the Nobel Prizes and how they might be remedied, and the candid story of a working scientist in today’s deeply corrupt coercively-funded research environment.

Keating, Brian. Losing the Nobel Prize. New York: W. W. Norton, 2018. ISBN 978-1-324-00091-4.

Here is a one hour talk by the author about the BICEP2 experience and the Nobel Prize.

This is the BICEP2 press conference on March 17, 2014, announcing the discovery of B-mode polarisation in the cosmic microwave background radiation.

What do you do after losing the Nobel prize?  In this April, 2016 (much) more technical talk at the SETI Institute, Brian Keating describes post-BICEP2 research aimed at using the cosmic background radiation to explore other aspects of the early universe including whether the universe has an inherent chirality (left- or right-handedness).  (The preview image for this video looks like it’s broken, but if you click the play button it plays correctly, at least for me.)

Like 12+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

Code Is Speech

Liberator pistol (produced by additive manufacturing)There is a fundamental principle at stake in the current controversy, ignorantly reported and heavily spun, over the recent U.S. State Department settlement with Defense Distributed and the Second Amendment Foundation in which Defense Distributed essentially won the case it had been pursuing since 2015, clearing it to distribute design files for the manufacture of firearms and components which can be used to produce them via additive manufacturing (“3D printing”).

This principle is much simpler and more fundamental than the cloud of confusion and ignorance spread like squid ink by the slavers who prefer a disarmed and dependent population at their mercy.  It goes to the heart of free speech, and we’ve been here before.

The information required to produce an object via additive manufacturing is a computer file which gives instructions to the fabrication device to make the object.  This file can be expressed in text and looks something like this:

        solid cube_corner
          facet normal 0.0 -1.0 0.0
            outer loop
              vertex 0.0 0.0 0.0
              vertex 1.0 0.0 0.0
              vertex 0.0 0.0 1.0
            endloop
          endfacet
        endsolid

Now, this is hardly The Federalist Papers or Common Sense, but it is text which you could read on a soapbox on the corner to the bewilderment of everybody except for a few aspies furiously scribbling down the numbers to take back to their workshops, or publish in a newspaper or pamphlet.

A federal judge in the U.S. state of Washington has issued an order to block the distribution of computer files which the settlement permitted to be disseminated on 2018-08-01.  (Lest one confuse these judicial tyrants with those chronicled in the seventh book of the Bible, recall Jerry Pournelle’s reminder to mentally replace “judge” with “lawyer in a dress”.  This one was appointed by Bill Clinton.)

This is a fundamental attack on freedom of speech.  It asserts that computer files and their dissemination via electronic means are not protected speech, and that the design of an object can be restricted in the same way the physical object can.  These are ideas so stupid only an intellectual could believe them.

Now, I spent some years of my life building tools to create electronic designs and models of objects in the physical world.  This technology has become central to almost everything we do, from fabrication of microcircuits to automobiles to the creation of imaginary worlds for entertainment.  I am, as they say, invested in this.

This lawyer in a dress is saying that my speech, and your speech, in electronic form, distributed electronically, is not subject to the protections granted his, spoken in a courtroom or printed on paper.  He is saying that such speech can be regulated based upon its content, which the founders of the republic that pays his generous salary (extracted by implicit threat at gunpoint from hairdressers and cab drivers who only want to be left alone) rejected in the very first amendment to their Constitution.

As I said, we’ve been here before.  In the 1990s, when the fellow who appointed the lawyer in a dress was president and demonstrating by example his moral rectitude to the nation, the U.S. tried to declare that strong encryption, which would allow citizens to communicate without eavesdropping by the organs of U.S. state security, was a crime to disclose.  Once again, they tried to declare computer code, in this case encryption algorithms and applications, “muntions” and subject to export controls.  In the celebrated case of PGP, MIT Press published its source code as a book and challenged the U.S. government to prevent its publication.  The U.S. government backed down (but did not entirely abandon its stance), and encryption is now generally available.  (Want military-grade encryption entirely within your browser?   Here you go!)

We won then.  We must prevail now.  If the slavers win the argument that computer files are not subject to the protection of physical speech or print, then everything you publish here, or send in E-mail, or distribute yourself will be subject to the kind of prior restraint this lawyer in a dress is trying to impose on Defense Distributed.

This is where it all comes together—the most fundamental of freedoms—speech, self defence, and the autonomy of the individual against the coercive collectivist state.  If these things matter to you, consider joining Defense Distributed.

Disclosure:

Cody R. Wilson at Fourmilab, 2013-01-30I provided some of the early developmental-phase funding of Defense Distributed.  To the right is a photo of Cody R. Wilson on his visit to Fourmilab in January of 2013.  I am the “patron” described (in not-so-complimentary terms) in his superb 2016 book Come and Take It).  In our conversation then Cody persuaded me to get into Bitcoin.  That has repaid my support of Defense Distributed many, many times over.

Like 19+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

This Week’s Book Review – Shale Boom: The Barnett Shale Play and Fort Worth

I write a weekly book review for the Daily News of Galveston County. (It is not the biggest daily newspaper in Texas, but it is the oldest.) My review normally appears Wednesdays. When it appears, I post the review here on the following Sunday.

Book Review

‘Shale Boom’ an even-handed look at fracking

By MARK LARDAS

July 24, 2018

“Shale Boom: The Barnett Shale Play and Fort Worth,” by Diana Davids Hinton, Texas Christian University Press, 2018, 192 pages, $30

Twenty years ago, the United States was running out of oil and gas. Fracking changed everything. Today, the United States is the world’s largest producer of petroleum products.

“Shale Boom: The Barnett Shale Play and Fort Worth,” by Diana Davids Hinton tells the history of a key part of that transformation. It examines how the Barnett Shale helped trigger the fracking revolution, and explores its consequences.

Hinton puts fracking in its historical context. It was not new. Some form of fracturing was done as early as the 1920s. This included injecting liquids into wells under high pressure — hydraulic fracturing. Hinton reveals what was new. The Barnett Shale is a large but narrow layer of oil bearing rock beneath Fort Worth and the area west of it. Fracking techniques of the 1980s and 1990s meant wells failed to yield economic levels of gas and oil.

George Mitchell owned lease rights in the area. Hinton shows how the Galveston-born Mitchell financed new fracking techniques. The new technology unlocked the Barnett Shale, producing unprecedented levels of natural gas. Directional drilling techniques developed during this century’s first decade multiplied yields.

It kick-started a shale gas boom around Fort Worth. Much of the best yield area was under Fort Worth, complicating things. What followed included some craziness of the type accompanying every oil boom. Hinton traces the action.

Hinton looks at the impact urban drilling had on both drillers and residents. She also examines the bust inevitably following a boom, the backlash against drilling, and the impact of environmental concerns fueled by fear of fracking.

Hinton is refreshingly even-handed. She looks at both the benefits and costs (societal and environmental as well as financial) of drilling and the hydrocarbon industry. She also explores both the benefits and excesses of environmental opposition to fracking. Hinton is unafraid to expose the follies and dodgy activities of individuals in both drilling and the environmental movement.

Hinton closes with an examination of the impacts of fracking — long and short term — around Fort Worth, and its global implications. “Shale Boom” a fascinating and balanced look at what technology revolutions yield.

 Mark Lardas, an engineer, freelance writer, amateur historian, and model-maker, lives in League City. His website is marklardas.com.


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

Total Lunar Eclipse: Now that’s a dark one!

On the night of July 27, 2017 (as reckoned in Universal Time, as we use at Ratburger.org), the longest total lunar eclipse of the 21st century occurred.  As I am writing this, the total eclipse is just about to end.  The eclipse is not visible in the Western Hemisphere.

Total lunar eclipse: 2018-07-27, mid-eclipse

At mid-eclipse, the Moon had just risen above the trees on the southwest horizon at Fourmilab.  This is easily the darkest eclipse I have ever seen (I missed the super-dark one in the early 1960s, but I have seen many since then).  The Moon was easily observed, but no detail was obvious and only the edge of the disc was well-distinguished.  Auto-focus with a camera was impossible, as was manual focus since the image was too dark to see adequately in the DSLR finder—I had to “bracket” manual focus and hope I’d get it right on one of the shots.  Obtaining a useful image required an ISO setting of 1600 and an exposure time of 1/3 second at the f/5.6 maximum aperture of the zoom lens at 300 mm focal length.  I have processed the above in-camera JPEG image to approximate the visual impression of the eclipsed Moon (the camera got the intensity about right, but over-saturated the colours).

This eclipse was so long (totality around 103 minutes) because the Moon passed almost directly through the centre of the Earth’s shadow while it was, simultaneously, near apogee, causing it to appear small compared to the size of the shadow.  Since the entire Moon was as deep as possible within the shadow, this explains the darkness of the eclipse.  Other factors affecting the brightness of a lunar eclipse are the weather around the limb of the Earth during the eclipse, the presence of particulates from volcanic eruptions in the atmosphere, and possibly solar activity: these differ from eclipse to eclipse and cannot be reliably predicted.

By coincidence, Mars is simultaneously at perihelion and in opposition at the same time as the eclipse.  This only happens every 25,000 years.

Like 13+

Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

“Eugenics” by Another Name…

Replying to 

Everyone will stay opposed to ‘eugenics’… right up until the microsecond that they can use it to give their own kids an advantage in life.

Me:

We just re-brand it as “Pro-choice”-problem solved! Progressives on-board! (Some of that “lateral thinking” I’ve been hearing about).

So, who all here has read Heinlein’s first (published) novel Beyond This Horizon? Skillfully explored the ethics of ‘eugenics’ and also a heavily armed, and thus, extremely polite, society. Heinlein had the government run the (voluntary) eugenics program, and distribute Basic Income (just how topical to 2018 can a 1940 novel be?!)

My current take: Unless we do some kind of World-Treaty, eugenics arms race with the Red Chinese started approximately last month. We just don’t know it yet. And as Geoffrey Miller so pithily notes, no one is going to unilaterally disarm.


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar

This Week’s Book Review – Blue Collar Space

I write a weekly book review for the Daily News of Galveston County. (It is not the biggest daily newspaper in Texas, but it is the oldest.) My review normally appears Wednesdays. When it appears, I post the review here on the following Sunday.

Book Review

Everyday jobs turn wondrous in ‘Blue Collar Space’

By MARK LARDAS

July 18, 2018

“Blue Collar Space,” by Martin Shoemaker, Old Town Books, 2018, 244 pages, $11.99

What will it be like when humans are living and working in space? Ordinary folk, like those who live down your street?

“Blue Collar Space,” by Martin Shoemaker offers one vision. It is a collection of short science fiction stories set on the moon and Mars, and Jupiter orbit.

The settings are exotic. The jobs are ordinary. EMTs, sanitation workers, teachers, doctors, factory workers and miners feature in these stories. A few stories fall into the category of space adventure. “Not Close Enough” deals with a first manned mission to Mars — sort of a first manned mission to Mars. The explorers from NASA, ESA, Roscosmos, JAXA, and space agencies from India, Australia and China are not allowed closer to Mars’ surface than Martian orbit. There is a sort of spy adventure in the short story “Black Orbit,” with smugglers and secret agents.

Yet most deal with life and work of an everyday sort; dirty jobs in a space setting. A rescue team is sent to assist crash survivors in “Scramble.” A young girl must find help for her injured father — on the surface of the moon — in “Father-Daughter Outing.” The complexities of running a sanitation system on a lunar city gets explored in “The Night We Flushed the Old Town.” A children’s survival class instructor on Mars has to figure out how to fix things when something goes wrong in “Snack Break.” A moon prospector grapples with the discovery that starring in a moon-based kiddie show really is significant in “A Sense of Wonder.”

It is not dull. Shoemaker shows the adventure in doing things that on Earth are ordinary when they must be done in a hostile environment like space. Being on a spaceship, a space station, or surface of the moon and Mars changes things. He writes with a crisp and engaging style that draws readers into the tale. The result is fascinating reading.

“Blue Collar Space” captures what life will really be like when we finally get off Earth and move into space. It will be commonplace, yet at the same time it will be wonder filled.

 Mark Lardas, an engineer, freelance writer, amateur historian, and model-maker, lives in League City. His website is marklardas.com.


Users who have liked this post:

  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar
  • avatar