What happened in 1914 that ended the golden age of microbiology?

What happened in 1914 that ended the golden age of microbiology?

We are searching data for your request:

Forums and discussions:
Manuals and reference books:
Data from registers:
Wait the end of the search in all databases.
Upon completion, a link will appear to access the found materials.

I was reading a microbiology textbook and all it said was "The work that began with Pasteur started an explosion of discoveries in microbiology. The period from 1857 to 1914 has been appropriately named the Golden Age of Microbiology." I know that World War 1 began, but it doesn't go into detail to why the Golden Age actually stopped at 1914.

There doesn't need to be any particular event in 1914. The starting and ending dates of historical periods are, to varying extents, generally somewhat arbitrary. It is relatively easy to point at the peak of a movement. At which does its rise constitutes the start of an era, and by which point can we classify its decline as an end, is much more opinion based.

In this case, microbiology experienced a flourish of discoveries in the late 19th century that established it as a modern scientific discipline. It seems that the number of major breakthroughs peaked around the 1880s or so. A brief survey seems to suggest that the pace of discoveries was in a relative decline by the early 20th century. This is not, of course, to say that important research wasn't being carried out, but the rhythm definitely seemed to have settled down. All golden ages have to end and it seems that microbiology's time had come.

However, although it seems apparent that the microbiological golden age had peaked, its decline is ultimately only a trend. There were no major breaks that could be chosen as a naturally obvious end date. That is, until the First World War shattered the Belle Époque. Given this context, the monumental calamity of the Great War seems like quite a logical cut off point for the Golden Age moniker.

See the following tables of of major microbiological discoveries for examples of the aforementioned decline. While this is by no means a comprehensive survey of the discipline, it does illustrate a broad general trend.

On the first list, note the 12 year gap between Shiga Kiyoshi, the discoverer of the cause of dysentery, and Paul Ehrlich, discoverer of the first real treatment for syphilis

(Source: Rao, Dubasi Govardhana. Introduction to Biochemical Engineering. Tata McGraw-Hill Education, 2010.)

A similar pattern of slowing down can be observed in this next list, which dealt exclusively with causative agents of microbiological diseases.

Source: (Kokare, C. R., Pharmaceutical Microbiology Principles and Applications. 6th ed. Pune: Nirali Prakashan; 2008.)

Lastly, two lists on bateriology and virology:

Source: (Nagoba, B. S., Microbiology for Nurses. 2nd ed.)

Not just microbiology, but all sciences virtually went on hold during the war. Research plummeted. Many schools had trouble attracting students, who were being drafted into the war and had to charge less tuition as demand dropped for advanced education. Also, unlike WW2, there were no deferments for students or scientists. If you were a 23-year-old biologist you were just as likely to get conscripted as a 23-year-old welder. Many promising scientists were killed in the trenches. One of the most brilliant physicists of the 20th century, Henry Moseley, was killed when he was 27 on the beaches of Gallipoli. Many other promising young scientists met the same fate.

It was the beginning of World War I, and the global disruption engendered by that war.

It wasn't just the "golden age of microbiology," but almost "the golden age of everything" that ended in 1914. This was succeeded by the "age of catastrophe" that most authorities agree began in 1914, using different ending dates. The two World Wars were hugely destructive of manpower and economies. To a lesser extent, this was true of the depressed 1930s. And even the relatively calm 1920s were a period a "suspicion" and anti-intellectualism characteristic of the interwar period that generally hinder scientific research. And by the 1930s, some of Germany's best (Jewish) scientists were being summarily fired from their posts, and forced to emigrate.

Within "microbiology" there were some exceptions to the rule. One example is virology, which got a relatively late start in the 1890s, and had arguably its most prolific era in the 1920s and 1930s. But that was a "special case," scientifically, that was an exception to the general trend in science and elsewhere.

The Plague of Athens Leading to the Fall of the Golden Age of Greece

Toxigenic fungi and mycotoxins entered human food supplies about the time when mankind first began to cultivate crops and to store them from one season to the next, perhaps 10,000 years ago. The storage of cereals probably initiated the transition by mankind from hunter-gatherer to cultivator, at the same time providing a vast new ecological niche for fungi pathogenic on grain crops or saprophytic on harvested grain, many of which produced mycotoxins. Grains have always been the major source of mycotoxins in the diet of man and his domestic animals. In the historical context, ergotism from Claviceps purpurea in rye has been known probably for more than 2000 years and caused the deaths of many thousands of people in Europe in the last millennium.

It is suggested that the outbreak of the ?plague' in Athens in the 5th century BCE was caused by moldy food containing immunosuppressive mycotoxins, including the irritant T-2 toxin produced by certain Fusarium micro-fungi. Attica, a small hilly, coastal city-state, had to import its cereal grains from overseas. However, because of the Peloponnesian war and the occupation of its port, Peiraeus, by enemy forces, scarcity of food may have forced the population of the overcrowded Athens to consume any food, even when moldy. This happened in the Orenburg district of the USSR during World War II, when an outbreak of ?alimentary toxic aleukia' caused a great number of fatalities. The Plague of Athens was an epidemic that devastated the city-state of Athens in ancient Greece during the second year of the Peloponnesian War (430 BCE) when an Athenian victory still seemed within reach. It is believed to have entered Athens through Piraeus, the city's port and sole source of food and supplies. Much of the eastern Mediterranean also saw outbreak of the disease, albeit with less impact. The plague returned twice more, in 429 BCE and in the winter of 427/426 BCE. Some 30 pathogens have been suggested as causing the plague.

Sparta and its allies, with the exception of Corinth, were almost exclusively land based powers, able to summon large land armies that were very nearly unbeatable. Under the direction of Pericles, the Athenians pursued a policy of retreat within the city walls of Athens, relying on Athenian maritime supremacy for supply while the superior Athenian navy harassed Spartan troop movements. Unfortunately, the strategy also resulted in adding many people from the countryside to an already well-populated city, introducing a severe crowding factor as well as resource shortages. Due to the close quarters and poor hygiene exhibited at that time Athens became a breeding ground for disease and many citizens died including Pericles, his wife, and his sons Paralus and Xanthippus. In the history of epidemics the ?Plague' of Athens is remarkable for its one-sided affliction and bias on the ultimate outcome of a war.

In his History of the Peloponnesian War the historian Thucydides, who was present and contracted the disease himself and survived, describes the epidemic. He writes of a disease coming from Ethiopia and passing through Egypt and Libya into the Greek world:

a plague so severe and deadly that no one could recall anywhere its like, and physicians ignorant of its nature not only were helpless but themselves died the fastest, having had the most contact with the sick.

In overcrowded Athens, the disease killed an estimated 25% of the population. The sight of the burning funeral pyres of Athens caused the Spartans to withdraw their troops being unwilling to risk contact with the diseased enemy. Many of Athens' infantry and expert seamen died, as well as their general Pericles. After the death of Pericles, Athens was led by a succession of leaders Thucydides described as incompetent or weak. According to Thucydides, not until 415 BCE had Athens recovered sufficiently to mount a major offensive, the disastrous Sicilian Expedition. Accounts of the Athenian plague graphically describe the social consequences of an epidemic. Thucydides' account clearly details the complete disappearance of social morals during the time of the plague. Thucydides states that people ceased fearing the law since they felt they were already living under a death sentence. Likewise, people started spending money indiscriminately. Many felt they would not live long enough to enjoy the fruits of wise investment, while some of the poor unexpectedly became wealthy by inheriting the property of their relatives. It is also recorded that people refused to behave honorably because most did not expect to live long enough to enjoy a good reputation for it. Another reason for the lack of civilized behavior was the sheer contagiousness of the illness. Those who tended to the ill were most vulnerable to catching the disease. This meant that many people died alone because no one was willing to risk caring for them. The dead were heaped on top of each other, left to rot, or shoved into mass graves. Sometimes those carrying the dead would come across an already burning funeral pyre, dump a new body on it, and walk away. Others appropriated prepared pyres so as to have enough fuel to cremate their own dead. Those lucky enough to survive the plague developed an immunity and so became the main caretakers of those who later fell ill. A mass grave and nearly 1,000 tombs, dated between 430 and 426 BCE, have been found just outside Athens' ancient Kerameikos cemetery. The mass grave was bordered by a low wall that seems to have protected the cemetery from a wetland. Excavated during 1994-95, the shaft-shaped grave may have contained a total of 240 individuals, at least ten of them children. Skeletons in the graves were randomly placed with no layers of soil between them.

Excavator Efi Baziotopoulou-Valavani, of the Third Ephoreia (Directorate) of Antiquities, reported that:

“[t]he mass grave did not have a monumental character. The offerings we found consisted of common, even cheap, burial vessels black-finished ones, some small red-figured, as well as white lekythoi (oil flasks) of the second half of the 5th century BC. The bodies were placed in the pit within a day or two. These [factors] point to a mass burial in a state of panic, quite possibly due to a plague.“

The plague also caused religious uncertainty and doubt. Since the disease struck without regard to a person's piety toward the gods, people felt abandoned by the gods and there seemed to be no benefit to worshiping them. The temples themselves were sites of great misery, as refugees from the Athenian countryside had been forced to find accommodation in the temples. Soon the sacred buildings were filled with the dead and dying. The Athenians pointed to the plague as evidence that the gods favored Sparta, and this was supported by an oracle that Apollo himself (the god of disease and medicine) would fight for Sparta if they fought with all their might. An earlier oracle had warned that:

“A Dorian [Spartan] war will come, and bring a pestilence with it “.

Thucydides is skeptical of these conclusions and believes that people were simply being superstitious. He relies upon the prevailing medical theory of the day, Hippocratic theory, strives to gather evidence through direct observation. He notes that:

Carrion-eating birds and animals disappeared as a result, though he leaves it an open question whether they died after eating the corpses or refused to eat them and were driven away. All the birds and beasts that prey upon human bodies, either abstained from touching them (though there were many lying unburied), or died after tasting them. In proof of this, it was noticed that birds of this kind actually disappeared they were not about the bodies, or indeed to be seen at all.

Historians have long tried to identify the disease behind the Plague of Athens. The disease has traditionally been considered an outbreak of the bubonic plague in its many forms, but reconsiderations of the reported symptoms and epidemiology have led scholars to advance alternative explanations. These include typhus, smallpox, toxigenic fungi, measles, and toxic shock syndrome. Based upon striking descriptive similarities with recent outbreaks in Africa, as well as the fact that the Athenian plague itself apparently came from Africa (as Thucydides recorded), Ebola or a related viral hemorrhagic fever has been considered. Given the possibility that profiles of a known disease may have morphed over time or the plague was caused by a disease that no longer exists, the exact nature of the Athenian plague may never be known. In addition, crowding caused by the influx of refugees into the city led to inadequate food and water supplies and a probable proportionate increase in insects, lice, rats, and waste. These conditions would have encouraged more than one epidemic disease during the outbreak. However, advancing scientific technologies may reveal new clues. Unfortunately, DNA sequence-based identification is limited by the inability of some important pathogens to leave a “footprint“ retrievable from archaeological remains after several millennia. The lack of a durable signature by RNA viruses means some etiologies, notably the hemorrhagic fever viruses, are not testable hypotheses using currently available scientific techniques.

What Happened to My Future: The Pivotal Years of 1914, 1929, and 2020

Walter G. Moss is a professor emeritus of history at Eastern Michigan University, a Contributing Editor of HNN, and author of An Age of Progress? Clashing Twentieth-Century Global Forces (2008). For a list of his recent books and online publications click here.

This car carried the late late Archduke Franz Ferdinand when he was assassinated.

Museum of Military History, Vienna. Photo Alexf CC BY-SA 3.0

As the years 1914, 1929, and 2020 began, many people thought their futures looked good. This optimistic feeling was captured well in the memoir of the Austrian Jewish writer Stefan Zweig:

When I attempt to find a simple formula for the period in which I grew up, prior to the First Word War, I hope that I convey its fullness by calling it the Golden Age of Security. . . .

. . . The world offered itself to me like a fruit, beautiful and rich with promise, in that radiant summer [of 1914]. And I loved it for its present, and for its even greater future.

Then, . . . [on 28 June 1914], in Sarajevo, the shot was fired which in a single second shattered the world of security and creative reason in which we had been educated, grown up and been at home--shattered it like a hollow vessel of clay.

That assassin&rsquos shot, which killed Austrian Archduke Franz Ferdinand, began a sequence of events that by August 4 entangled most of Europe&rsquos powers into World War I (WWI). In 1917, the USA entered the conflict, and later that year war was among the factors that brought the Communists under Lenin into power in Russia. By the time the war ended in November 1918, about 10 million fighting men had been killed--in France, a horrific 3 of every 10 between the ages of 18 and 28.

In addition to the military deaths, millions of civilians died because of the war. Even after it ended, many more millions of men and women continued to die in the chaos of the Russian Civil War and a continuing worldwide influenza pandemic, both partly due to WWI.

The Center for Disease Control (CDC) estimates those pandemic deaths at a minimum of &ldquo50 million worldwide with about 675,000 occurring in the United States,&rdquo where in the spring of 1918 it was first discovered among military personnel. The movement and dislocation of people caused by war aided its global spread. Like the war itself, the flu epidemic killed many young adults, those who before 1914 still had hopes for a brighter future.

By 1920 the world had changed greatly, especially in Europe, where so many young men lay dead. Old empires, like the Austro-Hungarian one, had disintegrated and new nations and borders sprang up. In the USA, the Republican candidate Warren Harding was elected, seeking a return to &ldquonormalcy.&rdquo &ldquoPoise has been disturbed, and nerves have been racked,&rdquo he said. What America needed was &ldquonot heroics, but healing not nostrums, but normalcy not revolution, but restoration . . . not submergence in internationality, but sustainment in triumphant nationality.&rdquo

And throughout the 1920s, until the Great Depression began in October 1929, many people tried to reaffirm traditional values. Prohibition began in January 1920 and lasted until 1933. Three Republican presidents dominated the decade. Refusal to join the League of Nations and isolationism characterized our foreign policy, and in the famous Scopes &ldquomonkey trial&rdquo of 1925, a high school biology teacher was found guilty of teaching Darwinian evolution instead of the biblical account of creation. But rapidly developing technology like the radio, movies, the Ford Model T, and a constant stream of other new consumer products undercut old ways. In Europe Zweig noted that &ldquoall values were changed,&rdquo and &ldquoanything that gave hope of newer and greater thrills&hellip found a tremendous market.&rdquo In the USA, despite prohibition there were speakeasies that sold illegal liquor, and new dances that shocked traditionalists. In 1920, women across the USA for the first time were able to vote for a president. A U.S. college president expressed feelings shared by many conservative Americans when he said, &ldquoThe low‑cut gowns, the rolled hose, and the short skirts were born of the Devil.&rdquo

The cataclysmic events of 1914-1919--WWI, Russian revolution and civil war, influenza epidemic, Paris Peace Conference and the treaties that resulted from it-- greatly affected the young adults who survived. This was especially true for many European veterans who served in the war--notably Hitler and Mussolini (ages 25 and 31 respectively in August 1914)--but also for some who never fought, like the American writer F. Scott Fitzgerald (only 21 when he dropped out of Princeton to join the army in 1917).

In the 1920s he and some of his fellow writers like Ernest Hemingway became known as part of the &ldquoLost Generation.&rdquo The latter author began his The Sun Also Rises (1926) with the epigraph &ldquoYou are all a lost generation.&rdquo The term suggested that because of the horrific and often pointless deaths of millions, many of the young had lost faith in abstract ideals and values like patriotism and courage. They were &ldquolost&rdquo in being aimless, lacking purpose and drive. In his novel of WWI, A Farewell to Arms (1929), Hemingway has his protagonist Frederick Henry say that &ldquoabstract words such as glory, honor, courage or hallow were obscene.&rdquo

Europe&rsquos writers were even more affected by the war. Some like the poet Wilfred Owen died in battle. Others, as Robert Wohl indicated in The Generation of 1914, remained &ldquounder the spell of the experience they had undergone during the critical decade between 1910 and 1920. They were never able to free themselves from the conviction, given lasting shape by the war, that they were living through an apocalypse&hellip. They had yet to absorb what it meant to live in a society characterized by persistent rather than occasional change&hellip. They wavered uncertainly and unpredictably between a desire to spring forward into the future and a longing to return to the hierarchies and faith of the past.&rdquo

But the past could not be brought back. The war had changed too much. Unemployment and inflation troubled many European nations, as did left-right political clashes. Unsettled conditions helped Mussolini come to power in 1922 and Hitler to attempt a takeover in Munich in 1923. Although it failed, the Great Depression aided him to become German chancellor in 1933.

Until that cataclysmic economic meltdown began in October 1929 in the United States, conditions looked rosy for many young Americans. At the beginning of 1928 Calvin Coolidge was still president, and in 1925 he had said &ldquothe chief business of the American people is business.&rdquo On January 1, 1929 The New York Times editorialized that 1928 had been a year &ldquoof unprecedented advance, of wonderful prosperity&hellip. If there is any way of judging the future by the past, this new year will be one of felicitation and hopefulness.&rdquo Between 1921 and 1929, U. S. industrial production doubled. As compared with pre-war years, many more students were now going to college, and jobs were plentiful. In Middletown, the Lynds&rsquo famous study of 1920s Muncie, Indiana, they indicated that while the state&rsquos population increased only about 25 percent from 1890 to 1924, the number of those graduating from &ldquothe State University&rdquo increased almost 800 percent. In late 1929, unemployment (3.2%) and inflation (less than 1%) were still low.

But then the Depression began, and with it increased misery for many Americans, young and old alike. By 1933 the unemployment rate was 25 percent and, despite Franklin Roosevelt&rsquos New Deal, declined only to 17 percent by 1939. Salaries also decreased for many, like professors, still lucky enough to have a job.

In Germany the unemployment rate was even higher by January 1933, which bolstered the popularity of Hitler and his Nazi party. He became chancellor of Germany at the end of that month. For the next six years he increased German armaments and made increasing demands on other countries before beginning World War II in September 1939 by invading Poland.

Finally, we have 2020 and the coronavirus pandemic, which by 15 May had sickened more than 4,431,700 million people and killed at least 303,000 people in at least 177 countries, including at least 86,700 deaths in the USA. How many more will get sick and die is anyone&rsquos guess. Based on our past knowledge and available scientific knowledge, however, the number will be large--one reliable source in early May more than doubled its death forecast, mainly because so many states were beginning to ease social distancing and other requirements. The effects of the pandemic on numerous phases of our life are also likely to be significant. Wearing of face masks, social distancing, massive unemployment, educational and entertainment disruptions, closing of businesses, especially restaurants, an acceleration of online shopping, rethinking of the relationships between governmental bodies (at all levels) and individuals--all of these phenomena and more are now occurring. Like 1914 and 1929, 2020 will likely prove to a pivotal year for many people, young and old, around the globe.

How pivotal will depend in major ways on the results of vital U. S. elections in November. President Trump, who will be seeking reelection, is in some ways like the three Republican presidents elected in the 1920s. Like Harding, he wishes for a return to normalcy, and Trump&rsquos &ldquomake America great again&rdquo reminds us of Harding&rsquos call for a &ldquotriumphant nationality.&rdquo Like Coolidge, he champions business interests before those of the needy--though his demeanor is hardly like that of &ldquosilent Cal.&rdquo And like Hoover, who started off his presidency with an expanding economy and low unemployment only then to plummet into the Depression, Trump has not heretofore handled the most serious crisis of his presidency (coronavirus) well.

If the Democratic nominee (presumably Joe Biden) is elected, the pandemic, plus four years of Trumpism, is likely to spur major changes just as the Depression and four years of Herbert Hoover helped produce FDR&rsquos New Deal. Dealing with the coronavirus, plus reversing Trump&rsquos environmental, tax, and medical policies, might just be a start.

Newspapers and journals are full of predictions about what our future might look like. But three of the wisest commentaries come from historians and contributors to the History News Network (HNN). Ed Simon, an HNN contributing editor, states that &ldquowhat happens next is unclear for all of us, and he quotes historian Rebecca Spang&rsquos &ldquoThe Revolution Is Under Way Already&rdquo (in The Atlantic): &ldquoeverything is up for grabs.&rdquo Historian Steve Hochstadt, also on HNN, writes &ldquoas for what that future will be like, at the moment I haven&rsquot a clue.&rdquo But like Simon and Sprang he asserts, &ldquowe, men and women, can influence how we come out of this chaos.&rdquo How we vote in November will dramatically impact that &ldquohow&rdquo--and all of our futures.

Zora Neale Hurston

Anthropologist and folklorist Zora Neale Hurston courted controversy through her involvement with a publication called FIRE!!

Helmed by white author and Harlem writers’ patron Carl Van Vechten, the magazine exoticized the lives of Harlem residents. Van Vechten’s previous fiction stirred up interest among whites to visit Harlem and take advantage of the cultural and nightlife there.

Though Van Vechten’s work was condemned by older luminaries like DuBois, it was embraced by Hurston, Hughes and others.


The middle of the 20th century has often been described as a golden age of scientific advancement and miraculous medical breakthroughs, as well as an era in which the medical profession and the institutions it controlled enjoyed especially high public regard. Observers as varied as science writer and journalist Paul de Kruif, AMA spokesperson Morris Fishbein, film maker Hal Wallis, medical sociologist Eliot Freidson, and historian John Burnham have been intrigued by this golden era. They and the scores of others who have written about and analyzed the different aspects of American health culture during this era did not always agree on the exact parameters of this age of miracles. Some would have it begin as early as the bacteriological revolution of the late 19th century. Others look to the flowering of scientific research and pharmaceutical development that is associated with World War I, and still others point to the changes in medical education and public health often associated with the Progressive era. Similarly, there is a lack of agreement about a suitable end point for the Golden Age, although there is a fairly widespread consensus that it is over by the late 1960s and early 1970s. The passage of Medicare & Medicaid legislation over the strenuous opposition of the medical profession, as well as the changing nature of the medical market place and the increasing chorus of scholarly, journalistic, and consumer-oriented critiques of the medicine signal the end of an era for most observers. Taking these variations into account, we have focused our attention on the 3 decades at the core of most accounts of the golden age: the 1930s, the 1940s and the 1950s.

Differences over the exact chronological parameters of the golden age resonate with the wide variation this era’s observers give to the question: what made it a golden age? The varying answers to this question depend substantially on the kind of evidence the observer favored. The more demographically inclined point to the declining mortality rate for infectious disease or to the increase in the average American’s life span. Others more concerned with the issue of the medical practitioner’s high social status point to the consistently high ratings that physicians and scientists received from their fellow citizens in opinion polls. Or they emphasize the popular culture image of doctors, nurses and medical scientists found in novels and movies which often cast them in a heroic light. More economically minded analysts of the golden age have pointed to the slow but steady rise in physicians’ incomes and the expanding role of costly hospital care as evidence for their arguments. Another set of criteria used to judge this a golden age are accounts of this era’s medical wonders – sulfa, blood typing, the polio vaccine, penicillin, etc. For interested mid-twentieth century observers, there was no lack of evidence to support a characterization of their time as one of medical triumph and advancement in American health.

For many this was an era in which no problem seemed unsolvable. There were no diseases that science could not ultimately defeat, no surgical procedure or medical product that could not be mastered, and no technology that could not be harnessed. And yet, this was the period in American health history rocked by some dramatic medical scandals, such as the 1937 sulfanilamide poisoning, which cost the lives of nearly a hundred citizens. This was also an era in which hospitals and the latest technology remained clustered in the larger urban areas, while many of the poorer and more rural parts of the country were facing shortages in medical basics, including doctors and nurses. During the same years, cancer, heart disease and a host of other chronic health conditions remained significant problems for the American health care system with its hospital based, acute care orientation. Rising costs of many kinds of medical care and the economic disruptions of a devastating economic depression and a world war further complicated the health lives of many Americans. How could the years of the Great Depression and the Second World War be part of a golden age in American health history??

In the following sections of our website you will find materials that will help you explore these questions and other aspects of American health culture in the 1930s,1940s and 1950s. Please feel free to indulge your own curiosity, but be aware that we have focused our attention on the following questions:

Stresemann and the Dawes Plan

It was at this moment of crisis that Gustav Stresemann was elected as chancellor in September 1923. Stresemann was a politician of the DVP, the German People’s Party. In 1923 he formed a coalition of the DVP , SPD , DDP and Centre Party and became chancellor.

To try and tackle the crisis gripping Germany, Stresemann followed a policy of ‘fulfillment’ whereby he aimed to improve international relations by attempting to fulfil the terms of the Treaty of Versailles. These improved relationships would then in turn help him to secure a reasonable revision to the treaty.

Following this policy, Stresemann made the unpopular decision to start repaying the reparations and order the striking workers of the Ruhr to return to work.

In April 1924, Stresemann’s policy of fulfillment paid off. An American economist named Charles Dawes was recruited to help to set a new, realistic, target for Germany’s reparations payments. This was called the Dawes Plan.

Under this plan, the reparations were reduced to 50 million marks a year for the next five years, and then 125 million marks a year following that. The plan also recommended that the German National Bank was reorganised, and that Germany receive an international loan. This loan was for 800 million gold marks, financed primarily by America.

These measures eased the economic pressure on Germany, and relations with other countries began to improve and then stabilise.

This economic improvement, as well as improvements in foreign relations, led to the years between 1924 and 1929 becoming known as the ‘Golden Years’.

The 1870-1914 Gold Standard: The Most Perfect One Ever Created

The most perfect monetary system humans have yet created was the world gold standard system of the late 19th century, roughly 1870-1914. We don’t have to hypothesize too much about what a new world gold standard system could look like. We can just look at what has already been done.

Contrary to popular belief, people generally did not conduct commerce with gold coins. Yes, gold coins existed, but people mostly used paper banknotes and bank transfers, just as they do today. In 1910, gold coins comprised $591 million out of total currency (base money) of $3,149 million in the United States, or 18.7%. These gold coins were probably not used actively, and served more as a savings device, in a coffee can for example.

Silver coins were also used, but by then they had become token coins, just like our token coins today. By 1910, most countries in the world officially had “monometallic” monetary systems, with gold alone as the standard of currency value. This eliminated many of the difficulties of bimetallic systems, which had caused minor but chronic problems in the earlier 19th century.

Also contrary to popular belief, there was no “100% bullion reserve” system, in which each banknote was “backed” by an equivalent amount of gold bullion in a vault. In the United States in 1910, gold bullion reserve coverage was 42% of banknotes in circulation.

For other countries, we can refer to Monetary Policy Under the International Gold Standard: 1880-1914, by Arthur Bloomfield. It was published in 1959. Bloomfield provides references to major central bank balance sheets around the world. He summarizes various “reserve ratios,” but includes not only gold bullion but also foreign exchange reserves (i.e., bonds denominated in foreign gold-linked currencies). The “reserve ratios,” on this basis for 1910, were 46% in Britain, 54% in Germany, 60% in France, 41% in Belgium, 73% for the Netherlands, 68% for Denmark, 80% for Finland, 75% for Norway, 75% for Switzerland, 55% for Russia, and 62% for Austro-Hungary. Reserve ratios for gold bullion alone would be, naturally, less than these numbers.

A number of countries had variations on a “gold exchange standard,” which is to say, a currency board-like system linked to a gold-linked reserve currency (usually the British pound). This became more common in the 1920s, and especially during the Bretton Woods period, but it was in regular use pre-1914 as well. Bloomfield lists countries on some form of a “gold exchange standard,” including: Russia, Japan, Austria-Hungary, the Netherlands, most Scandinavian countries, Canada, South Africa, Australia, New Zealand, India, the Philippines, and “a number of other Asiatic and Latin American countries, whose currency systems operated analogously to modern currency boards.” The pre-1914 era was the age of empire, and many of these countries were formally or informally within one or another European empire. Their currency systems also ended up being subsidiary to the currency of the imperial seat.

Most of the leading European countries had some sort of central bank, upon the model of the Bank of England. The U.S. did not, opting for a “free-banking” system (although one dominated by U.S. Treasury-issued banknotes). The countries with central banks also mimicked the Bank of England’s typical operating procedures, which included continuous involvement in credit markets by way of “discount” lending (short-term collateralized lending). This was not at all necessary, but was an outgrowth of the Bank of England’s history as a profit-making commercial bank. Thus, central banks also, in the fashion of the Bank of England, often managed base money supply by way of its lending policy, which included its “discount rate.”

The world gold standard did not produce some sort of “balance” in the “balance of payments” – in other words, no current account deficit or surplus. There was no “price-specie-flow mechanism.” These so-called “balance of payments imbalances” are another word for “international capital flows,” and capital flowed freely in those days. With all countries basically using the same currency – gold as the standard of value – and also with legal and regulatory foundations normalized by European imperial governance, international trade and investment was easy.

It was the first great age of globalization. Net foreign investment (“current account surplus”) was regularly above 6% of GDP for Britain, and climbed to an incredible 9% of GDP before World War I. From 1880 to 1914, British exports of goods and services averaged around 30% of GDP. (In 2011, it was 19.3%.) In 1914, 44% of global net foreign investment was coming from Britain. France accounted for 20%, Germany 13%.

This river of capital flowed mostly to emerging markets. The United States, which was something of an emerging market in those days although one that was already surpassing its European forebears (much like China today), was a consistent capital-importer (“current account deficit”). Most British foreign capital went to Latin America Africa accounted for much of the remainder.

Gross global foreign investment rose from an estimated 7% of GDP in 1870 to 18% in 1914. In 1938, it had fallen back to 5%, and stayed at low levels until the 1970s.

In 1870, the ratio of world trade to GDP was 10%, and rose to 21% in 1914. In 1938, it had fallen back to 9%.

This explosion of European capital translated into tremendous investment around the world. British-governed India had no railways in 1849. In 1880, India had 9,000 miles of track. In 1929, there were 41,000 miles of railroad in India, build by British engineers, British capital, and Indian labor. British-governed South Africa opened its first railroad in 1860. This grew to 12,000 miles of track, not including extensions into today’s Zimbabwe and elsewhere in Africa.

The arrangement was largely voluntary. There were no fiscal limitations or centralized governing bodies, such as the eurozone has today. The Bank of England served mostly as an example to imitate. Countries could opt out if they wished, and several did from time to time, although they usually tried to rejoin later. The countries that had rather loose allegiance to gold standard principles should be no surprise: Argentina, Brazil, Spain, Italy, Chile, and Greece, among others.

With monetary stability assured by the gold standard system, bond yields fell everywhere to very low levels. Yields on long-term government bonds were 3.00% in France in 1902 3.26% in the Netherlands in 1900 2.92% in Belgium in 1900 3.46% in Germany in 1900. Corporate bonds followed along: the yield on long-term high-grade railroad bonds in the United States was 3.18% in 1900. Unlike today, these rock-bottom yields were not obtained by every sort of central bank manipulation imaginable, but reflected the long history and expectation for monetary and macroeconomic stability that the gold standard system provided. They could continue at these low levels for decades, and often did: from 1821, when Britain returned to a gold standard after a floating-currency period during the Napoleonic Wars, to 1914, the average yield on government bonds of infinite (!) maturity in Britain was 3.14%.

During the 20th century, and now into the 21st, no central bank in the world has been able to match this performance. They are not even in the same galaxy. No world monetary arrangement has provided even a pale shadow of that era’s incredible successes.

We could create an updated version of the world gold standard system of the pre-1914 era. However, there isn’t really much need to change things very much. It worked fine, and would still be working today if not for World War I, and soon after, the rise of Keynesian notions that governments could manage their economies by jiggering the currency. This requires a floating currency, which is why we have floating currencies today.

Once we finally abandon these funny-money notions – probably because of their catastrophic failure – it will be very easy to create, once again, a superlative world gold standard system.

Novel Antimicrobials

The antibiotic treatment choices for already existing or emerging hard-to-treat multidrug-resistant bacterial infections are limited, resulting in high morbidity and mortality rates. Although there are some potential alternatives to antibiotic treatment such as passive immunization (Keller and Stiehm, 2000) or phage therapy (Levin and Bull, 2004 Monk et al., 2010), the mainstream approach relies on the discovery and development of newer, more efficient antibiotics. The vast majority of antimicrobial classes in use today have been isolated in the golden era of antibiotic discovery from a limited number of ecological niches and taxonomic groups, mainly from soil Actinomyces. Further exploration of this ecological niche, coupled with newer technologies such as cell-free assays and high-throughput screening, however, did not produce any novel drug classes in the past 20+ years. What approaches could be taken to uncover the novel antimicrobials diversity that is potentially suitable for therapeutic applications?

Some possible approaches to tap the novel antimicrobial diversity is the exploration of ecological niches other than soil, such as the marine environment (Hughes and Fenical, 2010 Rahman et al., 2010), borrowing antimicrobial peptides and compounds from animals and plants (Hancock and Sahl, 2006), mimicking the natural lipopeptides of bacteria and fungi (Makovitzki et al., 2006), accessing the uncultivated portion of microbiota through the metagenomic approach (MacNeil et al., 2001), and, finally, the use of the complete synthetic route pioneered during the early years of the antibiotic era. The latter approach becomes dominant in the search for drugs aimed at the newly identified targets in a bacterial cell. Other strategies may include drugs engineered to possess dual target activities, such as a rifamycin–quinolone hybrid antibiotic, CBR-2092 (Robertson et al., 2008).

8. In 1992, the chairman of Paramount Pictures spent a whopping $2.5 million on a film treatment that was a mere two pages long.

Sherry Lansing had just ascended to the top spot at Paramount when she made a splash by paying Joe Eszterhas (pictured above) the huge sum of money for just two pieces of paper. The story becomes less shocking when you learn that Eszterhas was the world’s most famous screenwriter in the ‘90s, earning $26 million for the scripts he sold during the decade.

Eszterhas turned the two-page treatment into a feature-length erotic thriller titled Jade, which was released in 1995 starring David Caruso and Linda Fiorentino. Jade didn’t fair nearly as well as Eszterhas’ previous erotic thriller, Basic Instinct. It was a major bomb, grossing barely 1/5th of its budget. Amusingly, despite the huge payday for the treatment, Jade earned Eszterhas a Worst Screenplay nomination at the Golden Raspberry Awards.

CuriousKC | A Brief History of KC Radio and the Broadcaster Who Started It All

Considered a “broadcast pioneer” in the heartland, Church got his start in radio broadcasting in 1913 when he was in college.

According to the University of Missouri-Kansas City library, Church made his first big professional move in 1914. That year, Church helped found an experimental wireless station called 9WU at Graceland College in Lamoni, Iowa.

“Church became one of the first to use radio to advertise merchandise,” according to UMKC’s Arthur B. Church KMBC Special Collections.

What’s more, his station also covered local, national and international news, which can still be heard in the UMKC’s Marr Sound Archive. The station’s broadcasts ranged from music programs to news coverage of Pearl Harbor.

Church’s career in radio quickly took off.

Four years after founding 9WU, Church enlisted in the U.S. Signal Corps during World War I. The Signal Corps, a separate branch of the Army, were “responsible for implementing and designing radio technology.” Church taught telegraphy in Leavenworth, Kansas, where he was stationed.

When the war ended, he created the Central Radio Station and a school of the same name, according to UMKC’s special collections.

In 1921, Central Radio Station began operating channel 9 AXJ broadcasts. After switching hands a few times, 9 AXJ became WPE in 1922, then KMBC and, later, KMBZ. Church’s influence in the radio world happened during what was known as the Golden Age of Radio, between 1920 and the 1950s.

Church also was credited with starting the first broadcasting studio – AXJ – in the Midwest.

According to Leah Weinryb Grohsgal’s blog for the National Endowment for the Humanities, his station was seen as the “model for the industry.”

Watch the video: Doomsday: World War 1. Extra Long Documentary (August 2022).