Everything I NEED to know about Islam I learned on 11 September 2001. Everything. I have friends who believe that there is such a thing a “moderate Islam.” I won’t go so far as to say that they are stupid — after all, stupid people are not allowed the privilege of being my friends — but they are naive for believing in a fairly tale like “moderate Islam.” Claiming that there is such a thing is like claiming that there was once such a thing as “moderate National Socialism” — and it shouldn’t come as a surprise to anyone that the Nazis and the Moslems were best of friends back in the day….
Islam is NOT a religion, folks, regardless of what its adherents and sympathizers will tell you. It is a totalitarian social and political ideology wrapped in religious trappings.
A religion, by definition, must allow for freedom of conscience: to truly follow a religion a person must arrive at the decision to do so through internal conviction, not external coercion. Inherent in any genuine religious system is the right to refuse to believe it in — or any other religion for that matter.
An ideology, on the other hand, does not tolerate dissent or non-conformity: everyone within its reach must adhere to its doctrines and dogmas, regardless of whether or not they personally accept the validity of those doctrines and dogmas. Everyone outside of the ideology’s reach must be brought into it, as their existence outside of the ideology’s control represents a threat to not only the ideology’s control over those it dominates, but also to the very existence of the ideology itself. Those of you who have studied Islam will immediately recognize that this is identical to the “Realm of Peace (the Moslem world)/Realm of War (the rest of the world)” dichotomy in Islamic doctrine. Going hand-in-hand with this a fundamental doctrines in Islam that the existence of anyone outside of the “Realm of Peace” will not be tolerated — either they must be brought into the “Realm of Peace” i.e. convert to Islam, or they must be eliminated. No alternatives are allowed in either the Koran or the hadiths.
Which leads to the next point in understanding why Islam is NOT a religion: no genuine religion will espouse the idea of conversion by force. The existence of a God or gods is not something that can be “proven,” in the scientific use of the word, so belief in a God or gods is, again by definition, a matter of faith. The exercise of such faith is a conscious, deliberate, voluntary act, one that in essence states, however openly or privately one might do so, that “I have come to this point, no one has brought me here or made me come here: I now believe what I believe because it is what I have chosen — it the conclusion of my spiritual seach, and that of no one else.” Islam, however, has at its heart the doctrine of “conversion through coercion” — the use of force, including physical violence and threats of execution, to compel non-Moslems to renounce their own beliefs and embrace Islam. This is not some modern aberration or perversion of Islam, the doctrine began with Mohammed himself in 622 AD. Again, this is the methodology of totalitarianism: conform or die. History has provided us with more than sufficient examples of this, whether they be the Third Reich, the Soviet Union, Maoist China, or any one of a dozen tinpot dictatorships that sprang up around the world in the 20th Century.
In the end, it comes down to this: the violence that we are seeing today — and was done on 11 September 2001 — is not simply a part of the fabric of Islam, it IS the fabric of Islam. Burying our heads in the sand, hoping it will go away, will not solve the problem. The world was shown the solution by men like Martel, Graf Salm, von Roggendorf, and Sobieski. The choice we have now is whether or not we will choose to profit from their example. The solution does not lie in continuing to pretend that Islam is something far different than what it actually is.
And that’s the way it is…
…because I’m Daniel Allen Butler, and you’re not.
I’m finding myself looking very closely at undertaking a radical shift in how I will be doing business post-2016. The business paradigms of my profession are undergoing an evolution so rapid that it is not entirely inaccurate to call it a “revolution,” and despite the hopefulness of some people participating in it, like all revolutions, it’s going to be messy and there will be a lot of collateral damage, namely, a lot of really bad work that should never be allowed to see the light of day is going to find its way into print, whether dead-tree or live electron, at the same time that a lot of genuinely good work will be irretrievably lost in the sheer mass of noise.
The sad truth of the great problem plaguing the publishing industry is NOT that six Big Companies control 85% of what gets into print. The problem is that those six companies have been taken over, lock, stock, and thesaurus, by bean-counters who are incapable of recognizing good writing and allowing it to be properly processed into good reading. Because bean-counters, by definition, deal in concrete concepts and are incapable of grasping anything as abstract as the creative process (numbers are numbers, period; there is no such thing as “creative bookkeeping,” that is merely a euphemism for bald-faced lying combined with outright theft), they rely on numbers, projections, trends, and formulae, rather than talent to make publication and marketing decisions. The Big Six have, in short, become the slaves of conventional wisdom.
It’s all too readily apparent what are the dominant thinking modes. One is “This sort of thing has sold in the past, so it will sell now.” This results in travesties such as Hillary Clinton’s latest door-stop. Her first book, It Takes a Village (the publisher accidentally left the word Idiot off the end of the title), for which she was paid a massive seven-figure advance, couldn’t be given away two weeks after its release back in 1996, but because she is a “name” it is automatically assumed (because this model has frequently worked in the past) that she will inevitably bring an audience with her whenever a new book under her name is published. The model actually failed the first time around, and yet because the conventional wisdom holds that “a name” equals “an audience,” the model was again followed, and the result was an embarrassment for the entire industry. (Whether Clinton was embarrassed or not – or is even capable of feeling embarrassment – is fodder for another discussion.) Unfortunately for authors, readers, and the industry itself, the massive corporate inertia (read “cowardice”) works against a wide recognition that the “name=an audience” model is irretrievably broken.
The other prevalent model is the constant repetition of formula. Clive Cussler, who happens to be a personal friend, has not written an original story since 1981, when Night Probe! was published. His Dirk Pitt novels have become the literary equivalent of the James Bond series of movies, at least through Die Another Day: he hit on a formula with Deep Six and Cyclops that worked, that is, it was commercially successful, and has continually rehashed it through another fifteen sequels. Don’t mistake me here, I’m not taking a cheap shot at Clive – he’s good at what he does, and he keeps his readers satisfied with what he produces. The upshot, though, has been a spate of Dirk Pitt-knock offs, clones, and wannabes created by other, less talented writers, most of which should have been strangled at virtual birth. The flood of vampire novels that washed over the fiction market and nearly ruined the urban fantasy genre in the wake of 2005’s publication of Twilight are a corollary to the formula premise, in this case a more-or-less stock character type and more-or-less stock plot situations for said character type are substituted for the specific lead character of a specific series. Twilight, indeed, should have been strangled while still in the conceptual cradle, even before it reached the status of a published novel. The downshot has been that far more people have experienced in one form or another the Twilight series and its consequences and aftereffects than have even heard of Jim Butcher, even though Butcher was first published five years before Stephanie Meyer, and his Harry Dresden series has done more for the urban fantasy genre than anything Meyer could ever have hoped to accomplish. In some ways, it could be argued that, despite the elements of formula in some of the Dresden novels, Butcher almost single-handedly saved urban fantasy from self-destruction at the hands of the horde of glittery, gloomy adolescent blood-suckers unleashed post-Meyer within it.
Which brings all of this the salient question: how do these two models, “This sold in the past, it will sell now” and “If it’s formula, it will work and it will sell,” become the standards of the publishing industry? The answer is cowardice. Cowardice on the part of the bean-counters who have taken over the majority of the industry, itself an act of cowardice, and a manifestation of insecurity. But then, what should be expected of accountants, who are cowardly and insecure by nature (think about it, think about the nature of cowardice and insecurity)? Creativity, and its handmaiden, an individual allowing himself or herself the liberty to recognize creativity, requires courage. Not the clichéd “think outside the box” courage, which, frankly, is so much bovine excrement, but rather the willingness to simply dispense with the concept of a box and recognize ability and talent when it is encountered – and have the courage to sometimes be wrong. Bean counters are raised from the cradle to believe that the worst thing that can happen to them is to ever make a mistake, to err in a sum, to miss a decimal point, as if such mistakes are irrevocable and bear the burden of eternal damnation. The unintended consequence of this is that the bean counters’ fear of their own mistakes metastasizes into a pathological fear of any mistakes made by anyone, and so they methodically set about to eradicate anyone in the companies and businesses which employ them who might also make a mistake, and in so committing an error could perhaps create a situation which would reflect badly on said bean counters. As long as the quarterly reports show a profit on the bottom line, all is well in the bean counters’ world, and so they will go to any length to ensure that result.
How this has undermined the publishing industry is simple: the bean-counter culture of near-paranoia leads them to first influence, then take over, hiring decisions – in order to be protected from errors, they must ensure that they are surrounded by co-workers who will not make mistakes of their own. In doing so, they allow intelligent, visionary, and creative acquisition and managing editors to fall by the wayside, and instead stock the editorial departments and boards of the companies they dominate with “editors” who have not the slightest practical knowledge of or experience at actual editing, but who will blindly and blandly obey the rules of formula and what sold in the past. They create nice, safe, error-proof editorial environments where new talent is not recognized and rewarded, because doing so would entail a risk and risks are something NEVER to be taken. They employ “editors” who are the masters and mistresses of MSWord, but who have no sense of style or the meaning of the concept economy of words. The good editors, those who will tell an author “You’re too verbose,” or “You’re too vague,” or “Your logic in this argument is flawed,” or “Your characters are stereotypes” or “You simply aren’t making sense here,” have been driven out of the major publishing houses, for the simple reason that their willingness to engage an author in a way that raises his or her writing out of the bland and predictable up to the level of polished, accomplished, and *gasp* innovative and thought-provoking might cause them to decide to publish and promote a book which may not be as immediately profitable as those nice, safe, vanilla books of formula and predictability. (The folly of this safe, secure, “predictable” business model is more than abundantly demonstrated by the long-term success of two highly dissimilar works – J.R.R. Tolkein’s The Lord of the Rings and Ken LaFollett’s The Pillars of the Earth.) Good authorship produces good manuscripts, but good editorship produces good books. (I say this with absolute assurance that it is a fundamental truth, given that I have nine books published through conventional, “mainstream” publishers, and have experienced good editors and bad, and can flatly assert that the books of which I am most proud are those which received the best editorial work.) The great flaw in the “conventional, mainstream” publishing industry today, and in particular within those six Big Companies, is not their determination to tell us as readers what to read and as authors what to write, but in their cowardice, in their craven refusal to risk making mistakes, and in doing so admitting that they have sold the birthright of their industry for a quarterly mess of pottage.
Where will it end? When the large publishing conglomerates sufficiently marginalize themselves, as they are already doing, by abandoning the very people on which not merely their success, but their existence depends: the readers. Revolutions are not usually regarded as “democratic” processes, and yet, a good historian recognizes that all revolutions are truly democratic in nature, as people vote, not with their ballots, or even their bullets, but with their feet. The publishing revolution will be decided by readers voting figuratively with their feet by leaving behind the drivel, the pap, and the vanilla fluff offered by the mainstream industry. Whether or not the industry will recognize the exodus for what it is before it reaches the tipping point where the industry cannot draw back those departing readers remains to be seen. If it doesn’t happen, I’ve already got my trainers on….
That’s the way it is…
because I’m Daniel Allen Butler, and you’re not….
So here I sit in a comfortable chair, laptop on the desk before me, a cup of hot coffee (black) to my right, a freshly-lit cigar to my left, and I’m going to tell you about the First World War.
Who the hell am I kidding?
All I have at my disposal are words, and there are some thing for which words are pathetically, even tragically, inadequate….
I’m not going to go into the nearly-unbelievable process of how a simple political assassination (of an Austrian archduke) initiated a chain of events which saw the whole continent of Europe go to war with itself in less than six weeks. If you’re really interested in the diplomatic squabbles, petty political posturing, intrigue, double-dealing, and dissembling , read my book The Burden of Guilt (Casemate 2010) to learn everything you need to know about it. Once the first German soldier crossed the Belgian border in accordance with the Schlieffen Plan, all of that became irrelevant anyway. What I’m going to talk about here is the war itself.
Frankly, The Last Post should be playing right now in the background, and keep playing until you reach the end of this column. That or maybe Geoff Knorr’s arrangement of Holst’s I Vow to Thee, My Country. Nothing else seems appropriate.
The world had never seen anything like the clash of armies that met each other in the West and in the East in August 1914.
When the Germans decided that the strategic dicta of the Schlieffen Plan overrode the moral imperatives of Germany’s guarantee of Belgium’s neutrality and invaded that little country, they brought the British Empire into the war on the side of the Allies, turning what would have been just one more European war into a truly global conflict – a “world war.” The German General Staff was openly derisive of the six-division British Expeditionary Force, thinking it too small to be worth consideration – there is a story, quite possibly apocryphal, that Kaiser Wilhelm II had referred to it as a “contemptible little army,” giving rise to the nickname which the officers and rankers of the B.E.F. adopted as a badge of honor, “The Old Contemptibles.” German derision notwithstanding, those six divisions were, man for man, the finest troops Europe had ever seen or would ever see again. When they met the oncoming waves of feldgrau on August 22, they handed the advancing Germans setback after bloody setback for the next month, retreating only when their exposed flanks were threatened, their numbers slowly but irrevocably dwindling, as the supporting French armies, bleeding and demoralized, reeled from the shock, surprise, and sheer weight of the German assault.
While the French armies desperately sidestepped to the west in the hope of forming a line which would finally stop and then throw back the German invaders, the courage and tenacity of the B.E.F. inflicted fatal delays and diversions on the German’s unforgiving schedule for advance. It still appeared, though, that the Schlieffen Plan was about to hand Germany the crushing victory she was seeking, for the French had been overextended to the east in their thrust into Lorraine. Had the B.E.F. not taken up its position on the extreme French left, the Germans would have swept in behind the French Army, encircling it, exactly as the German General Staff had intended.
But when the German Army was within sight of Paris, a hastily assembled French army, not letting the time bought so dearly by the B.E.F. go to waste, launched a devastating, unexpected counterattack into the German right flank, while the tired Tommies turned about and lunged at their feldgrau-clad pursuers. These blows threw the now more-than-weary Germans back some forty miles, with the exhausted armies all finally coming to a halt on September 22.
In the East, much had been expected by the Allies of the Russian Army, while the Germans feared its immense resources in manpower. But the Russians possessed another quality that neither the French nor the Germans had counted on: loyalty. In September, 1914, as the German First and Second Armies were lunging toward Paris, the French government, desperate for any succor, appealed to the Russian Imperial Staff to launch an attack into East Prussia that would cause the Germans to withdraw some of the divisions that were pressing so hard against the French forces. Tsar Nicholas and his senior officers, though well aware that the Russian mobilization was only half completed, felt honor-bound to come to their ally’s aid, and so ordered an immediate attack into East Prussia from Poland. The Russian commander, Field Marshal Alexander Samsonov, protested that his armies weren’t ready–some of his units were still in transit from their depots, what artillery he did have was not yet adequately supplied with ammunition, and that no operational planning had been done. Nonetheless, the Tsar insisted, and Samsonov obeyed.
At first the Russian forces made rather remarkable progress as they advanced steadily into East Prussia, brushing aside the light screen of defending units the Germans had deployed along the border. The advance continued unchecked for more than a week until Samsonov ran into the German Third Army, commanded by then-General Paul von Hindenburg. The result was the Battle of Tannenburg, an unmitigated disaster for the Russians.
Samsonov’s army, along with that of General Paul Rennenkampf’s, which had been assigned to support Samsonov’s offensive, were encircled and almost completely destroyed. The cost to the Russians was staggering: nearly a half-million men were killed or captured. Samsonov did his best to extricate as many of his troops as he could, then once he became aware of how hopeless the debacle was, rode off into a small wood and blew his brains out. A huge hole had been torn in the Russian front, and had von Hindenburg had the troops available to exploit it, the magnitude of this disaster for the Russians would have been unimaginable. As it was, a crisis was created on the Eastern Front, from which the Russian army would never wholly recover.
That von Hindenburg lacked the troops to properly exploit his victory had been merely a matter of timing however. Though it would ultimately end in disaster, Samsonov’s offensive had thrown a serious scare into the breasts of the German General Staff, and they hastily packed up two army corps from Belgium aboard a few dozen troop trains and rushed them to East Prussia, a process which all told took just under two weeks. When they arrived, though, the Battle of Tannenburg was over, the surviving Russians having stopped their headlong retreat and reformed a stable front. Yet in a curious way their arrival was still decisive, for those two corps would be absent from the Western Front when the French launched their counterattack at the Marne on 9 September. The German First and Second Armies, lacking the fresh reinforcements those two corps would have provided, fell back, leaving Paris in Allied hands, deprived of the last chance for a decisive German victory on land in the West.
By the end of 1914, after a series of sidesteps by the opposing armies called the “Race to the Sea” had ended, two thin, snake-like lines of opposing trenches, growing more and more elaborate with each passing week, had been dug from the Swiss border to the Channel, depriving each side of the opportunity to maneuver, as the armies began looking for a way to break the enemy’s lines. A slightly discordant note began to slip into the strains of Die Wacht am Rhein, or the Marseillaise, or “Tipperary” being sung as the soldaten, poilus, and Tommies left for the front, as it slowly dawned on the generals and politicians alike, and even more slowly on the general public, that something had gone terribly wrong in the calculations that had been made and the assurances given before the troops marched off to war.
It would all be over in six weeks, eight at the most, they had believed; the troops would be “home before the leaves fall.” But when the leaves fell, they only covered the fresh graves of the dead, or swirled into the newly dug graves of those still dying. Then the cry was that the war would be over before Christmas, but Christmas came and went and there was no end in sight, either of the war or the casualty lists. Soon entire military traditions were being overthrown, as realization began to dawn on the Germans and French that neither intellect nor elán were about to deliver a quick victory, and the British saw that their magnificent but tiny B.E.F., decimated in the first four months of combat, could never be resurrected.
What was developing, and what would become the lasting collective memories of the Great War, were methods of living and dying that could only find parallels in the darker passages of writers like Edgar Rice Borroughs, Jules Verne, or H. G. Wells. Open movement ceased as the troops literally went to ground, and the trenches, originally shallow and improvised, evolved into sophisticated systems of deep defensive positions, listening posts, dugouts, bunkers, and communication cuttings. Strategically, the Allies were faced with the task of forcibly ejecting the Germans from occupied Belgium and France. Tactically, it was a bloody and almost hopeless undertaking, as time and again the French and British armies surged forward against the waiting German defenses–and each time found some hellish new innovation that cut them down by the thousands. Machine guns proliferated behind huge entanglements of barbed wire appeared, with barbs the size of a man’s thumb, the better to catch on uniforms, accouterments, and flesh, pinning the hapless victims long enough for the stuttering Spandau machine-guns to find them. Mine-throwers and mortars made their debut, as did flame-throwers and poison gas.
Soldiers learned that sounds were dangerous–the steady mechanical rattle of the Spandau or the Vickers; the “whoosh” and roar of the flammenwerfer; the unforgettable “click-clack, clack-click” of a round being chambered in a Lee-Enfield rifle; the short, sharp scraping of the primer cord being drawn from the handle of a potato-masher hand grenade. And always, always, always the shells–-howling, whistling, warbling, chugging like freight locomotives, whizzing like hornets, or whining like banshees. And there was always the sound a soldier never heard–the one that got him.
Even colors were perilous. There were red Very lights at night signaling corrections to artillery bombardments; green or yellow mists, the terrible tendrils of phosgene or mustard gas, snaking along the ground. White on a man’s gums or blue on his feet announced the presence of trench mouth or trench foot; black on a wounded soldier’s body declared that gangrene had already set in.
Mud, blood, gore, long days or weeks of boredom punctuated by hours of absolute terror, were frame for the sights, sounds, and smells of men living a nightmare where they and their comrades were shot, torn, gassed, pulverized, immolated and obliterated in ways that human beings had never before suffered and endured. As gruesome as recounting such sights, sounds, and images may be, it is necessary, for it is required that there be an awareness (not an “understanding”–-only those who experienced those days can ever “understand” them) of what the survivors of the Great War endured in order to comprehend what happened when peace returned to Europe.
It was an article of faith that the enemy always suffered the worst. The French in particular refined their talent for self-deception, somehow formulating the absurd idea that for every two Frenchmen who died in action, three Germans had been killed–but the truth was that the slaughter was appalling for both sides. In the First Battle of Ypres, in October 1914, where wave after wave of German infantry, many of them university students advancing arm-in-arm singing patriotic songs, were cut down by the deadly accurate British rifle fire. One German division lost over 9,300 men dead out of strength of 12,000–in a single morning. Later, when the Allies were on the offensive, time and again the Tommies and poilus would clamber over the top of their trenches after artillery bombardments that had lasted for hours, sometimes days, or even weeks, waiting for the second-hands of their officers’ watches to touch zero hour, when they would begin their methodical advance, the British to the sound of the officers’ whistles, the more romantic French to bugles blaring the Pas de Charge. They stepped out with dressed ranks and at precise intervals across the shell-torn mudscape that stretched between the opposing lines of trenches known as No Man’s Land. The Germans, having weathered the barrage in the relative safety of their deep dugouts, would emerge to assume their prepared positions and bring down a withering hail of rifle, machine gun, and artillery fire on the advancing troops. At the Somme, one attack that advanced barely 700 yards took three weeks at a cost of nearly 30,000 lives. Even on days when the public communiques would read “All quiet on the Western Front” (the German read “In dem West ist nicht neuen”), nearly 5,000 men were becoming casualties, victims of sniper fire and random shelling. It was as if a small town was being methodically wiped off the face of the earth each day. The British, who in some ways were becoming even more systematically cold-blooded than the Germans, referred to such losses as “normal wastage.”
In the post-war years, the commanding generals on both sides would be pilloried as mindless brutes who could conceive of no alternative but to feed endless masses of men into a vast killing machine, in the hope that the enemy would run out of troops first. In fact the generals, much maligned as incompetents as they are, and some of them deservedly, really did not intend to slaughter the finest generation of young men their nations would ever produce. Certainly none of them ever enjoyed it, no matter what the slanderers might say in later years. The hard, painful truth was that they were unprepared–-there was no way they could have been prepared–for the war they found themselves given the responsibility of fighting. For years it had been a tenet of military faith, and correctly so, that the day of the frontal assault was over–the American Civil War and the Franco-Prussian War had first demonstrated that, and the Russo-Japanese War of 1905 and the Balkan War of 1912 had only reinforced the lesson. Modern firepower made frontal assaults too costly, for infantry in even a hastily prepared defensive position could hold off several times their number of attacking troops, inflicting unacceptable losses in the process. So for decades the emphasis had been placed on conducting wars of maneuver, which gave an army the opportunity to turn a foe’s flank, and achieve a decisive result in battle without having to resort to the terrible waste of lives of frontal attacks. What the generals never anticipated was a war where maneuver would be impossible, where there would be no flanks to turn, and the dreadful frontal assaults the only option remaining to them. The result was a slaughter the like of which had never been seen before or since.
And yet, somehow, there is still a rose-tinged nimbus of romance that surrounds the Great War. It was the songs–“Keep the Home Fires Burning,” “Lili Marlene,” “Pack Up Your Troubles,” “Till We Meet Again,” and everyone’s favorite, “Tipperary.” It was the magnificently anachronistic traditions–French cuirassier regiments and squadrons of German uhlans looked as if they had just stepped out of a Phillipoteaux painting of the Napoleonic Wars; the Germans wearing faintly absurd spiked Pickelhauben helmets; in England, a tradition harking back to the Hundred Years’ War found newly commissioned subalterns visiting an armorer to have their swords sharpened before leaving for the front.
It was the grandeur of an age that was in fact its shroud. France and Germany had armies of conscripts, it is true, but conscription had been a national institution for generations–what made these conscripts conspicuous was how few tried to evade their responsibility. In Great Britain the situation was even more astonishing: not until 1916, when Britain would be compelled to field the largest army the Empire had ever mustered to carry out the Somme Offensive, would the British Army have to resort to a draft to fill its ranks. These young men, rightly called the flower of European youth, were the most idealistic the world would ever see, untainted by the cynicism and affected, postured disdain of later generations. Instead they steadfastly believed in Ein Volk, ein Kaiser, ein Reich, or Liberte, Egalite, Fraternite and Vive le Republique , or fighting for King and Country.
What Europe was killing, no matter how willing the victims, was the vitality that would leave later generations listless and disillusioned: the fire that had driven the Continent for a thousand years was being quenched forever. By the end of May 1916 nearly two million soldiers once clad in khaki, feld grau or horizon bleu were dead or missing on the Western Front, while another four million had been wounded. And the bloodiest years were yet to come….
July 1, 1916, the first day of the Somme Offensive, would forever be remembered as the Black Day of the British Army. At 6:00 a.m. one hundred twenty thousand Tommies went “over the top” to attack the Hindenburg Line. By nightfall, barely more than twelve hours later, half of them had become casualties, twenty thousand of them dead. The Somme attack had been launched in order to take pressure off the French Army, which was locked in a death struggle with the German Army around the fortress city of Verdun. In the ten months of that battle, each army would lose more than 350,000 dead in an area little more than ten miles square.
The Somme did not end on the first day; the battle continued with a fury that ebbed and flowed until November, eventually gaining a seven-mile advance, at a cost of 420,000 British soldiers killed, wounded, or missing; the Germans losses totaled nearly a quarter million men. But while the British Army’s hopes for a breakthrough had perished in mud and barbed wire, the strategic horizons for the German Army seemed bright with promise. While no one was winning the war on the Western Front,
by all appearances Germany and her allies were winning it everywhere else.
The Central Powers (as Germany and Austria-Hungary styled themselves) began their string of strategic victories in the autumn of 1915, when Bulgaria joined the alliance and helped the Austrian armies overrun Serbia. Even before the Serbs collapsed, an attempt by the Allies to knock the Ottoman Empire out of the war by a coup-de-main had failed on the Gallipoli Peninsula at the Dardanelles: over the next two years the Turks would fight the British Empire to a strategic draw. The Kingdom of Rumania, sensing an opportunity to aggrandize itself at the expense of Austria-Hungary, declared war on the Central Powers in August of 1916; by December the Rumanians were suing for peace.
Italy, who had not earned and did not deserve the Great Power status she accorded herself, had ignored her alliance with Germany and Austria-Hungary in the crisis of the summer of 1914. The theme of the Italian Prime Minister, Antonio Salandra, during those crucial weeks was endless repetitions of “Compensation Compensation ” letting it be known that Italy was available to the highest bidder; there had been no takers. However, by the spring of 1915 the Italian government perceived an opportunity to exploit the Dual Monarchy’s preoccupation with the Serbs to the south and the Russians to the north, and declared war, hoping to annex Trentino and the Tyrol, and seize the Adriatic port of Trieste. Instead her attacks were stopped cold by the Austro-Hungarian forces, and a bitter stalemate identical to that of the Western Front imposed itself along the Isonzo River, where the Italians attacked and the Austrians repulsed them with bloody regularity.
To be sure, there were moments of alarm for the Central Powers. When a German U-boat sank the British passenger liner Lusitania in May 1915, the ensuing diplomatic confrontation with Washington DC left Berlin with the realization that sooner or later the United States would be added to the list of Germany’s foes. And the Brusilov Offensive, launched by the Russian Army in the summer of 1916, was perhaps the most skillfully executed operation of the war, and came perilously close to routing the Austro-Hungarian Army. Only swift German reinforcement stiffened the sagging morale of Franz Josef’s troops, and the Dual Monarchy fought on.
The Allies took the offensive again in 1917, and it would prove to be an annus horribilis. The overture came in April, along a sector of the Western Front known as the Chemin des Dames, where the French, who believed that they had discovered a tactical “formula” which would break the stalemate, opened their great attack. The abortive “Nivelle Offensive,” named after the French commanding officer who conceived and executed the plan, lasted three weeks, cost 187,000 casualties, and gained less than a mile. The French Army, already morally and physically debilitated by the charnel house of Verdun, had been pushed at last to the limits of human endurance. The ranks of the poilus erupted in mutiny. French soldiers would continue to defend la patrie but they would never again be asked to carry out great offensives.
Thus the burden fell once more on Great Britain. A carefully developed plan, strategically sound but betrayed by weather, geology, and–-worst of all–politicians, resulted in the British Army’s collective Golgotha. Officially known as the Third Battle of Ypres, it would be forever immortalized as Passchendaele, after a village which stood squarely in the center of the British line of advance. Five months of fighting gave a gain of five miles and a casualty list 315,000 names long. It had been the Tommies’ supreme effort: they had no more to give. If France was morally and spiritually exhausted, Britain had become physically so: there were, quite literally, no more fit men to replenish the ranks of His Majesty’s regiments. Scraping the bottom of the barrel accomplished nothing, for there was nothing there: it is a matter of record that draught notices were sent to men who were maimed, or mentally ill, or even, in a few instances, already dead.
On the other side of the trenches, the human cost for Germany in each of these battles was equally horrible as that of the Allies. Ultimately, German losses at Verdun, the Somme, and Passchendaele exceeded those of France and Great Britain, and for Germany, with a much smaller population relative to the Allies, replacements were harder to muster. Nor were the losses merely numerical: by the end of 1916 whatever qualitative superiority over the Allies’ soldiers the German soldaten once possessed had dissipated, and only the German Army’s technical superiority kept it ascendant. By the end of 1917, even that was being eroded.
And there was another force, its results less readily visible or immediately obvious, insidiously eroding German and Austrian strength. Neither nation was self-sufficient in food production, and before the war close to a third of Germany’s foodstuffs were imported from overseas. Now the Royal Navy’s blockade of Germany cut off altogether those sources of supply, and by the summer of 1916 shortages became increasingly frequent, while as 1916 turned into 1917, starvation began to loom over the civilian populations of the Central Powers. One way or another, 1918 would be the decisive year of the war.
In the east, battles equally costly in lives but even greater in scope than those of the Western Front would be fought between the Central Powers and Imperial Russia, bearing names like Tannenburg, the Masaurian Lakes, Tagenrog, Gorlice-Tarnow, Lake Naroch, and the Brusilov Offensive; only the immense distances of the Eastern Front prevented these battles from being individually decisive–-it would be their cumulative effect which would play into the hands of the Bolsheviks and other revolutionaries which eventually brought down the Romanov dynasty and then toppled the Provisional Government. Eight decades of Soviet meddling and “revision” have left the reliability of the records of those years suspect, but at the very least it can be stated with certainty that close to four million soldiers, Russian, Austro-Hungarian, and German, died in battle between August 1914 and November 1917. How many more, mostly Russian, died of disease and malnutrition is impossible to calculate, although the total could easily be double the number of combat deaths.
Yet, if 1918 was to be the decisive year of the war, 1917 was the pivotal one, and in it the hinge of fate turned against the Central Powers. The moral affront of the resumption of unrestricted submarine warfare and the blundering attempt at “diplomacy” which became known as “the Zimmermann telegram” had pushed the forbearance of the American people and their government past the breaking point, and on April 6, the United States declared war on Germany. Almost a year would pass, however, before the weight of American manpower and industry would begin to make itself felt; when it did so, the Allies would be invincible. Germany’s only hope of victory now lay in forcing the French and British to come to terms before the American juggernaut materialized. The collapse of Imperial Russia released sixty German divisions to be redeployed in the West, where for the first time since the opening weeks of the war the Kaiser’s army would enjoy a numerical superiority over the Allies. It was an opportunity that the German General Staff did not intend to waste.
As 1918 dawned, it was clear to the political and military leadership on both sides that the armies of all the warring nations were approaching exhaustion–-the question was who would falter first. Russia had already given up in December 1917: wracked by revolutions at home that lurched successive governments further and further to the left, and dissent, defeatism, and disillusion within the army, she finally made a separate peace with Germany at Brest-Litovsk. The French Army had stood on the brink of self-destruction in 1917, and even now replacements marching to the front were heard counting the cadence “Une, deux, trois, merde ” and baaaa-ing like sheep being led to the slaughterhouse. Great Britain had already reached the limits of her manpower, now the German U-boats were taking a dreadful toll of the merchant shipping that was Britain’s lifeline: in April 1917 there was less than three weeks’ supply of food left in the country. The introduction of convoys by the Royal Navy that same month, long resisted by civilian shipping interests, narrowly averted a catastrophe.
Germany’s position was, on the whole, even worse, for the British blockade had been slowly starving the German Empire to death for more than three years. Meat and milk were all but non-existent; turnips had replaced potatoes as a dietary staple; flour for making bread often contained as much sawdust and chalk as wheat. With its new-found numerical superiority on the Western Front, the German Army High Command believed that it had the strength for one last great offensive, but after that there would be no hope of victory: Germany was as exhausted as her foes.
The result was a succession of hammer-blows thrown against the Allied lines, collectively known as the “Kaiserschlacht”–-the Kaiser’s Battle–beginning in late March 1918, with a new offensive opening with each passing month until the July. The first, landing on the British Army, was the most successful, for it had the advantages of superiority in manpower and material as well as strategic surprise; yet in the end it failed, for while the British Army fell back, the Tommies held their line, and dreadful losses were inflicted on the attacking German divisions. It would prove to be a decisive development, for each of the four German offensives which followed were launched with diminishing numbers and declining morale, while the Allies were marshaling their strength and regaining a measure of confidence as each following German offensive was contained and then repulsed.
By the end of July the Germans had lost more than 600,000 killed, wounded, and missing–-irreplaceable losses–-in this succession of offensives, with little to show for it: some territory had been gained, but no strategic breakthrough had been achieved, no decisive result attained, and the German Army was finally exhausted. At the same time, the Allies, infused with new strength in manpower, material, and morale as the American Army began to arrive in France in significant numbers, went over to the offensive. In August and September, their numerical superiority regained, the Allies attacked the German line relentlessly; moreover, by this time they were also gaining a tactical and technological ascendency as well. One by one, through the month of October and into November, Germany’s allies fell away, and finally, in the first week of November, revolution swept over the Reich, the monarchy was dissolved, and the German government, now a republic, asked the Allies for an armistice. At 11:00 a.m. on November 11, 1918, the fighting ceased.
So, what did it all mean? What was it all for?
I don’t know.
Every nation has its partisans, of almost every social and political bent, as well as its detractors. The “why” of the Great War will be debated until the end of human history without a consensus ever being achieved, of that I’m convinced. A nice thought, isn’t it? Ten million or more soldiers die (and probably an equal number of civilians, East and West) and humanity can’t even agree as to why….
But there is something else of which I am equally convinced. For every Siegfried Sassoon, who won medals for brave deeds done by other men, and then whined about how “unfair” was the war to a sensitive soul like himself, there were hundreds of thousands of men, better men, who never thought of themselves as exceptional and deserving of special treatment, but, instead, were proud to be “numbered among those who, at the call of King and Country, left all that was dear to them, endured hardness, faced danger…” and who, for far too many of them, “finally passed out of the sight of men by the path of duty and self-sacrifice.”
I don’t care what uniform they were wearing, whether it was khaki drab, horizon bleu, or feld grau, or when or where they died. In 2014, the centennial of the first year of the Great War, they all deserve the same acknowledgement:
“They shall not grow old, as we who are left grow old.
Age shall not weary them, nor the years contemn.
At the going down of the sun, and in the morning, we will remember them.”
So now it begins.
The accusations of incompetence, ineptitude, and downright stupidity. The evasion of responsibility, real or perceived. The finger-pointing. The back-biting. The back-stabbing. The self-righteous posturing of those not involved in the decision-making process. The self-justifying whining of those who were. The weeks of absolutely absurd and pointless muppet-flailing which will be part of the national and regional dialogue for the next several news cycles.
I’m speaking, of course, of the aftermath of the Great Snowpocalypse of Atlanta, 2014 Edition.
Now, let me begin by staking out a base position. I was out in it. I know, I know, the sentence isn’t grammatically correct, but you get the idea. No, I didn’t spend the night in my car, stranded on a stretch of interstate or parkway because the whole road net had become paralyzed by a combination of winter weather conditions and a typical volume of Atlanta traffic (more on this in a moment). I didn’t have to have to abandon my vehicle a mere two miles from my home and trudge through the (minimal amount of) snow in temperatures in the mid-20s, all the while muttering about “Arctic-like conditions,” fearful that my fate would be reminiscent of the Scott Expedition. I didn’t sleep in a Waffle House or Home Depot–at least, not this time. No, my experience was spending two hours traveling approximately one-half mile in an effort to complete a trip that was totally unnecessary, saying “Bugger this!” (or words to that effect), doing a one-eighty and returning home. However, it was time well-spent, as it gave me an appreciation for the realities of life for those who did spend the night in their cars, walked home in the (minimal amount of) snow, or spent the night in some makeshift ad hoc public shelter.
In the endless volume of punditry and pontificating that will be belched forth in the days and weeks to come, we will be bombarded with countless variations of “What [insert Government Official or Department of Your Choice here] should have done was….!” We will then be told how the schools weren’t closed in anticipation of the coming storm, how salt- and sand-trucks were not deployed and put to work before the first flake fell, and how poor was the coordination and cooperation of the various state, county, and municipal authorities in coping with the emergency.
An article by Rebecca Burns on Politico.com bewails “The Day We Lost Atlanta,” as if the region (“Atlanta” is not just a city, it’s a geographical notion as well) had experienced multiple strikes by nuclear weapons in the hundred-megaton range and been wiped off the face of the planet. The terms “Snowmegeddon” and “Snowpocalypse” are making the rounds among the perpetually smug pseudo-hip to describe the afternoon, evening, and night of January 28, 2014 in and around Atlanta, Georgia. Professional navel-gazers are already creating narratives depicting the day as a Great Natural and National Disaster.
Atlanta still exists, it’s infrastructure (and no, we’re not going to talk about its flaws, that’s for another day) still functions, and as of today, January 30, 2014, the region is beginning to move with something approaching its normal dynamic. People who can work from home are doing so, people who can’t are, for the most part, staying home from jobs that really would have no point today, because the people who would need their services aren’t going out to avail themselves of them. Schools are closed, a prudent precaution because there’s still ice and snow on the majority of side roads and streets, and given the hills (the endless, endless hills!) here in Atlanta, doing otherwise would be a recipe for multiple tragedies. By the weekend, everything will be back to normal, for weal or woe. So, the bottom line here is that Atlanta got some snow, a bunch of people were inconvenienced, and within a few days all will be as it was before.
What really happened in the “Snowpocalypse”? I mean, what really happened? There was one death, a pedestrian struck by a car as he was walking down the middle of a street. (I won’t even deign to dignify that act of stupidity with further comment, other than to repeat the words of the Great American Philosopher Forrest Gump: “Stupid is as stupid does.”) There were a handful of injuries in minor traffic accidents. A lot of vehicular sheet metal met its doom. An unexpected volume of tire rubber was expended by motorists who had no clue as to how to drive their vehicles in slick road conditions. An inordinate amount of gasoline was consumed by cars and trucks idling for hours on end in the gridlock. And a world-record amount of muppet-flailing has ensued in the aftermath.
That’s it, folks. Atlanta, city and region, got shut down completely for one day, partially for two days. One fatality, a few injuries, a boon for accident attorneys and a big hit in the pocketbook for auto insurers. No buildings were destroyed, no one lost their homes, no one even lost power; in short, no permanent damage anywhere to anything (aside from a few bruised egos among those Southerners who imagined that they knew how to drive in the snow). And that translates into having “lost” Atlanta? Puh-leeze….
Of course, there will always be that Greek chorus of doom ready to chime in with “But what if…?” and “Oh, it could have been so much worse!” Which proves nothing. What might have been is exactly that – what might have been. Pointless conjecture remains pointless no matter how gruesome or titillating might be the details. Deal with the reality, people, not your fantasies.
The gridlock that occurred on the afternoon of January 28, 2014, happens to a slightly lesser degree every day in Atlanta, usually for three to four hours at a time, and happens in the same magnitude at least once a week, as accidents and incidents on the interstate, highways, and parkways create ripple effects that logjam traffic for hours. What happened that afternoon was notable not for its size or scope, but merely for its duration. As for gridlock, the few hours of daily gridlock that Atlanta experiences even in good weather isn’t a patch on the daily lockdown that is part of life in Los Angeles for drivers on the 405, the 605, the 10, the 110, the 210, and the 710. Having lived in West Los Angeles for eight years, I can attest to that personally.
In the aforementioned article, Ms. Burns bemoans the fact that the events of January 28, 2014 demonstrate that Atlanta’s infrastructure and governmental organization are both woefully unprepared to handle the stress of a major evacuation, and implies that there will be horrible consequences if such an event is every required to actually be undertaken. Well, suck it up, Cupcake, get out of Atlanta more often, take off your parochial blinders, and look around you. NO major city in the United States is prepared for or possesses the infrastructure to handle an evacuation in the face of a large-scale natural disaster. Does anyone remember Hurricanes Katrina and Sandy, just by way of example? “Snowmegeddon” was simply a demonstration of that wise old adage that “Crap happens” executed on a regional scale. Were preparations poor? Unquestionably. Were decisions badly-made? Undoubtedly. Are there lessons to be learned, applicable to how to act and react in the face of another such impending storm (Atlanta seems to have a habit of getting hit by a major snowstorm every three to four years)? Unarguably. Will such lessons be learned? Unlikely. The politicians in Georgia, from the governor on down to the superintendent of the smallest school district in the Atlanta area all acted and reacted as their kind always do – avoided making actual decisions until events either compelled them to do so or made the decisions for them. Then, naturally, those self-same politicians began to attempt to lead from the rear, while ordinary, real people rolled up their sleeves and coped. It’s too much to expect politicians to be anything other than what they are, so let’s give up that idea as a lost cause move on to more constructive and productive thoughts. However, I digress….
Bottom line: “Snowmeggedon, Atlanta 2014 ” was and will remain, deservedly so, a joke. A tempest in a teapot, which will be stirred endlessly in the days to come by the nattering nabobs of the chattering classes, an endless source of punditry and pontification by those who wish to appear wise, but who in the end, only can only produce “much sound and fury, signifying nothing.”
I survived, along with the rest of Atlanta, save for that one unfortunate individual, “Snowmegeddon.” And I yawned through the whole thing….
Remember, I’m Daniel Allen Butler, and that’s the way it is….