Story of Iridium

It all started one day in 1985 with a vision and commitment of one of Motorola’s brightest engineers, whose wife complained that she was unable to reach clients via her cell phone from the Bahamas. He envisioned a technology that would allow effortless communication from and to any corner of the earth. His name was Bary Bertiger. Bertiger submitted his idea to his superiors (who had rejected the concept, it was no less than Robert Galvin, Motorola’s chairman at the time, who gave Bertiger approval to go ahead with the project), gathered a star team of engineers and businessmen and started working on it. His idea was to put up a network of low orbiting satellites covering the entire orbit of the earth and linking them with mesh technology for routing calls to and from any point in the world.

Work on the project started in 1987. While satellite phones were available in the 1980s, they had limited global coverage and, owing to the height the satellites orbited, the transmission and receiving delay made a conversation sound as though speaking down a long tunnel. Motorola thought that the world was ready for something better, and proposed creating a massive network of satellites that would provide global coverage. These satellites would orbit at a much lower altitude than their competitors and so the quality of transmissions would improve dramatically. The initial cost of the project was estimated to be over A$7 billion and required to put 77 satellites into low earth orbit. Because of this number of satellites , the project was dubbed Iridium after element 77 in the periodic table of Mendeleev.

However, cost saving issues resulted in a proposed redesign of the project, reducing the number of satellites to 66 only. The marketers kept the name Iridium, instead of changing it to element 66 — dysprosium (from Greek “hard to get”).

The launch of satellites started early in 1997 and was completed by 1998 (from launching pads in Kazakhstan, China and America).

Iridium services commenced from December 1998. “Iridium’s core identity was defined by its transcendence of national borders, a structure that is particularly post-Cold War,” Wired magazine gushed in its October 1998 cover story. “Iridium may well serve as a first model of the 21st-century corporation.”

Unfortunately, despite the brilliance of the technology and the team behind its design and marketing, expectations had changed since 1987. People expected their phone to be lightweight, usable inside buildings and the calls to be relatively cheap. Iridium phones were heavy (not suitable for carrying in a pocket) — as they needed powerful batteries — and they didn’t work inside buildings, costing around A$10 per minute. In parallel, the demand, anticipated by the original Iridium creators, was slowly being met by a advent of portable mobile phones. 1G cellular telecoms have been launched by NET in Japan in 1979. However, the dawn of mobile phones came with the launch of 2G systems such as GSM in 1990s and still ubiquitous around the world. By then, the market for satellite phones was estimated at 2-3% of the mobile-phone market and there were other companies (Globalstar, ICO, and Ellipso) are chasing the same (satellite) customers.

Less than a year later, Wired News backtracked, saying, “After losing nearly US$1 billion in two disastrous quarters, the engineering marvel is in danger of becoming the Ford Edsel of the sky.”

At the cost of construction of A$7 billion, Iridium needed over one million subscribers to break even. By mid 1999 it had gained 55,000 subscribers and was rapidly running out of money. In August 1999, the Iridium was bankrupt and subscribers found themselves without a dial tone. There were several attempts at selling Iridium, but no company could afford it.

By early 2000 the only employees left at Iridium were those employed to ‘de-orbit’ the satellites (during the estimated period of two years).

In 2000, the company was taken over by Iridium Satellite LLC (for A$35 million), contracted by US Defence Department. In 2007, Iridium Satellite LLC announced that it would be launching new satellites to attract partners, providing services beyond voice calling such as a next-generation global positioning system, environmental monitoring and satellite photography, becoming fully operational by 2016.

It currently has 280,000 subscribers and counting, despite the fact that the phone requires a line-of-sight with a satellite and thus can’t work inside buildings.

Time will tell if its current incarnation is more successful than its first.

China: change of policy in 15th century

The Greek name for the Chinese was Seres, from which the Latin word serica derives, meaning silk.

China has always viewed itself as being at the centre of its world, traditionally. The modern word for the country, Zhong guo (Central Realm), seems to say it all.

Writing materials such as rolls of silk came from the 2nd century BC, and paper from the 2nd century AD (Cai Lun, 105AD). Printing too was a Chinese invention: fixed blocks were cut to print whole pages (Feng Dao, 932AD), and movable type was introduced from the 11th century AD (Bi Sheng, 1041AD). China was the first to establish the enduring institution of public service examination (founded under Sui Dynasty, 605, till its abolition in 1905).

Additionally, Chinese advances in iron and steel manufacture were several hundred years ahead of Europe. Coal was being mined from 8th century and used in furnaces producing high quality iron and steel. Chinese are also credited for inventions of saddle and stirrup (5th century), compass (possibly 20-100AD), gunpowder (Taoist monks in search of “elixir for immortality,” 9th century) and porcelain (under Tang Dynasty, 7th century).

Maritime inventions credited to Chinese also include the anchor, the drop-keel, the capstan, canvas and pivoting sails.

By medieval times, China became the most intellectually sophisticated and technologically advanced country in the world.

Then came the year 1405 under the Ming dynasty (1368-1644).  Fleets of hundreds of immense Chinese ships  (28,000 people sailing on 300 ships. It was a fleet whose size and grandeur would not be matched until World War I) headed by admiral Zheng He traversed from the China Sea past Sumatra to Ceylon, India, Arabia and East Africa. Seven epic Chinese naval expeditions from 1405 to 1433 explored and brought under the Chinese tributary system the vast periphery of the Indian Ocean. However, less than a century after this Chinese maritime high water mark, it was a crime to even go to sea from China in a multi-masted ship.

The economic motive for these huge ventures may have been important, and many of the ships had large private cabins for merchants. But the chief aim was probably political, to enroll further states as tributaries and mark the reemergence of the Chinese Empire following nearly a century of barbarian rule. The political character of Zheng He’s voyages indicates the primacy of the political elites. Despite their formidable and unprecedented strength, Zheng He’s voyages, unlike European voyages of exploration later in the fifteenth century, were not intended to extend Chinese sovereignty overseas. The question therefore begs how could such a policy (containing enormous potential for growth and prosperity), started in 1405, come to an abrupt halt and reversal by 1433.

There is no definite answer to that question. However, few possible explanations were postulated.

  1. Political power struggle between two factions of the Chinese Imperial court (between the Confucian courtiers and the palace eunuchs), combined with an overwhelming demand for political centralization and unity.
  2. There was an Imperial decree to decommission decision the great navy over the whole of China, reasoning behind such a decision possibly that renovation turned into stagnation, and that science and philosophy were caught in a tight net of traditions smothering any attempt to venture something new. This decision became irreversible due to the loss of shipyards capable of turning out ships that would prove the folly of that temporary decision.
  3. There was an internal Chinese court policy struggle between competing theories of the commercial and technology benefits of foreign trade, against the benefits in social purity of isolationism. Isolationism won.
  4. The navy had become dependent in the 15th century on a meager set of maritime missions that were overly fragile and thus the Chinese navy was vulnerable to relatively minor changes in the strategic situation. The completion of the Grand Canal as a more efficient and safer means of grain transport became an important factor, which engendered the demise of the Chinese ocean-going navy.
  5. Maritime threats (piracy) were always considered secondary in China to continental or land-based threats, and thus in difficult economic and political times (threat to revival of Mongol power on the northern steppe) during the Ming period, the maritime solutions to national security (navy) lost resources to the continental solutions (army).

Perhaps several factors separately or their combination caused a Chinese rejection of sea trade and seapower in the mid-15th century. We can never know for certain.

What we do know is that traditional ethnocentric and culturally well-cultivated Chinese were ever so anticipative. It is no secret that a following maxim has been a byword of not only Chinese warfare but also other strategic maneuvers throughout the ages.

“Steal the beams, change the pillars” (from  “36 Strategems”).

China is on rise again. Let us see how far it will go this time.

A Museum Of Personal Failure

Americans, it turns out, are not only in habit of establishing durable monuments and institutions commemorating their success and achievements, but also their failures.

Located in the Bucktown neighborhood, American Mini-Storage is one of Chicago’s best-kept secrets, but don’t expect it to stay that way for long. The self-storage facility houses what is arguably the nation’s most impressive collection of personal items accumulated during periods of failure.

A whole museum dedicated to failures. Not by accident many say that we learn more from failures than from successes. Some think that personal failure “becomes not an indicator of personal inadequacy, but a sign that you are expanding your horizons and making progress.”

“There are 250 storage units here, and each one has a different pathetic story to tell,” said Carlos Garcia, one of several client-relations managers at American Mini-Storage. “They run the gamut—from libraries of unread college textbooks to abandoned bolts of canvas to half-restored antique chests of drawers. Each storage locker is like a window into a separate life of disappointment and inadequacy.”

American Mini-Storage opened on Armitage Street in autumn of 1996. Despite being relatively new to the market, facility managers have amassed an impressive collection, thanks to location, word of mouth, and generous contributions from anonymous donors.

It also seems that many people, while not at ease with their present and past failures, still prefer to store or keep these somewhere – perhaps out of their sight – where they can mingle from time to time and be reminded.

“This is the Mueller space,” Garcia says. “It holds a crate of five partially written detective novels. And over here in the Sherman room, we have one of my favorite collections: the leftover inventory from a failed salad-dressing business. Oh, and take a look inside the Curtis collection. It boasts the decaying remains of an entire family’s failure, including a sixth-place intramural-tennis trophy, a moth-eaten gymnastics uniform, and a file cabinet jammed with overdraft bank notices.”

Gymnastic uniforms and overdraft bank notices and everything in between. Imagine that now this entire “wealth” of personal failures of countless types of human endeavors, thoughts and ideas is open to the public. What will be the long-term impact of such an exposition: failures exposition? Would we learn something new?  Or would be be reminded of our own well-hidden failures?

While the storage facility is by no means the only one of its kind, several factors have contributed to the breadth of its fascinating collection.

“Part of the reason for our success is that the neighborhood itself has been in drastic flux over the past 15 years,” Garcia said. “As a result of Bucktown’s gentrification, the Puerto Rican population has been displaced, followed by the artists and musicians, then the people on the first steps to their career. Everyone who has come and gone has needed a place to store painful reminders of the past. We are not just a storage facility, we are a repository for every imaginable setback a person can experience.”

Here is the entire The Onion article.

Failures of the theory of Darwin (part 1)

Evolution theory devised by Darwin is generally considered one of the most important intellectual achievements of the modern age. The theory allegedly put an end to hitherto existing speculations purporting to explain evolution of humanity and life on earth. In 1859, when the Origin of Species was first published, it did not directly reference humans nor made any claims of our common ancestry with other mammals. Ever since and with increasing knowledge in spheres of anthropology, genetics and biology, modern scientists came to hold it not as a possible conjecture (a sound theory with many explanations of empiric data) but as universal truth about the human life on earth. Currently, two main version of evolution theory exist: phyletic gradualism (uniformity and gradual transformation) and punctuated equilibrium (slight changes with final leap).

However till now, the theory failed to exhaustively explain or address a number of open questions and and issues:

1. Darwin, in The Descent of Man, considered it  logical to extend the theory to cognition, when he considered human characteristics such as morality or emotions to have been evolved, introducing evolutionary psychology. It holds that human nature was designed by natural selection in the Pleistocene epoch and aims to apply evolutionary theory to the human mind. It proposes that the mind consists of cognitive modules that evolved in response to selection pressures faced by our Stone Age ancestors. In the recent research conducted by authorities on the topic, Buller (in his book Adapting Minds) and  Richardson (in his book Evolutionary Psychology as Maladapted Psychology) show that neither the methodology nor the results of evolutionary psychology can be justified scientifically.

2. An apparent lack of “evolutionary” effect on bacteria (new generation: 12 mins to 24 hours) and fruit flies (new generation: 9 days) with unlimited number of genetic mutations and variations. Evolution theory must have had even a bigger effect on those because of a recently introduced model, which suggests that body size and temperature combine to control the overall rate of evolution through their effects on metabolism (smaller organisms evolve faster and are more diverse than larger organisms).

3. On rare and random occasions a mutation in DNA improves a creature’s ability to survive, so it is more likely to reproduce (natural selection). But it is widely known that there are very few human treats, which were tracked to one gene (sicknesses like the Dracula Gene and the Cheeseburger Gene). Modern science currently holds that most of even simplest of human treats, features and behavioral patterns have underlying sophisticated molecular and genetic mechanisms. Therefore it is doubtful natural selection could favor parts that did not have all their components existing in place, connected, and regulated because the parts would not work.

4. The Cambrian/Precambrian time period does not support Darwinian evolution. There are no intermediate (transitional forms) found during this period. There appear to be no fossil ancestors for complex invertebrates or fish.

5. The theory of evolution seems to be in violation of two fundament laws: second law of thermodynamics (things fall apart over time, they do not get more organized) and law of biogenesis (living cells divide to make new cells, and fertilized eggs and seeds develop into animals and plants, but chemicals don’t fall together and life appears).

To be continued some time soon..

Murphy’s law of failure

Murphy’s Law (“If anything can go wrong, it will“) was born at Edwards Air Force Base in 1949 at North Base.

It was named after Capt. Edward Murphy (born in 1918), an engineer working on Air Force Project MX981, a project designed to see how much sudden deceleration a person/humanoid can stand in a crash with subsequent tests performed by medical doctor John P. Stapp, then an Air Force captain. Featured on the cover of Time magazine in the 1950’s, Stapp became known as the “Fastest Man on Earth” for his G-force experiments, which involved the use of rocket sleds. He was a famous researcher who helped develop restraint systems including automobile seatbelts.

Murphy was engaged in supporting this research using high speed centrifuges to generate G-forces. One day, Murphy’s assistant wired the harness, and a trial was run using a chimpanzee. The sensors provided a zero reading, however; it became apparent that they had been installed incorrectly, with each sensor wired backwards. It was at this point that a disgusted Murphy cursed the technician responsible and said, “If there is any way to do it wrong, he’ll find it,” despite having the possibility to calibrate and test the sensor installation prior to the test proper, which he declined, not getting along well with the project team.

The contractor’s project manager, present at the time when Murphy told the phrase, kept a list of “laws” and added this one, which he called Murphy’s Law.

While the origins of the law are still debated, everyone agreed that Stapp played a critical role in popularizing Murphy’s Law. After the incident, he gave a press conference during which he said that their good safety record on the project was due to a firm belief in Murphy’s Law and in the necessity to try and circumvent it. Aerospace manufacturers picked it up and used it widely in their ads during the next few months, and soon it was being quoted in many news and magazine articles.

Murphy’s Law was born .

One dark evening in 1990, Murphy’s car ran out of gas. As he hitchhiked to a gas station, while facing traffic, he was struck from behind by a British tourist who was driving on the wrong side of the road.

A variation of the original Murphy law favored among hackers is a takeoff on the second law of thermodynamics: The perversity of the Universe tends towards a maximum.

P.S. Stapp had a paradox of his own, Stapp’s Ironical Paradox, which says, “The universal aptitude for ineptitude makes any human accomplishment an incredible miracle.”

Sources: Murphy’s Laws site; The Desert Wings, March 3, 1978; Annals of Improbable Research (AIR)

Crassus tries to take on Parthians

At the beginning of 54 BC, Marcus Licinius Crassus had just finished serving his joint-consul year with Pompey. Crassus was a greedy man and felt a great desire to achieve new glory and get even richer. He had seen no action since his defeat of Spartacus nearly 20 years earlier.

He turned to Parthia. Parthia was led by the Arsacid dynasty, which reunited and ruled over the Iranian plateau, after defeating the Seleucids, beginning in the late 3rd century BC and was the latest archenemy of the Roman Empire in the east. Parthia was the hereditary of the ancient Persian civilization and its wealth.

In 53 BC, Artavasdes, the Armenian king and vassal of Crassus, advised him to take a route through Armenia avoiding the desert, but Crassus refused. Parthian army – half of it while the other half was sent against Armenians – that met Crassus was commanded by Surena and consisted entirely of cavalry units, to scout out, delay, and, if possible, destroy Crassus. The two armies clashed near the town of Carrhae. Though demoralised by the hot climate and long route, Crassus’ troops heavily outnumbered the Parthians. The Parthian armies included two types of cavalry, heavily-armed and armoured cataphracts and lightly armed but highly-mobile mounted archers. For the Romans, who relied on heavy infantry, the Parthians were difficult to defeat, as both types of cavalry were much faster and more mobile than foot soldiers.

Furthermore, the Parthians used strategies during warfare unfamiliar to the Romans, such as the famous “Parthian shot“, firing arrows backwards at the gallop. Crassus having never encountered such an army or strategic warfare before was defeated decisively at the Battle of Carrhae. This was the beginning of a series of wars that were to last for almost three centuries.

After the defeat, Crassus was fed molten gold, a symbolic gesture for his greed. On the other hand, the Parthians found it difficult to conquer Roman eastern provinces completely.

While the Roman legions returned a few years later and stemmed the Parthians at the gates of Antioch (Syria), the folly of Crassus and the defeat at Carrhae of Roman’s renowned legions still lives in memory.

The first PDA: case Apple Newton

This story is about the precursor of modern PDAs.

The Newton project was not originally intended to produce a personal digital assistant (PDA). The PDA category did not exist for most of Newton’s genesis (however earlier devices like the Psion Organiser and Sharp Wizard had the functionality to be considered PDAs), and the “personal digital assistant” term itself was coined relatively late in the development cycle by Apple‘s then-CEO John Sculley on 7th January 1992, the driving force behind the project. Newton was intended to be a complete reinvention of personal computing.

To clarify, the official name of Apple’s product was the MessagePad; Newton was really the name of the operating system. But Newton captured the public’s imagination, so that’s what the device was popularly called.

One of the original motivating factors for the design was known as the “Architect Scenario”, in which Newton’s designers imagined a residential architect working quickly with a client to sketch and interactively modify a simple two-dimensional home plan.

The end result was a however what became a template for future PDAs. Its initial version rolled off with a variety of software to aid in personal data organization and management.

This included applications as Notes, Names, and Dates, as well as a variety of productivity tools such as a calculator (metric conversions, currency conversions), time-zone maps, and a handwriting recognition, which worked even with the display rotated.

In 1993 before its release, Apple launched a marketing campaign of Newton centered on its allegedly unprecedented handwriting recognition.

When it first appeared in shops, Newton however became a disappointment. It was big (not suitable for pocket), pricy (about $700 for the first model and as much as $1,000 for later), new (no market familiarity) and had software problems (notably, its handwriting recognition was fairly inaccurate and was skewered in the Doonesbury comic strips).

PDAs would remain a niche product until Palm, Inc.‘s (by ex-Apple employee Donna Dubinsky) Palm Pilot emerged shortly before the Newton was discontinued in 1998. The cheaper Palm Pilot was released in 1995 and became a runaway success. It was smaller, thinner and sold at lower cost. It had an excellent PC synchronization and more robust handwriting recognition (Graffiti) system—which had been available first as a software package for the Newton—managed to restore the viability of the PDA market after Newton’s commercial failure.

Attempt to market America

In the wake of 9/11 and subsequent invasion of Afghanistan, image of America became a rallying call of action for all anti-American elements abroad. When the White House finally decided it was time to address the rising tides of anti-Americanism around the world, it didn’t look for a seasoned diplomat. Instead, in keeping with the Bush administration’s neoconservative philosophy favoring private over public sector (Dick Cheney and Colin Powell being the other two neocons in the Bush Administration), it hired one of the then top brand managers in America.

From October 2001, as Undersecretary of State for Public Diplomacy and Public Affairs, Charlotte Beers‘ assignment was not to improve relations with other countries but rather to perform an overhaul of the American image abroad. A recipient of prestigious “Legend in Leadership Award” from the Chief Executive Leadership Institute of the Yale School of Management, Beers had no previous diplomatic experience but had held the top job at both the J. Walter Thompson and Ogilvy & Mather ad agencies.

The appointment of an inexperienced (in diplomacy and state politics) person to this post understandably raised some criticism, but the then Secretary of State Colin Powell shrugged it off. “There is nothing wrong with getting somebody who knows how to sell something. We are selling a product. We need someone who can rebrand American foreign policy, rebrand diplomacy.” “The whole idea of building a brand is to create a relationship between the product and its user,” she explained. “We’re going to have to communicate the intangible assets of the United States — things like our belief system and our values.”

From her point of view the tattered international image of America was little more than a communication problem. In fact, the problem was just the opposite: America’s marketing of itself has been too effective. School children could recite its claims to democracy, liberty and equal opportunity as readily as they could associate Nike with athletic prowess. And they expected the US to live up to its claims. And here lied the real problem. Results of economic and political decisions coming from Washington didn’t seem to correspond to the message and promises so staunchly promoted by American politicians. It was like a false ad, where promised qualities and real qualities of a product are different. America’s problem was not with its brand— which could scarcely be stronger — but with its product.

In the corporate world, once a “brand identity” is settled upon by the head office, it is enforced with military precision throughout a company’s operations. The brand identity may be tailored to accommodate local language and cultural preferences, but its core features — vision, aesthetic, message — remain unchanged. At its core, branding is about rigorously controlled one-way messages prevented to being turned into a social dialogue.

America already demands too much “consistency and discipline” from other nations; that beneath its stated commitment to democracy and sovereignty, it is deeply intolerant of deviations from the economic model known as the “the Washington Consensus.” Whether these policies, so beneficial to foreign investors, are enforced by the Washington-based IMF or through international trade agreements, critics generally feel that the world is already too influenced by America’s brand of governance and American brands.

There is another reason to be wary of mixing the logic of branding with the practice of governance. When companies try to implement global image consistency, they look like generic franchises. When governments do the same, they look authoritarian. It’s no coincidence that political leaders most preoccupied with branding themselves and their parties were also allergic to democracy and diversity. Think Mao and Hitler. Historically, this has been the ugly flip side of politicians striving for consistency of brand: censored information, state controlled media, reeducation camps, purging of dissidents, etc.

Democracy can be described as a confluence of different ideas. It is characterized by diversity of means, approaches and ends. The task was not only futile but dangerous: brand consistency and true human diversity are antithetical, one seeks sameness, the other celebrates difference, one fears all unscripted messages, the other embraces debate and dissent.

Little less than two years into her job, Beers stepped down. Indeed, if anything, prospects of improvement looked as gloomy as ever for this was when Blair and Bush were putting final touches for their next target — Iraq.

My first big failure

After few stories about historic, technological and strategic failures, time is perhaps ripe for a personal story – my story.

I am originally from Armenia.

A year before my last high school year, I was in a school where I was actively participating in “Applied Economics” program introduced by Junior Achievement – a program designed for high school kids to acquaint themselves with various aspects of micro and macroeconomics. After completion of the program, there was a country-wide competition in applied economics and nearly 1000 pre-selected participants from all high schools across the country took part in it. I came 19th in this competition, and by virtue of being among the first thirty, we were offered one-week long, all-paid holidays in one of resorts in Armenia. This was a week full of economics-related games, stock exchange simulations, and constant interactions with most famous businessmen and bankers of the country. This was when I decided I would become an economist. Next year, however, I was forced to change school and spent my last high school year at a phys-math school.

I wound up in the physics – mathematics school # 1, the best and most prestigious high school in the country specialized in physics and mathematics. I spent my last high school year at this school. This was 1996. Post-Soviet era started few years before, but corruption and nepotism were ubiquitous. Demand for economists, lawyers and doctors soared, and universities reflected well that tendency. The only possibility to enter in an economics department in any of state universities at that time was to obtain maximum scores from three exams: English, Armenian, math.

My family and friends advised against wasting my choices and applying for another department – my father was especially insisting that I apply for physics department, cherishing hopes that I would follow in his steps (he is a nuclear physicist) after graduation. When time came to apply for undergraduate studies, I was still completely in love with economics (especially macro) and without hesitation specified three of my preferences (out of four possible) in one or another of economics or related departments. I couldn’t imagine myself doing anything else. I was living in my own world and couldn’t care less on advises and hints of those who loved me and who had more life experience.

I made my decision and no one was able to make me change my mind. I went for exams. Examination committees were where loads of money circulation was happening. “Give me this much and your kid is guaranteed to get that score” was unwritten policy but was so widely known and followed as if it were legal. No one questioned, no one demanded justice. Corruption chain could be traced all the way to the top, including Ministry of Education. Besides one of subjects, mathematics, examination committees for other subjects were only giving high scores to those who paid or those with connections to committee members. I (and my parents) didn’t have enough money or connections.

Knowing well in advance these initial conditions, I still went for it. I did pass all exams and scored maximum in only one: mathematics. For the other two, English and Armenian, my work was scored worse than it was worth, predictably. Appealing to both committees provided no results. My total score from three required exams fell short by 2 or 3 (out of 20) from minimum pass score for all three departments that I applied for. This was one of the most painful times for me. I felt doomed not least because all young men above 16 not accepted to any university were automatically subject for conscription to the army. Armenian army however is (still) a place many readily pay loads of money to avoid. It is a waste of life for two years without guarantee that health and mental states of a person would be normal after the service (indeed, quite few return with different ailments and mental problems).

Luckily for me however, my fourth (and last) option specified in the application form was physics department. I put physics department as my last choice for exactly such a reason, but I hoped not to be in need of this option. If nothing else worked, I thought, I would at least have a high probability of not being drifted to the army but doing physics instead. After failing for the first three options, I came to the fourth (cheers Murphy): I was accepted to the physics department at Yerevan State University.

I forgot to mention that I could have gotten into the same physics department without any exam but by a simple interview due to a special agreement between my (physics – mathematics) high school and the university.

What I could have achieved by a simple 15-min effortless interview I achieved after passing four exams during one month (the three plus the physics exam), spending money, lot of nerves and countless unaccountable-for time and efforts.

Even when I started studying physics – not being “eligible” for the army – I felt horrible. I started my university days with gloomy expression of face and dark spirits. I really didn’t want to do physics. My future, as I saw and envisioned then, was anywhere but in physics. My perspective changed ever since.

Am I in physics now? No. But I did continue studying physics, which I later realized was more beneficial for me in terms of mentality and attitude than actual knowledge, on graduate and post-graduate levels.

The Vasa sinking

The Swedish flagship Vasa‘s first and final sailing in August 1628 left fine fodder for future management consultants – an all-purpose cautionary tale of an overbearing but technically clueless boss pushing through his pet project. King Gustavus II Adolphus, striving to make Sweden a superpower (in his bid to make the Baltic fleet join the Thirty Year War), had wanted four new warships built fast. Workmen were already laying the Vasa’s keel when the king ordered its length extended. His seasoned master shipwright, fearing to challenge the famously hot-tempered king, went ahead. The shipwright then took ill, directed the project as best he could from his sickbed and died before it was finished. His inexperienced assistant then took over, and the king ordered a second gun deck, possibly spurred by false reports that rival Denmark was building a ship with double gun decks. The result was the most lavishly appointed and heavily armed warship of its day, but one too long and too tall for its beam and ballast – a matchless array of features on an unstable platform. When the stan dard stability test of the day – 30 sailors running from side to side trying to rock the boat–tilted the Vasa perilously, the test was canceled and the ship readied for launch.

Despite an obvious lack of stability in port, she was allowed to set sail and foundered a few minutes later when she first encountered a wind stronger than a breeze. She drowned.

Vasa was located again in the late 1950s, in a busy shipping lane just outside the Stockholm harbour. She was salvaged with a largely intact hull on April 24, 1961. During recovery thousands of artifacts and the remains of at least few dozen people were found in and around the hull of the Vasa by marine archaeologists. The artifacts and the ship itself have provided historians with invaluable insight into details of naval warfare, shipbuilding techniques and everyday life in early 17th-century Sweden.

The ship is currently one of Sweden’s most popular tourist attractions.