Bitcoin: how many exist, lost and its quantum computing future

Let’s start by setting up a context of just how much it costs to verify one Bitcoin transaction. A report on Motherboard recently calculated that the cost to verify 1 Bitcoin transaction is as much electricity as the daily consumption of 1.6 American Households. Bitcoin network may consume up to 14 Gigawatts of electricity (equivalent to electricity consumption of Denmark) by 2020 with a low estimate of 0.5GW.

There is much written about theft of Bitcoin, as people are exposed to cyber criminals, but there are also instances where people are losing their coins. In case of loss, it’s almost always impossible to recover lost Bitcoins. They then remain in the blockchain, like any other Bitcoin, but are inaccessible because it’s impossible to find private keys that would allow them to be spent again.

Bitcoin can be lost or destroyed through the following actions:

Sometimes, not only individuals but also experienced companies make big mistakes and loose their Bitcoins. For example, Bitomat lost private keys to 17,000 of their customers’ Bitcoins. Parity lost $300m of cryptocurrency  due to several bugs. And most recently, more than $500 million worth of digital coins were stolen from Coincheck.

Lot Bitcoin losses also come from Bitcoin’s earliest days, when mining rewards were 50 Bitcoins a block, and Bitcoin was trading at less than 1 cent. At that time, many  didn’t care if they lost their (private) keys or just forgot about them; this guys threw away his hard drive containing 7500 Bitcoins.

Let’s briefly analyse Bitcoin’s creation and increase of supply. The theoretical total number of Bitcoins is 21 million. Hence, Bitcoin has a controlled supply. Bitcoin protocol is designed in such a way that new Bitcoins are created at a decreasing and predictable rate. Each year, number of new Bitcoins created is automatically halved until Bitcoin issuance halts completely with a total of 21 million Bitcoins in existence.

While the number of Bitcoins in existence will never exceed 21 million, the money supply of Bitcoin can exceed 21 million due to fractional-reserve banking.

Screen Shot 2018-02-09 at 6.04.08 PM

Source: en.bitcoin.it

As of June 23, 2017, Bitcoin has reached a total circulation amount of 16.4 million Bitcoins, which is about 81,25% of the total amount of 21 million Bitcoins.

2017 research by Chainanalysis showed that between 2.78 million and 3.79 million Bitcoins are already lost or 17% – 23% of what’s been mined to date.

Screen Shot 2018-02-09 at 6.41.15 PM

How much Bitcoin exactly has been lost? It’s a pretty tough question considering there is no definitive metric for finding the answer. A good estimate is around 25% of all Bitcoin, according to this analysis (this research concludes 30% of all coins had been lost, equating to 25% of all coins when adjusted for the current amount of coins in circulation, which can be done as bulk of lost Bitcoins originate from very early and as Bitcoin’s value has been going up, people lose their coins at a slower rate).

With advent of quantum computers, future of Bitcoin might be perilous. One researcher suggested that quantum computers can calculate the private key from the public one in a minute or two. By learning all the private keys, someone would have access to all available bitcoin. However, a more extensive research shows that in short term, impact of quantum computers will appear to be rather small for mining, security and forking aspects of Bitcoin.

It’s possible that an arms race between quantum hackers and quantum Bitcoin creators will take place. There is an initiative that already tested a feasibility of quantum-safe blockchain platform utilizing quantum key distribution across an urban fiber network.

The below image shows encryption algorithms vulnerable and secure for quantum computing.

Screen Shot 2018-02-15 at 12.17.48 PM

Source:  cryptomorrow.com

And while work is still ongoing, three quantum-secure methods have been proposed as alternative encryption methodologies for the quantum computing age: lattice-based cryptography, code-based cryptography, multivariate cryptography. IOTA already  deploys Winternitz One-Time Signature (OTS) scheme using Lamport signatures, claiming to be resistant to quantum computer algorithms if they have large hash functions.

The no-cloning theorem will make it impossible to copy and distribute a decentralized ledger of qubits (quantum units of information). As qubits can’t be copied or non-destructively read, they will act more like real coins (no issue of double-spending). Quantum Bitcoin miners might support the network by doing operations which amount to quantum error correction (which might replace current Proof-of-Work or Proof-of-Stake systems) as the use of quantum entanglement will enable all network participants to simultaneously agree on a measurement result without a proof of work system.

And while we are waiting for quantum-era Satoshi to rise, check out this THEORETICAL account of how quantum computers may potentially create Bitcoin, which also contains primers on quantum computers and Bitcoin mining.

P.S. Satoshi is estimated to be in the possession of over one million coins

View at Medium.com

 

Bitcoin, ICOs, Mississippi Bubble and crypto future

Bitcoin bubble

Bitcoin has risen 10x in value so far in 2017, the largest gain of all asset classes, prompting sceptics to declare it a classic speculative bubble that could burst, like the dotcom boom and the US sub-prime housing crash that triggered the global financial crisis. Stocks in the dotcom crash were worth $2.9tn before collapsing in 2000, whereas the market cap of bitcoin currently (as of 03.12.2017) stands at $185bn, which could signal there is more room for the bubble to grow.

 

Many a financiers and corporate stars think there is a bubble and a huge opportunity. One of the biggest bitcoin bulls on Wall Street, Mike Novogratz, thinks cryptocurrencies are in a massive bubble (but anticipates Bitcoin reaching $40,000 by end of 2018). Ironically (or not), he’s launching a $500 million fund, Galaxy Digital Assets Fund, to invest in them, signalling a growing acceptance of cryptocurrencies as legitimate investments.  John McAfee has doubled down on his confidence in bitcoin by stating his belief it will be worth $1 million by the end of 2020.

 

Former Fed Chairman Alan Greenspan has said that “you have to really stretch your imagination to infer what the intrinsic value of bitcoin is,” calling the cryptocurrency a “bubble.” Even financial heavyweights such as CME, the world’s leading derivatives marketplace, is planning to tap into this gold rush by introducing bitcoin derivatives, which will let hedge funds into the market before end of 2017.

 

The practical applications for cryptocurrencies to facilitate legal commerce appear hampered by relatively expensive transaction fees and the skyrocketing energy costs associated with mining at this juncture. On this note, Nobel Prize-winning economist Joseph Stiglitz thinks that bitcoin “ought to be outlawed” because it doesn’t serve any socially useful function and yet consumes enormous resources.

Bitcoin mania has many parallels with Mississippi Bubble

Bitcoin’s boom has gone further than famous market manias of the past like the tulip craze or the South Sea Bubble, and has lasted longer than the dancing epidemic that struck 16th-century France, or recent dot.com bubble in 2000. Like many others events such South Sea Bubble, ultimately, it was a scheme. No (real economy) trade would reasonably take place but the company’s stock kept rising on promotion and the hope of investors.

 

In my view, a more illustrative example, with many parallels for Bitcoin, is Mississippi Bubble, which started in 1716.  Not only was the Mississippi Bubble bigger than the South Sea Bubble, but it was more speculative and more successful. It completely wiped out the French government’s debt obligations at the expense of those who fell under the sway of John Law’s economic innovations.

 

Its origins track back to 1684 when Compagnie du Mississippi (Mississippi Company) was chartered. In August 1717, Scottish businessman/economist John Law acquired a controlling interest in the then-derelict Mississippi Company and renamed it the Compagnie d’Occident. The company’s initial goal was to trade and do business with the French colonies in North America, which included most of the Mississippi River drainage basin, and the French colony of Louisiana. Law was granted a 25-year monopoly by the French government on trade with the West Indies and North America. In 1719, the company acquired many French trading companies and combined these into the Compagnie Perpetuelle des Indes (CPdI). In 1720, it acquired the Banque Royale, which had been founded by John Law himself as the Banque Generale (forerunner of France’s first central bank) in 1716.

 

Law then created speculative interest in CPdI. Reports were skillfully spread as to gold and silver mines discovered in these lands.  Law exaggerated the wealth of Louisiana with an effective marketing scheme, which led to wild speculation on the shares of the company in 1719. Law had promised to Louis XV that he would extinguish the public debt. To keep his word he required that shares in CPdI should be paid for one-fourth in coin and three-fourths in billets d’Etat (public securities), which rapidly rose in value on account of fake demand which was created for them.  The speculation was further fed by the huge increase in the money supply (by printing more money to meet the growing demand) introduced by Law (as he was also Controller General of Finances, equivalent to Finance Minister, of France) in order to ‘stimulate’ the economy.

 

CPdI’s shares traded around 300 at the end of 1718, but rose rapidly in 1719, increasing to 1000 by July 1719 and broke 10,000 in November 1719, an increase of over 3,000% in less than one year. CPdI shares stayed at the 9000 level until May 1720 when they fell to around 5000. By the spring of 1720, more than 2 billion livres of banknotes had been issued, a near doubling of the money supply in less than a year. By then, Law’s system had exploded – the stock-market bubble burst, confidence in banknotes evaporated and the French currency collapsed. The company sought bankruptcy protection in 1721. It was reorganised and open for business in 1722. However, in late 1720, Law was forced into exile and died in 1729. At its height, the capitalisation of CPdI was greater than either the GDP of France or all French government debt.

Why did Law fail? He was over-ambitious and over-hasty (like this Bitcoin pioneer?). He believed that France suffered from a dearth of money and incumbent financial system (Bitcoin enthusiasts claim it will revolutionize economies and countries like India are ideal for it) and that an increase in its supply would boost economic activity (Bitcoin aims to implement a variant of Milton Friedman’s k-percent rule: proposal to fix the annual growth rate of the money supply to a fixed rate of growth). He believed that printing and distributing more money would lower interest rates, enrich traders, and offer more employment to people. His conceptual flaw was his belief that money and financial assets were freely interchangeable – and that he could set the price of stocks and bonds in terms of money.

Law’s aim was to replace gold and silver with a paper currency (just like how Bitcoiners want to democratise/replace fiat money and eliminate banks). This plan was forced upon the French public – Law decreed that all large financial transactions were to be conducted in banknotes. The holding of bullion was declared illegal – even jewelry was confiscated. He recommended setting up a national bank (Banque Generale in 1716), which could issue notes to buy up the government’s debt, and thus bring about a decline in the interest rate.

During both South Sea and Mississippi bubbles, speculation was rampant and all manner of initial stock offerings were being floated, including:

  • For settling the island of Blanco and Sal Tartagus
  • For the importation of Flanders Lace
  • For trading in hair
  • For breeding horses

Some of these made sense, but lot more were absurd.

Economic value and price fluctuations of Bitcoin

Bitcoin is similar to other currencies and commodities such as gold, oil, potatoes or even tulips in that its intrinsic value is difficult – if not impossible – to separate from its price.

A currency has three main functions: store of value; means of exchange; and unit of account. Bitcoin’s volatility, seen when it fell 20% within minutes on November 29th 2017 before rebounding, makes it both a nerve-racking store of value and a poor means of exchange. A currency is also a unit of account for debt. As an example, if you had financed your house with a Bitcoin mortgage, in 2017 your debt would have risen 10x. Your salary, paid in dollars, etc. would not have kept pace. Put another way, had Bitcoin been widely used, 2016 might have been massively deflationary.

But why has the price risen so fast? One justification for the existence of Bitcoin is that central banks, via quantitative easing (QE), are debasing fiat money and laying the path to hyperinflation. But this seems a very odd moment for that view to gain adherents. Inflation remains low and the Fed is pushing up interest rates and unwinding QE.

A more likely explanation is that as new and easier ways to trade in Bitcoin become available, more investors are willing to take the plunge. As the supply of Bitcoin is limited by design, that drives up the price.

There are governments standing behind currencies and reliable currency markets for exchange. And with commodities, investors have something to hold at the end of the transaction. Bitcoin is more speculative because it’s digital ephemera. That isn’t true for all investments. Stockholders are entitled to a share of a company’s assets, earnings and dividends, the value of which can be estimated independent of the stock’s price. The same can be said about a bond’s payments of principal and interest.

This distinction between price and value is what allowed many observers to warn that internet stocks were absurdly priced in the late 1990s, or that mortgage bonds weren’t as safe as investors assumed during the housing bubble. A similar warning about Bitcoin isn’t possible.

What about Initial Coin Offerings (ICOs)? An ICO (in almost all jurisdictions so far) is an unregulated means, bypassing traditional fund raising methods, of raising capital for a new venture. Afraid of missing out on the next big thing, people are willing to hand their money over no matter how thin the premise, very much like in case of South Sea or Mississippi Bubbles. They have close resemblance to penny stock trading, with pump-n-dump schemes, thin disclosures and hot money pouring in and out of stocks.

ICOs, while an alternative financing scheme for startups, aren’t so far sustainable for business. Despite the fact that more than 200 ICOs have raised more than $3 billion so far in 2017, only 1 in 10 tokens is use after the ICO. And a killer app for most popular public blockchain platform Ethereum, which sees increasing number of ICOs? First ecosystem (game to trade kittens) has been launched and almost crashed Ethereum network. This game alone consumes 15% of Ethereum traffic and even than it’s hard to play due to its slowness (thanks Markus for this info bite!).

So overall, Bitcoin (and other crypto currencies) exist only for the benefit of those that buy-n-hold and use them while creating an explicit economic program of counter-economics. In other words, Bitcoin is not as much about money but power.

How it all may end (or begin)

The South Sea Bubble ended when the English government enacted laws to stop the excessive offerings. Mississippi Bubble ended when French currency collapsed, French government bought back (and ultimately wrote off debt via QE) all CPdI’s shares and cast out instigators. The unregulated markets became regulated.

From legal perspective, most likely the same thing will happen to cryptocurrencies and ICOs. China temporarily banned cryptocurrency exchanges till regulations can be introduced. Singapore, Malaysia, and other governments have plans to introduce regulations by end of 2017 or early 2018. Disregard, ignorance, or flaunting of regulatory and other government-imposed rules be mortal for startups and big businesses alike.

From technology perspective, a number of factors, including hard forks, ledger and wallet hacking and its sheer limitations related to scaling, energy consumption, security might bring it down. Also many misconceptions about blockchain/Bitcoin such as claims of a blockchain being everlasting, indestructible, miners providing security, and anonymity being a universally good thing are either exaggerated, not always or patently not true at all.

From business perspective, startups and companies raising money via ICO can be subject to fraud – Goldman Sachs’ CEO claims Bitcoin is a suitable means for conducting fraud, and thus subject to money laundering, counter-terrorist and other relevant government legislation. From investors perspective, shorting seems to be the most sure-fire way of investing profitably in cryptocurrencies.

During the dot-com craze, Warren Buffett was asked why he didn’t invest in technology. He famously answered that he didn’t understand tech stocks. But what he meant was that no one understood them, and he was right. Why else would anyone buy the NASDAQ 100 Index when its P/E ratio was more than 500x – a laughably low earnings yield of 0.2% – which is where it traded at the height of the bubble in March 2000.

It’s a social or anthropological phenomenon that’s reminiscent of how different tribes and cultures view the concept of money, from whale’s teeth to abstract social debts. How many other markets have spawned conceptual art about the slaying of a “bearwhale

Still, the overall excitement around Bitcoin shows that it has tapped into a speculative urge, one that isn’t looking to be reassured by dividends, business plans, cash flows, or use cases. Highlighting a big, round number like $10,000 only speaks to our emotional reaction to big, round numbers. But it doesn’t explain away the risk of this one day falling to the biggest, roundest number of all – zero.

Top 13 challenges AI is facing in 2017

AI and ML feed on data, and companies that center their business around the technology are growing a penchant for collecting user data, with or without the latter’s consent, in order to make their services more targeted and efficient. Already implementations of AI/ML are making it possible to impersonate people by imitating their handwritingvoice and conversation style, an unprecedented power that can come in handy in a number of dark scenarios. However, despite large amounts of previously collected data, early AI pilots have challenges producing  dramatic results that technology enthusiasts predicted. For example, early efforts of companies developing chatbots for Facebook’s Messenger platform saw 70% failure rates in handling user requests.

One of main challenges of AI goes beyond data: false positives. For example, a name-ranking algorithm ended up favoring white-sounding names, and advertising algorithms preferred to show high-paying job ads to male visitors.

Another challenge that caused much controversy in the past year was the “filter bubble” phenomenon that was seen in Facebook and other social media that tailored content to the biases and preferences of users, effectively shutting them out from other viewpoints and realities that were out there.

Additionally, as we give more control and decision-making power to AI algorithms, not only technological, but moral/philosophical considerations become important – when a self-driving car has to choose between the life of a passenger and a pedestrian.

To sum up, following are the challenges that AI still faces, despite creating and processing increasing amounts of of data and unprecedented amounts of other resources (number of people working on algorithms, CPUs, storage, better algorithms, etc.):

  1. Unsupervised Learning: Deep neural networks have afforded huge leaps in performance across a variety of image, sound and text problems. Most noticeably in 2015, the application of RNNs to text problems (NLP, language translation, etc.) have exploded. A major bottleneck in unsupervised learning is labeled data acquisition. It is known humans learn about objects and navigation with relatively little labeled “training” data. How is this performed? How can this be efficiently implemented in machines?
  2. Select Induction Vs. Deduction Vs. Abduction Based Approach: Induction is almost always a default choice when it comes to building an AI model for data analysis. However, it – as well as deduction, abduction, transduction – has its limitations which need serious consideration.
  3. Model Building: TensorFlow has opened the door for conversations about  building scalable ML platforms. There are plenty of companies working on data-science-in-the-cloud (H2O, Dato, MetaMind, …) but the question remains, what is the best way to build ML pipelines? This includes ETL, data storage and  optimisation algorithms.
  4. Smart Search: How can deep learning create better vector spaces and algorithms than Tf-Idf? What are some better alternative candidates?
  5. Optimise Reinforced Learning: As this approach avoids the problems of getting labelled data, the system needs to get data, learn from it and improve. While AlphaGo used RL to win against the Go champion, RL isn’t without its own issues: discussion on a more lightweight and conceptual level one on a more technical aspect.

  6. Build Domain Expertise: How to build and sustain domain knowledge in industries and for problems, which involve reasoning based on a complex body of knowledge like Legal, Financial, etc. and then formulate a process where machines can simulate an expert in the field.
  7. Grow Domain Knowledge: How can AI tackle problems, which involve extending a complex body of knowledge by suggesting new insights to the domain itself – for example new drugs to cure diseases?
  8. Complex Task Analyser and Planner: How can AI tackle complex tasks requiring data analysis, planning and execution? Many logistics and scheduling tasks can be done by current (non-AI) algorithms. A good example is the use of AI techniques in IoT for Sparse datasets . AI techniques help this case because there are large and complex datasets where human beings cannot detect patterns but machines can do so easily.
  9. Better Communication: While proliferation of smart chatbots and AI-powered communication tools is a trend since several years, these communication tools are still far from being smart, and may at times fail at recognising even a simple human language.
  10. Better Perception and Understanding: While Alibaba, Face+ create facial recognition software, visual perception and labelling are still generally problematic. There are few good examples, like this Russian face recognition app  that is good enough to be considered a potential tool for oppressive regimes seeking to identify and crack down on dissidents. Another algorithm proved to be effective at peeking behind masked images and blurred pictures.
  11. Anticipate Second-Order (and higher) Consequences: AI and deep learning have improved computer vision, for example, to the point that autonomous vehicles (cars and trucks) are viable (Otto, Waymo) . But what will their impact be on economy and society? What’s scary is that with advance of AI and related technologies, we might know less on AI’s data analysis and decision making process. Starting in 2012, Google used LSTMs to power the speech recognition system in Android, and in December 2016, Microsoft reported their system reached a word error rate of 5.9%  —  a figure roughly equal to that of human abilities for the first time in history. The goal-post continues to be moved rapidly .. for example loom.ai is building an avatar that can capture your personality. Preempting what’s to come, starting in the summer of 2018, EU is considering to require that companies be able to give users an explanation for decisions that their automated systems reach.
  12. Evolution of Expert SystemsExpert systems have been around for a long time.  Much of the vision of expert systems could be implemented in AI/deep learning algorithms in the near future. The architecture of IBM Watson is an indicative example.
  13. Better Sentiment Analysis: Catching up but still far from lexicon-based model for sentiment analysis, it is still pretty much a nascent and unchartered space for most AI applications. There are some small steps in this regard though, including OpenAI’s usage of mLSTM methodology to conduct sentiment analysis of text. The main issue is that there are many conceptual and contextual rules (rooted and steeped in particulars of culture, society, upbringing, etc of individuals) that govern sentiment and there are even more clues (possibly unlimited) that can convey these concepts.

Thoughts/comments?

101 and failures of Machine Learning

Nowadays, ‘artificial intelligence’ (AI) and ‘machine learning’ (ML) are cliches that people use to signal awareness about technological trends. Companies tout AI/ML as panaceas to their ills and competitive advantage over their peers. From flower recognition to an algorithm that won against Go champion to big financial institutions, including ETFs of the biggest hedge fund in the world are already or moving to the AI/ML era.

However, as with any new technological breakthroughs, discoveries and inventions, the path is laden with misconceptions, failures, political agendas, etc. Let’s start by an overview of basic methodologies of ML, the foundation of AI.

101 and limitations of AI/ML

The fundamental goal of ML is to generalise beyond specific examples/occurrences of data. ML research focuses on experimental evaluation on actual data for realistic problems. ML’s performance is then evaluated by training a system (algorithm, program) on a set of test examples and measuring its accuracy at predicting the novel test (or real-life) examples.

Most frequently used methods in ML are induction and deduction. Deduction goes from the general to the particular, and induction goes from the particular to the general. Deduction is to induction what probability is to statistics.

Let’s start with induction. Domino effect is perhaps the most famous instance of induction. Inductive reasoning consists in constructing the axioms (hypotheses, theories) from the observation of supposed consequences of these axioms.Induction alone is not that useful: the induction of a model (a general knowledge) is interesting only if you can use it, i.e. if you can apply it to new situations, by going somehow from the general to the particular. This is what scientists do: observing natural phenomena, they postulate the laws of Nature. However, there is a problem with induction. It’s impossible to prove that an inductive statement is correct. At most can one empirically observe that the deductions that can be made from this statement are not in contradiction with experiments. But one can never be sure that no future observation will contradict the statement. Black Swam theory is the most famous illustration of this problem.

Deductive reasoning consists in combining logical statements (axioms, hypothesis, theorem) according to certain agreed upon rules in order to obtain new statements. This is how mathematicians prove theorems from axioms. Proving a theorem is nothing but combining a small set of axioms with certain rules. Of course, this does not mean proving a theorem is a simple task, but it could theoretically be automated.

A problem with deduction is exemplified by Gödel’s theorem, which states that for a rich enough set of axioms, one can produce statements that can be neither proved nor disproved.

Two other kinds of reasoning exist, abduction and analogy, and neither is frequently used in AI/ML, which may explain many of current AI/ML failures/problems.

Like deduction, abduction relies on knowledge expressed through general rules. Like deduction, it goes from the general to the particular, but it does in an unusual manner since it infers causes from consequences. So, from “A implies B” and “B”, A can be inferred. For example, most of a doctor’s work is inferring diseases from symptoms, which is what abduction is about. “I know the general rule which states that flu implies fever. I’m observing fever, so there must be flu.” However, abduction is not able to build new general rules: induction must have been involved at some point to state that “flu implies fever”.

Lastly, analogy goes from the particular to the particular. The most basic form of analogy is based on the assumption that similar situations have similar properties. More complex analogy-based learning schemes, involving several situations and recombinations can also be considered. Many lawyers use analogical reasoning to analyse new problems based on previous cases. Analogy completely bypasses the model construction: instead of going from the particular to the general, and then from to the general to the particular, it goes directly from the particular to the particular.

Let’s next check some of conspicuous failures in AI/ML (in 2016) and corresponding AI/ML methodology that, in my view, was responsible for failure:

Microsoft’s chatbot Tay utters racist, sexist, homophobic slurs (mimicking/analogising failure)

In an attempt to form relationships with younger customers, Microsoft launched an AI-powered chatbot called “Tay.ai” on Twitter in 2016. “Tay,” modelled around a teenage girl, morphed into a “Hitler-loving, feminist-bashing troll“—within just a day of her debut online. Microsoft yanked Tay off the social media platform and announced it planned to make “adjustments” to its algorithm.

AI-judged beauty contest was racist (deduction failure)

In “The First International Beauty Contest Judged by Artificial Intelligence,” a robot panel judged faces, based on “algorithms that can accurately evaluate the criteria linked to perception of human beauty and health.” But by failing to supply the AI/ML with a diverse training set, the contest winners were all white.

Chinese facial recognition study predicted convicts but shows bias (induction/abduction failure)

Researchers in China’s published a study entitled “Automated Inference on Criminality using Face Images.” They “fed the faces of 1,856 people (half of which were convicted violent criminals) into a computer and set about analysing them.” The researchers concluded that there were some discriminating structural features for predicting criminality, such as lip curvature, eye inner corner distance, and the so-called nose-mouth angle. Many in the field questioned the results and the report’s ethics underpinnings.

Concluding remarks

The above examples must not discourage companies to incorporate AI/ML into their processes and products. Most AI/ML failures seem to stem from band-aid, superfluous way of embracing AI/ML. A better and more sustainable approach to incorporating AI/ML would be to initiate a mix of projects generating both quick-wins and long-term transformational products/services/process. For quick-wins, a company might focus on changing internal employee touchpoints, using recent advances in speech, vision, and language understanding, etc.

For long-term projects, a company might go beyond local/point optimisation, to rethinking business lines, products/services, end-to-end processes, which is the area in which companies are likely to see the greatest impact. Take Google. Google’s initial focus was on incorporating ML into a few of their products (spam detection in Gmail, Google Translate, etc), but now the company is using machine learning to replace entire sets of systems. Further, to increase organisational learning, the company is dispersing ML experts across product groups and training thousands of software engineers, across all Google products, in basic machine learning.

 

The rise and demise of GM’s Saturn

Not very long ago, on a cold, wintry day of January 1985 the top man at GM, Roger B. Smith, unveiled ‘Saturn’, the first new brand to come out of GM in almost seven decades. A stand-alone subsidiary of GM, Saturn had a promising birth and was touted as a ‘different’ type of a car. Having its own assembly plant, unique models and separate retailer network, Saturn operated independently from its parent company. It was a cut above the rest in using innovative technology and involving its employees in the decision making process. Conceived as a fighter brand to take on the Japanese brands, the small car of superior quality was the product of strong principles with a mission of being America’s panacea to Japan’s challenge. It reaffirmed the strength of American technology, ingenuity and productivity with the combination of advanced technology and latest approaches to management.

Though a revolutionary idea, Saturn wasn’t able to live up to the hype or the hopes of Roger Smith. The case of Saturn is definitely one for the books. Its marketing campaign fired up the public’s imagination and interest perfectly while the product was a miserable failure. Everything the company did was just another leaf out of the handbook of perfect PR. When the first lot of cars had a bad engine antifreeze, the company replaced the entire car instead of just the coolant much to the customer’s delight.

Besides clever marketing, Saturn’s biggest assets were its passionate employees and customer-centric approach which rewarded it with a quick victory. The victory was however short-lived as GM was reluctant to expand Saturn’s offerings for fear of cannibalization on the sales of its other divisions. For the existing models, Saturn’s engine had inferior motor mounts with the plastic dashboard panels giving it a cheap look and even the plastic-polymer doors, the so-called unique feature, failed to fit properly. Overall, the car neither had an identity nor a USP. To make things worse, Roger Smith was on a spending spree from throwing recall parties when vehicle problems were solved to hosting “homecoming” celebrations at plants. This saddled GM with high costs leading to increased doubts of Saturn’s survival among the leaders of GM.

Disaster struck further when Saturn’s sub-compact prices failed to cover the huge costs gobbled up by a dedicated plant with massive operating costs. The fact that the plant churned out cars that barely share any common parts with other GM brands did not seem to help at all. To top it all, at a time when buyers were snapping up minivans and SUVs, Saturn’s offerings were just limited to 3 small models for over a decade, thereby losing out on locking customers in. Just when GM was pondering over the decision of scrapping the car, the UAW visited one of Saturn’s production facility with its international contract, only to be rejected by the workers. As obvious as it seemed, the unique labor contract of the company was dissolved and GM had no choice but to part with the brand by dividing the production among other GM plants.

Automotive history has witnessed myriad failure stories of brands that were supposed to be world-class products but ended up biting the dust. One such underachiever brand was Vector which sprouted out of the aim of producing an American supercar but doomed due to cash flow issues, mismanagement and failing to keep up their insane promises. Sterling, Rover’s disguise into the American market, was another lost car of the 80s which most people haven’t even heard of. Their promise of delivering “Japanese reliability and refinement with traditional British luxury and class” couldn’t save them from continuous sales drop and combating competition from new Japan rivals. Few other epic automotive experimental failures which can be recalled in this scenario would include Chrysler’s TC by Maserati , Subaru SVX, Jaguar X-type, Lincoln blackwood, GMC Envoy XUV, Chevrolet SSR, Chrysler Crossfire and Dodge Durango Hybrid/Chrysler Aspen Hybrid. While some were design disasters, the others just couldn’t perform.

The automobile industry is governed by various factors which include the technology advancements of the time, economic conditions and fluctuations of consumer needs. The latest automotive chip on the block are the electric cars which are set to revolutionize the entire industry. LeEco, a Chinese electronics company is taking serious steps to target Tesla, what with it investing $1.08 billion on developing its debut electric car. Tesla was the name which paved the way for an electronic vehicle era. Whether LeSEE, LeEco’s concept sedan, can surpass Tesla’s performance and give them a run for their money is only something that time will tell. If successful, these electric cars could be the game changers of this century to usher in an electric future. If not, it will fade away and claim its place as a bittersweet memory on the list of flops that the industry has had.

Modern saga “The Fox and the Hedgehog”: generalists vs. specialists

About 2,700 years ago, Archilochus wrote that “The fox knows many things, but the hedgehog knows one big thing.” Taking that as a starting point, Isaiah Berlin’s 1953 essay “The Fox and the Hedgehog” contrasts hedgehogs that “relate everything to a single, central vision” with foxes who “pursue many ends connected … if at all, only in some de facto way.”

And so we have become a society of specialists with much heralded “learn more about your function, acquire ‘expert’ status, and you’ll go further in your career”  considered the corporate Holy Grail. But is it?

The modern corporation has grown out of the Industrial Revolution (IR). The IR started in 1712 when an Englishman named Thomas Newcomen invented a steam-driven pump, to pump water out of a mine, so the English miners could get more coal to mine, rather than hauling buckets of water out of the mine. That was the dawn of the IR. It was all about productivity, more coal per man-hour; and then it became more steel per man-hour, more textiles per man-hour, etc.

The largest impact of the IR was the “socialization” of labor. Prior to the IR, people were largely self-sufficient, but the IR brought increased division of labor, and this division of labor brought specialisation, which brought increased productivity. This specialisation, though, decreased self-sufficiency and people became increasingly inter-dependent on one another, thus socialised more. Also, with the division of labor the individual needed only to know how to do a specific task and nothing more. Specialization also caused compartmentalization of responsibility and awareness. On a national level, it has allowed nations to become increasingly successful while the citizens become increasingly ignorant. Think an average American. You can be totally wrong about almost everything in life, but as long as you know how to do one thing good you can be a success, and in fact in a society such as this increased specialization becomes advantageous due to the extreme competition of our society. Environments with more competition breed more specialists.

But is the formula that ushered humanity in 20th century of rapid technological industrialisation and economic development still valid or as impactful in 21st century as it was for last 300 years? In our modern VUCA world, who (specialist OR generalists) have a better chance of not only surviving but thriving?

According to a number of independent research papers, employees most likely to come out on top of companies and becoming successful in long term are generalists—but not just because of their innate ability to adapt to new workplaces, job descriptions or cultural shifts. For example, according Carter Phipps (author of Evolutionaries) generalists (will) thrive in a culture where it’s becoming increasingly valuable to know “a little bit about a lot.” More than half of employees with specialist skills now consider their job to be mostly generalist despite the fact that they were employed for their niche skills, according to another survey. Among the survey respondents, 60% thought their boss was a good generalist, and transferable skills – such as people skills and leadership – are often associated with more senior roles.

We’ve become a society that’s data (from all various specialisation, industries and technologies) rich and meaning poor. A rise in specialists in all areas — science, math, history, psychology — has left us with huge amount of data/info/knowledge but how valuable is it without context? Context in a data-rich world can only be provided by generalists whose breadth of knowledge can serve as the link between various disciplines/contexts/frameworks.

A good generalist, David Christian gave his 2011 TED talk called “Big History” of the entire universe from the big bang to present in 18 mins, using principals of physics, chemistry, biology, information architecture and human psychology.

To conclude, it seems that specialisation is becoming less and less relevant due to 1) increasing, interconnected and overlapping data and information that permeates all aspects of our lives, 2) increasing VUCA-ness of social, political and economic situations of individuals and nations, 3) need to envision and derive from a bigger context or connect few contexts/disciplines/frameworks. All points seem to be better addressed by generalists.

Don’t Fail Your Business – Avoid the Most Obvious Traps

This is a guest blog by Eve Baxton.

Owning a professional business is a huge responsibility. It involves all the potential risks and traps that are imaginable. According to a survey in 2009 by PrinceWaterHouseCoopers more than half of the enterprises suffered from the economical crimes. In most of the cases, small sized enterprises are the best prey to fall for potential traps around.

These dangers include employees or managers misrepresenting or manipulating the financial information, customers misusing the enterprise’s borrowing in criminal activities and contractors producing false bills. Along with that, other elements exist that perpetrate fraud against the enterprise by electronic means. This includes hacking, manipulating the telephone service and robbing customer’s confidential data, such as credit card numbers or usernames and passwords.

The ultimate quest that should be taken care of as soon as possible, to protect your business is identifying the sources of potential fails. That is not an easy task to take on, for a less experienced businessman. By just considering some basic and yet important things, the enterprise can be easily protected from potential frauds, however.

According to the statistics, the most common reasons for falling into these traps are the lack of experience, commitment, and, most significantly, management. Let’s take a look at the potential traps that most of the enterprises, no matter big or small, fall for.

 

  • The Innocent Employee Trap
    According to the facts, most of the time insiders; employees, managers, or company’s officials, are the main reason for an enterprise to fall and fail. These employees “innocently” steal the assets of the company, and commit accounting frauds. Detecting such actions is vital; it requires a lot of methodologies and commitment to be employed, however.
  • The Clever Customer Trap
    Another hazardous trap that most of the enterprises fall for is the clever customer trap. It has been found that many customers are using fake identities, stolen credit cards, or are filling out fake liability and injury claims to perpetrate the company. All these actions by “the sweet customers” only results in the enterprise’s money being taken away.
  • The Fake Return Scheme Trap
    The most common victims of this type of frauds are retailers. It is the best scam for the clever customer to perpetrate against any enterprise. Most of the times, the customer brings back used merchandise that has not been bought from the same place and tries to return it and get new merchandise under the shelter of the fake return schemes.
  • The Greedy Contractor Trap
    Most companies rely on outside service providers for its survival. Therefore, many of the enterprises fall for the greedy contractor trap. It is not uncommon for the contractor to bill more than the task he has done is worth, and more often even asks for the billing of a task that has not yet been completed. Strong policy, terms, and conditions should be presented to the contractors, before entrusting them with any task, to avoid these attempts, and leaving no room for the contractor to commit such frauds.

 

Protecting Your Professional Businesses

After viewing the potential traps, let’s take a look on the precautionary measures that an enterprise can take to avoid unnecessary fails; these important methodologies can fraudproof your business against its very own employees.

  1. The Method of Anonymous Employee Reporting

Utilizing this method is one of the best methods for detecting the potential dishonesties that can occur within your business; compared to scheduled employee reporting, the violators’ opportunities to destroys any vital fraud traces are severely limited.

  1.  The Method Of Surprise Auditing

Along with the regular, scheduled internal audits, surprise auditing should also be performed in the enterprise, as it significantly increases the chances of detecting potential irregularities, thus making your company more protected.

  1. The Method of External Auditing

Along with the above mentioned methods, this is also a significant tool to determine a potential fraud; it should be held at regular intervals.

 

Business Insurance – The Ultimate Protection

Along with all the above methodologies, the most important step in stabilizing an enterprise is business insurance. It plays an important role in the art of preventing the failures related to undetected frauds. Professional business insurance protects the business from many unexpected frauds, and it enables a recovery from what may seem as a catastrophic loss. Even when insuring your business, you should consider several key factors, however; for your business insurance to be viable for you, it needs to offer an equal protection for any business, regardless of its size, and to provide protection from all sources of frauds and failures. First thing that comes to mind is protection from theft. Theft is the most common and uncontrolled source of assets loss that occurs in most enterprises internally and externally, making insurance without theft protection almost useless. Protection from litigation is another important aspect; while unfounded law suits from wealthier competition are likely to turn out in your favor, the costs before you prove your innocence can cause your business to fail, so proper insurance has that covered. Protection from unwanted liabilities provides you with protection, when the assets, coming from your company, are used for any illegal activities, which is not a rare occasion with internal frauds and thefts.

It’s your responsibility to implement a fraud proof protection method, that suits best to your enterprise’s needs, properly. All it requires is some commitment and dedication towards the business; no matter how malicious the fraudster’s aims are, you can stop your company’s fail before it’s to late!

What is traditional consultancy and why it mostly fails

What is consultancy and is there a value in it?

The truth is that only a comparatively small number of consulting projects seem to be successful. Either consultant recommendations fail at implementation phase, or don’t even survive reality checks. In the worst of cases, these have disastrous consequences for the organization. As Ferdinand Piëch, the CEO of Volkswagen, famously said: “If you want to ruin a company, you only have to try fixing it with the help of external consultants.

Story of Bag, Borrow or Steal, one of the first online sites allowing rental of expensive items (from several days to months) comes to mind.  In securing venture capital, a consulting firm was brought in to take the company to next level. It recommended that, in order to attract a more high profile customer, the company change its name – the very name that, in addition to being instantly recognizable and very descriptive of the nature of the business, was also forever immortalized in the movie Sex and the City. The new (proposed) name was Avelle, thought by the consultants to sound more luxurious.

Following the consulting firm’s advice, the end result was lost sales as loyal customers became confused by the name change and the brand equity built into the original name was lost as people wondered who the heck was Avelle.  Top executives of BBS were fired and the company returned back to its original name.

BBS is far from being an exception, nor are consultancies only amateurs. Take McKinsey, one of the best of the breed. Its advice led GE to loose $1 billion in 2007 and Swissair to its bankruptcy.  The list of McKinsey screw-ups is long. And if that is the best of the best, it is easy to imagine what other consultancy firms are up to.

Some estimate that about  80% of all consulting projects fail.

Usually, the traditional consultancy firms are called upon for the following reasons:

  1. Political leverage: CEOs that want or need to make an unpopular decision often bring in a consulting firm to help.
  2. Pool knowledge across functions: In large companies, cross-functional problem-solving rarely happens. Just getting different functions in a room typically unlocks creative problem solving.
  3. Pool knowledge across levels: Consultants interview, watch, and tag along with people down the organizational structure, often starting with customers and moving through sales and up. Top management of a company rarely does this. There are tremendous insights to be had by doing this.
  4. Deep focus on one problem.
  5. Consulting firms have access to way more data.
  6. The ability to structure a problem and to approach the task of identifying a solution in a methodical manner.
  7. A totally unbiased opinion on the topic.

The few studies that one can find identify several reasons for failure, which fall into four groups: personal characteristics of consultants, technical shortcomings of proposed solutions, problematic client–consultant relationships, and socio-political aspects of the client organization.

The underlying reasoning of how consultancy works, is valorized and perceived in what Heinz von Foerster called “trivial machine model.“

A trivial machine model logic of traditional consultancy is that it happens between two parties, consultant and client brought together for working on a certain project, and that consultant possesses more experience/knowledge/expertise and can more or less clearly identify a problem and propose a solution.  The focus is on analyzing and bridging the gap between the consultant’s body of knowledge and skills and the requirements of the client.  Assumptions here are that client information (especially related to weaknesses, problems or any issue that requires an external consultant intervention) is readily available, comprehensive and understandable. It is also assumed that the consultant can freely and efficiently access this information, understand it and process it.

In reality, however, it’s impossible for the client or even the consultancy to arrive at an “authentic” problem description.

The traditional consulting model is not only linear and simplistic but also not realistic, as social factors (social interconnection, interaction and environment in which the consultancy and the client operate) are largely ignored. These factors are essential – decisive in make-or-break of a project – to consider in what von Foerster described as “non-trivial machine model.“ A significant factor that must also be considered – this has to do with the “no one wants to rock the boat” psychology – is that consultants usually don’t feel comfortable telling (and thus don’t tell) to the top management of the client that some processes/structures are not optimized/performing; nor does the client’s top management feel good to have consultants come and tell that they are not doing well (in some cases consultants are dismissed/fired after having implied, in their recommendations, any wrongdoing or suboptimal performance/intention on part of the client’s top management).

Niklas Luhmann, German sociologist famous for his social systems theory, picked up and applied von Foerster’s theory (of trivial and non-trivial machines) to social systems. According to Luhmann, communication is the most important part of any society. All social relations are conceptualized as processes of communication – communications that connect to earlier communications and that call forth further communications. The crucial point is that this communication process takes place relatively independently of individual human beings involved. According to his theory, although communication cannot be effected without the involvement of human beings, the particular development of the communication process is beyond their control.

For example, the same word “yes” might be understood as signaling a confirmation, a doubt or even a rejection (if interpreted as irony). Thus, the meaning of a message, i.e. the concrete communication, is not produced by the speaker but by the listener.

Applying Luhmann’s theory to the social context of consultant-client, we need to differentiate three systems: consultant, client and communication media. The theory suggests that any intended consulting intervention becomes impossible right from its beginning. Following Luhmann, the only reason for the employment of external consultants is the possibility that the client’s systems (internal/external processes, operations, information flows, etc.) get perturbed/“irritated”. However, not all (consultancy) interventions cause such a perturbation, as this is only decided by the client system itself.

This model is more comprehensive, as it includes all considerations of the linear/traditional consultancy logic but also social factors. When determining whether to employ or not a consultancy, the only possibility for clients is to observe and decide which consulting firm has the most potential to perturb its systems. Finally, the stage of evaluation becomes redundant as there is no content or objective to evaluate, and nobody is able to evaluate the degree of perturbation.

All of the above is not to imply that consultancy practice is futile and needs to be discarded with. Even traditional consulting projects can and do sometimes have positive effects for clients, but usually in a different way than intended or planned. Rather than transfer some kind of knowledge, consultant firm should cause (via the communication system) perturbations in client systems that trigger positive changes in its processes and structures, which otherwise might not have been achieved.

The prevalent trend and common thinking (by clients and consultants) is to attribute any success of a consultancy intervention to a consulting firm. This, as it should be noted from above, is misleading and factually wrong. All change, as also for humans, needs to come from within, whether triggered from without (consultancy intervention) or within.

There already exist “systemic consultants,” unsurprisingly mostly in German-speaking world, including Königswieser & Network and OSB international.  As can be seen from client testimonies of the former and project descriptions of the latter, the following activities are commonly practiced by systemic consultancy firms:

  • organizational development and transformation
  • restructuring and change management
  • strategic assessment and revision
  • systems diagnosis and analysis
  • …etc..

Services above are not very different from what traditional consultancies offer. But, they shouldn’t be as an organization has a strategy and an objective, etc – nothing new under sun.

Three points that are different between traditional and systemic consultancy practices are:

  1. Mutual ownership of the client’s project: traditional consultancies position themselves as experts in terms of knowledge/experience and usually have a more “dictatorial” and outsider approach of “what needs to be done,” whereas systemic consultants come in as an unbiased party that co-owns the client project, trying to add value with their tools/methodologies.
  2. Social and psychological factors fully weighed: psychology of clients, their top management, framework of social interaction and cultural factors are all heavily considered; also subtleties related to message creation (by consultancy), communication and perception (by clients) are of paramount role as well as analysis of  (a perceived) problem and suggestion of possible solutions/improvements.
  3. Education, assistance and support: in most cases, systemic consultants act as educators – this is usually in the face of traditional consultants who think that educating a client might diminish a future chance of being employed by that client; as apparent from many client testimonies, systemic consultants offer assistance and support in vital issues such as drafting organizational strategies or HR incentive systems, etc; systemic consultants also educate their own teams in the client context.

How admitting a failure can lead to success

Launched in the beginning of 2011 by Engineers Without Borders Canada, Toronto-based Admitting Failure is intended to be “a collaboration between like-minded NGOs, governments, donors and those in the private sector,” in the site’s own words.

Those involved with charitable development groups can visit the site to submit their own stories of plans gone wrong, or they can browse through the stories submitted by others, rating and commenting upon them along the way. Either way, failures are bound to be exposed and lessons learned.

But why admit to or even share with others a failure?

Failure and success are two sides of the same coin called life, be it a human life or that of an organization. Failure is as natural to humans as it is to organizations consisting of humans. Just like all humans are resistent to failing (and even more so to admitting it) so are human organizations.

Admiting failure enriches and brings one closer to success as formulating and clearly understanding a problem brings closer to its solution.

This is not just to make us feel self-satisfied or justified in front of our consciousness. It is as real as you’ll get. A recent article on HBR, for example, shows how Dominos Pizza – not just some small and insignificant company at that – after much public flogging, condemnation (lot of it online) had its CEO Patrick Doyle admit failure, and not just any failure, but the very essential point of having a rather inferior quality of pizza, on TV. This pain-point served it a good lesson, and Dominos quickly turned itself around and is again living its stellar time.

IBM did it back in 80s under Louis Gerstner Jr.

GE did it under Jack Welch in 90s..

Daimler and Ford did it in 70s and 80s…

A route to success, whether personal or in business, lies in admitting a problem, failure or pain-point.

Why not you or your company? Start anew by submitting your past failures, mistakes and pain-points.

Attitudes of failing companies and Sisyphus

I came across a nice article on Digital Tonto about how companies fail. I previously posted about why smart people and companies do dumb things, eventually ushering in failure. The article however shares some interesting insights and examples of how companies, even very successful ones, eventually commence their decline by following one of the below “attitudes”/approaches (comments are mine).

  1. Overconfidence: mostly driven by past success and self-confidence.
  2. Overvaluing Strategy and Undervaluing Process: companies that are obsessed by grand visions and strategies tend to underestimate the incremental changes and the process itself which are the drivers of success.
  3. Looking for Dragons to Slay: this quality is typical of few very successful companies such as Google who, once at the summit of their success, look into going after other markets, products and companies.
  4. Disruptive Competition: which might or might not bring value to end users.
  5. The Lambda Response: instead of solving the problem at hand, it becomes more exacerbated by internal confusions, inefficiences and panick.

Some of those attitudes leading to failure are among the famous Ten Commandments for business failure of Mr. Coke.

The article offers a solution to embrace, a corporate equivalent of Sisyphus:

When company leaders are like Camus’ Sisyphus, they are most likely to be successful.  Companies who are focused at the task at hand, rather than building empires and seeking out the “Next Big Thing” are doing their shareholders the greatest service.  For a company to be profitable over the long term it has to perform and that can only happen if the organization is united in its purpose.

Constant focus on creating value for existing and potential customers, unity of purpose, corporate humbleness and perseverance are the generic vaccines for companies successful but not yet narcisstic or obsessed by self-grandeur.