Thursday, March 6, 2014

Specialization Nation

Are we losing the ability to empathize with one another?


            Before the Neolithic revolution every human kinda had the same job.  That is everyone was either a hunter or a gatherer.  This lifestyle left little time do anything other than look for food.  It is no surprise that we know little about this time, because there was no written language; there was no time to sit around writing much less reading.  After some chump realized if you put seeds in the ground and food pops out, it revolutionized the way we lived.  People could settle down, and stop chasing around buffalos and instead put fences around them and call 'em cows.  Since there was more food to go around and we spent less time looking for it, some folks started doing things other than food production.  We got some new jobs, like potters who made us bowls and cups, and tanners who made us stylish leather jackets.  We got some not so great things like kings, and people who argued with each other called politicians.  The point is we started specializing in doing other things.  

            Eventually we had people thinking up new ways to do stuff.  These inventors helped society learn how to build wagons, forge metal, and brew beer.  Many of these thinkers were brilliant beyond one area of invention or science.  For example the Greek genius Archimedes invented new ways of pumping water AND killing people.  Or take for example Leonardo da Vinci, who while creating masterpiece works of art found time to invent an ancient calculator among other things.  We eventually got around to creating a new word for these men who seemed to know everything, we called them Renaissance men.  It was even upheld as an ideal to strive for.  Many of the American founders are considered to be Renaissance men because of their broad spectrum of knowledge.  Today, in our specialized world the Renaissance man is considered to be a thing of the past.

            After all the inventions, wars, and civilizations, human society and life still hadn't changed much between the advent of agriculture and the American Revolution.  The majority of people still spent most of their time farming.  Horseback was still the fastest way to get around.  This all changed after the dawn of machinery.  Now instead of the plow and the ox, people started using machines like tractors to farm the land.  This meant that famers could grow more, and tend larger farms.  This ultimately liberated the majority of humans from the fields and allowed even more people the time to do things other than produce food.  In today's industrialized world less than five percent of the people actually farm.

            With more people doing things other than farming, society had enough people to specialize even further.  Instead of a tailors we had factory workers.  Eventually factory workers had very specific jobs, such as putting the "knick" on the "knack."  And even factories specialized into niches, so instead of a factory making a whole car a factory might make only the seats for the car.  This idea of "hyper-specialization" even found its way into academia.  So instead of having a natural philosopher we had scientist.  Then instead of having just a scientist we had specialized scientist like biologist.  Eventually we had specialized specialist like microbiologist.  Even education was carved up into segments.  Instead of one teacher, a child would have 7 and each would teach a different subject such as math, science, and history.  Since all production had turned into some mindless hyperspecific job in a factory, teaching children skills fell out of favor.

            Now I am not saying the specialization of society is bad, in fact I think the unprecedented standard of living we enjoy is a result of it.  But, it is important to realize that life was not always like this.  We also must realize that the "hyperspecialization" of society is creating new problems that are going unrecognized.  

            Modern people work a lot.  People spend most of their waking time at their jobs.  So it would not be a stretch to say that your profession and what you do all day can shape your perception of the world.  In the past when the majority of people were farmers, most everyone basically had the same world view.  In the 21st century there are a growing number of different types of jobs.  With these new types of jobs comes with it new perspectives on the world.  It is not hard to see that a professional football player wouldn't have a lot in common with a computer programer.  The problem with this is that it is difficult to empathize with one another if we cannot relate with the struggles others go through from day to day.

            Let me give an example of how we lack empathy today from some experiences in my life.  
 Most of the jobs I have had have been working for the public.  I have worked at a gas station, a movie rental place, a sports store, and as a paramedic.  Working for the public can be frustrating because people can be hard to please, and everyone has different preferences.  Another problem is that a lot of people will take their frustrations out on the guy behind the cash register or the waiter at the restaurant.  Needless to say working for the public is hard.  I have found that being around friends who have worked for the public are generally nicer to cashiers and waiters.  I have also found that people who are rude to waiters and cashiers have most times than not never worked at a job where they served the public.

            Specialization in academia has created an environment of  tension between professors of different areas of specialization.  The humanities professors cannot stand the economics professors.  To the humanities professor, society is about emotions, and connections between people.  To the economic professor, society is about cold calculating self-interest.  Natural scientist collide with philosophers.  Natural scientist believe that if something cannot be observed nor measured it cannot be believed, and they also reject subjectivity.  The philosophers on the other hand point out that you cannot subjectively observe or measure anything ever, and question the foundations of verificationism which itself cannot be verified.

            If we are looking to the political world we see one that is becoming increasingly contentious.  Add the diverse number of perspectives created by specialization with religious and social stratification and you end up with a whole lot of people who don't see eye to eye.  

            So what can we do?  how do we fix this?  I really don't know.  I once believed the internet would become a platform where people came together to work out their differences.  It is, however, becoming apparent this is not the case.  Online communities segregate into groups that isolate themselves from ideas and people that challenge their cognitive biases.  If anything the web is increasing the cognitive stratification of society.  I also believe through art like novels, movies, and video games we can expose people to the perspectives of others.  But with the commercialization of media, movies and games have become more about explosions and gore than story.  Though there are exceptions to this rule, I might expand upon that in another blog sometime.

If we are to ever heal the growing fault lines in our society it is to be through learning to listen to others and understanding where they are coming from.

Tuesday, August 27, 2013

Too Big to Succeed

How big banks came to be and why they suffocate our economy.


The New Normal


     As the story goes, the "Great Recession" ended in 2009.  Since that time, the US has had an uncommonly high unemployment rate of 8% and weak growth in GDP.  This lackluster economic "recovery" is part of the "new normal" economy; some academics say this "new normal" is part of a "post-American exceptionalist" reality, a reality in which American families are expected to be satisfied with less.

     While average Americans are struggling to find jobs, and managing to survive on less, the stock market is telling a different story.  Wall Street is seeing equities prices break records, with the DOW Jones hitting an unprecedented $15,000.

     This huge discrepancy between "Main Street" and "Wall Street" has many looking around and scratching their heads for an explanation of this economic paradox that exists in the "new normal" economy.

     Some explain the unemployment by taking a "Luddite route" which claims joblessness should be blamed on new technologies created by the information revolution which are eliminating clerical jobs from the economy.  Thus businesses thrive on the savings created by new technology that replaces workers.  This is the argument most notably made by a journalist at the Economist.[1]  The problem with this theory is that it is based on a fallacy that has been disproven over and over again. (See: Luddite Fallacy) 

     Another explanation is that outsourcing to emerging economies is eroding the industrial base of the US economy.  This claim is not supported by facts, in which analysis by the Bureau of Labor Statistics is predicting an increase in industrial jobs in the current decade.[2]  Not to mention that the US as most western economies has already shifted to a post-industrial economy since the 80's. 

     So if its not technology or globalization, what other explanation could there be for the high US unemployment?  What has so fundamentally changed in the World or the US economy that has ushered in the era of the "new normal" and American mediocrity?

     Well that answer can be summed up into two words: Financial Convergence.

Financial Convergence


     In the late 20th century economic thinkers, lawmakers,  and other policy makers took on the mentality that the US banking sector could benefit from broad liberalization policies.  Ideals like competition laws were seen as outmoded and too restrictive.  Words like investment bank and commercial bank were seen as arbitrary legal definitions handcuffing financial innovation.    

     Part of this push for financial market liberalization was the passage of the Gramm-Leach-Bliley Act (or GBL).  Before the GBL, a financial institution that took deposits from customers, and gave loans to small businesses and houses were labeled a "commercial banks."  On the other end of the spectrum financial institutions that did not have depositors, rather investors, and invested in equities (stocks and bonds) and commodities, were labeled "investment banks."  These legal definitions were set by the Glass-Steagall Act (or GS) of 1933.  GBL repealed the GS.  After GBL, investment banks and commercial banks became one in the same.  Investment banks started taking deposits, and commercial banks started investing in equities. 

     Since investment banks had long been involved in the equities market they were highly leveraged and had access to more capital from the fed due to its policies on reserve requirements.  Conversely commercial banks with smaller reserves had depended largely on depositor's money, and "CDs" (certificate deposit).  Since investment banks were in a more advantageous situation, they were able to buy out the smaller commercial banks.

Graph of the consolidation of financial sector.


     It should not be surprising that some investments can be more profitable than others.  For example giving a loan to a small local business can be risky and not very profitable, conversely investing money in emerging market equities can be just as risky but this investment can be exceedingly profitable.  Both of these activities are equally legitimate and have a place in the economy, however they are unsimilar.      

     The repeal of GS and the buyout of commercial banks led to preferences in certain financial activities over others.  Post-GLB banks could take money from depositors and invest it into equities.  Before, a commercial bank's investments were limited to traditional credit activities like small loans, and small businesses loans.  Since equities investment was more profitable than local credit activities, the latter was neglected in favor of the former.

     With the line between commercial banking and investment banking removed, a revolution occurred in financial instruments.  New ways were created to fill the void created in traditional credit markets.  Banks "innovated" ways to make conventional lending more like the equities market.  Variable interest rates were utilized over fixed rates.  Mortgages were bundled together and sold.  

     These activities allowed the banks to engage in widespread fraud.   Customers were bait-and-switched with teaser rates on variable APRs.  Mortgage bundles were packaged in a way that used math tricks to make them seem more sound than they actually were. 

       In economics this type of market situation is referred to informational asymmetry. It is a phenomenon that can exist in almost any market, but it occurs when one buyer or seller has more important information than the other.
     

     The Federal government's inaction to this widespread fraud could be explained as ineptitude, collusion, or confusion.  Regardless, the inaction cannot be blamed on free market dogma, because these fraudulent practices are the antithesis of market capitalism. Capitalism is founded on consensual economic activity.  Fraud is the subversion of consent, it is a form of theft.

2008 to Today


     In 2008 the recession was a victory for capitalism.  Banks who were using fraudulent tactics to make a profit went bankrupt.  The fraudulent tactics finally caught up to the big banks.


     Warning this paragraph is purely speculation:  2008 crisis would have been the solution to bad banking, if the government would have allowed the banks to fail, and resolved to fight fraud in the future.  Insolvent bank's profitable assets would have been bought in the open market, and its toxic assets would have been bought as well, but for their actual value(next to nothing).  Some of these banks would survive and would have been restructured, and some would even come back stronger with new leadership that might have tried to avoid past mistakes.

     What did happen however, was completely opposite to this ideal situation.  A new phrase entered the American lexicon, that phrase was "too big to fail."  The banking industry and its army of lobbyist set out to scare the hell out of the American congress.  They told representatives that if their companies were allowed to fail, then it would be armageddon.  If armageddon occurred  then voters will be mad, and congressman might lose their job.  

     The US Congress, possibly acting in fear, passed legislation to give the largest banks, which had been complicit in fraud, $1 Trillion dollars of public money.

     Some small things were done to try and prevent the situation in 2008 from occurring again.  One in particular was the "Frank-Dodd" Act, that promised to "end too big too fail."  The legislation does some good things like combining all financial regulatory agencies into one group. Other than that, and some vague promise to regulate derivatives, the financial industry is more or less the same as it was before the 2008 crisis. Financial convergence survived the market's attempt to kill it. 

Stuck in a Rut


     What does financial convergence, have to do with unemployment today?

     In the south we have a saying: "stuck in a rut."  According to the Cambridge dictionary "stuck in a rut" means: 


If a person, organization, etc. is in a rut, they have become too fixed in one particular type of job, activity, method, etc.

There is an economic theory called "liquidity trap" which basically means "stuck in a rut."  
     "Liquidity trap" theory was formulated by Keynesian economist. To understand it you have to know a little about Keynes.  Keynes believed some controversial things.  First off he believed that economies had a "liquidity preference."  This in layman terms means people love money, and they love it too much; meaning that people rather keep money under a mattress than invest or spend.  If you can see why this might be bad, then you already can understand Keynes next controversial point:  the "paradox of thrift" theory.   Basically the "paradox of thrift" theory says that when a lot people save money, that it is bad for the economy, because if you are saving money you are not buying goods and services; if you are not buying goods or services then people get fired, and businesses go out of business.

     Now The main way the US is currently trying to combat the recession and unemployment today is the Federal Reserve is using its "monetary policy" to kick start the economy.  The goal of monetary policy during a recession is to inject as much money into the economy as possible.  This is done by buying treasury bonds off the market, and lending money to banks at a "discount rate."  This strategy is commonly referred to as an "easy money policy."  The techniques were refined by the economic school of "monetarism."

     Since 2008 the fed has adopted this easy money policy.  Today the results appear to be mixed.  In some areas of the economy like the stock market things are good, yet elsewhere in the economy results are less dramatic.

     On top of this, despite of the easy money policy of the fed, many everyday people with good credit records are having a hard time getting access capital, to buy cars and homes.  While Main Street has no access to credit many businesses, and especially banks are flush with cash.  Even though banks have easy access to cheap credit, they only lightly invest in low risk equities like blue-chip stocks and AAA bonds. 

Corporate Saving Rate vs. Unemployment (strong correlation) 


     The problem created by this phenomenon is that banks using low interest leverage from the fed are investing in established companies who are not in need of credit or capital since they are also flush with cash. On the other end of the business spectrum, small businesses cannot gain access to credit and capital to expand their businesses.  Even more troubling aspiring entrepreneurs are even less likely to get credit to start a small business.  This is a major problem because small businesses employ over half of private sector employees in the economy.[3]  

     Knowing this information, it is easy to see why Wall Street is breaking profit and price records, while Main Street can't find a job or start a business. 

     American exceptionalism is not dead.  Americans can still innovate, they can still work hard, they can still take big risks and succeed.  This potential, however, is being suffocated because of the lack of access to crucial capital that is required to start a small business.  The dynamic potential of the American economy is eager to break loose and maintain America's lead in the global economy.  Unfortunately today, "too big to succeed" banks are being showered with money by the Federal Reserve.  They are intoxicated by their huge influxes of capital, in which they sit on as if it were a throne.  They are not going to invest in job creators, because of risk aversion and their lack of imagination.  These banks are currently leading the US into an era of mediocrity.  If we truly want put Americans back to work, and revive American exceptionalism, "too big to succeed" banks must be dismantled.  In the next few days I will list and detail some of the possible ways this can be achieved.