Nice piece Mike . It feels that progress has stalled, or slowed, after 1970. Some have blamed this on a lack of energy abundance, owing to both the oil crisis and degrowth movement that took hold at the time. Something I discussed at Risk & Progress.
I am not entirely sold on this idea. It's true, progress shifted from atoms to bits and the growth in the consumption of energy ( and GDP growth rates for that matter) all slowed around the same time.
I could make the case, however, that this is because of the nature of the information revolution. We just don't need as much energy to run a computer than we did a washing machine. Further, because the fruits of the IT revolution are mostly intangible, they are much harder to quantify and account for when we calculate GDP growth.
Digital products have a way of “collapsing” categories of goods and services into fewer items, like the smart phone evaporated scores of products.
On the other hand, there could be some truth to a slowdown in progress. We can't move people into cities, or teach them to read twice, the low hangling fruit may have been picked.
I agree about the energy thing. And I think you are dead right about how bits don’t require as much energy and so the information economy did not require more energy. And as I point out in the article, the early phase of the information economy rolled out lots of new demand-creating stuff.
What economies grow is utility. The rise of PCs in the 1980’s and 1990’s and the internet in the 1990’s to early 2000’s was mind blowingly expansive. You could feel your mind-power expanding.
And it produced new categories of demand. My wife and I were buying PCs for one to two grand every three or four years, we got cable, and then cell phone and internet service monthly bills, software purchases. My household and millions of others had added an IT category to their spending that was on top of existing expenditures.
But since then we have bought PCs at longer intervals as the ones we had still serve our needs (my current PC is more than ten years old). I still buy laptops every 3-4 years, but they are pretty cheap nowadays. The new stuff, social media for example, are free. So there is no growth there.
I think this has been an organic part of how tech has evolved. Initially it created new demand, now it doesn’t. But that is not a problem. The real issue has been why hasn’t there been something else.
20 years ago in by political cycles book I wrote about how the information economy was only about 1/3 the size of the previous economy and that was why growth was so anemic. The solution I suggested was to get more leading sectors. I suggested two, alternate carbon-free energy and health care. Global warming had been a thing for 15 years at that time and we were going to have to deal with it. And people are always interested in living longer and healthier lives.
These seemed like low-hanging fruit for leading sector development. But the big advanced in alternate energy happened in China, and advances in health care is hopelessly stymied by politics.
And so in more recent years I have been looking at politics, economics and culture for insights and my piece reflects that,
"The real issue has been why hasn’t there been something else."
What percent of the population could afford this something else if/when it comes about? Even the "collapse of categories of goods and services" hasn't necessarily made the replacements relatively cheaper as a proportion of income, or we'd be saving and investing a lot more than we have.
Initially only a few can afford the new product. But as sales rise economies of scale being down costs, price falls, and sales rise. The result is called the S-curve of development.
I guess my bigger question is whether the personal monetary costs of these new industries are factored into inflation calculations. Obviously the cost savings of not having to hire someone to do your laundry can help pay for a washing machine, but when you add something entirely new to the equation, does the "cost of living" proportionately increase?
Because I've seen other bloggers mention how much better paid the average person in the US is today compared to the 1970s (for instance), based solely on inflation. But I've also seen counter arguments that the average income has stagnated. Are these apples to apples comparisons? Does one factor in a rise in the base-level cost of living and the other doesn't?
Very well written! Nice chart too. We centralized the economic as well as the government in the 1970s/early 1980s and established private sector central planning in the economy and public sector central planning in government stuff, and its been down hill on a lot things ever since. But people are steadfastly opposed to discussing undoing anything that was done in that era.
HI. Your welcome! Well, here's two things but its far from limited to them:
One is the Bayh-Dole Act:
Since the 1970s, the pace of scientific advancement and industry innovation has arguably declined, and this trend can be significantly attributed to the centralization of research and development (R&D) enabled by policies like the Bayh-Dole Act. The Bayh-Dole Act allowed universities and private businesses to own patents on inventions developed through federally funded research. While intended to promote commercialization of research, this policy transformed large swaths of research universities' work into tax-sheltered appendages of private sector R&D. And then private companies increasingly outsourced their R&D to these research universities, which reduced the amount of independent research conducted by the private sector itself.
This shift meant that research universities, which picked up much of the applied research previously done by the private sector, saw a corresponding reduction in basic science research. Basic science, which is super important for long-term innovation, suffered as universities focused more on short-term, commercially motivated applied projects. Also, the Bayh-Dole Act's granting of patent control and exclusivities to universities and firms effectively turned taxpayer-funded government and university laboratories into extensions of private corporations. This created a system where certain firms, typically large and well-established ones, heavily benefited from subsidized R&D, leading to increased market concentration.
These concentrations have diminished the overall incentives for companies to engage in innovative research. The monopolistic control over patents and technologies has led to instances where firms suppress or delay the release of new technologies to protect their market dominance. This suppression stifles competition and further reduces the motivation for conducting new R&D. Centralized decision-making in research and investment, driven by these concentrated entities, has led to a homogenization thinking and a centralization of research priorities and planning that truly is not very far off from what I saw about the Soviets research system when I researched it. Thus, the centralization fostered by the Bayh-Dole Act and similar policies has contributed to a decline in both the quality and quantity of scientific and industrial innovation since the 1970s.
This one I shouldn't mention because there are others that are less controversial that I could put here instead, but since I've restarted my kick on it last week, I will. I'd never heard about it until five years ago, and when I did, I thought it self-evidently dumb. During COVID, I decided to go very deep down some historical rabbit holes, stumbled into it in the 19th century, then decided to get it straight from the horse's mouth and spent over one hundred hours getting to know it, decade by decade with contemporaneous to the times reading, over the 200 years (really hundreds of years older than that if one were to count the nation's predecessor, the thirteen colonies) of their existence.
That is the elimination of capital flow inhibitors between the states, these capital flow inhibitors had existed, in different forms, varying over time, but always there, for every single day of the country's existence until they were effectively mostly phased out between the late 1970s and mid-1980s and then finally de jure fully done away with by the late 1990s. If given the time, I assure you that a cogent and evidence-based argument can be made for the case that this policy has had literally the opposite effect of what most people -- I know I certainly did! -- thought it would/did.
Both of those, along with other things like the gutting of antitrust. Although, I now know something I never did before until I had visited the 19th century and early 20th century, that capital flow inhibitors were always conceptualized as being in large part antitrust measures. Antitrust didn't begin with Brandeis and those guys' era; it had always existed in the USA, and states and even localities engaged in it and could again. The last 50 years have been a novel experiment.
That, along with several other things, some of which are also individually huge, have led to both private sector central planning and public sector central planning. A good, but far from comprehensive (and I don't endorse all of its arguments or takes) book on it is 'The Transformation of American Capitalism: From Competitive Market Structures to Centralized Private Sector Planning' by John Munkirs.
I hope your off to a great start of your Saturday!
Food for thought here. Do you have any links that deal with the outfall of Bayh-Dole (with examples).
My view is it was only when executives had something else (stock buybacks, overpay for defensive acquisitions*) to do with retained earnings besides investing them internally (in R&D, for example) did it make sense for business to pursue the sorts of actions you describe. Not only that, but they were rewarded (via options) for juicing stock prices with buybacks/acquisitions rather than investing internally. After all, pursuing these sorts of strategies does not cause a company to "win" on metrics other than financial ones. Under SP culture that's all that matters. Under SC, not so much.
My account gets the timing right. SC went away because the Democrats abandoned the New Deal political order that creates the environment that selected for SC culture, as I described here:
I watched this transition happen over my 33 year career. I started with Upjohn in 1988, at the tail end of "old Upjohn" as the old timers called it. Old Upjohn was the company managed under SC culture (I was first introduced to the stakeholder idea in a speech by our then CEO). Old Upjohn came to an end in ca. 1991 when they closed the print shop (this was the consensus dating of the operators and shop folks).
Four years later our new CEO, whom we called baldy (though not to his face) merged us with Pharmacia and moved HQ from Kalamazoo to London. Baldy left and Fred came in, moved HQ to NJ, near his home, and merged us with Monsanto. He did this to get Celebrex, a Searle product (Searle had been acquired by Monsanto). Fred then rearranged things joining Monsanto's biotech operations with our Bioprocess operations in Kalamazoo and Searle's operations in North Chicago. The rest of Monsanto (the core Ag chemical business) was spun off as Monsanto and did quite well as I recall.
Now that we had Celebrex, Pfizer wanted us, and they took us over in 2003. They took a meat axe and starting hacking. First to go was Searle Discovery R&D in North Chicago and Upjohn's in Kalamazoo. A bit later Pharmacia's R&D in Sweden. Then Parke-Davis R&D in Ann Arbor (which had given them Lipitor). I seem to recall they shuttered Warner-Lambert R&D sometime around then too. Pfizer was acting more like a financial firm, buying drugs and managing them as a portfolio of investments (they even used the term portfolio to describe their list of products). This is pure SP.
*Yahoo had the opportunity to buy Goggle for chump change and turned them down. As a result, after their spectacular success in the 1990's they have become the poster child for what not to do. Facebook did not make that mistake when they picked up Instagram.
I agree with what your saying. Where we may diverge, and if so its possible it would be a very deeply fundamental divergence, is that the specific actions you refer to can be severed from both of my examples (along with much else), because they occurred within and, at least to the great extent they did, because of the broad and deep systemic changes, and the tings I mentioned (along with things I didn't) go part and parcel with each other in those regards.
In regards to switched between New Deal SC to SP: there was no singular "New Deal Era" as it evolved over time and underwent significant changes and was throught incossitent and nevr complete in its overall modern conceptualizion. The Old Republic, didn't end overnight, it phased out and faded away, in some key aspect this was very front loaded, but in some other key aspects (including in finance, which I'll touch on in the next paragaph) it actually mostly held throughout the entirety of the New Deal Era, most areas were in between, varying, and in some cases had brief times of reversal. But by the 1970s, substantial political and economic centralization had occurred, building the environment which enabled the great centralizations of the 1970s, 1980s, and 1990s. This centralization was in part, but far from fully as there were other big changes as well that would take too long to write about7 here, facilitated by the erosion of antitrust enforcement through horizontal mergers and other cartelization practices. States had significantly pulled back from antitrust or antitrust-like actions, transforming the political landscape. The two major political parties had shifted from decentralized mass member organizations to centralized, managed entities. Universities rose as supreme epistemological authorities, which, coupled with their halo effect, enabled policies like the Bayh-Dole Act to pass. The media also underwent centralization and homogenization, contributing to a more uniform dissemination of information. The end phase of the New Deal Era in the 1970s wasa very different economic and political landscape compared to its beginnings in the 1930s. These intertwined major structures and paradigms cannot be severed when considering the changes in behavior of major entities within our socio-economic system.
One of the ways in which the conceptualized (mythologized?) "New Deal Era" never existed was in its treatment of finance, especially in the first decades. The semi-populist, semi-politically decentralized, and semi-economically decentralized strictures of the Old Republic ACTUALLY HELD throughout the entirety of the "New Deal Era," first almost fully during the 1930s (with one huge loss, the initial moves towards the centralization of the Federal Reserve in 1935), and then they phased away but were still the dominant structures until the early 1970s and remained in force until the late 1970s. This was one of the most important areas, particularly for the specific situation Mike is referring to. Everyday people and SMEs (populists) effectively defeated very big business, very big finance, centralizing technocrats, and the universities. Thus, the "New Deal Era," as conceptualized as centrally directed, mass-standardized, and fully de-poeticized and rationalized in the academic macroeconomic sense, mostly never occurred. If you delve into the reasoning behind these overall structures, you'll find that risk management was secondary. The primary purposes were twofold: 1) preventing the financialization of the economy by preventing Big Finance from being its core engine, and 2) for antitrust (its importan to note that they were around long before the term "antitrust' ever became a thing, which people should know if they want to research the very long history of "antitrust" in this country that goes not just all the way back to its founding day but in fact long predated it in its predecessor thirteen colonies) purposes, preventing collusion among entities in our socio-economic system via finance-generated vectors such as interlocking directorates, co-ownership, ultimate owners/senior managers having overlapping personal portfolios, common big donors for institutions like research universities, and the symbiotic relationships and secondary forms and effects this generated. Absent these structures being undone, much of what you referred to either would not have happened as much, would have been far less incentivized, or may not have happened at all. It was just after the old regulatory paradigm and associated structures of banking and finance were undone that the intense financialization of our economy, which created many of the situations we're referring to, happened.
The changes in the patent law regime, of which the Bayh-Dole Act was a big part, significantly incentivized and enabled the diversion of money away from R&D by firms. While changes in corporate governance and executive incentives played a huge role, they are not mutually exclusive from the impacts of the patent law changes and the fincialization of the eocnomy and other things that go part and parcel with each other. The act provided the legal and structural framework that allowed these corporate governance changes to take root. By allowing universities to patent and license federally funded research, the Bayh-Dole Act enabled companies to control patents and benefit from university research. This generated monopolistic and oligopolistic powers that reduced the need for firms to invest in their internal R&D. Instead, firms could license new technologies developed with public funds, which was less costly and more immediately beneficial than conducting their own research. This helped to make it so that companies were incentivized to pursue stock buybacks and acquisitions, leveraging the financial gains from patented technologies rather than reinvesting in R&D. Without the ability to control patents and benefit from university research, firms might not have diverted resources away from R&D to the same extent. So the Bayh-Dole Act played a big role in enabling and incentivizing the diversion of corporate funds from R&D to financial strategies that prioritized short term gains.
In short, I'm refering to what I view to be a systeic paradigm change and am arguing that no one or one set of particluar specific actions can be singled out, its rather that a whole new world had been constructed...
Also, regarding patents, I believe in compulsory licensing and a non prohibitive universal rate. And all tech developed by the taxpayer should be uncontrolled (obvs not nuclear bombs or something) (although I'm super sympathetic to the researchers, and all of them not just the leads but also the rest including, if their there, students and post docs, getting a royalty or some other form of monetary compensation). But that is a long discussion, my main point in my initial reply was that these conversations dont *actually* happen.
I hope you and your wife have been having a pleasant weekend.
You are proposing a different theory for why things are the way they are. It should be amenable to cultural evolutionary analysis. I don’t really understand your model, but you haven’t really worked out all the details. I can show you how I approached the problem. I started with Peter Turchin’s elite proliferation model that explains political instability (like what happened yesterday) as a function of income inequality. So I built a cultural evolutionary model to explain inequality trends as described below. In principle you can do the same with your theory.
I characterize eras like the New Deal Order or the Neoliberal Order that followed as defined by the type of culture they select for. Nothing more to it. I measure the culture type using income inequality as a proxy as shown in this figure.
I use the cultural framework because I found that a standard cultural evolution (CE) model worked to explain inequality trends. This figure shows inequality (the thing I am trying to explain) and the output of the CE model (the theoretical explainer) as the solid line.
The dotted line is the economic environment expressed in terms of the level of inequality that would exist if the culture was fully adapted. In actuality CE is slow-acting and culture only gradually approaches the dotted line. The dotted line itself is calculated as a simple linear function of things hypothesized to affect economic culture: tax rates (represented by top rate), labor power (represented by strike frequency) and interest rate (represented by Aaa bond rate). The model development is explained in this 2019 paper:
Note the inequality dataset I used then was different that the one available now. With the old data interest rate did not add any explanatory power and I did not use it. With the current data set, adding interest rate as a third explanatory factor improved the fit.
The effect of my proposed orders/cultures also show up in stock market valuation patterns:
P/R is a valuation measure for the S&P500 that is very similar to Tobin’s q, except it is a lot easier to calculate. You can see that market peaks averaged around 1.3 but the one in the 1960’s (at the height of the New Deal Order) was unusually low, while the values during the current Neoliberal Order have been unusually high. This is an explicit effect of the financialization that comes with SP culture and its suppression under SC culture. I talk about that here:
Progress stalled because the institutions, academia included, have systemically eliminated and kneecapped all progress since the 70’s. Anyone who comes up with anything or calls out negative progress is blacklisted, disemployed and even murdered.
There are papers and products from before the 70s about using lasers to repair damaged tissue, thorium nuclear reactors and designer molecules to target cancers and viruses. They work and have seen a flourishing in China/Russia now that they’ve caught up their own facilities to construct these things.
However in the US the people who took hold around the 70s have made sure to maintain their crushing feudal blackmail powered hold to secure the wealth they steal to this day.
There’s no innovation problem, there’s a management problem.
Laser surgery is an established technology, been around for decades. Designer molecules are drugs, I worked in drug manufacturing R&D for 33 years. We worked on lots of new molecules and older molecules that we still made.
Uranium nuclear reactors are cheaper than thorium ones, and we didn't build many of those after the 1970's because with the rise in interest rates in the early 1970's nuclear power no longer looked attractive relative to coal. And once fracking came in and gas prices fell, there was no way nuclear (or coal) could complete with gas. New gas plants are way cheaper, way more efficient, and more tunable than new nuclear plants. Existing nuclear plants are cheap to run, so utilities kept what they had, and worked to replace coal with gas. Simple economics.
If we taxed long term capital gains at the same rate as income then stock buybacks wouldn’t be meaningfully different from dividends. This generalizes: you should prefer to own stocks that do buybacks and do NOT issue dividends because then you’ll pay less in taxes.
Also you’d get to choose which year to pay those taxes so you can retire much earlier.
For some reason even intelligent people misunderstand this point and so prefer stocks with dividends. But the math works out the same either way: they’re both extracting profits from a business.
AI has the potential to democratise several currently niche services: therapist, tutor, personal trainer, life coach, mentor, personal assistant, wardrobe and interior design consultant, and probably others.
Would you consider expanding the markets for any of these, to say a hundred times their current sizes, to be more of the same, or something new?
Of course this is hypothetical. The most likely use for AI, after targeting advertising, is as pretend girlfriends or boyfriends, as in the film "Her".
In my 2004 book I suggested alternate energy as a leading sector for the information economy. I also discussed how to structure health care as a leading sector. Health care has real potential. Every new category of treatment or drug creates new categories if demand. The problem with health care is the way it is structured it doesn’t follow the normal S-curve of development. The solution I suggested was to split it in two to tiers. The public tier would be handled by a national health insurance type program which provides subsidized treatment for standard treatments and drugs listed in the National Formulary. Providers have to accept the low government price, but they will get a lot of volume because most of the country will be in the program.
Then there would be a second tier that sells treatments and drugs that are on (yet) on the standard list or in the National Formulary. These are treatments/drug the government has approved as safe and not less effective than the standard treatment but have not yet been added to the things covered under the public plan. This portion of the health care business would be private pay, Individuals pay directly for services or purchase their own insurance.
Here is where new treatments would get rolled out first. Prices will be high and providers incented to find ways to lower prices so as to gin up more sales. Thus, new treatments and drugs would follow the S-curve of development (see link). Eventually they will get the price down low enough for provider to petition the Health Authority to add their treatment/drug to the public program and accept the much lower price, but much larger volume this would entail.
I envisions that AI could be a way to reduce the costs of treatments. An AI primary care “autodoc” could spend lots more time with you talking about your symptoms while a 20 versions of it are talking to other patients at the same time, with one human doc on site. This one doc with her “AI helpers” might handle 20 patients in 20 minutes, a huge increase in productivity, reducing the cost of primary care. Ditto for other treatments.
Same thing for hospital care. The AI could talk with patients and answer questions, provide reassurance as long as they want and never get tired or irritated, while the humans do the physical care. w/o having to deal with the patient.
You don’t need to have highly paid medical professionals doing things an AI or a patient care specialist can do. Physicians will stop doing routine tasks like colonoscopies and focus on the complex tasks like surgery, cardiac cath and so on.
I would classify all of that as "more of the same", since most people already use medical services. It's all bottom-of-the-cliff.
Using AI at the top of the cliff, stopping people getting sick in the first place, strikes me as a new thing.
Treatments for the illnesses of obesity (cardiovascular, digestive system, musculo-skeletal-joint, immune system) would be useful, yes, but not really anything new.
What Harry Dent calls basic innovations are those that create new categories of demand. Those that expand an existing category are called maturity innovations. Both create new demand and are good for growth. In fact the postwar boom is what Dent calls a maturity boom because much of the innovation was of the maturity type.
For example, broadcast TV is a maturity innovation built on radio. It ended up being much bigger thing. Interestingly, mainframe computers were a startlingly new tech, but acted as a maturity innovation. They were used as business machines to replace an early generation of electromechanical calculators, cash registers and tabulators.
The PC was a basic innovation despite being a mere extension of timeshared mainframe systems of the 1970's. What made it basic was it created a new category of demand home computing.
So you are right, the health care stuff is maturity innovations, even if it involves cutting edge tech like AI. But as I noted above with TV, these can be bigger growth promoters than the basic innovation.
I can't really talk about future basic innovations (neither can anyone else) because we don't have crystal balls. :)
I am not very familiar with AI applications. Is an AI life partner sort of like the "software secretary" that I envisioned in 1990's that managed your personal affairs like personal secretaries did for executives back in the day? Or are you talking about AI girlfriend/boyfriend sort of thing? Or am I way off?
"Life partner" was my euphemism for girlfriend/boyfriend. This from soranews24:
"Koi Suru AI is, essentially, a dating simulator. It’s framed like a dating app, and the gameplay consists of exchanging messages with a 22-year-old woman named Ai whose responses are generated using AI technology."
and
"Tapple isn’t a game developer. Instead, they’re the company that runs what they claim is Japan’s largest for-humans dating app, also called Tapple."
"According to the company’s internal statistics, Tapple has over 19 million members. Even still, the company says it’s noticed that younger people in particular have become less active in pursuing romance and marriage, and so while they’re not shutting the for-meeting-humans Tapple app down, they’ve added Koi Suru AI to their service lineup."
Tapple says it hopes that the AI app will eventually create desire for the real thing, but that seems like a jump. I think it's going to be an adequate substitute, especially in cultures oriented around work as the primary source of prestige. (Pretty much all of thedeveloped world.)
---
A digital PA would be great for many professionals. Hospital doctors and nurses seem to spend as much time at the keyboard as they do on more directly medical tasks, for instance. The same with builders, field engineers and technicians, salespeople, and many more.
Nice piece Mike . It feels that progress has stalled, or slowed, after 1970. Some have blamed this on a lack of energy abundance, owing to both the oil crisis and degrowth movement that took hold at the time. Something I discussed at Risk & Progress.
I am not entirely sold on this idea. It's true, progress shifted from atoms to bits and the growth in the consumption of energy ( and GDP growth rates for that matter) all slowed around the same time.
I could make the case, however, that this is because of the nature of the information revolution. We just don't need as much energy to run a computer than we did a washing machine. Further, because the fruits of the IT revolution are mostly intangible, they are much harder to quantify and account for when we calculate GDP growth.
Digital products have a way of “collapsing” categories of goods and services into fewer items, like the smart phone evaporated scores of products.
On the other hand, there could be some truth to a slowdown in progress. We can't move people into cities, or teach them to read twice, the low hangling fruit may have been picked.
I agree about the energy thing. And I think you are dead right about how bits don’t require as much energy and so the information economy did not require more energy. And as I point out in the article, the early phase of the information economy rolled out lots of new demand-creating stuff.
What economies grow is utility. The rise of PCs in the 1980’s and 1990’s and the internet in the 1990’s to early 2000’s was mind blowingly expansive. You could feel your mind-power expanding.
And it produced new categories of demand. My wife and I were buying PCs for one to two grand every three or four years, we got cable, and then cell phone and internet service monthly bills, software purchases. My household and millions of others had added an IT category to their spending that was on top of existing expenditures.
But since then we have bought PCs at longer intervals as the ones we had still serve our needs (my current PC is more than ten years old). I still buy laptops every 3-4 years, but they are pretty cheap nowadays. The new stuff, social media for example, are free. So there is no growth there.
I think this has been an organic part of how tech has evolved. Initially it created new demand, now it doesn’t. But that is not a problem. The real issue has been why hasn’t there been something else.
20 years ago in by political cycles book I wrote about how the information economy was only about 1/3 the size of the previous economy and that was why growth was so anemic. The solution I suggested was to get more leading sectors. I suggested two, alternate carbon-free energy and health care. Global warming had been a thing for 15 years at that time and we were going to have to deal with it. And people are always interested in living longer and healthier lives.
These seemed like low-hanging fruit for leading sector development. But the big advanced in alternate energy happened in China, and advances in health care is hopelessly stymied by politics.
And so in more recent years I have been looking at politics, economics and culture for insights and my piece reflects that,
"The real issue has been why hasn’t there been something else."
What percent of the population could afford this something else if/when it comes about? Even the "collapse of categories of goods and services" hasn't necessarily made the replacements relatively cheaper as a proportion of income, or we'd be saving and investing a lot more than we have.
Initially only a few can afford the new product. But as sales rise economies of scale being down costs, price falls, and sales rise. The result is called the S-curve of development.
https://mikealexander.substack.com/p/an-introduction-to-leading-sectors#:~:text=New%20products%20and%20technologies,1.%20The%20S%2Dcurve
I guess my bigger question is whether the personal monetary costs of these new industries are factored into inflation calculations. Obviously the cost savings of not having to hire someone to do your laundry can help pay for a washing machine, but when you add something entirely new to the equation, does the "cost of living" proportionately increase?
Because I've seen other bloggers mention how much better paid the average person in the US is today compared to the 1970s (for instance), based solely on inflation. But I've also seen counter arguments that the average income has stagnated. Are these apples to apples comparisons? Does one factor in a rise in the base-level cost of living and the other doesn't?
Thanks for the response and posts.
Very well written! Nice chart too. We centralized the economic as well as the government in the 1970s/early 1980s and established private sector central planning in the economy and public sector central planning in government stuff, and its been down hill on a lot things ever since. But people are steadfastly opposed to discussing undoing anything that was done in that era.
Thanks! I am not sure to what you are referring.
HI. Your welcome! Well, here's two things but its far from limited to them:
One is the Bayh-Dole Act:
Since the 1970s, the pace of scientific advancement and industry innovation has arguably declined, and this trend can be significantly attributed to the centralization of research and development (R&D) enabled by policies like the Bayh-Dole Act. The Bayh-Dole Act allowed universities and private businesses to own patents on inventions developed through federally funded research. While intended to promote commercialization of research, this policy transformed large swaths of research universities' work into tax-sheltered appendages of private sector R&D. And then private companies increasingly outsourced their R&D to these research universities, which reduced the amount of independent research conducted by the private sector itself.
This shift meant that research universities, which picked up much of the applied research previously done by the private sector, saw a corresponding reduction in basic science research. Basic science, which is super important for long-term innovation, suffered as universities focused more on short-term, commercially motivated applied projects. Also, the Bayh-Dole Act's granting of patent control and exclusivities to universities and firms effectively turned taxpayer-funded government and university laboratories into extensions of private corporations. This created a system where certain firms, typically large and well-established ones, heavily benefited from subsidized R&D, leading to increased market concentration.
These concentrations have diminished the overall incentives for companies to engage in innovative research. The monopolistic control over patents and technologies has led to instances where firms suppress or delay the release of new technologies to protect their market dominance. This suppression stifles competition and further reduces the motivation for conducting new R&D. Centralized decision-making in research and investment, driven by these concentrated entities, has led to a homogenization thinking and a centralization of research priorities and planning that truly is not very far off from what I saw about the Soviets research system when I researched it. Thus, the centralization fostered by the Bayh-Dole Act and similar policies has contributed to a decline in both the quality and quantity of scientific and industrial innovation since the 1970s.
This one I shouldn't mention because there are others that are less controversial that I could put here instead, but since I've restarted my kick on it last week, I will. I'd never heard about it until five years ago, and when I did, I thought it self-evidently dumb. During COVID, I decided to go very deep down some historical rabbit holes, stumbled into it in the 19th century, then decided to get it straight from the horse's mouth and spent over one hundred hours getting to know it, decade by decade with contemporaneous to the times reading, over the 200 years (really hundreds of years older than that if one were to count the nation's predecessor, the thirteen colonies) of their existence.
That is the elimination of capital flow inhibitors between the states, these capital flow inhibitors had existed, in different forms, varying over time, but always there, for every single day of the country's existence until they were effectively mostly phased out between the late 1970s and mid-1980s and then finally de jure fully done away with by the late 1990s. If given the time, I assure you that a cogent and evidence-based argument can be made for the case that this policy has had literally the opposite effect of what most people -- I know I certainly did! -- thought it would/did.
Both of those, along with other things like the gutting of antitrust. Although, I now know something I never did before until I had visited the 19th century and early 20th century, that capital flow inhibitors were always conceptualized as being in large part antitrust measures. Antitrust didn't begin with Brandeis and those guys' era; it had always existed in the USA, and states and even localities engaged in it and could again. The last 50 years have been a novel experiment.
That, along with several other things, some of which are also individually huge, have led to both private sector central planning and public sector central planning. A good, but far from comprehensive (and I don't endorse all of its arguments or takes) book on it is 'The Transformation of American Capitalism: From Competitive Market Structures to Centralized Private Sector Planning' by John Munkirs.
I hope your off to a great start of your Saturday!
---Mike
Food for thought here. Do you have any links that deal with the outfall of Bayh-Dole (with examples).
My view is it was only when executives had something else (stock buybacks, overpay for defensive acquisitions*) to do with retained earnings besides investing them internally (in R&D, for example) did it make sense for business to pursue the sorts of actions you describe. Not only that, but they were rewarded (via options) for juicing stock prices with buybacks/acquisitions rather than investing internally. After all, pursuing these sorts of strategies does not cause a company to "win" on metrics other than financial ones. Under SP culture that's all that matters. Under SC, not so much.
My account gets the timing right. SC went away because the Democrats abandoned the New Deal political order that creates the environment that selected for SC culture, as I described here:
https://mikealexander.substack.com/p/how-the-new-deal-order-fell
I watched this transition happen over my 33 year career. I started with Upjohn in 1988, at the tail end of "old Upjohn" as the old timers called it. Old Upjohn was the company managed under SC culture (I was first introduced to the stakeholder idea in a speech by our then CEO). Old Upjohn came to an end in ca. 1991 when they closed the print shop (this was the consensus dating of the operators and shop folks).
Four years later our new CEO, whom we called baldy (though not to his face) merged us with Pharmacia and moved HQ from Kalamazoo to London. Baldy left and Fred came in, moved HQ to NJ, near his home, and merged us with Monsanto. He did this to get Celebrex, a Searle product (Searle had been acquired by Monsanto). Fred then rearranged things joining Monsanto's biotech operations with our Bioprocess operations in Kalamazoo and Searle's operations in North Chicago. The rest of Monsanto (the core Ag chemical business) was spun off as Monsanto and did quite well as I recall.
Now that we had Celebrex, Pfizer wanted us, and they took us over in 2003. They took a meat axe and starting hacking. First to go was Searle Discovery R&D in North Chicago and Upjohn's in Kalamazoo. A bit later Pharmacia's R&D in Sweden. Then Parke-Davis R&D in Ann Arbor (which had given them Lipitor). I seem to recall they shuttered Warner-Lambert R&D sometime around then too. Pfizer was acting more like a financial firm, buying drugs and managing them as a portfolio of investments (they even used the term portfolio to describe their list of products). This is pure SP.
*Yahoo had the opportunity to buy Goggle for chump change and turned them down. As a result, after their spectacular success in the 1990's they have become the poster child for what not to do. Facebook did not make that mistake when they picked up Instagram.
Hi Mike. Thanks for the super interesting reply!
I agree with what your saying. Where we may diverge, and if so its possible it would be a very deeply fundamental divergence, is that the specific actions you refer to can be severed from both of my examples (along with much else), because they occurred within and, at least to the great extent they did, because of the broad and deep systemic changes, and the tings I mentioned (along with things I didn't) go part and parcel with each other in those regards.
In regards to switched between New Deal SC to SP: there was no singular "New Deal Era" as it evolved over time and underwent significant changes and was throught incossitent and nevr complete in its overall modern conceptualizion. The Old Republic, didn't end overnight, it phased out and faded away, in some key aspect this was very front loaded, but in some other key aspects (including in finance, which I'll touch on in the next paragaph) it actually mostly held throughout the entirety of the New Deal Era, most areas were in between, varying, and in some cases had brief times of reversal. But by the 1970s, substantial political and economic centralization had occurred, building the environment which enabled the great centralizations of the 1970s, 1980s, and 1990s. This centralization was in part, but far from fully as there were other big changes as well that would take too long to write about7 here, facilitated by the erosion of antitrust enforcement through horizontal mergers and other cartelization practices. States had significantly pulled back from antitrust or antitrust-like actions, transforming the political landscape. The two major political parties had shifted from decentralized mass member organizations to centralized, managed entities. Universities rose as supreme epistemological authorities, which, coupled with their halo effect, enabled policies like the Bayh-Dole Act to pass. The media also underwent centralization and homogenization, contributing to a more uniform dissemination of information. The end phase of the New Deal Era in the 1970s wasa very different economic and political landscape compared to its beginnings in the 1930s. These intertwined major structures and paradigms cannot be severed when considering the changes in behavior of major entities within our socio-economic system.
One of the ways in which the conceptualized (mythologized?) "New Deal Era" never existed was in its treatment of finance, especially in the first decades. The semi-populist, semi-politically decentralized, and semi-economically decentralized strictures of the Old Republic ACTUALLY HELD throughout the entirety of the "New Deal Era," first almost fully during the 1930s (with one huge loss, the initial moves towards the centralization of the Federal Reserve in 1935), and then they phased away but were still the dominant structures until the early 1970s and remained in force until the late 1970s. This was one of the most important areas, particularly for the specific situation Mike is referring to. Everyday people and SMEs (populists) effectively defeated very big business, very big finance, centralizing technocrats, and the universities. Thus, the "New Deal Era," as conceptualized as centrally directed, mass-standardized, and fully de-poeticized and rationalized in the academic macroeconomic sense, mostly never occurred. If you delve into the reasoning behind these overall structures, you'll find that risk management was secondary. The primary purposes were twofold: 1) preventing the financialization of the economy by preventing Big Finance from being its core engine, and 2) for antitrust (its importan to note that they were around long before the term "antitrust' ever became a thing, which people should know if they want to research the very long history of "antitrust" in this country that goes not just all the way back to its founding day but in fact long predated it in its predecessor thirteen colonies) purposes, preventing collusion among entities in our socio-economic system via finance-generated vectors such as interlocking directorates, co-ownership, ultimate owners/senior managers having overlapping personal portfolios, common big donors for institutions like research universities, and the symbiotic relationships and secondary forms and effects this generated. Absent these structures being undone, much of what you referred to either would not have happened as much, would have been far less incentivized, or may not have happened at all. It was just after the old regulatory paradigm and associated structures of banking and finance were undone that the intense financialization of our economy, which created many of the situations we're referring to, happened.
The changes in the patent law regime, of which the Bayh-Dole Act was a big part, significantly incentivized and enabled the diversion of money away from R&D by firms. While changes in corporate governance and executive incentives played a huge role, they are not mutually exclusive from the impacts of the patent law changes and the fincialization of the eocnomy and other things that go part and parcel with each other. The act provided the legal and structural framework that allowed these corporate governance changes to take root. By allowing universities to patent and license federally funded research, the Bayh-Dole Act enabled companies to control patents and benefit from university research. This generated monopolistic and oligopolistic powers that reduced the need for firms to invest in their internal R&D. Instead, firms could license new technologies developed with public funds, which was less costly and more immediately beneficial than conducting their own research. This helped to make it so that companies were incentivized to pursue stock buybacks and acquisitions, leveraging the financial gains from patented technologies rather than reinvesting in R&D. Without the ability to control patents and benefit from university research, firms might not have diverted resources away from R&D to the same extent. So the Bayh-Dole Act played a big role in enabling and incentivizing the diversion of corporate funds from R&D to financial strategies that prioritized short term gains.
In short, I'm refering to what I view to be a systeic paradigm change and am arguing that no one or one set of particluar specific actions can be singled out, its rather that a whole new world had been constructed...
Also, regarding patents, I believe in compulsory licensing and a non prohibitive universal rate. And all tech developed by the taxpayer should be uncontrolled (obvs not nuclear bombs or something) (although I'm super sympathetic to the researchers, and all of them not just the leads but also the rest including, if their there, students and post docs, getting a royalty or some other form of monetary compensation). But that is a long discussion, my main point in my initial reply was that these conversations dont *actually* happen.
I hope you and your wife have been having a pleasant weekend.
Best,
Mike
You are proposing a different theory for why things are the way they are. It should be amenable to cultural evolutionary analysis. I don’t really understand your model, but you haven’t really worked out all the details. I can show you how I approached the problem. I started with Peter Turchin’s elite proliferation model that explains political instability (like what happened yesterday) as a function of income inequality. So I built a cultural evolutionary model to explain inequality trends as described below. In principle you can do the same with your theory.
I characterize eras like the New Deal Order or the Neoliberal Order that followed as defined by the type of culture they select for. Nothing more to it. I measure the culture type using income inequality as a proxy as shown in this figure.
https://substack-post-media.s3.amazonaws.com/public/images/79339777-f010-4b96-ba48-a7fd8d048676_642x256.gif (642×256) (substackcdn.com)
I use the cultural framework because I found that a standard cultural evolution (CE) model worked to explain inequality trends. This figure shows inequality (the thing I am trying to explain) and the output of the CE model (the theoretical explainer) as the solid line.
https://mikealexander.substack.com/p/how-economic-culture-evolves#:~:text=for%20detailed%20development)-,.,Figure%204.%20Evolution%20of%20business%20culture,-The%20rise%20of
The dotted line is the economic environment expressed in terms of the level of inequality that would exist if the culture was fully adapted. In actuality CE is slow-acting and culture only gradually approaches the dotted line. The dotted line itself is calculated as a simple linear function of things hypothesized to affect economic culture: tax rates (represented by top rate), labor power (represented by strike frequency) and interest rate (represented by Aaa bond rate). The model development is explained in this 2019 paper:
https://escholarship.org/uc/item/9x36913k
Note the inequality dataset I used then was different that the one available now. With the old data interest rate did not add any explanatory power and I did not use it. With the current data set, adding interest rate as a third explanatory factor improved the fit.
The effect of my proposed orders/cultures also show up in stock market valuation patterns:
https://mikealexander.substack.com/p/how-anomalies-drove-my-social-science#:~:text=higher%20stock%20market,and%20housing%20valuation
P/R is a valuation measure for the S&P500 that is very similar to Tobin’s q, except it is a lot easier to calculate. You can see that market peaks averaged around 1.3 but the one in the 1960’s (at the height of the New Deal Order) was unusually low, while the values during the current Neoliberal Order have been unusually high. This is an explicit effect of the financialization that comes with SP culture and its suppression under SC culture. I talk about that here:
https://mikealexander.substack.com/p/how-sp-culture-produces-financial
Progress stalled because the institutions, academia included, have systemically eliminated and kneecapped all progress since the 70’s. Anyone who comes up with anything or calls out negative progress is blacklisted, disemployed and even murdered.
There are papers and products from before the 70s about using lasers to repair damaged tissue, thorium nuclear reactors and designer molecules to target cancers and viruses. They work and have seen a flourishing in China/Russia now that they’ve caught up their own facilities to construct these things.
However in the US the people who took hold around the 70s have made sure to maintain their crushing feudal blackmail powered hold to secure the wealth they steal to this day.
There’s no innovation problem, there’s a management problem.
Laser surgery is an established technology, been around for decades. Designer molecules are drugs, I worked in drug manufacturing R&D for 33 years. We worked on lots of new molecules and older molecules that we still made.
Uranium nuclear reactors are cheaper than thorium ones, and we didn't build many of those after the 1970's because with the rise in interest rates in the early 1970's nuclear power no longer looked attractive relative to coal. And once fracking came in and gas prices fell, there was no way nuclear (or coal) could complete with gas. New gas plants are way cheaper, way more efficient, and more tunable than new nuclear plants. Existing nuclear plants are cheap to run, so utilities kept what they had, and worked to replace coal with gas. Simple economics.
So I am not sure what you are talking about.
Healthcare is an interesting area of growth. As science and technology advance there are new options for diagnosis and treatment
Yes it is.
If we taxed long term capital gains at the same rate as income then stock buybacks wouldn’t be meaningfully different from dividends. This generalizes: you should prefer to own stocks that do buybacks and do NOT issue dividends because then you’ll pay less in taxes.
Also you’d get to choose which year to pay those taxes so you can retire much earlier.
For some reason even intelligent people misunderstand this point and so prefer stocks with dividends. But the math works out the same either way: they’re both extracting profits from a business.
In America since 2003, dividends are taxed at capital gains rate.
Great article.
"2 Examples of success: Apple II, Mac, iPhone.
3 Examples of failure: Apple III, Lisa"
Both categories provide the all important learning from experience that a stock buyback doesn't.
"The reason why is that executives are cultural primates like the rest of us and are motivated to acquire prestige"
It was still a larger paycheck, too. Income is a motivator regardless of the magnitude.
AI has the potential to democratise several currently niche services: therapist, tutor, personal trainer, life coach, mentor, personal assistant, wardrobe and interior design consultant, and probably others.
Would you consider expanding the markets for any of these, to say a hundred times their current sizes, to be more of the same, or something new?
Of course this is hypothetical. The most likely use for AI, after targeting advertising, is as pretend girlfriends or boyfriends, as in the film "Her".
In my 2004 book I suggested alternate energy as a leading sector for the information economy. I also discussed how to structure health care as a leading sector. Health care has real potential. Every new category of treatment or drug creates new categories if demand. The problem with health care is the way it is structured it doesn’t follow the normal S-curve of development. The solution I suggested was to split it in two to tiers. The public tier would be handled by a national health insurance type program which provides subsidized treatment for standard treatments and drugs listed in the National Formulary. Providers have to accept the low government price, but they will get a lot of volume because most of the country will be in the program.
Then there would be a second tier that sells treatments and drugs that are on (yet) on the standard list or in the National Formulary. These are treatments/drug the government has approved as safe and not less effective than the standard treatment but have not yet been added to the things covered under the public plan. This portion of the health care business would be private pay, Individuals pay directly for services or purchase their own insurance.
Here is where new treatments would get rolled out first. Prices will be high and providers incented to find ways to lower prices so as to gin up more sales. Thus, new treatments and drugs would follow the S-curve of development (see link). Eventually they will get the price down low enough for provider to petition the Health Authority to add their treatment/drug to the public program and accept the much lower price, but much larger volume this would entail.
https://mikealexander.substack.com/p/an-introduction-to-leading-sectors#:~:text=New%20products%20and%20technologies,1.%20The%20S%2Dcurve
I envisions that AI could be a way to reduce the costs of treatments. An AI primary care “autodoc” could spend lots more time with you talking about your symptoms while a 20 versions of it are talking to other patients at the same time, with one human doc on site. This one doc with her “AI helpers” might handle 20 patients in 20 minutes, a huge increase in productivity, reducing the cost of primary care. Ditto for other treatments.
Same thing for hospital care. The AI could talk with patients and answer questions, provide reassurance as long as they want and never get tired or irritated, while the humans do the physical care. w/o having to deal with the patient.
You don’t need to have highly paid medical professionals doing things an AI or a patient care specialist can do. Physicians will stop doing routine tasks like colonoscopies and focus on the complex tasks like surgery, cardiac cath and so on.
I would classify all of that as "more of the same", since most people already use medical services. It's all bottom-of-the-cliff.
Using AI at the top of the cliff, stopping people getting sick in the first place, strikes me as a new thing.
Treatments for the illnesses of obesity (cardiovascular, digestive system, musculo-skeletal-joint, immune system) would be useful, yes, but not really anything new.
What Harry Dent calls basic innovations are those that create new categories of demand. Those that expand an existing category are called maturity innovations. Both create new demand and are good for growth. In fact the postwar boom is what Dent calls a maturity boom because much of the innovation was of the maturity type.
For example, broadcast TV is a maturity innovation built on radio. It ended up being much bigger thing. Interestingly, mainframe computers were a startlingly new tech, but acted as a maturity innovation. They were used as business machines to replace an early generation of electromechanical calculators, cash registers and tabulators.
The PC was a basic innovation despite being a mere extension of timeshared mainframe systems of the 1970's. What made it basic was it created a new category of demand home computing.
So you are right, the health care stuff is maturity innovations, even if it involves cutting edge tech like AI. But as I noted above with TV, these can be bigger growth promoters than the basic innovation.
I can't really talk about future basic innovations (neither can anyone else) because we don't have crystal balls. :)
How about the teaching, advising, and assisting roles that I mentioned? Do you see AI expanding the markets for those services?
AI life partners would be a basic innovation in Dent's taxonomy. The idea is being tried in Japan, apparently.
I am not very familiar with AI applications. Is an AI life partner sort of like the "software secretary" that I envisioned in 1990's that managed your personal affairs like personal secretaries did for executives back in the day? Or are you talking about AI girlfriend/boyfriend sort of thing? Or am I way off?
"Life partner" was my euphemism for girlfriend/boyfriend. This from soranews24:
"Koi Suru AI is, essentially, a dating simulator. It’s framed like a dating app, and the gameplay consists of exchanging messages with a 22-year-old woman named Ai whose responses are generated using AI technology."
and
"Tapple isn’t a game developer. Instead, they’re the company that runs what they claim is Japan’s largest for-humans dating app, also called Tapple."
"According to the company’s internal statistics, Tapple has over 19 million members. Even still, the company says it’s noticed that younger people in particular have become less active in pursuing romance and marriage, and so while they’re not shutting the for-meeting-humans Tapple app down, they’ve added Koi Suru AI to their service lineup."
https://soranews24.com/2024/01/18/japans-biggest-dating-app-says-young-people-not-interested-in-romance-creates-ai-girlfriend-app/
Tapple says it hopes that the AI app will eventually create desire for the real thing, but that seems like a jump. I think it's going to be an adequate substitute, especially in cultures oriented around work as the primary source of prestige. (Pretty much all of thedeveloped world.)
---
A digital PA would be great for many professionals. Hospital doctors and nurses seem to spend as much time at the keyboard as they do on more directly medical tasks, for instance. The same with builders, field engineers and technicians, salespeople, and many more.