Computers Reduce Efficiency: Case Studies of the Solow Paradox - 11 minutes read





I’ve harped on this in my sidewinder and slide rule blergs, as well as older ones: very often, using a computard is a recipe for failure, and old fashioned techniques are more efficient and useful. The sidewinder guys at China Lake actually called this out in their accounts of the place going to seed: computers and carpets ruined the place’s productivity and most of all innovation. I’ve mentioned my own anecdotes of CAD failures, and the general failings of computers in a design cycle. This is an early realization of mine; even before the internet existed as a distraction. In my first serious programming projects I had a lot of good experiences literally writing out the Fortran on a giant brown paper trashbag cut up and stapled into a scroll-like object, compared to people who would laboriously use WATCOM or Emacs or whatever people were using in those days as an IDE, looking through the toilet paper tube of 640×480 90s era monitors. I attributed  this to simply being able to look at the whole thing at a glance, but for all I know, writing it with a pencil and engaging my hand’s proprioceptors, or not looking at a hypnotically flickering screen were the magic ingredients. I’m beginning to think the latter things are more important than people realize.


RAND corporation did a study of the failings of CAD in design of British nuclear submarines. Before computard/CAD tools, people would use the old timey techniques of drafting in 2-d on a piece of paper, and building scale models to see how pieces fit together. Literally laboriously using weird T-square tools and a pencil and building plaster models was faster than using advanced computard technology. Again, this is something I’ve actually experienced in building experimental physics gizmos. You can spend months on a design in Solidworks and make something impossible to fabricate which doesn’t line up with the rest of the system: I’ve seen it happen. Dude with a tape measure can fix it in a few hours if it’s a one-off; somehow these problems don’t come up with models and slide-rule and paper design. This was admitted in Parliament in their investigations of the cost overruns on their Astute class submarines. It boggles my mind that people still don’t realize this is a real problem. We get mindless repetitions that “software is eating everything” like some kind of mantra despite evidence to the contrary. Instead of studying the problem, it’s simply dismissed. Nobody trains in non-CAD drafting any more so we can’t exactly go back to that.


Now the RAND study did sort of elide over the core problem by stating that American expertise (who had been using CAD and run into many of the problems before) at the electric boat company helped unfuck the British program. They did not ask the basic question of whether or not CAD was mostly harmful; it might be so that its use reduces productivity overall and people might be better off only using it strategically. We’ll never actually know, because unless the Russians are still using old timey drafting methods, we don’t have a comparison class which isn’t time-censored (the Chinese would never think of this: using paper would be seen as losing face).


Another study is one on CAD design back in 1989. He uses the example of printed circuit design; something that has long since been given over to CAD. Back in those days a lot of the designs had to be refactored by hand. He also notes the danger that future generations of designers might have atrophied skills which won’t enable him to do this. He notes that CAD didn’t eliminate the job of the draftsman or increase his output; he just does it on a computer now.


For another example, Richard H Franke studied an early adopter of now widespread computer technologies: the financial services industry. This is wholly remarkable because if any field would show an increase in productivity due to computer adoption it would be financial services, but he pretty definitively proved, up to 1987 anyway, productivity of financial services went down due to the introduction of computers. Not by a little bit either: by a lot:


Note traditional non-computard plot he made: probably got published 2 years earlier because of this


Note in the same paper he found that introduction of CAM in manufacturing was also associated with a similar productivity decline. You can sort of imagine why: computer equipment was expensive and people had to learn how to use it. But there are probably larger scale effects. I have small machine tools in my house; none of them are CAM tools. If I want a part, in most cases it’s dirt simple to get out the calipers, visualize it in my mind and make the damn thing. At most I need to fool around with a ruler and piece of graph paper. I can’t make everything this way, and there are a number of doodads I’d have a hard time with, where a $50,000 CAM mill that fills my entire machine shop would be able to do it. The CAM thing would do the corner cases, but I’d spend months learning how to use the thing, spend tons of loot keeping it running (it’s much more complicated and prone to failure), and I’d spend all my time ministering to this monstrosity, learning to use whatever CAD tools are out there, and forgetting how to make precise cuts on my manual mill and lathe. The same story was probably true in FinServ. Their routine tasks were made more complicated by ritual obeisances to the computer gods.


Somewhat to my surprise there are enough examples of this that economists have actually come up with a name for it. It’s called the Solow Paradox. Robert Solow is a 99 year old MIT emeritus professor of economics who quipped in 1987 that “You can see the computer age everywhere but in the productivity statistics.” I loathe economists as a pack of witch doctors with linear regression models, but the effect is large enough even they noticed. Everyone was relieved when GUIs and LANs came out in the mid 90s and these technologies did seem to be associated with an increase in productivity in some sectors of the economy. This measurable increase basically stopped when people started wiring their computers up to the internet. It’s not like MS Word does anything different now that it didn’t do in 1995. It just requires more resources to run.


For brick and mortar retail one can understand how productivity increased in the 90s. It’s a hell of a lot easier using bar codes and a database back end to manage your inventory than whatever ad hoc filing cabinet systems people were using before. With inventory control you can optimize your supply chain and get further efficiencies. Buying it all from China also helped the firms involved in doing this (didn’t do any good for the country’s manufacturing capabilities of course, but that’s out of scope for economists). This process was happening in the 80s, but computers were still running things like DOS with VAX and AS/400 backends; all of which required ministrations of a large caste of IT professionals. Hooking everything up to a LAN with GUI front ends helped lower the IT head count, so the IT guys could go off and invent new businesses involving wasting time on the internet.  Later you got some productivity growth from selling stuff online instead of in shops (which are a large cost center). BTW this is my interpretation for why the Solow paradox came back in the 00’s: the most obvious interpretation.


You can see here the “paradox” that computers aren’t helpful is back


earlier figures in productivity growth not shown in current year BLS website


Just to tangentially remind people: this is the promise of computing and wistful fantasies like “AI” -you want to increase the productivity of a worker to the point where you can make do with fewer workers, outputting the same amount of product for cheaper. If you have a technology that doubles a worker’s productivity, but requires another worker to minister to the technology, you haven’t increased your company’s productivity: you’ve decreased it, because you have the same output per worker and an incurred cost for the technology. If you have a technology which marginally increases a worker’s productivity but still requires another worker to minister to the technology, you have made productivity significantly lower.  It is entirely possible that you might lower a worker’s productivity with a technology, as we saw with the British attempt to cut submarine design costs using CAD instead of using t-squares, pieces of graph paper and styrofoam models.


The economists, naturally, are lowering their own productivity by arguing about this; some of them claim it’s not real and that the productivity increases show up at some nebulous later date in an unspecified way that apparently can’t be measured. Some of them simply flip over the tables and insist the computers are good and we should find a new way to measure productivity that involves fucking around on a computer. This despite the abundant evidence productivity is slowing down or declining despite our pervasive computard technology. They fiddle around with linear regression models of varying degrees of sophistication. They argue on the internet. What they don’t do is look for situations where the data show differences in an attempt to understand well enough to provide guidance or solutions. This is a microcosm of everything else: rather than solving a problem, they’re looking busy by furiously typing on their computard. Economists in the pre-computard era were more than capable of this sort of thing: Burton Klein made a good stab at looking at productivity improvements using pencil and graph paper.


One of the things that introducing a new technology does do: it redistributes resources and what people do on a daily basis. I don’t know if Avon Ladies are still a thing; but now there are Instagram whores shilling things. Database vendors and DBAs get money instead of filing cabinet manufacturers and filing clerks. Instead of filling out paper forms, people fill out computer forms. For another example, computard made decimalization a thing, and opened up market making to people all over the world, instead of a couple of knuckle dragging former football players in Chicongo and NYC who went to the same couple of high schools. Any individual who buys stonks now (and a lot more people do) will get a better price. Lots more guys like me get paid. Still, the total productivity has gone down. Is it better to pay a couple of dumb incumbents more money or more Ph.D. types less money and spend the difference on computers and stratum 0 NTP servers instead of coke and hookers?


Robotic automation may remove jobs from blue collar workers and assign more jobs to white collar workers and the pyramid-scheme institutions which certify white collar workers. It would be hilarious if we automated all the manufacturing jobs with robotics and it lowered productivity: that actually seems to be the trend. The long term trend of this is that lower IQ people have nothing remunerative to do, and higher IQ people in these jobs don’t reproduce, because they educated themselves out their fertility windows. That’s another issue nobody in economics wants to think about, but an Amish farmer would probably notice.


Mind you I think robotics is something I think is worth investing R&D dollars in. All these “AI” goons fooling around with LLMs or larpy autonomous vehicle nonsense should be working on workaday stuff like depth estimation, affordance discovery and scene understanding, or other open problems in robotics. It’s the mindless application of current year information technologies in areas they are not suited for or not helpful at all I find disagreeable. We add computers to things not because it makes things better, but as a sort of religious ritual to propitiate technological gods. The gods are not pleased with our sacrifices. We do them anyway, like the cargo cult guy with coconut earphones trying a different variety of coconut in hopes of getting a different answer.


The persistent presence of the Solow “paradox” ought to give pause over how we develop and innovate new technologies. If I visit a company claiming to innovate things, is there a computer on everyone’s desk? Does there need to be a computer there? What are people doing at their computers? Is it mission oriented or are they just fucking around with a computer? I suspect banning computers in R&D facilities excepting where absolutely necessary would pay dividends. Banish them to special compute rooms, and limit employee time there.  Someone should try it; there’s nothing to lose: all R&D is a gamble, and at least you won’t waste time fiddling with computers.




Related



Source: Wordpress.com

Powered by NewsAPI.org

Comments



zoeyaddison1990 commented on Computers Reduce Efficiency: Case Studies of the Solow Paradox 6 months ago

The Solow Paradox moving help los angeles highlights a crucial aspect of technological advancement: while computers augment productivity in the long term, short-term inefficiencies may arise during the adoption phase. nology, ensuring a seamless transition toward enhanced efficiency.