Steve Naidamast
The continuous mix of poorly produced software and/or its misuse in daily life is having an increasingly degenerative effect on the lives of those who use it, especially when software is misused to the extent that younger generations are now adapting to it with mobile computing devices.
A recent paper clinically demonstrates that the consistent use of mobile software has had a serious degrading effect on simple curiosity, which is a vital requirement for intellectual development. Without it, people do not research knowledge for the sake of knowledge with an existing foundation. Instead, using the Internet becomes a substitute for the acquisition of knowledge, which is presented in “sound-bite” form so that it may satisfy a specific need but does nothing to increase the basic understanding of the topic.
Unfortunately, as technology has advanced in the past 25 years, its misuse has advanced to the same degree exponentially. This then has been an underlying factor in the technical outsourcing trend by US corporations whereby new communication technologies convinced US business leaders that with such technologies cheaper development labor could be acquired.
With new, Internet communication capabilities, corporate management teams went ahead and dreamed up new ways to reduce the labor costs of American Information Technology professionals by exchanging them for cheaper labor with H1B visas together with foreign outsourcing firms whose sole interest was feeding bodies into a growing money machine. This process alone gutted the hierarchical nature of software development over time, which was critical to its quality.
Software development is as much a craft of love as it is a science and without that sense of personal pride in the development of what we call source-code (the base code that is compiled and then provided to users as finished applications) one can never get to the science behind it; and the science alone simply cannot provide the needed inherent quality. Yet, this vital relationship was ignored by corporate “professional management” with the inducement of new high-speed Internet communications and the result was a major disconnect in how software was developed.
First corporations saw no need in maintaining their Business Analyst staffs (an often ignored position) except for highly specialized situations. So the vital function of communication between non-technical and technical staff was removed or replaced by less expensive people in faraway places which made communication difficult and costly between company members
As software development environments advanced in capabilities to keep pace with Internet advancements being achieved, the System Analysts were next to be phased out; the people who took Business Analyst requirements for the user communities and converted them into functional specifications for development teams. The perception that software could be now created faster with even less planning and design was the result of erroneous and intentional conclusions made by business managers to favor the corporate bottom-line.
Finally, there was a technical management change from US technical management to that of foreign countries, which had neither the interest nor understanding as to how technical teams should be treated or how systems should be developed. And with this influx of new, foreign managers, the deadline became fanatically important as these new management types were allowed to run rough-shod over everything and everyone.
Developers and software engineers under increasingly terrible pressures were now solely responsible for the implementation of new work, bypassing many of the pre-requisites of good planning and design. This initiated a new trend in lower quality software being developed.
One of the outgrowths of such pressures was the promotion of a new development paradigm that promoted the capability to build new software under such pressures. It was a false promise but caught on like wildfire.
In the early 2000s the Chrysler Corporation initiated a plan to re-design their payroll system, which was called the “C3 Payroll System.” In an attempt to bring the project in under budget and on time, the development team “cooked up” a new paradigm called “Extreme Programming” or “XP.” It seemed like a terrific concept since it removed the planning and design stages from software development but still promised the same level of quality but in shorter amounts of time. In the fourth year of its use the development team promoted their new concept nationally and corporations quickly saw another way to reduce costs by adopting this paradigm as the new way to develop software.
However, in the fifth year of the Chrysler project, the development of this new system collapsed under the weight of the new paradigm’s failure. Simply put, using “Extreme Programming” would be like creating a new aircraft without all the necessary planning and putting a prototype directly into production.
When news of the “Extreme Programming” paradigm failure filtered through the technical ranks a new idea was proposed called “Agile Development,” which attempted to mimic much of the first paradigm with a few caveats adding some design standards back into the process. Despite this, the historical project failure rate remained at around 70%.
By the time the first Apple iPhone made its introduction in 2007/2008, the damage had been done to the US technical field, which once had been the crown jewel of the American economy. Its basic infrastructure had been torn apart for the quest of greater profits while unsustainable pressures were introduced to those in development. The addition of iPhone development and then later iPad development added an entirely new diversion from quality software development that many younger technicians gravitated to who most often produced useless toy apps for such devices.
Mobile computing has insidious side-effects. On the one hand, constant and prolonged usage of cell-phones has been well documented across five different medical reports as a promoter of brain cancer. In addition, usage of smart-phones and tablet computers has been clinically shown to irreparably re-wire users’ brains so long-term concentration is severely impaired. Once the damage is done it has been found to be irreversible.
This effect on people has also caused deleterious effects in many software development projects since the younger generations of technical professionals and younger managers simply have lost the ability to plan complex systems outside of very specific industries such as commercial aircraft manufacturing. (Even military aircraft design has similarly suffered terribly over the years.) The result is that software development in many places has actually deteriorated into what we used to call in the field “guerilla programming,” whereby everything is done on the fly without any proper foresight.
This ongoing deterioration in American software development, which now supports nearly every transaction in American daily life, is why this society is finding that nothing that relies on it works well. The recent introduction of the “Affordable Care Act” (“Obamacare”) is a just one glaring example of this failure. With over fifty consulting firms across the nation with an initial cost of $680,000,000, the sheer size of the project was a recipe for failure, causing the disastrous introduction that was experienced by everyone who attempted to apply for insurance with it.
As the author found recently, the New York extension is still unable to confirm Social Security Numbers correctly (a fairly standardized process for such work).
The catastrophic damage done to the software industry is why nothing works properly. There are simply not enough people left in the software profession any longer with enough influence to provide the necessary quality to the technologies infesting our daily lives. And it is has an inherited effect as less competent development staffs pass on their defective development methodologies to newer people as they enter US firms.
Those who had the intelligence to implement and promote new technologies within the context of their limitations have been shunted aside, forced from the technical professions, or given up and left of their own accord for the pursuit of profit and personal agendas.
Steve Naidamast is a senior software engineer by profession and a military historian by avocation.
For an in-depth laymen’s view of the complete lunacy in the current software development world please see the article at the following link: http://stilldrinking.org/programming-sucks
Sometimes what you call misuse and poor design others will most likely deem as “planned obsolescence” version 3.0… Or lets make this fun “ENGAGEMENT Architecture” from a marketing view. So as one taps the options to make something go they are boosting the stats for an app or a software program and show its ‘efficiency’ and ‘user engagement’ and more variability may be added which does nothing other than suck more time out of a person for the good of aforementioned ideals.
I’ll see your “planned obsolescence, version 3.0” and raise you with my wife’s favorite movie title: “Final Destination 6,” which indicates a rather cavalier lack of interest in what the word “final” actually means. Why sell someone a needless and already obsolescent product only once when you can continue to sell them the same needless and already obsolescent product over and over again, ad infinitum, with only a minor change in the packaging or arrangement of iconic symbols substituting for no-longer-understood words.
Speaking of “the very latest” and “way cool” iconic phenomena: back in 1981, IBM announced their new Personal Computer so I took my oldest son — then about seven — to see one at a local electronics store. Unfortunately, the store only had one of the things, and hadn’t even assembled it yet. So we only got to see the various empty slots in the motherboard where — supposedly — other hardware vendors would supply circuit cards that would provide necessary functions like video and sound, etc. The store salesman, seeing our disappointment, tried to steer us over to the Apple MacIntosh computer on display nearby. My son immediately grabbed the mouse pointer and began fooling around with it. I told him to put the thing down so as not to damage the merchandise. “Oh, he can’t hurt anything,” the salesman said. “It’s completely intuitive, so let him explore and discover what the Mac can do.” Against my better judgement, I told my son that he could go ahead and play with the thing.
So, little Stuart took the mouse pointer and moved it over to an icon which looked like a bird’s feather (meant to represent a quill pen which in turn meant to represent the computer’s word-processing program). He moved the little bird-feather icon over to what looked like a trash can icon and dropped it there. Then he clicked on the mouse button and got a menu that asked if he wanted to empty the trash can. So he did, because he understood what you normally do with a trash can with a bird feather in it. Then Stuart lost interest and we headed for the exit.
Just before we got to the door, the salesman brought another customer over to the MacIntosh in order to demonstrate the computer’s word-processing program. Finding no icon for the program — or “app” — as the way-cool Apple people like to call such things — he turned to my son and demanded to know: “What did you do the the Mac?” I cut him off immediately. “Don’t blame my son for your useless computer,” I said. “You told him that he couldn’t hurt anything.”
I still buy inexpensive parts for generic IBM PC clones today, assembling the mother boards, power supplies, circuit cards, keyboards, and display monitors myself. Then I install the operating system and other software programs which I can now download from the Internet free of charge. I taught my two sons how to do these things, too. As a result, they have no problem understanding computers or what makes them do what they do. On the other hand, I fear for a nation of people who think that little iconic bird-feathers actually write letters or documents.
Now my wife and I wonder when the movie “Final Destination 7” will open at a nearby theater where we won’t go to see it like we didn’t go to see the previous 6 versions of “finality.”
“Completely intuitive.” What an amusing story, Mike.
My first computer was an IBM PC Junior. I think I bought it in 1985 for roughly $1000. 256K of RAM if I remember right. I used it mainly for Word Processing (using PFS: Write, if anyone remembers that “app”) and for playing computer chess.
Of course, computers really weren’t that fun until they could be networked together. Behold the Internet! The new Wild West, an often violent and vapid landscape. A funhouse reflection of humanity, I suppose.
Pingback: The Sorrow of Magic Lost | The Contrary Perspective
Pingback: Comfort, risk and coal-fired furnances | Phil Ebersole's Blog