Need a book? Engineering books recommendations...
Structure Magazine - Computers and SE Judgement article[Subject Prev][Subject Next][Thread Prev][Thread Next]
- To: "SEA International List" <seaint(--nospam--at)seaint.org>
- Subject: Structure Magazine - Computers and SE Judgement article
- From: "Dennis S. Wish PE" <wish(--nospam--at)cwia.com>
- Date: Sat, 2 Jan 1999 02:08:59 -0800
I just finished Richard Parmalee's article entitled " Have We Let Computers Replace Structural Engineering Judgment". I have some comments about portions of this article which I do not entirely agree. Structure Magazine (Winter 1998) is distributed to members of SEA (NCSEA), CASE and SEI members free of charge as part of their annual dues. If allowed, I will ask if I can reproduce the article for publication on the SEAINT Online page of our website.
In short, the article is concerned that engineers place implicit faith in the results of computer programs and not enough time in obtaining the experience necessary to interpret whether the results are accurate. Although not condemning of structural programs, Parmalee believes that engineers are paying less attention to the interpretation of results than how structural materials behave. He further cautions that there are no real industry standard to protect the user of a program from "bugs" which can produce inaccurate results. Finally, he points out that the disclaimers with each software protects the developer and places sole responsibility upon the user to verify accuracy.
These are arguments that are as old as the references he quotes. By all means, they are valid concerns, but I believe that they become less culpable as commercial software products evolve. For the most part, I believe that the results achieved by computer software are far more accurate than it has ever been. The problem is not one of mathematical error as it is interpretation of the methodologies used to create the algorithms.
In defense of computer programs, structural programs for the most part are created by engineers with additional skills in computer programming. I once wrote a complicated spreadsheet that designed the retrofit of a multi-story unreinforced masonry building. I learned more from the process of writing the programs than any other methodology I designed. I obtained an in-depth understanding of the code I was working with (theoretical and practical) because I had to understand the relationship of the formulas to the results they yielded. I tested this by designing a four story building and comparing results from the program to the manual output. I further relied upon the scrutiny of plan check engineers at a time when this methodology was being intensely debated prior to being published in the UCBC. I was not just creating computer representations of formulas - instead, I was required to understand the relationship of the numbers in order to create a path or flowchart that yielded the intended results. This is not typical, although it is highly desirable of any engineer who designs any type of structure. However, there is no guarantee that the engineer doing the work has done the process enough times to insure that his submittal represents a correct interpretation of the code - something more likely to occur in the computer program..
Computer programs are tools - no more, no less. Tools are only as good as the understanding of their operation by the person using them. Which leads to the issue of liability:
Parmalee questions the liability issue inasmuch as the developer makes no claim of accuracy of his product. I am sympathetic to Parmalee on this issue - however, it is typical of all software on the market and understandable that a program with millions of lines of codes can have a mistake which could yield inaccurate results. In most commercially available software the instance of a specific bug which yields catastrophic results is extremely rare and can usually be attributed to an isolated event which lies within the logic on one routine. However, this is where intuition needs to be developed - yet proper beta testing of software products and comparisons with both other software (at least two other software's) and accompanying justification to hand analysis helps to reduce the possibility of a catastrophic error occurring.
I also believe that there is errors cause by the perception of how a software is to work rather than how it is designed to evaluate data. This was a problem in the software that I created. Although the manual was very clear that the software accumulated pier areas from each level, it was not uncommon to find users who did this manually - thus compounding errors. This was not the problem of the software but one of interpretation by the user - albeit incorrect.
In this case the problem is not implicit faith in the software but the failure to learn the proper use of the new tool. In any case, a stronger foundation that provides greater intuitive understanding of the results would have tipped the designer off to the inaccuracy of the results. This can be accomplished by use of computer analysis as easily as it can by manual analysis.
Finally, I believe that Parmalee's article is not representative of today's technology but had value as little as five years ago. For the most part, it is my opinions that computer programs, like fine engineers, improve with age. I believe that we utilize the computer generated results to help establish a professional feel or intuition for how materials are to work. I also believe that most developers include sufficient documentation that helps to alleviate our concerns by addressing specific issues. In a recent review of RISAFoot which was just published in SEAINT Online, I noted many places within the software's help files where the author (Bruce Bates) took great pains to explain to the user how the results of the software interprets the data used to design the footing.
Computer programs should not be expected to teach the user the methodology - this is the responsibility of education and apprenticeship. Computer programs are tools to be used to apply the knowledge of the methodology. Our professional intuition is required not only to check the accuracy of the software, but to identify where we have erred in the input of values - something that is just as easy (if not more so) in manual calculations.
No computer developer is free from the possibility of litigation because of his disclaimer. Anyone with enough money can institute action against the developer for inaccuracies that exist in the software. I question whether this is a real concern and would challenge Parmalee to specifically state examples of where recent software yielded results that caused structural failures. I am interested in whether these were commercial or proprietary software's and if they were related to bugs or an improper understanding of the software by the user.
Please note that these are my own opinions and not necessarily represented by those who publish SEAINT Online. I have addressed only commercially available software in this post, however I do feel there is potential for problem arising from software created by companies for proprietary use. In this case there is little if any appropriate beta program and even less chance of published documentation which allows a building official to follow design examples. However, I will save this for a separate discussion.
Dennis S. Wish PE
- Prev by Subject: Re: Structural Steel and Reinforced Concrete Connections
- Next by Subject: Re: Structure Magazine - Computers and SE Judgement article
- Previous by thread: Re: Structural Engineers are Artists!
- Next by thread: Re: Structure Magazine - Computers and SE Judgement article