I don't know whether California exam practice has changed since you
participated or what. I have participated in scoring exams twice in the
past five years. In both cases, the problem author was on the scoring team
and in both cases, the first order of business was to review his/her
solution and develop a way of scoring that solution and alternate solutions
(a "grading plan").
Graders were encouraged to use their engineering judgement if a non-standard
solution was presented. The alternative solution was to be judged by
whether it was technically sound, not whether it met the author's version of
Each problem was scored independently by two separate graders. BORPELS
staff reviewed the numerical scores given. If the scores did not agree
within a fairly small range, the problem was given to a third grader to
independently rate. The problems were subdivided significantly finer than
the separate parts (e.g., part a, part b, etc.) of the problem for scoring
purposes. Numerical errors were to be penalized only once. If a further
part of the problem depended on an incorrect numerical answer, graders were
instructed to evaluate the problem beyond the error as if the erroneous
result were correct. In my experience, the grading plan gave far more
weight to ability to present reasonable ways to solve a given problem than
to numerical results.
Although I had heard of rumors, prior to my participation, of the type of
situations such as you described, I was pleasantly surprised to find that
the instructions given were geared to allowing any and all reasonable
solution approaches. In my experience, very few problems went to the third
"referee" grader except with one individual on the grading panel. That
individual did not return after the lunch break. I don't know if BORPELS
asked him to leave or if he became frustrated that he didn't agree with
anyone else. In general, it was very easy to see if a candidate understood
the problem or not. Those that did understand often made numerical errors
but if the method was correct, they seemed to get quite decent scores.
I would argue that a single test is an imperfect way to judge an engineers
capability. But, after participating in the process and reflecting on the
process, it at least acts as a gross filter to eliminate those that need
significantly more education and/or experience. Does passing the test
reflect competency? I think only in the most general way.
After one of the sessions, one author I talked with was deeply disturbed
that his problem was way too difficult and set the bar too high. He said he
learned that it was more important to test fundamentals than detailed
understanding of code provisions and that would impact any future problems
he might author. I don't know if BORPELS staff passed on this lesson to
future authors or not.
As for artificially high barriers toward passing, I came away with an
appreciation that the bar is set reasonably even though the number passing
may be low. The types of mistakes I observed would make me uncomfortable
about the competence of the group just below the bar if a higher number of
individuals were passed just to raise the numbers. The bar definitely needs
to be set with respect to difficulty of the problem because it is extremely
difficult to develop problems of uniform difficulty from year to year.
BORPELS attempts to do this by having people who have recently passed their
exam take the test to reflect the difficulty.
Bill Cain, S.E.
From: Charles Greenlaw [SMTP:cgreenlaw(--nospam--at)speedlink.com]
Sent: Thursday, February 24, 2000 2:25 PM
Subject: RE: California SE exam
California tests CE's using the NCEES CE Exam, supplemented
state-written "special" exams in (supposedly) seismic principles and
engineering surveying principles. There have been recurring protests
the term "principles" has not been honored, and that excessive
and sophistication have been inserted as an artificial barrier to
rates. Before the mid-1970's, California used a home-written CE exam
contained required seismic problems and land surveying problems. The
rate ran around one-third. With use of the NCEES CE exam, which did
seismic or LS problems, pass rates ran above 80 percent. Both
were cured by introduction of the special supplemental exam
Passing an exam is a very good measure of the person's
"pass" the exam that was taken. For the Calif SE Exam, in addition
competency, that ability, more than anything else, turns on how well
replicate the solutions favored by unseen, unknown problem writers
graders who have their own habitual perspectives. Working among
people while acquiring the ordained experience is highly
Knowing the equivalent of Calif SE prejudices, idioms, accents, and
mannerisms does count a lot to the acceptability of one's problem
Maverick ways, regardless of technical merit, are a disadvantage.
example, despite exam instructions that clearly give equal
solutions in steel design are standard; LRFD solutions foreign and
unwelcome. That may be easing under pressure since litigation
Solutions notably more expert and sophisticated than intended by the
examiners are likely to be scored poorly; if in multiple choice
format a superior answer can easily be a "wrong choice" and get zero
No appeal of multiple-choice problems is permitted anymore.
Charles O. Greenlaw SE Sacramento CA