Return to index: [Subject] [Thread] [Date] [Author]

RE: California SE exam

[Subject Prev][Subject Next][Thread Prev][Thread Next]
Charles-
I don't know whether California exam practice has changed since you
participated or what.  I have participated in scoring exams twice in the
past five years.  In both cases, the problem author was on the scoring team
and in both cases, the first order of business was to review his/her
solution and develop a way of scoring that solution and alternate solutions
(a "grading plan").  

Graders were encouraged to use their engineering judgement if a non-standard
solution was presented.  The alternative solution was to be judged by
whether it was technically sound, not whether it met the author's version of
the solution.  

Each problem was scored independently by two separate graders.  BORPELS
staff reviewed the numerical scores given.  If the scores did not agree
within a fairly small range, the problem was given to a third grader to
independently rate. The problems were subdivided significantly finer than
the separate parts (e.g., part a, part b, etc.) of the problem for scoring
purposes.  Numerical errors were to be penalized only once.  If a further
part of the problem depended on an incorrect numerical answer, graders were
instructed to evaluate the problem beyond the error as if the erroneous
result were correct.  In my experience, the grading plan gave far more
weight to ability to present reasonable ways to solve a given problem than
to numerical results.

Although I had heard of rumors, prior to my participation, of the type of
situations such as you described, I was pleasantly surprised to find that
the instructions given were geared to allowing any and all reasonable
solution approaches.  In my experience, very few problems went to the third
"referee" grader except with one individual on the grading panel.  That
individual did not return after the lunch break.  I don't know if BORPELS
asked him to leave or if he became frustrated that he didn't agree with
anyone else.  In general, it was very easy to see if a candidate understood
the problem or not.  Those that did understand often made numerical errors
but if the method was correct, they seemed to get quite decent scores.

I would argue that a single test is an imperfect way to judge an engineers
capability.  But, after participating in the process and reflecting on the
process, it at least acts as a gross filter to eliminate those that need
significantly more education and/or experience.  Does passing the test
reflect competency? I think only in the most general way.

After one of the sessions, one author I talked with was deeply disturbed
that his problem was way too difficult and set the bar too high. He said he
learned that it was more important to test fundamentals than detailed
understanding of code provisions and that would impact any future problems
he might author.  I don't know if BORPELS staff passed on this lesson to
future authors or not. 

As for artificially high barriers toward passing, I came away with an
appreciation that the bar is set reasonably even though the number passing
may be low.  The types of mistakes I observed would make me uncomfortable
about the competence of the group just below the bar if a higher number of
individuals were passed just to raise the numbers.  The bar definitely needs
to be set with respect to difficulty of the problem because it is extremely
difficult to develop problems of uniform difficulty from year to year.
BORPELS attempts to do this by having people who have recently passed their
exam take the test to reflect the difficulty.  

Regards,
Bill Cain, S.E.
Oakland  CA

	-----Original Message-----
	From:	Charles Greenlaw [SMTP:cgreenlaw(--nospam--at)speedlink.com]
	Sent:	Thursday, February 24, 2000 2:25 PM
	To:	seaint(--nospam--at)seaint.org
	Subject:	RE: California SE exam

	<snip>
	        California tests CE's using the NCEES CE Exam, supplemented
by
	state-written "special" exams in (supposedly) seismic principles and
	engineering surveying principles. There have been recurring protests
that
	the term "principles" has not been honored, and that excessive
complexity
	and sophistication have been inserted as an artificial barrier to
high pass
	rates. Before the mid-1970's, California used a home-written CE exam
that
	contained required seismic problems and land surveying problems. The
pass
	rate ran around one-third. With use of the NCEES CE exam, which did
not bear
	seismic or LS problems, pass rates ran above 80 percent. Both
"calamities"
	were cured by introduction of the special supplemental exam
portions.    

	<snip>

	        Passing an exam is a very good measure of the person's
ability to
	"pass" the exam that was taken. For the Calif SE Exam, in addition
to basic
	competency, that ability, more than anything else, turns on how well
one can
	replicate the solutions favored by unseen, unknown problem writers
and
	graders who have their own habitual perspectives. Working among
those very
	people while acquiring the ordained experience is highly
recommended.
	Knowing the equivalent of Calif SE prejudices, idioms, accents, and
	mannerisms does count a lot to the acceptability of one's problem
solutions.
	Maverick ways, regardless of technical merit, are a disadvantage.
For
	example, despite exam instructions that clearly give equal
opportunity, ASD
	solutions in steel design are standard; LRFD solutions foreign and
	unwelcome. That may be easing under pressure since litigation
exposed it.
	Solutions notably more expert and sophisticated than intended by the
	examiners are likely to be scored poorly; if in multiple choice
answer
	format a superior answer can easily be a "wrong choice" and get zero
credit.
	No appeal of multiple-choice problems is permitted anymore. 

	<snip.

	Charles O. Greenlaw  SE    Sacramento CA