Jeff Fergus - What is ABET accreditation and how materials programs can easily succeed at their next visit

[MUSIC]

Hello and welcome to another episode of

Undercooled and

Materials Education Podcast.

Today we have a special guest.

We have my friend Jeff Fergus from Auburn

University who knows more about ABET than

anyone else in the

entire world that I know.

And so we're going to be

talking all about ABET today.

But first, Jeff, why don't you tell us a

little bit about yourself?

Sure.

So glad to be here.

So I'm originally from Illinois.

Went to University of Illinois and the

University of Pennsylvania, which is

where I first met Steve.

Did a post-doc at Notre Dame and had been

at Auburn ever since.

I came here in 92, so

I've been here for 31 years.

My research area is generally kind of

high temperature chemistry,

electrochemistry, batteries,

fuel cells, things like that.

But been in the dean's office about 10

years in two different

associate dean roles.

First was program

assessment and graduate studies.

And now I moved over to undergraduate

studies and program assessment.

So that's a little bit about me.

Tim?

Awesome.

Well, with that background, of course,

the theme of today's show is ABET.

So can you first actually just tell us a

little bit about what is ABET?

There may be people in

the audience who don't know.

And then how are you

personally involved in the program?

Sure.

So ABET's a confederation

of professional societies.

There's 35 professional societies that

have basically come together to establish

criteria to prepare programs,

to evaluate programs to prepare students

in areas, engineering, computing,

engineering technology,

applied and natural sciences.

And so the purpose is to set these

criteria that programs can develop

programs so that they can prepare their

students to be prepared, be ready to

mentor the profession.

I've been involved with ABET as a

volunteer for about 20 years or so.

As a program evaluator, I

served on the commission.

There are four different commissions in

ABET in the different technical areas.

So engineering is the largest of those.

Engineering technology is another,

computing and applied

and natural sciences.

So I served on that commission.

I moved into the executive committee,

also then into the officer change.

So I was the chair of

the commission in 2019-20.

And since I've also worked on training.

So I'm a facilitator for the training

programs used to

train program evaluators.

And I'm now a lead facilitator for that.

Been doing that for I

think since about 2011 or so.

And currently I'm the chair of the

training committee of

the Accreditation Council.

So the Accreditation Council is a

committee that represents the four

different commissions.

Some of the things that ABET does are

specific to the different commissions and

some are common across

the four commissions.

And when things are common, then this

body is the one that kind of coordinates

so that those can be done.

And things can be

accomplished for all four commissions.

And so that council

has a training committee.

And that training committee is the body

that prepares the training,

organizes the training for,

mostly for program evaluators, but also

some of the team chair training that's

common among the commissions.

That's great.

And so ABET's been around, I believe,

since 1930 something.

Is that right?

And under different names, reinvented

themselves multiple times.

Two times since I've been around since

around 1998 when they started doing the

outcomes assessment based approach.

And then I believe it was another, I

don't remember how many years when they

redid the student outcomes pretty

dramatically going from A

through K to one through seven.

So ABET is always asking programs to do

continuous improvement.

And I think ABET's been practicing

continuous

improvement for quite some time.

That said, why should, you know, this is

a podcast about materials education.

So our prime audience are the materials

programs in the United

States and around the globe.

I believe there's something like 125

materials programs, something plus or

minus five programs, which is a lot.

Why should materials

programs care about ABET?

Why are we involved with ABET?

So there's a first kind of a practical

reason that many entities look for ABET

accreditation when

they're looking for engineers.

In particular, if licensing is involved,

most of the jurisdictions for licensing

of engineers require a degree from an

ABET accredited program.

Now, many materials engineers don't need

that licensing, but there are some areas

where it is important.

For example, if you're going to go into

consulting and say failure analysis, it's

very important to have that professional

engineering license.

So that's kind of a practical reason.

And many companies or other entities,

when they're hiring, they'll put that as

a requirement for the degree or

preference, if not a requirement.

So those are kind of practical reasons,

but it really the process involved,

although it sometimes may seem tedious

and parts of it may not seem

clear why they're important,

that process of evaluating your program,

looking at it to see what you can do

better, is going to help you improve that

program and better prepare your students.

So I think when you really understand

what's required from ABET, it's not as

hard as sometimes people might think when

you really understand.

And if you do it in the right way, it can

actually be a benefit and really help you

improve your program.

So I think it's really about, and that's

the reason I'm involved, if it were just

about checking a box for a licensure, I'm

not sure I would have spent

all the hours I have on it.

But when it's really about trying to

improve the preparation we're providing

for our students, that's really what I

think is important about it.

I completely agree. It's so valuable to

be able to say not just, hey, let's keep

doing what we've always been doing, but

to say, what can we do better so that our

students are more successful and more

prepared to make a

difference in the world?

Absolutely.

So on the receiving side of things, as an

educator at an ABET accredited

institution, I'm often interacting with

these criteria, the

accreditation criteria.

And so this makes me curious about where

do the criteria come from, what are they

specifically and what are the criteria

that ABET is using to

accreditate programs?

So maybe I'll start

with where they come from.

As I mentioned, ABET is a confederation

of professional societies.

And so it really is us.

It's all the different professional

societies have

representatives on the commission.

It's the commission that

decides what those criteria are.

It's the commission that creates the

teams that go out and evaluate programs.

And it's a commission that decides

whether or not they're compliant.

So it is people from our different

professional societies that determine

what those criteria should be.

And that's important to keep in mind that

when we complain about ABET requires this

or that and may not agree

with it, ABET's not this.

There is an office in Baltimore that has

30 or 40 staff members, but they're

really there just to manage the process.

The decisions are made by representatives

from all our professional societies.

Now, those different professional

societies have different

interests and priorities.

And so there is some

negotiation that has to go into that.

It has to apply to all these different

types of institutions, types of programs.

And so sometimes it can be the language

can be a little bit vague because it's

got to fit many different instances,

which which takes some judgment and some

understanding of them.

So that's kind of where it comes from.

The criteria in terms of what they are,

there are eight what we

call general criteria.

And those are the criteria

that apply to all programs.

So the way I think of them at the top,

you have what are called program

educational objectives.

And these are what the program expects

its graduates to be attaining

a few years after graduation.

So it's kind of your North Star.

What are you trying to do?

And the program needs to determine those

with involving its constituencies,

whoever they determine those to be, which

is typically employers and

alumni, people like that.

And I think that's one of the real

benefits of the impacts of the change

that Steve mentioned

that occurred around 2000.

And that is that's forced programs to

really engage more with their alumni and

their employers to get feedback on what

they should be doing.

So then below that,

it's how do you get there?

And the first step is what

are called the student outcomes.

These are skills and knowledges that that

you expect your graduates to demonstrate

before they graduate so that they can

then attain those

objectives that you've identified.

Those, again, had to be determined and

agreed to by all

these different societies.

And one of the improvements that Steve

was referring to is actually has has

roots in material in

our materials community.

The the outcomes that were set back in

probably late 1990s had

been around for a long time.

And Beth Judson, who is from the American

Ceramic Society, was then on the

executive committee and she said, well,

we've been we ask our programs to always

speak improving, so we

should be doing that, too.

So let's re look at these outcomes and

that she kind of started that process

back in, I think, was about 2009, 2010.

Sometimes in that time frame.

And it finally took 10 years or so before

they they were changed.

And it took a long time to come to

agreement as to what they should be.

I think they were an improvement.

So there is that trying to make it

better, try to adapt to these changes.

So that's the outcomes.

And then to support all that are

basically the things that the institution

or the program needs to do.

Students, what are your processes for

advising and making sure that students

meet all the graduation requirements?

What is the curriculum?

What is the content of

the courses they take?

What's their experiences that they need

to go through before

they can for they graduate?

Are your faculty qualified?

Do you have enough faculty?

Are the facilities adequate to support

those student outcomes?

Do you have the the the institutional

support, financial and administrative to

support the program?

And then on top of all this, there's that

requirement for a continuous improvement.

So you have to have a process by which

you're determining are your students

meeting the outcomes that that have been

specified and looking at other other data

to see how how are you doing

and what can you do to improve?

And you need a document that you're

continuing that process.

Most of the most of what the evaluators

are looking for are related to processes.

And processes are important.

Sometimes it might say, well, it's kind

of a bureaucratic thing, but processes

are what keep things going.

If you don't have a process established,

then if people change, you might not keep

doing what you're doing.

But if there's a process that helps make

sure that people are doing things in a

certain way, it's being consistent and

not letting things

fall through the crack.

And that's really mostly what what the

team is looking for.

Do you have these processes? How are they

working? Are they documented?

Yeah, that's well said. I think that's

exactly captures what a bet tries to do.

And it's always distressing to me how so

many faculty and even many administrators

misinterpret what a bet trying to do.

I can't tell you how many meetings I've

been in where someone says, oh, we can't

teach that in our class

because a bet won't let us.

And it's like, no, a bet doesn't say

anything about what we actually teach.

There's no curricular mandates by a bet

other than it be a engineering course or

a math science course.

There's 30 credits of math and science,

45 credits of engineering. That's it.

And there are clear definitions in the

criteria that explain the difference

between those two words.

And that's it. They don't dictate the

content, the pedagogy, the kinds of

courses that's totally up to a program as

long as the student

outcomes are being assessed.

And they don't even have to be met.

Right. The criteria doesn't say the

students have to meet the outcomes.

It only says that you have to measure the

extent to which the graduates have

achieved those outcomes.

And everything, the processes, everything

is up to the programs. A bet does not

dictate any of that.

And I think it's all encapsulated really

nicely with something.

Was it Chet Van Tyne who started saying

the phrase, improve your program for

yourselves first and

worry about a bet later.

If you're a good program, meeting the

criteria for a bet should be trivial.

You just have to document things and have

a process where you actually look at it.

And the details are already being done

because you care about your students.

That's been my experience as well,

tangentially working with some of these

things is looking at the criteria,

looking at the

outcomes and really saying,

this is what I should be doing all along

anyway, right? You just want to make sure

that I'm actually doing a good job.

It doesn't really seem so onerous if, as

you said, Steve, if you're trying to make

your program better for

yourself and for your students,

the A-bet process will happen naturally

as a consequence of that.

Yeah, and there's an analogy for that.

You know, if you give a question on a

test that requires maybe a sentence or

two to answer and the students who know

the answer will answer

it in a sentence or two.

And those that don't will fill every

little spot on the paper with words

hoping that you will find the answer in

what they've written.

And it's kind of the same thing you'll

see with a program that doesn't really

know what's required.

They just do a lot of stuff

hoping it's the right thing.

And if you understand what is really

required, as Steve said, it's really not

that hard to meet the criteria.

You have to document some things, you

have to do some things.

But if you're just interested in getting

your making your program better, it's not

that much work if you

know what work is required.

Part of this, maybe you can you probably

know the history better than I do, but

EC2000, when A-bet started the outcomes

assessment based accreditation, which was

a big shift for them at that time.

I believe that was born out of ISO 9000.

Is that correct?

I it sounds it sounds

like it would be to me.

I don't I wasn't around at

that time, so I'm not sure.

But so I think that's likely you haven't

said this, but industry is pretty

involved with A-bet as well.

And I because I first got in when our

program was going to be evaluated in

1999, I think under the EC2000.

That was the Engineering Commission 2000

project has changed

the way A-bet did things.

We agreed to do it then.

And lucky me, I got sucked into being the

one who had to do the self study.

And so that was my initiation.

And, you know, it was explained to us

that in factories, people always make

measurements to make sure the product

that comes out at the end of

the line is a quality product.

And so they had all

sorts of metrics involved.

And this was pretty much done

universally in all industry.

So they were saying, why don't

you do the same for education?

And while I think that's a great idea and

I think it's it's come a long way, I

really do have to always say measuring a

part in a factory is a lot easier than

measuring student outcomes.

And I think that there has

to be some give by industry.

And I saw this on teams where the

industrial evaluators didn't quite get

that and they were always much pickier

than the academic evaluators.

Yeah, sometimes although they also they

get the continuous improvement part more

easily than some of the academics, I

mean, they get the idea

that, oh, yes, you do need to.

It's about the improvement as opposed to

sometimes, at least in the beginning

stages, I guess, I think

we've gotten better at it.

But in the in the beginnings, we as

professors didn't quite get it.

I think it took us a little while to

understand that we have we are assessing.

It's just where we give tests, we know we

give homeworks, we have

them do things to assess.

We don't think of it that way.

So I think it would be useful.

Maybe you can tell us go through the

whole A-bet process, because a lot of

programs out there, you know, it's one of

these things I've been doing it at my

school for a long time and helping the

dean much as like you've done.

And what's remarkable to me is that every

six years, the chairs are all completely

different and none of

them have experience.

And so it's reinventing the wheel every

single time you do it, which is why

process is so important.

So, in case there are programs out there

that are listening to this to try to

figure out, oh, my

God, we have to do a bit.

I have no idea how to do it.

Can you go through how the whole process

begins every six years?

And so people understand that.

Sure.

So A-bet comes at the

invitation of the institution.

No program has to be accredited by A-bet.

And so it's an invitation.

And that invitation is done through

what's called a request for evaluation.

That's done in January of the year

preceding when your

onsite evaluation will be.

So if you were to be evaluated, say it's

too late for fall 2024, so you could be

evaluated in fall 2025.

In that case, in January of 2025, you

would submit a request for evaluation,

which will just list these other programs

you want to have evaluated.

You provide one example transcript

because they need to see how the degree

is described on the transcript to make

sure the name is correct

and consistent and so forth.

So then you just wait.

And the next step is the team chair is

assigned to the evaluations.

That occurs typically around this time.

So they're in the process right now of

probably assigning team chairs.

The team chairs are commissioners, either

active commissioners or what

we call the team chair pool,

which are commissioners who have been off

the commission for a few years and are

still familiar enough that

they can still lead a team.

And so they're assigned typically in late

April, May, that time frame.

Once the team chair is assigned, then the

evaluation date is set.

And actually, for the last several years,

when you submit your request for

evaluation, you can at

that time request a date.

You can say this is the date when we want

to have our evaluation.

And in most cases,

that that is accomplished.

Some cases it may not work

out, but usually it does.

So once the team chair is established,

they confirm the date, the date set, then

the rest of the team is is assigned.

The program evaluators are assigned by

the professional societies.

So in the case of materials programs, the

program evaluators are assigned by TMS.

The only exception would be ceramics

programs are assigned by

the American Ceramics Society.

And all the different disciplines will

assign teams to all

the different programs.

So that's what's that's happening in

terms of preparing for the visit.

Now, in the meantime, the program needs

to provide a prepare a self-study report.

That's due in July, July 1st of the

summer before the visit.

Now, you would have already been should

have already has been working on that for

a while when we prepared for our so we

were visited about a year before last.

We started essentially really preparing

about a year before

the self-study is due.

Basically, the after that previous

academic year, you have that academic

year completed, your processes, most of

the self-study report isn't going to

change very much at that point.

So you can really write the

vast majority of the report.

I coordinated that so far.

The last two visits, I coordinated that

for our college and I asked for drafts by

actually January, even

though it's not due to July.

And my argument was, well, if you submit

it in January, you have a draft and we

find an issue, you got the spring to fix

it and turn it from a problem to an

example of a continuous improvement.

And some of the programs, well, nobody

disagreed with that.

Some of them complied, some of them did

not, but that's that's their choice.

So the preparation is take some time to

prepare that document on that document

basically describes how you comply with

all the different criteria.

There are some kind of bookkeeping type

things that collecting resumes and

collecting syllabi, which

could be a bit time consuming.

So starting on that early is a good way

to not have it all concentrated in June.

So now we have the team is set in July

and the self-study report submitted.

And then that team starts to work with

the institution to prepare for the visit.

The visits occur typically in the fall

from usually the earliest or maybe the

week after Labor Day.

And they go until most are the highest

concentrations October.

They go into November or

maybe a few in December.

And then the team can even before the

visit, the team is encouraged to interact

with the institution.

So the program evaluator will read the

self-study might have some questions.

If the PV doesn't have questions, it

probably means they're

not really doing their job.

There's always something that's not

clear, some additional information.

So if you get questions,

don't worry about that.

In fact, I would think that's a good

thing that you're getting.

You're finding out what issues that might

be there and you have a chance to address

it even before the visit in many cases.

Or if not, at least be prepared to

provide the evidence on site that you

might need to demonstrate compliance.

So that all occurs up until the visit.

The typical visits are the kind of

standard is starts on a Sunday.

Sunday morning, the team meets, comes on

campus in the afternoon, looks at

facilities, reviews, materials,

which are course materials and assessment

materials you've collected.

That's changing now.

That's a lot of that

might be available in advance.

So there's some discussion of how that

Monday might look a little bit different.

If you've already evaluated some of those

materials, that's the

traditional purpose of that Sunday.

And then most of the

meetings are on Monday.

So Monday is just a day full

of meetings from the program.

So each the team consists of a team chair

and then program evaluators where there's

one program evaluator for each program.

Exceptions might be if a program has a

name that has two that

requires two program criteria.

And one thing I forgot to mention when I

was talking about criteria in a division

in addition to those

eight general criteria,

there are what are called program

criteria and those are specific to the

different disciplines.

So materials engineering and material

science and engineering has one set of

program criteria which are different from

aerospace engineering or

mechanical engineering.

And those program criteria describe

either curricular topics or faculty

qualifications that are needed for that

discipline that are necessarily needed

for a general any engineer program.

So you're preparing so that each program

has one or maybe two if

they have a name like that.

Sometimes there are other members of the

team like observers who are part of the

training are just tagging along and

observing or from state boards or various

types of observers that can be part of

the team but aren't part of the

evaluation decision.

So that you'll meet with your program

evaluator will want to meet with the

chair will want to be

with faculty with students.

And so you'll work with the program

evaluators schedule

that that day of meetings.

There's typically a luncheon where the.

That you're allowed to host the team and

typically use that to have some alumni or

maybe some star

students to meet with this.

Be with the PV. You always want to get

your best students to have lunch with the

PV because that gives a very good

impression that you have these very

bright students and successful alumni.

And then the visit goes till Tuesday. So

on Tuesday the morning there's typically

not any meetings it's meant for

preparation of the report and for any

pickup meetings that might be needed.

So the team prepares a report a Sunday or

Tuesday afternoon

they'll read that report.

First they'll debrief the the program

chair just to talk through it and then

they have a formal reading

for with the president provost.

Really whoever the institution wants to

have their anything from just the

president provost and the team to

inviting all the faculty and that had

everything in between.

And so at the end of the visit the team

will read this report. This is what's

called the draft statement.

And you're not given a written draft

statement which you are given is a

written program audit form and that

program audit form will have a

description of any

shortcomings that were identified.

So you have a written version of those

shortcomings to see and work on if there

are issues to be addressed and you can

immediately start working on those. You

don't have to wait for

the rest of the process.

So the rest of the process is first you

have seven days to say if there's any

mistakes in that what was read.

But that's really just errors of fact.

It's not we disagree with this or we've

done this additional work.

It's just that was wrong.

You said we have 15 faculty. We really

have 16 or something like that. And then

it goes through an editing process. And

this editing is not just language.

It's it's really about content and

interpretation and judgment.

So the first level it goes to.

And there's two letters two levels of

editor editors editors one editors two

editors one are members of the executive

committee of the commission.

I think EAC now has about 20 of those 20

21 maybe something like that.

So all the AC will probably be evaluating

seven 800 different programs.

So it's a lot. So those are

divided among those 20 people.

So they'll read it and come back.

They may have some questions.

But why was this done this way.

What did you really see here.

You know this doesn't look

like what we typically do.

So what they're really looking for is

trying to make everything consistent

among all the different teams that are

going out because those couple hundred

teams are going to all different

institutions with different people.

Hundreds of different people all trying

to make a consistent decision.

And that's a that's a challenge.

We have training to try to do that.

But training is not

going to cover it all.

So you have this process to have checks

of people who have seen more have more

experience and can help them to make

things similar at different institutions.

Then the second level of

editing are the officers.

And now in the AC its

officers plus one or two more.

And they're looking at even more programs

and they're they have more experience and

they may go back and forth with some

discussion of what this should be.

And then finally it goes to what are

called the adjuncts and the adjuncts are

part time a bet staff.

AC has four of them.

All four of those have been chairs of the

commission and now they work part time

for a bet that they don't make the

decisions but they provide guidance.

And because they have even more

experience and we'll ask

questions and go back and forth.

So it goes through all of this process

before you finally have

what's called the draft statement.

This takes a couple of months typically.

That draft statement will may look a

little different than what was read.

Some of the shortcomings

might have changed in level.

Some might have gone away.

So that's based on the judgment and what

what what's typically done

at different institutions.

Then that draft statement

comes back to the institution.

And at that point the

program has 30 days to respond.

What's called 30 day

response at that point.

Then you can provide well we have changed

this in this way or provide a different

additional information.

Talk about what you're planning to do to

address a shortcoming because sometimes

maybe it's not ready yet.

But you still have to

respond within 30 days.

Now if there is information that is

needed afterwards you can request to

submit what's called post 30 days post

post 30 day information.

And that can typically be provided until

about May 15th or May 30th.

And it's at the discretion of the team

chair but any team chair is going to

accept anything that's

reasonable in a reasonable time.

So there'll be an agreement that we're

going to provide this information by May

15th or whatever the agreed upon day is.

That would be things like say you're

changing your assessment processes and

you want some assessment

data from the spring semester.

So obviously you may not have that when

you have to provide your response.

That's a perfectly legitimate

request or capstone projects.

We were missing some aspect of the

capstone experience.

So we need to provide

this is what we've done.

This is how we change the course.

And in May we will provide the actual

report so you can see that what we did

was actually accomplished.

Then once that's that that goes through

that whole same editing process again.

So the response is evaluated

just like the initial report.

And that's the report

that goes to the commission.

The commission meets in July typically

about second third week in July.

So they have this all these reports and

they're they're going to make the vote on

these whether they whatever that

recommendation is for accreditation.

The in addition to all these other checks

there's a what's called

a consistency committee.

So that's a committee that's assigned.

They look at all of the reports and try

to look for anything that

might not be consistent.

Like there's all these two situations

look kind of similar.

This one's a weakness.

This is a concern.

And they'll ask the

commission to take a look at it.

The commission evaluates these in panels

so that they're divided into smaller

groups so that each group has about

probably about 10 or 12.

Institutions to look at

which might be 40 or 50 programs.

And so looking at a smaller subset of

this these hundreds of programs and

they'll discuss and make sure that yes

this looks like it's

the right thing or not.

Then that goes to the commission.

The other thing is all of the

professional societies have

representatives there.

And those professional societies will

look at reports from their programs.

So the TMS and the

Trammell Society group.

We get together and we look at all the

materials programs and look at this

doesn't look right to us and we can go

and raise an issue or we can go talk to

the team chair and and address something

if we think something

doesn't look quite right.

And then the commission will finally vote

and vote on some accreditation action.

And so those actions the one you want is

what's called the next general review.

And that means you don't have to do

anything in terms of formal reporting

until your next review which is six years

later or really five years by then.

By the time you get to this this because

this is already the

summer following your visit.

Then there's also interim actions which

can either be an interim

report or interim visit.

Those are in two years.

And those those the depends on the

different shortcomings and whether it's a

report or visit kind of depends on the

nature of whether it's something that can

be reported a document

or needs to be observed.

Or the or there's another type of interim

is a show cause which is more serious

that it could lead to loss of

accreditation after that.

It's just when you have a deficiency as

opposed to weaknesses would

be for just to enter and visit.

And so that decision is made and then

it's communicated to the program or the

institution directly from a bet typically

in August August or maybe into September.

They after the meeting they have to make

sure everything's correct and takes a

little bit of time to do that.

So it's a long process.

It's going on two years.

In addition to that if it's the first

time there's a program from an

institution there's what's called a

readiness review which occurs before that

request for evaluation.

And the purpose of that is it's

submitting a kind of a poor part of a

self study and it's really just to make

sure that the program kind of understands

what they're getting into.

And then there's a recommendation from

that that said yes looks like you're

ready to go or no we don't think you are

but it's not binding the program can go

ahead and institution can go ahead and

even if it's recommended to

not try they can still go ahead.

It's just a recommendation that we think

you don't quite understand what you know

what you're up up against.

So maybe you should wait or the program's

not quite ready or you

don't really understand.

So that's the process.

It's about a two year almost a two year

process to get through.

Thank you.

And obviously the goal of a program is to

have no shortcomings at that final exit

meeting because that

means you have an NGR.

There's nothing to complain about.

I think it's really important for

programs to realize it's really important

that they think deeply about whether or

not they're going to be compliant the

summer before the visit.

And to be really honest with the

evaluator and accept what the evaluator

tells them way back in the summer before

the visit because good evaluators will

let programs know early if

they think there might be a

problem especially these days when the

resource room is all in Dropbox or

Box.com or something like that.

So there's you know except for meeting

with all the people and

seeing the facilities.

Ninety five percent of the visit should

be done before the

evaluator even gets there.

And that's important because if a program

gets early warning that they haven't been

doing continuous improvement.

They haven't been

documenting things all of that.

They can start in the fall term and make

sure that they've done everything that

they need to do to demonstrate that

they're now compliant.

So when the draft statement comes to them

they're in very very good shape to make

sure those shortcomings are removed.

And I can't emphasize that enough for any

program going through.

So if you just you know even if you're in

trouble you know you should do it Jeff

suggested have yourself study written in

January before you're going to be

evaluated the following fall because that

gives you the winter term

or spring term to get ready.

And even if that doesn't work even if you

get shortcomings you

have the whole fall term.

But if you just sit there and wait and

take it and your evaluation happens to be

in November boy you're in a tough spot.

And so don't let

yourself get in that spot.

That's that's my personal advice.

And I would say the most shortcomings are

resolved most can be and many are.

So if you do get a

shortcoming at the visit.

Obviously it'd be nice if you can avoid

that as Steve describes.

But even if you if there is still

something there you have time and most of

the shortcomings in my experience do get

resolved or at least reduced to the level

where they don't affect

accreditation the action.

So don't get discouraged

if you do get something.

There's time and that that's the goal is

that everyone is compliant.

It's not the goal to see how many

shortcomings we can ring up.

It's we want the we want to confirm that

the program is compliant.

And if not can we help them get there.

You know that's the perfect lead to the

next question that I wanted to ask which

is what advice do you have for programs

to help them be more successful.

What can they do to

learn about these processes.

Where can they go.

Is there training they can do.

And how can they make sure

they have a successful visit.

To me the best way to learn is to become

a program evaluator.

You learn so much.

I mean it's just like if

you're writing a proposal to NSF.

What do you do to prepare for that.

You serve on a panel.

So you serve on a panel and you say oh

now I see what kills a proposal and I

know now I see why that proposal it's

it's it's it's stood out.

And this is what made it stand out.

It's it's the same.

And as I mentioned before once you

understand what's required

it's really not that hard.

Now that does that is a commitment

because it I would estimate it probably

takes a maybe a week of your time.

If you do one visit per year which is

kind of typical of a P.E.V.

Maybe a week of your time over the travel

the preparation and it varies a lot.

It gets shorter as

you get more experience.

I'm accurate a self study much faster

than I could when I started.

But it is a commitment.

There's no cost.

It pays for all your costs.

Of course your time is cost

but no out of pocket costs.

That's the best way I think.

But there are other opportunities.

TMS training that we

have for program evaluators.

We allow other people

to come just to learn.

That's how I got involved.

I was I was going to be in charge of the

next review for my program.

And TMS had the P.E.V. training and they

said anyone can come.

So I went thought I might have to learn

about this and they said

well do you want to be a P.E.V.

I guess I guess I'll

learn more if I do that.

So I did and and I kept with it.

And I find it very rewarding to do.

But it is a bit of a commitment.

So that's not for everybody

but that would be the best.

But there are other ways to learn.

There is a bit symposium this way.

There are other ABET has other training

programs to help to train

people in understanding these

these criteria.

We have a workshop or

a symposium every year.

It's been at at the fall meeting until

but this next year is

not going to be this fall.

We moved it to the annual meeting.

So it'll be in the

2025 TMS annual meeting.

It's called the Judson symposium.

It's named after Beth Judson who I

mentioned earlier was the one

who started the reevaluation

of the student outcomes and she

unfortunately was killed in

a plane crash about the year

after she started that.

So we named the

symposium in honor of her.

And in fact getting I think one of you

mentioned earlier about

just doing the right thing and

a credential come along the way.

The original name of the symposium was

continuous improvement of academic

programs and then in

parentheses and

satisfying ABET along the way.

We wanted to put ABET in there to get

people's attention

because ABET does get attention.

But we didn't want it to

be just this is about ABET.

It is but it's really about how do you

improve your program.

Yes you do need to look and see well what

does ABET require and

just make sure you have

those things.

But I think you find if you if you're

just trying to improve

your program the things you

have to add are probably pretty minimal

and some of them might

even help you to do what

you were doing better particularly things

with documentation and things like that.

So that's in TMS we

have that opportunity.

I think other professional societies have

similar types of programs where they have

committees that are working on this and

trying to help their programs as well.

That's great and to reinforce that even

more the Judson

Symposium that used to really be

all about from the accreditation

committee has now joined

forces with the education committee

and it's a joint meeting.

And Tim and I are both on the education

committee now and that's

where this whole idea for the

podcast actually came from.

So it's very appropriate we're talking to

you since you've been

very involved in the Judson

committee symposium for a long time.

So you already mentioned that ABET is

continuously improving.

Where is it going now?

What's next for ABET?

Yeah so there I mean there are a number

of areas that are kind of hot topics.

Probably one of the

biggest one right now is DEI.

And I was happy when I was a chair we

actually approved some changes to the

criteria to incorporate

DEI.

And ABET had been

trying to do this a while.

They first worked on kind of the overall

statements of the

organization and those all

fine but really where the impact is going

to be is if it's in the

criteria where programs have to

do something.

That's where I think we're

going to really have impact.

So they asked all the commissions to try

to put DEI related

topics into the criteria.

And so we this was so this is 2019-20 and

they had asked that and so

I was a chair at the time

and so I kept on our criterion committee

to make sure they kept working on this.

And they did and they came up with

proposals in four

different criteria to have changes.

And happy that we did this

even though this was COVID.

So we were also dealing with how to

transition all of the visits to virtual.

So it was a it was a hectic

year but we did pass those.

And so I was really pleased with that and

it's been a while the so

after the commission approves

that it has to be approved by another

body called the

engineering area delegation.

So ABET used to have a board of like 55

people and a board of 55

people cannot function.

No absolutely not.

And the reason it was

so big is because ABET is

has representatives all these

professional societies and you have to

give them representation

and you have to have a little bit

different representation

for the ones that have more

programs and so it was a big body.

And so they reorganized so they now have

a small board of I think 13 people

but they still have the area delegation

or the well the area

delegation are for the different

commissions and the board of delegates is

all of those together.

And that's where the

representation comes.

So TMS has a representative on the

engineering area delegation

and the board of delegates.

And that's the body

that has to then approve.

Well they didn't want to approve it they

wanted to they kind of kicked

it down the road went through

talking about it for a while and tried

again and finally a year

later it did approve did get

approved two of the four because of the

of the criteria there's

eight criteria of those five are

what's called harmonized which means

they're the same for

all four commissions.

So anything that changes and knows has to

be agreed on by all four commissions

which is can be a challenge.

The other three are commission specific

so we can EAC could pass those.

So two of them those two passed the other

two they passed the

commission but then they got

stopped by the accreditation council

because none of the

other commissions were ready.

So the other two got through and they

were in criterion five and criterion six.

So criterion five is curriculum and

criterion six is faculty.

And so it went through it took a couple

years to finally get them approved.

And now they're still not approved but

they are they have approved a pilot study

that pilot study we're going into the

second year of the pilot study.

And so the idea there is we'll go through

the pilot study kind of learn

how is this being valid how are people

responding how are they how are they

saying they're compliant how are we going

to judge that how are we

going to train people to

evaluate that so there's some issues that

have to be worked out and

that's kind of what we're

trying to do these these couple years. So

that's that's the in process

the essence of those additions

are essentially I mean you're designing

things well I should say it

mentions that you have to have

DEI in the professional context and to me

that's the important part

it's not just oh you need to go to a DEI

seminar and learn about diversity,

equity, and inclusion.

How is that important for engineering?

And obviously it is I

mean to me the two ways it

stands out is one is you have to work

with different people

you have to understand how

to do that and do that effectively. And

secondly probably with even

the bigger impact is you're

designing whether it's materials or

products or processes for a lot of

different people and if

you don't understand and don't have

perspectives from different people your

market share is going

to be people just like you right and if

you're going to really just to be

effective even getting

setting aside the social justice aspects

of it just talking

about the business case for

the diversity is you're going to have a

product that is is more

useful for more people.

And so that's kind of to me the essence

of what I think is

appropriate to put in a engineering

criteria because that is important to be

a successful engineer.

And of course for the

faculty it's about understanding those

concepts I would say so you

can support that education.

So that's going through this pilot study

I'm hopeful that it will

you know it may be tweaked

a little bit because as as we in the

first year of this self the

first year of the pilot study

what happened was a lot so that was

optional and any program could

decide to do it or not it had

absolutely no bearing on the evaluation

it was evaluated not

even by the team it was

evaluated by a separate group of people

that just evaluated

that. So there's no risk and

some programs did I think it ended up

being maybe 40 institutions said they

would and maybe 20 some

actually did something on that order.

What they found was they

the programs often just talked

about what they did in DI but didn't

really address the criteria

and how they were compliant

with those words. So in the second year

they changed the prompt so

there's a like an addendum

to the cell study if you're if you're

going to participate to really try to

make that more clear

that you really have to demonstrate how

are you doing what we've

described here. And so I'm going

to be working with a task group to look

at that and also start

thinking about what are we looking

for in terms of their criteria and how

are you going to train our

our teams to be able to look

at that and interpret that. So that to me

that's the biggest thing

that's on the horizon. There's

also there's DEI coming in different

areas too. There's a proposal coming

through in the institutional

support with some kind of general DEI

related language. The kind

of language that's in there

now is the kind of thing that all

institutions are doing just because of

federal law. So I'm not sure

that that's going to be as impactful but

I think in criterion five

where you're talking about

in that experience in that design

experience how are you

preparing your students

in the context considering equity and so

forth when you're

designing that's going to make

better engineers and if every program has

to every six years

explain how are they doing that

that's I think that's I think that'll

really move the needle more than a lot of

other things that we

we do need to do the other things but I

think this is really

impact. Totally agree with you.

Are you at all concerned about the new

politics of DEI and DEI anti-DEI

legislation and now I believe Congress is

proposing to write a law specifically

talking about accreditation bodies. So

how is ABET responding to this?

ABET's very cognizant of that and they're

looking at it and

they're looking at the laws and

they don't want to put in it they don't

want to they don't want

to not do this I think ABET

I mean I know I know enough of the people

in the leadership and

so forth I think they are

committed to doing this and they're going

to find find a way to do it

but they don't want to put a

program in in the position where if they

want to be accredited

they're going to lose state funding

I mean they don't want to do that so

they'll look at a way to do that. I'm

experienced that we had a

legislation passed a couple weeks ago

here and when you read it

first it looks like we're in

trouble but actually ours fortunately

related accreditation does

have a clause that says you

know if you have to do something because

of accreditation you're

excluded from this so we're

okay and at this point but it's going to

take I think it's going to

be a lot of um a lot of it

is the words and those words are

triggering and in our areas we're

stopping we're not using DEI as

much of course I'm in Alabama where it's

a little bit different

than Michigan but we we

so we try to describe the things without

using the triggering words

you can because what you do

it's not as people don't really object to

the things you might

do they object to those

words that they have some other

connotation of what

you're really talking about

so some of it's going to be trying to

describe things in a different way

that accomplish what we want them to

accomplish for example you talk about

equitable design well

you can talk about universal design in

kind of doing the same

things how do you make this design

usable by more people well you're doing

that for equity but if you

call it equity then that may

trigger some people so you might call it

universal design you're also

doing it for everybody so you're

doing it for people who are introverts

people who are extroverts people from

different socio-economic

groups and it's everybody it's not

absolutely particular group

that gets you in trouble with

politics absolutely yeah something else

i'm really happy to hear about that

approach is that it's

about programs taking action not just

saying i have this

principle that i believe in i mean

it's nice to believe in principles but if

that's all you're doing is

believing in a principle and

not doing anything about it it's kind of

like so what so the fact

that you're saying what are you

doing to make your students more

successful in you know being a part of

this world that we're in

together i'm really happy to hear that

yeah in fact our

legislation that was passed in alabama

they um it it talks about the what

they're really objecting to

is forcing someone to believe

something which we don't do that we teach

this is the you know this

is and this is how you design

things this is why it's good but we don't

say you have to believe in

this principle which is what

they're kind of objecting to and some of

the exclusions are the

things like well if you're

supporting students even if you're

targeting a particular group

that's okay if it's just about

supporting students as long as you're not

making limiting

participation based on being in a group

which we don't do you know so um yeah i

agree and so it's when you think about

well what do you need

to do to accomplish what you might

describe as your dei goals well what do

you have to do to get

there you describe that and that that

you're not going to get

objection to so i think that's

probably the way abet's going to have to

do it they may have to change the

language to not make

it offensive but still force programs to

think about those things

and it's already in there i

mean with the teams right we talked about

work on diverse teams has

been there for years and that's

important and it's already in there but

then the kind of the design

part maybe it is in there too

but and communication to a range of

audiences it's there too

absolutely well jeff this has been

fantastic thank you so much i want to

mention that we're hosting

the north american materials

all the programs.

outcomes assessment process, but some of

those programs just

thought it was done and they

forgot to also document the continuous

improvement of their

program, which is really the whole

point.

And it's okay if measuring

the assessing the outcomes.

It's a really important thing to do

because it gives you

great insight into whether or

not you have a problem

anywhere and need to address it.

And one thing I've learned is by using

all the data and having

a histogram, I can't help

but look at the little

tick marks in the tails.

Even though, yeah, we have, you know, 80%

of our students are beyond our threshold,

you see that tail and now I'm wondering,

well, what about them?

And because we actually have all the

data, the actual unique

names, it's all secret.

I can't see who they are, but we have it.

And now we have a new AI tool that knows

how to look into SQL databases.

Again, our AI tool is private.

It's completely FERPA compatible because

we don't give any

information to OpenAI or Microsoft

or Google.

We've licensed their tools, we pay for

it, so that we can work

on our curated dataset.

Why not work on that?

We have 6,000 students in our system now.

Maybe we can do small longitudinal

studies with, you know,

here are these people in the

tails.

How did they do in the other criteria?

How did they do the next

year they were assessed?

And so I think this ABET process,

although this is not

explicitly part of ABET, it's

allowing us to view our data in ways we

probably never

dreamed we'd be able to do.

And it might be one of the advantages of

artificial intelligence.

It's certainly these large language

models to figure out what's going on.

So we're kind of excited about this, but

you still have to improve your program.

And we do that with lots of other ways.

Tim is creating a new math

course for all of our sophomores.

We had instituted a fifth math course in

our curriculum,

thinking it would help them

learn the math they

needed for the courses.

And the students all waited until their

last year, senior year

to take it, which defeated

the purpose.

So we reevaluated.

Now we're going to offer it as a

sophomore level course

and cover, you know, complex

variables or orthogonal series,

eigenvalue problems, all this

stuff that they don't really

get in their calculus sequence.

So these are all the things that good

programs will always do

to improve their program.

And it never ends.

We're constantly having to deal with it.

So anyway, any final

thoughts before we take off?

The only thing I guess I would say is,

you know, I know

people sometimes have issues

with specific things that ABED does, but

I think ideally the

idea that someone's coming

to look at your program and give you

feedback should be a positive thing.

And when we find that our program

developers are not doing

that, we either stop sending

them or try to encourage

them to follow that philosophy.

But that really is what it

should be, is meant to be.

And I think I'm still hopeful that we're

moving closer to that.

And I hope we look at it that way and try

to work towards that would be what I like

to do.

All right.

Well, thank you very much.

And I think we'll sign off.

So see you later.

Excellent.

Thanks so much.

See you next time.

Okay.

Thank you.

Jeff Fergus - What is ABET  accreditation and how materials programs can easily succeed at their next visit
Broadcast by