I met recently, along with Andy, Alice, and Kiem, with two reps from the Blackboard company. They were here to tell us about a Blackboard add-on product called, I think, "the assessment module." Herewith, some observations.
The product incorporates some of the functionality that most of us have seen recently in the CARP software. It allows folks at different levels of the instructional process -- from instructors up to deans and assessment staff -- to input, collate, tally, analyze, query, and report on all manner of information related to assessment. It has the advantage of using the same overall interface and design logic that we are familiar with from our use of Blackboard for classes and it's "flexible" and can be integrated with BANNER.
The mere fact of investing in the software would probably send a positive signal to WASC that we are an institution that is taking assessment seriously. It would also greatly simplify the work of the office of institutional research by organizing assessment data in one place and one format. Nobody at the meeting was prepared to give actual numbers but it seems logical that it could save lots and lots of hours of work in that office (and probably in other offices that have to prepare materials for WASC).
Much of the labor saving derives from the fact that the system assumes that instructors will use it to collect and assess at least some of the work students do in their courses. At a minimum, the system allows students to submit papers, essays, etc. in electronic form and then the assessment group can process these using rubrics we've developed so as to arrive at some measure of student achievement in our programs. In most cases we'd rise above that minimum: instructors would simply use the system itself to do the grading and feedback on papers and exams and so this information would be "automatically" recorded and tallied up for use in assessment. This would make faculty life easier because we would not have to submit separate assessment information. Ideally, most, if not all, of the work that we assign for evaluation in courses would be associated with a rubric that would be in the system and then students could submit work electronically and we could have open in side-by-side windows the student's work and the rubric and we rate the work on each measure, add comments, etc. and then the student receives the feedback in electronic form and the aggregate results for the class are automatically recorded and forwarded "up the chain" to department heads, the office of assessment, etc. as appropriate.
MY TAKE-AWAY
After several hours listening to the Blackboard reps (sales and technical folks) here are a few observations:
- These folks do not understand how education happens in a liberal arts college.
- what is valuable for students
- how departments work
- how decisions get made
- what value added professors actually bring to the mix
Instead the software is designed to resonate with an auditor's fantasy of higher education as might be manifest in the cfo of a large, for-profit, online university.
- Feature after feature of the software is perfect for online correspondence courses as offered by, say, University of Phoenix.
- While the company representatives repeatedly touted the system's "flexibility," in fact, it imposes dozens upon dozens of assumptions about teaching and learning on the process without any self-consciousness. The whole thing derives from a particular view of academic assessment (itself a refugee from peer review) and purveyors appeared to have zero sense that its epistemological status was different from, say, the law of gravity.
- Totally absent from their pitch was any sense at all that there was an educational problem that this product could help you solve.
- What it does address is the fact that institutions like Mills have been told "you must do something" and this is clearly a something and spending a lot of money on it would be a great demonstration of institutional commitment.
- The company appears to have done zero assessment of the temporal impact of the processes the software would require. "Eventually, instructors would get really good at entering this stuff and so the time involved would drop over time..." "The information could be viewed and sliced in many different ways..." (by whom?) Is there a net gain in productivity? No idea. Is there a net positive for student learning? No idea. Will more parents want to pay our tuition because we use this system? No idea. What should instructors stop doing to make time to use this system? No idea.
- What they are selling is "a license and consulting." In order to figure out how to use the software and adapt it (remember, it's very flexible) you have to hire them as consultants. Remember too, that these consultants, as far as I can tell, have very little fundamental appreciation for how a liberal arts college works. Either they will mislead us because they don't understand us or we will pay for them to learn something about how a college like Mills works.
- The fact that, as potential customers, we were hardpressed to come up with things that we want to do that this product would make it easier for us to do (usually at these things users' imaginations get going and they start saying "hey, could I use it to do X?") and instead we sat their realizing that the software would make us do things is telling.
SOFTWARE DESIGNED TO CONNECT THINGS UPTwo aspects of the software are key (from a software design point of view) -- "the hierarchy" and "links."
A core concept in the software design is "the hierarchy" by which they seemed to mean the managerial hierarchy that oversees the delivery of education. At the bottom of this structure are instructors and their students. Instructors implement courses which are at the next level above them -- overseen "by a department chair or dean" who might then be under another dean. Above this we have "the assessment operation" -- as the discussion went on it seemed that this means some combination of Institutional Research and Assessment Committee. Then above this you might have other levels of college administration and then above this outside mandaters such as WASC. The genius of the software is that each institution can build-in the hierarchy that is appropriate to itself. The data in the system, the descriptions of goals and standards and such, are carefully protected so that only the appropriate people at the appropriate level of the hierarchy can see, change, etc.
The other part of the system is the links. It allows you to build a rubric for, say, reading lab assignments and for each item in the rubric to be linked back to course learning objectives which are in turn linked back to program goals and these back to institutional goals or to requirements set forth by external agencies. This means that when you evaluate 25 lab reports, the system automatically gets information on how well the institution is doing in its effort to inculcate a culture of experimentation AND it also automatically gets information about the fact that the institution is monitoring whether or not such learning is occuring. And all this simply by clicking on a radio button in an web-based report evaluation rubric!
All of these things are, of course, changeable. In theory. In practice, the system allows for the creation of extremely high levels of opaque complexity. To insure system integirity new procedures will need to be invented so that faculty who want to make a change can confer with departmental colleagues and get department head to make a request to office of assessment and then maybe the gen ed committee or the epc has to get involved etc. Or, even more likely, once stuff is in the system it just stays there until it causes a major problem.
The designers of the system seem totally oriented toward (1) the capacity to output what an entity like WASC wants and (2) changing the way instructors teach via a logic of "it's easier to join than fight" and "why duplicate your efforts?"
DISHONESTY AND ANTI-INTELLECTUALISM
A system like this is championed for its flexibility but that flexibility exists only relative to how rigid it could be. Neither its designers nor its purveyors struck me as having even a hint of a nuanced view of what education is and how it happens and how real educational organizations work. That's too bad because these are not mysterious topics -- a lot of people DO know a lot about them. What the talk of "flexibiliity" represents is marketing-speak. A common complaint about course management software and student systems software is that it is inflexible and "doesn't fit how we have kept records before" and so the folks who write it add more options to mix and match the pieces (the presenters seemed to want to impress us by the fact that on a particular screen we could have two tabs or four : "you can set it up so it's exactly right for your process!"). But that's not really flexibility, that's customization. System software is, pretty much by definition, not flexible. It's especially true that system software almost never adapts to an organization; organizations adapt to system software. We've seen this plenty over the years when we're told "Banner can't do that" or "we need this change because of Banner."
A second moment of dishonesty happens because the designers and sales force have clearly bought into the ideology of the professor/instructor as problem. They have talked for so long to assessment afficiandos and heads of assessment who get blowback from faculty that they "know" that individual professors don't like this stuff and that part of the challenge of their job is just to soft pedal around that. They are not selling this stuff to instructors. They are selling it to the instructors' managers or ueber-managers, folks who themselves have uncritically bought into the idea of there being a crisis of accountabilty in higher education. The intellectual dishonesty lies in the fact that these folks are neither willing nor able to actually have a critical conversation about any of this. They simply think of people who do not swallow it hook, line, and sinker as "unsaved."
AN IMPORTANT ASIDE
Sociologically, what's interesting is that this is an example of the "for-profit" side of education rubbing up against the not-for-profit side. Blackboard and its competitors, as well as the folks who are on the hustings about assessment, are entrepreneurs. They don't live for assessement, they live off assessment. And we know that that's an arrangement that makes intellectually honest discussions hard to come by