I looked at the vibrating
beam experiment. The first thing that I feel is
important to say is that it is not easy judging higher educational
courseware in an area that one knows nothing about. This was the case for
me in evaluating something from mechanical engineering.
I cannot comment on
accuracy, appropriateness of conceptual layout etc.
I am nevertheless
able to give some feedback but my feedback would be
greatly enhanced with a
panel of experts that include a content specialist. My feedback will be
set against the various categories created by the class.
As far as the class categories are concerned, I found that the categories of content and design were the key categories that I used to judge the software. On the whole, I think that this piece of software has been well thought through. It is easily navigable and it is laid out in an appropriate and user-friendly way. I particularly liked the way in which the equations were put into white boxes that stood out clearly against the gray background.
However there were things that could be improved. First, under the "useability" category, there were a few links that did not work and at times I felt that there was some over-kill in terms of information being repeated too often in different forms (sometimes in the same format) on the same page. Under the category that we entitled "considering the audience" what was most lacking was the use of feedback mechanisms. There was no FAQ section. The only form of feedback came in the form of an email address to which queries could be addressed. In my opinion, this is not sufficient for courseware.
More thought could have been put into the interactivity of this courseware. Nevertheless, on the whole this was good courseware. The content was comprehensible to me and it was certainly not too complex or hard to follow. As such, it must be even clearer for mechanical engineers who are the intended audience.