Jef's Page of Multi-Media Evaluation

a little piece of coursework for S97 Infosys 296a-3


This document is a summary of my thoughts as I applied the Multimedia Categories generated by our class, to a particular piece of instructional courseware:
Vibrating Beam Experiement Instructional Courseware by Pamela A. Eibeck and Brandon Muramatsu.


GRADE-(this score is used to judge the actual courseware)

0 - extremely poor

5 - average

10 - extremely good


META-COMMENT - (in order to reduce repatition of commentary, I've added these brief meta-comment markers)

G - good
category makes perfect sense AND category seems applicable in this context
U - unapplicable
category makes perfect sense BUT category seems unapplicable in this context
KS - knowledge specific
category makes perfect sense AND category seems applicable in this context BUT any judgement in this category requires knowlege i lack, such as familiarity with the content area or familiarity with the demographics of the intended or actual audience (esp. what sort of computers/net access they are expected to have)
? - I don't understand
category does not make sense to me


Content                                 META-COMMENT 	GRADE



     Searchability                      G               7

     Scope                              G               ?

     Useful links                               

        internal                        G               9               

        external                        G               2       

     Accuracy                           KS              ?

     Uniqueness, originality            KS              ?

     Coherance                          G               8

     Carefully selected                 G               8               

     Quality of Content                 KS              ?

     Browse use ability                 G               9

          Reference use ability         G               9

     Realistic (non-fiction? fiction?)  U               



These categories seem to work, on the whole, although I'm not qualified to deal with content accuracy issues in relation to engineering information. Assuming that they got their facts straight (and I have no reason to believe they didn't) the my overall impression of the content was very favorable.


Design                                  MARKING 	GRADE



     Creativity                         G               4

     Response Time                      KS                               

     Realistic system requirements                      

          bandwidth                     KS               

          graphic cabilities            KS               

          memory                        KS              

     Help functions and                                         

         guides (printed and/or online) G               9               

     Adherance to 

        standards familiar to user      G               7

     Appropriateness to objectives      G               9

     Accessibility to all users 

        (consideration of impairments)  G               6

     Appropriateness 

        of modality to content          G               9

     Both underlying 

       structure and graphic layout     G               6

These general design issues are good, although I'm not sure that system requirements might work better under "considering the user". Response time could also migrate. If students will be using on-campus computer labs brand new desktops each with its own direct network connection (and if the images have been cached from the last user looking at the same pages) then the system requirements are very different then from someone sitting home with a 486 with 2 megs of memory and a 14.4 modem (like me). So, for these to be good criteria, we need to give the judges more information about the users (or a default grade of user they can assume).


Navigation                         	MARKING 	GRADE



          Skipability                   G               10              

          bookmarks                     G               0/10               

          linking                       G               9               

          mapping                       G               9

          customizability               G               0/10

Something that occured to me with this section is that some multimedia packages, like this one, are built to run off a platform. Using Navigator or Explorer we have bookmarks accessible to us, so it's not really a problem. Using these bookmarks we can customize our route through the content so that when we come back next time we don't have to follow the same path. While the designers of this particular program can't really take resposiblity for the benefits that the web browser offers, we can at least figure out some way to acknowledge them for taking advantage of systems that were already in place.


     Organization                       MARKING 	GRADE



          Interface                     G               8               

          Sequencing                    G               9

          Structuring                   G               9

          Chunking                      G               8

I suspect that these organizational issues are really inextricably bound up with content and pedagogy. Certain sequences, structures, chunking styles and interfaces will work best with certain kinds of content, and certain kinds of students. I'm wholly unaware of the specifics of such issues, so I can only speak for my experience with the courseware. I felt it was well organized, but in an official situation, I would hope that the judges might have some background in judging the educational/content specific appropriteness of organization styles.


     Graphic Design                     MARKING 	GRADE



          Representation options                                

              (pictorial, words, audio) G               8

          Clarity                       G               8

          Iconicness of icons           G               7

          Artistry                      G               5



These seem the fuzziest of the categories, but necessarily so. There is such a wide diversity of graphic design styles that vague terms like this should be used so we keep a sufficently dynamic approach to innovation and "artistry".

     Engagement                         MARKING 	GRADE





          interactivity                 G               3 (no online experiment)

          Aesthetics                    G               6

          appearance                    ?               

          keeps one coming back         U                               

          Interface                     G               7

          immersion                     G               4

I'm somewhat less satisfied with this group of criteria, because it seems to be too specific. I'm sure this is partially due to the fact that these terms haven't been defined, but it seems to me very medieval. It seems very arbitrary, and not really comprehensive. Appearance should clearly belong to Graphic Design, and Keeps One Coming Back might fit better under longevity. With reference to the specific piece of courseware we looked at, I think our overall engagement with the material would depend greatly on our interest in the class it's associated with, and would have been improved greatly with the inclusion of an on-line interactive experiement.


Longevity/Useability                    MARKING 	GRADE



     adapt to technological changes     G               ?               

     updating                           G               5 (have to change image maps)

     ease of installation               U               

     incorporating new info             KS               

     adapting to new instructors                

       and instructional environments 

       (customizability)                KS               

     Manipulability/Reusability                                 

          ability to decontectualize & 

          use MM elements separately	KS               

          Legal Issues 

     Mechanism to give the 

        author feedback                 G               8



Again, I think some of these categories are too specific, and some won't work in a wide variety of circumstances. Web-based multimedia is never installed, but should the authors of the project get points for avoiding the problem altogether? What about projects (like this one) that have been released in two forms - do we grade based on just one, or do we take the best of both, or the worst of both?


Objectives of MM Package                MARKING GRADE



     Value Added                        KS               

     Context w/other content            KS               

     Integrity                          ?               

     Scope                              KS               

     Ability/age level                  KS               

     funness                            U               

     Packaging                          U               

          Is it clear and truthful 

             what user is getting?      U 

With the exception of "integrity", which I don't understand at all, these categories make a lot of sense to me. I think in order to judge them, one has to have some background in the way the same content was communicated before the advent MultiMedia Software, and a lot of related issues. I don't know a lot about these things in relation to engineering education, but if I were judging a Multi Media package related to something I do know, I would certainly want to use these categories to talk about it.


Considering the Audience                MARKING 	GRADE



     Appropriateness to audience        KS               

          nationality                                  

          culture                                      

          age                                          

          gender...                                    

     Use of Appropriate Learning Styles KS               

     Instructional Design               KS               

     Feedback                           A              7

          FAQ                                         

          Timing                                       

          Quantity ...

With this last group, I felt as though there should have only been these 4 mid-level categories, and rather than including all these sub-categories (nationality, culture, age, gender, FAQ, Timing, etc) we should list examples of ways that a Multi Media package might be "appropriate to the audience" or provide a good feedback mechanism. I would find it obnoxious to have all the feedback mechanisms originally listed here, because it would get in the way of using the actual software. In fact, I think feedback might be reclassified under engagement, objectives, longetivy or design. And the rest of them can go under Multi Media Package Objectives.

And again, before I make any judgements about the appropriateness to the audience I would want to find out more about exactly who is expected to use it, and who, in fact, is using it.