Written assessments can evaluate student's visualization skills, but lack the interactive learning opportunities a computerized version affords with the inclusion of animation. This significant refinement allows for the immediate correction of erroneous mental images by actively viewing correctly rotating objects. This experiential learning exercise can be implemented into any design curriculum requiring a method of assessment and improvment of student's visualization skills.
Animation -- A series of still pictures of an object which, when shown in rapid succession, create the illusion of movement.
Assessment -- synonym for test, exam.
Competency -- (1) The highest level of mandated achievement/knowledge expected of graduating students of FIDER accredited interior design program. (2) The demonstrated successful application by those students of aesthetic concepts and technical information to resolve interior design problems.
Concept/Design Concept -- The ability to think abstractly, free from distracting details.
Computerized Version -- The vehicle of administering the assessment by interacting only with a computer.
Designers -- Interior designers.
Mental Design Language -- A specialized form of visualization skill used by architects and interior designers within which to address design concepts.
Mind's Eye -- The ability to internalize images.
Paper Version -- The vehicle for administering the assessment with paper.
Orthographic Projection -- The projection of a single view of an object in which the view is projected along lines perpendicular to both the view and the drawing surface.
Pictorial -- A two-dimensional drawing giving the illusion of a three-dimensional space to the viewer.
School Numbers --
Freshmen class end in 1 i.e.- 30001
Seniors class end in 2 i.e. - 30002
3rd Angle Projection -- The standard drawing convention used in the United States and Canada which shows the relative positions of orthographic views (fig. 1), (Muller, Fausett, 1993).
Visualization Skills-- The ability to create, manipulate and then verbally and/or graphically communicate mental images.
Success in the interior design profession is highly predictable by one's ability to mentally visualize and manipulate objects: visualization skills (Ghiselli, 1966) but the many theories and tests that attempt to explain how people visualize images fail to reveal exactly how participants "see" objects in their mind's eye (Shepard, Cooper, 1982). Designers must develop the ability to form clear and focused mental images or they will face extreme difficulty communicating the intent of designs in a verbal or graphical manner appropriate for their clients' needs (Samuels, Samuels 1975). Therefore, the ability to concisely communicate a highly complex and creative design solution has at its creative core visualization skills (internal imaging) that allow designers to mentally create, manipulate and communicate solutions effectively.
FIDER (Foundation for Interior Design Education Research) reinforces the importance of this vital skill in its Standard and Guidelines for the Accreditation of First Professional Degree Level Programs in Interior Design (FIDER, 1993). The FIDER accreditation standards mandate that graduating interior design students demonstrate competency in the core communicative areas of design process, oral and graphic presentation and communication skills. Deficiencies in any of these categories compromise the quality of a student's visualization skills. Programs that tend to rely on the traditional drafting techniques of mechanical drafting, blueprint reading and engineering drafting to train students in visualization skills may be inadvertently contributing to these deficiencies (Myers, 1958). Myers found that students who received training in the traditional drafting techniques scored no higher on visualization tests than students who received no training. While designers must master these traditional techniques, educators need to explore alternative methods of teaching visualization skills in order to maximize student's visualization skills.
Another complication in teaching these vital visualization skills is that individuals uniquely process and encode the information with which they construct their mental images. Mike and Nancy Samuels believe that some people see images like "snapshots" frozen in time."The human mind is a slide projector with an infinite number of slides in its library, an instant retrieval system and endlessly cross-referenced subject catalog. The inner images we show ourselves form our lives, whether as memories, fantasies, dreams or visions. Inner images supply the creative force in art ..."
(Samuels, Samuels 1975)
This image based method of describing visualization skills fails to address the subtle descriptive contributions that language bestows on any object (Block, 1982). Living in a world dominated by the spoken word, designers unable to precisely articulate their visual images are at a severe disadvantage.
Other people "see" the same image entirely with words or phrases (Miller, 1985). Miller further described that this verbal method of visualization takes into account that not only do people "see" the physical object that is not there, but can understand the implied and abstract meanings associated with that image. He supports his hypothesis by asking subjects to list items in a scene that they have previously observed. When asked to verbally report on the relative positions of the objects, they made serious mistakes. Also, when items in the scene were omitted from the verbal report, participants did not remember or experience a visibly vacant "hole" in their visual image when questioned about the missing objects. Lee Brookes of McMaster University reported a similar result in 1968. Brookes found that subjects recalled certain specified features of a geometric figure much faster when using a verbal response rather than by pointing to the alternative on a wall chart (Brookes, 1968).
Words alone can, however, describe many nuances of a design where graphical images fail, but lack the ability to communicate precisely the same image to each member of the audience. Ideally, designers should create a "mental design language" uniquely theirs, capable of describing the requirements of any design to any client (Tye, 1984). By combining imaged based and language based visualization theories into a flexible method of communication, designers can avoid expensive and time consuming errors by adapting their communication method to best explain the pertinent requirements of their design to a particular client. For example, two-dimensional graphic representations communicate the most useful information to other designers and craftsmen who understand basic drawing and building conventions. Laymen typically relate better to three-dimensional representations accompanied by verbal descriptions. Since scientists may never agree on a single method in which people internalize images and designers experience problems when they cannot express their mental images to clients in a coherent or appropriate manner, it becomes imperative that designers continuously sharpen their visualization skills so they can clearly communicate with a wide variety of clients. Successful communication, therefore, depends on both strong pictorial and verbal visualization skills to address all the subtleties involved in a design concept.
To develop a computerized interactive visualization assessment that will employ the element of animation, providing students two previously missing factors unattainable with a static paper assessment. First, an incorrect response will immediately reveal an animated image of the three-dimensional object rotating into the designated direction of view, and allow the subject an opportunity to correct mental images. Second, but more important, these "visually neutral" animations will permit subjects to process and encode information using their unique mental design language.
Expectations for this research will be to provide an assessment instrument with which interior design programs might determine or strengthen a student's visualization skills. Evaluating incoming freshmen would be invaluable in establishing a baseline for individual students and defining the competency level of an entering class. Through periodic assessments, educators would be able to compare the two results to determine if the student is making satisfactory progress in acquiring the visualization skills required of professional designers.
An interactive computerized visualization assessment test will help interior design students improve their visualization skills.
The paper assessment was used to develop, validate assessment problems and to establish a benchmark for the evaluation of the computerized version. The paper version was developed as follows:
For interior design students, a series of architectural objects were deemed more appropriate than abstract objects. These objects were developed as a series of orthographic and isometric drawings.
Different sizes of the objects, formats (multiple choice, sketching, etc.) and assessment layouts were evaluated. A multiple choice format was selected because of subject's familiarity with this type of format and this type of test does not depend on other types of communication skills to indicate the correct answer.
MiniCad 5.0.1, a CAD/graphics/animations software package (Diehl Graphsoft, 1994), was used to complete all drawings for the paper assessment. The computer hardware used was a standard Apple Macintosh Quadra 650 computer with 8 megabytes of RAM and a 250 megabyte hard drive.
Only FIDER accredited interior design programs housed in art, home economics and/or architecture schools participated in the assessment. First semester freshmen and seniors were tested to encompass the greatest diversity of skills.
The assessment package was mailed to the participating schools and contained the following: a detailed set of instructions so that all participants would receive the same amount instruction, sequentially numbered score sheets, the requested number of assessment copies and return envelopes.
When the completed paper assessment were received from each of the participating institutions, they were immediately scored. The scores were returned so each institution could promptly review their student's performance. After all the institutions returned their completed assessments, the data were compiled, analyzed and then forwarded to each institution.
Reported typographical and drawing errors were corrected using the comments supplied from the participating institutions. The edited paper assessment was used as the basis for the development of the computerized version. The paper assessment was appended to include animation using Authorware Professional, an interactive assessment development software program (Macromedia, 1993), and MiniCad 5.0.1, a CAD/graphics/animations software package (Diehl Graphsoft, 1994) on a standard Apple Macintosh Quadra 650 computer with 8 megabytes of RAM and a 250 megabyte hard drive.
The computerized assessment was then administered to a small group of students to validate the computer format. Data was compiled and compared to the paper assessment to see if similar results occurred. Comments were reviewed and corrections made to the computerized assessment.
The Paper Assessment
Participants completed a multiple choice test that included three groups of problems:
Group One (18 questions) -- The subject was given plan, front elevation, side elevation and isometric views and asked to determine the correctly drawn image corresponding to the designated direction of view. **
Group Two (18 questions) -- The subject was provided an isometric view of an object and asked to identify the incorrectly drawn orthographic projection.) **
Group Three (9 questions) -- The subject was presented a pictorial drawing of a room and asked to determine the viewer's location on the plan provided.**
** An area for sketching possible solutions is provided beside each question on the paper assessment.
For example of the paper assessment mailed to schools, see Appendix C.
Errors and Omissions in the Paper Assessment
In validation of the paper assessment a few schools reported several errors:
Question #3: The arrowhead asking for the designated direction of view (solid black one) was in the wrong location. However sufficient information, either in the provided two-dimensional views or the available options, was given that offset the wrongful placement of the arrow. The resulting score for that question was consistent with the rest of the assessment (Appendix A).
Question #15: The mistaken placement of the designated direction of view arrowhead resulted in the lowest percentage of correct responses of the entire assessment.
Question #31: Lines shown as a solid (object) lines should have been dashed (hidden) lines.
Question's #19, 20, and 33: The omission of hidden lines on one or more of the views Question #19 presents an interesting problem where freshmen, due to a lack of knowledge, did not recognize that both Option A- Plan and Option C- Right Elevation contained drawing mistakes. They selected the most obvious incorrect answer, the plan, as the right answer which was correct (fig. 2).
Example -- Question #19
Seniors, possibly recognized both of these errors, resulting in one of the two questions that the freshmen out scored the seniors (Appendix A).. Comparing question's #19, 20, and 33 with the rest of the assessment showed that these simple types of errors had dramatic affects on the percentage of correct responses (Appendix A).
Question #34: The front and right elevations options were reversed, not following standard drawing conventions. Switching the elevations did not affect the freshmen's scores, while the seniors possibly recognized that the elevations were reversed and that one of the elevations contained an error. This combination resulted in the second question in which the freshmen scored higher than the seniors (Appendix A).
The Computer Assessment
Drawing mistakes found in the paper assessment were corrected. This corrected paper assessment then became the basis for the computerized assessment. The computerized version of the assessment is similar to the paper version in both content and sequence except for the addition of animations using MiniCad 5.0.1, a CAD/graphics/animations software package (Diehl Graphsoft, 1994 & Appendix D). To force subjects to use only their visualization skills, neither a sketching area on the computer screen nor sketch paper were provided with the computerized assessment. The computerized assessment was administered to freshmen and seniors from school 90001 to validate and identify errors in the computerized assessment vehicle.
After students entered the computer lab, verbal instructions were given about how to start the assessment. The importance of understanding the directions, as displayed on the computer screen, were stressed several times to the students because once they started the assessment they could not return to the beginning screen to review the instructions. There were no further directions given during the assessment.After completing the assessment students completed a short questionnaire requesting their scores and any comments about the computerized assessment (Appendix E). After reviewing these comments, small refinements were made in the assessment.
Errors and Omissions in the Computer Assessment
Group three questions where not included in the computerized assessment. Very high average scores by both the freshmen and seniors
(fig. 4) show that this type of question did not accurately test visualization skills as defined within the confines of this research.
Problems occurred when "packaging" the assessment to send to other institutions when some animations developed "hiccups." This occurred at the end of an animation sequence when the image would suddenly jump from the last frame to the first frame and back to the last very quickly. This occurred randomly throughout the assessment in a totally unpredictable manner. Reloading the assessment from the same disks produced these "hiccups" on the same or different questions or solved the problem completely. Telephone calls to both the compression program's developer, Aladdin ( Aladdin, 1992), and Authorware Professional (Macromedia, 1993) provided no answer to the problem and no solution is currently available.
Freshmen Test Scores
Forty-two (42) percent of the five hundred and ninety-six (596) freshmen paper assessment packets were completed and returned. Individual results for the two-hundred and fifty-two (252) freshmen who participated in the paper version, ranged from a low score of 18 percent to a high score of 98 percent, with an overall mean of 70.81 percent and a standard deviation of 17.56, (fig. 3, fig. 4 ). Wide ranges of scores within individual freshmen classes revealed the variety of skills and innate abilities of incoming freshmen. The narrow range of scores in school 20001, an architecture school, may be attributed to its rigid admission requirements.
The high overall mean for freshmen reflects the approximately 20 point difference in the average scores freshmen achieved in the first two groups of problems compared with the third group (fig. 4). High scores in group 3 may indicate that freshmen already possessed this type of visualization skill at the time of admission into an interior design program.
Average Assessment Scores per Group
Senior Assessment Scores
Sixty-seven (67) percent of the seniors responded from a total of two hundred and forty-three (243) packages requested. The one hundred and forty-nine (149) seniors mean score was 84 percent, ranging from a low score of 20 percent to a high of 100 percent and a standard deviation of 14.56. Unlike the freshmen, the senior's scores were within a very narrow range, indicating common visualization skills seniors acquire approaching the end of their education (fig 5 and Appendix B).
Seniors Assessment Scores
Not clearly explained in the data is the relatively small (8 percentage points) difference between the freshmen and seniors average scores (fig. 6). Comparing the difference of average scores of freshmen and seniors in schools 10000, 40000 and 90000 indicates the acquisition of visualization skills, but the rest of the institutions showed little or statistically no improvement from their freshman to senior years (fig 6).
Overall Average Test Scores w/ 1 Standard Deviation
The apparent lack of improvement could possibly indicate that the assessment does not accurately measure visualization skills or that those institutions do not emphasize learning that type of visualization skill.
Seniors consistently outscored their freshmen counterparts in each group of assessment questions (fig 4). The difference between the average score decreases as the amount of information required to mentally manipulate the object decreases, agreeing with Metzler and Shepards 1971 experiments (Shepard 1971). In group 3, the small difference between the average senior and freshman scores show this type of question did not accurately measure visualization skills. A third group of participants, non-designers, needs to be surveyed and compared with these results to validate this assumption.
According to the data, entrance requirements did not have an impact on freshmen's scores. Students surveyed in schools with low GPA entrance requirements performed equal to or higher than those in schools with high GPA requirements (fig 7). For example freshmen in school 20001 posted thehighest average results for freshmen but did not require the highest minimum GPA for admission into their program. Also, the average score freshmen posted from one institution was not an indication as to how well the seniors from the same institution scored (fig 8). Large increases in average scores from the freshman to senior year may indicate the strength of the overall program in developing visualization skills. The lower average scores for seniors than freshmen in school 80000 is not clear from the data.
Entrance Requirements & Average Freshmen Scores
Average Scores per School
Overall, freshmen improved a modest 8 percent on the computerized assessment compared to the paper assessment (fig 9). The low percentage of improvement is not as large as anticipated, but might have been higher if the format of the computerized directions were clearer on the first group of questions. Greater development of visualization skills seems to have taken place at the freshmen level than the senior level as indicated by the greater increase in scores by freshmen than by seniors. Administering the assessment to every class would further pinpoint when students begin to acquire this vital skill.
Seniors showed a 1 percent drop in their scores (fig 9). The nearly identical average senior scores on both the paper and computerized assessment indicate that seniors possess the visualization skills by this time in their educational career.
Comparison of Scores: Paper Assessment vs. Computer Assessment
The paper assessment fulfilled its anticipated role as a way of developing and validating the assessment questions and then surveying a large group of students. It had the advantage of being easily sent to any location and requires no additional equipment or specialized computer equipment for its administration.
Students who participated in the computerized assessment unanimously selected the computerized version as the most beneficial. Interest remained high throughout the computer assessment, with some students trying to beat the assessment as if it was a video game. Students' remarks made while taking the computer assessment lead to the conclusion that the addition of animation to the standard multiple choice format provides a very positive learning experience. Typical student's comments were:
"You could see your mistakes immediately"
"Animations helped correct your mistakes"
"Need a piece of paper to sketch on"
"Frustrating - got so many wrong"
"Learned as I went along"
"Fun - great exercise"
"Great way to take an exam"
"See mistakes immediately, animations great help"
"Got better as I went along"
"Saw which group I needed to work on"
See Appendix E for a full list of comments.
Increases in the computerized versions average score indicate that the inclusion of animations has the potential to increase visualization skills. Using the computer assessment to gather a data sample similar in size to the paper assessment would help validate this assumption. Students without computer experience quickly adapted to the computer version and the basic skill of manipulating a mouse to select an answer. Although special equipment is needed to administer the computerized assessment, increases in scores and comments made by participants leads to the conclusion that the computerized version is the most beneficial when used in this context.
1. Eliminate the given three-dimensional view, thereby, forcing the subject to use his/her visualization skills to completely visualize the object from the orthographic information provided in group one.
2. Evaluate the assessment with and without the animations to see if animations have an effect on learning visualization skills.
3. Provide sketch paper or an area in the computer assessment to see if computerized assessment scores improve when participants have the opportunity to sketch when trying to figure out an answer.
4. Give students additional help in weak areas of their visualization skills by identifying which problems are missed and then providing additional problems of the same difficulty.
5. Determine the correlation between scores of incoming freshmen and final success or failure in their interior design program by teaching them from admission into their program.
6. Develop flexibility in the assessment instrument to allow students with visualization deficiencies to use it as self-administered, individually paced instruction.
7. Revise the instructions and/or modify the instrurment to permit the student to return to the directions and example. This modification may be beneficial in helping students better adjust to using the computer, especially in group 1.
Shepard, R. N., & Cooper, L A. (1982). Mental images and their transformations. Cambridge: MIT Press.
The IBD Foundation/Lester Johnson Endowment Graduate Fellowship Research Grant made it possible to administer the paper version of the assessment to an anticipated 800 interior design students from eleven different FIDER accredited interior design programs.