ART 125A - Basic Design - Stage 5 - Mine Ternar

Assessment

Assessment Methods
  • Analysis of exam, quiz, or homework items linked to specific SLOs
  • Assignments based on rubrics (such as essays, projects, and performances)
  • Capstone projects or final summative assignments
Assessment DescriptionI. Final Exam Style Exit Survey: Building on what we learned from our previous SLO data analyses, I produced an enhanced Excel worksheet to analyze, in greater detail, the results of the summative final exam style exit survey we have been administering (with some improvements every semester) in all sections of Basic Design since 2011. Following our established pattern of revisiting our exit survey, with data and analyses from previous semesters, for discussions among all faculty teaching design, we improved our survey again, making further clarifications and adding necessary details. The improved exit survey was administered in all sections of Basic Design. The scores of each student for every question and subject area were recorded for each section and the results were compiled for the whole program. This data was then analyzed in detail with the new Excel worksheet prepared. Both the highly detailed results and their summary analysis were recorded and distributed to design faculty for our ongoing SLO evaluations/discussions.

II. Direct Assessment of SLOs through faculty evaluation of creative design projects given with rubrics: Developed, with active feedback from design faculty in the program, a standard rubric for assessment of SLOs through their direct demonstration in students' design projects in the course. I also designed an electronic worksheet which supplements this rubric, for the calculation of student proficiency levels based on their individual project scores. This method was devised to ensure consistency in the way data was collected for compilation, and is the result of design faculty working together cooperatively, brainstorming on teaching priorities and essentials of assessment. The rubric, and its electronic worksheet supplement for the calculation of student scores and their corresponding proficiency levels, were distributed to all design faculty. All faculty used these new tools in their evaluation of student projects completed in response to the course MLOs. For this semester, we decided to focus on MLO A." Distinguish, explore, employ and organize design elements (line, shape, value, texture, color) to create unified space in two-dimensional compositions." The results gathered from each section were compiled for both individual section and program level summary analyses.

III. Mid-term Exams & IV. Capstone projects/ final summative assignments: Two sections of the course, which utilize Moodle in delivery of instruction and quizzes, had quantitative data readily available for analyses of mid-term and other quizzes, final exam projects and students' total scores from the semester, addressing all course SLOs. For this semester, I selected to analyze the results of student proficiency levels of students' mid-term exam results and final capstone project scores in those two sections.
Learning OutcomesALL.
Number of Sections9, 9, 2, 2
Number of Instructors8, 8, 1, 1
Number of Students133, 156, 45, 42

Data Analysis

Data Shared With
  • Instructors of the same course (at CCSF)
  • Faculty and staff within our department
  • Also sharing summary data through this public website.
Data Sharing Methods
  • Face-to-face meetings
  • Email
  • Shared document files
Data SummaryI. Final Exam/Exit Survey Summary:

TOTAL NUMBER OF STUDENTS WHO TOOK THE SURVEY 133

TOTAL NUMBER OF ACTUAL PROGRAM ENROLLMENT AT SEMESTER'S END (when the survey was administered) 151

STUDENT PARTICIPATION RATIO FOR PROGRAM 88.08%

RATIO OF OVERALL ACHIEVEMENT FOR THE PROGRAM 73.83%


Overall correct student replies to each question:

Q1 Elements of Design 59.40%

Q2 Uses of line 93.23%

Q3 Vocabulary of shape 54.14%

Q4 Value (Definition) 96.99%

Q5 Compositional Balance 81.20%

Q6 Visual texture (Definition) 75.94%

Q7 Spatial interaction 81.95%

Q8 Positive/Negative Space 69.92%

Q9 Pattern (Definition) 90.23%

Q10 Depth Illusion in 2D 64.66%

Q11 Basic Color Wheel 85.71%

Q12 Color theory (vocabulary) 70.68%

Q13 Complementary contrast 47.37%

Q14 Color/Analogous Scheme 57.89%

Q15 Compositional unity 78.20%

Bar Graph of the Final Exit Exam style survey results can be viewed at the following URL:

https://sites.google.com/a/mail.ccsf.edu/mine-ternar/slos/art-125a-slo-graphs/Sp13_BarGraph_Art125ASurveyResults.png?attredirects=0

II. Direct SLO Assessment/ Evaluation of Student Design Projects with Rubrics: Individual student project scores, per the agreed upon rubric, were entered on an electronic worksheet by all design faculty, and results were collected for a program level review. Faculty is in the process of adjusting to the use of a shared rubric in direct assessments. Per our specific program goals, we decided to look not only at the general proficiency levels, but also break down our general proficiency category into two sub-categories: 1. Proficient Basic (average passing score), 2. Proficient Plus (scored high 8-10 pts in a max. 10 pt scoring system.) Here is the program level summary of the data collected from all sections teaching Basic Design, in relation to the direct assessment of MLO A. Distinguish, explore the characteristics of, employ and organize design elements (line, shape, value, texture, color) to create unified space in two-dimensional compositions.

Total # of students enrolled in Basic Design at time of assessment: 160

Total # of students assessed: 156

Participation ratio: 97.50%

% of Students Proficient: (Total of Proficient Basic and Proficient Plus categories): 83.33%


% of Students in the Developing category: 7.69%

% of Students in the No Evidence category: 8.97%


Bar Graph of the assessment results from each section and for program total can be viewed at: https://sites.google.com/a/mail.ccsf.edu/mine-ternar/slos/art-125a-slo-graphs/Sp13_BarGraph_Art125A_MLO.A.png?attredirects=0

You may also view the breakdown of the three proficiency levels: 1. Proficient, 2. Developing, 3. No Evidence, for the program in this pie chart:https://sites.google.com/a/mail.ccsf.edu/mine-ternar/slos/art-125a-slo-graphs/Sp13_Art125A_StudentProficiencies_MLO_A.png?attredirects=0

A further breakdown of the data in the Proficient category gave us the following information. Among the students who were proficient (83.33% of all assessed), those who scored higher than 8 pts on a scale of 10, which we call as our Proficient Plus group comprised: 77.69% (or 63% of all students assessed/ students in all categories.)

You can view this further breakdown of the direct assessment results into two Proficient sub-categories (I. Proficient Plus, II. Proficient Basic) in the following pie chart: https://sites.google.com/a/mail.ccsf.edu/mine-ternar/slos/art-125a-slo-graphs/Sp13_Piechart_Art125A_ProficiencyBreakdown.png?attredirects=0

III. Mid-Term Exam: Proficiency levels of students in 2 sections (005 & 831) utilizing Moodle for delivery of instruction:

45 students, who actually took the exam were assessed.

6 students who did not take the exam and withdrew from the course were excluded from the analysis upon withdrawal from the course. (Those students were in the No Evidence category.)

The class average proficiency score (for all students who took the exam): 82.09

Summary Bar Graph of the Mid-Term Exam Results can be viewed at this link: https://docs.google.com/a/mail.ccsf.edu/viewer?a=v&pid=sites&srcid=bWFpbC5jY3NmLmVkdXxtaW5lLXRlcm5hcnxneDo3NjRlZTAwMzg3ODcyZDc4

IV. Students' Final Capstone Project Scores in two sections, which utilize Moodle for delivery of instruction.

37 students who actually submitted the final project were included in the assessment.

5 students who did not submit the project, were excluded from the assessment. (Those students received a 0 score, were in the No Evidence category.)

1 out of the 5 students above, took an Incomplete due to medical reasons and plans to complete the project during the summer and submit in the Fall.

The remaining 4 were in the No Evidence category and failed the course.

The course average proficiency score of 37 students who did submit the project: 84.84

A bar graph summary analysis of the final capstone project results can be viewed at this link: https://sites.google.com/a/mail.ccsf.edu/mine-ternar/slos/art-125a-slo-graphs/Sp13_Art125A_DataGraph_FinalProjScores.png?attredirects=0
Analysis SummaryI. Exit Exam Style Survey: We have been achieving our goal of meeting an overall program success ratio of over 70%. (73.83% in Spring 2013) Most of our students who do not meet the 70% or higher rate of achievement are students who are struggling academically, some with learning disabilities. However, some of our students who do not do well in quiz style assessments, are able to demonstrate successful outcomes in project based assessments as can be seen in our direct assessment results.

Because students are accepted to art programs at four-year colleges, in large part, on the strength of their portfolios, direct assessment of outcomes as demonstrated in the actual work they produce in our classes is a very important measure of our students' proficiency levels.

II. Direct SLO Assessments: 97.50% of all students enrolled in the program at the time of our assessment were included. Since most of our students were scoring in the Proficient category in direct assessments, in order to obtain data that was helpful to us when assessing SLOs through projects with rubrics, we decided to break down the general proficiency level to two subcategories, which we called I. Proficient Plus (students scoring 8-10 pts on a scale of 10/ roughly the equivalent of B or A letter grade categories), and II. Proficient Basic (students scoring 6-7 pts on a scale of 10/ roughly the equivalent of a passing grade of C), and we also looked at the total of those two groups for our overall proficiency ratio. We then analyzed Developing and No Evidence categories. This allowed us to see how certain projects were instrumental in achieving particular SLOs. Program overall is quite high in this regard: 83.33% of the students scored in the Proficient range, of whom 77.69% (or 63% of all students assessed) scored quite high (8-10 pts on a scale of 10, which we called our Proficient Plus category). The results are very encouraging as they demonstrate the success/effectiveness of our teaching as evidenced in the student learning outcomes for MLO. A, which goes to the heart of Basic Design.

III. Mid-Term Exam Data for Proficiency Levels of students in 2 sections utilizing Moodle for delivery of instruction (sections 005 & 831) indicate a very high achievement ratio for students who actually took the test: The course average is 82.09 out of 100. (The mid-term exam was quite comprehensive, not only testing students on their content knowledge, but in two of the questions, requiring students to articulate critically the design dynamics in examples of a traditional and a modern artwork. It addressed all the course SLOs, with the exception of SLOs which require hands-on implementation and manipulative skills gained in the course. SLOs relating to implementation in own creative work, were measured through a direct assessment of the student projects with rubrics.)

Of the 45 students who took the exam, only 4 students scored below proficiency level, all whom were in the Developing category.

6 students who did not take the exam and withdrew from the class at mid-term were in the No Evidence category until they withdrew, and were excluded from the assessment data upon withdrawal from the course.

This evidences a high average proficiency ratio for students who attend the class and participate in coursework and assessments as required.

4 students in the Developing category continued to have problems, but only 1 out of those 4 failed the course at semester's end, as she continued to have attendance and participation problems and did not submit the required work.


Of the 4 in the Developing category, 1 had a diagnosed learning disability, but with regular attendance and participation, raised her score and passed the course.

Of the 4 in the Developing category, 1 student was repeating the course for the second time, and continued to have serious problems with attendance and participation, and barely scored a passing grade at semester's end. (He appeared to have attention issues which created serious challenges in learning and needed academic support, which he was not getting. The need to inquire about academic counseling and support was discussed with the student.)

Of the 4 in the Developing category, 1 student had a great deal of talent, but was experiencing challenges due to serious attendance and participation issues, had overall problems with study skills. With a great deal of guidance, she bettered her score and passed the class with a basic proficiency level, but did not reach her actual potential.

All 6 students who did not take the exam, and who would have been in the No Evidence category had they remained in the course, withdrew, and were thus not part of the group assessed.

IV. Data from scores of 37 students who submitted the Final Capstone Project, in two sections, which utilize Moodle for delivery of instruction (005 & 831), shows a very high course average score of 84.84 (The final capstone project scores involve not only an assessment of the proficiency in implementation of skills and knowledge gained in the course as demonstrated in students' own creative works, but also the students' participation in the final critique, which requires demonstrated ability to articulate critically the 2D dynamics in own work and in the works of peers and so target all course SLOs.

This calculation excludes 5 students who did not submit the project and who are in the No Evidence category. 4 out of those 5 failed the course, but 1 was in the Incomplete category due to medical reasons. She plans on completing the work and submitting in the Fall.

Of the 37 students who submitted the final project, only 2 were in the Developing category and the total semester score of 1 was not at the basic proficiency level and failed the course, the total score of the other was at a barely passing level, but scored just on the border and did pass the course with a low grade. Both students had serious attendance and participation issues. The one who passed with a low grade was repeating the course for the second time.

Again, this demonstrates a very high achievement rate among students who submit their work. This final project and its critique synthesize the course SLOs and thus its direct assessment results are a good indication of where students are at at semester's end.
Next Steps PlannedFlex Day in August 2013: Design Faculty will meet for an extended department meeting to discuss SLO results from both the exit survey and the project assessments with a shared rubric. We will share the data in person, discuss our results and what they mean, as well as discuss the efficiency of compiling data electronically. (I will share the electronic form I have created to administer the exit survey online.) We have already revised our course outline twice in the last two years with data from our SLO assessments and feedback from our college-wide SLO reporting process. (So our course outline is very current and is approved by the C-ID program for California Community Colleges and has been given a C-ID designation. Therefore, it is recognized for articulation with any other CCC course with the same designation for course content credit.) Our next goal is to tie all learning activities and projects in all sections with an SLO assessment process per the most recent revisions made to our course MLOs. Traditionally, instructors have done this through grading of their projects per the course objectives. How our grading can and should consistently align with our SLO assessment process while we keep up with the pace of the changes taking place at our college and at community colleges at large will be an important part of our discussions in our Flex Day meeting in August.

During Fall 2013 semester: All design instructors will conduct their own project based assessments with our shared rubric again to report to area coordinator by the last day of teaching. Repeating the same direct SLO assessment process of outcome evaluation by student projects with rubrics will lead us to revising our project descriptions as needed, bringing greater specificity to grading and greater consistency to teaching among all sections, tying all grading with SLO assessments.

Fall 2013 and Spring 2014: In our area meetings, we will continue to discuss our direct assessment results, share our methodologies in presenting visual problems through the projects we assign, the sequence of our projects, student work examples and other relevant factors, to maintain the current strength of our program, and to continually improve and update our presentations and assessments to effectively address all current demands.

December 2013: We will administer the exit survey in all sections again. The exit survey gives a general idea of SLOs being achieved in relation to academic content, and we improve our teaching in the program through our discussions of the consistencies and variables we see in the final analysis.

January 2014: Fall 2013 assessments will be analyzed and summarized, and results will be distributed to all design faculty for our ongoing discussions of SLO evaluations and our teaching.
Learning OutcomesALL.

Changes

DetailsNew course outline, with revised SLOs, was submitted for the review of the Curriculum Committee and was approved. (The revisions were made with feedback from our SLO assessments, on an already quite current course outline, one which was updated only a year ago!)

A revised distance ed addendum was submitted and approved by the CC for the hybrid-online section of the course.

Our revised course outline is recognized with a course identifier number in the California Community College (CCC) courses list and is approved for articulation with any other CCC course with the same designation for course content credit.

Changes on the updated outline were applied to teaching in all sections.

We continued with existing methods of assessments, such as the exit surveys, but we also focused on developing greater consistency in the way we directly evaluate our SLOs, through our grading of student projects: We developed a new rubric to be used across all sections, to be used with an electronic worksheet I designed for greater efficiency in calculation of student proficiency scores and faster data collection for program level compilation.

Design faculty were exemplary in the way they participated in communicating with one another, attending meetings to discuss SLOs, and implementing the many changes.
Learning OutcomesALL.

Tentative Future Plans

TermFall 2013
Activities
  • Assessment (measurement) of outcomes
  • Analysis and discussion of assessment data and next steps
  • Implementation of planned changes and reassessment
More Details1. Design Area meeting as Extended Art Department Meeting on Flex Day in August 2013 to review our assessment data and discuss the implementation of changes as needed and plan steps for continuing assessments.

2. During the Fall 2013 semester, utilize the electronic worksheet and rubric developed in Spring 2013 for direct assessment of SLOs through students' project scores. Discuss project descriptions and possible revisions to them, as needed, for best results in student achievement and efficiency of our evaluation process.

3. Administer a final summative exam style exit survey again to compare results later with those from previous semesters. Discuss with faculty the possibility of administering the exit survey online. (I have already generated an interactive online version of the survey using Google Forms, however, we need to discuss the pros and cons of giving the survey online in computer labs.)

SLO Details Storage Location

Additional Highlights

We have had excellent communication among design faculty leading to very lively discussions on teaching methods, coverage of content, types of projects given, their evaluation, effective and creative ways of accomplishing major learning outcomes and fostering student achievement. Design faculty worked very hard to keep up with the implementation of the changes we brought to our course outline, which we revised twice in the last couple of years, with feedback from our SLO work. There also has been increased communication relating to teaching and assessment methods between fine art and graphic communications faculty, which has been very valuable, as the two programs share goals when it comes to the coverage of principles of visual organization, yet the differences in the application of the same principles in fine arts versus in contemporary communication media or in commercial arts lead to discourse that is wonderfully stimulating and inspiring for us all.

Overall, the college-wide sharing of the SLO reporting/review process that our first ever SLO coordinator and Dean of Instruction put in place through a public website this year have made dialogue/ feedback between faculty in different disciplines much easier. I have benefited greatly from this year's process and devised more effective ways of assessing and reporting SLOs and improving outcomes further with feedback. In just the second semester of this college-wide online reporting/sharing, the SLO assessment process in our area has seen tremendous growth, because we have received excellent direction through the questions asked of us on this online form, benefited from constant feedback and viewing of examples, all which have naturally resulted in addressing outcomes and assessments with greater breadth. In other words, this I have found out to not be a mere data reporting tool, but a very effective interactive tool for sharing outcomes across disciplines, learning, growth and transformation. It has also given me the opportunity to discover and/or be better informed of the amazing learning activities taking place at the college. I am extremely impressed by the herculean task that our first-ever SLO Coordinator and our Dean of Instruction accomplished college-wide, in less than a year's time, by devising this system, which serves as a built-in ongoing quality control and improvement mechanism. In a very short period of time, the process has proven its effectiveness. In time, it will be transformative on an institutional scale. A very solid, transparent review and feedback system is in place. It it is working, and has already achieved an amazing feat in just its first year!

Back to Department Overview