Thursday, September 30, 2010

How Can the Learning Resource Center Better Assist You and Your Students?

Guest Blogger: Patricia Fullmer

All of us at the LRC are interested in continuously improving our services and ensuring that we are effectively helping students and assisting professors. We would like to know your ideas about improving our services.

Tutoring, Persistence, and Retention


Several research studies provide evidence that tutoring can significantly assist a student in earning a higher GPA, persist in their education, and increase the retention of students. Rheinheimer, et al (2010) tracked 129 incoming Act 101 students at a public university in Pennsylvania and found that "…students who were tutored were 13.5 times more likely to graduate than students who were not tutored…" (p. 28). The total number of hours tutored significantly predicted cumulative GPA, credits earned towards graduation, and graduation. This recent study demonstrated that tutoring helps improve students’ academic performance, persistence, and retention.

The immediate positive feedback of an online tutoring system has been linked to an increase of metacognitive and cognitive skills (Saadawi, et al, 2009). In addition, Hodges and White (2001) found that tutoring is a contributing factor to the academic success of students, and Boylan, Bliss, and Bonham (1997) found that the training of tutors related significantly (p=<0.05) to higher first term GPA, higher cumulative GPA, and the retention of students. With the above evidence in mind, the LRC tutors, both professional and peer, are trained and certified through the International Tutoring Program Certification process of the College Reading and Learning Association.

Request for Your Response


We, in the LRC, would like to know how we can work more closely with faculty and rectify any problems faculty see. We also welcome your suggestions on how to have more students utilize the LRC so we can be more effective in aiding students to persist in their education and graduate.

References:

Boylan, H., Bliss, L., and Bonham, B. (1997). Program components and their relationship to student performance. Journal of Developmental Education, 20(3).

Hodges, R. and White, W. (2001). Encouraging high-risk student participation in tutoring and supplemental instruction. Journal of Developmental Education, 24(3), 2-11.

Rheinheimer, D.C., Grace-Odeleye, B., Francois, G.E., and Kusorgbor, C. (2010). Tutoring: A support strategy for at-risk students. Learning Assistance Review, 15(1), 23-34.

Saadawi, G., Azevedo, R., Castine, M., Payne, V., Medvedeva, O., Tseytlin, E., Legowski, E., Jukic, D., and Crowley, R. (2010). Factors affecting the felling-of-knowing in a medical intelligent tutoring system: The role of immediate feedback as a metacognitive scaffold. Advances in Health Science Education, 15, 9-30.

Sunday, September 26, 2010

Teaching Matters and So Does Assessment: How Not to Assess Student Learning

Guest Blogger: Safro Kwame

If the Middle States' visiting team of 24th September 2010 has taught us anything, it may be that teaching matters and so does assessment of teaching! The team's report suggests the following to me:
(a) a need for an immediate change in our habits and assessment of student learning, (b) a need for faculty to take ownership of the assessment of teaching,
(c) a need for appropriate software to collect and analyze assessment data, and
(d) a need for an internal Middle-States type of assessment committee that will do what the external one (from Middle States) has done, i.e. evaluate our assessment efforts and make appropriate recommendations.

QUESTION: What did you get from the Middle States visit of 24th September 2010?

SUGGESTION: Look at Middle States' standards and guidelines on assessment and indicate whether you agree with the visiting team's report (that we are not in compliance with standard 14) and indicate why (you agree or disagree).

REFERENCES

1. See Faculty Meeting Minutes of 29th April 2008 for my original proposal for assessment

We should (1) stop doing what we have been doing about assessment or significantly improve upon it, and (2) immediately implement the Middle States evaluation team's suggestions and recommendations on assessment. Example for Consideration: Each instructor may, accordingly, design a simple test of student learning outcomes which could be electronically scored or graded and automatically processed and analyzed for program, department, school and university characteristics and recommendations. Thus, in addition to submitting a gradesheet at the end of each semester, each instructor can turn in an assessment sheet or report at the end of each semester (after grades have been submitted). – Safro Kwame, 4/29/08


2. See Faculty Meeting Minutes of 3rd February 2009 for my follow-up proposal for assessment

In a simple and easy way, (a) Middle States wants faculty to regularly assess some or a few of the goals and objectives of courses and programs, apart from the courses and students themselves, (b) share and discuss the results, and (c) implement changes resulting from the assessment and discussion.

An Example: One Type of Assessment:

1. Select 2 or 3 of your most important goals or objectives. Make sure they are (easily) measurable.
2. Set 2 or 3 questions specifically for each goal or objective.
3. Get students to answer the questions.
4. Find an easy, e.g. automatic or electronic, way to score the answers to the questions and analyze the results; e.g. by using assessment software such as Exam View or getting IT to acquire and administer appropriate software.
5. Discuss the results with your colleagues and, preferably electronically, forward the results and recommendations (which may include changes) to your supervisor and/or central coordinating unit which could be IR, Chairperson, Dean, or VP.
6. Make appropriate changes, e.g. to your syllabus, examination or content or delivery of course, as a result of your assessment of learning goals and objectives.

Note: You need (a) software to create, score, analyze, forward and collate assessment, and (b) personnel to support or assist in creating, scoring, analyzing and processing assessment. Consult IT, IR and VP. – Safro Kwame, 2/3/09


3. See News Report on the Need for Assessment Software:

New Software Aids in Assessment, The Chronicle Vol. 53, Issue 30, Page A37 3/30/2007 By Dan Carnevale

Facing greater demands for accountability, colleges turn to technology to analyze mounds of data. Richmond, Va. The last time Virginia Commonwealth University had to prepare for an accreditation review, officials here found themselves overwhelmed with data. The university's accreditor, the Southern Association of Colleges and Schools, was asking for more information than ever before about how much students were learning: grades, test scores, written evaluations, and other measures. Much of that information was scattered throughout the institution - kept in computer files and storage drawers. So Jean M. Yerian, then the director of assessment, led the development of a computer program that would organize and analyze all the assessments of students being done on the campus. The computer program, dubbed Weave, not only helped the university satisfy its accreditors, but also appealed to other colleges, which wanted to use it to prepare for their own accreditation reviews. "We started out as solving our own problem and ended up developing something that can help others as well," says Ms. Yerian. Last year Virginia Commonwealth spun off the project as an independent company called WeaveOnline. Ms. Yerian resigned her post at the university last month to become director of assessment management for the company, which has already attracted more than 40 colleges as clients. Supply is slowly meeting the demand. Companies such as Blackboard, Desire2Learn, and Datatel have developed software that helps conduct institutional assessments. Other companies, such as Oracle and eCollege, have plans to jump into the game as well.

Caribbean University Selects Blackboard Outcomes System to Assess Student Learning

Dec 11, 2007 University Is First in Latin America to Implement Comprehensive Institutional Assessment to Meet Accreditation Standards PHILADELPHIA During the annual conference of the Middle States Commission on Higher Education, Blackboard Inc., a leading provider of enterprise education software and services, announced that Caribbean University in Puerto Rico has selected the Blackboard Outcomes System(TM) to assess student learning across its system of four campuses, and plan for and measure continuous improvement in institutional effectiveness, to help continue to meet the rigorous accreditation standards set by the Commission.

WEAVEonline is a web-based assessment system that helps you to manage accreditation, assessment and quality improvement processes for your college or university.

The Blackboard Outcomes System helps institutions efficiently meet the demand for increased accountability and drive academic improvement with evidence-based decisions. The Blackboard Outcomes System makes planning and assessment easier and evidence-based.

The TrueOutcomes Assessment Manager is a complete, web-based solution that facilitates every aspect of Learning Outcomes Management from assigning, assessing, and tracking to analyzing and making evidence-based decisions to improve student learning outcomes and facilitate continuous improvement.

eLumen Achievement is an information system for managing a college's attention to student achievements, learning outcomes and education results. It is specifically designed to facilitate authentic assessment processes that are faculty-driven, student learning-centered, standards-based, and (now, with eLumen) system-supported.

Tk20 provides comprehensive outcomes assessment systems that let you collect all your data systematically, plan your assessments, compare them against specified outcomes/objectives, and generate detailed reports for compliance, analysis, and program improvement. A leader in assessment, Tk20 offers a complete set of tools for managing outcomes-based assessment and measurement of student learning as well as institutional activities such as program improvement, curriculum mapping, institutional effectiveness, and reporting.

Monday, September 20, 2010

Teaching Service, Learning Fun

Michelle Petrovsky, Guest Blogger

Our recent discussions of assessment seemed to give short shrift to an important teaching tactic: service learning. Relating classroom activities to conditions, events, and trends in the larger world enhances students’ interest in those activities. That in turn reinforces competencies and skills gained.

In my Web Programming class (CSC 201) in Spring 2009, service learning was at first absent. Students were lectured on, led through lab work in, and mentored regarding topics including:
HTML (the “native language” of web pages, that uses components like the ‘tag’ BODY and the ‘attribute’ BGCOLOR)

MySQL (a full-function database management system quite comparable to high-end packages like Oracle; widely used on servers that offer Web-based functions that require dynamic data, such as purchases)

PHP (one of two programming languages – the other is Perl – almost universally used to provide interactivity between Web browsers and servers; such interactivity can’t be provided by HTML)

Despite their all being upperclass computer science majors, and therefore having significantly more than a nodding acquaintance with programming concepts and practices, my students slogged. Writing lines like

<BODY TEXT="#435D36" BGCOLOR="#F5F5F5">

to define the background and text colors of a web page, rather than pointing and clicking in a program like DreamWeaver, is both challenge and effort, even for the computer-very-literate.

Noting the slog and seeking some way to ameliorate it, I talked to the class about reworking their semester project, by including in it a service learning experience. At first skeptical, they quickly warmed to the idea. The group's first design decision? That the web site they would create, and the MySQL database and PHP programming that might be needed to support it, should address topics my folks felt would be of interest to the entire LU student body.

Direct, indirect, and even outright subjective assessment tools indicated that connecting classroom activities to a larger context improved student performance. Grades on subsequent quizzes and exams were higher than those on the midterm. Projects began to be completed with fewer requests for assistance. Group work proceeded more smoothly, with less and less instructor monitoring needed. And I saw clearly that my students’ enjoyment of and enthusiasm for CSC 201 had increased. They were not only learning, but having fun doing so. The website they created is still available, at

http://compsci.lincoln.edu/csci/csci.htm

Monday, September 13, 2010

Lincoln's Center for Teaching and Learning Enhancement: What Can We Offer?

Guest Blogger: Yvonne Hilton

The Center for Teaching and Learning Enhancement (CTLE) is this year working to make a remarkable impact in the area of faculty development at Lincoln University. As the director of this program, I see it as a resource providing various workshops, seminars and activities to help strengthen the pedagogical acumen of faculty. CTLE wants to provide opportunities to glean from the wealth of wisdom we have within our walls, as well as from other knowledgeable professionals that reside outside of our campus.

We (the CTLE Advisory Board) believe a good place to start is to hear from you. We want to know what are your interests and challenges as university faculty. To this end, we have developed a survey that we will ask you to complete at this week’s faculty meeting. Feedback from this survey will give us insight on the types of programs faculty want and need to be the best they can be.

CTLE is very small as it is in the beginning stages of existence. Therefore, we ask for your cooperation, your understanding, and your patience as we grow and mature. Meanwhile, please share with us some of the things you would like to see CTLE do this year. Perhaps you have taught at other institutions with similar programs. Share the types of programs and services you experienced there. Tell us your thoughts and opinions so that we can work toward making this year beneficial for everyone.