Monday, May 26  Welcome to the TLT Collaborative !

TLTC Home


About the TLTC

TLTC Staff

TLTC Board

TLTC Projects

Feedback

TLT Collaborative: Ongoing Projects and Initiatives

UNC Flashlight Statewide Training Workshop Survey [RESULTS] p a g e s 1 | 2 | 3 | 4 | 5

Flashlight Use:

Briefly, what potential uses do you envision Flashlight having on your campus? Include information about the timeframes, if possible.

18 Responses

    I guess Flashlight might be of use to faculty who have never used the WWW and don't know how to program or do not want to use shareware.
     
    I will be working on a plan of action shortly but do not have this information right now.
     
    IT evaluation -- already used Institutional research SACS surveys
     
    Right now, we are in a beta test for a Distance Learning project for assessment. We are planning on beta testing it on pilot projects to evaluate the usefulness and the results before we integrate it to the entire campus.
     
    The Flashlight database was rather limited as presented to us at the workshop. There seemed to be a need for more items and more specificity. At some point in its development it may be of more use, but currently it is not useful for the types of evaluations that I conduct.
     
    It will give facultytheability to create assessments with little knowledge of what type of questions to create. Once Flashlight has the ability to work with our student and faculty database, we can use it to assess many services all across campus. Without this ability, I think we will limit its use to course evaluations only.
     
    We will plan to pilot the assessment with a small number of selected on-line instructors first this semester. After the pilot, we will begin to identify those departments or individual teachers that are ready to add an on-line assessment and plan training for beginning of next year.
     
    End of course evaluations Formative and summative evaluation
     
    We hope to use it as a part of the evaluation tools used in the School of Education and in the proposed distance learning programs.
     
    Use in evaluating courses, partiularly those incorporating the use of technology. Also, would like to develop an instrument to use in evaluating the services of the Center once it is completely established and being implemented.
     
    We will offer a training session in March and possible again in May. We want our faculty to consider using it as a part of their course evaluation, maybe in the Fall.
     
    This program could be used to evaluate online courses for effectiveness. Instructional design training could be analyzed as well. There are many other applications, such as in traditional classrooms, but the first two are my view of my campus priorities. If we purchased a license, I would begin incorporating assessment into the current cycle of web course development, beginning in April 2001. However, we have not purchased the program so this is mostly moot for now.
     
    We began to use Flashlight last semester and we will continue to use Flashlight Online to provide assessment opportunities for both traditional and online courses
     
    Moderate potential.
     
    We are just now trying to decide how to best use Flashlight.
     
    Flashlight-based surveys could be used to evaluate system-wide initiatives from an administrative standpoint, perhaps. Without seeing the question bank, it's hard to be sure.
     
    1) Distance ed program assessment; 2) providing intor to Flashlight as we deliver instruction on the integration of technology in teaching and learning; 3) using as a model for the development of assessments for library instruction;
     
    Waiting for feedback from the Flashlight Adminisrtator
     

What concerns do you have about using Flashlight?

17 Responses

    I don't need it right now since I use other programs for the same purpose and have been for 5 years.
     
    Managing this tool among many others on campus with the current people resources we have
     
    1. It does not allow to make required questions. A user can easily not select an option for an answer and submit it. 2. The exporting of information is not very user friendly. When exporting to text file, questions (column labels) are not defined easily without first having to go else where to get an explanation of what each column header represents.
     
    Want to see the Faculty assessment questions.
     
    The data base is too limited both in subject areas and question items.
     
    no backend for adding users no way to verify who took survey cluncky interface
     
    More of my concerns are about our infrasturcture, not the product itself.
     
    The surveys reside on another campuses' server
     
    I need to spend more time working with it.
     
    None at this time.
     
    Flexibility
     
    Lack of research supporting tools - difficlut to approach/encourage faculty without it.
     
    No manual! No online training modules.
     
    None at the moment.
     
    The questions are cross-referenced, yet it seems difficult to determine if questions are useful for a particular purpose.
     
    1) cost; 2) validity/reliability;
     
    Need FAQ database and What questions other 16 unc campuses has asked about Flashlight. More Examples of many uses of Flashlight on Campus for Students, Administrators, Staff, and Faculty.
     

What improvements could make Flashlight a better product?

15 Responses

    I would expect the usual evaluation techniques found on products we now use, such as item-scale correlations, test validity checks, and a factor analysis program for question weighting. We do this even for questions in our large social science courses for EVERY exam.
     
    Can't say quite yet.
     
    For a faculty member giving a survey to his/her students, its a good tool to use and integrity of results is easily controlled by use of userIDs and Passwords. At a campus level, integrity of survey results are much harder to control, since it would not be productive to give a username and password to everyone. There is no way to track an IP to 'somewhat' ensure that people don't take the survey twice.
     
    Increase the number of question items and subject areas.
     
    see above
     
    Manual available in looseleaf so questions created at a later date could be added to each section.
     
    I need to spend more time working with it.
     
    Don't know enough about the product to make any recommendations.
     
    More on how to customize.
     
    Research findings published in refereed journals
     
    a manual. Online training moduls
     
    A broader spectrum of surveys (beyond student satisfaction with classroom instruction).
     
    I am not sure at this point.
     
    Add the capability to create new questions from existing questions. The editing features are hard to use. Also, one should be able to go back to editing a "finished" survey. More sample surveys would be helpful.
     
    Need to more training in use of product.
     

Next Page

Copyright The University of North Carolina All Rights Reserved