Cover Sheet for Bids (All sections must be completed) Name of JISC Initiative: Assessment and Feedback: Strand A Name of Lead Institution: Bath Spa University Name of Proposed Project: FASTECH: and Assessment for Students with Technology Name(s) of Project Partners(s) (except University of Winchester Feedback commercial sector – see below) This project involves one or more Name(s) of any commercial partner company No commercial sector partners Full Contact Details for Primary Contact: Name: Professor Paul Hyland Position: Head of Learning and Teaching Email: p.hyland@bathspa.ac.uk Tel: 01225 875564 Address: Bath Spa University, Corsham Court Centre, Corsham Court, Wiltshire, SN13 OBZ Length of Project: 3 Years (Years 1 & 2 are JISC part-funded) Project Start Date: September 2011 Project End Date: August 2014 Total Funding Requested from JISC: Funding requested from JISC broken down across Academic Years (Aug-July) August 2011 – July 2012 August 2012 – July 2013 95,221 95,228 Total Institutional Contributions: 236,678 Outline Project Description FASTECH is designed to use readily available technologies to support the systemic enhancement of assessment and feedback strategies and practices at programme, school and institutional levels. A key aim of the project is to provide evidence of, and guidelines for, technological improvements and change processes that can be used to enhance the effectiveness and efficiency of assessment and feedback at these levels throughout the sector. Our core research activities will be conducted through collaborations involving senior managers, teachers, technical and administrative staff, students and external advisors. Throughout the project we will also work with colleagues in other institutions and organisations; drawing upon their experiences and expertise, and ensuring that the key findings and outputs of FASTECH are provided in ways that best address the particular needs of their disciplinary communities, institutions and learning environments. By addressing the challenges of improving practices through research and development at programme level, we will be able to provide findings that are sensitive to disciplinary needs and traditions, and closely attuned to teachers’ and students’ interests and experiences. Our project builds upon a new body of knowledge about student and staff experiences of assessment and feedback, collected from over 22 degree programmes in 8 universities. From these baseline data we have identified many common and distinctive disciplinary challenges facing students and their teachers, and have worked effectively with course teams and senior managers to realise enhancements through systemic change processes. We are now aiming to improve student learning from assessment and feedback by transforming practices within 33 degree programmes across Bath Spa and Winchester universities, and embedding wellproven technology-supported pedagogies and quality improvements in these institutions. YES I have looked at the example FOI form at Appendix A and included an FOI form in this bid I have read the Funding Call and associated Terms and Conditions of Grant at Appendix B YES 1 1. Appropriateness and Fit to Programme Objectives and Overall Value to the JISC Community 1.1 Introduction FASTECH will embed technology-supported improvements to assessment and feedback practices in 33 degree programmes across the range of educational provision at Bath Spa and Winchester universities. This will directly affect the learning experiences of over 6,500 students; the practices of over 300 teachers; and the work of over 60 IT-support, administrative and other staff, including senior managers, within the institutions. By seeding and supporting change within whole programmes, we are able to work-with-the-grain of teachers’ subject interests and departmental loyalties, and the full course experiences of their students. And by demonstrating the effectiveness, efficiency and sustainability of technology-supported learning and teaching practices within many kind of programmes, we are able to engineer systemic institutional quality enhancements. 1.2 Baseline Data and Scope of Research To understand the effects of any changes made to assessment and feedback strategies and practices, it is essential to have high-quality baseline data. We have outstanding quantitative and qualitative data on all of the13 degree programmes at the heart of FASTECH research. These data have been gathered through Curriculum Audit: documenting the variety, volume, timing and sequencing of formative and summative assignments; and the forms and amounts of feedback given and received; Assessment Experience Questionnaire (AEQ): providing evidence of student experience on 10 scales, such as ‘quantity of effort’, ‘goals and standards’, and ‘quantity and quality of feedback’; Meetings with Teaching Teams: providing knowledge of teachers’ views and problems; Student Focus Groups: providing insights into the nature of students’ experiences, and how students believe these can be improved. Early findings of this research are available on the ‘Transforming the Experiences of Students through Assessment’ website ( www.testa.ac.uk ) and have been reported at national/international conferences. The tools and methods of this research are now familiar to our Programme Leaders and their teaching teams, and we will use and adapt these instruments (e.g., by concentrating on the effects of technology-based interventions, in Student Focus Groups) throughout the Project. This will enable us to measure not only the effects of particular changes due to the use of technologies, but also to understand how these changes are related to and affect the assessment and feedback cultures and environments of programmes and departments. Our wider Project work (e.g., on the processes of embedding institutional change, improving staff training and development, and promoting community take-up) will provide further evidence to support the JISC community and HE sector. 1.3 Workload Tracking Attention to ‘Time-on-Task’ has long been recognised a vital principle of good practice (Berliner,1984; Chickering & Gamson,1987) as it exercises a powerful influence on staff and students’ workloads and experiences. In times of austerity and higher learner-expectations, it is also of fundamental importance to the work of managers and developers. Moreover, staff anxieties about increased workloads are widely recognised as being a major obstacle to the adoption of technologies to enhance teaching and learning. Throughout the Project we will conduct research to track the total time that teachers and students spend on assessment and feedback practices each month, and the time spent on particular kinds of activity; such as providing/reading written feedback on essays, holding/attending face-to-face tutorials, preparing/using screencast feedback. Data will be collected primarily through Time-on-Task charts (one for teachers, one for students) which will be emailed to all teachers and students each month. The charts will be trialled and baseline data collected in the first stage of the Project. As the Project develops, this and our other data will enable us provide evidence of the effects of introducing particular technological changes, on staff and student workloads; how workload volumes and patterns change in first and subsequent year of technology-adoption; how students’ time on assessment relates to other factors (e.g., ‘quantity of effort’ scale in AEQ); the relationships between the effectiveness and efficiency in assessment and feedback practices; how managers and developers can plan and promote the greater use of technologies, based upon data about resourcing needs and benefits. 2 1.4 Assessment and Feedback Challenges Addressed From our data on 22 degrees at 8 universities, and some of the key findings of educational research and advice in this field, it is clear that although there are significant variations in students’ experiences of assessment and feedback according to their course of studies, there is also a strong common pattern of experience which can be improved through the use of readily-available technologies targeted to meet the needs of particular disciplines, kinds of student, and learning environments. An outline of these common challenges and some of the technologies that we will deploy to enhance students’ learning is provided here: . Challenge Students do not report high levels of effort, and have difficulty in planning and distributing effort evenly across modules, due to poorly designed patterns of assessment. Students struggle to understand course documentation (e.g., of ‘assessment criteria’) and feedback, to apply written feedback. Teachers spend large amounts of time crafting written feedback which isn’t attended to. Students receive feedback too late to be of use to them. Peer review and feedback are patchily implemented and viewed as ancillary or ‘second best’. Illustrative Innovations Frequent and regular tasks (such as blogging, chat forums, patch-working and portfolio activities) delivered online. Authentic assessment tasks which require collaboration and filesharing. Alternative modes of feedback implemented; e.g., dialogic, oral, through use of mp3 files and mobile learning devices. Opportunities for students to respond to or request feedback on particular aspects of their learning are built into online modes and subject pedagogies Teachers give more informal and quicker feedback in oral formats; either generically or one-to-one, through mobile technology. Use of interactive Assessment Cover-sheets, and online tutoring. Early-bird generic online feedback in Jing, mp3s or vimeo clips with illustrative material help students to learn from feedback. Personal Response Systems develop student & teacher understanding of learning gaps before assessments are submitted. Shared online forums where writing tasks are reviewed by teacher and student commentary. Use of software such as PebblePad and PeerMark in Turnitin. Self-assessment is often optional and rarely used. Use of Pebblepad and other PDP facilities to help students develop selfreflective skills. Students do not fully understand assessment goals and standards, which often appear uneven and fragmented due to diverse assessment practices and marker variation in modular schemes. Use of tools (e.g., Jing screencasts, Co-genT, and Grademark) to make standards more transparent through discussion and exemplification. Benefits Students are required to work more evenly throughout and across modules. Effective ‘time on task’ raises student engagement and leads to higher understanding and achievement. Changes to the transmission model of one-way feedback; creating more dialogue and engagement, enabling students (individually and collectively) to participate in feedback processes. Enhancement of feedback that feeds forward. Pedagogical relationships are often strengthened by use of more personalised forms of feedback, delivered through technology. Major efficiency gains are realised as teachers spend less time crafting written feedback that isn’t attended to. Formative processes are strengthened as ‘feedback’ is shifted to the more useful concept of ‘feed forward’. Forums enable teachers to build up banks of exemplars of work with comments and discussion, for access 24/7. Students’ self- and peer-review skills are strengthened, as is their social experience of learning. Students actively engaged in assessment processes will develop abilities to manage their own leaning, and enhance peerlearning activities. More student reflection on their practices and achievements, leads to deeper learning. More holistic learning due to more feedback at programme level. Students benefit from shared and well-articulated standards. Standards can be transformed from obscure discourses into ones that students understand. and use. 3 1.5 Barriers to Change Drawing upon the work of Lindquist (1978), Draper and Nicol (2009) identify common barriers to institutional change. These are summarised in the table below with strategies to address each barrier. Barriers to Change Strategies to Address Barriers 1. Major disciplinary differences in teaching and learning. Our programme-level approach acknowledges the diversity of disciplinary needs, traditions and ‘signature pedagogies’ (Shulman, 2005). We will work in partnership with subject teams to ensure that technologies chosen are used to help realise benefits that teaching teams and students will prize. All subject teams have developed good working knowledge of assessment & feedback concepts & literature through participation in TESTA. Staff development activities at subject, institutional and cross-institutional levels will raise knowledge of IT tools and applications, and support development of knowledge through our communities of practice. Our project is directly aligned with our universities’ Learning and Teaching, and E-learning strategies and plans. These highlight the importance of improving assessment and feedback processes and policies at programme and institutional levels through the greater use of e-learning facilities. Our Management Group will invite and expect PVCs and managers (e.g., of QA, staff development, IT-services, e-learning) to attend selected meetings where institutional issues are at stake. Progress reports will be submitted for discussion and action at key university boards and committees. Heads of Faculty in both universities support our work and are keen to see it develop. The value of our approach (collecting empirical data at programme level and supporting staff and students in developing improvements that are best suited to their particular needs and environments) has been commended by all of our programme teams and their students. The benefits of innovations (from improving the quality of developmental feedback, to increasing the efficiency of assessment and feedback processes) are therefore well-understood by our students, teachers and senior managers. Throughout our universities there is already a widespread interest in the use of digital media to support assessment and feedback, though the most advanced usage is usually by individuals within particular modules. FASTECH builds upon the current roles of the Project Leaders and other staff (such as departmental ‘E-learning Champions’), and will facilitate the development and scaling-up of small, free-standing and patchy innovations to deliver systemic changes at programme, faculty and institutional levels. 2. The isolation of academics from the educational research literature. 3. Weak linkages between local innovations and general strategies. 4. Low levels of senior management ‘buy-in’ after funding is secured. 5. Little evidence of the benefits of innovation. 6. Funding is diverted to support other developments already underway. 1.6 Project Benefits Institutional Benefits (Bath Spa and Winchester) Development of technology-supported assessment and feedback practices to enhance the quality of teaching and learning, and student experiences, within 33 degree programmes. Target: at least 15 programmes across all Faculties in each institution. Development of expertise and capacity to facilitate the transfer of knowledge (based upon well documented evidence of benefits for students, teachers and managers) about the application of technologies and effective change processes pertinent to all other degree programmes in the universities. Accelerated attainment of strategic goals for improving students’ learning and assessment experiences, in accordance with key objectives of each institution’s Learning and Teaching and Elearning strategies. Target: improved ‘Assessment and Feedback’ scores from annual NSS returns. Improvement of institutional policies and procedures for Quality Assurance (as through guidelines for the review, validation and evaluation of courses), staff development and technology support. Delivery of 2 conferences and 28 staff training-and-development workshops at departmental, institutional and cross-institutional levels. Target: 150 staff participating in each institution. Improved management of resources to support teaching and learning, founded upon better understanding of student and staff workloads relating to assessment practices. (A quick survey to 4 support this bid suggests that our teachers spend between 8 and 40% of their total teaching time on assessment and feedback activities; average is 24%.) Target:15% efficiency savings on staff ‘assessment and feedback’ time throughout each institution. Strengthening of the relationships between the universities through our Communities of Practice. Target: further collaborative initiatives and 1 externally-funded project. Increased opportunities for disciplinary teams to develop and disseminate pedagogical knowledge and expertise through activities with JISC and the HE community. Target: 8 teacher-researcher publications/conference presentations from each institution by completion of the Project. Community Benefits Provision of a suite of digital media resources to enable teachers, managers and developers to learn about programme-level assessment and feedback practices, and how these may be enhanced through the use of readily available technologies. Online and face-to-face support for teachers and others to use the findings, tools and methods of FASTECH to review and improve their own practices and programmes. Opportunities for individuals and groups throughout the sector to share the challenges and findings of their work by contributing assets to the FASTECH website. Opportunities to contribute to online discussions on improving students’ experiences of assessment and feedback within their disciplines and institutions, through participation in the national & international community of practice seeded by the work of TESTA, and developed here. Access to a wide range of high-quality research findings within & across the disciplines, to support further research & development throughout the sector, and the ability of departments & institutions to address issues, such as NSS ‘Assessment and Feedback’ scores, through proven technologies. Access to the data, findings and tools of longitudinal research on the effects of technology-supported changes on staff and student workloads, to support the business case for change. 1.7 Summary of Outputs and Deliverables for the Community Outputs and deliverables will include: 13 ‘thick descriptions’ of the contexts, research & development processes & findings, challenges, technologies used, benefits & effects of change within the degree programmes (Year 2), and 20 case studies of change in other degree programmes (Year 3); 13 student narratives (one from each programme) describing experiences of change from students’ perspectives; 2 full accounts of the processes & outcomes of embedding institutional change within the partner universities; 1 manual for teachers & developers to help them apply the tools, methods & findings of the Project to enhance assessment & feedback practices within their own programmes; 1 guide for Quality Assurance, administrative & technical staff to help them improve QA procedures and IT-support; 1 guide to the range of technologies used in FASTECH and the research findings relating directly to their use; 1 full account of the comparison of staff and student workloads (from Time-on-Task data) with quality of student-learning data, enabling modelling of ‘Effectiveness and Efficiency through Technology’; 4 progress & 2 evaluation reports. These assets will make the most of opportunities afforded by digital media to facilitate re-purposing, critical and creative interactions among key stakeholders (e.g., disciplinary communities & senior managers), and the contribution of open assets from individuals, groups and organisations throughout the sector. Other Outputs will include 1 FASTECH conference, 12 national/international conference papers, and full participation in all JISC activities to promote and support the Assessment and Feedback Programme. 2. Quality of Proposal and Robustness of Work Plan All Project staff, institutional resources and facilities will be in place to secure a strong start in Sept. 2011. 2.1 Educational principles The fundamental importance of degree programmes in framing the activities of teachers and the learning experiences of students is well known to students, teachers and researchers. And we know that each degree’s assessment environment exercises a powerful influence upon what, how much and how effectively its students learn. Our Project is underpinned by key concepts from Gibbs and Simpson (2004) on the conditions of assessment which support student learning, and from Nicol & McFarlane Dick (2006) on the principles of good feedback. It draws upon the work of Boud (2000), O’Donovan, Price & Rust (2008), and Nicol (2010) in its emphasis on dialogic approaches to improving feedback, and it is informed by a wide range of research on 5 feedback and assessment and the uses of new technologies within and cross the disciplines. The individual tools and methods of FASTECH research, such as the AEQ and focus groups, are well established, though our approach to research and development --- working in partnership with teaching teams and students to engineer the systemic improvements that they would like to make to the assessment schemes and cultures of their programmes --- is distinctive. From our work to date with staff and students on whole programmes, at several universities, we are confident that our approach is also proving to be effective. 2.2 Outline of Project Plan: Key Stages and Activities For planning and monitoring purposes, FASTECH is described here in 5 stages, reflecting key opportunities and deadlines for development within the rhythms of the academic year. Project milestones will entail the attainment of all activities at each stage. Stage 1: Set Up September - December 2011 Confirm partnership agreements, staff roles & responsibilities; advise institutional senior managers & supporters, and external advisors about their roles; appoint student reps for all 13 programmes. Design & launch of FASTECH website to share Project resources, expertise, experiences, challenges & findings, with links to key sector materials to support application of technologies & knowledge of good practices, student ‘help’ and FAQ facility, and blog for RSS feed. Test & launch website with FASTECH team, students and external community. First FASTECH conference to review plans & build cross institutional CoPs (all FASTECH team; 13 programme leaders & student reps; PVCs, institutional QA managers & IT/e-learning champions; external advisors, plus JISC A&F programme & projects). Circulate findings for online discussions. Work with Programme Leaders, their teaching teams and students, using baseline data (already collected via TESTA) and training workshops (provided by FASTECH staff and externals) to address challenges and identify appropriate technologies. Plan technology trials within modules and across programmes. Collate, compare and share findings. Introduce students & staff to FASTECH Time-on-Task charts. Pilot online usage. Plan the design of an ‘Effectiveness and Efficiency through Technology’ model, with help of Business Departments. Meetings of Project Management and Development groups to review progress; prepare Time-on-Task charts for teachers and students; start collection of Project evaluation data; address systemic challenges (e.g., of sustainability, diversity, & disability); undertake community-building activities with external advisors and HEI contacts. Submission of Project Plan & Web Page (Sept) and Consortium Agreement & Project Website (Nov) to JISC. Progress Report for Stage 1. Stage 2: Pilot Interventions and Formative Evaluations January - August 2012 Programme Teams undertake technology trials with support from E-learning Leader, Project Developers and Learning-with-Technology Specialists. Impact on students’ learning is tracked through focus groups and online discussions. Impact on student and staff workloads tracked through Time-onTask charts. Teams report findings and programme development plans at inter-site meetings. Report from Lead Researcher & Manager to FASTECH Management Team. Staff Development workshops led by FASTECH leaders, developers and externals, to support key groups (e.g., teaching teams; QA, Registry, admin & IT staff; HoDs and senior managers). Quality Assurance and IT-support issues (e.g., systems integration) are identified and addressed by Management Team with senior managers at BSU and UW. External conference presentations of FASTECH ‘Work in Progress’. Community-building activities: e.g., visits to other HEI to promote take-up of FASTECH. Review of Website through student, teacher and community survey and Analytics. Baseline & Progress reports and Technical & Supporting Documentation to JISC at required dates. Progress Report for Stage 2. Stage 3: Programme Change and Institutional Development September 2012 - May 2013 Programme teams, supported by all FASTECH staff, apply technological changes to whole degree programmes. Impact on student learning experiences and outcomes is examined through quantitative and qualitative data collected from Assessment Experience Questionnaire, focus groups, student 6 performance, programme assessment audit (TESTA tool) and Time-on-Task chart. Impact on teachers and managers, admin & IT supporters of student learning is collected through programme-team meetings, institutional meetings, and Time-on-Task audit. Lead Researcher & Manager provides analysis of findings against baseline data for each programme, and comparative analysis across the 13 programmes. Inter-site meetings and online discussions to review progress, identify and address challenges. Progress Report to Management team, key university committees, and JISC. Staff Development workshops at faculty, institutional and cross-institutional levels. Community-building activities: e.g., to capture/develop of assets from other HEIs. External national conferences papers delivered (e.g., for HEA, SEDA, ALT-C). Progress Report for Stage 3. Stage 4: Production of Resources and Dissemination of Findings June - August 2013 Teaching teams and students produce digital media resources: e.g., case studies of programme-level change (such as audio-visual materials on the effects of introducing technologies to enhance assessment and feedback practices), student accounts of change processes and benefits. Assets are trialled with prospective users in the sector. FASTECH leaders & developers produce online guides & manuals to support teachers & educational developers in the selection & implementation of technologies; a guide for QA staff and supporters of student learning; and a guide to research findings on technologies used. Materials tested with external advisors. First modelling of ‘Effectiveness and Efficiency through Technology’ based on staff-workload and student-learning data. Review and development of models with Finance staff. First Evaluation Report to JISC and University Boards. Second FASTECH conference (attendance as first conference, plus participants from the sector): keynotes, papers and networked discussions on all aspects of the Project and outcomes to date. Report to PVCs, HoDs & senior managers at BSU & UW on project outcomes (including findings on effectiveness & efficiency; and recommendations concerning QA procedures and QA plans). Dissemination events and activities in Schools and Departments (e.g., workshop & posters). Training of 26 members of the 13 programme teams for the peer-mentoring scheme. Trials of peer-mentoring scheme to spread & embed technology-supported enhancements. Project assets uploaded to website, with discussion forums and invitations to contribute materials. Progress Report for Stage 4. Stage 5: Institutional Embedding and Community Benefit September 2013 - August 2014 Heads of Learning and Teaching at BSU and UW (Hyland and El-Hakim) work with PVCs & senior managers, using FASTECH findings to review and revise institutional Learning and Teaching and Elearning strategies & operational plans. Workload tracking through Time-on-Task charts is repeated for 13 programmes (for comparison with first-year-of-change data), and introduced for all new programmes. Completion of ‘Effectiveness and Efficiency through Technology’ work. Dissemination of findings. Using range of expertise acquired through FASTECH, at least 20 more programme teams at BSU and UW are supported (e.g., through normal staff development activities and the peer-mentoring scheme) in the employment of selected technologies to enhance assessment and feedback. Production of 20 new case studies, and support for students and others to contribute assets. Analysis of FASTECH changes against ‘Assessment and Feedback’ scores from NSS returns. External conference papers delivered (e.g., for SRHE, ISSoTL, and discipline-based events). Development of website assets with others across the sector. Sustainability plans reviewed and secured in strategic plans of the universities. Final Evaluation Report to JISC and University Boards. 7 2.3. Leadership and Management Management Group Project Leadership will be provided by Prof. Hyland (Chair of the Management Group) who will be responsible for Project plans and outputs, evaluation, staffing, budgets, reporting, sustainability, external advisors, formal relationships with external stakeholders, and partnership arrangements. As Head of Learning and Teaching at BSU, working with the DVC (Academic) and senior managers, he will ensure that the work of the Project is embedded in the strategic thinking of the university and that its findings inform key decision-making processes. The Lead Researcher and Project Manager, Dr Jessop (Chair of the Development Group) will be responsible for the management of research and development activities, including the operational management of the Project Developers; agreements with Programme leaders, teaching teams and students; staff development activities (including peer-mentoring scheme), the analysis and presentation of research findings (including Time-on-Task data); and progress reports at each stage of the Project. The E-Learning Leader, Yassein ElHakim will have overall responsibility for the use of technologies within the Project, including management of the Learning-with-Technology Specialists; provision of IT training and support; design and management the Project website; development and testing of assets to meet the needs of the JISC community and HE sector, and contributing to key-stage progress reports. As Director of Learning and Teaching at UW, he will also ensure that the work of the Project is well embedded in the university’s strategic planning. Other members of the Group will include one External Advisor, Prof. Gibbs, who will have special responsibility for advising on community engagement, institutional embedding, and Project evaluation; two Programme Leaders; two Student Representatives; and occasional participants (such as PVCs, senior managers and external stakeholders) where appropriate. Development Group The Development Group, chaired by the Lead Researcher and Project Manager, will include the E-Learning Leader; the two Project Developers, Joelle Adams and Nicole McNab, who will work closely with the 13 programme teams, administrators and technical staff to ensure that research and development activities within the programmes are delivered and reported; the two Learning-with Technology Specialists, Dr Breeze and Dr Lewis, who will provide technical advice and support for teaching teams, and support the creation of digital media assets; two Programme Leaders; two Student Representatives; and occasional participants drawn from university staff, external advisors, and external users. 2.4 Risk Analysis Risk Likelihood Problems in Low appointment of some staff. Impact High Programme Leaders are unwilling or reluctant to participate due to lack of time. Low High Teaching teams lack ‘buy in’. Low High Difficulties with application of technologies & systems integration. Mid Mid Risk Management /Mitigating Actions Some selected staff are on fractional contracts, but all are keen to take on FASTECH work. We have identified other high-quality staff, if needed. Project Leaders and Developers will provide support for Programme Leaders. Programme Leaders may choose to delegate some responsibility to appropriate members of their teaching teams (such as E-learning Champions). Some team members will press their leaders to ensure full participation. Teams are keen to improve assessment and feedback practices using technology, and have enjoyed working with us. Other teams are also keen, and we can collect baseline data on their programmes in Sept/0ct 2011. Many teachers will recognise this as an excellent opportunity for professional/career development. Some will aim to present findings at discipline/technology events. Most technologies are already in use among some individuals or on particular modules at both universities. Our IT teams are familiar with these technologies. Learning-withTechnology Specialists and two of our External Advisors will provide support. Early-Warning Sign Resignations or redeployments. Poor responses to communications & requests for early meetings. Some staff express resistance to any perceived addition to workloads. Slow progress of teams in implementing technology innovations. 8 Institutional commitments are lacking due to economic uncertainties. Mid Mid Some students are reluctant to use unfamiliar technologies. Mid Mid HEI and sector interest in using technology to enhance assessment and feedback declines. Community does not make use of online resources. Low Mid Mid Mid Partnership problems. Low Mid University-funded staff have key institutional roles to enhance student experience & deliver institutional strategies; improving effectiveness & efficiency of feedback & assessment is essential to this work. All Project staff are committed to researching & publishing in this field. Any evidence of efficiency gains due to the use of common technologies will be well received. Where technologies (such as Grademark and Jing) are currently being used in our assessment and feedback practices, most students report very positive experiences. Students will be active participants and contributors throughout our project, and we will provide training sessions and discussion forums to identify & address any difficulties. Project team will build upon the national & international network of TESTA users. Project Leaders will present ‘Work in Progress’ within key communities. Website design will encourage external engagements & assets. Sector concern for assessment will be responsive to the student voice. A high priority of Management Group throughout Project to ensure that online assets are provided in ways that are easily transferable or adaptable for use by the key user groups, and evidence of benefits is clear to them. Project team will provide advice services to support external users. History of excellent working relations between key staff, and mutual benefits for the universities. Changes to institutional strategies and funding plans. Programme teams cannot identify student reps. Lack of early interest in Project signalled by slow responses to requests to review trials and assets. Slow growth of Project network. Monitoring of site usage through Google Analytics shows assets not being accessed. Lack of collegiality and good channels of communication. 2.5 Sustainability Our universities recognise the many benefits from participation in the scheme (see section 1.6 ‘Project Benefits’ above). A key benefit will be the enhancement of our students’ learning experiences, for this is fundamental to our universities’ missions and strategic objectives. By the close of the Project it is unlikely that we will have secured technology-enhanced assessment practices across the whole of our course provision, so there will be a strong case for sustaining the Project’s work at least until all programmes (including postgraduate courses) have been enhanced. The building and further development of expertise in the use of technologies to improve and transform the pedagogies of our disciplines and programmes, will also present us with many further opportunities for improving students’ learning; for example, though more online tutoring to increase the flexibility and accessibility of our courses. Such developments will appeal to students, and it may be anticipated that students’ continuing concern throughout the sector for the quality of their assessment and feedback experiences will be one of the most compelling reasons for sustaining our Project’s work within and beyond our institutions. There may also be an excellent business case for maintaining and extending our investment in the Project. For, if there are only small but widespread efficiency gains that can be sustained through the effective use of new technologies, these will amount to significant savings that may be re-invested in the Project. 9 3. Engagement with the Community We aim to build a strong community of practice through our Project, and will encourage all forms of participation and contribution; from attendance at technology-training workshops to the creation of digital media assets. So, wherever practicable, we will publicise our calendar of work as ‘Open Activities and Opportunities’. We will also try to respond to all external requests for help/support, online or in person. We are confident that our work will help programme teams, departments and institutions to address important issues concerning assessment and feedback practices, and to seize opportunities to make improvements. We also recognise the need to work with the particular needs and interests of external stakeholders: Students As well as being a primary source of information and opinion, students have a right to be involved in discussions and decisions about assessment, and can exercise great influence as change agents. FASTECH will work closely with students as active partners in all aspects of the Project: e.g., focus groups; student reps; website asset creators, developers; progress monitors and evaluators; conference presenters. Public evidence of this work will encourage students from other universities, and independent learners, to participate through online discussions, the contribution of assets, and the use and sharing of student ‘help’ and peer-support facilities. Teachers We will engage with teachers within and beyond our institutions primarily because we will be working to help them realise the practical benefits of using technology to improve their programmes. We will undertake ‘Community-Building’ (e.g., contributing to discipline-based events) throughout the Project. Managers FASTECH will address many concerns of managers: e.g., the need for empirical evidence about the impact of technology-based changes on the effectiveness and efficiency of teachers; and the need for knowledge about how and which technologies might best address issues arising from NSS returns. Educational Researchers and Developers From the outset, we will invite researchers and developers to join in our activities (e.g., by attending workshops, shadowing our researchers, reviewing assets), and will support groups and institutions who aim to adopt or develop our research. We will disseminate and promote our work at key conferences and through working with organisations such as SEDA and the HEA. Administrators and Technology Staff As with teachers, we will invite (online and through professional associations) external staff to join in our activities and online discussions, and create web assets. International Community Several universities overseas (e.g., Utrecht and UNSW) are keen to undertake research and development work that follows our whole-programme approach to improving assessment and feedback. We will welcome this international enrichment of programme-based research, and will encourage its development, though we will not fund any costs of engagement. The JISC Community In addition to participating fully in all JISC dissemination, promotion and evaluation activities, we are keen to draw upon JISC specialists to maximize impact and outputs. 3.1. Evaluation Responsibility for Evaluation will be held by the Project leader, supported by the senior external advisor Prof. Gibbs. Both have extensive experience of evaluating the outcomes of educational research and development projects in the UK and overseas. We will design our evaluation to pay close attention to JISC’s ‘Project Planning: evaluation plan’ and ‘InfoNet’s evalkit’. Key aims will be to assess the attainment of Project goals; and the impacts, benefits and value of our work to various stakeholders within and beyond the partner institutions. We already have substantial baseline data (section 1.2) on which to build, and we will monitor our progress through review and reporting processes throughout the Project (section 2.2). Thus at all stages of the Project we would expect to able to provide a well-documented and externally verifiable account of impact and benefit to-date. We will gather evaluation data from multiple sources (e.g., students, teachers, managers, external stakeholders, website users and asset creators) using standard tools and methods (e.g., questionnaires, interviews, focus groups, peer review of assets, usage and web-server logs), replicating these activities at key stages of the Project to inform our progress and create a chronology of the salient moments, levers and activities of change. In keeping with the JISC Programme, the main focus of our investigations will be on the impact of technology-based change at programme and institutional levels, and the extent to which we have enabled the HE community to deliver quality improvements. 4. Budget 4.1 Budget justification Staffing costs include Project Leader, Paul Hyland (BSU-funded) 0.2 FTE for years 1 & 2, plus 0.1 for year 3; Lead Researcher & Project Manager, Tansy Jessop (UW) 0.3 FTE for years 1 & 2, plus 0.2 for year 3; E10 learning Leader, Yassein El-Hakim (UW-funded) 0.2 FTE years 1 & 2, plus 0.1 for year 3; Project Developers, Joelle Adams (BSU) & Nicole McNab (UW) each 3.5 FTE for years 1 & 2; Learning-with-Technology Specialists, Nicholas Breeze (BSU) & Bex Lewis (UW) each 0.2 FTE for years 1 & 2. Dissemination and Evaluation costs include External Conferences:10 staff presentations at 1-day conferences (£200 each), 2 staff x 2 residential conferences (£500 each); 2 FASTECH Conferences: one at each university, 2 x 30 FASTECH participants @ £50pp, for travel and subsistence; Evaluation supported by Graham Gibbs (in costs below); Evaluation meetings and materials, £1,000pa in years 1 & 2.; Community Building: travel to promote use and development of FASTECH (e.g., through creation of additional assets) with other HEIs and HE providers, £1,000 in each of the 3 years. Direct Costs include Student Support for student groups and reps from 13 degree programmes (13 x £500 in years 1 & 2); Programme Team Meetings: a subsistence allowance for each programme (13 x £100 in years 1 & 2); Inter-site Travel and Meetings for FASTECH team, students and other university staff (£2,500 pa in years 1 & 2); Technology Training and Staff Development Workshops for teachers and students (12 institutional workshops @ £200 each, and 2 cross-institutional workshops @ £500 each) in each of years 1 & 2; Laptops for project developers (2 x £500, BSU & UW funded); External Advisors: Graham Gibbs, 6 days p.a.. in years 1 & 2, plus 3 days in year 3 (£600pd); Harvey Woolf, 3 days p.a. in years 1 & 2 (£250pd); Shane Sutherland, 3 days p.a. in years 1 & 2 (£500pd); Zak Mensah, 3 days p.a. in years 1 & 2 (£450pd); JISC Travel and Expenses, £1,800 for travel to 15 JISC meetings (5pa @ £120 each). 4.2 Quantification of Benefits Please see Sections 1.6 ‘Project Benefits’, and 1.7 ‘Summary of Outputs and Deliverables’ above. 4.3 Project Budget Directly Incurred Staff Post, Grade, No. Hours & % FTE August 11– July 12 £0 August 12– July 13 £0 August 13– July 14 (Strand A only) £0 TOTAL £ Total Directly Incurred Staff (A) £0 £0 £0 £0 Non-Staff August 12– July 13 £1,900 August 13– July 14 (Strand A only) £600 TOTAL £ Travel and expenses August 11– July 12 £1,900 Hardware/software £1,000 £0 £0 £1,000 Dissemination £3,960 £4,500 £1,000 £9,460 Evaluation £4,600 £4,600 £1,800 £11,000 Programme Team Development £3,400 £3,400 £0 £6,800 Student Input £6,500 £6,500 £0 £13,000 Consultancy Total Directly Incurred Non-Staff (B) £3,600 £3,600 £0 £7,200 £24,960 £24,500 £3,400 £52,860 £24,960 £24,500 £3,400 £52,860 August 12– July 13 £84,018 August 13– July 14 (Strand A only) £25,230 TOTAL £ Staff August 11– July 12 £85,241 Estates £9,544 £9,544 £2,307 £21,395 Other £0 £0 £0 £0 Directly Incurred Total (C) (A+B=C) Directly Allocated £0 £4,400 £194,489 11 Directly Allocated Total (D) £94,785 £93,562 £27,537 £215,884 Indirect Costs (E) £71,112 £71,112 £16,159 £158,382 Total Project Cost (C+D+E) £190,857 £189,174 £47,096 £427,127 Amount Requested from JISC £95,221 £95,228 £0 £190,449 Institutional Contributions £95,636 £93,946 £47,096 £236,678 Percentage Contributions over the life of the project JISC 45 % Partners 55 % No. FTEs used to calculate indirect and estates charges, and staff included No FTEs 4.0 Total 100% Which Staff PH (0.5) JA (0.7) NB (0.4) TJ (0.8) NM (0.7) BL (0.4) YEH (0.5) 5. Previous Experience of the Project Team 5.1 Internal Staff Project Leader: Prof. Paul Hyland NTF, is Head of Learning and Teaching at BSU, with responsibility for the design and attainment of learning and teaching strategies, staff development and pedagogical research. He has been a Head of School, leader of national/international research and development projects and activities such as CETL and FDTL, Director of History in the HEA’s Subject Centre, and is a founding member of ISSoTL. His pedagogical publications include ‘Learning from Feedback on Assessment’ (2000). Lead Researcher & Project Manager: Dr Tansy Jessop is Senior Teaching Fellow at UW and leader of the TESTA project. She has published on virtual learning, assessment, the minority ethnic student experience, and learning spaces; and has presented papers at 40 conferences and staff development events in the last five years. E-learning Leader: Yassein El-Hakim is Director of Learning and Teaching at UW, and a member of SEDA’s Executive Committee. He is co-leader of TESTA, and institutional lead for JISC’s Co-genT project and the Benefits Realisation. His research interests are in the ways that technology can be used to enhance learning, teaching and assessment, and in HE leadership and the change process; the field of study for his EdD at Southampton University. Project Developers: Joelle Adams is Student Achievement Co-ordinator and leader of the Writing and Learning Centre at BSU. Her teaching and research is in the field of academic writing, learning support, and e-learning; Nicole McNab is a Research Officer at UW. She has project managed Co-gentT BR and worked on several other external projects in the last two years. She has led institutional development projects using PebblePad, Turnitin, mobile learning devices and e-readers, and has extensive experience of helping staff to use new technologies to enhance teaching and learning. Learningwith-Technology Specialists: Dr Nicholas Breeze is the Learning and Technology, Support and Development Officer for the School of Education at BSU. His teaching and publications are in the introduction and use of learning technology in higher education; Dr Bex Lewis is Blended Learning Fellow at UW. Specialising in the innovative uses of social media and digital literacy she works with staff and students throughout the university. She also runs a consultancy service and has a contract with the University of Durham. 5.2 External Advisors Prof. Graham Gibbs NTF, formerly Director of the Oxford Learning Institute at the University of Oxford. His work includes over 20 books and 400 articles on teaching, learning and assessment in higher education; Dr Harvey Woolf, formerly Head of Academic Standards at the University of Wolverhampton, is a specialist in Quality Assurance and assessment. He has been an institutional auditor and is founder-member of the Student Assessment and Classification Working Group. Shane Sutherland, Development Director and Cofounder of Pebble Learning, has wide experience of working in and with universities to improve curriculum design, assessment and reflective learning. Zak Mensah, E-learning Officer for JISC Digital Media, hosted at the University of Bristol, is a specialist in supporting the creation of high-quality digital media resource. 12
© Copyright 2024