The Electronic Journal of Information Systems Evaluation provides critical perspectives on topics relevant to Information Systems Evaluation, with an emphasis on the organisational and management implications
For general enquiries email
Click here to see other Scholarly Electronic Journals published by API
For a range of research text books on this and complimentary topics visit the Academic Bookshop
Journal Issue
Volume 8 Issue 3, ECITE 2005 Special / Nov 2005  pp143‑230

Editor: Dan Remenyi

Download PDF (free)

Evaluating Success in Post‑Merger IS Integration: A Case Study  pp143‑150

Maria Alaranta

Look inside Download PDF (free)

An Evaluation Framework for the Acceptance of Web‑Based Aptitude Tests  pp151‑158

Michael Amberg, Sonja Fischer, Manuela Schröder

Look inside Download PDF (free)


Aptitude tests analyse the aptitude of persons for studying at a specific school or university as well as for working within a specific company. Due to the recent technology advances, web‑based solutions are increasingly used for the implementation of aptitude tests. These web‑based aptitude tests can be utilised for rather standardized test methods, testing a large amount of users. Based on the fact that web‑based aptitude tests are getting more and more common, a high user acceptance is important, especially since test results tend to be taken more seriously. Furthermore, the design of the test should be helpful and support the use of the test. In this context, the target of our research is to provide a framework for the evaluation of the user acceptance for web‑based aptitude tests. The research method is based on an exemplary web‑based aptitude test and includes the following steps: Firstly, we used the Dynamic Accep‑ tance Model for the Re‑evaluation of Technologies (DART) as a basis for the adoption of web‑based aptitude tests. DART is an instrument designed for the analysis and evaluation of the user acceptance of innovative technologies, prod‑ ucts or services. Based on a literature review and expert interviews, we identified the most important acceptance indica‑ tors. In a next step, we evaluated the defined acceptance indicators in a survey with test persons who carried out one selected web‑based aptitude test. Afterwards, we analysed the reliability and validity of the developed evaluation frame‑ work. The result shows that a detailed analysis of the influencing factors is generally possible with the use of DART. This approach helps to define a balanced set of measurable acceptance indicators for the evaluation of the user acceptance. Finally, we described lessons learned and the ongoing process to measure the acceptance of web‑based aptitude tests. 


Keywords: evaluation framework, web-based aptitude test, user acceptance, DART approach


Share |
Why IT Continues to Matter: Reflections on the Strategic Value of IT  pp159‑168

Frank Bannister, Dan Remenyi

Look inside Download PDF (free)

IS Evaluation in Practice  pp169‑178

Ann Brown

Look inside Download PDF (free)

Citizen‑Centric Approach and Healthcare Management Based on the XML Web Services  pp179‑186

Mayumi Hori, Masakazu Ohashi, Shotaro Suzuki

Look inside Download PDF (free)

Conducting Interdisciplinary Research: Evaluation of the ePre‑scription Pilot Scheme in Finland  pp187‑196

Hannele Hypponen, Pirkko Nykanen, Lauri Salmivalli

Look inside Download PDF (free)

The Adoption of new Application Development Tools by IT Pro‑fessionals from the Viewpoint of Organisational Learning  pp197‑206

Torsti Rantapuska

Look inside Download PDF (free)

Evaluation of Information Technology Productivity and Productive Efficiency in Australia  pp207‑210

Wesley Shu, Simon Poon

Look inside Download PDF (free)

Designing a Process‑Oriented Framework for IT Performance Management Systems  pp211‑220

Sertac Son, Tim Weitzel, Francois Laurent

Look inside Download PDF (free)

Impact of the Quality of ERP Implementations on Business Value  pp221‑230

Oana Velcu

Look inside Download PDF (free)