Further development of the online survey system ZOFAR

Start of the project: 15-Aug-2015

ZOFAR is the online survey system developed in the service area between 2009 and 2015; it is tailored to the specific requirements of social science research. It comprises a Java Enterprise application that we operate under an open source licence (AGPL V3+). The system’s operation takes into account that survey respondents use a range of different technologies and ensures (e.g. by avoiding the use of JavaScript as far as possible) that all respondents experience the online survey in a similar way, enabling the collection of valid data. It supports a number of standard question types (open questions as well as simple or multiple choice, selection matrix, semantic differentials and hybrids of the latter). The survey system is also capable of showing new types of questions, such as user-friendly recording of long-term biographies at a monthly level or the collection of complex data using several linked questions on one page of the form. It provides numerous possible visual and content-based variations in online survey design (a range of log-in procedures, optional display of the survey length, integration of corporate design, integration of foreign languages, implementation of complex filtering processes).

The server architecture behind ZOFAR is based on a scalable server cluster that contains sufficient redundancy to remain secure in case of breakdown and to offer functional security. The cluster includes Apache HTTP servers, Apache Tomcat Application servers and PostrgreSQL database servers (incl. continuous archiving and hot standby for point-in-time recovery and additional data security). The Apache HTTP servers’ configuration enables surveys to be carried out in parallel and the load can be distributed among several Tomcat Application servers in case of larger surveys.

The further development of the survey system is designed to comply with the demands defined by higher education research and science studies. The plan is to expand the technical functionality and increase the accessibility of online surveys, particularly in view of the use of mobile end-devices, and close cooperation with our research data centre with the aim of achieving efficient documentation of the surveys and making data available quickly.


The Online Research service area offers advice and support to the project teams in higher education research and science studies in planning and implementing web-based surveys. If wished, empirical social projects can make use of services from an early stage for the sampling procedure. We then collaborate closely with the project to produce a questionnaire template that is used to programme the online questionnaires using the Extensible Markup Language XML. The online survey is then stress-tested and trialled in a range of scenarios with simulated participants. Our service repertoire includes implementing a range of log-in procedures, sending invitations and reminders, preparing various response rate statistics, supporting data processing and plausibilizing survey data, and weighting the sample. After the fieldwork is complete, we hand over a comprehensive data package to the project teams containing the survey data as well as, among other things, a code book, marginal distributions, field statistics and the XML file on the online survey. We are planning additional supporting services. Our staff are also available to advise you on methodological aspects of online surveys. The service area would like to promote the discourse and the further development of online research by offering supplementary workshops.


In higher education research and science studies, our surveys are typically addressed to young, well-educated individuals such as people with a higher education admission qualification, students, graduates, doctoral candidates and researchers. Most of them use computers and the internet intensively, and are thus easy to reach by an online survey. This method of data collection has also become increasingly important at DZHW in recent years. The long-term survey series are being successively adapted to this data collection method or expanded to become multi-modal by using an online survey to supplement personal, written or telephone interviews.

This expanded access to our target population and data collection modes is linked, in our opinion, with a range of methodological issues relating to contacting and survey participation, response behaviour and data equivalence. We are following this process closely and would like to find answers to questions such as: to what extent can we use post or email to gain respondents to participate in a scientific study? How can the survey series that have been carried out in writing for many years be adapted to online survey processes? How must online surveys be designed to enable respondents to answer them regardless of the end device used (such as computer, tablet or smartphone)?

Our research interest is therefore not limited to the survey data itself. During an online survey, what are known as ‘paradata’ can be collected simultaneously. They give information about the respondents’ response behaviour and about the end device used (computer/notebook, tablet or smartphone) and its configuration (such as browser and plug-ins). The use of paradata enables us to determine how far differences in response behaviour can be attributed to, for example, the end device or a specific browser. They also help us to ensure that the online survey is available to all participants in similar form, as far as possible.

Last but not least, we realise that the development of our survey system confronts us with new technical challenges such as increasing the system’s stability and expanding the load limits. Our research efforts therefore also address the analysis of load distribution in view of increasingly complex future online surveys involving several hundred thousand respondents.

Selected Projects

Show more Show less

Does the way how demanding questions are presented affect respondent’s answers? Experimental evidence from recent mixed-device surveys.

Schulze, A., Euler, T., Schwabe, U., Sudheimer, S., & Fiedler, I. (2021, September).
Does the way how demanding questions are presented affect respondent’s answers? Experimental evidence from recent mixed-device surveys. Poster auf der Konferenz General Online Research Conference, DGOF, Berlin.

Contact person

Andrea Schulze
Andrea Schulze +49 511 450670-438


Viktor Dick Martin Konstantin Christian Meisner Parmida Zarei