1,858
Views
2
CrossRef citations to date
0
Altmetric
Original Articles

Learning of Construction Technology Appraisal in Higher Education: A Case Study Enquiry

Pages 77-97 | Published online: 15 Dec 2015

Abstract

Technology appraisal is a fundamental task for building and construction management. It is essential that university students today, i.e. construction professionals tomorrow, are armed with a solid and critical understanding of construction technology. However, technology appraisal appears to have been overlooked in relevant previous research. This paper addresses this knowledge gap by investigating undergraduate learning of construction technology appraisal in building disciplines at the University of Plymouth. The study was carried out using a combination of coursework assessment, a questionnaire survey and group interviews. Results suggest a general lack of criticality and critical analysis in students’ technology appraisal, which was highlighted as a key learning issue in building and construction education. The students found critical appraisal of, and proposing alternatives to, the technology more difficult than describing and explaining it. The students also perceived their knowledge and skills to be superior to those they actually demonstrated in their work. A number of strategies were developed for improving undergraduate learning of construction technology appraisal. The critical approach needs to be formally and strategically integrated into building curriculum design. Research-informed learning should be promoted, and student skills in analytical and quantitative analysis and practical engagement are to be enhanced. Better communications between, and greater engagement of, the learning stakeholders are crucial. The findings should contribute to improving higher education learning of construction technology in the built environment.

Introduction

Technology appraisal is a fundamental strategy for building and construction management and can have a significant impact on the overall performance of an organisation or project. Although the choice of construction solutions has been simplified as a process of managing the resources and addressing the design (CitationBryan, 2005), construction technology appraisal effectively involves a wide range of decision criteria (see CitationPan et al., 2008; Riley and Cotgrave, 2009). The increasingly stringent regulations on sustainability (e.g. CitationAtkinson et al., 2009) push outwards the boundary of construction technology assessment criteria which were conventionally associated with quality, cost and time only. Both theories and practice of technological innovation in construction also evolve, from the early simpler ‘technology-push’ and ‘market-pull’, through the integrated forms (see CitationJones and Saad, 2003), to more complex systems integration and networking approaches such as techno-economic (see e.g. CitationTsoutsos and Stamboulis, 2005) and socio-technical (see e.g. CitationRohracher, 2001; Geels et al., 2008) models. All these factors together present a case for emphasising and renewing knowledge of construction technology appraisal in the built environment. University students today will form the backbone of forces to deliver a sustainable society tomorrow. To ensure that the students are armed with a solid and critical understanding of construction technology appears fundamental to higher educational learning in the area. However, it appears that very few previous studies focus specifically on critical appraisal of construction technology in the context of higher education.

According to UKAS statistics, the number of applicants for building-related degree courses, including construction management, building surveying and quantity surveying, starting in autumn 2009 declined by 7.6% compared to the previous year, set against the background of a 7.8% overall rise in full-time university applicants to UK universities and colleges (CitationConstruction Manager, 2009). Also, there is a general deficiency of quantitative and mathematical skills in engineering areas resulting from a gradual decrease of entry levels (e.g. CitationNewman-Ford et al., 2007; Overton, 2003). These factors, together, imply that there is a potential student skills shortage risk to higher education learning of building and construction in respect of critical analysis and technology appraisal. It is important to address this risk given the significance of construction technology to delivering a modern built environment. Good knowledge of, and skills at, appraising construction technology might also add value to students’ future careers in both practice and academia.

This paper reports on a study aimed at investigating current learning performance in construction technology appraisal and developing strategies for improving student knowledge and skills at critical analysis and appraisal. The study was carried out using a combination of carefully designed coursework, a questionnaire survey and follow-up group interviews with second-year undergraduate students in building disciplines at the University of Plymouth. The paper assesses the current performance of students in appraising construction technology, investigates the associated problems and explores the underlying reasons and how they can be addressed. The discussion leads to the development of strategies for improvement and the conclusion of the paper.

Critical Construction Technology Appraisal

The area of construction technology assessment or appraisal has been widely studied. Drawing on three project case studies, CitationWells (1993) assessed the success of appropriate building technologies in meeting client expectations in terms of time, cost, quality and broader economic implications. CitationKadir et al. (2006) compared the construction performance in relation to labour productivity, construction structural cost, crew size and cycle time between conventional building systems and industrialised building systems by surveying 100 residential projects in Malaysia. Many more recent studies attempted to address technology assessment and selection by applying multi-criteria decision analysis methods. CitationWong and Li (2006) developed a conceptual model for selecting intelligent building systems. CitationDoukas et al. (2009) provided an intelligent decision support model for assessing energy-saving building measures and CitationPan et al. (2008) presented a decision support matrix for selecting build systems for housing construction. However, there appears to be a knowledge gap between construction technology research and relevant textbooks. Also, very little previous research is focused on critical construction technology appraisal in higher education learning.

Nevertheless, critical thinking and critical analysis have been studied in the context of general higher education and academic writing. CitationCottrell (2005) defined critical thinking as a complex process of deliberation which involves a wide range of skills and attitudes. CitationBowell and Kemp (2005) regarded it as a tool for argument analysis, and thinking clearly and rationally. As such, it is seen as a desirable key skill for graduates to demonstrate. Some argued that in today’s workplace, critical thinking abilities are needed now more than ever before (CitationSendag and Odabasi, 2009). However, these descriptions offer an oversimplified explanation. Criticality is more than just a skill or a tool. If it is to be a desirable attribute for graduates to take into their professional careers, then it is as much about attitude and disposition (CitationHilsdon and Bitzer, 2007). CitationHilsdon et al. (2006) developed a critical thinking model which is underpinned by a ‘functional-narrative’ approach, with its structure based upon description, analysis and evaluation, in order to help students in ‘deconstructing and reconstructing a given problem, topic or knowledge claim’ (CitationHilsdon and Bitzer, 2007, p.1194).

The concept of critical construction technology appraisal in this study expanded the three key stages of the critical thinking model developed by CitationHilsdon et al. (2006), i.e. description, analysis and evaluation, by including another one, i.e. proposing alternative construction technologies. This fourth stage required the provision of arguments on why the alternatives would be better or more appropriate than the existing technology. These four aspects together, attempted to encourage critical in-depth student learning.

This four-stage critical construction technology appraisal model was integrated into a coursework assignment, which was part of learning in a second-year undergraduate construction technology module. The coursework required the use of critical thinking and analysis, which was supposed to be demonstrated in all the four aspects, i.e. description, explanation, critical appraisal and proposing alternatives (70% of assessment in total). These four aspects, coupled with the oral presentation and written report (15% respectively), were used for assessing the student group coursework performance (). Up to 5% extra points were made available for encouraging research efforts.

Table 1 Criteria used for assessing construction technology appraisal

The students, in groups of four or five, worked together to critically appraise a type of construction technology for a real-world project. Each group utilised different projects drawing on a previous coursework exercise. The projects covered a range of types of buildings, including leisure, education, offices, supermarket, residential, hotel, shopping centre and sports stadium. All the construction technologies to be appraised were proposed by the student groups and moderated by the lecturer to achieve a wide coverage of construction elements of a building. The technologies covered the types of roof structure, façade and cladding, structural frame, staircases, internal walls, canopies and building services. The moderation of the use of projects and technologies enabled in-depth learning of individual groups and mutual sharing between the groups during the coursework presentations.

Methodology

To achieve the research aims a combination of coursework assessment, a questionnaire survey and group interviews were undertaken. Sixty-six second-year undergraduate students participated in the study and they were from four courses including Building Surveying and the Environment (30), Construction Management and the Environment (12), Environmental Construction Surveying (18), and Architectural Technology and the Environment (6).

Coursework assessment was utilised as the first step to examine the current student performance and investigate the problems. This step of investigation is important, as assessment has been regarded as the most critical influence on what and how well students learn (CitationGibbs and Simpson, 2002), whilst it is considered as the most poorly understood or ineffective of higher education practice as professionals (CitationQuality Assurance Agency for Higher Education, 2006). All the 15 student groups () were assessed, drawing on the clearly defined criteria (). A paper-based questionnaire survey was carried out following the coursework assessment. The survey examined the student perspectives on construction technology appraisal, explored the issues and the underlying reasons and sought students’ recommendations for improvement. Given that the unit of study was a cohort of undergraduate students registered to the construction technology module, ‘convenience sampling’ (CitationBryman, 2004) was used. All 66 students were invited to participate in the survey on a voluntary basis with ethics duly considered. The survey was carried out in two identical sessions to enable more students to participate, which, together, yielded a response rate of 92.4% (), which was high enough to justify the survey (CitationOppenheim, 1992).

Table 2 Participants in the study

An initial survey questionnaire was developed through the literature review of construction technology appraisal (e.g. CitationBryan, 2005; Pan et al., 2008; Riley and Cotgrave, 2009), drawing on the results from the coursework assessment. This questionnaire comprised a mix of qualitative and quantitative questions with a methodical use of rating scales, Likert scales and open-ended questions (see CitationOppenheim, 1992; Bryman, 2004) in order to establish student knowledge and skills profiles, reveal the issues of concern and invite recommendations. The questionnaire was refined through discussions with learning support colleagues. The validity and reliability of the questionnaire were verified through a pilot survey of two colleagues who had taught the same module previously.

Follow-up semi-structured interviews were undertaken with the student groups to verify and explore in more depth the results obtained from the coursework assessment and questionnaire survey. All the 15 student groups were invited, and 10 attended the interviews (). A list of open-ended questions arranged in a reasonably logical order (CitationMorse and Richards, 2002) was refined through discussion with colleagues. These questions were provided to the interviewees a week or so in advance, which offered them comfort of the pre-planned questions and reasonable time to think of the issues. The questions were used in the interviews as a guide (CitationBryman, 2004). The interviews lasted between half and one hour each. Notes were taken, which, although not verbatim, significantly added rich data to the questionnaire survey analysis. Despite some necessary involvement in discussion and clarification, the interviewer’s interference to students’ responses was minimised.

For quantitative data analysis, Microsoft Excel was used for storing, analysing and illustrating the quantitative data collected from the coursework assessment and questionnaire survey. SPSS16 was also used for statistical analysis where necessary, which involved a combined use of the methods of univariate and bivariate analysis (CitationBryman, 2004). Three measures of central tendency, i.e. the mean, median and mode, and standard deviation for dispersion were used to avoid unexpected effect on results of extreme scores (CitationChristensen and Stoup, 1991). For bivariate analysis, according to the nature of the variables studied, the measure Pearson’s r and Spearman’s rho (CitationBryman, 2004) were used for representing correlations between variables. For qualitative data analysis the content analysis method (see CitationPatton, 2002) was used. This was reflected in the process of establishing codes, identifying themes and generating patterns from the data.

Results and Analysis

Coursework assessment

The coursework assessment suggested that the student groups’ performance in appraising construction technology varied to different levels (). The overall marks ranged from 35 to 71 and averaged 56, with a standard deviation (SD) of 11.1.

Figure 1 Coursework marks of 15 student groups

At individual criterion level, the means of the marks against ‘proposing alternative technologies’ (52) and ‘critical appraisal’ (55) were lower than against ‘description’ (60) and ‘explanation’ (58), which suggests that the students were better at describing and explaining than critiquing and appraising construction technologies. The criterion ‘proposals’ was associated with the highest SD (14.7), which highlights a greater inconsistent student performance in proposing alternative technologies than in other tasks like describing (11.4), explanation (13.2) and critical appraisal (12.5).

All the student group marks against the first four criteria (C1ߞC4) were strongly positively correlated with each other, with r values no less than 0.725, statistically significant at p < 0.01 level (2-tailed) (). These statistics suggest that student groups which received better marks at ‘critical appraisal’ and ‘proposals’ also performed better at ‘description’ and ‘explanation’, and vice versa. This significant strong correlation also existed when criteria of ‘presentation’ (C5) and ‘report’ (C6) were taken into consideration (). However, the correlations between ‘research efforts’ (C7) and other criteria were not consistent with each other. The statistics suggest a weak, statistically non-significant, relationship between student research efforts and their performance in description and explanation of technology. However, student research efforts were modestly related to their performance in ‘critical appraisal’ and ‘proposals’ (r = 0.56, p < 0.05), and strongly related to student report writing (r = 0.83, p < 0.01) (). These statistics imply that student research efforts demonstrated in their coursework might not directly help with their description and explanation of construction technologies, but more likely enabled their critical thinking and proposal of alternatives, and much more clearly supported their academic reporting.

Table 3 Correlation between assessment marks against individual criterion

The coursework assessment suggested that there were a number of common weaknesses in student learning of appraising construction technology.

  • Lack of quantitative supports. Most of the group submissions provided reasonable qualitative arguments. However, most lacked quantitative supports to their arguments.

  • Lack of comparative analysis. For proposing alternative construction technologies, all the group submissions provided information of individual construction technologies. However, there was a lack of comparative analysis.

  • Inconsistent understanding of ‘sustainability’. Sustainability was an important concept embedded in the coursework assignment. However, there was a generally superficial understanding of the concept, and the criteria used for measuring sustainability of construction technologies were inconsistent.

  • Lack of criticality. There was a pattern of ‘copy and paste’ in the coursework reports. Few groups justified the selection of the technology. The criteria used by the groups for ‘critical’ appraisal varied. Although this might be attributed to the selection of different technologies, the use of criteria should have reflected the learning from the lectures on the use of performance assessment criteria for construction technologies. There was also a lack of demonstration of research efforts in both evaluating existing technology and proposing alternatives.

  • Little evidence of engaging stakeholders other than students themselves. Most groups did not present evidence of engaging or interacting with the other stakeholders of the technology, e.g. builders, owners, designers or suppliers.

  • Improper referencing & academic writing. Although some group reports were written with a clear structure and due references, generally there was a lack of proper referencing and clear or justified argument building.

Questionnaire survey and group interviews

Student satisfaction and skills

More than half (55.7%) of the respondents were satisfied with their learning and performance in the construction technology appraisal coursework. Nearly a quarter (23%) took a neutral attitude and 13.1% were unsatisfied. Only a small number of students were either very satisfied (6.6%) or very unsatisfied (1.6%) ().

Figure 2 Students’ satisfaction with their construction technology appraisal

More than half (52.5%) of the respondents took a neutral assessment of their current skills at appraising construction technology. Just over a third (34.4%) considered their skills good whilst 13.1% perceived their skills poor (). No students graded themselves as very good or very poor in their skill assessment.

Figure 3 Students’ self assessment of skills at appraising construction technology

The measured student satisfaction with their construction technology appraisal and student self assessment of skills were modestly correlated with each other (Spearman’s rho = 0.384, statistically significant at p < 0.01 level, 2-tailed). This correlation suggests that the more satisfied the students were with their performance in construction technology appraisal, the better they rated their critical appraisal skills, and vice versa.

Students’ perspectives on critical construction technology appraisal

Drawing on common student learning weaknesses reflected in their coursework practice, six statements about construction technology appraisal were provided in order to measure the students’ perspective. Likert scales were used, with 1 for ‘strongly disagree’ to 5 for ‘strongly agree’ ().

Table 4 Students’ perspectives on appraising construction technology

The patterns of the frequencies of the measured student perspectives on the six statements were, in general, strongly correlated with each other (Spearman’s rho close to +/−1 in most cases), although some of these correlations were not statistically significant (p > 0.05) ().

Table 5 Non-parametric correlation between the frequencies of student cohort perspectives on appraising construction technology

These statistics suggest a general consistency between the perspectives of the students, as a cohort, on the statements on critical construction technology appraisal. This demonstrates a reasonable level of knowledge of the student cohort in compliance with the logic of the critical approach and proper academic writing.

When analysing the responses to the six statements from the students, as individuals, the results show that there was a modest correlation between the measurements of individual students’ perspectives (). These statistics suggest a general poor consistency between the perspectives of the students, as individuals, on critical construction technology appraisal.

Table 6 Non-parametric correlation between individual student perspectives on appraising construction technology

Two important implications of the statistical analysis above are revealed. Firstly, the students, as individuals, demonstrated inconsistent understanding of critical appraisal in the context of construction technology learning. Virtually no significant correlation was found in this study between the assessments of student satisfaction and skills and the measurements of student perspectives on the statements on construction technology appraisal (). The other implication is that the students, as a cohort, demonstrated a modest correlation between the assessments of their satisfaction and skills and the measurements of their perspectives of construction technology appraisal ( & ). The level of statistical significance associated with the second implication was higher than that with the first one. The analysis above supports the comments from the coursework assessment on student learning weaknesses. It also suggests that the students’ knowledge of critical appraisal that was reflected in their responses to the survey was superior to that demonstrated in their coursework practice.

The students’ perspectives on the criteria for appraising construction technology were also measured through the use of Likert scales ().

Figure 4 Students’ perspectives on the criteria for appraising construction technology

1 ‘not important’, 2 ‘somewhat important’, 3 ‘neutral’, 4 ‘important’, 5 ‘very important’

The average values of importance associated with all the criteria were above 3 (neutral), most effectively rounded up to 4 (important). This pattern of ratings reveals that the majority of the respondents considered most of the decision criteria important for appraising construction technology. Student awareness of the importance of wide-ranging criteria was implicit. However, in contrast, the coursework assessment suggests that there was a general lack of both breadth and depth, in student groups, of utilising the criteria.

Issues affecting learning of appraising construction technology

Most of the students responded to the questionnaire survey with issues which affected their learning and coursework performance, and over 90 issues were raised. These issues were confirmed and explained in the group interviews. They are summarised below:

  • Lack of understanding of the critical approach. This was reflected by a general lack of understanding of critical thinking and critical analysis, a relatively late introduction of the critical approach to the coursework and absence of previous formal teaching of the approach.

  • Difficulty with obtaining relevant information. This was reflected by a lack of information on construction detail/technology chosen, a paucity of reputable references on technology appraisal, difficulty with finding alternative construction technologies, a lack of resources in library or search ability to find the right information, no previous experience of researching information and failure to access detail/technology for study.

  • Lack of briefing and/or understanding of the coursework. This was reflected by a lack of briefing the coursework in lectures, insufficiently clear coursework briefing documents and failure to arrange meetings/tutorials with lecturers.

  • Poor or difficult individual learning management. This was reflected by difficulty with concentrating on the learning goals, managing coursework, social, sports and personal activities and other learning commitments, encountering software compatibility issues, weakness in reading and taking notes at lectures, malfunctioning of PowerPoint presentation, a lack of research and understanding of the technical aspects, and a paucity of self-directed study/reading.

  • Poor or insufficient group management. This was reflected by a lack of group meetings and discussions, improper understanding or interpreting of the coursework brief, individuals affecting overall group marks, difficulty with managing other coursework with similar hand-in dates (or too many hand-in dates together as claimed) and integrating group members’ input into a concise report, insufficient background reading and the failure to respond to marking criteria.

  • Poor or inconsistent presentation and writing skills. This was reflected by a lack of demonstration of in-depth understanding, poor role-play and presentation skills and improper academic writing.

The students further explored the issues in the interviews by commenting that the lack of critical analysis was a cross-module problem. There was no formal lecture on critical thinking and analysis in the building programme. Secondly, group management was apparently prominent in group work, for which arguments for both grouping strategies, i.e. allocated groups and student self-grouping, existed. Nevertheless, a leading or coordinating role in the group was claimed crucial. Thirdly, greater student motivation should have helped communication with lecturers. Some groups identified uncertainties and enquiries regarding their work, but did not clarify perceived difficulties with the lecturers.

Recommendations for improvements

Most students provided recommendations for improvements in the questionnaires, which yielded over 150 items. These recommendations were discussed and verified in the interviews. They are summarised in relation to the learning stakeholders including students, lecturers, university/library and external/industry.

The students realised that they needed to improve their understanding of construction technologies, the concept of sustainability and criteria for technology appraisal. Critical thinking and analytical skills should be developed for proposing and assessing alternative technological solutions. Also, skills at searching relevant information and using methodologies for appraisal were considered to be enhanced. Furthermore, management skills were regarded as a significant area for improvement, which should target the effective organisation of both individual and group work and communications between group members and with lecturers. Finally, enhanced skills at presentation and academic writing and referencing were perceived particularly important by the students to uplifting their learning performance.

Many of the students’ suggestions were provided in relation to lecturers. First of all, lecturers were requested to further facilitate student technology learning, e.g. by using more actual examples or case studies of technology, explaining in class construction technology in more detail, demonstrating the use of tools and techniques for selecting and assessing building technology, and providing more practical teaching and learning such as video, site visits, models to relate theory to practice more effectively. Secondly, the students suggested that lecturers provide the critical thinking lecture before the construction technology coursework or in Year 1, integrate critical appraisal into the lectures to a greater extent, and arrange more interactive sessions in class to allow technology comparison practice. Thirdly, lecturers were expected to provide more support to student learning, e.g. by uploading more lecture notes/alternative learning sources on to the intranet for further learning, directing information search, and advising on books or information sources early. The students clearly expressed their aspiration for more one-to-one tutorials and drop-in sessions, learning by analysing good examples of coursework, and formative feedback before final submission. Fourthly, some students wished that the coursework could be broken down into smaller tasks and hand-in dates for other assignments could be coordinated better. Fifthly, a greater involvement of lecturers in student individual and group work management was suggested. This included further raising student awareness of the importance of the work, clearly specifying the role-play rule, engaging more to help with communications between student group members, and utilising peer review to ensure more balanced contributions by group members and more accurate marks for individuals.

Also, a number of recommendations were made in respect of the learning facilities provided by the university and library. It was suggested that each student on the course be allocated with a module and/or personal tutor. Extra time should be allowed to students with dyslexia for coursework as for exams. More equipped group study rooms needed to be provided to help enable group work, and information on the buildings in the university should be made available. Also, the students commented that more library resources and greater student support in the library should be provided.

Finally, a range of recommendations were associated with industry and external bodies to the university. The students reckoned that demonstration information and detailed performance data of construction technology from trade associations and professional bodies would be very useful. A greater engagement with industry for improving technology critical learning was suggested by most of the students. More guest lectures by industry speakers should help students develop a better industry perspective. More project case studies should enable practical learning. More site visits would facilitate illustrative learning of specific construction technologies.

Discussion

The three components of this study, i.e. the coursework assessment, questionnaire survey and group interviews, together, contributed a reasonably clear picture of the students’ perspective on, and coursework practice of, appraising construction technology. There was a general lack of criticality and critical analysis, which was highlighted as a key learning issue by the students in both the survey and interview discussions. This result echoes the claim by CitationCottrell (2005) that many people find it difficult to order their thoughts in a logical, consistent and reasoned way, while many others who have the potential to develop more effective critical thinking can be prevented from doing so for a variety of reasons apart from a lack of ability.

The detailed analysis of the students’ coursework performance suggests that they found critical appraisal and proposing alternatives more difficult than describing and explaining construction technology, which was also verified in the survey and interviews. This finding warns the teaching and learning stakeholders of a surface learning approach taken by students, which was associated with a lack of scholarly critiquing and know-how skills. The prominence of building scholarship in higher education has been long emphasised (see e.g. CitationJenkins, 2004) as it should help students develop practical thinking and in-depth learning. The paucity of criticality in appraising construction technology revealed in this paper reiterates the need for universities and lecturers to help enhance the scholarship of teaching and learning by making it ‘rigorous, visible and accountable’ (CitationCotton, 2006, p.12). Criticality and scholarship are particularly important to improving effectiveness of teaching given the increasing focus on research-informed teaching in higher education (see e.g. CitationJenkins, 2004; Healey and Jenkins, 2008; Roberts, 2007). This approach has also been promoted in the University of Plymouth (2009) via its Teaching and Learning Strategy 2009–2012. However, despite the general promotion, practical guidance is yet to be developed for building disciplines. An effective application of this approach will need to take into account many issues, e.g. the discipline and module specifics, industry and professional body influence, for which this paper hopefully has laid a foundation for further study.

The lack of critical analysis in students’ reports appeared partly attributed to a paucity of quantitative support to their arguments. Coupled with the less critical and scholarly approach was a weakness or unwillingness of students to explore their appraisals using ‘hard’ data. This is believed to be related to the general deficiency of quantitative and mathematical skills in engineering areas resulting from a gradual decrease of entry levels (e.g. CitationNewman-Ford et al., 2007; Overton, 2003). Attempts to address this skills shortage exist (e.g. CitationPerkin et al., 2007), but how to improve student mathematical and therefore quantitative analytical skills in building is under-researched.

There was a discrepancy between the relevant knowledge and skills that the students perceived and what were effectively reflected in their coursework performance. This discrepancy suggests that the students perceived their knowledge and skills to be superior to those that they demonstrated in their coursework. It seemed to be attributed to two possible reasons. One was that the students did not really understand in-depth the critical approach and its usage in construction technology appraisal. The other was that the students’ knowledge of critical thinking was improved at the time of responding to the survey and interviews due to their attendance to the lecture on critical thinking and participation in this study. Albeit for uncertain reasons, the improved student perspectives on critical technology appraisal reveal the significance of learning support and learner engagement and participation to ensuring effectiveness of learning. Previous studies (e.g. CitationBiggs, 2003; Race, 2001) also claimed positive correlations between students’ engagement and their learning curve.

The discussion of the results suggests a number of strategies () which should help address the issues affecting student learning of appraising construction technology and implement the recommendations established in this paper.

Table 7 Key strategies for improving learning of construction technology appraisal

These strategies address the student weaknesses and issues with learning construction technology. They also reflect the key skill requirements for sustainable construction education summarised by CitationMurray et al. (2006): technical/discipline specific skills, e.g. collaborative working, presentation, management and leadership, teamwork, communication, data analysis, and generic skills, e.g. critical thinking, organisation, decision-making, partnership working and relationship skills. The strategies suggest that, to achieve a critical and deep student learning of construction technology, a wide range of stakeholders should be engaged effectively. This should include not only lecturers and learners but also university, industry, professional institutions and trade associations. Despite the grouping of the recommendations and strategies provided in the paper, they are actually interactive and involving multiple stakeholders. Rather than considering them in isolation, a holistic approach is needed, i.e. to form an effective learning stakeholder party. Such a party can be in the form of the Accelerating Change in Built Environment Education (ACBEE) and the Construction Knowledge Exchange (CKE) initiative (CitationHeesom et al., 2008), and will enable a wider take-up of critical thinking and deep learning in relation to construction technology in higher education. Professional accreditation of the course is certainly a useful mechanism to involve professional bodies to improve practicality of building and construction higher education. Industry sponsorship is another effective means for engaging industry. However, previous research (e.g. CitationGriffiths, 2004) argued that the ‘content coverage’ mentality generated by accreditation requirements seems to be an inhibitor of more research-led and research-oriented teaching and warned that there is a tendency in built environment education to focus on imparting professional skills rather than on more critical inquiry into the process of knowledge creation. Project-/problem-based learning (see e.g. CitationUniversity of Nottingham, 2003) has emerged to become a fairly common practice in building and construction education, which promotes criticality, practicality and industry involvement in technology learning. It is therefore important to adapt the strategies developed in this paper to specific teaching and learning contexts in order to achieve effectiveness of application. Better communications between, and greater engagement of, the learning stakeholders are crucial.

Conclusions

This paper has investigated the perspectives and coursework practices of undergraduate students in relation to appraising construction technology. The results suggest a general lack of criticality and critical analysis in students’ technology appraisal. The students found critical appraisal of, and proposing alternatives to, the technology more difficult than describing and explaining it. However, the students perceived their knowledge and skills to be superior to those they actually demonstrated in their work. Significant issues were raised by the students, which were perceived to have affected their coursework performance and learning. These issues were centred on insufficient individual learning and group management, lack of critical thinking and analysis, paucity of presentation and academic writing skills and weak capability in research and practical engagement.

A wide range of recommendations were provided accordingly to help improve student learning, which concerned not only students and lecturers but also university and the wide industry context. Formation of a learning stakeholder party is recommended, as better communications between, and greater engagement of, the learning stakeholders are critical. Lecturers should review relevant teaching curricula and coursework activities, with the purpose of formally integrating the critical approach into the coursework brief, induction and assessment. The university should review the balance between effectiveness of teaching and learning and other higher education commitments in discipline specifics. And students should act proactively with their learning and manage resources effectively. These strategies should provide guidance for effective applications of the recommendations revealed in the paper. The implementation of the strategies will however be evaluated in future research to enable continuous improvement of the effectiveness of teaching and the quality of learning in the area.

The paper also contributes to the debate on the effectiveness of using assessment for improving student learning by presenting a worked example of analysing and elucidating assessments and communicating feedback with the learner in a structured and integrated manner. The quantitative results of the paper may not fully satisfy the quantitative sampling principle given the unit of analysis being a single institution. However, the study was in-depth, with a careful research design of combined assessment, survey and interviews, the findings based on which should present logic of replication, and will support a strategic integration of the critical approach into the teaching curriculum in building and construction disciplines in the built environment. Future research may explore construction technology learning at different institutions, which should contribute to further debate on construction technology appraisal in the wider community.

References

  • Atkinson C., Yates A. & Wyatt M. (2009). Sustainability in the built environment: An introduction to its definition and measurement. Watford: BRE.
  • Biggs J. (2003). Teaching for quality learning at university: What the student does. 2nd ed.Buckingham: SRE & Open University Press.
  • Bowell T. & Kemp G. (2005). Critical thinking: A concise guide. 2nd ed.London: Routledge.
  • Bryan T. (2005). Construction technology: Analysis and choice. Oxford: Blackwell.
  • Bryman A. (2004). Social research methods. 2nd ed.Oxford: Oxford University Press.
  • Christensen L. B. & Stoup C. M. (1991). Introduction to statistics for the social and behavioural sciences. 2nd ed.Pacific Grove, CA.: Brooks/Cole Publishing Company.
  • Construction Manager. (2009). Students start to shun building. Construction Manager Magazine, March, 6-7.
  • Cotton D. R. E. (2006). Using an institutional audit to enhance the scholarship of learning and teaching: A UK case study. MountainRise, 3 (2) URL: http://mountainrise.wcu.edu/index.php/MtnRise/issue/view/8
  • Cottrell S. (2005). Critical thinking skills: Developing effective analysis and argument. Basingstoke: Palgrave Macmillan.
  • Doukas H., Nychtis C. & Psarras J. (2009). Assessing energy-saving measures in buildings through an intelligent decision support model. Building and Environment, 44 (2), 290-298.
  • Geels F.W., Hekkert M.P. & Jacobsson S. (2008). The dynamics of sustainable innovation journeys. Technology Analysis & Strategic Management, 20 (5), 521-36.
  • Gibbs G. & Simpson C. (2002). How assessment influences student learning ߞ a literature review. Milton Keynes: Centre for Higher Education Practice, Open University.
  • Griffiths R. (2004). Knowledge production and the research-teaching nexus: The case of the built environment disciplines. Studies in Higher Education, 29 (6), 709-726.
  • Healey M. & Jenkins A. (2008). Developing students as researchers. University College Union Magazine, October, 17-19.
  • Heesom D., Olomolaiye P., Felton A., Franklin R. & Oraifige A. (2008). Fostering deeper engagement between industry and higher education: Towards a Construction Knowledge Exchange approach. Journal for Education in the Built Environment, 3 (2), 33-45.
  • Hilsdon J., Sentito E., Magne P. & Crust G. (2006). Ideas, arguments and critical thinking, Learning Development Study Guide 8. Plymouth: University of Plymouth.
  • Hilsdon J. & Bitzer E. M. (2007). To become an asker of questions. A ‘functional-narrative’ model to assist students in preparing post-graduate research proposals. South African Journal of Higher Education, 21 (8), 1194-1206.
  • Jenkins A. (2004). A guide to the research evidence on teaching-research relations. Heslington: The Higher Education Academy.
  • Jones M. & Saad M. (2003). Managing innovation in construction. London: Thomas Telford.
  • Kadir M. R. A., Lee W. P., Jaafar M. S., Sapuan S. M. & Ali A. A. A. (2006). Construction performance comparison between conventional and industrialised building systems in Malaysia. Structural Survey, 24 (5), 412-424.
  • Morse J. M. & Richards L. (2002). Readme first for a user’s guide to qualitative methods. Thousand Oaks: Sage Publications.
  • Murray P., Goodhew S. & Turpin-Brooks S. (2006). Environmental sustainability: Sustainable construction education ߞ a UK case study. International Journal of Environmental, Cultural, Economic and Social Sustainability, 2 (5), 9-22.
  • Newman-Ford L., Lloyd S. & Thomas S. (2007). Evaluating the performance of engineering undergraduates who entered without A-level mathematics via a specialist six-week ‘bridging technology’ programme. Engineering Education, 2 (2) 33-43.
  • Oppenheim A. N. (1992). Questionnaire design, interviewing and attitude measurement. (New ed.). London: Continuum.
  • Overton T. (2003). Key aspects of teaching and learning in experiental sciences and engineering. In: Fry H., Ketteridge S. & Marshall S. (Eds.). A handbook for teaching and learning in higher education: Enhancing academic practice. 2nd ed.London: Kogan Page, pp. 255-277.
  • Pan W., Gibb A. G. F. & Dainty A. R. J. (2008). A decision support matrix for build system selection in housing construction. International Journal for Housing Science and Its Applications, 32 (1), 61-79.
  • Patton M. Q. (2002). Qualitative research and evaluation methods. 3rd ed.Thousand Oaks: Sage.
  • Perkin G., Peli G. & Croft T. (2007). The Mathematics Learning Support Centre at Loughborough University staff and student perceptions of mathematical difficulties. Engineering Education, 2 (1) 47-58.
  • Quality Assurance Agency for Higher Education. (2006). Code of practice for the assurance of academic quality and standards in higher education. Section 6: Assessment of students. Mansfield: The Quality Assurance Agency for Higher Education (QAA)
  • Race P. (2001). The lecturer’s toolkit: A practical guide to learning, teaching and assessment. 2nd ed.London: Kogan Page.
  • Riley M. & Cotgrave A. (2009). Construction technology 2: Industrial and commercial buildings. 2nd ed.Basingstoke: Palgrave Macmillan.
  • Roberts A. (2007). The link between research and teaching in architecture. Journal for Education in the Built Environment, 2 (2), 3-20.
  • Rohracher H. (2001). Managing the technological transition to sustainable construction of buildings: A socio-technical perspective. Technology Analysis & Strategic Management, 13 (1), 137-150.
  • Sendag S. & Odabasi H. F. (2009). Effects of an online problem based learning course on content knowledge acquisition and critical thinking skills. Computers and Education, 53, 132-141.
  • Tsoutsos T. D. & Stamboulis Y. A. (2005). The sustainable diffusion of renewable energy technologies as an example of an innovation-focused policy. Technovation, 25 (7), 753-761.
  • University of Nottingham. (2003). A guide to learning engineering through projects. Nottingham: University of Nottingham.
  • University of Plymouth. (2009). Teaching and learning strategy 2009 ߞ 2012. Plymouth: University of Plymouth.
  • Wells J. (1993). Appropriate building technologies: An appraisal based on case studies of building projects in Senegal and Kenya. Construction Management and Economics, 11 (3), 203-216.
  • Wong J. & Li H. (2006). Development of a conceptual model for the selection of intelligent building systems. Building and Environment, 41, 1106-1123.

Reprints and Corporate Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

To request a reprint or corporate permissions for this article, please click on the relevant link below:

Academic Permissions

Please note: Selecting permissions does not provide access to the full text of the article, please see our help page How do I view content?

Obtain permissions instantly via Rightslink by clicking on the button below:

If you are unable to obtain permissions via Rightslink, please complete and submit this Permissions form. For more information, please visit our Permissions help page.