Purpose: Through the study of visualizations, virtual worlds, and information exchange, this research reveals the complex connections between technology and the work of design and construction. The authors apply the socio-technical view of technology and the ramifications this view has on successful use of technology in design and construction.
Approach: This is a discussion paper reviewing over a decade of research that connects three streams of research on architecture, engineering, and construction (AEC) teams as these teams grappled with adapting work practices to new technologies and the opportunities these technologies promised.
Findings: From studies of design and construction practices with Building Information Modeling and energy modeling, the authors show that given the constructed nature of models and the loose-coupling of project teams, these team organizational practices need to mirror the modeling requirements. Second, looking at distributed teams, whose interaction is mediated by technology, the authors argue that virtual world visualizations
enhance discovery, while distributed AEC teams also need more traditional forms of 2D abstraction, sketching, and gestures to support integrated design dialogue. Finally, in information exchange research the authors found that models and data have their own logic and structure and as such require creativity and ingenuity to exchange data across systems. Taken together, these streams of research suggest that process innovation is brought about by people developing new practices.
Originality: In this paper the authors argue that technology alone does not change practice. People who modify practices with and through technology create process innovation.
- Dossick, Carrie, Laura Osburn & Gina Neff. “Innovation Through Practice: The Messy Work of Making Technology Useful for Architecture, Engineering and Construction Teams.” Engineering, Construction and Architectural Management, doi: 10.1108/ECAM-12-2017-0272.
Machine learning and artificial intelligence make extraordinary discoveries possible, and autonomous systems are being rolled out in vital business and social settings including healthcare, policing, and education. Assumptions about the neutrality and objectivity of data may encode serious social and political bias into the results. High-profile examples show how these systems already incorporate into their design human flaws, biases, and assumptions, especially about women and their role in society. In this talk, Professor Gina Neff will show that explicitly thinking about gender in AI will help designers make AI systems that help humans make better—and fairer—decisions.
Professor Gina Neff, Oxford Internet Institute, talks about the importance of social scientists getting involved with big data and computational methods.