Questioning the status quo, and thinking about how we could do things differently, and more inclusively, is what drives me.
My job is to challenge the old and today and encourage colleagues to ask a few simple but important questions: do things have to be this way? Could we do things better?
I have first-hand experience that design doesn’t always account for the needs of all its end-users, from changing my travel patterns to avoid dark routes at night to struggling to placing a bag above the seat on a train.
My latest project - a study on the gender data gap and its impact on the built environment - is exploring why this is the case and reveals just how easy it can be to slightly change our approach - for the benefit of everyone.
In the study, we consider how design standards and data models have been based on ‘reference man’ or, more recently ‘reference person’ for over 45 years. Perhaps unsurprisingly, this ‘reference’ is a young white male of average proportions with a Western lifestyle. Of course, there are many ways in which this sidesteps diversity, but we focus on one of the most obvious – it totally overlooks women. But not on purpose.
Typical data on cycling, for example, tends to focus on peak travel times when more men are travelling. Women are more likely to travel outside of peak hours when roads are quieter. A design that improves the cycling experience for women (by using segregated and wider lanes) will also benefit existing cyclists. Considering data that reflects women, designs and advice will become more inclusive from the start.
We’ve painted a picture. With numerical and qualitative data we show the difference between typical experiences. Data gaps are not intentional. But rather, the result of a ‘one size fits all’ male norm approach which leads to missing information.
A Future Ready approach helps us to recognise trends and consider any potential challenges now. Looking to the future, automation is becoming more important in all we do.
The gaps in our gender data could make it unintelligent as it becomes written into artificial intelligence. We must design out inadvertent issues in our algorithms now. The ideal with AI technology is to reduce our human biases and cannot be built in this way without re-examining our data and current standards to better suit all.
We’re starting with gender. But our findings are just the start. We know that by filling the gaps, we can considerately design for all ages, disabilities, sexual orientations - for everyone.
We all need to think deeply, and challenge solutions based on out-of-date data. Yes, we must meet our design standards and codes, but why not look beyond them? If we can change something for the better, why don’t we?