Mistrust makes technology toxic

Part 4 of a series on toxic technology. Read parts 1, 2 and 3.

Over the next few instalments in this series, I’m going to write about the causes of toxic technology. These causes are systemic: markets, management, culture, finance and human bias can all contribute to toxicity. My intention is to help people understand that toxic technology is currently inevitable – and without change to systems, it will continue to be.

The first cause I will be writing about is the absence of trust in delivering and sustaining technology.

Trust & technology

The best technology is delivered by high performing teams in organisations with a culture of collaboration, transparency, and sharing. Common to all these positive characteristics is trust. Trust is efficient, and effective – it makes it easier to get stuff done.

With technology, it is easy for trust to be lost. Delivering and operating technology is unpredictable – it’s hard to set clear mutual expectations, and meet them. Teams struggle to anticipate the need to change the design of code and architecture, and operational incidents strike like lightning. The strengths and weaknesses of technology are counter-intuitive – it can be effortless to perform billions of complex calculations, but perhaps impossible to reliably identify an image as a cat.

Because trust can be lost so easily, mistrust is common in the technology industry. Command and control governance cultures are too common, particularly within larger organisations. They make mistrust endemic. In these cultures, processes and policies bake-in mistrust by design.

Mistrust leads to toxic technology because teams become misaligned with expectations. It prevents organisations working as a whole – resulting in siloed areas which compete with or undermine each other. Mistrust hides away risk – particularly long term and systemic risks – because people don’t want to be blamed for bringing bad news. It creates the perfect conditions for the ‘boiling of frogs‘ – the temperature is rising, but nobody is aware.

The patterns of mistrust are best understood by looking at the relationships that exist within, and between organisations.

Mistrust between the powerful and the disempowered

In organisations there are the powerful, and the disempowered. Power imbalances can exist between managers and the managed, budget holders and the budgetless, auditors and the audited. Hierarchies, accountabilities and ‘checks and balances’ are necessary structures for the efficient running of organisations, but they create power dynamics which can be both useful and harmful. In all of these relationships, mutual trust is needed.

The powerful hold more responsibility to create mutual trust – there is more they can do and change. Unfortunately, it is common for the powerful to encourage the disempowered to ‘build trust’ with them. They seek proof that trust is warranted through evidence of action, delivery or impact. This ignores that gaining trust is mutual, and that the disempowered need trust in order to succeed. The language of mistrust by the powerful will be familiar to many: 

  • Overspending on budgets is because of wasteful teams, not underfunding
  • Missing a deadline comes from a lack of hard/smart work in the team, not an unreasonable promise made without understanding feasibility and capability
  • A lengthy estimate is because the team wishes to ‘gold plate’ delivery, not that expectations or assumptions are unreasonable

Trust is vital for good governance. Those governing must recognise that they hold powers, and therefore responsibility. Mistrust makes governance disciplinarian – those being governed are treated as unruly, and assumed to be failing if they don’t meet expectations. Mistrusting governance becomes command and control – attempting to control without empathising with those delivering, or providing them with support. This culture creates a barrier for the flow of information, and therefore truth. Bruce F. Webster describes this as ‘The Thermocline of Truth’:

In many large or even medium-sized IT projects, there exists a thermocline of truth, a line drawn across the organizational chart that represents a barrier to accurate information regarding the project’s progress. Those below this level tend to know how well the project is actually going; those above it tend to have a more optimistic (if unrealistic) view.

Bruce F. Webster, The Thermocline of Truth

The truth which is least likely to flow in these circumstances is bad news – information about the organisation’s unacceptable risks or failing delivery. This causes the scale of existing toxic technology to become hidden, or to be explained away as an ‘accepted risk’. It also adds to the problem through the late revealing of failure in technology projects. The panic to get delivery ‘over the line’ causes toxic technology at scale as shortcuts are taken.

Mistrust between manager and the managed

Seniority in organisations is typically defined by the management hierarchy. Command and control culture follows from attributing superiority to seniority. In some organisations and sectors, higher levels of seniority are considered elites. They have rituals or identities which encourage a sense of superiority. In this culture, senior managers stop trusting those in junior roles, despite the fact they often used to hold these roles.

Command and control managers make specific demands without listening. They seek outputs, not outcomes, from those they manage. These outputs are often time-bound, and have predetermined scope – leaving little room for dialogue around risk or feasibility. Outputs in technology describe the solution: “build a website” or “make a chat bot”. This contrasts with outcomes which describe the goal, such as “make it easier to access healthcare” or “increase the global market share of a product”. Command and control managers presume teams don’t work hard or smart enough – and don’t provide support to help them succeed. Sadly, most command and control managers are propagating the culture – they are commanded and controlled themselves, and pass this down.

Command and control management doesn’t work, because the future is unpredictable. Technology delivery particularly so. This means that demanding outputs has undesirable consequences:

Hiding the truth

Teams inevitably manipulate information to navigate low trust cultures – finding ways to avoid being held to account for not meeting unreasonable expectations. Instead of providing honest, insightful information, they ‘feed the beast’ – meeting the demand for evidence of how money is spent and time used with generic, copy-and-pasted responses. Every governing function is susceptible to increasing the burden of mistrust: finance, risk, security, productivity, commercial, legal. Corporate norms and so-called ‘best practice’ in these areas are mistrustful by default. But the combined information demands of corporate norms can be suffocating for technology teams.

Teams will often hide or downplay performance issues, or toxicity in technology. This is because they don’t trust their managers to be understanding, or don’t trust the corporate process being followed. This problem can scale up to entire organisations, if mistrust is endemic. If successive managers up a hierarchy avoid exposing uncomfortable truths, then risks can grow, unknown to leaders. Widespread mistrust can lead to catastrophic accumulation of toxic technology before leaders are become aware.

Burn out

When trying to achieve the unachievable, teams find themselves on a high-speed treadmill. They try to keep up with expectations to gain or maintain trust. Some people are fortunate to have the life circumstances to temporarily push beyond the boundaries that many might find uncomfortable. However, as organisations scale they must accommodate a healthy working environment for everyone. They must consider factors such as mental health, caring responsibilities and disability. They should support interests outside the workplace and encourage balance between work and life. There is a moral argument for this. But, it is also to the long term advantage of the organisation – leading to a happier, healthier and more engaged workforce. Burn out pace is a way of operating that starts to ignore these factors, and puts the short-term needs of the organisation above the health of its people.

Burn out makes retention of skills and knowledge challenging. Hiring is impacted through the reputation of the workplace. Some areas of the tech industry compensate through paying higher salaries, storing up problems for the future. If burnout pace continues for too long, quality will drop and risks will increase as team members become exhausted. It will impact the work done afterwards – operations, continuous improvement or the next big feature or product. Most technology is intended for ongoing use and improvement, and will need a happy and healthy team to help make this happen.

Reckless risk taking

Teams trying to achieve unachievable outputs might resort to taking reckless risks, hoping they can get away with it. Risky shortcuts in technology typically happen in the less visible areas: user research, performance, accessibility and security. In some circumstances intentionally taking on technical, or other forms of  ‘debt’ is the right thing to do. It can allow faster initial delivery, but there must be a strong likelihood of paying down the debt later. In low trust cultures however, it is likely that a manager already wants the next output delivered even before the first is finished. This leads to a cycle of debt accumulation, and growing toxic technology.

Fear and anxiety

Teams can become fearful of the consequences of failing to deliver the output, and become doubtful of their own abilities. Managerial gaslighting can result from mistrust, where teams told they are under performing, but never given the trust they need to succeed. Command and control management can easily become bullying or harassment.

Mistrust between auditors and the audited

Regulators and auditors provide a ‘check and balance’ role in many industries and sectors. Whilst there is intentional separation of roles, trust is still important. A low-trust approach to auditing can sometimes work, but only if compliance with policy, regulation and standards is widespread. If compliance is consistently high, then auditors need to discover the minority who fall short – a suspicious mindset, to some extent, can work well.

If compliance is consistently low then this approach fails – auditors become a nuisance, pointing out problems that are already known, and hard to resolve. Auditors and regulators may even become counterproductive as they incentivise organisations to be opaque, and avoid being open about non-compliance.

Many areas of compliance and legislation in technology are subjective. They are also designed to address significant economic or social challenges. In this context, non-compliance is common. Compliance culture can vary between disciplinarian or general tolerance, but rarely seeks trust between the auditor and the audited. The opportunity to tackle a systemic problem is missed.

The worst areas of regulation result in busywork industries. These compliance industries produce frameworks, paperwork and elaborate processes which give the appearances of rigour and professionalism. Meanwhile, the spirit of technology regulation can be lost entirely.

The EU’s General Data Protection Regulation is a good example of this. Tangible change has happened as a result of its introduction, but most organisations remain highly non-compliant – toxic legacy technology being a contributor to this. Achieving the spirit of GDPR remains prohibitively expensive for many. But this contradiction is impossible for most organisations to discuss publicly, leading to an uneasy truce with regulating bodies. Long term mistrust between regulators and the regulated risks legislation becoming redundant through lack of enforcement. The culture of opacity, busywork and ignorance is likely to reduce data protection and privacy. It will also contribute to the growth of toxic technology – non-compliant technology will become more ignored, and people less willing to hold accountability for it.

The subjectivity of auditing technology presents another problem – good auditors need technology expertise. Often, those being audited don’t trust the auditor has access to these expertise. An inexpert auditor can make mistakes in interpreting compliance. It undermines their role as a ‘line of defence’ through widespread mistrust by those being audited.

Mistrust between the centre and the edge

Large organisations, and governments, have areas which are considered “at the centre” and areas that are correspondingly “at the edge”. The public sector has edges such as municipal and local government, centres such as federal government. Corporations have similar layers: local offices, regional hubs and corporate headquarters.

These structures have important implications for technology. The technical architecture will be defined by which services are offered, and used, by areas of the organisation. It is most common to see ‘shared services’ offered (or mandated) from the centre. But, where trust is high, innovative shared services could be provided and used from any part of an organisation.

Theoretically, all centres and edges should share common goals. They should be aligned to the purpose of the whole organisation. In this perfect organisation, the most efficient technical architecture would be a harmonious set of shared services. Unfortunately it is common for mutual distrust to exist between centre vs. edge, or edge vs. edge. Without addressing this, the organisation is destined to deliver duplicated or conflicting services. The greater the internal mistrust, the more unused, ineffective technology is likely to be created. Toxic technology will thrive in this mistrusting environment. 

Sometimes these trust challenges can be huge – mirroring geopolitical tensions, or the resentment of corporate acquisition. It may even be better to break organisations apart rather than find ways to build trust. But if the organisation intends to remain whole, leaders from both the centre and the edge, hold a responsibility for establishing more trust between their teams.

Building trust

Understanding how mistrust can cause toxic technology is the first step to avoiding this happening. However the positive case for building high performing teams who trust and respect each other is also important. I will explore this in future instalments.

As a starting point for the more positive case, I’ll share the principles for digital teams developed at the UK Ministry of Justice. I found these useful for describing the mutual, aspirational expectations between digital teams, and those who manage or govern them.

Continue reading next instalment.

Sign up for future instalments

Sign up for email updates below.

Processing…
Success! You're on the list.

Credits

Vera Mehta (@vmehtadata) for her brilliant reviewing of my writing (and general awesomeness).