Mistrust makes technology toxic

Part 4 of a series on toxic technology. Read parts 1, 2 and 3.

Over the next few instalments in this series, I’m going to write about the causes of toxic technology. These causes are systemic: markets, management, culture, finance and human bias can all contribute to toxicity. My intention is to help people understand that toxic technology is currently inevitable – and without change to systems, it will continue to be.

The first cause I will be writing about is the absence of trust in delivering and sustaining technology.

Trust & technology

The best technology is delivered by high performing teams in organisations with a culture of collaboration, transparency, and sharing. Common to all these positive characteristics is trust. Trust is efficient, and effective – it makes it easier to get stuff done.

With technology, it is easy for trust to be lost. Delivering and operating technology is unpredictable – it’s hard to set clear mutual expectations, and meet them. Teams struggle to anticipate the need to change the design of code and architecture, and operational incidents strike like lightning. The strengths and weaknesses of technology are counter-intuitive – it can be effortless to perform billions of complex calculations, but perhaps impossible to reliably identify an image as a cat.

Because trust can be lost so easily, mistrust is common in the technology industry. Command and control governance cultures are too common, particularly within larger organisations. They make mistrust endemic. In these cultures, processes and policies bake-in mistrust by design.

Mistrust leads to toxic technology because teams become misaligned with expectations. It prevents organisations working as a whole – resulting in siloed areas which compete with or undermine each other. Mistrust hides away risk – particularly long term and systemic risks – because people don’t want to be blamed for bringing bad news. It creates the perfect conditions for the ‘boiling of frogs‘ – the temperature is rising, but nobody is aware.

The patterns of mistrust are best understood by looking at the relationships that exist within, and between organisations.

Mistrust between the powerful and the disempowered

In organisations there are the powerful, and the disempowered. Power imbalances can exist between managers and the managed, budget holders and the budgetless, auditors and the audited. Hierarchies, accountabilities and ‘checks and balances’ are necessary structures for the efficient running of organisations, but they create power dynamics which can be both useful and harmful. In all of these relationships, mutual trust is needed.

The powerful hold more responsibility to create mutual trust – there is more they can do and change. Unfortunately, it is common for the powerful to encourage the disempowered to ‘build trust’ with them. They seek proof that trust is warranted through evidence of action, delivery or impact. This ignores that gaining trust is mutual, and that the disempowered need trust in order to succeed. The language of mistrust by the powerful will be familiar to many: 

  • Overspending on budgets is because of wasteful teams, not underfunding
  • Missing a deadline comes from a lack of hard/smart work in the team, not an unreasonable promise made without understanding feasibility and capability
  • A lengthy estimate is because the team wishes to ‘gold plate’ delivery, not that expectations or assumptions are unreasonable

Trust is vital for good governance. Those governing must recognise that they hold powers, and therefore responsibility. Mistrust makes governance disciplinarian – those being governed are treated as unruly, and assumed to be failing if they don’t meet expectations. Mistrusting governance becomes command and control – attempting to control without empathising with those delivering, or providing them with support. This culture creates a barrier for the flow of information, and therefore truth. Bruce F. Webster describes this as ‘The Thermocline of Truth’:

In many large or even medium-sized IT projects, there exists a thermocline of truth, a line drawn across the organizational chart that represents a barrier to accurate information regarding the project’s progress. Those below this level tend to know how well the project is actually going; those above it tend to have a more optimistic (if unrealistic) view.

Bruce F. Webster, The Thermocline of Truth

The truth which is least likely to flow in these circumstances is bad news – information about the organisation’s unacceptable risks or failing delivery. This causes the scale of existing toxic technology to become hidden, or to be explained away as an ‘accepted risk’. It also adds to the problem through the late revealing of failure in technology projects. The panic to get delivery ‘over the line’ causes toxic technology at scale as shortcuts are taken.

Mistrust between manager and the managed

Seniority in organisations is typically defined by the management hierarchy. Command and control culture follows from attributing superiority to seniority. In some organisations and sectors, higher levels of seniority are considered elites. They have rituals or identities which encourage a sense of superiority. In this culture, senior managers stop trusting those in junior roles, despite the fact they often used to hold these roles.

Command and control managers make specific demands without listening. They seek outputs, not outcomes, from those they manage. These outputs are often time-bound, and have predetermined scope – leaving little room for dialogue around risk or feasibility. Outputs in technology describe the solution: “build a website” or “make a chat bot”. This contrasts with outcomes which describe the goal, such as “make it easier to access healthcare” or “increase the global market share of a product”. Command and control managers presume teams don’t work hard or smart enough – and don’t provide support to help them succeed. Sadly, most command and control managers are propagating the culture – they are commanded and controlled themselves, and pass this down.

Command and control management doesn’t work, because the future is unpredictable. Technology delivery particularly so. This means that demanding outputs has undesirable consequences:

Hiding the truth

Teams inevitably manipulate information to navigate low trust cultures – finding ways to avoid being held to account for not meeting unreasonable expectations. Instead of providing honest, insightful information, they ‘feed the beast’ – meeting the demand for evidence of how money is spent and time used with generic, copy-and-pasted responses. Every governing function is susceptible to increasing the burden of mistrust: finance, risk, security, productivity, commercial, legal. Corporate norms and so-called ‘best practice’ in these areas are mistrustful by default. But the combined information demands of corporate norms can be suffocating for technology teams.

Teams will often hide or downplay performance issues, or toxicity in technology. This is because they don’t trust their managers to be understanding, or don’t trust the corporate process being followed. This problem can scale up to entire organisations, if mistrust is endemic. If successive managers up a hierarchy avoid exposing uncomfortable truths, then risks can grow, unknown to leaders. Widespread mistrust can lead to catastrophic accumulation of toxic technology before leaders are become aware.

Burn out

When trying to achieve the unachievable, teams find themselves on a high-speed treadmill. They try to keep up with expectations to gain or maintain trust. Some people are fortunate to have the life circumstances to temporarily push beyond the boundaries that many might find uncomfortable. However, as organisations scale they must accommodate a healthy working environment for everyone. They must consider factors such as mental health, caring responsibilities and disability. They should support interests outside the workplace and encourage balance between work and life. There is a moral argument for this. But, it is also to the long term advantage of the organisation – leading to a happier, healthier and more engaged workforce. Burn out pace is a way of operating that starts to ignore these factors, and puts the short-term needs of the organisation above the health of its people.

Burn out makes retention of skills and knowledge challenging. Hiring is impacted through the reputation of the workplace. Some areas of the tech industry compensate through paying higher salaries, storing up problems for the future. If burnout pace continues for too long, quality will drop and risks will increase as team members become exhausted. It will impact the work done afterwards – operations, continuous improvement or the next big feature or product. Most technology is intended for ongoing use and improvement, and will need a happy and healthy team to help make this happen.

Reckless risk taking

Teams trying to achieve unachievable outputs might resort to taking reckless risks, hoping they can get away with it. Risky shortcuts in technology typically happen in the less visible areas: user research, performance, accessibility and security. In some circumstances intentionally taking on technical, or other forms of  ‘debt’ is the right thing to do. It can allow faster initial delivery, but there must be a strong likelihood of paying down the debt later. In low trust cultures however, it is likely that a manager already wants the next output delivered even before the first is finished. This leads to a cycle of debt accumulation, and growing toxic technology.

Fear and anxiety

Teams can become fearful of the consequences of failing to deliver the output, and become doubtful of their own abilities. Managerial gaslighting can result from mistrust, where teams told they are under performing, but never given the trust they need to succeed. Command and control management can easily become bullying or harassment.

Mistrust between auditors and the audited

Regulators and auditors provide a ‘check and balance’ role in many industries and sectors. Whilst there is intentional separation of roles, trust is still important. A low-trust approach to auditing can sometimes work, but only if compliance with policy, regulation and standards is widespread. If compliance is consistently high, then auditors need to discover the minority who fall short – a suspicious mindset, to some extent, can work well.

If compliance is consistently low then this approach fails – auditors become a nuisance, pointing out problems that are already known, and hard to resolve. Auditors and regulators may even become counterproductive as they incentivise organisations to be opaque, and avoid being open about non-compliance.

Many areas of compliance and legislation in technology are subjective. They are also designed to address significant economic or social challenges. In this context, non-compliance is common. Compliance culture can vary between disciplinarian or general tolerance, but rarely seeks trust between the auditor and the audited. The opportunity to tackle a systemic problem is missed.

The worst areas of regulation result in busywork industries. These compliance industries produce frameworks, paperwork and elaborate processes which give the appearances of rigour and professionalism. Meanwhile, the spirit of technology regulation can be lost entirely.

The EU’s General Data Protection Regulation is a good example of this. Tangible change has happened as a result of its introduction, but most organisations remain highly non-compliant – toxic legacy technology being a contributor to this. Achieving the spirit of GDPR remains prohibitively expensive for many. But this contradiction is impossible for most organisations to discuss publicly, leading to an uneasy truce with regulating bodies. Long term mistrust between regulators and the regulated risks legislation becoming redundant through lack of enforcement. The culture of opacity, busywork and ignorance is likely to reduce data protection and privacy. It will also contribute to the growth of toxic technology – non-compliant technology will become more ignored, and people less willing to hold accountability for it.

The subjectivity of auditing technology presents another problem – good auditors need technology expertise. Often, those being audited don’t trust the auditor has access to these expertise. An inexpert auditor can make mistakes in interpreting compliance. It undermines their role as a ‘line of defence’ through widespread mistrust by those being audited.

Mistrust between the centre and the edge

Large organisations, and governments, have areas which are considered “at the centre” and areas that are correspondingly “at the edge”. The public sector has edges such as municipal and local government, centres such as federal government. Corporations have similar layers: local offices, regional hubs and corporate headquarters.

These structures have important implications for technology. The technical architecture will be defined by which services are offered, and used, by areas of the organisation. It is most common to see ‘shared services’ offered (or mandated) from the centre. But, where trust is high, innovative shared services could be provided and used from any part of an organisation.

Theoretically, all centres and edges should share common goals. They should be aligned to the purpose of the whole organisation. In this perfect organisation, the most efficient technical architecture would be a harmonious set of shared services. Unfortunately it is common for mutual distrust to exist between centre vs. edge, or edge vs. edge. Without addressing this, the organisation is destined to deliver duplicated or conflicting services. The greater the internal mistrust, the more unused, ineffective technology is likely to be created. Toxic technology will thrive in this mistrusting environment. 

Sometimes these trust challenges can be huge – mirroring geopolitical tensions, or the resentment of corporate acquisition. It may even be better to break organisations apart rather than find ways to build trust. But if the organisation intends to remain whole, leaders from both the centre and the edge, hold a responsibility for establishing more trust between their teams.

Building trust

Understanding how mistrust can cause toxic technology is the first step to avoiding this happening. However the positive case for building high performing teams who trust and respect each other is also important. I will explore this in future instalments.

As a starting point for the more positive case, I’ll share the principles for digital teams developed at the UK Ministry of Justice. I found these useful for describing the mutual, aspirational expectations between digital teams, and those who manage or govern them.

Continue reading next instalment.

Sign up for future instalments

Sign up for email updates below.

Processing…
Success! You're on the list.

Credits

Vera Mehta (@vmehtadata) for her brilliant reviewing of my writing (and general awesomeness).

Toxic technology

Part 1 of a series on Toxic Technology.

In 2018 I wrote about toxic technology, a short post explaining the threat organisations face from the legacy technology they accumulate. To explain the idea in more detail, I wanted to write more. This series of blogposts will cover a range of topics which contribute to toxic technology – the way teams work, the strategies we use, core operational processes, and market incentives. Later in the series, I will write about how to avoid, manage and mitigate the risks of toxic technology. This post is the first of many instalments, so if you’re interested, please do sign up for more.

Toxic technology is eating the world

In 2011 Marc Andreessen suggested that software is eating the world. He described the phenomena of new companies using internet-enabled business models to disrupt established markets. 

“we are in the middle of a dramatic and broad technological and economic shift in which software companies are poised to take over large swathes of the economy….all of the technology required to transform industries through software finally works and can be widely delivered at global scale”

Marc Andreessen, 2011

At the time he wrote this, in 2011, over two billion people used the internet, up from an estimated 50 million in 2000. Andreessen predicted that in the next ten years “at least five billion people worldwide [will] own smartphones [with] instant access to the full power of the Internet”. A decade later, there are an estimated 5.11 billion mobile users and 4.39 billion internet users globally. The majority of the world’s population are now internet users.

In the decade since, this pattern has continued as companies such as AirBnB, Uber and Snap inc. have disrupted markets. But a different pattern better characterises the most recent decade: not market disruption, but the accelerating use of digital technology in existing organisations. This has created an impact across diverse industry sectors such as government, finance, retail and transportation.  

New digital technology has been a trigger for widespread change in the public sector. Governments across the world are now transforming how they work using internet-era methods. The US Digital Service, UK Government Digital Service and e-Estonia movements led the way, and many more are following. In the US and UK, these changes represent a rebirth after an era of outsourcing, where investment in technology was principally done through procurement. Now governments and public sector bodies are building technically skilled workforces, and producing large volumes of their own technology. There are currently over 900 national and local governments and agencies contributing code on GitHub.

Disruption has also occurred across the financial sector. Large technology companies such as Apple, Google and Tencent have disrupted consumer-facing payment services. Fintech companies like Stripe, Square and Ant Financial have created innovative and popular products. But, at the heart of the financial system, established banks remain the dominant force. To compete with market entrants, conglomerates have invested large sums in digital transformation. BNP Paribas invested $3billion in 2017, HSBC $17billion in 2018, and JP Morgan $10.8billion in 2019. Challenger banks like Monzo and Starling represent a more direct challenge to established banks, but whilst their growth is rapid, they remain niche players in the global banking sector. 

Small to medium enterprises represent the majority employer in most countries and sectors. These small organisations collectively make a significant contribution to the software produced globally. Typical software produced by SMEs are systems to help with routine administrative tasks such as case management systems and customer records management systems. SMEs also produce millions of websites – whilst many are constructed using templated tools, many also involve writing bespoke code.

Outside professionalised software development communities, people use general purpose tools to create software. They may not identify as software developers, yet they create abundant software. Excel formulas, Microsoft Access databases and customised ‘low code’ platforms are examples. Millions more create software online, editing the markup code of their websites. Rudimentary knowledge of HTML (Hypertext Markup Language) and CSS (Cascading Style Sheets) can help them go beyond standard templates. Content building platforms like WordPress and Shopify democratise software development. This long tail could perhaps be the largest software sector – the user-generated content of the software industry.

A crisis in the sustainability of software

Software is growing in every sector, and within organisations large and small. Software is playing an increasing, vital economic and social role. But is it sustainable for software to keep eating the world? Do we have the resources to ensure all this software remains healthy, and effective? Many patterns exist to suggest this is not the case. There is a crisis in the sustainability of software.

When technology is not sustainable, basic cyber security and maintenance practices lapse. This causes organisations to experience data breaches with increasing frequency and at increasing scale. But many of these attacks and accidents are preventable. High-visibility outages are becoming more common as a result of neglected technology. Systematic records on service outages are not kept, making trends hard to observe. But, as technology in most organisations is ageing, it is reasonable to assume the trend is worsening.

The European Union introduced the General Data Protection Regulation (GDPR) in 2018 to strengthen pre-existing privacy legislation. Whilst remaining subjective, it has become harder to argue the compliance of a large legacy technology estate. Data protection is now a bigger challenge for organisations – it requires more investment in modernisation, and nurturing a culture of maintenance.

It is now expected that digital services are accessible – designed inclusively to make it simple and easy to use by all. Inclusivity affects everyone by including permanent, temporary and circumstantial needs (e.g. deafness, ear infections and noisy environments). Some countries are beginning to legislate in this area, adding new legal responsibilities. Yet, the ways in which existing services exclude are often trivial to identify. Away from mass consumer markets, niche software is often inaccessible for many — staff and specialist users must work-around the flaws. Low quality niche software can be the daily working experience for administrative staff in large organisations, made worse when the design excludes them.

High-growth tech companies provide many of the services that consumers experience daily. When consumers order a taxi, a takeaway or buy a book online, they use technology which has recently been renewed or replaced. High-growth gives the abundant resources to make this possible. Established organisations rarely experience high growth so accumulating technology becomes a maintenance burden. With limited investment in technology, organisations prioritise high profile services. Established airlines provide a good example of this. Buying a flight online feels like a modern internet-era service. A less-used service like changing your flight can be very challenging.  Lowest priority of all, office administrators will often use ageing, low quality ‘back end’ systems.

If software is eating the institutions which form the structure of our societies, it must not cause them to fail.

Unsustainable technology inhibits the agility and stability of organisations. It will become a threat to their existence. Businesses will not be able to compete with the agility of younger, leaner organisations. The role of institutions will erode through lack of trust, with citizens opting for market alternatives. The importance of sustainability goes beyond the impact on organisations. If software is eating the institutions which form the structure of our societies, it must not cause them to fail. We must find ways to make digital technology sustainable over decades if these institutions and public trust in them is to endure. 

Network and data centre energy consumption is already set to increase as a proportion of global energy consumption. If digital technology is not made sustainable, inefficiency will result in avoidable accumulating energy use. Sustainable digital technology is necessary to avoid the internet revolution being a key contributor to climate change.

Digital technology will not stop eating the world – the promise of automation is too great, and technology can have a positive transformative effects on people’s lives. If it cannot, and should not stop, it needs to become sustainable.

What is toxic technology?

Toxic technology describes the harmful characteristics caused by poor design, or neglect. Poor design is common, in an industry where outputs are often favoured over outcomes. Neglect is systemic, caused by short-termist cultures, processes and practices which inhibit sustainability. 

Whilst the impacts of toxic technology are significant, examples of toxic technology are mundane, everyday, and recognisable to most. It is: the broken kiosk at the local museum, the ageing computer-on-wheels trolleyed around the hospital, the unpatched web server that lead to the embarrassing data breach or the strange green-on-black interface from the 1990s used by the back office staff at a big bank. Toxic technology is around us all, powering our banks, care homes, warehouses and submarines. It’s pervasive.

The following are typical toxic characteristics in technology. Each is challenging, and subjective to measure – making toxicity hard to expose.

  • Insecure – unacceptable risks to breaches of confidentiality, loss of integrity or lack of availability
  • Unscalable – an inability to respond to change of scale, such as increased usage, number of users, or complexity of the domain
  • Unreliable – lacking durability, availability and predictability
  • Non-compliant – non-compliance with the law, standards or an organisation’s policies
  • Inaccessible – the design excludes users
  • Hard to support – cannot be maintained effectively and efficiently 
  • Hard to change – cannot be changed effectively and efficiently
  • Opaque – important information about the service cannot be obtained when needed
  • Overly expensive – the service isn’t value-for-money
  • Poorly understood – the service and its technology is poorly understood

Software in particular can move fast to toxicity, more so than physical technologies. Bridges can fail and buildings can decay but the patterns of neglect are reasonably well understood, and occur over decades. Software decay is faster, less predictable and subject to more complex external factors. Cyber security vulnerabilities can emerge in any component part. Open-source communities may become unreliable. Commercial suppliers may go out of business, or stop working in your interests. Even doing the basics like patches and upgrades is challenging due to the norms of culture, practice and process.  The software industry is not yet mature enough to match the risk-management rigour of civil engineering.

The term ‘toxic’ is intentionally evocative language to give a sense of active harm, worthy of attention. Terms like ‘legacy’, ‘technical risk’ and ‘technical debt’ are useful, but don’t give a sense of urgency.  For most organisations, toxic technology is a growing and ignored problem, so a change of language could help. 

Systemic issues are the principle cause of toxic technology, not individuals or teams. This is important to recognise when using the very negative term ‘toxic’. The assumption should be that historic creators and decision makers made decisions in good faith. Ageing technology accrues toxic characteristics which become more visible from a contemporary perspective. Historic code reveals the culture, language and decision making of the time. It should be valued as a form of communication from the past to the present – perhaps even aesthetically appreciated like historic buildings. Toxicity is avoided through understanding that it can emerge over time from even the most thoughtfully designed technology.

Continue to part 2…

Credits

Nick Rowlands (@rowlando) for the idea to publish as a series of blogposts, reviews, and general encouragement to write more.

Steve Marshall (@SteveMarshall) and James Stewart (@jystewart) for their many second opinions on my writing.

Giles Turnbull (@gilest) for timely advice to improve my writing.

Sign up for future instalments

Sign up for email updates below.

Processing…
Success! You're on the list.