The false promise of technology

Part 7 of a series on toxic technology.

For all the wide-eyed futurology surrounding the potential of digital technology, it commonly fails to meet expectations. Start-ups fail considerably more than they succeed. Digital ‘transformation’ initiatives commonly overrun and under deliver. Many products and features fail the ultimate test – users just don’t find them useful. This isn’t surprising – building with digital technology is experimentation, trying something new. 

In some respects, the optimism of digital is a virtue. It creates the conditions for the boldest reforms, and the biggest ideas. But where this optimism becomes a problem is when it comes with a disregard of pragmatic pessimism. Optimism is toxic when it leads to avoiding questions of feasibility, viability and the conditions for success.

Digital technology is a particularly dangerous place for this blinkered optimism. It’s an unpredictable medium, in an industry bursting with hype. If you’re willing to believe it, the worlds’ hardest problems can be solved with AI, or blockchains, or internets-of-things. If you do believe it, you’re almost certainly going to be disappointed.

In markets

Companies can tap into broader market optimism for the potential of hyped technologies. This is typically by expressing an intent to use it, rather than showing proven success. In November 2019 there were a flurry of articles reporting that HSBC would use blockchain for a ‘custody platform’ by March 2020. 

“HSBC aims to shift $20 billion worth of assets to a new blockchain-based custody platform by March”

November 2019 Reuters news article

There is no corresponding press coverage in March 2020 to indicate this really happened. But the announcement had already served its purpose to generate positive market sentiment for HSBC. It’s unlikely blockchain was a necessary technology to solve the problem, but its use turns a bland corporate IT initiative into a story about supposed innovation.

In governments

Governments frequently make assertions of future delivery. This helps cement strategy across huge bureaucracies, and encourages public engagement. It is also low risk to reputation because governments are pretty effective at managing the message around its technology delivery failure.

A current example of a government promise on technology comes from the UK government’s National Data Strategy:

“We will drive data discoverability across government through developing an Integrated Data Platform for government, which will be a safe, secure and trusted infrastructure for government’s own data.”

UK government National Data Strategy, September 2020

This announcement sets false expectations:

  • That much of the UK governments’ data can be put onto a singular ‘platform’.
  • That a named thing needs to exist called the “Integrated Data Platform”.
  • That such a thing will come into existence reasonably soon (perhaps within the next spending cycle in the UK government).
  • And, at some point, funding will be allocated to make this all happen.

The language may appear subtle, but published strategies start to shape the funding pots, and the £multi-million programmes that emerge. The complexity needed to solve data in government is immense, but a ‘single platform’ is appealingly simple to fund and easy to announce. The wider strategy does explore the complexity of data in government, but these more concrete, announce-able things will have a more enduring impact.

In politics

As technology plays a greater role in society, we’re likely to see ever more technology-based promises from parties seeking votes. This from the 2019 UK Conservative Manifesto is an example:

“We will use new air traffic control technology to cut the time aircraft spend waiting to land”

2019 Conservative Manifesto

This promise of a real-world outcome, achieved by introducing new technology, is hugely appealing to many voters. But how this will be done, and the uncertainty involved, is not forthcoming. Which technology would be used, and why? When would aircraft waiting times reduce? And to what cost to taxpayers? Instead of a strategy, we’re asked to vote for a promise that cannot be made, and the belief that a government will be competent enough to deliver on it. Whilst this pattern of trust in political promises isn’t new, the unpredictability of digital technology will increasingly expose the gap between intent and delivery.

Expectation vs reality

It’s hard to trust assertions about when and how digital technology can deliver outcomes. The internal complexity and interdependencies of digital technology mean even the most experienced professionals rarely get their estimates right. Most software developers can tell you the time they spent days solving a problem they thought would take a few minutes.

Technology initiatives don’t overrun or under-deliver – instead, people have irrational expectations of certainty. CEOs, Ministers and managers crave this certainty to build a reputation as a person-of-their-word. Customers, workforces and electorates crave this certainty because it reduces anxiety around something they need, or want.

This mismatch of expectation and reality isn’t just a disappointment or a financial write-off. Tactics and fire-fighting must make up for failed plans – reducing the quality of digital technology. These are the ideal conditions for toxic technology to emerge – poorly designed, insecure and unstable technology which is harmful to the organisation and its users. Sometimes organisations work this way for years, reinforcing the culture of false promises. Inevitably it’ll backfire spectacularly such as the glitchy dystopia of Cyberpunk 2077 or the IT meltdown at TSB. If organisations don’t learn from consistent failure to meet expectations, then they normalise the accumulation of toxic technology. 

The accumulation of broken promises

The tech industry narrative of success is dominated by survivor bias of the few commercial giants – each an example of when bold predictions came to pass (at least in their origin stories). Yet the majority of organisations are filled with toxic legacy and broken promises. The historic strategies of these organisations did not anticipate today’s glacial pace of change, system workarounds and disruptive incidents.

Inside established sectors, like government and traditional banking, these false promises repeat cycle after cycle. Somehow, the next Big Technology Transformation will solve all the problems, despite being funded, governed and promised in the same way as previous attempts.

The powerful – CEOs, government ministers, or those with the ability to gain media or community attention – have a unique role in how promises are made. There is pressure on them to make promises – relieving anxiety that something needed will be delivered, or creating a buzz around a future product. At worst, this is a tool of command and control management – creating unrealistic expectations in public to motivate teams in private. But even the most throw-away announcements can be exploited by command and control cultures. Leaders must take care with announcing new investments, plans or strategies – they communicate intent, but they also set expectations far beyond the leader’s control – and these expectations are not always reasonable. 

Making better promises

We still need to make promises, and to do so with optimism. It’s essential to positive, forward-looking organisational cultures. Optimism is what encourages teams to take risks and try something new. Promises are how trust is built. 

But we all need to make better promises when it comes to digital technology.

Leaders should balance their optimism with pragmatic pessimism. They should ensure this is woven into the culture, practice and process they encourage. They must make sure they don’t consistently over-promise and under deliver, even when the culture might reward it. They must wary of where the burden of expectation will fall, as often it will be with others.

Funders and those that govern should recognise that upfront large-scale approval for funding and headcount encourages false promises. They should work to reform accounting and governance based on greater trust within organisations, making it more incremental and iterative. They should listen to digital leaders and delivery teams as peers, putting aside their role in ‘holding to account’ to collaborate on strategy and reform.

Technologists should use their experience to counter naive optimism. They must remember what caught them by surprise – when the simple became complicated, and the complicated became simple. They must be aware of their own bias in setting expectations of delivery, particularly. when estimates depend upon their expertise. Technologists must be skeptical of hype, even when technology is exciting and new.

Designers should value and understand their medium of delivery as this is where viability and feasibility becomes clearer. Like an architect appreciates wood, steel and construction techniques, a designer of digital technology must appreciate software, data and the craft of technologists. This can be done best in a multidisciplinary team – finding the right experts for the right design challenges.

Product managers should embrace risk as a peer to value as a core part of their discipline. This helps ground bold ideas in achievable reality – it ensures product managers have a more complete set of information with which to make decisions, and make promises.

Delivery professionals should embrace uncertainty. They shouldn’t accept the false certainty of specific and singular estimates (e.g. “complete by July next year”) – ensuring they consider a range of potential outcomes from the best to the worst cases. They should be skeptical of estimates and conscious of bias. Delivery professionals should take a lead in exposing and challenging unreasonable expectations from leaders. Delivery professionals should hold to account those who consistently set poor expectations or provide inaccurate estimates.

Sign up for the next instalment

The next part of this series will be published soon. Sign up for email updates below.

Processing…
Success! You're on the list.

Culture eats technology for breakfast

Part 5 of a series on toxic technology.

Culture in organisations is what people value, and how they behave. Some organisational cultures are undesirable, and will cause and embed problems until the culture changes. Some of these problems impact technology – culture can reduce its quality, and lead to it becoming toxic. Poor cultures can encourage neglect, poor decision making, and prevent people from caring about what is important.

These cultures are unfortunately very common, and even become codified in policy, process and practice. The emergence of undesirable cultures is perhaps inevitable – people’s individual motivations, preferences and biases lead towards them. But they can be changed once observed. Through leadership and the momentum of shared purpose, people can change their behaviour, and start to value what matters.

Command and control

A command and control culture means teams aren’t trusted. It means the more senior someone is, the more they presume to know what’s best for the organisation. Teams are commanded to deliver outputs rather than outcomes.

Toxic technology will emerge naturally in command-and-control cultures, because feedback loops don’t work. When new information is discovered, and it conflicts with the highest-paid-person’s opinion, then it’s disregarded. Often, the most senior people never even see new information – throughout a command and control culture, people are incentivised to follow the plan, not do what’s best for the organisation. This prevents new information flowing up the hierarchy, where it can influence strategy.

The problem is that command and control is pervasive. Long-term detailed business plans, target operating models and annual budgets all default to command and control – and they exist almost everywhere. Managers must preserve their reputations by keeping to promises and commitments. If growing technology risk is exposed as new, and contradictory information, it has to contend with these forces.

Command and control practitioners might point to under-performing or self-preserving teams to legitimise their methods. But if teams have psychological safely, they will be more open about their shortcomings in capability or experience – they will ask for the help they need. Command and control prevents this honest dialogue between managers and the managed.

Sometimes organisations must make major command and control decisions – to become competitive or to secure survival. Even in these circumstances, trusting in people and teams is a better foundation for the hardest change – such as redundancy. Reverting to pure command and control in these times could poison the culture – meaning that what survives is doomed to fail.

There are also circumstances where rigid standards-based controls can be used to exert influence at scale. This blunt instrument can be useful in making a big cultural shift, but its effectiveness is time-limited. If compliance rises high, then governance simply becomes friction. If compliance remains low, it could be unachievable – leading to the standard being subverted or ignored. Consistent failure to meet standards is often a signal of teams burdened with technical debt. Being punitive to the non-compliance motivates teams to hide, and therefore continue to accumulate, toxic technology. Crucially however, even if a standard is temporarily rigidly enforced, it should be done with trust-by-default – assuming everyone is doing their best to achieve it, but they may need help.

Everything is awesome

In some cultures it can seem that everything is awesome (when it is not). When communication is dominated by an organisations’ success stories, failure becomes a dirty secret. Bad news is discouraged or even suppressed. These cultures can be very fun for the majority, but they are usually time-limited. Once the disparity between communicated success and observed failure becomes stark, more people will become cynical and disengaged. A counter-narrative will emerge amongst the disaffected.

Most organisations should tilt the narrative towards the positive – it helps make for a happy, optimistic working environment. This is particularly important for external communication – it helps with hiring, and pride for the current workforce. But, when everything is supposedly awesome, it can be challenging to raise awareness of risk. Toxic technology can be allowed to grow because talking about it doesn’t fit the organisation’s positive narrative. If organisations experience the consequences of toxic technology – such as a cyber attack or system outage – these cultures don’t handle crises well. They are likely to be under-prepared, and shocked at how this could happen.

An extreme version of this culture exists where routine failure is celebrated as success. Large-scale outsourced public sector IT since the 1990s has largely been a series of calamitous failures. Yet, many embedded in the industry celebrate the same activity as unbridled success. The means the business case structures, delivery methodologies, organisation structures and funding models have remained largely unchanged despite routine, spectacular failure. This pattern has been perhaps one of the largest sustained contributions to the global mountain of toxic technology.

Shiny things

A powerful distraction to tackling legacy is human nature — we are attracted to the new. The technology industry is rife with the problem of under-valuing the old, and over-valuing the new.

Organisations focus on delivering new technology, often to the detriment of improving existing technology. We build new products. We digitise the analogue. But organisations perhaps don’t realise that software ages like fish? – if we look away for too long, it begins to rot. Legacy accumulates quickly by following the fashions — Javascript frameworks are a good example of this effect, where popularity rises for passing moments in the long history of web technology.

Not invented here syndrome”, is the tendency for technologists to craft a solution to a problem that has already been solved. This can be caused by the challenges of discovering existing solutions to problems, and how much to trust them. But more significantly, creators of technology have the desire to create. As a former software engineer, I know that creating something completely new is emotionally rewarding. Particularly when compared to configuration and composition of existing technology.

Technology that has been around for decades, such as SQL databases, web frameworks, and web servers can be harnesses to solve a vast range of problems. So many problems can be solved by “putting strings into databases (and taking them back out again)”. Technologists must be very user-focused and goal-focused to choose the boring technology that works best. Unfortunately, early career progression as a technologist can depend upon exposure to a variety of technologies – incentivising trying out new things where possible.

The pursuit of shiny things affects entire organisations when they prioritise reputation over sustainability. To some degree this is legitimate – reputation drives sales, inflates the share price and gets investment. An inflated reputation also helps gain momentum when transforming organisations from the inside. But reputation-over-sustainability doesn’t last – at some point there are consequences for neglect.

The love of shiny things runs deep. Design and systems thinking connote a sense of freedom often only possible by starting anew. Organisational, political, and professional leaders are valued and rewarded, for their output – measured through the creation of announceable things – not the outcomes they influence through teamwork. It takes a lot to resist these forces, and spend some time polishing up what used to be shiny.

Too many heroes

Organisations love a hero. They celebrate the achievements of individuals or teams who achieve impressive feats, against the odds. They might deliver something new in a heroically a short timescale. They might avert catastrophe by working all hours to resolve a problem. A culture of everyday heroics has innocuous beginnings, with people receiving praise for going beyond the usual expectations of their role. But when entrenched, it makes the organisation fail to see systemic failure, or appropriately recognise systemic success. Rapid delivery might be the result of years of bold investment in platforms. Heroic ‘all-nighters’ by the operations teams are often the result of systemic neglect of technology.  

Organisations shouldn’t stop celebrating these ‘heroes’. But they must find balance. They should strive to celebrate systemic success or failure. This is hard, because systemic change often doesn’t have an event, or moment, with which to associate the fanfare. They should celebrate the teams, past and present, who made the success, or averted failure possible – not just the individuals who made the most visible impact. Celebration creates incentives, and an imbalance towards the short-term, highly visible success stories. Incentives are needed to encourage longer term impact, where individual credit is less likely.

Improve culture by observing it first

Identifying and cataloguing cultures is not scientific – it’s just too hard to characterise and categorise collective values and behaviours . But culture still has a powerful effect. What’s important is to find ways to spot undesirable cultures, talk about them, and improve them – before they do too much damage to the organisation, and its technology.

Continue reading part 6.

Sign up for future instalments

Sign up for email updates below.

Processing…
Success! You're on the list.

Mistrust makes technology toxic

Part 4 of a series on toxic technology. Read parts 1, 2 and 3.

Over the next few instalments in this series, I’m going to write about the causes of toxic technology. These causes are systemic: markets, management, culture, finance and human bias can all contribute to toxicity. My intention is to help people understand that toxic technology is currently inevitable – and without change to systems, it will continue to be.

The first cause I will be writing about is the absence of trust in delivering and sustaining technology.

Trust & technology

The best technology is delivered by high performing teams in organisations with a culture of collaboration, transparency, and sharing. Common to all these positive characteristics is trust. Trust is efficient, and effective – it makes it easier to get stuff done.

With technology, it is easy for trust to be lost. Delivering and operating technology is unpredictable – it’s hard to set clear mutual expectations, and meet them. Teams struggle to anticipate the need to change the design of code and architecture, and operational incidents strike like lightning. The strengths and weaknesses of technology are counter-intuitive – it can be effortless to perform billions of complex calculations, but perhaps impossible to reliably identify an image as a cat.

Because trust can be lost so easily, mistrust is common in the technology industry. Command and control governance cultures are too common, particularly within larger organisations. They make mistrust endemic. In these cultures, processes and policies bake-in mistrust by design.

Mistrust leads to toxic technology because teams become misaligned with expectations. It prevents organisations working as a whole – resulting in siloed areas which compete with or undermine each other. Mistrust hides away risk – particularly long term and systemic risks – because people don’t want to be blamed for bringing bad news. It creates the perfect conditions for the ‘boiling of frogs‘ – the temperature is rising, but nobody is aware.

The patterns of mistrust are best understood by looking at the relationships that exist within, and between organisations.

Mistrust between the powerful and the disempowered

In organisations there are the powerful, and the disempowered. Power imbalances can exist between managers and the managed, budget holders and the budgetless, auditors and the audited. Hierarchies, accountabilities and ‘checks and balances’ are necessary structures for the efficient running of organisations, but they create power dynamics which can be both useful and harmful. In all of these relationships, mutual trust is needed.

The powerful hold more responsibility to create mutual trust – there is more they can do and change. Unfortunately, it is common for the powerful to encourage the disempowered to ‘build trust’ with them. They seek proof that trust is warranted through evidence of action, delivery or impact. This ignores that gaining trust is mutual, and that the disempowered need trust in order to succeed. The language of mistrust by the powerful will be familiar to many: 

  • Overspending on budgets is because of wasteful teams, not underfunding
  • Missing a deadline comes from a lack of hard/smart work in the team, not an unreasonable promise made without understanding feasibility and capability
  • A lengthy estimate is because the team wishes to ‘gold plate’ delivery, not that expectations or assumptions are unreasonable

Trust is vital for good governance. Those governing must recognise that they hold powers, and therefore responsibility. Mistrust makes governance disciplinarian – those being governed are treated as unruly, and assumed to be failing if they don’t meet expectations. Mistrusting governance becomes command and control – attempting to control without empathising with those delivering, or providing them with support. This culture creates a barrier for the flow of information, and therefore truth. Bruce F. Webster describes this as ‘The Thermocline of Truth’:

In many large or even medium-sized IT projects, there exists a thermocline of truth, a line drawn across the organizational chart that represents a barrier to accurate information regarding the project’s progress. Those below this level tend to know how well the project is actually going; those above it tend to have a more optimistic (if unrealistic) view.

Bruce F. Webster, The Thermocline of Truth

The truth which is least likely to flow in these circumstances is bad news – information about the organisation’s unacceptable risks or failing delivery. This causes the scale of existing toxic technology to become hidden, or to be explained away as an ‘accepted risk’. It also adds to the problem through the late revealing of failure in technology projects. The panic to get delivery ‘over the line’ causes toxic technology at scale as shortcuts are taken.

Mistrust between manager and the managed

Seniority in organisations is typically defined by the management hierarchy. Command and control culture follows from attributing superiority to seniority. In some organisations and sectors, higher levels of seniority are considered elites. They have rituals or identities which encourage a sense of superiority. In this culture, senior managers stop trusting those in junior roles, despite the fact they often used to hold these roles.

Command and control managers make specific demands without listening. They seek outputs, not outcomes, from those they manage. These outputs are often time-bound, and have predetermined scope – leaving little room for dialogue around risk or feasibility. Outputs in technology describe the solution: “build a website” or “make a chat bot”. This contrasts with outcomes which describe the goal, such as “make it easier to access healthcare” or “increase the global market share of a product”. Command and control managers presume teams don’t work hard or smart enough – and don’t provide support to help them succeed. Sadly, most command and control managers are propagating the culture – they are commanded and controlled themselves, and pass this down.

Command and control management doesn’t work, because the future is unpredictable. Technology delivery particularly so. This means that demanding outputs has undesirable consequences:

Hiding the truth

Teams inevitably manipulate information to navigate low trust cultures – finding ways to avoid being held to account for not meeting unreasonable expectations. Instead of providing honest, insightful information, they ‘feed the beast’ – meeting the demand for evidence of how money is spent and time used with generic, copy-and-pasted responses. Every governing function is susceptible to increasing the burden of mistrust: finance, risk, security, productivity, commercial, legal. Corporate norms and so-called ‘best practice’ in these areas are mistrustful by default. But the combined information demands of corporate norms can be suffocating for technology teams.

Teams will often hide or downplay performance issues, or toxicity in technology. This is because they don’t trust their managers to be understanding, or don’t trust the corporate process being followed. This problem can scale up to entire organisations, if mistrust is endemic. If successive managers up a hierarchy avoid exposing uncomfortable truths, then risks can grow, unknown to leaders. Widespread mistrust can lead to catastrophic accumulation of toxic technology before leaders are become aware.

Burn out

When trying to achieve the unachievable, teams find themselves on a high-speed treadmill. They try to keep up with expectations to gain or maintain trust. Some people are fortunate to have the life circumstances to temporarily push beyond the boundaries that many might find uncomfortable. However, as organisations scale they must accommodate a healthy working environment for everyone. They must consider factors such as mental health, caring responsibilities and disability. They should support interests outside the workplace and encourage balance between work and life. There is a moral argument for this. But, it is also to the long term advantage of the organisation – leading to a happier, healthier and more engaged workforce. Burn out pace is a way of operating that starts to ignore these factors, and puts the short-term needs of the organisation above the health of its people.

Burn out makes retention of skills and knowledge challenging. Hiring is impacted through the reputation of the workplace. Some areas of the tech industry compensate through paying higher salaries, storing up problems for the future. If burnout pace continues for too long, quality will drop and risks will increase as team members become exhausted. It will impact the work done afterwards – operations, continuous improvement or the next big feature or product. Most technology is intended for ongoing use and improvement, and will need a happy and healthy team to help make this happen.

Reckless risk taking

Teams trying to achieve unachievable outputs might resort to taking reckless risks, hoping they can get away with it. Risky shortcuts in technology typically happen in the less visible areas: user research, performance, accessibility and security. In some circumstances intentionally taking on technical, or other forms of  ‘debt’ is the right thing to do. It can allow faster initial delivery, but there must be a strong likelihood of paying down the debt later. In low trust cultures however, it is likely that a manager already wants the next output delivered even before the first is finished. This leads to a cycle of debt accumulation, and growing toxic technology.

Fear and anxiety

Teams can become fearful of the consequences of failing to deliver the output, and become doubtful of their own abilities. Managerial gaslighting can result from mistrust, where teams told they are under performing, but never given the trust they need to succeed. Command and control management can easily become bullying or harassment.

Mistrust between auditors and the audited

Regulators and auditors provide a ‘check and balance’ role in many industries and sectors. Whilst there is intentional separation of roles, trust is still important. A low-trust approach to auditing can sometimes work, but only if compliance with policy, regulation and standards is widespread. If compliance is consistently high, then auditors need to discover the minority who fall short – a suspicious mindset, to some extent, can work well.

If compliance is consistently low then this approach fails – auditors become a nuisance, pointing out problems that are already known, and hard to resolve. Auditors and regulators may even become counterproductive as they incentivise organisations to be opaque, and avoid being open about non-compliance.

Many areas of compliance and legislation in technology are subjective. They are also designed to address significant economic or social challenges. In this context, non-compliance is common. Compliance culture can vary between disciplinarian or general tolerance, but rarely seeks trust between the auditor and the audited. The opportunity to tackle a systemic problem is missed.

The worst areas of regulation result in busywork industries. These compliance industries produce frameworks, paperwork and elaborate processes which give the appearances of rigour and professionalism. Meanwhile, the spirit of technology regulation can be lost entirely.

The EU’s General Data Protection Regulation is a good example of this. Tangible change has happened as a result of its introduction, but most organisations remain highly non-compliant – toxic legacy technology being a contributor to this. Achieving the spirit of GDPR remains prohibitively expensive for many. But this contradiction is impossible for most organisations to discuss publicly, leading to an uneasy truce with regulating bodies. Long term mistrust between regulators and the regulated risks legislation becoming redundant through lack of enforcement. The culture of opacity, busywork and ignorance is likely to reduce data protection and privacy. It will also contribute to the growth of toxic technology – non-compliant technology will become more ignored, and people less willing to hold accountability for it.

The subjectivity of auditing technology presents another problem – good auditors need technology expertise. Often, those being audited don’t trust the auditor has access to these expertise. An inexpert auditor can make mistakes in interpreting compliance. It undermines their role as a ‘line of defence’ through widespread mistrust by those being audited.

Mistrust between the centre and the edge

Large organisations, and governments, have areas which are considered “at the centre” and areas that are correspondingly “at the edge”. The public sector has edges such as municipal and local government, centres such as federal government. Corporations have similar layers: local offices, regional hubs and corporate headquarters.

These structures have important implications for technology. The technical architecture will be defined by which services are offered, and used, by areas of the organisation. It is most common to see ‘shared services’ offered (or mandated) from the centre. But, where trust is high, innovative shared services could be provided and used from any part of an organisation.

Theoretically, all centres and edges should share common goals. They should be aligned to the purpose of the whole organisation. In this perfect organisation, the most efficient technical architecture would be a harmonious set of shared services. Unfortunately it is common for mutual distrust to exist between centre vs. edge, or edge vs. edge. Without addressing this, the organisation is destined to deliver duplicated or conflicting services. The greater the internal mistrust, the more unused, ineffective technology is likely to be created. Toxic technology will thrive in this mistrusting environment. 

Sometimes these trust challenges can be huge – mirroring geopolitical tensions, or the resentment of corporate acquisition. It may even be better to break organisations apart rather than find ways to build trust. But if the organisation intends to remain whole, leaders from both the centre and the edge, hold a responsibility for establishing more trust between their teams.

Building trust

Understanding how mistrust can cause toxic technology is the first step to avoiding this happening. However the positive case for building high performing teams who trust and respect each other is also important. I will explore this in future instalments.

As a starting point for the more positive case, I’ll share the principles for digital teams developed at the UK Ministry of Justice. I found these useful for describing the mutual, aspirational expectations between digital teams, and those who manage or govern them.

Continue reading next instalment.

Sign up for future instalments

Sign up for email updates below.

Processing…
Success! You're on the list.

Credits

Vera Mehta (@vmehtadata) for her brilliant reviewing of my writing (and general awesomeness).

Legacy technology: The good, the bad, and the toxic

Part 3 of a series on toxic technology. In part 1 I introduced toxic technology, and part 2 explained the symptoms.

The term ‘legacy’ when applied to technology has a dual meaning. Some define legacy technology by emphasising its risk. They use the term to refer to technology which, through an ageing process, has become outdated and hard to change. Legacy technology has become toxic over time, by accumulating risk.

Others emphasise the value in legacy technology such as the Legacy Code Rocks and Festival of Maintenance communities. They celebrate the usefulness, beauty and lessons to be learnt from past technology. This perspective is also a call-to-action – saying there should be more focus and celebration of maintenance.

Both these perspectives are needed. Legacy technology has both risk and value. Legacy remains a useful term for describing the technology inherited by organisations – combining the good, the bad, and the toxic.

Despite a dual meaning, the term legacy is used negatively, more often than positively. In common usage, it is becoming a synonym for toxic technology – creating a sense that anything old is bad. Allowing legacy to become a synonym for toxicity is dangerous. It creates an unhealthy divide where new is good, and old is bad. When old is bad, this devalues the important role of maintenance and continuous improvement. When old is bad, it creates generational divides based on engineers’ familiarity of technology. When old is bad, organisations fail to learn the skills to sustain technology over decades – they become trapped in cycles of delivery booms, and toxic legacy busts.

Organisations need to be clear in their intent, and strategy around complex areas like legacy technology. Because the term legacy is both positive and negative, it is hard to communicate intent. Legacy might need modernising, migrating, decommissioning or maintaining – but its hard to summarise what should be done in general. The idea of toxic technology, and toxic characteristics, is easier to use – it’s unambiguously negative. Toxicity must be minimised. If toxicity is growing, something must be changed. If toxicity is reducing, the strategy is working. If a new system is created, and it’s already full of security vulnerabilities and accessibility issues, then it is toxic. If a legacy system has been patched, upgraded, modernised and is keeping pace with changing user needs, it is not toxic. It doesn’t matter how much legacy an organisation has, it matters how toxic this legacy is.

The volume of toxic legacy, for many organisations, is a problem decades in the making — it cannot be solved quickly, or cheaply. The systemic causes of toxic technology have not been addressed, so next generation’s toxic legacy is still being created today. It will take bold leadership to challenge these causes which lie, often hidden, in our current ways of working – in finance, procurement, design and technology delivery.

Continue reading the next instalment.

Sign up for future instalments

Sign up for email updates below.

Processing…
Success! You're on the list.

The symptoms of toxic technology

Part 2 of a series on toxic technology. In part one I introduced toxic technology, and defined what I mean by the term.

The characteristics of toxic technology can be invisible to people working in organisations. It is hard to directly observe the security flaws, tangled architecture or unreadable code without expertise. But the symptoms of toxic technology are common and visible.

Delivery

Toxic technology makes delivery slower, harder and riskier. This is caused by shortcomings in the the less visible design of technology: the code and architecture. This internal design has a significant effect on the ease and risk of change. Some change can be simple, like adding lego blocks to an already assembled structure. But some change require the existing structure to be pulled apart, or new kinds of lego block fabricated. The most complex changes ripple out into connected systems. Toxicity makes even the most simple changes surprisingly complicated.

There’s no perfect internal design for technology – it’s a balancing act between preparing for the future vs. reacting to immediate needs. Preparing involves making technology more malleable and understandable (known as refactoring), and therefore ready for likely future changes. But teams can be wasteful if they over-prepare, investing in changes to anticipate something that never happens. When the pressure to deliver is high, teams refactor less. Failing to refactor over a long time will make technology brittle and hard to understand – change becomes harder and harder to achieve. If the pain of changing technology is too high, it will become perceived as ‘legacy’.

Where legacy technology exists, its impact on delivery can be hard to avoid. If legacy systems hold data, or process transactions which are important to the organisation, then they become bottlenecks for delivery. It can be very challenging to replace legacy systems, so it is common for new technology to layer on top of the old, accumulating over time. Teams must be wary of the infectious nature of toxic technology – when new technology integrates with legacy technology, some changes can only move at the pace of the slowest system.

Operations

The ease of changing technology is also important for its operation – it improves the ability to manage failure. Teams need to be able to release fixes with confidence, knowing they’re unlikely to cause side effects. To help detect and diagnose failure operators sometimes need to add new ways to observe how the system is behaving. Toxic technology increases the risks of accidently making things worse when trying to deal with an emergency.

Toxic technology makes systems brittle. If changes cause unwanted side effects, small incidents can grow into catastrophic failures. Legacy systems will often grow a reputation for stability, but this reputation is a result of infrequent change. When incidents do occur, it takes a long time to recover and the impact on the organisation and users is more severe. 

The organisation

A catastrophic IT outage is a big threat to organisations. It creates an existential threat to commercial organisations, and can cause the loss of trust in a public institution. In 2019 British Airways and TSB experienced outages which impacted customers over several days. By lasting for longer periods of time, compound effects are seen – customers will shop elsewhere and public service users could see impacts on their lives. Modern technology companies, currently less burdened by toxic technology have outages which usually only last a few minutes or hours. Sometimes the press coverage of these is high-profile, but compound effects are avoided as business returns to normal. However, in the public and banking sectors, the loss of critical systems for several days is becoming increasingly normal. 

Communications around these kinds of persistent outages state, or imply, that there is a single root cause. The truth is that any major technology failure will have multiple complex causes: a succession of technical and organisational failures, alongside human error and bad luck. But organisations are unlikely to reveal these complex causes publicly, because it begins to expose the depth of toxic technology the organisation relies upon. Where they have accumulated toxic technology at scale, small failures can cascade into catastrophes.

Toxic technology is a cyber security risk. Neglect means that technology is not kept up-to-date, and vulnerabilities emerge. Pressure on delivery means good security practices are routinely deprioritised. Meanwhile, cyber crime is increasing year-by-year, in terms of both frequency and sophistication. Insufficient cyber security, caused by toxic technology, will someday result in the failure of a major corporation or institution.

Challenging new legislation like GDPR is beginning to take effect, with British Airways fined £183million in July 2019 for a data breach. Their rivals Easyjet are being investigated for the unauthorised access of 9 million customers’ personal data. Most organisations being fined so far are established organisations with high volumes of toxic technology. Legislation of technology is expanding further – including accessibility, online platforms, and encryption – meaning toxic technology is also becoming a legal concern.

Where technology is subject to standards, the impact of toxic technology can be critical to doing business. Toxic technology is very likely to breach standards through poor security, or a failure to keep up with changing requirements. PCI-DSS is a standard used by the payment cards industry to protect customers’ financial information. Failing to meet the standard can result in legal action, cost of fraud, loss of revenue and range of other negative impacts.

Cost

Toxic technology places a financial burden on organisations. It makes it harder to be strategic and innovative because resources drain away on tactical or emergency manoeuvres. Only the highest priority work can be done, because the cost and risk of change has risen so high.

Toxic technology can get stuck – becoming both too hard to change and too costly to replace. But even if remaining unchanged, costs increase because of a changing context. Most technology is connected with networks, platforms, APIs and ad-hoc user integrations. Whenever external systems change, there are costs to re-integrating the technology. New security vulnerabilities mean new investment in cyber defences. Changing supporting platforms can trigger expensive migrations. People costs increase as access to niche skills and knowledge becomes harder. Commercial contract renewals are renegotiated at higher prices to support ageing technology. And throughout this process of toxic technology growing in cost, modern replacements tend to become cheaper, but remain out of reach due to the cost of switching.

The simplest way for toxic technology to impact an organisation’s finances, is by buying or building something that’s not needed. The technology industry is full of ‘silver bullet’ solutions to complex problems, particularly in domains like data warehousing and cyber security. Faced with a complex public health and economic challenge like Covid-19, governments across the world have responded by spending significant sums on apps. Whilst these apps sound intuitively useful, and are very annouceable for political leaders, there’s no evidence to show they’re effective. If they’re not scrapped entirely, governments are left with complex sustainability and data privacy challenges.

Users

Pain and frustration are not uncommon to experience when using technology. But design is even more challenging when the medium is brittle and slow-to-change. Technology provides its most painful experiences inside large slow-moving institutions, where toxicity has accumulated over decades. Inside these organisations, staff are mandated to use dire technology in order to go about their daily duties. In August 2019, Dr Dominic Pimenta described his experience as a junior doctor in the UK’s National Health Service:

His experiences are very typical for public servants and administrative staff in large, established organisations across the world.

It is common for users of toxic systems to increase their usability with a layer of spreadsheets and paper-based work-arounds. Whilst this can optimise use for long-term users, it makes it harder for new users to learn. As new staff enter the workforce with the raised expectations of internet-era services, they will be less tolerant of technology which is frustrating and confusing to use. The impact of poor user experience is likely to hit certain groups more than others, with the effect of excluding those with permanent, temporary or situational disabilities.

Service outages, caused by neglect of technology, can erode trust with users, and prevent them from meeting their needs. The impact can range from the inconvenient to the life-threatening. Outages to services such as medical advice, security monitoring, housing, and access to money could have enormous impacts on the lives of users. 

Toxic technology is also more vulnerable to cyber attack, which can have a significant impact on users, such as the suicides following the Ashley Maddison breach, or the leak of the HIV status of 14,000 individuals in Singapore.

Institutions such as governments, or monopoly service providers present a big risk to their users if toxic technology accumulates. When the quality of the user experience diminishes, there is no choice of alternative, and users must suffer the consequences.

The symptoms are everywhere

These symptoms are common in larger, more stable organisations. But their causes are systemic, and so start ups and high growth tech companies are not immune. Without bold new approaches to building and sustaining technology, supported by changes to how we fund, staff and govern teams, the outcome is the same: the accumulation of complexity and toxicity.

These systemic causes, and ways to mitigate them are the subject of future instalments.

Continue reading part 3.

Sign up for future instalments

Sign up for email updates below.

Processing…
Success! You're on the list.

Toxic technology

Part 1 of a series on Toxic Technology.

In 2018 I wrote about toxic technology, a short post explaining the threat organisations face from the legacy technology they accumulate. To explain the idea in more detail, I wanted to write more. This series of blogposts will cover a range of topics which contribute to toxic technology – the way teams work, the strategies we use, core operational processes, and market incentives. Later in the series, I will write about how to avoid, manage and mitigate the risks of toxic technology. This post is the first of many instalments, so if you’re interested, please do sign up for more.

Toxic technology is eating the world

In 2011 Marc Andreessen suggested that software is eating the world. He described the phenomena of new companies using internet-enabled business models to disrupt established markets. 

“we are in the middle of a dramatic and broad technological and economic shift in which software companies are poised to take over large swathes of the economy….all of the technology required to transform industries through software finally works and can be widely delivered at global scale”

Marc Andreessen, 2011

At the time he wrote this, in 2011, over two billion people used the internet, up from an estimated 50 million in 2000. Andreessen predicted that in the next ten years “at least five billion people worldwide [will] own smartphones [with] instant access to the full power of the Internet”. A decade later, there are an estimated 5.11 billion mobile users and 4.39 billion internet users globally. The majority of the world’s population are now internet users.

In the decade since, this pattern has continued as companies such as AirBnB, Uber and Snap inc. have disrupted markets. But a different pattern better characterises the most recent decade: not market disruption, but the accelerating use of digital technology in existing organisations. This has created an impact across diverse industry sectors such as government, finance, retail and transportation.  

New digital technology has been a trigger for widespread change in the public sector. Governments across the world are now transforming how they work using internet-era methods. The US Digital Service, UK Government Digital Service and e-Estonia movements led the way, and many more are following. In the US and UK, these changes represent a rebirth after an era of outsourcing, where investment in technology was principally done through procurement. Now governments and public sector bodies are building technically skilled workforces, and producing large volumes of their own technology. There are currently over 900 national and local governments and agencies contributing code on GitHub.

Disruption has also occurred across the financial sector. Large technology companies such as Apple, Google and Tencent have disrupted consumer-facing payment services. Fintech companies like Stripe, Square and Ant Financial have created innovative and popular products. But, at the heart of the financial system, established banks remain the dominant force. To compete with market entrants, conglomerates have invested large sums in digital transformation. BNP Paribas invested $3billion in 2017, HSBC $17billion in 2018, and JP Morgan $10.8billion in 2019. Challenger banks like Monzo and Starling represent a more direct challenge to established banks, but whilst their growth is rapid, they remain niche players in the global banking sector. 

Small to medium enterprises represent the majority employer in most countries and sectors. These small organisations collectively make a significant contribution to the software produced globally. Typical software produced by SMEs are systems to help with routine administrative tasks such as case management systems and customer records management systems. SMEs also produce millions of websites – whilst many are constructed using templated tools, many also involve writing bespoke code.

Outside professionalised software development communities, people use general purpose tools to create software. They may not identify as software developers, yet they create abundant software. Excel formulas, Microsoft Access databases and customised ‘low code’ platforms are examples. Millions more create software online, editing the markup code of their websites. Rudimentary knowledge of HTML (Hypertext Markup Language) and CSS (Cascading Style Sheets) can help them go beyond standard templates. Content building platforms like WordPress and Shopify democratise software development. This long tail could perhaps be the largest software sector – the user-generated content of the software industry.

A crisis in the sustainability of software

Software is growing in every sector, and within organisations large and small. Software is playing an increasing, vital economic and social role. But is it sustainable for software to keep eating the world? Do we have the resources to ensure all this software remains healthy, and effective? Many patterns exist to suggest this is not the case. There is a crisis in the sustainability of software.

When technology is not sustainable, basic cyber security and maintenance practices lapse. This causes organisations to experience data breaches with increasing frequency and at increasing scale. But many of these attacks and accidents are preventable. High-visibility outages are becoming more common as a result of neglected technology. Systematic records on service outages are not kept, making trends hard to observe. But, as technology in most organisations is ageing, it is reasonable to assume the trend is worsening.

The European Union introduced the General Data Protection Regulation (GDPR) in 2018 to strengthen pre-existing privacy legislation. Whilst remaining subjective, it has become harder to argue the compliance of a large legacy technology estate. Data protection is now a bigger challenge for organisations – it requires more investment in modernisation, and nurturing a culture of maintenance.

It is now expected that digital services are accessible – designed inclusively to make it simple and easy to use by all. Inclusivity affects everyone by including permanent, temporary and circumstantial needs (e.g. deafness, ear infections and noisy environments). Some countries are beginning to legislate in this area, adding new legal responsibilities. Yet, the ways in which existing services exclude are often trivial to identify. Away from mass consumer markets, niche software is often inaccessible for many — staff and specialist users must work-around the flaws. Low quality niche software can be the daily working experience for administrative staff in large organisations, made worse when the design excludes them.

High-growth tech companies provide many of the services that consumers experience daily. When consumers order a taxi, a takeaway or buy a book online, they use technology which has recently been renewed or replaced. High-growth gives the abundant resources to make this possible. Established organisations rarely experience high growth so accumulating technology becomes a maintenance burden. With limited investment in technology, organisations prioritise high profile services. Established airlines provide a good example of this. Buying a flight online feels like a modern internet-era service. A less-used service like changing your flight can be very challenging.  Lowest priority of all, office administrators will often use ageing, low quality ‘back end’ systems.

If software is eating the institutions which form the structure of our societies, it must not cause them to fail.

Unsustainable technology inhibits the agility and stability of organisations. It will become a threat to their existence. Businesses will not be able to compete with the agility of younger, leaner organisations. The role of institutions will erode through lack of trust, with citizens opting for market alternatives. The importance of sustainability goes beyond the impact on organisations. If software is eating the institutions which form the structure of our societies, it must not cause them to fail. We must find ways to make digital technology sustainable over decades if these institutions and public trust in them is to endure. 

Network and data centre energy consumption is already set to increase as a proportion of global energy consumption. If digital technology is not made sustainable, inefficiency will result in avoidable accumulating energy use. Sustainable digital technology is necessary to avoid the internet revolution being a key contributor to climate change.

Digital technology will not stop eating the world – the promise of automation is too great, and technology can have a positive transformative effects on people’s lives. If it cannot, and should not stop, it needs to become sustainable.

What is toxic technology?

Toxic technology describes the harmful characteristics caused by poor design, or neglect. Poor design is common, in an industry where outputs are often favoured over outcomes. Neglect is systemic, caused by short-termist cultures, processes and practices which inhibit sustainability. 

Whilst the impacts of toxic technology are significant, examples of toxic technology are mundane, everyday, and recognisable to most. It is: the broken kiosk at the local museum, the ageing computer-on-wheels trolleyed around the hospital, the unpatched web server that lead to the embarrassing data breach or the strange green-on-black interface from the 1990s used by the back office staff at a big bank. Toxic technology is around us all, powering our banks, care homes, warehouses and submarines. It’s pervasive.

The following are typical toxic characteristics in technology. Each is challenging, and subjective to measure – making toxicity hard to expose.

  • Insecure – unacceptable risks to breaches of confidentiality, loss of integrity or lack of availability
  • Unscalable – an inability to respond to change of scale, such as increased usage, number of users, or complexity of the domain
  • Unreliable – lacking durability, availability and predictability
  • Non-compliant – non-compliance with the law, standards or an organisation’s policies
  • Inaccessible – the design excludes users
  • Hard to support – cannot be maintained effectively and efficiently 
  • Hard to change – cannot be changed effectively and efficiently
  • Opaque – important information about the service cannot be obtained when needed
  • Overly expensive – the service isn’t value-for-money
  • Poorly understood – the service and its technology is poorly understood

Software in particular can move fast to toxicity, more so than physical technologies. Bridges can fail and buildings can decay but the patterns of neglect are reasonably well understood, and occur over decades. Software decay is faster, less predictable and subject to more complex external factors. Cyber security vulnerabilities can emerge in any component part. Open-source communities may become unreliable. Commercial suppliers may go out of business, or stop working in your interests. Even doing the basics like patches and upgrades is challenging due to the norms of culture, practice and process.  The software industry is not yet mature enough to match the risk-management rigour of civil engineering.

The term ‘toxic’ is intentionally evocative language to give a sense of active harm, worthy of attention. Terms like ‘legacy’, ‘technical risk’ and ‘technical debt’ are useful, but don’t give a sense of urgency.  For most organisations, toxic technology is a growing and ignored problem, so a change of language could help. 

Systemic issues are the principle cause of toxic technology, not individuals or teams. This is important to recognise when using the very negative term ‘toxic’. The assumption should be that historic creators and decision makers made decisions in good faith. Ageing technology accrues toxic characteristics which become more visible from a contemporary perspective. Historic code reveals the culture, language and decision making of the time. It should be valued as a form of communication from the past to the present – perhaps even aesthetically appreciated like historic buildings. Toxicity is avoided through understanding that it can emerge over time from even the most thoughtfully designed technology.

Continue to part 2…

Credits

Nick Rowlands (@rowlando) for the idea to publish as a series of blogposts, reviews, and general encouragement to write more.

Steve Marshall (@SteveMarshall) and James Stewart (@jystewart) for their many second opinions on my writing.

Giles Turnbull (@gilest) for timely advice to improve my writing.

Sign up for future instalments

Sign up for email updates below.

Processing…
Success! You're on the list.