The ways in which we can be wrong in considering calls for improvement in institutions
Spend some time in any institution and I bet you would notice at least one of these issues crop up. Sometimes people might push back against a a change too hard, or they might end up copying glitzy approaches without really thinking things through.
The issue deals with how do institutions react to suggestions for change. One of the most important things to consider while leading institutions is how they deal with changes to itself. By institution I mean any collective. It could be your school, university, company, team, or even government. Depending on the nature of the institution, its current and historical context etc it can be quick moving, pretty lethargic or downright hostile to change.
The NIH ('Not Invented Here') syndrome is a disease - Linus Torvalds
NIH ('Not Invented Here') syndrome is a tendency for collectives to not use products, systems or advice from an external source. The idea being stems from the assumption that in-house resources would develop the solution in a better way, since they have more knowledge of the organisation or team they are part of.
To be fair, this is not always a bad way of thinking. Joel Spolsky wrote an article on it two decades ago which is still relevant and goes into detail about it. It's main point being: If it is something which is core functionality or something really differentiating you're offering in a way that's offering a competitive advantage, then it's probably not a good idea to use an external solution for it. However, if it is something which is not really core to what you do, then it's probably worth it to save your time and effort not doing that and getting an external solution there.
There are also issues to consider regarding how far down the stack do you want to have ownership and control. Joel Spolsky mentioned the Excel team making their own C compiler. If you're working on a software project, would you do the same, even if performance and memory optimisation are super important to you? The answer often would also depend on the resources you have at play (size of the team, how much money and time you can put into investing in the team) as well as the potential return on investment etc. For Apple, they for long used microchips provided by external vendors like Intel - but they see a need to own the performance sphere as something really core to them, and have the resources to invest in developing their own chips and they see a return on investment on it in terms of competitive advantage. So now their lineup is featuring their own chips.
Having said that though, in many cases NIH syndrome is a symptom of inertia and an aversion to systemic change.
I couldn't find a proper term for it so I've made one up myself - Institutional Exceptionalism. In large organisations especially, you might encounter a feeling of resistance to change. Even if someone mentions something they could learn from others, often the response is that it won't work inside their institution.
I remember almost a decade ago I went and did a workshop on web standards and best practices for a bunch of people in the government. Most of them were people working on the department responsible for that government's national websites. These websites were extremely poorly designed and seemed to use a lot of practices which were, even at that time, considered bad.
During my workshop, I showed and demoed a few government websites from other countries with the intention that it might inspire the group to raise their game and learn from how they could design and develop something similar. Afterwards a few of the engineers came up to me explaining how complicated the countries situation is (so many national languages) and how much traffic they are getting everyday (wasn't that much actually) and how much of a resource crunch there is etc. They were convinced that their situation was so unique that no industry best practices could be applied to them properly. I internally sighed and thought to myself 'Ah, that explains the state of their websites for the last 15 years'.
During some other meetings (again, unfortunately with government bureaucrats) I have come across the same devaluing of ideas. I remember one particular instance where a foreign diplomat was talking in a meeting with a bunch of people from the private industry, as well as bureaucrats from the home country in attendance. Afterwards, a bureaucrat from the home country just completely trashed the ideas of the foreign bureaucrat - just saying that their country is different so it’s not a valid comparison. If we go by that logic, we can't learn anything from anyone since nothing will be 100% the same between two institutions!
Some people tend to go on the defensive when hearing ideas from external sources thinking that it's some sort of attack on their own thought process and work. Sometimes people are just having the Dunning-Kruger effect and overvalue their expertise while undervaluing others.
I recently came across this article where you see the same type of behaviour of overvaluing their capabilities and undervaluing others by some bureaucrats in the US. A telling quote from it is the following:
The worst phrase I keep hearing: apples to apples. The idea is that projects can’t really be compared, because such comparisons are apples to oranges, not apples to apples; if some American project is more expensive, it must be that the comparison is improper and the European or Asian project undercounted something. The idea that, to the contrary, sometimes it’s the American project that is easier, seems beyond nearly everyone who I’ve talked to. For example, most recent and under-construction American subways are under wide, straight streets with plenty of space for the construction of cut-and-cover station boxes, and therefore they should be cheaper than subways built in the constrained center of Barcelona or Stockholm or Milan, not more expensive.
Another aspect of such thinking (including NIH) is resting on past laurels. An institution might have a grand history of producing great work in the past, but that doesn't necessarily mean that it will do so all the time in the future too.
Consequences of this type of thinking can range from just lack of innovation, to dangerous and foolhardy policy. For e.g, take this example of a university which brought back 40,000+ students in COVID times to campus based on a model created by 2 over-confident physicists, who said epidemiology was important but intellectually unchallenging. Turns out they said made flaws in basic assumptions and the result was a mass spread of COVID-19 cases in the university.
Having said that, there are instances of people going overboard on the other direction too.
Isomorphic Mimicry is a term used in the field of biology dating back to as far as the 19th century. It was used as a term to describe one organism mimicking another to gain evolutionary advantage for themselves. A well-known example of this is the Scarlett Kingsnakes (which are non-venomous) mimicking the look of the Eastern Coral Snakes (which are highly venomous). Another example is of the Snowtail Butterfly, which mimics the look of some other toxic butterflies.
Over the years, this term has made the jump beyond biology and now essentially means the tendency to take a look at some other institution's successes and completely replicate every (or a large) aspect of their success - whether it's processes, systems, metrics, etc without really putting thought into whether it could serve the context of your institution or not.
It is meant to give the appearance of success or legitimacy, without actually necessarily solving deeper issues involved to achieve success. You get the benefits of appearing like a successful entity without actual being that thing.
To be fair, thats not always a bad thing. Meiji-era Japan in the 1800s sent a bunch of people to various western countries to learn from their systems of governance as well as their commercial, artistic and scientific advances. They sent officers to France to study its courts, army (they also studied Prussia for its army) and police. They studied the navy and the postal system in Britain. In the United States, their officers studied the banking and arts education system. All this laid the foundation for modern day Japan.
Vaccination of children has seen a great increase over the last few decades and, by and large, been a massive success. For the most part, this has relied on many countries just mimicking the approach of others in this field. This resulted in a dramatic lowering of the death rate for children dying from diseases which are vaccine-preventable.
However, if I had to guess, I would say that most significant successes would rarely happen just via absolute mimicry of another institution. You would need to adapt things to better suit your home institution. This would mean learning from others while synthesising it with the nuances of your home institution to come up with something much better fitted to you. Along with this, while looking at others, you might repeatedly ask about why a certain approach is taken by others to truly learn the real reasons behind the rituals, tools and processes of others.
Knock-on effects of isomorphic mimicry
"I suppose it is tempting, if the only tool you have is a hammer, to treat everything as if it were a nail." - Abraham Maslow
Over time you have a market demanding skills in the latest hot framework. You have an industry of courses and books selling training on it. Eventually you end up with a bunch of developers who are trained in it and willing to use it, even in situations where it may not make the most sense. For example, I have encountered plenty of framework-laden websites which were so trivial that I wondered why they would need to use a framework in the first place. I've seen people use certain build tools in needlessly overcomplicated ways too.
One of the knock-on effects of isomorphic mimicry at scale is a bunch of people nodding their heads about things which may not be actually suited to everyone. It could also lead it more disastrous and sinister effects. There are instances of people who are part of institutions with very exclusionary hiring criteria moving on and joining other institutions and then replicate the hiring criteria on those places too. The result becomes that people associate good institutions with just the kind of people who pass that criteria - which most of the time, puts minorities and economically struggling people at a disadvantage.
You'll find it everywhere
Once I started thinking of these things, I started experienced a Baader-Meinhof Phenomenon of sorts. I realised that there are many topics which have people on both ends of the spectrum. Agile software development is an example. I've encountered people who absolutely never want to work in an agile environment and others who never want to work without it.
For people who are apprehensive of Agile but who have never tried it themselves, a lot of it has to do with hearsay, fear of change, and feelings stemming from NIH syndrome and Institutional exceptionalism (and sometimes even hubris). On the other hand, there has been a large scale adoption of Agile practices in companies who don't fully understand it and end up adopting it in name but not really in spirit. They may perform all the rituals and ceremonies associated with it, but without really thinking of whether it would apply to their organisation or not, and whether adaptations to those might be needed to better suit their institution.
Somewhere in between are institutions who look at it, apply context and make a plan to adapt it to better suit their needs and execute well on it. They are the ones who would have more success with Agile than others.
I think at some point, everyone of us experiences these biases in some scale or another. I would reckon we all experience these more frequently than we would like to admit. Next time we instinctively push back on something, or instantly think something is a great idea to adopt - it might be good to pause for moment, and truly ask why.