August 20, 2020

Podcast #4: What is Technology?

And why do we need technology assessment? An interview with Jim Thomas

In this episode, ETC Co-Executive Director Jim Thomas explains how ETC understands technology, "Mooney's Law," and the utility of technology assessments for social movements. The episode is hosted by Zahra Moloo.

Listen below or subscribe to our podcast via Apple, Spotify or Google.


Zahra Moloo: Jim, let's begin with what seems to be a very obvious question, which is what is technology?

Jim Thomas: I think often when people think about the word technology, they think of certain types of modern high-tech technologies, whether that's phones, computers, airplanes, biotechnology, or whatever. Obviously, technology is much more than that. We're really surrounded and saturated with technologies many of which we don't even recognize.

I'm sitting in front of a window right now. The window is a technology for looking through a wall. I'm wearing clothes. Clothes are the technology for keeping people warm and sheltered. I'm in a house. A house is a technology for dwelling and protection from the elements, and so forth. Writing is a technology of communication across time and space. We can see many things around us are actually technologies.

What is a technology? I think a common definition of a technology is that it's a tool or an artifact that allows us to solve a problem or do something. I think it's more complicated than that actually. One of the important parts of the word technology is ‘techni,’ it's the same root that you get in ‘technique.’ In a technology, often what you have is a series of techniques that have been made actual, sometimes in a material object or in a system of organization, such that it does something in the world by itself with a certain level of autonomy.

If I can give an example: a slingshot. A slingshot throws a stone very far, but in some ways, it's a set of techniques where you take a stone and you pull it back, and then you launch it very fast. All of that is brought together in the technology of the slingshot. It does all that for you.

ZM: It's almost like it's also a set of processes embedded within something. The definition you are giving is very broad, from writing to slingshot to complex systems. We often think of technology as belonging to the so-called modern world, the Western world, but there are also other technologies.

We rarely hear people talk about indigenous technologies nor do we think of technologies as being universal. Can you talk a bit about that bias towards thinking about technology as coming from a specific place, a particular system of thought?

JT: When we think about technologies as these kind of modern high-tech technologies, that's the effect of a particular culture and indeed an industry. We have technology industries that create that story of technology, that it's these high-tech glitzy things that are sold to us regularly. Agriculture, for example, is full of technologies. The technology of the plow, the technology of the hoe, the technology of seed winnowing, for example.

All of these are tools and techniques brought into material objects that allow agriculture to happen. Water technologies, I think about how many traditional cultures have ways of holding and storing water, of distributing water, of sanitizing water. There is an amazing technology where it's just silver bowls [holding water], and leaving it there for a period of time, the water becomes clean. That's a technology. It's a traditional indigenous technology that's been around for many years.

Indigenous and traditional communities have many technologies. Technologies of writing - they say writing is one of those technologies that we don't even see as a technology these days, but it's very much a technology. It structures and it changes and it gives us power.

ZM: It is also very transformative - writing has transformed so much of the world. Or even if you think of things like cartography, or maps as a kind of technology.

There's also quite a mythical dimension around technology. It seems as if the very word makes people freeze up, and maybe defer when otherwise they'd be more critical. How do people project positive change onto technology or project authority onto scientists and technical experts?

JT: I think there's something that's happened very much in the last few centuries, which is the conflation of science and technology. Science is different systems of knowing, but when we say science these days, we often think of a very particular kind of science, the Cartesian scientific method. There's often been a very deliberate conflation of that with technology and engineering.

We hear about STEM, Science, Technology, Engineering, Medicine, as being one thing and they're not disaggregated. As much as there is a certain expertise and authority that's lent to a certain kind of science, so we're told that technology carries that scientific authority and expertise, by which is meant high-tech science.

Philosophers of science, technology and society will talk about technoscience, the sort of technologies and science that build up each other in a way as opposed to the knowledge of communities which is also science. Science just means knowledge, or the indigenous or bottom-up technologies that are always being innovated but aren't in that specialized technoscience bubble.

ZM: At ETC Group, we often talk about technology as not inherently good or bad, but as political. You're touching on that by talking about this conflation of science and technology, and a particular definition of science. How can you explain this definition of technology as not good or bad, but as political?

JT: There's many ways to go into this. I think it's useful to start with that idea that a technology is really a set of techniques brought together. I think it's understood that techniques are political. The way in which you do something has built into it all kinds of power relations for example.

ZM: Can you give an example?

JT: Sure. I can give this in two ways. One is the composition of a technology. Take something like the computer - a politically neutral thing - where you could say it is brought together from a whole series of processes, whether that's mining rare earth materials, large amounts of energy production that are required, and the concentration of capital in order to design the chips, and the certain types of knowledge you require.

All of those processes that make possible, say, the personal computer on your desk, have a certain political cast to them. You require certain systems for that to happen. The philosopher of technology, Langdon Winner, talks about nuclear power as a very good example of this. That in order for a nuclear reactor to be operated, you need to have systems of military production, you need to have systems of extracting thorium and all sorts of radioactive materials. That technology cannot exist without a certain political reality behind it.

That's one way in which a technology has an inherent politics built into it: how it's produced and how it's kept and how it's run has a politics. Another way is that the use of a technology has built into it certain power relations. One of the clearest examples of this is a technology like a gun.

Very clearly, if somebody is using a gun, the person who holds the gun and points it, whether at a living animal or a target, has power over the thing that it's pointing at. Now that's a very specific example, but it's built into the technology. The technology gives that power to the user. That's also true actually about something like a motor car. The person who holds the steering wheel has certain types of power over the other people in the car, over the people around the car.

That person using the technology is given certain power over other people around them. It's not just true for these two examples. As you begin to look at whether it's genetically modified seeds or whether it's certain types of surveillance technologies or whatever, you see these kind of relations of power that are physically built into our technological objects.

Sometimes those are intentional, as in a gun. Sometimes those are just the effect of assumptions of the people who designed them. This is something that disability campaigners really point to.

They point to how they cannot get onto a bus because they don't have the same set of abilities as many other people. That's not because the bus was trying to be exclusionary of disabled people, it was because there were a set of assumptions about what makes a human being that were built into the design of those technologies.

This is why often, technologies that are built by people who are white, or male, or have other dominant traits in society exclude others who are already marginalized. You build in those marginalizations because assumptions about what makes a human being or who the person that's using this gets built into the technology. In that way, the technology is deeply political because it exacerbates the differences between the marginalized and those who have power.

ZM: I was just going to quote Pat Mooney, who was the founder of ETC Group and who said, “A powerful technology introduced into an unjust society will always increase the gap between the powerless and the powerful.” It seems like we live in a world in which those power relationships are so much established that it seems inevitable that the technologies that we see coming out today that are well-financed, let's say, from certain industries would inevitably increase that gap.

JT: What you just said there is what in ETC we called Mooney's law - this idea that a powerful technology introduced into an unjust society will increase the gap between the powerful and the powerless. Of course, the kicker there is when did you last see a just society? This is an unjust society and we're seeing ever-more powerful technologies.

There are certain questions that need to be asked: when you have any new technology being brought into society, it doesn't just drop on society from above. It's being developed and created. Whose interests is it likely to serve? Who is going to get to use that technology and who's going to be excluded in the process? These sets of questions are deeply political questions. That's why at ETC Group, we argue for technology assessment. As important as developing open technologies and accessible technologies, is being able to politically assess the impact.

ZM: A little bit further on, I will ask you more about technology assessment because it's a whole topic of how do you do technology assessment, what does it actually mean. Just going back to technology and technology development more broadly, would you say that technological development is inevitable?

JT: Yes, absolutely. People develop technologies all the time in the nature of working with their environment, growing food, living together, and finding new ways of being together. I think technological development happens all the time, has always happened all the time. But it's interesting to say what kind of technologies? There are technology theorists like Lewis Mumford who would say there are different strains of technology. There are democratic technologies and there are authoritarian technologies.

There are technologies that as they come into societies, lend power to authoritarian systems and others that democratize. The bicycle is a technology I think of as being very democratic. It's fairly easy to access, it's fairly easy to repair, it lets us live in ways that are more convivial, whereas you might say that the private motor car has more authoritarian tendencies. In order for it to operate, you need to separate space and certain people become more powerful.

You can start to see these different strains of technology. This is where it goes back to the question of traditional and indigenous technologies. Often, those technologies that are named ‘low-tech,’ what they are are democratic technologies. They work well with more community-based societies.

ZM: Could you have a situation where a technology that's created as a democratic technology can then become used in a more authoritarian way or instrumentalized?

JT: Yes, probably. Technologies combine all the time. The example of genetically modified seeds might be that. You've got biotechnology companies who are taking technologies that have been developed over centuries by farmers and that are open source, if you like, and have been shared and fit to the land, and fit to their own place, and fit to communities’ food systems, and then they're adding additional elements to it that allow it to work with pesticides or fertilizers in ways that become more privately controlled and then get used to control farmers and food systems.

So, yes, for sure. This speaks to why there needs to be a certain vigilance about the ongoing politics of technology. There needs to be systems in place to be watching what new technologies are emerging that may have disproportionate impacts on people's power and rights, on the environment and so forth. We need to have a certain level of social surveillance of technologies.

ZM: ETC Group has been doing a lot of work on emerging technologies over the years, but those who are not so familiar with the work we do, are quite surprised to hear about some of the less known technologies that we work on. Can you give some examples of new and emerging technologies that ETC Group is now paying attention to and monitoring?

JT: Sure. ETC Group has, for many years, been watching developments in biotechnology, genetic engineering, which has now become known as synthetic biology and gene editing. These are different ways of changing the genetic makeup of living things. What we're increasingly seeing is that technological platforms are coming together.

That is to say, in order to modify life at the scale of genes and so forth, you need to have a level of control over information, that's information technology, - we see artificial intelligence and developments there. You also need to have a level of control over physical matter. There you have developments in nanotechnology where increasingly technologists are changing and rebuilding materials at the level of atoms.

This coming together of all of these technologies then creates all sorts of things. One thing that we're seeing and have been seeing really for about 20 years is an attempt to bring together these powerful high-tech platforms - nanotechnology, biotechnology, information technology, artificial intelligence, and cognitive technologies - into a common platform that then you can get really quite unusual technologies.

This is now increasingly being called the Fourth Industrial Revolution by groups like the World Economic Forum. We're seeing strange hybrids of living genetically modified organisms behaving more like robots or being able to program with DNA and things like that.

ZM: Can you give an example of one of those? For instance, within synthetic biology? First of all, what is synthetic biology? Then what would be a technology that works in the way you've described?

JT: Synthetic biology is a buzzword for the next stage of genetic engineering. What it involves really is synthesizing the parts of life, like it's genes and RNA, in order to make quite novel new living organisms that are built from the bottom up. One of the ideas under synthetic biology is that ultimately, living organisms have a code, DNA, which you could reprogram in the same way you reprogram a computer.

Now, there's all sorts of problems with that idea, but that's the organizing metaphor. If you can design the DNA code on a computer such that the DNA code of a bacteria would start producing plastics or start producing vanilla, then you can create little cells that become like factories, plastic factories, vanilla factories. This then becomes a new manufacturing paradigm. That's a lot of what's happening in the synthetic biology world.

Increasingly, in order to redesign these little cells, these little molecular factories, synthetic biologists are using artificial intelligence. They're taking massive amounts of genomic information, that is the codes of DNA from many different organisms, feeding it all into large AI systems, and then asking those AI systems to develop new living organisms. Can you come up with the codes that will make a particular flavor, or the codes that will make a particular color dye, or something like that.

We're now seeing in this area that's known as bio intelligence that genetic engineering of organisms is not being done by people anymore. It's being done by large artificial intelligence systems and robotic genetic engineers. Bio intelligence companies are basically saying, "We'll ask the artificial intelligence to come up with the design of the DNA, we'll ask the robots to build it, and then we have a living organism that will make whatever we want."

Increasingly, human beings aren't very involved in that process. It's a lot of power, and there's a real fight between particularly China and the US to dominate bio intelligence, but it also raises some major safety questions. If you don't even know how you're making these organisms, if it's not understood by any human being, then when something goes wrong, it's going to be hard to clean it up.

ZM: On the other hand, people might argue that having that potential to create, let's say, synthetic substances, like vanilla, as you mentioned, might actually be cost effective. Also, given that we are using so many resources on the planet, given the environmental catastrophe, are there not benefits to having this technology?

JT: This is exactly why we need technology assessment, because you can make that claim and the companies do make that claim, that if you could make for example, synthetic vanilla, which they're making out of synthetic biology cells, then you're able to produce it more cheaply and so forth, but obviously, there's other questions. What does that do for vanilla farmers in places like Madagascar? In turn, what does that do to the environment that those farmers actually protect, which is forest environments?

You have to interrogate the implications of this technology beyond the claims of those who make it. That's something that anyone looking at technology has known for years and that in fact, Plato wrote about Socrates, I think it was, talked about how the inventor of a technology is not best placed to determine the impact of that technology. He was talking about writing, the inventor of the technology of writing, and talked about how actually writing had had all these problematic effects.

That's even more true when you're writing DNA or you're rewriting our society through technological means, which is increasingly what's happening through artificial intelligence being applied to communications and other areas too.

As technological abilities drive production, drive how we have politics and discussions, we need to have the capacity to evaluate those technologies to see if they are authoritarian, if they have unfair impacts on people who are already marginalized, and to try and understand the built-in politics of those technologies.

ZM: That's what you mean when you say technology assessment. It is an evaluation of all the impacts of the technology, but also how it is created and who is behind it. Can you talk more about that process? How do you do technology assessment?

JT: Well, actually, people do technology assessment all the time. Most people have some level of assessing the technologies that they're interacting with. They're deciding, should my children have cell phones? Or should they have cell phones but only on these occasions? Should I be eating genetically modified food? Or do I feel comfortable eating pesticides?

People are continually assessing the technologies they come in touch with. There's a lot of very good, common sense assessment that goes on all the time in society and between people. What we're missing crazily enough is that being brought into governance of technologies. At the level of governing technologies, there is an assumption that technologies are neutral, that those who bring them to society always understand them and have the best interests at heart, and that there won't be unforeseen effects.

Sometimes there are rules that will mitigate against some of the worst effects of the environmental effects, or the health and safety effects, but in terms of how technologies change power relations, or push people out of jobs or de-skill people or make some people more powerful than others, there's really no governance mechanisms in place to track that and to try and re-balance it in a fair way.

What we're saying is, we need those mechanisms upstream. As much as governments put money into trying to promote technologies as engines of the economy, they need to be putting effort into assessing technologies which may disrupt and unfairly impact large swathes of the population.

ZM: What does exist now at the level of governance in terms of technology assessment or evaluation or even just consideration of new technologies?

JT: Part of what's happened is we have technology specific regulations or governance frameworks around particular technologies, where society has run after those technologies and said, "Well, wait a minute, these are safety questions or this seems to unfairly impact workers." They're technology by technology.

ZM: Can you give an example of one of those?

JT: Yes, global agreements on bio-safety of genetically modified organisms or the rules around pesticides and toxins and food. What's interesting is we're in an era now where technology as an engine of the economy is so powerful, the most powerful actors in our economy are now technology companies, and they're driven by an ideology of disruption. It's seen as a good thing to disrupt society.

The idea of Silicon Valley, the ideology is move fast and break things. Try and upturn things in order to get creative disruption that will make new sources of funds and money. The idea that disruption is a good thing can only be true if you're very well insulated against the shocks and the pain of social disruption. It's only true for the richest and most powerful people.

For most people who are trying to deal with the effects of change and are already highly vulnerable, disruption is a threat. There's this perverse idea that we want disruptive technologies, we want to upturn our societies continually and make money along the way. We should be able to say disruptive technologies will always benefit a small number of people, but are likely to have big downsides and we have to start from protecting those who they are going to impact.

ZM: Can you talk a bit more about the issue of governance. Is there an example of a particular technology that's being discussed at the international level in particular fora in which this need for technology assessment is coming up or in which groups are trying to push for technology assessment?

JT: Sure. I talked about synthetic biology earlier, and that's one area. Within synthetic biology, there's a technology which is gene drives and ETC group has been working with other groups to draw attention to gene drives. Gene drives are a genetic engineering technology where an organism is genetically engineered with a new trait, but then that's driven through an entire population.

You're not just engineering the single organism that you change in the lab, but it automatically passes on that change to all of its offspring so that you engineer an entire population -- its population scale engineering. There's been a lot of work and activity by civil society groups to bring the question of gene drives to the International Convention on Biological Diversity and other places to say that this has real impact on people's rights.

This isn't just about safety, there are big safety and bio safety issues, but this is also about impacting the rights of farmers and of people whose land is going to have gene drives introduced to them. Within the Convention on Biological Diversity, there's now a recognition that there needs to be horizon scanning of new technologies, assessment of new technologies.

ZM: What is horizon scanning?

JT: Seeing what new technologies are coming so that when something like gene drives hits, it's not too late. You see it coming, there's a process of seeing the technology coming, putting it under an assessment framework and monitoring the effects afterwards, because a body may say, "This is perfectly fine," then new information comes up later. Monitoring is as important.

The Convention on Biological Diversity is actually discussing how to have a framework for horizon scanning, assessment and monitoring of synthetic biology and even as a question, maybe this should happen across all technologies that can affect biodiversity. Really, when you go to many of the drivers of biodiversity loss, they have a technological underpinning to them, even climate change. The massive amounts of carbon dioxide in the atmosphere now, you can take straight back to the industrial revolution and the increase in cars and airplanes and so forth. Unless we're carefully monitoring our technologies, watching what's coming, assessing them, and keeping that monitoring going, we're going to have new drivers of destruction of biodiversity, but also destruction of human rights.

ZM: With technology assessment, let's say if that process were to take place, if there was a horizon scanning of new technologies that are probably going to arrive and disrupt, but also an assessment of the benefits, the impacts, the people funding it, what results would we have from that process?

JT: Well, I think it's possible to do technology assessment in a way that doesn't transform anything, and that would be a problem. If it was literally just looking at what are the health effects, that would protect a few things. What's really important is that the assessment has to be driven from below. If you go back to Mooney's Law, that the idea that when you introduce a powerful technology, it exacerbates the gap between the powerful and the powerless, you're only going to understand the impact if you involve those who are most marginalized, those who are probably going to be most affected. They're going to bring really the most useful knowledge. They're going to be able to see how this technology will play out in their food systems, in their health systems, in their systems of decision-making.

To have a meaningful technology assessment, you're going to have to empower indigenous peoples, local communities, farmers, fisher folk, and others to be able to assess and speak about what are really ultimately political changes and economic changes. What on the one hand seems like a very technical request - we want to be able to assess technologies - is actually quite a political request.

ZM: Is there any resistance to that? Are there groups of people who don't want technology assessment to be done in that way?

JT: Absolutely. You see it in the international negotiations where technology assessment is put forward, that some of the most powerful northern countries, like the US, Japan, Australia, would like to see that language disappear.

ZM: Finally, just going forward, for people who are listening, if they were interested in this topic or interested in getting involved or knowing more about the issue, where can they go?

JT: Many people are involved in campaigns and struggles and movements all around the world. I think even just beginning to ask the question, how do technologies affect the thing we're focused on? The struggles we're involved in, the campaigns we're involved in, how are these technological? How's the politics technological? That would be a useful start and to begin making these connections.

Obviously, at ETC Group,, we try to put information about new technologies and what's coming on the horizon. That's a place to start. I think bringing the politics of technology into our many different campaigns and struggles will over time become more and more important.

For more information about ETC Group, visit our website at


Please consider supporting ETC's unique research and advocacy with a tax-deductible donation. Donate here