Computers are getting smaller and smaller. But what if we had sensors the size of dust, that could float through the air undetected, talk to one another, gather information, and transmit that information back down to a central place? This is the concept behind smart dust, and it’s more plausible than you might think.
Guests:
- Amy Webb, quantitative futurist and founder of the Future Today Institute
- Faine Greenwood, journalist and drone expert
- Stacey Higginbotham, journalist, co-host of the Internet of Things podcast
Further Reading:
- Future Today Institute Tech Trends Report 2018
- Smart Dust Is Coming, Are You Ready?
- Two-photon direct laser writing of ultracompact multi-lens objectives
- 3D printing enables the smalles complex micro-objectives
- Smart Dust Is Coming: New Camera Is the Size of a Grain of Salt
- Smart Dust and Sensory Swarms and OpenWSN (Kris Pister talk at UC Berkeley)
- DTIC ADA464105: Dynamic Resource Allocation for a Sensor Network
Paper from Bombas advertisement: Spray-On Socks: Ethics, Agency, and the Design of Product-Service Systems by Damon Taylor
Flash Forward is produced by me, Rose Eveleth. The intro music is by Asura and the outtro music is by Hussalonia. The episode art is by Matt Lubchansky.
If you want to suggest a future we should take on, send us a note on Twitter, Facebook or by email at info@flashforwardpod.com. We love hearing your ideas! And if you think you’ve spotted one of the little references I’ve hidden in the episode, email us there too. If you’re right, I’ll send you something cool.
And if you want to support the show, there are a few ways you can do that too! Head to www.flashforwardpod.com/support for more about how to give. But if that’s not in the cards for you, you can head to iTunes and leave us a nice review or just tell your friends about us. Those things really do help.
That’s all for this future, come back next time and we’ll travel to a new one.
FULL TRANSCRIPT BELOW
▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹ ▹▹
Rose: Hello and welcome to Flash Forward! I’m Rose and I’m your host. Flash Forward is a show about the future. Every episode we take on a specific possible… or not so possible future scenario. We always start with a little field trip to the future, to check out what’s going on, and then we teleport back to today to talk to experts about how that world that we just heard might really go down. Got it? Great!
Before we go the to future this week I just want to say a quick thank you to the folks who are in the Flash Forward FB group! Facebook group you ask! Yes, right now the group is open only to Patrons, so if you become a Patron you can get access. It’s for links and conversations and sharing dog photos, and all the other fun things people might want to do in a Facebook group. So if you want to join, and chat with fellow Flash Forward listeners, right now the only way to do that is by becoming a Patron. Full disclosure: I will probably open the group up to everybody eventually. But I’m not sure exactly when, so if you don’t think you can wait some indeterminate amount of time, you can go to patreon.com/flashforwardpod! For just one dollar an episode you can get access to this group, as well as other fun stuff that’s on the Patreon page.
Okay, let’s go to the future. This episode we’re starting in the year 2048.
***
[typing sounds]
Robot voice: Okay Cerebron, show me quadrant nine. Just the air quality sensors. Zoom in on that spot. Okay, add heat signature sensors.
[ding]
Robot voice: Add the cameras. Zoom in on square four. Enhance. Enhance. Send that image to Lem.
[typing]
[chime]
News Anchor: It’s part of the development of the ultimate evil genius tool, smart dust. Smart dust has sensors for light, heat, sound and chemicals. It communicates by radio with other specs of smart dust to form a distributed computing network. Smart dust can be used to track people in natural disasters, to gather environmental data – such as sniffing out pollution in difficult or distant environments, for all manner of scientific and surveillance purposes. Naturally, smart dust is solar powered.
[typing]
Robot voice 2: Cerebron show me quadrant nine, cameras only please.
[ding]
What’s that?
Second robot voice: Oh, nevermind just an overheated car. Why do humans insist on driving those things.
Robot voice 1: They’re illogical.
Second robot voice: Send a road sensor report on the O-220 over to transport just in case.
[ding]
Robot voice 1: Have the round three sensors all come back?
Robot voice 2: Almost, waiting on the last 800 or so.
Robot voice 1: Not bad.
[ding]
[typing] [fade out]
Rose: So this episode is all about sensors. And specifically, really, really tiny sensors. As small as a grain of sand… or smaller, like, dust sized.
Amy Webb: You know, it’s hard to envision dust, because we don’t tend to see singular grains of dust, but at some point everybody’s seen the sunlight pouring in through a window, and you see little teeny tiny specks sort of floating around.
Rose: This is Amy Webb, she’s a futurist and the founder of the Future Today Institute. I first read about Smart Dust in something called the Tech Trends Report that the Future Today Institute puts out. They included smart dust in their reports for the first time in 2017, and it’s back again on the 2018 report, too.
Amy: We’re talking about tiny computers. And these computers are so small, they’re like the size of maybe a grain of sand, or even smaller that that. Which is why we compare them to dust. So these are teeny tiny computers that perform different types of functions, and operate as part of a network. And those tiny computers you know tell us information about something
Rose: Smart dust is not actually a new concept. It shows up in Stanislaw Lem’s 1965 collection of short stories called The Cyberiad: Fables for the Cybernetic Age. And on the research side, the term “smart dust” was coined by a guy named Kristofer Pister, at UC Berkeley, back in 1999. And this technology became a really big fad for a couple of years, everybody was talking about smart dust and how it was going to change the world. Here’s Kris Pister talking about the excitement around smart dust at a lecture at Berkeley a couple of years ago.
Kris Pister: Everybody got really excited at that point. There was so much excitement about where all the technology could go. And a bunch of startups coming out of MIT and other places. I was one of the people that jumped off and started a company. And you know I did my best to get good celebrity endorsements.
Rose: On the screen is an image of Kris shaking hands with Arnold Schwarzenegger. And Arnold Schwarzenegger was involved because people were convinced that this was going to be the next big thing. Here’s a clip from a Discovery Channel segment about Smart Dust from around 2005.
Discovery Clip: Smart Dust on the tracks will monitor your commuter train, so you know if it’s running late. Potholes will be able to report themselves, and warn your car. And you’ll never have to wait for a radio traffic report again. Bridges will get a coating of smart dust particles that will warn us when they detect stress fractures, helping avoid deadly collapses. But smart dust will also allow buildings and streets to recognize you, and respond accordingly. Your workplace will know you. Smart dust at the entrance will boot up your computer. And smart dust embedded in the elevator doors will automatically ring your floor.
Kris: And the industry analysts were predicting this incredible acceleration. By 2007 the number of wireless sensor nodes was going to eclipse the number of cell phones sold on the planet, according to these guys. But it turned out that this didn’t happen.
Rose: In fact, between 2005 and now ish, smart dust kind of faded away. Like… dust in the wind… you might say?
It turns out, actually making this whole system work — thousands and thousands of teeny tiny computers, networked together, the size of dust, is really, really hard.
Kris: The thing that was preventing people in 2005, when they were surveyed, was reliability. It turned out that wireless for sensors has been out there forever. People have been trying to do this literally since the time of Marconi. But it’s a real challenge, and getting the reliability; if you don’t get the reliability, people will rip the system out. They’d rather no data at all than have intermittent data.
Rose: In other words: it was really hard to get these tiny sensors to work reliably together, and when they didn’t work reliably together, nobody wanted to use them, and when nobody wants to use something the people with money in industry who would perhaps fund the making of such things say “goodbye!” and take their money elsewhere.
But of course, that doesn’t mean that researchers like Kris haven’t been working on this stuff. It just means that their work hasn’t been catching the eye of sexy television programs like Discovery Channel or the moneybags at places like Samsung or Intel.
Kris: Of course, what I set out to solve as an academic had absolutely nothing to do with what anybody in industry cared about.
Rose: So, slowly but surely since the early 2000’s, when smart dust first caught people’s attention, things have changed. And few years ago, some of those changes started to all bubble. But how do you know if those bubbles are… legit bubbles that burst and say “yes I am indeed something to pay attention to?” Or if those bubbles are like “pop: I’m here to seem cool and sexy but actually it’s highly unlikely that I’ll wind up materializing as a real technology!”
Amy: My work as a futurist; it’s really important to distinguish between what is a trend and what’s trendy.
Rose: Amy is basically the master bubble interpreter. She has a whole method for listening to those bubbles and understanding what they’re saying.
Amy: The first part of that methodology really is trying to surface nodes of information, and to observe the relationships between those nodes. That is what I call a fringe sketch. And what’s magical about creating a fringe sketch that’s rooted in data, is that oftentimes we wind up seeing not just emerging trends, but really interesting plausible future uses for those trends.
Rose: And in the case of smart dust, she started seeing a few key bubbles popping that she thought were worth listening to.
Amy: The shrinking sizes of components, the availability of cloud based compute power, the new code that makes it easier to deploy teeny tiny chips and have most of the processing happen remotely. All of these things combined, along with ever better tiny lenses and computational photography creates a constellation. And so not only are we able to in advance see the plausibility of smart dust emerge, but we’re also then able to see all of the next order implications.
Rose: So let’s walk through just a couple of those key things that mark the return of smart dust as a technology to watch, and why I’m dedicating a whole episode to this particular idea. First, there’s the ever shrinking size of things like cameras.
Amy: The University of Stuttgart has been leading the way in 3D printed lenses that are capable of taking extremely high resolution photos. And these lenses aren’t assembled, they’re printed out fully formed. And in some cases, some of the smart dust includes not just one lens, but like eight lenses, or even more that are embedded into a single tiny little device.
Rose: Plus, the cost of manufacturing sensors has gotten cheaper and cheaper and cheaper. In 2004, sensors cost about $1.30 to make, and Goldman Sachs currently estimates that by 2020 they will cost just 38 cents. On top of that, there has been a big shift in how and where information is stored and processed. For a long time, anything your computer needed to do — any kind of processing or software or thinking that it had to go through — had to happen on the computer itself.
Amy: The original architecture that is still used across many of our devices is what’s called the Von Neumann architecture.
Rose: That works for some things, but as we start to ask more and more of our devices — more colors, better video, smarter thinking, harder questions — that gets more and more challenging.
Amy: Our computers have come a long way, tut the average person doesn’t have a supercomputer in their home.
Rose: So we’re starting to see this assumption that the so called “compute” happens locally, on your device, change. Your phone, for example, might be connected to The Cloud. And if it is, some of the thinking that your phone does, doesn’t happen on the phone itself, it happens in the cloud.
Amy: So there’s a couple of really interesting mobile photo apps that let you retouch and fix up the photos that you take with your phones. That’s done in cloud. And the challenge with the cloud is you’ve got latency, you have bandwidth demands, you’ve got more and more people getting online, you’re potentially in areas where the connectivity isn’t great.
Rose: This is the same challenge that Kris talked in his lecture that I played you.
Kris: The thing that was preventing people in 2005, when they were surveyed, was reliability. If you don’t get the reliability, people will rip the system out. They’d rather no data at all than have intermittent data.
Rose: So there’s this inflection point coming — pretty soon, we’re going to have to figure out how to handle all this informational processing.
Amy: One of the things that’s happening right now is a debate around where is the future of that compute happening. Is it closer to the device? Is it the device itself? Is it the cloud? Are we moving toward small many small cell technologies? This is like many, many more, potentially millions more, small cells that are deployed all over the place rather than just on the big towers.
Rose: If we do wind up with that future, with millions of tiny cells deployed all over that can help with the processing, then we might wind up in a world where smart dust can be reliable enough to be feasible. And a lot of people are pushing for a future with a ton of distributed high speed processors, not just because of smart dust, but for all sorts of applications.
Amy: One of the theories is that’s why we need 5G. Not so that you can stream ever more videos, but to support this enormous internet of things ecosystem that’s partially already here, and certainly coming.
Rose: So that’s one of the pieces. Another piece is that Amy was seeing investments and advances in areas like computational photography, machine learning, facial recognition algorithms; even the development of new, lighter, polymers to make lighter sensors out of.
Amy: Folks that are making environmental sensors, for example. And some of the research there, again there’s just more money, there are more applications. And all of these things interconnect.
Rose: What all of this adds up to, is a trend. An official Tech Trend. Smart dust is back, and its something that Amy thinks is worth paying attention to.
Amy: Now this does not mean that you going to walk outside and go to the store, and buy a scoop of smart dust. There’s no bulk Smart Dust Counter the way you would buy gummy worms. At least I hope not. But what it does mean is that this is viable now, and it affects and intersects with enough different industries that it’s worth paying serious attention to.
Rose: Now, smart dust doesn’t have to be cameras. It can be all kinds of different sensors that are connected to one another. Some of them might be cameras, some of them might be accelerometers, or pressure sensors, or light detectors, or sensors that look for specific chemicals. The combinations are kind of endless And the applications are too. But let’s talk about a few of the applications that are, as Amy says, in the realm of plausible.
Amy: Which is to say, there is initial research, there is data to support what I’m about to say. But what we’re about to talk about.
Rose: The first one is medical imaging.
The human body, you might have noticed, is not see through. In fact, one of the big challenges doctors have is trying to figure out ways to look at stuff that they want to see, that is inside of us. Take cancer for example — sometimes a person can have tumors, but the doctors can’t figure out where the cancer originates.
Amy: This is actually what happened to my mother. My mom had neuroendocrine cancer. So it was kind of a very rare form of cancer, and they didn’t know where the primary site was. We did know where the tumors were. But without knowing what that primary site was, or what, in a more specific way, was going on all they really could do was give her treatments as though she had pancreatic cancer and lung cancer. And she hung on for three years. But they didn’t have anything else to treat her with.
Rose: With smart dust, you could have a patient ingest these sensors, and have them go looking for the signs of cancer, and trace the tumors and mutation back to the source.
Amy: That potentially could allow doctors to find that primary site, and then to come up with a treatment plan that makes more sense. Rather than just coming up with a blunt instrument.
Rose: This could also work for one of the hardest parts of the body for doctors to access: the brain.
Amy: It’s hard to map what the brain is doing while it’s functioning/ And that’s been a challenge for the neuroscience community for a very long time.
Rose: With smart dust, researchers could send tiny specks into the brain to try and figure out what it’s doing in real time, locally, and have those dust mites send information back. Now, I want to be very clear here that this is not currently happening. I’m saying this because if you go to YouTube and you search “smart dust”, the vast majority of what you get, even on the first page of results, is conspiracy theory videos in which people claim that they have already been infected by smart dust deployed by the government to track them or control their brains. Smart dust like this does not really exist yet. There are still some technical hurdles for this technology before it could really be deployed like that, and we’ll get to those in a bit. The point here is please don’t believe everything you see on YouTube.
One of the other use cases for smart dust that Amy thinks is plausible, and even probable, is for environmental monitoring.
Amy: You can you can measure some very specific things that way, like particular contaminants and pollutants.
Rose: This could be for an oil spill, or a nuclear waste accident, or something more nefarious.
Amy: If there is an environmental catastrophe; we have a airborne chemical of some kind. Or we’ve got a, God forbid, some kind of bio weapon. I think, in those cases when we’re talking about using sensors, and needing to deploy sensors fast across a wide area, that makes sense.
Rose” And here is where smart dust can get kind of interesting — in theory, you could stick it in and on almost anything. So what are things that we’d like to track? Maybe animals, and where they go? Or the jet stream, and how it’s shifting? Or the waste that people generate?
Amy: Now here’s an interesting idea, and I’ve been thinking — the project I’m working on next week has to do with the future of micro plastics in the oceans. And I’ve been thinking a lot about fleeces and micro fibers, because I’m cold all the time and I wear fleeces all the time. And I didn’t realize how environmentally problematic fleece microfiber is. But that’s kind of interesting right? That’s an interesting use case of embedding that type of sensor, for a certain amount of time, into the micro fibers as a way to track where they actually go. It’s easy to see visuals of like the beaches of Thailand when you’ve got just awful numbers of soda bottles washing up. But it would be really interesting if we had smart dust embedded into many of our plastics to track that data, and see well where does this stuff actually wind up.
Rose: Anywhere you might want to know where something is, what it’s doing, where it’s going, what it’s made of, you can just stick some smart dust in there.
Faine Greenwood: So if you’re trying to monitor a rare species of antelope or something, or some kind of area, some kind of very jumpy, very hard to see critter, this could be incredibly valuable for that. I think about monitoring volcanoes, people are using drones for that. People use other cameras, but that could be a really great way to see what’s happening in a dangerous environment you might not want to necessarily be in. So those could be applications that I would not find particularly creepy.
Rose: This is Faine Greenwood, a journalist who specializes in drone imagery. And when we come back, we’re going to hear about the applications that does find creepy. And why she would actually rather stick with her drones than use smart dust. Plus, what actually happens if we try to manage a trillion devices at once? Is that even possible? But first, a quick break.
[BREAK]
Okay so we’ve covered what smart dust is, and some of what it can do in the plausible future. But what about the further out future?
Personally, one of the first things I thought of when I started thinking about how this kind of smart dust could be used, was journalism. This probably isn’t a surprise I guess because… I am a journalist. But reading about smart dust cameras actually made me think of journalists who use drones to document protests and natural disasters and emergencies.
Faine: There’s lots of lots of great examples of the use of drone technology that I think are really cool or go beyond even the popular stereotype of some dude-bro in Silicon Valley flying a drone over himself doing sick ski tricks. Which…. boring.
Rose: This is Faine Greenwood, you heard her a little bit just before the break, and she’s a journalist and researcher at the Harvard Humanitarian Initiative at the Harvard School of Public Health where she researches drones and humanitarian aid.
Faine: So one of my favorite examples is this Indonesian geographer called Irendra Radjawali has been doing some fascinating work with Dayak people in Borneo. He’s been working with them, and helping them use drones to document land abuses on their traditional lands. They’ve actually been able to win court cases in Indonesia with this drone imagery.
Rose: There are lots of examples of journalists and activists using drones to document protests and rallies. In November of 2016, drones captured footage of law enforcement agents spraying protestors with water cannons in freezing weather during the Standing Rock oil pipeline protests in North Dakota. That footage went viral, and the FAA responded by issuing a flight restriction over the protests, basically banning drones from flying and documenting anything more. A similar restriction on drones was placed over the airspace in Ferguson, Missouri in 2014, after protestors rallied against the murder of Michael Brown. Drones have also been used to document refugees landing in Greece, and in 2012 drone footage of a meat-packing plant in Dallas showing literal rivers of blood coming out of this facility, and resulted in a big criminal investigation of the company.
Faine: So, another great use for drone technology in humanitarian aid and disaster response is, of course, post disaster mapping. The case that most people know about is in 2015, during the devastating Nepal earthquake, drones were used to basically map the damage, and figure out what had been damaged, and figure out the extent of what was need, what the resources that were needed to help people there.
Rose: And drones provide some advantage to the other forms of aerial imagery out there.
Faine: Flying unmanned aircraft for photography is very expensive, crazy expensive by the hour in many cases. Whereas satellite imagery can be used that way. but that also can be quite expensive to purchase those images. You have to have a special skill sets to know how to even download it, or work with it, or manipulate it. You can only take a satellite picture once a day.
Rose: So I thought that maybe the idea of smart dust, this tiny mesh network that someone could just throw up into the air, that could record in real time, might appeal to Faine. But actually, she’d rather stick with drones.
Faine:Not putting on my pragmatic hat for a second, I think that could be a very interesting technology. But also very creepy. I do find that much more creepy than a drone, in a lot of ways
Rose: Anything that is invisible, and could watch you all time is obviously a privacy concern. To say the least.
Faine: It really would be a terrifying realisation of some of the stuff people are afraid that drones can do, that drones actually can’t do today. But like this kind of thing, if that existed that would strike me as a pretty legitimate problem.
Rose: People today are still somewhat skeptical of dones when they’re not being used for really specific things like search and rescue. There are tons of stories of people shooting down drones, or worried that someone is using a drone to spy on them. But in fact, at least when it comes to consumer drones, that’s kind of unlikely. Drones are not very sneaky. They’re pretty loud and pretty obvious. And in fact there is a lot of conversation about how to make drones even more understandable to us — some way for them to signal who they belong to. Some uniform of some kind.
And Faine says that in many cases, the ability to see the drone is actually really important.
Faine: I think in most cases, for reporters and activists certainly, you actually kind of want it to be obvious that you’re using a camera. Or that you are a journalist, and you are doing this. Because part of it is that you’re using the camera as both a way to document stuff, but also as a bit of a deterrent effect or a way to shame people out of doing things, because they know you’re watching.
Rose: Now, the so called observer effect doesn’t always work. There are examples of cases in which governments have done more damage because they knew they were being watched, kind of like to say “hey I know you’re looking and I don’t care.” And there are other cases where operating a drone, and being obvious about your recording, can put people at risk of serious physical harm. For Faine the visibility of drones is generally good. And the invisibility of a system like smart dust, is … scary.
Faine: Yeah, I would find the dust concept cool on one level. I could see the benefit. But also I can see it as extraordinarily ethically problematic and as a way to really create that kind of frightening “everybody’s watching everybody at all times for God knows what purposes” reality that a lot of us fear, and social media is kind of bringing about. Yeah, that could really close the loop on it.
Rose: Earlier Faine said that she wasn’t putting on her pragmatic hat, but let’s put that hat on now, for a second. We’ve been talking about these tiny sensors almost like they’re magic. Like they just kind of appear out of nowhere, and we throw them in the air and then they work and then disappear as if they were never there before. Obviously, that’s not the case.
Stacey Higginbotham: They’re not magic. They have to have some kind of software and they have to have some way of talking back to an entity.
Rose: This is Stacey Higginbotham
Stacey: I’m a journalist who does the Internet of Things podcast, and a website called Stacey on IoT. I cover the Internet of Things.
Rose: Now, you might have heard of the Internet of Things, but for those who aren’t familiar, let’s do a quick definition.
Stacey: It means a lot of things to different people. But, the way I look at it is a platform for innovation. Kind of like broadband, or the invention of semiconductors was. And what it’s doing is it’s combining the ubiquitous wireless connectivity, cheap cloud computing, and cheap sensors, into this whole way we can gather more granular data about the world around us and then start making decisions based on that data.
Rose: So this smart dust stuff is kind of related to the realm of the Internet of Things. There are these little devices that are connected to each other, and then ultimately, probably connected to the internet, somehow. And right now, there are a lot of devices in this network, connected wirelessly to one another, and gathering and transmitting data.
Stacey: The current guestimate can range from about 8 billion devices that are already out there all the way to about 20 billion by the end of 2020.
Rose: And some people think that, in fact, it won’t be 20 billion by the end of 2020, it will be a lot more.
Stacey: So the CEO of Softbank, Masayoshi Son has actually said that he thinks by 2020 we’re going to have a trillion sensors, or a trillion connected devices.
Rose: Wait a minute, you might be saying. That kind of prediction sounds… familiar….
Kris: And the industry analysts were predicting this incredible acceleration. By 2007 the number of wireless sensor nodes was going to eclipse the number of cell phones sold on the planet, according to these guys. But it turned out that this didn’t happen.
Rose: That is Kris Pister’s lecture at Berkeley from earlier in the episode. This is the kind of thing people love to predict — trillions of devices! They’ll show a graph that has that straight up line. Everything is going to advance exactly linearly, or even exponentially, and we’ll get trillions of them. And we have no idea if we’ll get trillions of devices.
Stacey: I don’t know if we’re actually going to hit a trillion by then. I don’t know if we could even count that high, in all honesty. But I do think we’re going to have a lot of lot of sensors.
Rose: But even if we only wind up with a mere 20 billion devices, there are some big challenges to managing all of that stuff. Remember, these aren’t magical dust particles.
Stacey: We don’t even have a programming language that makes it easy to talk to, let’s say, a billion sensors. Or even a hundred thousand sensors that is really scalable.
Rose: Not only will they need some kind of programming, they will also need power.
Stacey: It is impossible to change the batteries on a trillion sensors. So we’re going to have to come up with some sort of way to have them power themselves. And right now we have a bunch of energy harvesting technologies that are out there. So there’s obviously solar. We’d need to make that a little bit more efficient, because these things are typically tiny. Sometimes they’re underground, so that’s not going to work. There are energy harvesting chips that work based on temperature change, and others that work based on motion.
Rose: Maybe these sensors will be powered by our own sweat, or by the air, or by temperature differentials in waterways.
Stacey: What if the sensors had basic, kind of, mimicking mitochondria right? And they were able to generate their own power. It’s kind of far out there, but we’re going to need something like that.
Rose: And here is an interesting point that Amy raised that is kind of one of those ~~ everything is connected ~~ moments. One of the things that might wind up making tiny sensors like smart dust happen, is green energy. And not just because that energy will be used to power the sensors and computers. That is true. But this connection is a little bit weirder and deeper. So, right now a lot of big oil companies are worried about the future.
Amy: If it’s the case that we’re moving toward electric vehicles, that puts our big petroleum and petroleum companies in a bit of a bind. The Exxon Mobils of the world.
Rose: But companies like Exxon Mobil don’t just make oil that goes into our cars, they also make other stuff.
Amy: They’re also producing the materials that become the plastics and the micro fibers that we drink our soda out of. A
Rose: If smart dust is really going to take off, it’s going to require really, really light materials.
Amy: Smart dust isn’t going to be made out of wood. It’s too heavy, right? And it’s not going to be made out of metal, right?. Probably because that’s also too heavy. But a new type of polymer, a new type of plastic; that makes a lot of sense.
Rose: So these are going to be made of polymers. And probably new polymers. Super light polymers. Polymers that might be made by oil companies.
Amy: It’s possible that a company like Exxon Mobil, who we do not work with, and I don’t know that they’re necessarily working on this, but a company like Exxon might be incentivized to create ever lighter polymers that would serve something like smart dust really well because they need to put their resources into something.
Rose: Smart Dust, brought to you by Chevron or BP, or whoever you choose. Deployed into the air, water, or even into your body to gather data. Sometimes people say that data is the new oil, and in this case it’s almost literal.
Stacey: A lot of people, they’re like, “oh my data is my oil. It’s going to be what differentiates me from everybody else.”
Rose: Precious, precious data that must be protected at all costs.
And this question of who actually owns that data collected from all these tiny, tiny sensors, is actually a huge one.
Stacey: Who do we want in control of that? I can own some of my health data, but you can also look at things like my voice timbre, or the pattern of blood flow in my face and make health determinations. And once that’s in my place of work, for example, or at the DMV when I’m taking a driver’s test. Both of those are like, hey maybe it would be really good to know if I were mentally declining at the DMV, and maybe you don’t give me a driver’s license. But what if I go into my place of work and they’re like, “Oh Stacy, I know you’re only forty-five, but judging by this, your performance is going to decline rapidly so we’re going to let you go.” The implications of this are bonkers, and we’re not prepared for it. It’s going to be a whole new social paradigm. Kind of Black Mirror-ish, is how I’m thinking of it. But I’m hoping it’s not the case.
Rose: We’ve talked about how having all these things watching us could be super, super creepy and bad. And there’s another layer to that, too. And how do you secure a trillion anything? How do you make sure that the nodes aren’t lying, or infecting each other, or get hacked, or repurposed, or turned against us?
Stacey: There’s a lot of people talking about using the blockchain as a way to establish trust between these sensors. So, the idea is you train sensors to think that they’re being lied to. Right? So any node in your network could be corrupt or a liar. And what you do, is you try to establish trust using the blockchain. So you say, “what has the sensor told me in the past? Did that prove true? Yes? No? OK it did. So have a trust point basically.”
Rose: That sounds great, right? A self regulating system! The problem is that for that to work, the sensors have to be way more powerful than they actually are.
Stacey: And then you’re back to square one.
Rose: And at the end of these sensors lives, what happens to them? Unlike real dust, this stuff isn’t part of the ecosystem, it doesn’t get recycled through natural systems.
Stacey: So we aren’t even at the point where we have smart dust, or really sensors just being flung willy nilly into the air. But the United Nations has already found that people generated 44.7 million metric tons of e-waste in 2016, and they expect that to grow to 52.2 million metric tons by 2021. If we go from there, then we’ve got all these sensors that are going to die, and we’re going to make more and throw them out there. Which then becomes kind of an environmental nightmare if we start thinking about it in that way.
Rose: You could potentially design these things to be recyclable, but that would make them more expensive, which cuts down on the viability as a commercial project.
Stacey: it costs a lot more to design something for recyclability, because you have to use different types of glue and things like that that you can use to pry up your metals more easily. So that’s, that’s going to be tough.
Rose: You could also try and program them to find their way home, like maybe with a flagella or something? But that would add to the weight and energy requirements. Plus, besides not wanting to contribute to the e-waste problem, you’re also probably not going to want this stuff to stick around in your body for a really long time.
So, there are a lot of complicating factors around the technical side of this. How these things are made, how they’re powered, how they’re deployed and secured. But I think it’s also worth pausing for a second to ask: what is all this data actually for? The point of having all these sensors is to collect data, right? But is this data making us happier, or healthier, or better? Is it helping solve an actual problem? Or is it being collected so that one giant data company can sell it to another giant data company.
We’ve talked about this on the show before, but any good scientist can tell you that simply having data doesn’t really do much. You have to know what you’re looking for in that data. You have to have some kind of hypothesis or question that the data can answer.
Stacey: I think right now a lot of companies are kind of doing the work of old school scientists, where they’re like, “oh I’ve got a lot of information.” And they’re taking their data and putting it in these “data lakes,” and they’re like, “we’ll deal with it later.” But I think we need to move to… in this world where data is almost ubiquitous, we need to move to having hypotheses around our data, and then pulling that data and testing it.
Rose: The kinds of questions that we ask this data kind of dictate what this future looks like. As always, there are many ways to see this future.
Stacey: So, dystopian, cynical, current 2018 Stacey says this will probably be a really terrible future for everyone who’s not really rich, and doesn’t have societal societal advantages. The optimistic, I’ve loved technology for 20 years and really know it can help people; it has the potential to be amazing. We are embedding sensors in roadways that will detect accidents as they happen, even if it’s in a remote area and no one passes by. So, we’ll be able to tell that sort of thing, and send help right away. You can envision a future where we optimize traffic flow so we can consume less energy, or maybe so emergency response people can get through traffic faster. These are all things that are actually happening. We can consume less electricity, or optimize our homes and businesses to consume electricity within the cycles of the sun and wind so we don’t have to have coal fire plants. This could be amazing. But it also could be, like I said earlier, incredibly dystopian if you think about how people could use that data for ill.
Rose: This is definitely one of those technologies where what you see in it, depends on how much you trust the powers that be with more information. As you listeners probably know, I tend to be less optimistic than Stacey. For me, I’m just not that sure that I think that the people controlling our political and technical systems right now deserve more data about me. They certainly haven’t earned my trust, nor have they proven that the are interested in protecting my interests and safety. But if you think that more data in the hands of powerful people could change the world for the better, then smart dust might seem great to you!
And now for every futurist’s least favorite question: when will we see smart dust? Here’s Amy Webb again.
Amy: Wen does smart dust move into some kind of scalable production so that it can be deployed for various uses. We don’t have enough data right now to make any kind of accurate, actionable prediction. So, for that reason we don’t focus on exactly when. Instead, we focus on what we can see at the moment. And we know that we’re probably looking at not next year, or the year following, but also that 30 years is too far down the road. So that’s not a specific answer to your question, but that’s the way we roll as futurists.
Rose: Here’s a tip from me, unsolicited — except that you’re listening to this, so sort of solicited: this is the kind of answer that you want from a futurist. Do not trust anybody who is willing to give you an exact date for a future technology. Things are really complicated! For example, the US could make some rule banning smart dust, or the recent GDPR developments in Europe could make it impossible to deploy. Or, the West could cut off its relationship with China and that would basically end smart dust for a while.
Amy: If somebody decides that China is a national security threat, which could happen, we may find that our components are no longer manufactured there. But, at the present time, no other country is set up to mass produce all of the stuff we need to make our devices.
Rose: Or, the opposite could happen, something could speed up the development of smart dust.
Amy: If there was a proven medical use, if there was something that might really help the banks and financial institutions, if this became a real viable way to track crops as we are seeing increased changes in climate and extreme weather, that could be an accelerant.
Rose: In other words, it’s complicated, as usual. I think I say this at the end of every episode. Every technology is at the mercy of not just the research and science at hand, but also policy and public perception.
Amy: When exactly is the future? You know, the future is three seconds from now, 27 minutes from now, 115 years from now. So when exactly is the future of smart dust? Well, the future is right now because there’s plenty of research being done.
Rose: THE FUTURE IS NOW! I love it when guests say that.
[music up]
That’s all for this future. Flash Forward is produced by me, Rose Eveleth. The intro music is by Asura and the outro music is by Hussalonia. Special thanks to this week’s guests. You can learn more about how to be a futurist by reading Amy’s book The Signals Are Talking: Why Today’s Fringe Is Tomorrow’s Mainstream. You can learn more about Faine’s work at her website, http://faineg.com. And you can find Stacey’s work on the Internet of Things at her newsletter which you can get at staceyoniot.com, and you can also hear more from Stacey on her podcast! Internet of Things podcast, which you can find, wherever you are listening to this podcast! The episode art is, as always, by Matt Lubchansky.
If you want to suggest a future we should take on, send us a note on Twitter, Facebook or by email at info@flashforwardpod.com. We love hearing your ideas! And if you think you’ve spotted one of the little references I’ve hidden in the episode, email us there too. If you’re right, I’ll send you something cool.
And if you want to support the show, there are a few ways you can do that too! Head to www.flashforwardpod.com/support for more about how to give. But if that’s not in the cards for you, you can head to Apple Podcasts and leave us a nice review or just tell your friends about us. Those things really do help.
2 comments
How do you keep the “smart dust” that is monitoring pollution from becoming a new form of pollution? What kind of new respiratory diseases will be caused by inhaling “smart dust”?
[…] Eyes In The Skies […]