Session:The Internet of Things
Name the page 'Session:<title>', fill in the details below, and remove this line
- ELC 2012
- February 16, 2012
- Mike Anderson
- The PTR Group
- The Internet of Things
- here (linux foundation) and here (free-electrons)
Consumers increasingly want inter-operability of their devices. They want to program their DVR via their mobile phone. They want their music available everywhere. They want their television to update social networking sites. But, as developers, how do we make this possible? This presentation will discuss the imminent "Internet of things" and how we can extend connectivity to previously "dumb" devices like TVs, refrigerators, and other appliances and how this connectivity is directly related to IPv6 support. The target audience for this presentation are platform developers looking to enable connectivity in a new class of intelligent appliances. This presentation is targeted at introductory-level developers with some understanding of the IP protocol stack.
Mike Anderson is currently CTO and Chief Scientist for The PTR Group, Inc. With over 33 years in the embedded and real-time computing industry, Mike works with a number of RTOS offerings. However, his focus over the past decade is primarily embedded Linux and Android on a number of CPU architectures. As an instructor and consultant, Mike is a regular speaker at the Embedded Systems Conference and the Embedded Linux Conference as well as other Linux-oriented conferences such as LinuxWorld, Ubuntu Live and the Real-Time Embedded Computing Conference series. Ongoing projects include several efforts focused on porting applications from RTOS offerings such as VxWorks and pSOS to real-time enhanced Linux platforms. Additional projects include Android bring up and its use in non-phone applications and Linux in high-performance computing platforms.
- Transcribed by
- Tim Bird,
- Verified by
- 1 - Test User,
(From the Linux Foundation version of the video)
0:00 - 1:00: v1 >> TIM BIRD: Good morning everyone. Hopefully you had a good night's rest and hopefully you had fun at the reception last night. I'd like to thank Intel and Yocto for providing that reception. It was a pretty neat venue. Hopefully if you were in the little lobby at the beginning you noticed that there was a whole other section of the museum to go walk in. I was worried about that. I didn't notice it myself for a while. But then you got in there and there was a lot of interesting stuff and hors d'oeuvres and..
But I just want to remind people about the demo session - the technical showcase that we're having tonight. You might want to get ready for that. We're having a key-signing party. I have gotten ready. I'm doing lazy-man's keysigning prep. You're supposed... It's nice if you have little slips of paper, but if you don't you can just write your key on the back of your badge and then have people take your picture or whatever.
Um. But let's get started with our sessions for today.
1:00 - 2:00: We're going to start of with what I think will be a very interesting session, a keynote address, by Mike Anderson. And Mike Anderson has been coming to embedded Linux conference for years and years, and is, quite frankly, one of my favorite speakers. He's always got really interesting stuff to say. He's the Chief Scientist for The PTR Group, out of Washington D.C., and always working on interesting things, and always has real practical, hands-on information in his tutorials. And I look forward to hearing his comments on the Internet of Things.
Please join me in welcoming Mike.
>> MIKE ANDERSON: Thanks Tim. [slide 1 - Title]
OK. It even worked. Great job. [laughs]
I want to first of all thank ELC, Tim and the organizers for giving me the opportunity to get up in front of you and speak about the Internet of Things.
2:00 - 3:00: As we move along with the way technology is developing at this point, we have a very broad selection of opportunities for those of us in the embedded Linux community, and those of us who are interfacing to devices and understand what a device actually is. [slide 2 - What we will talk about]
This is what we'll be talking about, and this is all of the words in this presentation. I just want to get them all out of the way up front. So that now you've had an opportunity to see them, let's actually get into the material [slide 3 - Evolution of the Internet]
So, when we start with the Internet back in the 60's, it was originally conceived as a mechanism for doing away with the old Telex systems that were employed by the military. So in it's original form, the ARPANet was designed as a mechanism for being able to send message traffic around between military bases.
3:00 - 4:00: That, of course, because a lot of the research for the ARPANet was being done by universities, the concept of being able to send out an e-mail, and say "Hey, everybody. I've got a party at my house this Saturday night. Why don't you all come over.", really carried forward as these univerisity researchers and grad students, etc., then left their respective institutions and got out into industry.
Now, we went beyond that, to the introduction of Netscape and Mosaic, where we suddenly had Web 1.0. And the concept behind Web 1.0 was simply that of a library. There's lots of information out here. We will go out into the library. We'll look around. We'll ask a few questions. We'll get some information. We'll bring that back and think about it.
Web 2.0 introduced the concept of, instead of simple being consumers of information, we became producers of information.
4:00 - 5:00: Today, for every minute that goes by, there are 8 hours of YouTube video being uploaded to Google.
That's a lot.
Now, as we become producers, whether it's blogs, whether it's video, whether it's whatever we're producing, putting it out there, we of course are asserting our individuality, our concept, our "person-ness" out into the network. And, of course, there are the downsides, where people will then grab a hold of that information and use it against us. We'll talk a little bit more about that later.
But what we're at, right now, is the cusp of a new transition in the Web - Web 3.0. With Web 3.0 we're talking about something called the "semantic Web". With the semantic Web there's actually way more machines on the Internet than there are people.
5:00 - 6:00: And in the semantic Web these machines start chatting with each other sharing data. As they share data, we will hopefully be able to derive some wisdom from that. We'll see how that play our here in the next few charts. [slide 3]
So what is the Internet of Things? Well, it can be defined as the point in time at which the number of objects on the Internet outnumber the number of people on the Internet. That actually happened about 2008/2009 timeframe. And, when that happened we suddenly started to realize that, hey, the approach that we're using right now for the Internet (IPv4), is probably not going to scale. So we have to do something about that. We'll get to that in a bit.
At the same time, when they start looking at the proliferation of devices on the Internet, in 2008/2009 we had about 2 billion devices on the Internet.
6:00 - 7:00: Today that's calculated to be about 5 billion devices on the Internet. By 2015 - 15 billion devices. By 2020 - 50 billion devices on the Internet. That means that each one of us will on average, by the year 2020, have 6 different Internet-connected devices on our person at all times. Now, some of you, certainly those of you from Asia, already have 2 or 3 cell phones with you. Now that tends to be more common in places like Japan, where they have a cell phone for the family, a cell phone for the business, a cell phone for whatever other things they may be up to. And, as we've seen the deployment of these technologies, we tend to see a lot of this happening over in Asia first.
7:00 - 8:00: And then it rolls into the west. We'll see a little bit more about that in a bit. [slide 4] Here, is kind of a selection of connected things courtesy of the latest CES (the Consumer Electronics Show). Here we see an internet-connected television. This is from LG. This happens to be running ARM rather than a more traditional sort of MIPS sort of thing, where we see MIPS used to have a very strong hold in the settop box business. But, the interesting thing about this television is that it is running Android, and not Google TV. They perceive that the market is actually people getting access to YouTube, Hulu, Vuduu and Netflix, rather than using the Google Portal.
8:00 - 9:00: And of course, because it is Android they have the option of setting up their own private marketplace, so that you can download Angry Birds. [ahem]
Now here is a Samsung refrigerator with a 7-inch panel in it. The idea is, of course, that you will hook your refrigerator into the Internet, it will be able to download the latest weather, and you'll be able to use it as the replacement for the corkboard, on the old refrigerator.
Now, it also has the concept that you can potentially scan pictures, so instead of having all those kids pictures magnetically plastered to the refrigerator, we'll simply scan them and have a running slide show on the refrigerator - in case we really wanted to see those.
Now here's another situation. Now, here we have Kia has introduced the concept of an Android panel on the front dashboard.
9:00 - 10:00: Talk about distracted driving. It's like... I'm not sure exactly how well this is going to go over. But it certainly is.. They're argument is that it's heads up display.
We have, down here in the corner, we have a wirelessly connected baby scale. Again, some of these things you wonder "Why"? But I guess if you were a health care provider and you noticed that the babies from a particular region were all underweight, maybe that would be indicative of some problem in the water. Hmm. I'm not sure.
They did not invent Internet cows. However, they did introduce this concept of being able to instrument the cattle to track their health and status. It turns out that cattle produce about 200 megabytes worth of data every year, for each cow.
10:00 - 11:00: Of course I don't necessarily want to be the person who instruments the back end of the cow to see how things are coming out. But in any case the concept is certainly there.
And this one is kind of a scary one. You may not be able to tell from the back, but it says "Brainwave TV". In this particular example, I mean it's connected -- you actually wear this thing and think the television channel - and it changes TV. You have to be very careful about what you're thinking. [laughter] Because it could change TV to a channel that could be embarrassing. But, certainly, this is the kind of thing that we're starting to see. And this of course - all of these with the exception of the cow - was introduced at CES this year.
[slide 5 - Connected Earth] But what we're really looking at here is the Earth, as a system, has been able to produce gigabytes worth of data every single day.
11:00 - 12:00: We just haven't been able to listen to it. Well now, with the Internet of Things and the proliferation of sensors and technologies, we can start listening to the Earth.
We're used to thinking about seismic - certainly out here in California - seismic sensors. But what if we started thinking about the minerals, or the water in the ground. Could we have produced.. Could we have prevented a dustbowl situation. Had we instrumented the ground properly, could we have prevented that whole episode in American history from happening. If we had just simply been able to listen to the Earth effectively.
[slide 6 - Enablers] So what are the enablers for the Internet of Things. We have tagging things. Now here we see an RFID tag. And there's been a huge push for RFID in a lot of different technologies - a lot of different components.
12:00 - 13:00: We'll talk more about that in a bit.
We have sensing things. And we have now a situation where people are actually instrumenting themselves, for their own reasons. For instance in this particular case, this thing on the guy's arm is an air quality sensor. So this individual.. this particular product is targeted at someone who's asthmatic, and they want to know what the quality of the air is, and have it alert them on their phone, before they have an asthma attack. OK.
We have shrinking things. This is an RFID tag. It used to be this. Now it's that.
And we have thinking things. Now this is of course the proliferation of devices like AVRs, PIC24s and 32s, ARM Coretex M0s, M3s and of course now moving up the scale into all the Linux devices that we're actually out here deploying.
13:00 - 14:00: [slide 7 - RFID Everywhere]
So if we think about the tagging things first, we see that the RFID piece of it - this is an RFID tag about the size of a grain of rice. This is the kind of thing that people typically implant in their animals. Their cats, dogs, etc. So if they become lost they can simple be scanned by the professional - by the veterinarian - and figure out where that cat belongs and contacted. We see it here against a human hand and then we see of course it's new smaller cousin, being produced. But what's an application for this kind of stuff. Well here we see an RFID-enabled cat door. So that if the cat with the appropriate RFID comes up to the door it will open, and the cat can come through. But the run-of-the-mill stray cat will not be able to get through that door.
14:00 - 15:00: Where is the big push for RFID coming? Well, it's actually coming from retailers. If you take a look at what's happening with retailers like Walmart. They're getting ready to switch from an inventory model to a consignment model. That is, they won't actually pay the producers of the goods for the goods until the goods are sold. This is going to be a huge change in the way business is conducted throughout the world.
So as we see RFID proliferation, we'll see it in milk cartons, we'll see it in our clothes, we'll see it just about everywhere. Now this all has a good side and a not-so-good side. We'll talk about the security bits in a minute.
[slide 8 - Instrumenting Things/People]
And of course "sensing things". 15:00 - 16:00: As we start instrumenting both things and people, we see that there's already a market for books on Getting Started with the Internet of Things.
Here we see an example of a human who has augmented their sensing with ultrasonic sensors and haptic feedback devices. So the closer they get to a wall, the more the haptic feedback vibrates. They can then sense their environment. This of course is a potential boon for the blind and other situations. However think about what that could be used for in terms of augmented reality, in terms of augmenting our sensors, in terms of enabling humans to be able to reach out beyond our simple surroundings to detect things out in the world.
And we see a couple of examples of this using a mesh technology. This happens to be built on top of Arduinos.
[slide 9 - Smart Dust, Nanoboats and MEMS]
16:00 - 17:00: Well as we start shrinking things, now we start getting into some really kind of bizarre applications of the Internet of Things.
We see here in this example this is called a dust mote. And this is about 1 to 2 millimeters. It has a laser interferometer on it. It has a solar cell on it. It has mesh technology on it - mesh networking. And it has a coretex M0. In about 2 millimeters worth of material.
Now the research that's been done on things like smart dust is that if you wanted to instrument the crowd, you simply spread these things out on the floor and as people walk their shoes pick them up. And then they are now instrumented and you watch them move, through large venues. Of course, one could conceive of other applications for these, but we won't get into that.
Here we have a nanobot.
17:00 - 18:00: The nanobots have largely been focused in the medical world, where people are interested in using them to go in and cut out cancer cells and other such afflictions.
Personally I don't like the little bug-looking nanobot. I want mine to be sharks with laser beams on their heads.
But if we then take that to the next step here, a Micro Electro Mechanical Systems (or a MEMS). Now they have nano electro mechanical systems. This particular one is rather curious. What they did was they implanted a MEM into this moth in the pupal stage. When the moth came out of its pupa, it had a sensor in it. This sensor now enables them to cause the moth to flap its wings. They can actually steer the moth.
18:00 - 19:00: Put a camera on that, and send it out as a small UAV. Of course I wouldn't want to watch the camera output while I was flying. It would go a little like that [rocking]. But none-the-less kind of an interesting concept for being able to instrument things in the real world and have them collect data.
[slide 10 - the DIKW Relationship]
So what are we really getting at here. Well we're really talking about something called the DIKW relationship. In the DIKW relationship we have a sea of data. This sea of data is meaningless unless we can start putting some organization to it. Once we understand the relations inside the data, we can then start to derive information from that data. But information in and of itself is of little use to us unless we can see patterns. Once we understand the patterns we can now derive knowledge from that ultimate data source.
19:00 - 20:00: Patterns are good. Knowledge is good. But what's really important to us is to understand the principles behind how those patterns work in the world. And those principles, then and that understanding we derive wisdom from that.
So the whole concept behind a lot of the things that we talk about in the Internet of Things is really the concept of being able to take data, of which we have lots of it, and derive some wisdom, for the understanding of how the world works, as a result of that.
[slide 11 - Improving the Human Condition] So how can we go about improving the human condition with the Internet of Things?
Well, here we have a circumstance where if we think about the number of millions of gallons of fresh water that are lost every single day by municipalities through leaking pipes.
20:00 - 21:00: It's just, the infrastructure is old. We don't know where the leaks are. In D.C., we find them because a pipe bursts and now it's ice everywhere. But what if we had the ability to instrument the pipe? What if we had the ability to to be able to then recognize a potential failure in the water system, before it happens?
We could alert the department of public works. We could tell the police, "Hey. Redirect traffic around this intersection." We could get somebody out there to fix the leak, stop the pipe burst before it happens, and save millions of gallons of water. Especially if then we could have sensors that travel through the water system, collecting data about where the water system was bad, and then plan our work as proactive work rather than reactive when something actually fails.
21:00 - 22:00: Other applications: If we had a sensor on someone that could detect a potential heart attack. And that sensor could then, through their smart phone alert the emergency personnel, dispatch emergency personnel, alert the doctor, alert the hospital - "Hey. This individual's having a heart attack." It gets them into the golden hour. The time at which, if you can get medical aid to someone in that first hour, chances are extremely high that you'll be able to save their lives.
Of course we have some more mundane applications of this technology as well. Let's say that I have an intelligent agent, that looks at my calendar, ans says "Well, he doesn't have any appointments until 10 o'clock. So I'll change the alarm clock to go off an hour later.
22:00 - 23:00: And, not only that, but I'll also start the coffee. And by looking outside I see that the temperature is below freezing." So, 5 minutes before I'm supposed to leave, it will automatically start the mini, and start warming up the car so I don't have to go scrape the ice off the outside of it. That would be a handy thing. Myself, I know when I'm out there scraping ice, I really wish that my car would automatically do that for me.
But let's say we have our smart refrigerator - with the RFID tags on everything that we put inside the refrigerator. You have to be careful not to eat an RFID tag, I would guess. Of course then that raises the concept of smart toilet paper... Hmmm. Not sure about that. But in any case, what if I didn't know what I was going to have for dinner? I could ask my refrigerator, "What do I have in the refrigerator that is good for dinner?" It could then actually go out on the Internet, look for recipes that matched the things that I have in the refrigerator, and come back to me with some ideas for what I'll make when I get home.
23:00 - 24:00: And then here we see. This one is actually deployed at this point. This is a smart pill bottle, that has an LED on it, and it is actually 3G connected, and the light comes on to tell people when to take their medicine. And which medicine to take.
Of course, as we get older, we start having quite a collection of pills that we have to take, and based on their individual preferences, sometimes we take them in the morning, sometimes in the afternoon, sometimes several times during the day.
So it would be nice to have an LED that would just start blinking when it is time to take the pill. Of course, the potential for someone hacking into that, and then causing mischief also exists.
But, when we look across kind of a broader spectrum. If we take a look at what happens with the energy production grid today.
24:00 - 25:00:
25:00 - 26:00:
26:00 - 27:00:
27:00 - 28:00:
28:00 - 29:00:
29:00 - 30:00:
30:00 - 31:00:
31:00 - 32:00:
32:00 - 33:00:
33:00 - 34:00:
34:00 - 35:00:
35:00 - 36:00:
36:00 - 37:00:
37:00 - 38:00:
38:00 - 39:00:
39:00 - 40:00:
40:00 - 41:00:
41:00 - 42:00:
42:00 - 43:00:
43:00 - 44:00:
44:00 - 45:00:
45:00 - 46:00:
46:00 - 47:00:
47:00 - 48:00:
48:00 - 49:00:
49:00 - 50:00:
50:00 - 51:00:
51:00 - 52:00:
52:00 - 53:00:
53:00 - 54:00:
54:00 - 55:00:
55:00 - 56:00:
56:00 - 57:00:
57:00 - 58:00:
58:00 - 59:00:
59:00 - 60:00: