- Unsupervised Learning
- The Real Internet of Things: The Four Components of Information Architecture
The Real Internet of Things: The Four Components of Information Architecture
These are published chapters from my book The real Internet of Things, published on January 1st, 2017.
There are many different information technologies that will be invented and adopted in the coming decades, but I believe there are four (4) primary categories that they will all fall into.
As I spoke about in the realtime data chapter, knowledge of the current state of the world is extraordinarily empowering. It allows us to ask questions about the state of the world and adjust behavior as a result. The more realtime the better, and the more standardized and usable the format the better.
Now that we have the data available, we need to be able to get it to the algorithms that will perform work on it. The protocols will have to be not only standardized, but built to allow trillions of tiny queries and updates, since even one object’s various state attributes could be changing in tens, dozens, hundreds, or thousands of times per second.
Once we have this data the focus turns to the algorithms that will do the analysis. As we talked about in the ‘Businesses as Daemons’ chapter, companies will largely compete as data analysis algorithms. Companies will largely have access to the same data; the question will be what you can do with that same data that gives you the competitive advantage.
Finally we have the output step. We’ve captured the realtime data, we’ve moved it to where it’ll be analyzed in a standard and efficient way, some company has done their unique analysis on it, and now we’re going to display it to someone or something. That’s presentation, and it will be another opportunity for companies to differentiate.
Unsupervised Learning — Security, Tech, and AI in 10 minutes…
Get a weekly breakdown of what's happening in security and tech—and why it matters.
Creating the ability to track and present realtime data about objects (and ultimately the world) is hard. That’s an engineering problem. The other engineering problem is creating the protocols that will allow us to constantly poll and update objects for their state changes, which will be trillions per second in any large set of objects (like a company, or a city).
Those are efficiency and scalability problems.
The algorithm and presentation steps are significantly more creativity and innovation based. They are ultimately what will differentiate competitors in a long-term business market.
There will be innovators solving the engineering problems as well, but it’s infrastructure. It’s the connective tissue that enables the competition in the spaces of algorithmic analysis and presentation of results.
There is also the option for the output of one algorithm to be sent to one or many others as well, of course.
Realtime data is collected from the world.
The data gets evaluated by algorithms.
The output of those algorithms gets presented in some useful way.
The collection and transfer of the realtime data are engineering problems, and the analysis and presentation are creative/innovative problems.