These are published chapters from my book The real Internet of Things, published on January 1st, 2017.
Perhaps the best way to think of a daemonized world is to imagine a fabric of interactive and interconnected nodes made up of every object.
Then imagine each of these nodes having instant access to the state of every other node in the fabric. Except it’s not just knowledge of state, but the ability to modify, adjust, and command those nodes based on needs, desires, and access.
The power of this should not be underestimated.
Unbelievable amounts of waste can be attributed to imprecise guesses about the state and nature of reality. How far is that city? Is that person married or single? How many people do I know within two city blocks? How many devices are plugged into power on this city block? How happy is this country compared to that one, and what are the factors for that difference?
These are ephemeral truths that have always been to us for any given moment in time because the fact’s lifespan has always been shorter than its acquisition period. Stated differently, it always took so much time and resources to learn this information, assuming we knew enough to ask the question correctly from the beginning, that it would often be stale by the time it was gathered, making the whole effort a waste.
When objects maintain their own precise, authoritative, and realtime state, and when that information is available to every other object in the world in a fraction of a second, our connection to reality changes dramatically.
At that point, sensors simply become inputs for realtime data streams published through daemonization. This includes sensors for: light, sound, heat, vibration, chemical composition, EM/RF energy, etc. Cameras and microphones (light and sound sensors) will be the most powerful and prolific of these, and they’ll be transformed from hardware used by applications to sensors used to supply data to algorithms.
We learn about the world through analysis of data. We use machine learning and other types of Artificial Intelligence to look at data and give us answers to interesting questions, as well as, to help us ask better ones.
This will be done through a hybrid of local and global resources as is required by the particular application. Certain real-time applications will require instant responses and won’t have time to go to the network and back, while others will not have such constraints.
I prefer the term “synthetic” vs. “artificial” intelligence, as its capabilities will end up being just as real as our own despite having non-biological origins.
But the problem will continue to be the collection and curation of that data. What daemonization provides is an infinite stream of data coming from the sources themselves, i.e., trillions of objects telling you their exact state at every moment—constantly.
Think about universal interfaces between data analysis algorithms and the sources of data they consume. Think about every object’s data being presented in a way that natively facilitates continuous data analysis at scale.
From the practical, “What percentage of moving vehicles in Colorado currently have more than two people in them?”, to the specific, “Which top three factors most caused unhappiness in the city of London within the last 24 hours?” The point is not any particular query, but rather that we’ll be able to ask nearly any question and almost instantly know the answer.
When combined with Synthetic Intelligence powered continuous analysis, this is not just game-changing—it’s civilization-changing.
And that’s just the reading part of daemon capabilities.
- The lack of realtime data about the world severely restricts our ability to learn about it.
- Universal Daemonization will provide us accurate realtime data about the objects in our world.
- This enables sensors (such as light and sound) to continuously feed realtime data into analysis algorithms.
- This will enable us to have a realtime view of the entire world, constrained only by how much we can evaluate at one time.