Sensor: a device that detects or measures a physical property and records, indicates, or otherwise responds to it.
We all know what a sensor is, right?
A sensor makes "sense" of physical property — it turns something about the physical world into data upon which a system can act. Traditionally, sensors have filled well-defined, single-purpose roles: A thermostat, a pressure switch, a motion detector, an oxygen sensor, a knock detector, a smoke detector, a voltage arrestor. Measure one thing, and transmit a very simple message about that one thing. This thinking stems from several hundred years of physical engineering of devices and persists today in part because of the convenience of modular thinking in system design.
But this is changing. Fundamentally. Software is becoming the new sensor.
Consider your smartphone: it has sound and image capability, along with a multi-axis accelerometer, 3-axis gyroscope, magnetic compass, air pressure, light levels, touch, you name it. The sensor suite on current generations of smartphones would completely outclass many sensor packages flown by the US military not long ago. Some of these phone-based sensors still use dedicated hardware to reduce transduced data to information, but increasingly it's all done in software: acceleration, gyro, and other data are reduced to screen orientation, to "phone-to-ear" detectors, and to navigational inputs.
Instead of the old-school sensor design, chips capable of capturing highly granular physical inputs at high-frequency sample rates feed software to run in local memory on a local processor, reducing that data stream into specific inputs needed for a variety of different purposes by the OS and by apps. Even the radio components are becoming a software function.
Because the decision engine no longer owns the transducer, the underlying data is also available in its raw form. This means a smartphone app can increasingly leverage the same sensor data to make its own decisions in ways never specifically intended by the hardware designer. Am I running, walking, or standing in line? Am I on the bus or in the car? How's my driving? How's my workout going? Is it getting dark out? Is it the ambient crowd noise loud enough I should turn up the volume? What's the gender and age of the speaker?
A home security system based on similar thinking, with a microphone and a suitable microcontroller, can do much with a software-defined audio processing capability: a glass break detector, a footstep detector, a heartbeat counter, a doorknob rattle detector, a dog bark or shout detector, a trip and fall sensor for grandma, an unauthorized teenager party alarm, all of these sensors defined in software within the same, flexible hardware box. No longer the "one box, one answer", traditional security sensor design.
Industrial IoT applications are numerous, and many are already in production. Dedicated physical thermocouples and vibration-limit switches are being replaced with digital temperature probes and accelerometers attached to embedded microcontrollers.
New software-defined sensing can now employ AI and predictive analytics (like ours) to intervene before a problem happens. We can now alert operators to a pending issue or needed maintenance with time for critical, high-value processes to be spooled down in a controlled, planned fashion – or resolved during the next scheduled downtime so no interruption is necessary at all. Manufacturers and insurers can be kept in the loop regarding equipment field issues and parts needs and can perform post-event forensics after critical failures.
Automotive sensors are also driving this way: decision-making functions are trending away from hard-wired, end-point transducers and toward onboard computers. Makers know this places increasing flexibility and software-adaptive capability into the hands of the system designers.
Modularity is increasingly moving from the physical layer to a network layer, in which modules are connected on a peer-to-peer network, exchanging packetized information. While this network layer begins as a digital substitute for individual electrical circuits, with increasing bandwidth capacity it can also provide the flexibility for devices to share underlying data as well as local yes/no decisions. This creates an unprecedented opportunity both to integrate information across modalities and to add brand new capability, ad hoc, in the form of software sensors.
This shift in thinking also opens the door to incorporating more complex, AI-based algorithms, rather than just simple condition thresholds. Sensor information can be integrated in ever more complex ways, and even the innocuous electrical panel circuit breaker is becoming a microcontroller-powered, software sensing device.
Tools like Reality AI are enabling machine learning smarts in these environments.
This only makes sense.