Rob Tiffany #Digital Podcast 13

#Web3 Explained

Dr. Gavin Wood coined the term, “web3” back in 2014 because he thought the current Web and Internet was fundamentally broken and he wanted fix it with blockchain technologies like Ethereum. Control over identity, financial transactions, and personal data had become centralized and dominated by a handful of powerful web platforms. You can think of web3 as a “Power to the People” movement.

Daring Greatly in 2021

A small group of innovators spent 2021 in the Arena addressing Agriculture 4.0 issues including a rapidly growing population, the scarcity of natural resources, climate change, and food waste using 5G and Industrial Internet of Things technologies.

Let’s start by looking at the problems:

With Earth’s population nearing 10 billion by 2050, growers must double food production to keep up. In other words, growers must produce more food than has been produced since the beginning of farming.

It turns out there are a few headwinds that make this seemingly impossible task even more daunting.

Climate change takes credit for an unprecedented heat wave during the summer of 2021 that wiped out or severely diminished many varieties of fruits and vegetables. Cherries, raspberries, blueberries, and blackberries shriveled-up or burned, putting family farms at risk. Apples and wine grapes stopped growing leading to a smaller harvest. Internal cellular damage to certain fruits and vegetables will carry over into next year’s growing season leading to additional small harvests. Unrelenting floods drove damaging fungal growth. Warmer winters confuse budding plants, while late spring frost can kill those that started growing too early.

Extreme heat and flooding leads to less food.

Most of the Western United States is experiencing extreme drought due to the lack of rainfall combined with evaporative heat, as the Colorado river struggles to deliver enough water to the 40 million people that depend on it. America’s two biggest reservoirs, Lake Mead and Powell, are now at historically low water levels. Since agriculture uses 70% of all freshwater, we find ourselves in a desperate situation.

Reduced water leads to less food.

The Earth is losing almost 30 million acres of arable land each year due to a phenomenon known as desertification. In fact, We’ve lost 33% of all arable land over the last 40 years. Primary culprits include urbanization and deforestation along with farming and ranching that lead to overcultivation, overcropping, and overgrazing. Soil is eroding and turning into lifeless dirt due to drought and poor farming practices such as tilling which depletes soil nutrients.

Less land leads to less food.

Exceptionally dry forests combined with a variety of ignition sources has led to widespread fires. Sadly, the term, “Fire Season” is now a thing. Smoke threatens people and livestock, while fire threatens agricultural lands.

Burned land leads to less food.

A workforce is required for planting, maintaining, and harvesting fruits and vegetables as well as operating and maintaining farm equipment. The current labor shortage is at a crisis level with growers losing crops and income as fresh produce is left rotting in the field because there aren’t enough workers for harvest.

A reduced workforce leads to less food.

Energy usage accounts for roughly 15% of total farm expenditures and comes from operating farm machinery, trucks, processing warehouses, tractors, irrigation pumps, HVAC, ATVs, crop dryers, packing houses, and cold storage. As costs for electricity, gasoline, propane, and diesel increase, grower operating margins decrease making it harder to stay in business.

High energy costs lead to less food.

Between 30 and 40 percent of food, about 1.3 billion tons, is wasted after harvest every year due to disruptions in the supply chain. This disruption creates gaps between production and distribution leading to the loss of perishable items like eggs, milk, and produce in the face of unprecedented need at food banks. Improper temperatures at different parts of the cold chain lead to reduced quality or complete food loss. Lean, just-in-time supply chains, based on buyer behavior leave no room for error.

Supply chain disruptions lead to less food.

Let’s look at the solution we worked on in 2021:

While many of you know about Industrial IoT, Digital Twins, and Industry 4.0, you may not have heard about Agriculture 4.0. I spent 2021 with partners Courtney Latta and Doug Boling building technology that implemented the same kind of IIoT capabilities used in a Smart Factory to facilitate sustainable Agriculture.

As a refresher, the Internet of Things uses devices, sensors, and connectivity to allow you to remotely know the state, performance, or health of an object in real-time. In our case, we needed apples, hops, soil, air, equipment, and water to talk to us and lets us know how they’re doing. With that data and a little bit of analytics, we could help growers make more informed farming decisions to drive desired outcomes such as:

  • Reduced water usage (save money and environment)
  • Increased crop quality (make money)
  • Reduced energy usage (save money and environment)
  • Reduced fertilizer usage (save money and environment)
  • Increased crop yields (make money)
  • Reduced pesticide usage (save money and environment)
  • Reduced labor costs (save money)
  • Improved food traceability (reduce food waste)
  • Increased crop protection from frost, heat, or disease (make money)
  • Increased worker safety (employee well being)
  • Reduced herbicide usage (save money and environment)
  • Increased automation (save money)
  • Increased equipment uptime (save money and make money)

What did we do during the growing season?

We combined battery-powered devices and sensors measuring soil moisture, temperature, air quality, humidity, location, and others with 5G cellular connectivity and our portable Software as a Service (SaaS) platform that can run in the Cloud or at the Edge. Our goal was to validate the MVP of our product. During pilots and trials on large farms in Eastern Washington, we deployed devices and sensors throughout acres of hops and apples. At regular intervals, those devices wirelessly transmitted their telemetry data to our platform. Sometimes we analyzed the captured data with our analytics, and other times our platform routed the data to the grower’s analytics systems to derive insights.

What was the outcome?

In the end, Courtney, Doug, and I found our product/market fit and validated our product with happy customers who willingly signed letters of intent. Heck, we even won the “IoT Innovation of the Year for Agriculture” award from Compass Intelligence. I got to wear a cowboy hat and boots and spent the summer of 2021 amongst trees, crops, and soil. Obviously, we barely scratched the surface in the value the technology we built can provide. We also have no illusions that the precision agriculture we delivered can solve all the problems faced by growers and society at large. Battling climate change, drought, fires, desertification of soil, floods, and food waste is an “all hands on deck” calling for everyone. While we are only part of the solution, the satisfaction and fulfillment we felt this year is immeasurable, and we’re happy to play our part.

The Future of Agriculture is the Future of Humanity

Static Properties of a Digital Twin Model

Static properties enumerated by a #digitaltwin model represents data that typically doesn’t change.

Unlike telemetry properties that map 1:1 with a sensor or other data point, static properties contain values that typically don’t change.


Using a car as an example, static properties of a digital twin model could be things like the length of the car, the number of cylinders and displacement of the engine as well as the volume of room in the trunk.

Static properties are necessary to have a more complete view of the physical twin to be used as reference data for analytics. Digital twin instances inherit these unchanging static properties from their corresponding digital twin model.

More to Come

Follow along with me as I take you on a deep dive of all the elements that come together to make a digital twin. Click links below to catch up with other articles in the series:

IoT Coffee Talk: #83 – The 2021 Christmas Special!

Welcome to #IoT Coffee Talk #83 where we chat about #Digital #IIoT #Automation #DigitalTwins #Edge #Cloud #DigitalTransformation #5G #AI #Data #Industry40 & #Sustainability over a cup of coffee.

Grab a cup and settle-in with some of the industry’s leading business minds and technology thought leaders for a lively, irreverent, and informative discussion about IoT in a totally unscripted, organic format.

Your hosts include:

It’s our Christmas special for 2021! We talk about Christmas logs that Catalonian children beat up to make them poop presents, the chip shortage, the problem with spectrum policy, and a bunch of other nonsense. It’s pretty good nonsense so you will want to watch the whole thing,…. the whole thing!

Telemetry Properties of a Digital Twin Model

Telemetry represents the data flowing from a physical twin to a #digitaltwin along with the associated properties that define the data points.

Data Points

Telemetry properties are the dynamic properties of a digital twin model containing values that can change often. For every data point sent from a sensor, tag, or other data source representing a physical twin, a corresponding telemetry property must be defined for the digital twin model. It starts with a human-readable, friendly name that aligns with the data point that makes sense for people and analytics. Something like temperature or humidity for example. In the event that the data points or tags use something unintelligible like T1 or H2, you must also define an unfriendly name that will be translated to the friendly equivalent.

Next up, you must assign a data type and unit of measure to the telemetry property. The data type could be a string, a whole number like an integer, a Boolean (true/false), or floating point number. The unit of measure could be acceleration or pounds per square inch (PSI) of air pressure in a car tire. Assigning data types and units of measure enable conditional logic operations to be performed.

All the telemetry property elements that comprise a digital twin model are inherited by appropriate digital twin instances and tell the software agents in your platform what to expect from incoming data. This facilitates pattern matching.

Data Format

Last but not least, the format that contains all the data points transmitted from the physical twin must be defined. Whether the data is streamed across as JSON, XML, Binary, CSV, Avro, Protobuf, or MessagePack, the platform ingesting the data must know how to parse it.


For every telemetry property you define, there’s a good chance you know in advance what a good or proper data value should be. For instance, when you define the RightFrontTire telemetry property of your car with an integer data type and PSI unit of measure, you might know that 35 is the recommended pressure for your tire. You can therefore define key performance indicators (KPIs) ranges for each of the properties. Green is good. Yellow is a warning. Red is dangerous. A range from 34 to 36 might be green whereas a range from 31 to 33 or 37 to 39 might be yellow. Anything higher or lower than those ranges could be red. The software agents in your platform will look at the incoming data points and compare those values to the KPIs defined for the corresponding telemetry properties and fire the appropriate event for green, yellow, or red to deliver an insight or take an action. The use of KPIs tied to each telemetry property is optional and represents the simplest form of analytics. Those of you in manufacturing will note this is similar to defining thresholds and limits for machine operations.

Prescriptive Analytics

If you choose to define KPIs for your digital twin model properties, you also have the option to define what should be done for respective green, yellow, and red events. This is called prescriptive analytics and clarifies one or more actions to take. Using the tire pressure example, no action is taken for a green event, whereas a yellow event would tell the driver of the car to add or remove a small amount of air from the tire. A dangerous red event would tell the driver to stop their car immediately and change the tire. Since you can define a list of prescriptive actions to take for each KPI event, an additional action to take for the red event might instruct the driver to call a tow truck if the car doesn’t have a spare tire.

More to Come

Follow along with me as I take you on a deep dive of all the elements that come together to make a digital twin. Click links below to catch up with other articles in the series:

Digital Twin Prototypes and Models

The #digitaltwin prototype is created before the physical twin and the digital twin model is used to define a type of entity or process.

What is a Prototype?

The digital twin prototype is typically comprised of engineering designs, processes, relevant analysis, and a visual representation to create the physical product or process. As you might imagine, it’s faster and cheaper to create and test a digital product than a physical one. Digital mistakes and iterations are definitely less costly or painful.

Models let you Define Once and Reuse Many

The digital twin model allows you define all aspects of a type of entity just once, rather than defining it repeatedly for each individual entity in your IoT platform. The model includes baseline information including a name, description, a variety of properties, one or more pictures or CAD models, and a version number because the model may evolve over its lifetime. If you’re on object oriented software developer, you can think of this as a base class comprised of one or more properties. Unique, individual entities are referred to as digital twin instances which I’ll cover in a later article. Each instance of a digital twin derived from a digital twin model will inherit its properties. You can best think of a digital twin model as a data definition, structured in a database or file. A composable digital twin model is created using a visual designer in an application or through a domain-specific programming language designed to create the proper data model. An example of a digital twin model might be a 2022 Ford F-150 with a specific set of features as properties. Thousands of actual 2022 Ford F-150s on the road that inherit the specific feature set from the model would be the digital twin instances.

More to Come

Follow along with me as I take you on a deep dive of all the elements that come together to make a digital twin. Click links below to catch up with other articles in the series:

Digital Twins Defined

What are #digitaltwins and where did they come from?

What are they?

A digital twin is a digital representation of a physical object, process, place, person, or system that sits at the intersection of connectivity, data and analytics. At a high level, the digital twin concept is comprised of three parts. The physical space, the digital space, and the connection between the two. Digital twins are used in both design, simulation, and operational phases of a product or system lifecycle. Design is done digitally before physical creation occurs, which saves an organization money because they avoid costly, physical mistakes. Digital simulations are performed by feeding test data to see how digital twins react. Think of this as a digital version of a wind tunnel to test the aerodynamics of car. Operationally, data populates the digital twin with the physical counterpart’s current state and behavior. This allows you to observe both the current, and historical state of the physical twin. Further context is derived when additional data from a variety of sources are blended with the digital twin. Software agents compare incoming data from the physical twin with expected state values and KPIs of the digital twin to trigger further analysis and actions. Oftentimes you’ll see hear that digital twins are a 3D CAD model or what you see when wearing AR or VR goggles. Keep in mind that those are actually a view of the digital twin’s data model combined with live data from the physical twin. The view of a digital twin could just as easily be an Excel spreadsheet or exploration through the metaverse.

Where did they come from?

If you remember watching the movie Apollo 13, you saw how Tom Hanks, Kevin Bacon, and Bill Paxton were struggling to get home from the moon in their disabled spacecraft after an oxygen tank exploded. Yes, this happened in real life. You might also remember Gary Sinise troubleshooting the problem in a physical twin of the spacecraft at mission control in Houston. Luckily, through ingenuity and perseverance, the astronauts safely returned to Earth. The learnings from this event gave rise to the idea of creating a high-fidelity, digital replica of spacecraft to make it easier to troubleshoot problems in the future. In the early 1990s, the concept of digital twins were anticipated by David Gelernter’s book, “Mirror Worlds.” In 2002, Dr. Michael Grieves introduced the idea of a “Doubleganger” as part of product lifecycle management (PLM) while he was at the University of Michigan. Dr. Grieves went on to say, “Industry 4.0 is only possible with the digital twin.” In 2010, the name “digital twin” finally stuck when John Vickers of NASA referred to the name in an official roadmap report. Throughout the 2000s, I witnessed the slow rise of digital twins in manufacturing showcased by companies like GE and at events like Hannover Messe.

Asset Avatars

In 2016, while serving as CTO at Hitachi Insight Group, it occurred to me that digital twin technology should be at the very heart of the industrial IoT platform we were creating. Collaborating with members of the product management and engineering teams, we created an industrial digital twin technology called “Asset Avatars” running within the Lumada platform. The Asset Avatars could model machines, subsystems, assembly lines, and entire factories. We literally brought machines and processes to life in a virtual world to enhance operational efficiency, provide early warning of problems, an ensured uptime of Hitachi assets such as bullet trains and wind turbines. Since then, I’ve been a huge proponent of this powerful technology.

More to Come

Follow along with me as I take you on a deep dive of all the elements that come together to make a digital twin. Click links below to catch up with other articles in the series:

IoT Coffee Talk: #82 – Log4j – The Great Hole

Welcome to #IoT Coffee Talk #82 where we chat about #Digital #IIoT #Automation #DigitalTwins #Edge #Cloud #DigitalTransformation #5G #AI #Data #Industry40 & #Sustainability over a cup of coffee.

Grab a cup and settle-in with some of the industry’s leading business minds and technology thought leaders for a lively, irreverent, and informative discussion about IoT in a totally unscripted, organic format.

Your hosts include:

In this installment, we talk about open source software. Is it more secure or just more transparent? Who do you trust? The closed software guys or the open software guys? What about SBOM (software bill of materials)? Can that help in preventing things like Log4J which opens a massive security hole in our digital universe? Get into the fray. It’s techno religious war time!

IoT Coffee Talk: #81 – The Cloud Episode

Welcome to #IoT Coffee Talk #81 where we chat about #Digital #IIoT #Automation #DigitalTwins #Edge #Cloud #DigitalTransformation #5G #AI #Data #Industry40 & #Sustainability over a cup of coffee.

Grab a cup and settle-in with some of the industry’s leading business minds and technology thought leaders for a lively, irreverent, and informative discussion about IoT in a totally unscripted, organic format.

Your hosts include:

In this installment, we break down the cloud and get real with some realities that are slowly but surely grinding down the hype. If you have an appetite for some raw, unhinged rants about cloud and all that is right and wrong about it, this is the IoT Coffee Talk episode for you!