Reto Meier and Colt McAnlis present Build Out (Episode 1)

A Garden that Cares for Itself

Designing an autonomous, learning smart garden

In the first episode of Build Out, Colt and Reto — tasked with designing the architecture for a “Smart Garden” — supplied two very different concepts, that nevertheless featured many overlapping elements. Take a look at the video to see what they came up with, then continue reading to see how you can learn from their explorations to build your very own Smart Garden.

TL;DW: What they built

Both solutions aim to optimize plant care using sensors, weather forecasts, and machine learning. Watering and fertilizing routines for the plants are updated regularly to guarantee the best growth, health, and fruit yield possible.

Colt’s solution is optimized for small-scale home farming, using a modified CNC machine to care for a fruit or vegetable patch. The drill bit is replaced with a liquid spout, UV light, and camera, while the cutting area is replaced with a plant bed that includes sensors to track moisture, nutrient levels, and weight.

Colt’s CNC machine-based automated grower

Reto’s solution extends a typical garden sprinkler / reticulation system into a distributed autonomous garden care system. A liquid / fertilizer spout and soil sensors are incorporated into each stand-alone device, each of which is installed alongside a plant in your garden — allowing each plant to receive a custom care regime.

Reto’s distributed garden care solution

How to build your own Smart Garden

The basic system architecture is based around the following common components:

One or more devices powered by a microprocessor / micro-controller.

An Android Things “hub” that connects the garden care device(s) to the cloud.

A cloud server component built around an App Engine Flexible environment that processes camera and sensor results, and determines the appropriate changes to the plant care instructions.

A machine learning implementation built around TensorFlow models to analyze camera images and optimize plant-care instructions.

User client(s) to monitor (and control) the garden behavior and status.

The basic underlying architecture consistent across both solutions.

Garden-side client architecture

The garden care devices — both the CNC rig and individual plant-care sensor/sprinklers — are optimized for their specific role monitoring sensors and controlling the actuators used to move the arm, trigger the water and fertilizer delivery mechanisms, and collect camera footage.

The specific hardware and software platforms you select for this component will be determined by your specific needs. Colt’s “home-made” CNC machine is driven by a Raspberry Pi, while Reto’s per-plant devices are built on micro-controller like the ESP32.

In any case, you don’t want to connect the plant-adjacent hardware directly to the cloud. Reto’s solution includes multiple devices connected together in a Bluetooth mesh, while even Colt’s CNC system may include multiple CNC devices in each location — so we connect each device wirelessly to a “hub” built using Android Things.

Why Android Things? Android Things is designed to build connected devices using the powerful development tools and APIs used to build apps for Android within Android Studio. Reto and Colt are both experienced Android developers, so this is a big shortcut for them, but you can use any language / SDK supported by the underlying hardware platform.

The Garden Hub serves as the single point-of-contact between garden device(s) and the cloud-based service, and as such serves two purposes. It:

collects and aggregates the readings from all available sensors and cameras, and uploads them to the cloud for analysis.

controls the behavior and operation of each garden device (usually based on the results of the cloud analysis.)

To control the operation of each garden device, each hub begins with a set of default garden care instructions, which are updated as needed. The default and current settings are all stored locally, ensuring full offline support so that losing connectivity won’t prevent the garden being watered.

One solution for local offline support is using the Firebase Real-time Database, which is well supported by Android Things and, as well as offline support, can provide auto-synchronization with our server and client implementation.

Why Firebase Real-time Database? The data being stored within each hub is simple, non-relational, and relatively small. Firebase is a NoSQL database that can handle the volume of data we’re storing, takes care of data synchronization for us, and remains available if we go offline. This is handy as it ensures we don’t need to create multiple database storage and synchronization implementations across multiple systems.

The need to have a local database depends on the complexity of your planned garden setup. In the case of a single CNC machine, a database may be overkill — while having a device per plant necessitates a fairly robust database mechanism.

Choosing to include support for web- or app-based modification / control of your garden makes the use of an auto-synced database even more compelling.

Data collection & storage

All your sensor data, including camera imagery, should first be collected at the Garden Hub before being transferred to our cloud for analysis. The frequency at which you collect and upload this data will vary based on the size of your garden, and the resolution of your analysis.

The camera imagery can all be uploaded easily to Google Cloud Storage (Object/File Serving & Storage), Google’s unified object storage offering.

There are multiple options for uploading and storing the sensor data. One particularly flexible approach is to use Cloud Pub/Sub (Distributed Real-time Messaging) to aggregate and publish the sensor data from each garden into a single durable stream, to which multiple cloud components can subscribe.

Why Cloud Pub/Sub? Pub/Sub provides low-latency, durable message delivery based on a many-to-many, asynchronous stream that decouples senders and receivers. That means we’re guaranteed that each message will be received, potentially by multiple subscribers within our server implementation (such as App Engine, BigQuery, and Dataflow.)

Using Pub/Sub allows us to decouple of the uploading and processing steps. This is particularly powerful if multiple components in our cloud architecture need to consume the same data. It also allows us to use services such as Cloud Dataflow (Managed Data Processing) to modify the incoming stream, or perform real-time analytics, prior to processing by our analysis engine.

More recently, Google released Cloud IoT Core in Beta. This fully managed service utilizes Pub/Sub, and is designed specifically to securely connect, manage, and ingest data from globally dispersed devices. IoT Core also enables you to re-configure those devices on-the-fly.

Server-side data analysis

Once the data is uploaded to your cloud, it’s ready for analysis to determine what changes — if any — need to be made to the existing garden-care routine. There are a number of options for building your analysis engine, here we’ll use AppEngine Flex (Managed App Platform).

Why AppEngine? AppEngine is a platform-as-a-service, so we can focus on the code rather than infrastructure, and let Google handle scaling and reliability. We’re using Flex instead of Standard to support our use of the Google Cloud Tasks API. As a new project, there’s no legacy or hybrid infrastructure requirements that might suggest using containers. Also we both work at Google, so we know someone who can hook us up with a bunch of credits.

To initiate the analysis process, you can either ping an AppEngine front end from the hub once the sensor / image uploads are complete, or subscribe directly to the pub/sub streams for incoming sensor data or modifications to a GCS bucket.

To perform the analysis the Google Cloud Tasks API is used to initiate an AppEngine Flex instances to perform the analysis.

Why not use a Google Cloud Function? A possible alternative to the Cloud Tasks API is triggering a cloud-function based on either the GCS bucket change or Pub/Sub stream. However, the max duration for a Cloud Function is 540 seconds, which may be insufficient for performing image analysis; the Cloud Tasks API supports tasks of up to 24 hours.

Image analysis is used as a signal for determining plant health, so we need to train TensorFlow models to recognize everything from plant decay, to bugs, fruit, and leaf size.

To simplify our use of TensorFlow, we can use the Google Cloud Machine Learning engine, which can host TensorFlow models for training, prediction — providing you with a managed, serverless solution for training TensorFlow models and using them for prediction.

To devise a care routine for each plant, we start with a Datastore (Distributed Hierarchical Key-value Storage) database that contains the baseline care instructions for a variety of plant species.

Why Datastore? There’s a large variety of database options we could use here, but the nature of the plant-care database — largely static, potentially very large, and without any relational constraints — is ideal for a NoSQL database, particularly one that is well supported by AppEngine.

The baseline instructions are modified based on a combination of the image analysis results, plant sensor readings (soil moisture / pH, etc.), and environmental signals including local weather forecasts, seasons, and day / night cycles.

Updating the garden care routine

Having completed your analysis and determined any required updates to your gardening routine, you need to update your Garden Hubs.

If you’re using the Firebase Real-time Database, you can record those changes server-side and let Firebase synchronize those changes on the hub. If not, you’ll need to implement your own synch mechanism to transfer and apply the updates.

All changes made to the plant care routines, and the health-history of each plant is all sent to BigQuery (Managed Data Warehouse / Analytics), that acts as a data pool of training data for TensorFlow that’s used to learn how to better care for each plant based on prior results.

Your BigQuery dataset is also connected to Data Studio (Visualize and Explore Data), which can be used to create real-time visualizations indicating overall garden health status and trends.

User website and app

At this stage your garden is completely autonomous; the client / garden owner has neither visibility into the system settings, nor the ability to manually modify its behavior.

There are a number of options for a client-side implementation, depending on the level of information and control you wish to make available, and the platforms you wish to support.

It’s likely you’ll want at least a web client, and native Android apps that allow users to observe the current plant care settings for their garden, and manually alter the schedules or trigger things like watering or fertilizing on demand.

Why Android? You’re going to build an iOS mobile app as well as Android, and the Firebase Real-time Database and Cloud Messaging both feature an iOS SDK — so when we say “Android” we really mean “Android and iOS”. [Ed: Don’t tell anyone we said this.]

If you used the Firebase Real-time Database to store the plant care settings, the web, Android, and iOS clients can automatically get updated when those settings change — and similarly, providing functionality to modify and save to Firebase within those clients will see the settings updated on the server and Garden Hub.

If you’re building native mobile apps, you can utilize Firebase Cloud Messaging to send realtime notifications for time-sensitive gardening alerts (such as detecting a repeated failure in watering, or mechanical issue with the CNC arms.)

You may also choose to add peer-to-peer control to your mobile client, providing a way for users to modify the settings on the hub even if you have no Internet connectivity. Using Bluetooth you can connect to the local hub and modify the care instructions, turn on specific sprinklers, or otherwise control the garden devices.

Further features depend on your needs. You can use more TensorFlow models to recognize plants by taking a photograph, add support for manual tasks such as pruning or harvesting, or add voice control through Google Assistant Actions.