Tuesday, February 4, 2025

My experience with Vivid Machines in 2024


 During the 2024 growing season I worked with Vivid Machines (https://www.vivid-machines.com/) using their XV3 camera and cloud platform to track apple crop load on a tree-level basis in two orchards across eight different orchard blocks including five varieties. In Massachusetts. What follows is my summary of the process of scanning the orchard blocks with the XV3 camera, transmitting the data to the cloud, and how the results are displayed and interpreted. There was no particular objective here other than to casually evaluate and demonstrate the Vivid Machines platform during the 2024 growing season.

First, the hardware and set-up. We received the XV3 camera and associated mounting hardware from VividMachines. (I will use VM from here on to stand for VividMachines). The camera just has a power button and is a rather robust, rugged looking, somewhat heavy piece of equipment. In other words, it is built. The camera was attached to our John Deere Gator – could be any ATV, tractor, etc. – using a rather floppy (but built), four-legged mount that was attached to the hood of the Gator using large suction cups. Getting the mount secured to the Gator and the camera attached and leveled to the mount was sometimes a bit of a coordination issue, but it generally worked OK. If you have watched Tales from the Loop, which I highly recommend – https://en.wikipedia.org/wiki/Tales_from_the_Loop – it kind of ended up looking like one of the robots therein. One has to watch the suction cups, however, to make sure they do not loosen up while scanning orchard rows. The XV camera can also be mounted more securely to a single vehicle using their rigid mount which I would recommend if it was only going to be used in one orchard and mounted to one vehicle. Next year maybe.

Once the XV3 camera is mounted securely, pushing a button powers it up. The XV3 camera is powered by an internal battery that can either be fully  charged by an AC adapter prior to using, or using a 12 volt DC cigarette-lighter style adapter while operating. The camera has high-level GPS accuracy (RTK?) and wi-fi built in, so it often took a minute or two to lock on the GPS signal and transmit wi-fi. Which is then used to connect to their Vivid Camera Control app on smartphone or tablet (iOS or Android). Camera Control is then used to control the XV3 camera while scanning orchard rows.

One more thing before starting scanning, is the outline of the orchard blocks needs to be mapped which is done using the Camera Control app by driving around the perimeter of the block and mapping corners so the platform knows where the block is, how big it is, etc. Seems to me there is some other information – tree spacing, variety, etc. – that needs to be inputted to the VM Dashboard – more on the Dashboard later – while doing this but that is a minor detail.

OK, once all of the above is in-hand, you can start scanning with the XV3 camera. In the Camera Control app, select the block you are scanning, push start scan (in the app) and start driving down any or all of the rows you want to scan. Maximum speed 5 mph, slower is better, we tried to do about 2-3 mph for our demonstration scans. Driving and scanning as we did is ideally a two-person job, but could be done by one operator once you get used to it. There are some nuances, camera positioning, etc. But overall it was relatively easy. Oh wait, one more thing: it’s called Calibration, which requires you to stop scanning, and measure some fruits (10 or so?) on a minimum of four trees (we typically did five). This is where having two people helps, although VM supplied a Bluetooth caliper so fruit measurements could go directly into the app. It worked, most of the time. It all goes into the CameraConnect app but it is a process that has to be followed for the most accurate result. Seemed a bit onerous at first, but once you get the hang of it, it typically took 10-15 minutes or so to scan 5 rows per block. On one side only, which seems to me was the recommended procedure.

Once scans are complete – and you can do as many blocks as you want, each called a “session” – the camera is detached from the mount, brought inside and attached to an internet-connected router (cabling and power supplied), powered up, and the data (scan images, etc.) are uploaded to the VM Tableau (https://www.tableau.com/) cloud-based Dashboard. It’s pretty much automatic, but can take a couple hours to upload a lot of image data, best to plug it in and do it overnight. VM would also occasionally push software updates to the camera.

Once fully uploaded, the data is available (next day typically) and displayed in the Dashboard. TABS – Home, Map, Scan Summary, Fruitlet Thinning, and Yield Potential – serve as the main navigation aid. I will briefly attempt to describe what is displayed in each tab, but you need to know within the Tabs many choices – Variety, Farm Block, Date, Stage, etc, – can be filtered for only the data you want to see. VM did supply some training and follow up support in using the Dashboard, as well as any problems or questions that arose when out scanning in the orchard. But I have to say it is not exactly for the timid or techno-phobe. Hint to VM here might be to simplify it a bit in terms of user interface?

Here are the TABS within the VM Dashboard:

  • HOME – here there is a Farm Map, and Global Statistics (all varieties, all scan dates, etc. but can be filtered down to the one you want to see) can be displayed. Figure 1: VM-Home.
  • MAP – produces a map down to the tree level of the scanned trees which includes counted fruit and fruit size. Along with a Histogram of Fruit Avg. Predicted Count. Figure 2: VM-Map.
  • SCAN SUMMARY – gives you Avg. Predicted Count (per tree) and Avg. Predicted Size. Among various other scan statistics like Number of Trees scanned and Distance travelled. You can also Show Full Tree Data and Download it. Which is basically the individual tree raw scan data. It’s a lot to digest here, but it’s all there. Just waiting for you to dig into.
  • FRUITLET THINNING – gives you a predicted (beta) Fruit Set Estimate expressed as Avg. Fruit Likely to Stay on Tree. Three successive scan dates once the fruitlets are in the 10 mm. size range to get this Estimate. If it can be verified, which is a work in progress, this could be very useful by greatly increasing sample size when trying to use the fruitlet growth rate model to predict fruit set and help make chemical thinning decisions.
  • YIELD POTENTIAL – shows a Projected Fruit Size line chart (beginning with the first scan through the last scan) as well as Fruit Size Distribution in a bar chart. Also an Estimated Bin Count. 

Now, one can do a lot of data mining in any or all of the above as to what and how tree level data are manipulated and displayed. Frankly, it’s a lot of information. As I mentioned, I’ve given feedback to VM perhaps they could make it simpler? What to do, what to do? But where I do currently see “actionable” potential:

  • Blossom scanning, clusters, not individual flowers, yes, starting at pink. Could be useful for precision pruning, and variable rate spraying of caustic bloom thinners.
  • Fruitlet Thinning, using the fruitlet growth rate model (in some way, shape, or form), if it can be verified with further ground truthing, would be very useful for decision-making when it comes to chemical thinning. Doubt it’s ready for prime time. Yet. Plan is to double-down on it in 2025.
  • Season-long scans to predict block-level Yield Potential and Fruit Size Distribution are handy for growers trying to manage harvest labor, bin needs, and sales desks. Could also be used to direct hand thinning needs in June. This is currently the best output, for wholesale operations that are vertically integrated, to use.

Although I mentioned we had no particular objective here other than to just evaluate the VM platform, but in the back of my mind I wanted  to compare VM vs. Outfield maps. (I already reported, similar to this, on Outfield here: https://jmcextman.blogspot.com/2024/11/fun-with-outfield-2024.html) Have not fully followed-through with that, might be the subject of another post. But I will say it’s kind of like comparing apples and oranges. Also, I need to compare the VM Fruitlet Thinning Fruit Set Estimate with some ground truth data we took using Malusim and/or FruitGrowth apps. Finally, I did have some help named Liam Oulette who was a CAFE summer scholar. I charged Liam for his project to do some VM ground truthing by comparing tree level VM fruit counts to actually tree counts by Liam (and me). As a side note, when two (or more) people count apples on the same tree, they never match (exactly). Sometimes, they are quite different, particularly when there are lots of apples on the tree(s). That being said, VM counts and hand counts by Liam (and me) were not exactly the same. Should we be surprised? Probably not. Like hand counts where two people differ, we saw where there were lots of apples on the tree, there was more difference in VM versus hand count, but where there were fewer (and larger apples) the VM count and hand count were pretty close. See Figure 3 for an example result. Although VM kind of suggested there might not be rationale for exactly comparing hand counts to their VM counts? Huh? Below (Figure 4) I have also included some tree pics as to what VM was seeing. So maybe you can see why they – hand counts vs. VM counts – did not agree exactly? Maybe.

OK, that is a lot. In practice, it’s not all that complicated and the potential is HUGE to use VM to better manage apple crop load. I think their technology is going to be evolutionary, and they are at the beginning. Consider yourself an early adopter if you choose to engage with them, and their technology is probably best suited to larger, wholesale apple growers. I am thinking this is kind of like when the original iPhone came out, and there was no App Store, you could only use what Steve Jobs chose to pre-install. That is where VM may be now, but you can see what might be coming?

Monday, January 27, 2025

Orchard Watch update...

 Orchard Watch – aka “How is the Orchard Feeling Today?” – is our nickname for a set of weather “stations” at the UMass Orchard in Belchertown, MA. Orchard Watch consists of:

  • Two Onset Computer Corporation (https://www.onsetcomp.com/) RX 3000 “mother” stations that collect all the mote (see next) weather data as well as collecting their own: air temperature, humidity, rainfall, wetness, solar radiation, wind speed and direction, and soil temperature and moisture content.
  • Seven “motes” that each measure: same, all of the aboved.

For more, Orchard Watch has a dedicated website: https://orchardwatch.wordpress.com/

Some interesting facts, in addition to the above, about Orchard Watch:

  • Data from the motes is transmitted to the two mother stations by a proprietary wireless network called HOBOnet: https://www.onsetcomp.com/resources/product-overview-videos/hobonet-field-monitoring-system
  • The HOBOnet “field monitoring system” at the UMass Orchard is composed of nine locations divided into “North” and “South” mother stations, North having 3 motes and South having 4 motes (in addition to an in-canopy wetness sensor). Figure 1. The motes are separated (at the maximum) by 3,000 feet horizontally and 180 feet vertically. Thus, nine micro-climate sites.
  • Mother stations and motes log environmental (mostly “weather”) data every 5 minutes, from every sensor – air temperature and  humidity, wind speed/direction, rainfall, solar radiation, leaf wetness, soil temperature and moisture – and transmit the data (every 15 minutes) via cellular to an Onset website called Hobolink. On Hobolink, current conditions are displayed, charted, and the data is archived for later retrieval. Currently, this costs $700 a year ($350 for each mother station). Thanks to Massachusetts Fruit Growers's Association for paying the bill in 2024!
  • How much data is there? The math: there are 104 sensors, each with a data log point (air temperature for example) every five minutes, times 12 (per hour) times 24 (per day) equals nearly 30,000 data points per day! Think 30,000 spreadsheet cells per day! Times 365 days per year equals nearly 11 million data points (spreadsheet cells). That’s a lot!

The original intent of Orchard Watch was to monitor microclimate to get a feel if it makes any difference when used in orchard pest management models – for example apple scab – that predict pest severity/incidence. In addition, there was some thought about a public-facing website “How is the Orchard Feeling Today?” Neither in particular have come to realization. There is, however, a lot of data mining that could be done. I have pulled together three minimal examples for consideration.

Temperature

How much did the temperature differ within Orchard Watch? Turns out not that much (with one caveat). I pulled the temperature data for May, 2024, charted in Figure 2. Consider, however, that I have found that on a clear, cool (cold?) night the temperature does indeed differ significantly. For example, most recently, on January 22,2025 the low temperature recorded differed by 7 degrees F. (Zero degrees in the higher elevation to -7 degrees F. in the lower elevation.) Figure 3. And not the first time that has happened: https://orchardwatch.wordpress.com/2020/10/18/cold-air-sinks-confirmed/ Why – all other things being equal – orchards should be planted on higher ground vs. lower to manage frost/freeze risk. Thus you see orchards on hills (most of the time).

Wetness (leaf presumably)

This is interesting. I buried a wetness sensor in a tree canopy so I could compare that to one of the wetness sensors out in the “open.” (Did I mention that the mothers and motes are all out in open areas, not in orchard blocks themselves.) Hypothesis being the wetness sensor in the canopy would be “wetter.” Wrong. Figure 4. Guessing the canopy shields the wetness sensor from dew (or light rainfall), thus it records less wetness vs. sensors out in the open that collect dew?

Apple scab

Did a quick comparison here on scab infection periods between the mother weather stations on NEWA – the motes are not on NEWA so do not have the capability to run the scab infection period model with those –  and also included another Onset and Rainwise station here at the UMass Orchard that are not part of Orchard Watch. For the month of May, 2024. Figure 5. Interesting, generally line up that would trigger fungicides sprays with the exceptions of May 1 and May 23-24. No idea why, probably a result of recording more wetting events/precipitation. Not convinced that overall, microclimate is going to make that big a difference in these models, however, certainly could use further investigation?

So, I got a lot of data, it is out there, free for the taking, I just need to find someone with some big data mining/analysis skills to make any of it actionable. Any takers out there? Or does anyone really care?

Figure 1 - Overview of OrchardWatch 'Hobonet' wireless network
at the UMass Orchard, Belchertown, MA

Figure 2 - Average, maximum, and minimum monthly temperature (May 2024)
at nine OrchardWatch locations

Figure 3 - Minimum air temperature (degrees F.) on 22-January, 2025
at nine Orchard Watch locations

Figure 4 - Wetness (leaf) outside vs. inside canopy, May 2024 (week 4)

Figure 5 - Apple scab infection periods (as predicted by NEWA) for May 2024 at two Orchard Watch  locations (OW-N and OW-S) and two other on-site weather stations (Belchertown and Belchertown-2)



Wednesday, January 22, 2025

It's cold out there...

 But not too cold? How cold is too cold? It depends😀

Arctic high pressure invaded the country this week. Dangerous wind chills from the Midwest to the East coast and a record snowtorm in New Orleans, LA where they got 10 inches of snow. (That's more than we have had here all season.) Snow from Houston to the Florida panhandle. Meanwhile, here in New York and New England it was just plain cold. Nowhere near record, but flirting with -10 degrees F. where I might expect to see some damage to tender fruit (peaches) buds. How cold did it get across Massachusetts NEWA weather stations which are mostly on-farm? -17 to 16 degrees F., the former (Richmond) being in the Berkshires, the latter (Hyannis) down in the Cape Cod banana belt. See what I have to deal with?

Now, microclimate within site I know by fact had something to do with these temperature variations. For example, I have a set (too many, OrchardWatch) of weather stations at the UMass Orchard in Belchertown, MA and the low temperature last night (this morning, Janaury 22, 2025) there varied from zero to -7 degrees, high to low elevation, with about a 200 foot vertical height difference (maybe a couple thousand feet horizontal difference). What do they say about real estate? Location, location, location?

It has been cold up to this point with no significant warm-up, so in most, if not all locations, we will have a peach crop. But check back with me in May...