During the 2024 growing season I worked with Vivid Machines (https://www.vivid-machines.com/) using their XV3 camera and cloud platform to track apple crop load on a tree-level basis in two orchards across eight different orchard blocks including five varieties. In Massachusetts. What follows is my summary of the process of scanning the orchard blocks with the XV3 camera, transmitting the data to the cloud, and how the results are displayed and interpreted. There was no particular objective here other than to casually evaluate and demonstrate the Vivid Machines platform during the 2024 growing season.
First, the hardware and set-up. We received the XV3 camera and associated mounting hardware from VividMachines. (I will use VM from here on to stand for VividMachines). The camera just has a power button and is a rather robust, rugged looking, somewhat heavy piece of equipment. In other words, it is built. The camera was attached to our John Deere Gator – could be any ATV, tractor, etc. – using a rather floppy (but built), four-legged mount that was attached to the hood of the Gator using large suction cups. Getting the mount secured to the Gator and the camera attached and leveled to the mount was sometimes a bit of a coordination issue, but it generally worked OK. If you have watched Tales from the Loop, which I highly recommend – https://en.wikipedia.org/wiki/Tales_from_the_Loop – it kind of ended up looking like one of the robots therein. One has to watch the suction cups, however, to make sure they do not loosen up while scanning orchard rows. The XV camera can also be mounted more securely to a single vehicle using their rigid mount which I would recommend if it was only going to be used in one orchard and mounted to one vehicle. Next year maybe.
Once the XV3 camera is mounted securely, pushing a button powers it up. The XV3 camera is powered by an internal battery that can either be fully charged by an AC adapter prior to using, or using a 12 volt DC cigarette-lighter style adapter while operating. The camera has high-level GPS accuracy (RTK?) and wi-fi built in, so it often took a minute or two to lock on the GPS signal and transmit wi-fi. Which is then used to connect to their Vivid Camera Control app on smartphone or tablet (iOS or Android). Camera Control is then used to control the XV3 camera while scanning orchard rows.
One more thing before starting scanning, is the outline of the orchard blocks needs to be mapped which is done using the Camera Control app by driving around the perimeter of the block and mapping corners so the platform knows where the block is, how big it is, etc. Seems to me there is some other information – tree spacing, variety, etc. – that needs to be inputted to the VM Dashboard – more on the Dashboard later – while doing this but that is a minor detail.
OK, once all of the above is in-hand, you can start scanning with the XV3 camera. In the Camera Control app, select the block you are scanning, push start scan (in the app) and start driving down any or all of the rows you want to scan. Maximum speed 5 mph, slower is better, we tried to do about 2-3 mph for our demonstration scans. Driving and scanning as we did is ideally a two-person job, but could be done by one operator once you get used to it. There are some nuances, camera positioning, etc. But overall it was relatively easy. Oh wait, one more thing: it’s called Calibration, which requires you to stop scanning, and measure some fruits (10 or so?) on a minimum of four trees (we typically did five). This is where having two people helps, although VM supplied a Bluetooth caliper so fruit measurements could go directly into the app. It worked, most of the time. It all goes into the CameraConnect app but it is a process that has to be followed for the most accurate result. Seemed a bit onerous at first, but once you get the hang of it, it typically took 10-15 minutes or so to scan 5 rows per block. On one side only, which seems to me was the recommended procedure.
Once scans are complete – and you can do as many blocks as you want, each called a “session” – the camera is detached from the mount, brought inside and attached to an internet-connected router (cabling and power supplied), powered up, and the data (scan images, etc.) are uploaded to the VM Tableau (https://www.tableau.com/) cloud-based Dashboard. It’s pretty much automatic, but can take a couple hours to upload a lot of image data, best to plug it in and do it overnight. VM would also occasionally push software updates to the camera.
Once fully uploaded, the data is available (next day typically) and displayed in the Dashboard. TABS – Home, Map, Scan Summary, Fruitlet Thinning, and Yield Potential – serve as the main navigation aid. I will briefly attempt to describe what is displayed in each tab, but you need to know within the Tabs many choices – Variety, Farm Block, Date, Stage, etc, – can be filtered for only the data you want to see. VM did supply some training and follow up support in using the Dashboard, as well as any problems or questions that arose when out scanning in the orchard. But I have to say it is not exactly for the timid or techno-phobe. Hint to VM here might be to simplify it a bit in terms of user interface?
Here are the TABS within the VM Dashboard:
- HOME – here there is a Farm Map, and Global Statistics (all varieties, all scan dates, etc. but can be filtered down to the one you want to see) can be displayed. Figure 1: VM-Home.
- MAP – produces a map down to the tree level of the scanned trees which includes counted fruit and fruit size. Along with a Histogram of Fruit Avg. Predicted Count. Figure 2: VM-Map.
- SCAN SUMMARY – gives you Avg. Predicted Count (per tree) and Avg. Predicted Size. Among various other scan statistics like Number of Trees scanned and Distance travelled. You can also Show Full Tree Data and Download it. Which is basically the individual tree raw scan data. It’s a lot to digest here, but it’s all there. Just waiting for you to dig into.
- FRUITLET THINNING – gives you a predicted (beta) Fruit Set Estimate expressed as Avg. Fruit Likely to Stay on Tree. Three successive scan dates once the fruitlets are in the 10 mm. size range to get this Estimate. If it can be verified, which is a work in progress, this could be very useful by greatly increasing sample size when trying to use the fruitlet growth rate model to predict fruit set and help make chemical thinning decisions.
- YIELD POTENTIAL – shows a Projected Fruit Size line chart (beginning with the first scan through the last scan) as well as Fruit Size Distribution in a bar chart. Also an Estimated Bin Count.
Now, one can do a lot of data mining in any or all of the above as to what and how tree level data are manipulated and displayed. Frankly, it’s a lot of information. As I mentioned, I’ve given feedback to VM perhaps they could make it simpler? What to do, what to do? But where I do currently see “actionable” potential:
- Blossom scanning, clusters, not individual flowers, yes, starting at pink. Could be useful for precision pruning, and variable rate spraying of caustic bloom thinners.
- Fruitlet Thinning, using the fruitlet growth rate model (in some way, shape, or form), if it can be verified with further ground truthing, would be very useful for decision-making when it comes to chemical thinning. Doubt it’s ready for prime time. Yet. Plan is to double-down on it in 2025.
- Season-long scans to predict block-level Yield Potential and Fruit Size Distribution are handy for growers trying to manage harvest labor, bin needs, and sales desks. Could also be used to direct hand thinning needs in June. This is currently the best output, for wholesale operations that are vertically integrated, to use.
Although I mentioned we had no particular objective here other than to just evaluate the VM platform, but in the back of my mind I wanted to compare VM vs. Outfield maps. (I already reported, similar to this, on Outfield here: https://jmcextman.blogspot.com/2024/11/fun-with-outfield-2024.html) Have not fully followed-through with that, might be the subject of another post. But I will say it’s kind of like comparing apples and oranges. Also, I need to compare the VM Fruitlet Thinning Fruit Set Estimate with some ground truth data we took using Malusim and/or FruitGrowth apps. Finally, I did have some help named Liam Oulette who was a CAFE summer scholar. I charged Liam for his project to do some VM ground truthing by comparing tree level VM fruit counts to actually tree counts by Liam (and me). As a side note, when two (or more) people count apples on the same tree, they never match (exactly). Sometimes, they are quite different, particularly when there are lots of apples on the tree(s). That being said, VM counts and hand counts by Liam (and me) were not exactly the same. Should we be surprised? Probably not. Like hand counts where two people differ, we saw where there were lots of apples on the tree, there was more difference in VM versus hand count, but where there were fewer (and larger apples) the VM count and hand count were pretty close. See Figure 3 for an example result. Although VM kind of suggested there might not be rationale for exactly comparing hand counts to their VM counts? Huh? Below (Figure 4) I have also included some tree pics as to what VM was seeing. So maybe you can see why they – hand counts vs. VM counts – did not agree exactly? Maybe.
OK, that is a lot. In practice, it’s not all that complicated and the potential is HUGE to use VM to better manage apple crop load. I think their technology is going to be evolutionary, and they are at the beginning. Consider yourself an early adopter if you choose to engage with them, and their technology is probably best suited to larger, wholesale apple growers. I am thinking this is kind of like when the original iPhone came out, and there was no App Store, you could only use what Steve Jobs chose to pre-install. That is where VM may be now, but you can see what might be coming?
Thanks to Massachusetts Fruit Growers' Association for partial support of this demonstration research. Also: Precision Crop Load Management of Apples: USDA-NIFA-SCRI SREP 2020-51181-32197. 09/30/2019 – 08/31/2024
.![]() |
Figure 1 - VM Home Tab as seen in their 'Dashboard' |
![]() |
Figure 2 - VM Map Tab as seen in their 'Dashboard' |
![]() |
Figure 3 - VM (Machine) count vs. actual hand counts (Count 1 and Count 2 by two different people) |
![]() |
Figure 4 – Example images (Ambrosia) captured by VM XV3 camera at the UMass Orchard |
No comments:
Post a Comment