Abstract
A plant growth simulation can be characterized as a reconstructed visual representation of a plant or plant system. The phenotypic characteristics and plant structures are controlled by the scene environment and other contextual attributes. Considering the temporal dependencies and compounding effects of various factors on growth trajectories, we formulate a probabilistic approach to the simulation task by solving a frame synthesis and pattern recognition problem. We introduce a sequence-informed plant growth simulation framework (SI-PGS) that employs a conditional generative model to implicitly learn a distribution of possible plant representations within a dynamic scene from a fusion of low-dimensional temporal sensor and context data. Methods such as controlled latent sampling and recurrent output connections are used to improve coherence in the plant structures between frames of prediction. In this work, we demonstrate that SI-PGS is able to capture temporal dependencies and continuously generate realistic frames of plant growth.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Code and supplementary material can be found at:
https://github.com/mohas95/Sequence-Informed-Plant-Growth-Simulation.
References
Cieslak, M., et al.: L-system models for image-based phenomics: case studies of maize and canola. in Silico Plants 4(1), diab039 (2021). https://doi.org/10.1093/insilicoplants/diab039
Debbagh, M.: Learning structured output representations from attributes using deep conditional generative models. arXiv preprint arXiv:2305.00980 (2023). https://doi.org/10.48550/arXiv.2305.00980
Drees, L., Junker-Frohn, L.V., Kierdorf, J., Roscher, R.: Temporal prediction and evaluation of brassica growth in the field using conditional generative adversarial networks. Comput. Electron. Agric. 190, 106415 (2021). https://doi.org/10.1016/j.compag.2021.106415
Goodfellow, I., et al.: Generative adversarial nets. In: Advances in Neural Information Processing Systems, vol. 27 (2014). https://doi.org/10.48550/arXiv.1406.2661
Heusel, M., Ramsauer, H., Unterthiner, T., Nessler, B., Hochreiter, S.: Gans trained by a two time-scale update rule converge to a local nash equilibrium. In: Advances in Neural Information Processing Systems. NIPS 2017, vol. 30, pp. 6629-6640 (2017). https://doi.org/10.5555/3295222.3295408
Hitti, Y., Buzatu, I., Del Verme, M., Lefsrud, M., Golemo, F., Durand, A.: Growspace: a reinforcement learning environment for plant architecture. Comput. Electron. Agric. 217, 108613 (2024). https://doi.org/10.1016/j.compag.2024.108613
Jiang, Y., Li, C., Paterson, A.H., Sun, S., Xu, R., Robertson, J.: Quantitative analysis of cotton canopy size in field conditions using a consumer-grade RGB-D camera. Front. Plant Sci. 8, 2233 (2018). https://doi.org/10.3389/fpls.2017.02233
Keren, G., Schuller, B.: Convolutional RNN: an enhanced model for extracting features from sequential data. In: 2016 International Joint Conference on Neural Networks (IJCNN), pp. 3412–3419. IEEE (2016). https://doi.org/10.1109/IJCNN.2016.7727636
Kingma, D.P., Welling, M.: Auto-encoding variational bayes. arXiv preprint arXiv:1312.6114 (2013). https://doi.org/10.48550/arXiv.1312.6114
Leonhardt, J., Drees, L., Jung, P., Roscher, R.: Probabilistic biomass estimation with conditional generative adversarial networks. In: Andres, B., Bernard, F., Cremers, D., Frintrop, S., Goldlücke, B., Ihrke, I. (eds.) DAGM GCPR 2022. LNCS, vol. 13485, pp. 479–494. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-16788-1_29
Li, Y., et al.: Self-supervised plant phenotyping by combining domain adaptation with 3D plant model simulations: application to wheat leaf counting at seedling stage. Plant Phenomics 5, 0041 (2023). https://doi.org/10.34133/plantphenomics.0041
Lu, Y., Chen, D., Olaniyi, E., Huang, Y.: Generative adversarial networks (GANs) for image augmentation in agriculture: a systematic review. Comput. Electron. Agric. 200, 107208 (2022)
Luo, L., et al.: Eff-3DPSeg: 3D organ-level plant shoot segmentation using annotation-efficient deep learning. Plant Phenomics 5, 0080 (2023). https://doi.org/10.34133/plantphenomics.0080
Miranda, M., Drees, L., Roscher, R.: Controlled multi-modal image generation for plant growth modeling. In: 2022 26th International Conference on Pattern Recognition (ICPR), pp. 5118–5124 (2022). https://doi.org/10.1109/ICPR56361.2022.9956115
Prusinkiewicz, P., Cieslak, M., Ferraro, P., Hanan, J.: Modeling plant development with L-systems. In: Morris, R.J. (ed.) Mathematical Modelling in Plant Biology, pp. 139–169. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-99070-5_8
Sakurai., S., Uchiyama., H., Shimada., A., Taniguchi., R.: Plant growth prediction using convolutional LSTM. In: Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (5: VISAPP), pp. 105–113. INSTICC (2019). https://doi.org/10.5220/0007404901050113
Sohn, K., Lee, H., Yan, X.: Learning structured output representation using deep conditional generative models. In: Advances in Neural Information Processing Systems, vol. 28 (2015). https://doi.org/10.5555/2969442.2969628
Soualiou, S., et al.: Functional-structural plant models mission in advancing crop science: opportunities and prospects. Front. Plant Sci. 12, 747142 (2021). https://doi.org/10.3389/fpls.2021.747142
Sun, S., et al.: In-field high throughput phenotyping and cotton plant growth analysis using lidar. Front. Plant Sci. 9 (2018). https://doi.org/10.3389/fpls.2018.00016
Szegedy, C., Vanhoucke, V., Ioffe, S., Shlens, J., Wojna, Z.: Rethinking the inception architecture for computer vision. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 2818–2826 (2016). https://doi.org/10.1109/CVPR.2016.308
Wang, Z., Lu, L., Bovik, A.C.: Video quality assessment based on structural distortion measurement. Signal Process. Image Commun. 19(2), 121–132 (2004). https://doi.org/10.1016/S0923-5965(03)00076-6
Yasrab, R., Zhang, J., Smyth, P., Pound, M.P.: Predicting plant growth from time-series data using deep learning. Remote Sens. 13(3) (2021). https://doi.org/10.3390/rs13030331
Acknowledgements
This study was partially funded by Gardyn and Mitacs (IT16220). We thank the Gardyn team for providing the vertical growth systems essential for building our datasets. Special thanks to Anita Parmar, Ollivier Dyens, and the Building 21 members for their engagement in creative discussions.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
1 Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Debbagh, M., Liu, Y., Zheng, Z., Jiang, X., Sun, S., Lefsrud, M. (2024). Generative Plant Growth Simulation from Sequence-Informed Environmental Conditions. In: Suen, C.Y., Krzyzak, A., Ravanelli, M., Trentin, E., Subakan, C., Nobile, N. (eds) Artificial Neural Networks in Pattern Recognition. ANNPR 2024. Lecture Notes in Computer Science(), vol 15154. Springer, Cham. https://doi.org/10.1007/978-3-031-71602-7_26
Download citation
DOI: https://doi.org/10.1007/978-3-031-71602-7_26
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-71601-0
Online ISBN: 978-3-031-71602-7
eBook Packages: Computer ScienceComputer Science (R0)