Insects are the most important global pollinator of crops and play a key role
in maintaining the sustainability of natural ecosystems. Insect pollination
monitoring and management are therefore essential for improving crop production
and food security. Computer vision facilitated pollinator monitoring can
intensify data collection over what is feasible using manual approaches. The
new data it generates may provide a detailed understanding of insect
distributions and facilitate fine-grained analysis sufficient to predict their
pollination efficacy and underpin precision pollination. Current computer
vision facilitated insect tracking in complex outdoor environments is
restricted in spatial coverage and often constrained to a single insect
species. This limits its relevance to agriculture. Therefore, in this article
we introduce a novel system to facilitate markerless data capture for insect
counting, insect motion tracking, behaviour analysis and pollination prediction
across large agricultural areas. Our system is comprised of Edge Computing
multi-point video recording, offline automated multi-species insect counting,
tracking and behavioural analysis. We implement and test our system on a
commercial berry farm to demonstrate its capabilities. Our system successfully
tracked four insect varieties, at nine monitoring stations within a
poly-tunnel, obtaining an F-score above 0.8 for each variety. The system
enabled calculation of key metrics to assess the relative pollination impact of
each insect variety. With this technological advancement, detailed, ongoing
data collection for precision pollination becomes achievable. This is important
to inform growers and apiarists managing crop pollination, as it allows
data-driven decisions to be made to improve food production and food security.
Contributing authors: don.amarathunga@mona sh.edu; asaduzzaman@monash.e du;
寄稿: don.amarathunga@mona sh.edu; asaduzzaman@monash.e du;
0.39
adrian.dyer@rmit.edu .au, adrian.dyer@monash.e du; alan.dorin@monash.ed u;
adrian.dyer@rmit.edu .au, adrian.dyer@monash.e du; alan.dorin@monash.ed u;
0.28
Abstract Insects are the most important global pollinator of crops and play a key role in maintaining the sustainability of natural ecosystems.
概要 昆虫は作物の世界的な受粉者であり、自然生態系の持続性を維持する上で重要な役割を担っている。
0.52
Insect pollination monitoring and management are therefore essential for improving crop production and food security.
したがって、昆虫の受粉監視と管理は作物の生産と食品の安全性を改善するのに不可欠である。
0.57
Computer vision facilitated pollinator monitoring can intensify data collection over what is feasible using manual approaches.
コンピュータビジョンによる受粉者監視は、手動で実現可能なデータ収集を強化することができる。
0.70
The new data it generates may provide a detailed understanding of insect distributions and facilitate fine-grained analysis sufficient to predict their pollination efficacy and underpin precision pollination.
Current computer vision facilitated insect tracking in complex outdoor environments is restricted in spatial coverage and often constrained to a single insect species.
Therefore, in this article we introduce a novel system to facilitate markerless data capture for insect counting, insect motion tracking, behaviour analysis and pollination prediction across large agricultural areas.
Our system is comprised of Edge Computing multi-point video recording, offline automated multispecies insect counting, tracking and behavioural analysis.
We implement and test our system on a commercial berry farm to demonstrate its capabilities.
我々は,その能力を実証するために,市販のベリー農場でシステムを実装し,テストする。
0.61
Our system successfully tracked four insect varieties, at nine monitoring stations within a poly-tunnel, obtaining an F-score above 0.8 for each variety.
With this technological advancement, detailed, ongoing data collection for precision pollination becomes achievable.
この技術進歩により、精密受粉のための詳細なデータ収集が可能となる。
0.76
This is important to inform growers and apiarists managing crop pollination, as it allows data-driven decisions to be made to improve food production and food security.
Keywords: deep learning, camera trapping, honeybees, pollination, food security, insect tracking
キーワード:ディープラーニング、カメラトラップ、ミツバチ、受粉、食品セキュリティ、昆虫追跡
0.63
1
1
0.42
英語(論文から抽出)
日本語訳
スコア
2 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
2 精密受粉のためのコンピュータビジョンを用いた空間モニタリングと昆虫行動解析
0.61
1 Introduction Pollinators play a key role in world food production and ecosystem management.
はじめに 受粉者は世界の食糧生産と生態系管理において重要な役割を果たす。
0.60
Three out of four flowering plants (Food & Agriculture Organization of the United Nation, 2019) and 35% of agricultural land (FAO, 2018) require some degree of animal pollination.
The annual market value of pollinator contributions to global food production is estimated to be in the range of 235− 577 billion USD (Potts et al , 2016).
世界の食料生産への受粉者の貢献の年間市場価値は235~577億米ドルと推定されている(Potts et al, 2016)。
0.75
Recently, climate change and other anthropogenic pressures have been implicated in declines in some pollinator populations (Schweiger et al , 2010; Vanbergen & Initiative, 2013), threatening global food security.
近年、気候変動やその他の人為的圧力は、世界の食料安全保障を脅かす、一部の受粉者(Schweiger et al , 2010; Vanbergen & Initiative, 2013)の減少に関係している。
0.72
In many instances, pollinator population size is directly correlated with crop yield (Rollin & Garibaldi, 2019), although the efficiency of different pollinator populations varies between crops (MacInnis & Forrest, 2019).
Hence, improved understanding and management of pollinator communities is important to boost crop yield (Garibaldi, Requier, Rollin, & Andersson, 2017), and for the long-term viability of many farming projects (Garibaldi, S´aez, Aizen, Fijen, & Bartomeus, 2020).
This need strongly motivates the research presented here to describe the design and implementation of computer vision facilitated spatial monitoring and insect behavioural analysis for precision pollination.
Traditional methods of insect monitoring are straightforward to conduct but are time-consuming and labour intensive.
従来の昆虫モニタリングの方法は簡単に行うことができるが、時間と労力がかかる。
0.57
The use of human labour for traditional sampling may unintentionally bias results (Dennis et al , 2006; Simons & Chabris, 1999), increase processing lead times, reduce reproducibility, and inhibit or interfere with active pollination monitoring conducted simultaneously in different areas of a site.
従来のサンプリングにおける人的労働力の使用は、意図しない偏見の結果(Dennis et al , 2006; Simons & Chabris, 1999)、処理リード時間の増加、再現性低下、およびサイトの異なる場所で同時に実施されるアクティブな受粉監視の抑制または妨害などである。
0.67
Furthermore, conventional sampling methods lack functional precision – the capacity to model pollinator movements, motion paths and spatial distributions.
This restricts their value as a means to understand how insect behaviour effects pollination.
これにより、昆虫の行動が受粉に与える影響を理解する手段としての価値が制限される。
0.51
Automated and detailed pollination
自動および詳細な受粉
0.78
monitoring techniques with high functional precision are needed that allow continuous assessment of pollination levels.
受粉レベルの連続的な評価を可能にするために,機能精度の高いモニタリング技術が必要である。
0.65
Mechanised efforts to count insects have been attempted and improved over the last century, although it is only with improved technology and Artificial Intelligence that individual recognition in complex environments has started to emerge as a realistic proposition (Odemer, 2022).
In turn, this will facilitate the efficient management of pollinator resources as agriculture increasingly embraces data-driven, AI-enhanced technology (Abdel-Raziq, Palmer, Koenig, Molnar, & Petersen, 2021; Breeze et al , 2021; Howard, Nisal Ratnayake, Dyer, Garcia, & Dorin, 2021).
これにより、農業がデータ駆動のAI技術を採用するようになるにつれて、受粉者の資源の効率的な管理が促進される(Abdel-Raziq, Palmer, Koenig, Molnar, & Petersen, 2021; Breeze et al , 2021; Howard, Nisal Ratnayake, Dyer, Garcia, & Dorin, 2021)。
0.82
Improvement in sensor technology has enabled the use of inexpensive Internet of Things (IoT) devices, such as cameras and miniature insectmounted sensors, for pollination monitoring.
Insect-mounted sensors allow movement tracking of tagged insects over large areas (Abdel-Raziq et al , 2021).
昆虫搭載センサーは、広範囲にわたるタグ付き昆虫の動き追跡を可能にする(abdel-raziq et al, 2021)。
0.61
However, the technique is unsuitable for agriculture since tagging is laborious, it may increase insect stress or alter behaviour (Batsleer et al , 2020), and it is simply impractical on a large enough scale to be relevant in this context.
しかし、タグ付けは手間がかかるため農業には適さないため、昆虫のストレスや行動の変化を増大させる可能性がある(Batsleer et al , 2020)。 訳抜け防止モード: しかし、タグ付けは手間がかかるため農業には適さない。 昆虫のストレスや行動の変化を引き起こす可能性がある(Batsleer et al, 2020)。 この文脈に関係するほど 大規模では実用的ではありません
0.67
Camera-based pollination monitoring can overcome these drawbacks by tracking untagged insects using computer vision and deep learning (Howard et al , 2021; Ratnayake, Dyer, & Dorin, 2021a).
カメラによる受粉監視は、コンピュータビジョンと深層学習(Howard et al , 2021a; Ratnayake, Dyer, & Dorin, 2021a)を用いて、未タグの昆虫を追跡することで、これらの欠点を克服することができる。
0.60
In this research, we introduce a novel computer vision system to facilitate pollination monitoring for large-scale agriculture.
本研究では,大規模農業における受粉モニタリングを容易にするコンピュータビジョンシステムを提案する。
0.84
Our system is comprised of Edge Computing multi-point remote capture of unmarked insect video footage, automated offline multi-species motion tracking, as well as insect counting and behavioural analysis.
We implemented and tested our methods on a commercial berry farm to
私たちは商業用ベリー農場で我々の方法を実装しテストした。
0.62
(i) track individual movements of multiple varieties of unmarked insect,
(i)複数種の無標昆虫の個体運動を追跡する。
0.72
(ii) count insects,
(ii)昆虫を数える。
0.76
(iii) monitor their flower visitation behaviour, and
(iii)花の訪問行動をモニターし、
0.77
(iv) analyse contributions of different species to pollination.
(iv)受粉に対する異なる種の寄与を分析する。
0.73
Along with this article we publish the monitoring software, a dataset of over 2000 insect tracks of four insect classes, and an annotated dataset of images from the four classes.
We believe that these will serve as a benchmark for future research in precision pollination, a new and important area of precision agriculture.
これらは、精密農業の新しい重要領域である精密受粉における将来の研究の指標となると信じている。
0.76
英語(論文から抽出)
日本語訳
スコア
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
精密受粉のためのコンピュータビジョンを用いた空間モニタリングと昆虫行動解析
0.80
3 The remainder of the paper is organised as follows.
3 紙の残りは次のように整理される。
0.60
In Section 2 we present a brief overview of related work concerning computer vision for insect tracking in the wild.
第2節では,野生昆虫追跡のためのコンピュータビジョンに関する関連研究の概要について述べる。
0.76
Section 3 presents our new methods and their implementation.
第3節では、新しいメソッドとその実装を紹介します。
0.53
In section 4 we describe experiments to evaluate the performance of our approach and present the results of a pollination analysis to demonstrate our methods’ application.
第4節では,提案手法の性能を評価する実験について述べ,受粉分析の結果について述べる。
0.68
In Section 5 we discuss the strengths and limitations of our approach and suggest future work.
第5節では,アプローチの強みと限界について論じ,今後の研究を提案する。
0.67
Section 6 concludes the paper.
第6節はその論文を締めくくる。
0.54
2 Related Work Recently there has been an increase in the use of computer vision and deep learning in agriculture (Kamilaris & Prenafeta-Bold´u, 2018; Odemer, 2022).
This has been prominent in land cover classification (Lu et al , 2017), fruit counting (Afonso et al , 2020), yield estimation (Koirala, Walsh, Wang, & McCarthy, 2019), weed detection (Su, Kong, Qiao, & Sukkarieh, 2021), beneficial and insect pest monitoring (Amarathunga, Grundy, Parry, & Dorin, 2021), and insect tracking and behavioural analysis (Høye et al , 2021).
これは、土地被覆分類(Lu et al , 2017)、果実計数(Afonso et al , 2020)、収量推定(Koirala, Walsh, Wang, & McCarthy, 2019)、雑草検出(Su, Kong, Qiao, & Sukkarieh, 2021)、有益害虫モニタリング(Amarathunga, Grundy, Parry, & Dorin, 2021)、昆虫追跡と行動分析(Høye et al, 2021)において顕著である。
0.72
Applications of insect tracking and behavioural analysis algorithms are usually confined to controlled environments such as laboratories (Branson, Robie, Bender, Perona, & Dickinson, 2009; Haalck, Mangan, Webb, & Risse, 2020; P´erez-Escudero, Vicente-Page, Hinz, Arganda, & De Polavieja, 2014; Walter & Couzin, 2021), and semi-controlled environments such as at beehive entrances (Campbell, Mummert, & Sukthankar, 2008; Magnier et al., 2019; Yang, Collins, & Beckerleg, 2018).
昆虫追跡と行動分析アルゴリズムの応用は通常、研究室(Branson, Robie, Bender, Perona, & Dickinson, 2009; Haalck, Mangan, Webb, & Risse, 2020; P ́erez-Escudero, Vicente-Page, Hinz, Arganda, & De Polavieja, 2014; Walter & Couzin, 2021)や養蜂場(Campbell, Mummert, & Sukthankar, 2008; Magnier et al., 2019; Yang, Collins, & Beckerleg, 2018)のような制御された環境に限られている。
0.85
In these situations, image backgrounds and illumination under which insects are tracked vary only a little, simplifying automated detection and tracking tasks.
Pollination monitoring of crops however, may require tracking unmarked insects outdoors in uncontrolled environments subjected to vegetation movement caused by the wind, frequent illumination shifts, and movements of tracked and non-target animals.
These environmental changes, combined with the complexity of insect movement under such variable conditions, increases the difficulty of the tracking problem.
Recent studies attempted to address these issues through in-situ insect monitoring algorithms (Bjerge, Mann, & Høye, 2021; Bjerge, Nielsen, Sepstrup, HelsingNielsen, & Høye, 2021), but were limited in the
spatiotemporal resolution required for efficient pollination monitoring.
効率的な受粉監視に必要な時空間分解能
0.58
To overcome the difficulties listed above, we previously presented a Hybrid Detection and Tracking (HyDaT) algorithm (Ratnayake, Dyer, & Dorin, 2021b) and a Polytrack algorithm (Ratnayake et al , 2021a) to track multiple unmarked insects in uncontrolled conditions.
困難を克服する 前述したように、ハイダット(hydat)アルゴリズム(ratnayake, dyer, & dorin, 2021b)とポリトラックアルゴリズム(ratnayake et al , 2021a)を提案し、無制御条件下で複数の無標昆虫を追跡した。
0.73
HyDaT and Polytrack algorithms use a hybrid detection model consisting of a deep learningbased detection model (Bochkovskiy, Wang, & Liao, 2020; Redmon & Farhadi, 2017) and a foreground/backgroun d segmentation-based detection model (Zivkovic & Van Der Heijden, 2006).
HyDaTとPolytrackアルゴリズムは、ディープラーニングに基づく検出モデル(Bochkovskiy, Wang, & Liao, 2020; Redmon & Farhadi, 2017)と、フォアグラウンド/バックグラウンドセグメンテーションに基づく検出モデル(Zivkovic & Van Der Heijden, 2006)からなるハイブリッド検出モデルを使用している。
0.82
This enables tracking unmarked and free-flying insects amidst the changes in the environment.
これにより、環境の変化の中で無マークで自由飛行する昆虫を追跡することができる。
0.50
However, these earlier algorithms are limited to one species and one study location at a time.
しかし、これらの初期のアルゴリズムは1つの種と1つの研究場所に限定されている。
0.64
To gain a sophisticated understanding of agricultural pollination, these constraints are limiting since analysis of the behaviour of multiple insect species that contribute simultaneously, in multiple locations, to overall pollination levels or deficiencies is important (Garibaldi et al , 2020; Rader et al , 2016).
農業受粉の高度な理解を得るためには、複数の場所において、全体の受粉レベルや欠陥に同時に寄与する複数の昆虫の行動を分析することが重要である(Garibaldi et al , 2020; Rader et al , 2016)。
0.68
Currently there is no computer vision facilitated system, or any other practical system, capable of achieving this goal.
現在、この目標を達成することができるコンピュータビジョン支援システムやその他の実用システムはない。
0.75
In addition, no previous method can identify and classify insect pollination behaviour across large-scale industrial agricultural areas at a level of detail that permits sub-sitespecific interventions to increase farm yield via improved pollination.
3 Methods and Implementation In this section, we explain the methods and implementation of our insect and pollination monitoring system.
3つの方法 実施 本稿では,昆虫・受粉監視システムの方法と実施方法について解説する。
0.67
An overview of the proposed methodology is shown in Fig 1.
提案手法の概要を図1に示す。
0.54
3.1 Multi-point remote video
3.1マルチポイントリモートビデオ
0.68
capture Video footage of freely foraging, unmarked insects required for insect tracking and behavioural analysis was collected using edge computing-based remote camera trap devices built on the Raspberry Pi single board computer.
We used a Raspberry Pi 4 and Raspberry Pi camera v2 (Sony IMX219
Raspberry Pi 4とRaspberry Piカメラv2(Sony IMX219)を使ってみた
0.79
英語(論文から抽出)
日本語訳
スコア
4 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
4 精密受粉のためのコンピュータビジョンを用いた空間モニタリングと昆虫行動解析
0.61
Fig. 1: Overview of the proposed methodology
第1図:提案手法の概要
0.65
8-megapixel sensor) because it is widely available, customisable, there’s a wide range of plug-in sensors, and it is sufficiently low-cost for replication across a large area (Jolles, 2021).
Videos are recorded at 1920 × 1080 resolution at 30f ps.
ビデオは1920×1080解像度で30fpsで録画される。
0.72
The system is powered using a 20000mAh battery bank.
このシステムは20000mAhのバッテリーバンクで駆動される。
0.82
However, we do not process videos to track pollinators in situ since the Raspberry Pi is currently incapable of processing high quality videos in real-time, and our key goals required detection of insects.
Reducing the video resolution or the capture frame-rate to compensate for the lack of speed of the device is not currently feasible within the limitations imposed by pollinator insect speed and size.
Video recording units were distributed across nine data collection points in an experimental site (section 3.4 below) and were programmed to continuously record sets of footage clips of 10 minutes duration.
(Refer to code availability for the software used in the video recording unit.)
(ビデオ録画装置で使用するソフトウェアについては、コード参照を参照)。
0.82
3.2 Automated multi-species insect
3.2 自動多種昆虫
0.38
tracking We processed the videos captured remotely using an offline automated video processing algorithm.
追跡 遠隔地で撮影した映像をオフライン自動処理アルゴリズムで処理した。
0.68
Since food crops are usually grown in uncontrolled or semi-controlled environments subject to changes in illumination and foliage movement caused by wind and/or insect and human activity,
robust tracking of insects and flowers is essential for accurate pollination and insect behavioural analysis.
昆虫や花の堅固な追跡は、正確な受粉と昆虫の行動分析に不可欠である。
0.62
Here, we extended methods proposed in Ratnayake et al (2021a, 2021b) to track multiple insect varieties simultaneously and to detail their interactions with flowers.
そこで我々は,Ratnayake et al (2021a, 2021b) で提案された複数の昆虫を同時に追跡し,花との相互作用を詳細に示す手法を拡張した。
0.67
In the following sections we present the technical details of our methods.
本稿では,本手法の技術的詳細について述べる。
0.65
At the start of processing each video sequence, our algorithm extracts the time and location at which the video was captured from the sequence’s embedded metadata.
Next, the video is processed to track movement of insects and their interactions with flowers.
次にビデオが処理され、昆虫の動きと花との相互作用を追跡する。
0.80
Pilot research revealed that the position of each respective flower being recorded varies throughout a day due to wind and farm management activities, and flowers may physically move termed heliotropism in some cases to track sunlight (Kevan, 1975; van der Kooi, Kevan, & Koski, 2019).
パイロット研究により、各花の位置は風と農場の管理活動によって1日中変化し、花は日光を追跡するために物理的にヘリオトロピズムと呼ばれることがある(kevan, 1975; van der kooi, kevan, & koski, 2019)。
0.73
Therefore, it is essential to track flower position within the frame to reliably identify insect-flower interactions.
The positions of all visible flowers are first recorded at the start of a video sequence and updated in predefined userspecified intervals (Parameters values are provided with the source code).
A “predict and detect” approach is used to track flower movement.
予測と検出”アプローチは,花の動きを追跡するために使用される。
0.74
The predicted next position of each flower is initially identical to its current position, since the magnitude of flower movement within a short interval (e g , ≈ 100seconds) is assumed to be small.
それぞれの花の次の位置は、短い間隔(e g , ... 100秒)で動く花の大きさが小さいと仮定されるため、当初は現在の位置と同一である。
0.75
We then used the Hungarian algorithm (Kuhn, 1955) to associate the predicted position of each flower to a flower detection in order to form a continuous flower movement track.
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
精密受粉のためのコンピュータビジョンを用いた空間モニタリングと昆虫行動解析
0.80
5 is undetected in a given frame, the last detected position is carried forward.
5 与えられたフレームで検出されず、最後に検出された位置が前方に運ばれる。
0.52
If a detected flower cannot be assigned to any predictions it is considered to be a new flower.
検出された花がいかなる予測にも割り振られなければ、それは新しい花と考えられる。
0.81
At the end of a video sequence, the final positions of flowers and their respective tracks of interacting insects are saved for later pollination analysis and visualisation.
When an insect is first detected inside a video frame, the automated video processing algorithm identifies its species using the Polytrack deep learning model (Ratnayake et al , 2021a).
ビデオフレーム内で最初に昆虫が検出されると、自動ビデオ処理アルゴリズムはポリトラック深層学習モデル(Ratnayake et al , 2021a)を用いてその種を識別する。
0.86
In addition, it saves a snapshot of the insect for (optional human) visual verification.
さらに、昆虫のスナップショットを保存して(オプションで人間の)視覚的検証を行う。
0.70
After detection and identification of an insect, the Polytrack algorithm tracks it through subsequent frames.
昆虫の検出と同定の後、ポリトラックアルゴリズムはそれを後続のフレームで追跡する。
0.72
In each frame after the first detection of an insect, its position is compared with the position of recorded flowers to identify flower visits.
昆虫を最初に検出した後の各フレームにおいて、その位置を記録花の位置と比較し、花見を識別する。
0.78
If an insect is detected inside the radius of a flower for more than 5 consecutive frames (at 30 fps this ensures it is not flying over the flower at typical foraging flight speeds (Spaethe, Tautz, & Chittka, 2001)), the spatial overlap is stored as a flower visit.
花の半径の内側で5フレーム以上にわたって昆虫が検出されると(30fpsで、典型的な給餌飛行速度(spaethe, tautz, and chittka, 2001)で花の上を飛んでいないことが保証される)、空間的重なりは花訪問として記憶される。
0.83
The radius of a flower is computed to include its dorsal area and an external boundary threshold.
花の半径は、背部領域と外部境界閾値を含むように計算される。
0.75
This threshold is incorporated as some insects station themselves outside of a flower while accessing nectar or pollen.
Repeat visits to a flower that occur after an intermediate visit to another flower are recorded as flower re-visits.
他の花への中間訪問後に起こる花への繰り返し訪問を、花の再訪問として記録する。
0.75
When an insect exits the video frame, a file with data on camera location, time of capture and insect trajectories with flower visitation information is saved for behavioural analysis.
The software and recommended tracking parameter values are available with the source code.
ソフトウェアと推奨追跡パラメータの値はソースコードで利用可能である。
0.86
3.3 Insect behaviour analysis
3.3 昆虫の行動分析
0.68
We analysed insect flower visiting behaviour using the extracted movement trajectories to infer likely pollination events.
抽出した移動軌跡を用いて昆虫の花見行動を分析し,花粉発生の可能性を推察した。
0.62
This is appropriate since flowers have evolved structures that enable visiting insects to conduct pollen dispersal and transfer between floral reproductive organs for fertilisation of ovules by pollen (Real, 2012).
Matrices used to analyse flower visitation behaviour and pollination are presented below.
花の訪問行動や受粉の分析に用いられる行列を以下に示す。
0.62
Let S = {s1, s2, ..., s|S|} and F be the set of insects belonging to different species (or varieties at any taxonomic level) and the set of flowers in
S = {s1, s2, ..., s|S|} と F を異なる種(またはあらゆる分類学レベルでの品種)に属する昆虫の集合とし、花の集合とする。
0.84
1, si the experimental environment respectively.
1・・ それぞれ実験環境である。
0.75
Here, si = {si 2, ..., si|si|} denotes the subset of insects in S that belong to the ith species type, and si j is the jth insect in si.
ここで、si = {si 2, ..., si|si|} は ith 種に属する s の昆虫のサブセットを表し、si j は si の j 番目の昆虫である。
0.76
|. | is the cardinality of a given set – e g , |S| is the number of species types, |si| is the number of insects belonging to the ith species.
|. | は与えられた集合の濃度 – e g , |s| は種数、 |si| はith種に属する昆虫の数である。
0.56
• Number of flowers visited by an insect
•昆虫が訪れた花の数
0.80
species The number of flowers visited by an insect species si is defined as F V (si), where nf si is the j of species si visited number of times insect si flower f ∈ F .
種 昆虫 si が訪れた花の数は F V (si) と定義され、nf si は昆虫 si の花 f ∈ F の回数 si の j である。
0.67
j F V (si) =
j F V (si) =
0.42
nf si j (1)
nf si j (1)
0.42
• Total number of visits to a flower f from species si Total number of visits to a flower f from species si is defined as V F (f, si).
※花fの訪問総数 種 si 種 si の花 F への総訪問回数は V F (f, si) と定義される。
0.56
V F (f, si) =
V F (f, si) =
0.43
nf si j (2)
nf si j (2)
0.43
• Total number of visits to a flower f j=1
※花f訪問の総数 j=1
0.38
Total number of visits to a flower f is defined as V (f ).
花 f への総訪問数は V (f ) と定義される。
0.77
V (f ) = nf si
V (f ) = nf si
0.43
j (3) • Number of flowers fertilised with visits
j (3) ※訪問で受精する花の数
0.53
i=1 j=1
i=1 である。 j=1
0.30
from species si Number of flowers fertilised with visits from species si is defined as Npol(si), where ˆV is the number of visits required for fully fertilisation of a flower.
種 si から、種 si からの訪問で受精した花の数を npol(si) と定義する。 訳抜け防止モード: 種 si から、種 si からの訪問で受精した花の数を npol(si) と定義する。 v は、花の完全な受精に必要な訪問回数である。
0.73
Npol(si) =
Npol(si) =
0.42
[V F (f, si) ≥ ˆV ]
[V F (f, si) ≥ > V ]
0.44
(4) • Total number of fertilised flowers
(4) ・全肥化花数
0.49
Total number of fertilised flowers in a location defined as Npol.
Npolとして定義された場所には、全数の花が咲く。
0.58
Npol = [V F (f, si) ≥ ˆV ]
ヌポル= [V F (f, si) ≥ > V ]
0.43
(5) |si|(cid:88)
(5) |si|(cid:88)
0.38
(cid:88) j=1
(cid:88) j=1
0.34
f∈F
フェーーーーーーーーーーーーーーーーーーーーーーーーー
0.07
|si|(cid:88)
|si|(cid:88)
0.33
|S|(cid:88)
|S| (cid:88)
0.33
|si|(cid:88)
|si|(cid:88)
0.33
(cid:88) f∈F
(cid:88) フェーーーーーーーーーーーーーーーーーーーーーーーーー
0.23
|S|(cid:88)
|S| (cid:88)
0.33
(cid:88) i=1
(cid:88) i=1 である。
0.35
f∈F
f.f.f.f.f.f.f.f.f.f. f.f.f.f.f.f.f.f.f.f
0.01
英語(論文から抽出)
日本語訳
スコア
6 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
6 精密受粉のためのコンピュータビジョンを用いた空間モニタリングと昆虫行動解析
0.61
3.4 Implementation We implemented the proposed spatial monitoring and insect behavioural analysis system on the commercial Sunny Ridge farm in Boneo, Victoria, Australia (lat. 38.420942° S, long. 144.890422° E) (Fig. 2a).
We installed remote video recording units over nine data collection points in strawberry polytunnels (Fig. 2 b).
リモートビデオ記録装置をイチゴポリタンネルに9つのデータ収集点上に設置した(図2b)。
0.72
These data collection points were selected to cover the edges and central regions of the polytunnels because previous studies indicated that edge effects might impact insect movement, foraging behaviour and numbers within polytunnels (Hall, Jones, Rocchetti, Wright, & Rader, 2020; Howard et al , 2021).
これらのデータ収集ポイントは、昆虫の移動、捕食行動、ポリタンネル内の数(Hall, Jones, Rocchetti, Wright, & Rader, 2020; Howard et al , 2021)に影響を及ぼす可能性があるため、ポリタンネルのエッジと中央領域をカバーするために選択された。
0.70
Videos were recorded for a period of 6 days (8th - 17th March 2021) from 11 : 00am to 4 : 00pm (≈ 5 hours) to coincide with the key pollination period.
We monitored the behaviour of four key insect types, honeybees (Apis mellifera), Syrphidae (hover flies), Lepidoptera (moths and butterflies), and Vespidae (wasps) that actively forage on the farm (Fig. 3).
The YOLOv4 model was then trained on this dataset using TensorFlow (Abadi et al , 2016) with a learning rate of 0.001.
YOLOv4モデルは、TensorFlow(Abadi et al , 2016)を使用して、0.001の学習レートでトレーニングされた。
0.80
3.4.2 Processing videos
3.4.2 プロセッシングビデオ
0.61
We processed the videos to extract insect tracks and insect-flower visiting behaviour using the methods described in Section 3.2.
第3部2節に記載した手法を用いて,昆虫の足跡と昆虫花の観察行動の抽出を行った。
0.57
Videos were processed on the MASSIVE high performance computing infrastructure (Goscinski et al , 2014) with Intel Xeon Gold 6150 (2.70 GHz) CPU, 55 GB RAM, NVIDIA Tesla P4 GPU and CentOS Linux (7).
ビデオは、Intel Xeon Gold 6150 (2.70 GHz) CPU、55 GB RAM、NVIDIA Tesla P4 GPU、CentOS Linux (7) で、MASSIVEの高性能コンピューティングインフラ(Goscinski et al , 2014)で処理された。
0.86
3.4.3 Insect trajectory dataset
3.4.3 昆虫軌道データセット
0.53
preparation We post-processed insect tracks extracted from the videos to remove false positive tracks and correct insect type identifications.
準備 ビデオから昆虫の足跡を抽出し,偽陽性の足跡を除去し,昆虫のタイプを同定した。
0.69
Insect type identification was performed on multiple still frames of each insect assigned to a motion track.
移動軌跡に割り当てられた各昆虫の複数の静止枠に昆虫型同定を行った。
0.79
A further step was appended to this process to manually classify Hymenoptera into two separate classes, honeybees and Vespidae.
As reported above, these insects were initially treated as a single class in training the deep learning model due to the difficulty of clearly resolving morphological differences between them in flight at low video resolution and 30 fps.
4.1 Experimental evaluation The automated video processing system employs a deep learning model to detect insects and flowers.
4.1 実験評価 この自動ビデオ処理システムは、深層学習モデルを用いて昆虫や花を検出する。
0.77
We created a custom dataset of 3073 images divided into four classes:
3073枚の画像のカスタムデータセットを4つのクラスに分けて作成しました。
0.57
(i) honeybees/Vespidae (2231/371 instances),
(ハチミツバチ科(2231/371例)
0.32
(ii) Syrphidae (204 instances),
(ii)シロフ科(204例)
0.75
(iii) Lepidoptera (93 instances), and
三 甲虫類(93例)及び
0.50
(iv) strawberry flowers (14050 instances).
(四)イチゴの花(14050例)
0.34
Honeybees and wasps were included in a single Hymenopteran class due to their physical similarities and the difficulty of automatically distinguishing between them using the low-quality video footage extracted from the basic cameras (discussed further below).
We evaluated the performance of our system for extracting the trajectory and flower visitation behaviour of four insect types (Fig. 3).
本研究では,4種の昆虫の行動・花見行動の抽出システムの性能評価を行った(第3報)。
0.77
Experiments were conducted using a test dataset of 180, 000 frames/100 minutes at 30 frames per second (comprised of 10 sequential videos of 10 minutes each).
These videos were randomly selected from the set of recordings unused in deep learning model training and captured from different polytunnel locations (Test video dataset is accessible from Data Availability).
where, T rueP ositive is the total number of correct detections in all frames; F alseN egative is the total number of undetected insects in frames and F alseP ositive is the total number of incorrectly detected insect positions.
t ruep ositive は全フレームの正しい検出数、f alsen egive はフレーム内の検出されていない昆虫の総数、f alsep ositive は不正に検出された昆虫の総数である。 訳抜け防止モード: T rueP ositive はすべてのフレームにおいて正しい検出の総数である F alseN egative is the total number of untected insects in frames F alseP ositive は間違った検出された昆虫の位置の総数である。
0.85
Identity swaps (where a pair of insect’s identities are mistakenly swapped) in tracks were recorded as F alseP ositive.
The tracks and flower visits reported by our system were compared against human observations made from the videos for validation as we found no other existing monitoring system against which to compare our software.
When an insect appeared in the frame, the video was analysed frame by frame to record its flower visits.
フレームに昆虫が現れたとき、ビデオはフレームによって分析され、花見を記録しました。
0.80
An insect landing on the dorsal side of a flower was counted as a flower visitor.
花の背側への昆虫の着陸は、花の訪問者として数えられた。
0.74
Insects that appeared inside the frame of the video for less than 5 frames were ignored since at 30 fps this time is too brief to be likely to have any biological impact on pollination.
The insect behavioural analysis component of the algorithm accurately detected 97% of honeybeeflower interactions, and 3% of flower interactions were not recorded due to undetected flowers.
These lower values were due to the frames where the insect was undetected (see Discussion).
これらの低い値は、昆虫が検出されていないフレームに起因する(議論を参照)。
0.67
Tracking matrices related to Lepidoptera were similar to that of Syrphidae, where the algorithm detected and tracked 75% of Lepidopterans with precision, recall and Fscore values of 0.99, 0.71 and 0.81 respectively.
The complete trajectory dataset of insects and flowers is accessible from Data Availability.
昆虫と花の完全な軌道データセットは、データアベイラビリティからアクセスできる。
0.82
Spatial monitoring and insect behavioural analysis can help growers quantify pollination across different farm areas.
空間モニタリングと昆虫の行動分析は、農家が異なる農地で受粉を定量化するのに役立ちます。
0.55
We compared pollination levels across farm strawberry polytunnels using insect counts and the number of insect-flower interactions recorded at each location.
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
精密受粉のためのコンピュータビジョンを用いた空間モニタリングと昆虫行動解析
0.80
9 Table 1: Results of the experimental evaluations for the test video dataset.
9 表1: テストビデオデータセットの実験評価結果。
0.51
“Detections made” shows the number of insects/flowers detected by the algorithm compared against human observations.
は、人間の観察と比較して、アルゴリズムによって検出された昆虫や花の数を示しています。
0.54
“Trackletts generated” shows the total number of tracks generated for each insect variety.
trackletts generated”は、昆虫の種類ごとに生成されるトラックの総数を示す。
0.79
“Visible frames” indicates the number of frames the insects/flowers were fully visible in the frame.
可視フレーム(visible frames)とは、昆虫や花が完全に見えるフレームの数をいう。
0.70
“Evaluation matrices” present the average precision, recall and F-score values for tracked insects.
評価行列」は、追跡昆虫の平均精度、リコール値、Fスコア値を示す。
0.63
“Flower visits” compares the total number of insect visits to flowers counted through human observations and automatically identified through the software for tracked insects.
We quantified this on the strawberry flowers by calculating the percentage of flowers that received visits from each
これを定量化し イチゴの花は それぞれの花から 訪れた花の割合を計算して
0.64
英語(論文から抽出)
日本語訳
スコア
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
精密受粉のためのコンピュータビジョンを用いた空間モニタリングと昆虫行動解析
0.80
11 insect type. We further analysed insect-flower visits to evaluate the pollination efficacy of insect types by calculating the proportion of flowers that received the minimum of four insect visits required for fertilisation.
We recorded relatively low Lepidopteran and Syrphidae counts in most areas of the farm (Fig. 5).
農場のほとんどの地域で比較的低い甲殻類とシロフ科の個体数を記録した(第5図)。
0.65
The contribution of these species towards achieving flower-visitor targets required for pollination was observed to be much lower than that of honeybees (Fig. 6).
This effect is evident by the low relative frequency with which these insects made successive visits to flowers to meet the four required for optimal fertilisation (Fig. 6).
For example, the highest frequency of a non-honeybee pollinator to meet four visits was Lepidoptera at location 9 where less than 15% of flowers achieve this level of pollination; whilst at all locations honeybees significantly exceeded this level of pollination performance (Fig. 6).
When pollination across all locations is considered, over 68% of the recorded strawberry flowers received the minimum of four insect visits required for fertilisation, and 67% of flowers attained this threshold through honeybee visits alone.
This data thus reconfirms which insects seem, at least as far as the number of visits is concerned, to contribute the most towards pollination at the site.
In this study, a novel multi-point computer visionbased system is presented to facilitate digital spatial monitoring and insect behavioural analysis on large scale farms.
Our system operates in real-world commercial agricultural environments (Fig. 2) to capture videos of identify them (Fig. 3), and count the number of different varieties over large areas (Fig. 5).
Analysis of the insect behavioural data allows comparison of the contributions of different insect varieties to crop pollination (Fig. 5 and 6).
昆虫行動データの解析は、異なる昆虫種の作物の受粉に対する寄与の比較を可能にする(第5図と第6図)。
0.78
Here, we discuss the implications of our research for precision pollination.
本稿では,精密受粉研究の意義について論じる。
0.64
insects, 5.1 Computer vision for insect
虫だ 5.1 昆虫のコンピュータビジョン
0.71
tracking and behavioural analysis Our methods remove the major constraints imposed by the limitations of human observers for horticultural pollination monitoring and the collection of high-resolution spatiotemporal data (Fig. 5) on insect behaviour.
The approach therefore also paves the way for computer vision and edge computing devices to identify insect species for other entomological and ethological applications.
The use of relatively inexpensive Raspberry Pi edge computing devices (Fig. 2) for remote recording provides a high degree of scalability and customisability (Aslanpour et al , 2021; O’Grady, Langton, & O’Hare, 2019) for insect monitoring.
比較的安価なraspberry pi edgeコンピューティングデバイス(図2)をリモート記録に使用することで、高いスケーラビリティとカスタマイズ性(aslanpour et al, 2021; o’grady, langton, and o’hare, 2019)を昆虫モニタリングに活用することができる。
0.78
However, the limited capabilities of these devices confines the size of recorded study areas (Fig. 2d) and offers only low frame rates and low quality video.
It is likely that with the rapid improvement of camera technology, video quality and resolution will overcome current limitations and enhance the accuracy and efficiency of our methods.
12 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
12 精密受粉のためのコンピュータビジョンを用いた空間モニタリングと昆虫行動解析
0.61
Fig. 6: Contribution of different insect varieties towards strawberry pollination.
図6:イチゴの受粉に対する異なる昆虫種の寄与
0.73
Bar chart shows percentage of flowers visited by each insect type.
バーチャートは、各昆虫が訪れる花の割合を示しています。
0.66
The dark grey portion shows the percentage of flowers with over four (number of visits required for strawberry flower fertilisation (Chagnon et al , 1989; Garibaldi et al , 2020)) from each insect type.
暗灰色の部分は、昆虫の種類ごとに4つ以上の花(イチゴの花の受精に必要な訪問回数(chagnon et al , 1989; garibaldi et al , 2020)の比率を示している。
0.80
The red dashed line in the plots show the total percentage of flowers with more than four visits in a location.
プロットの赤い破断線は、ある場所に4回以上訪れた花の総比率を示している。
0.71
We applied our new methods to monitor insect pollination behaviour in strawberry crops.
我々は,イチゴ栽培における昆虫の受粉行動のモニタリングに新しい手法を適用した。
0.65
Strawberry flowers bloom within a narrow vertical spatial range and are usually visible from above (Fig. 2d).
イチゴの花は狭い垂直な空間範囲に咲き、通常は上から見える(図2d)。
0.73
By contrast, other crops, such as tomatoes or raspberry, grow within complex three-dimensional structures of vines or canes, making overhead camera tracking of insects problematic.
Monitoring their behaviour in such three-dimensional crops will require camera placements at oblique angles.
このような三次元作物の挙動を観察するには、斜めにカメラを設置する必要がある。
0.54
Insect detection is an essential precursor to tracking and monitoring.
昆虫の検出は、追跡と監視の重要な前駆体である。
0.66
Our algorithm accurately detected honeybees and Vespidae but performed relatively poorly on Syrphidae (Table 1).
ミツバチ科とベスプ科を高精度に検出したが,シロップ科の成績は低かった(第1表)。
0.42
This is because of the relatively small pixel area covered by the insect with our setup (Syrphidae covers ≈ 40 ± 10 pixels compared to ≈ 1001 ± 475 pixels for a honeybee) (Fig. 3).
Future improvements in cameras and object detection technologies (Stojni´c et al , 2021) will help here.
未来 カメラと物体検出技術の改善(Stojni ́c et al , 2021)が役に立つだろう。
0.79
Our algorithm uses deep learning to detect and classify insects.
アルゴリズムは深層学習を用いて昆虫の検出と分類を行う。
0.74
The results of experimental evaluation showed limitations in Lepidopteran detection and visually similar insect detection (i.e. honeybees, Syrphidae and Vespidae (Fig. 3 and Table 1)).
実験評価の結果,ハチ類,ハチ類,ハチ類およびハチ類(第3および第1図)の昆虫検出に限界があることが判明した。 訳抜け防止モード: 実験結果から,レピドプテラン検出と視覚的に類似した昆虫検出(ハチミツバチ)の限界が確認された。 Syrphidae and Vespidae (Fig . 3 and Table 1 ) 。
0.85
Detection of Lepidopterans was challenging because they sometimes appear similar in shape to foliage and shadows in the environment.
レピドプテミス類の検出は、しばしば環境中の葉や影と似た形に見えるため困難であった。
0.65
Also, they rested stationary on flowers for extended periods, prompting the algorithm to classify them as part of the background.
また、彼らは花に一定期間留まり、アルゴリズムがそれらを背景の一部として分類するように促した。
0.70
Detection and classification of visually similar insects requires a deep learning model trained with large annotated datasets.
Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
精密受粉のためのコンピュータビジョンを用いた空間モニタリングと昆虫行動解析
0.80
13 (Høye et al , 2021).
13 (Høye et al , 2021)。
0.62
However, our dataset was unbalanced, since the number of instances in each class was influenced by the relative abundance of insects recorded at the site (Wang et al , 2016).
しかし,各クラスにおけるインスタンス数は,現場で記録された昆虫の相対的存在量に影響されたため,データセットはバランスがとれていなかった(wang et al , 2016)。
0.75
We propose that future research should use characteristics of insect behaviour, such as spatial signatures of insect movement, to improve species classification tasks (Kirkeby et al , 2021).
今後の研究は,昆虫行動の空間的シグネチャなどの昆虫行動の特徴を利用して,種分類作業を改善するべきである(kirkeby et al, 2021)。
0.85
This will help overcome limitations associated with camera quality and deep learning datasets.
これにより、カメラの品質とディープラーニングデータセットに関連する制限を克服できる。
0.54
The video data we publish with this article offers a starting point for such solutions.
私たちがこの記事で公開するビデオデータは、そのようなソリューションの出発点となります。
0.61
5.2 Spatial monitoring for precision
5.2 精度の空間モニタリング
0.80
pollination Spatial monitoring and insect behavioural analysis can help growers understand the distribution of pollinators across a farm and their impact on pollination.
Strawberry flowers require at least four insect visits for full fertilisation (Chagnon et al , 1989; Garibaldi et al., 2020).
イチゴの花は、完全な受精には少なくとも4つの昆虫の訪問を必要とする(chagnon et al , 1989; garibaldi et al., 2020)。
0.67
However, it is important to note that crop yield and visitation rates have been observed to have a non-linear relationship (Garibaldi et al , 2020), where higher flower visitation rates can result in lower crop yield (Garibaldi et al , 2020; Rollin & Garibaldi, 2019).
しかし、作物の収量と訪問率は非直線的な関係(Garibaldi et al , 2020)が観察されており、花の開花率が高いと収量が少なくなる(Garibaldi et al , 2020; Rollin & Garibaldi, 2019)。
0.65
Therefore, it is beneficial to maintain insect flower visits at an optimum value that depends on the crop type, pollinator species, and environmental conditions (Garibaldi et al , 2020).
したがって、作物の種類、受粉者種、環境条件に依存する最適な価値で昆虫の花見を維持することは有益である(Garibaldi et al , 2020)。
0.73
Although different behaviours and morphologies make some insect species more effective pollinators of some flowers than others, we compared the contribution of different insect varieties to strawberry pollination using the number of insect flower visits as a proxy (Fig. 6).
However, an agricultural system driven by a single pollinator type may not be desirable.
しかし、単一の受粉者型で駆動される農業システムは望ましいものではないかもしれない。
0.58
Pollinator diversity and associated high flower visitor richness have been shown to affect pollination and crop yield Garibaldi et al (2016).
受粉者の多様性と高花の訪問者の豊かさは受粉と収量に影響を与えている(2016年)。 訳抜け防止モード: 花粉の多様性と高花の訪問者の豊かさ garibaldi et al (2016) の受粉と作物収量に影響を与える。
0.69
Often the high abundance of a single pollinator species cannot be used as a
多くの場合、単一の受粉剤種が多量に使用されることはない。
0.61
substitute for species richness Fijen et al (2018); Garibaldi et al (2016) as variations in behaviour and foraging inherent to different insect species may be important.
種多様性 種多様性 fijen et al (2018), garibaldi et al (2016) は異なる昆虫種に固有の行動や採餌の多様性として重要である。
0.74
Compared to manual pollination monitoring, our methods provide high-resolution behavioural data classified by insect type.
Pollination monitoring helps understand the impact of climate change and other anthropogenic activities on insect populations (Settele, Bishop, & Potts, 2016).
Recently, climate change and other anthropogenic pressures, including intensive agriculture, have caused a decline in some pollinator populations (Hallmann et al , 2017; Outhwaite, McCann, & Newbold, 2022; Schweiger et al , 2010; Vanbergen & Initiative, 2013) threatening global food security and terrestrial ecosystem health.
近年、気候変動やその他の人為的な圧力、例えば集中農業は、世界の食料安全保障と地上生態系の健康を脅かすいくつかの受粉者(hallmann et al , 2017; outhwaite, mccann, & newbold, 2022; schweiger et al , 2010; vanbergen & initiative, 2013)の人口減少を引き起こしている。
0.76
The most impacted pollinator populations are native and wild insects that must compete for food with managed pollinators while coping with disease, pollution and habitat loss (Wood et al , 2020).
最も影響の大きい受粉者集団は、自然および野生の昆虫であり、病気、汚染、生息地喪失に対処しながら、管理された受粉者と食物を競う必要がある(wood et al, 2020)。
0.53
Digital pollination monitoring systems like that described here, provide much-needed data for understanding the impacts of climate change on insect biodiversity and can ultimately provide a sound basis for conservation.
6 Conclusions In this paper, we presented a computer vision facilitated system for spatial monitoring and insect behavioural analysis to underpin agricultural precision pollination.
Our system comprised of edge computing-based remote video capture, offline, automated, unmarked multi-species insect tracking, and insect behavioural analysis.
The system tracked four insect types with F-scores above 0.8 when implemented on a commercial strawberry farm.
このシステムは、商業用イチゴ農場で実施された4種類の昆虫のFスコアを0.8以上で追跡した。
0.53
Analysis of the spatial distribution of flower-visiting behaviour of different insect varieties across the farm, allowed for the inference of flower fertilisation, and the comparison of insects’ pollination contribution.
We determined that 67% of flowers met or exceeded the specified criteria
我々は67%の花が特定の基準を満たすか超えたと決定した。
0.59
英語(論文から抽出)
日本語訳
スコア
14 Spatial Monitoring and Insect Behavioural Analysis Using Computer Vision for Precision Pollination
14 精密受粉のためのコンピュータビジョンを用いた空間モニタリングと昆虫行動解析
0.61
for reliable pollination through honeybee visits.
ミツバチの訪問による 信頼できる受粉のためです
0.51
However, alternative pollinators were less effective at our study site.
しかし, 調査現場では, 代替受粉剤は効果が低かった。
0.64
This advancement of computer vision, spatial monitoring and insect behavioural analysis, provides pollinator data to growers much more rapidly, broadly and deeply than manual observation.
Such rich sources of insect-flower interaction data potentially enable precision pollination and pollinator management for large-scale commercial agriculture.
Acknowledgments. The authors would like to thank Sunny Ridge Australia for the opportunity to conduct research at their farm.
認定。 著者らは、彼らの農場で研究を行う機会に、サニーリッジ・オーストラリアに感謝したい。
0.54
Declarations • Funding: Authors were supported by the Australian Research Council Discovery Projects grant DP160100161 and Monash-Bosch AgTech Launchpad primer grant.
• Ethics approval: Not applicable • Consent to participate: Not applicable • Consent for publication: Not applicable • Availability of data and materials: The datasets generated during and/or analysed during the current study are available here.
contributions: Conceptualization: Malika Nisal Ratnayake, Adrian G. Dyer, Alan Dorin; Data curation: Malika Nisal Ratnayake; Formal analysis: Malika Nisal Ratnayake; Funding acquisition: Adrian G. Dyer, Alan Dorin; Investigation: Malika Nisal Ratnayake, Don Chathurika Amarathunga, Asaduz Zaman; Methodology: Malika Nisal Ratnayake, Adrian G. Dyer, Alan Dorin; Project administration: Adrian G. Dyer, Alan Dorin; Resources: Adrian G. Dyer, Alan Dorin; Software: Malika Nisal Ratnayake; Supervision: Adrian G. Dyer, Alan Dorin; Validation: Malika Nisal Ratnayake,
contributions: Conceptualization: Malika Nisal Ratnayake, Adrian G. Dyer, Alan Dorin; Data curation: Malika Nisal Ratnayake; Formal analysis: Malika Nisal Ratnayake; Funding acquisition: Adrian G. Dyer, Alan Dorin; Investigation: Malika Nisal Ratnayake, Don Chathurika Amarathunga, Asaduz Zaman; Methodology: Malika Nisal Ratnayake, Adrian G. Dyer, Alan Dorin; Project administration: Adrian G. Dyer, Alan Dorin; Resources: Adrian G. Dyer, Alan Dorin; Software: Malika Nisal Ratnayake; Supervision: Adrian G. Dyer, Alan Dorin; Validation: Malika Nisal Ratnayake, 訳抜け防止モード: 寄稿:概念化 : Malika Nisal Ratnayake, Adrian G. Dyer Alan Dorin ; データキュレーション : Malika Nisal Ratnayake ; Formal Analysis : Malika Nisal Ratnayake ; Funding acquisition : Adrian G. Dyer, アラン・ドリン : マリカ・ニサル・ラトナイヤケ, ドン・チャトゥリカ・アマラトゥンガ Asaduz Zaman, Methodology : Malika Nisal Ratnayake, Adrian G. Dyer アラン・ドリン プロジェクト・マネジメント : Adrian G. Dyer, Alan Dorin ; Resources : Adrian G. Dyer, Alan Dorin ; Software : Malika Nisal Ratnayake ; Supervision : Adrian G. Dyer, Alan Dorin ; Validation : Malika Nisal Ratnayake
0.81
Don Chathurika Amarathunga; Writing – original draft: Malika Nisal Ratnayake; Writing – review & editing: Malika Nisal Ratnayake, Don Chathurika Amarathunga, Asaduz Zaman, Adrian G. Dyer, Alan Dorin.
Don Chathurika Amarathunga; write – original draft: Malika Nisal Ratnayake; write – review and editing: Malika Nisal Ratnayake, Don Chathurika Amarathunga, Asaduz Zaman, Adrian G. Dyer, Alan Dorin 訳抜け防止モード: Don Chathurika Amarathunga, write – original draft: Malika Nisal Ratnayake, write – review and editing : Malika Nisal Ratnayake, Don Chathurika Amarathunga, アサドゥーズ・ザマン、エイドリアン・G・ダイアー、アラン・ドリン。
0.85
References Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., . . . Zheng, X. (2016).
参考文献 Abadi, M., Barham, P., Chen, J., Chen, Z., Davis, A., Dean, J., . . . Zheng, X. (2016)。
0.75
TensorFlow: A system for large-scale machine learning.
TensorFlow: 大規模な機械学習のためのシステム。
0.79
Proceedings of the 12th usenix symposium on operating systems design and implementation, osdi 2016 (pp. 265—283).
第12回OS設計・実装シンポジウム osdi 2016 (pp. 265–283)。 訳抜け防止モード: 第12回usenix symposium on operating systems design and implementationの開催報告 osdi 2016 (p. 265–283) を参照。
Goscinski, W.J., McIntosh, P., Felzmann, U.C., Maksimenko, A., Hall, C.J., Gureyev, T。 訳抜け防止モード: Goscinski, W.J., McIntosh, P., Felzmann, U.C. Maksimenko, A., Hall, C.J., Gureyev, T. . others ( 2014 ) .
0.45
The multi-modal australian sciences imaging and visualization environment (massive) high performance computing infrastructure: applications in neuroscience and neuroinformatics research.
Haalck, L., Mangan, M., Webb, B., Risse, B. (2020).
Haalck, L., Mangan, M., Webb, B., Risse, B. (2020)。
0.83
Towards image-based animal tracking in natural environments using a freely moving camera.
自由移動カメラを用いた自然環境における画像に基づく動物追跡
0.80
Journal of neuroscience methods, 330 , 108455.
神経科学雑誌 330 , 108455。
0.50
Hall, M.A., Jones, J., Rocchetti, M., Wright, D., Rader, R. Bee visitation and fruit quality in berries under protected cropping vary along the length of polytunnels.
hall, m.a., jones, j., rocchetti, m., wright, d., rader, r. bee visitation and fruit quality in berrypping under protected cropping ポリトンネルの長さによって異なる。
0.79
Journal of Economic Entomology, 113 (3), 1337–1346.
journal of economic entomology, 113 (3), 1337–1346 を参照。
0.85
(2020). Hallmann, C.A., Sorg, M., Jongejans, E., Siepel, H., Hofland, N., Schwan, H., . . . others (2017).
(2020). Hallmann, C.A., Sorg, M., Jongejans, E., Siepel, H., Hofland, N., Schwan, H。
0.54
More than 75 percent decline over 27 years in total flying insect biomass in protected areas.
保護区では27年間で75%以上の減少が確認された。
0.51
PloS one, 12 (10), e0185809.
PloS 1, 12 (10), e0185809。
0.34
Howard, S.R., Nisal Ratnayake, M., Dyer, A.G., Garcia, J.E., Dorin, A. (2021).
Howard, S.R., Nisal Ratnayake, M., Dyer, A.G., Garcia, J.E., Dorin, A. (2021)。
0.91
Towards precision apiculture: Traditional and technological insect monitoring methods in strawberry and raspberry crop polytunnels tell different pollination stories.