International Development

Earlham Institute: Phenomenal phenomics
Contact person: Release date: 2017-12-21

Dr Ji Zhou and his group are developing hardware and software tools to help us weave together the complete picture from genotype to phenotype, while helping to develop and drive forward innovations in breeding and agriculture using computing and robotics. Plant yields have been steadily increasing ever since the green revolution of the mid-20th Century, yet this has only served to increase the strain on our agricultural lands through enabling a rapid and continuous worldwide population explosion.

Since the time Norman Borlaug helped prevent famine in India, China and Mexico, the world population has almost tripled, from 2.5 billion people in 1950 to over 7 billion people in 2017.

Technologies such as nutrient fertilisers and increased mechanisation have allowed us to triple our average cereal yield per hectare, in line with the growth in world population. However, over the last 20 years, relative increases in yield year-on-year are plateauing.

Across the globe, to help ensure we can feed every one of the ten billion (or so) people on earth come 2050, we are witnessing a revolution in how crops are monitored and grown.

As with tractors and fertilisers in the last century, the 21st Century is set to embrace aeroplanes, drones, smartphones, modern supercomputing and robotics.

An agricultural revolution: smart monitoring of crops.

We already have the capacity to feed every single person on the planet, we just require smarter ways of producing and distributing food. One way we can ensure that we have enough food to feed 10 billion people by 2050 is to ensure that we make the most out of the available space that we have.

Smart agriculture requires smart and mobile systems, and thankfully we are better equipped to put those systems in place. At the same time that cameras and sensors are getting smaller and cheaper, we are working out more ways of obtaining useful information from them.

Recently, a transition in farming has seen some modern farmers move from the fields into high-rise vertical farm systems. This urban farming revolution has witnessed climate-controlled plants being grown in systems using hydroponicsaeroponics and aquaponics, with LED lighting controlling when plants sprout, put on mass and flower.

However, while vertical farming can only serve to supplement agriculture in fields, we can apply the principles of controlled crop monitoring in more varied environments, using the latest in modern robotics and supercomputing.

Miniaturised hyperspectral sensors are now flying above crop fields, analysing them for signs of disease, drought and other environmental stress factors. Closer to crops, advanced monitoring systems are feeding data to the cloud which informs farmers about the micro-environment in every single part of their field.

Applied agricultural analysis.

The plant phenomics group, led by Dr Ji Zhou, at EI is working on some such systems.

Along with independent agricultural giant G’s growers, consisting of growers from both the UK and Spain, as well as Syngenta, the group is helping to develop software solutions to assist continuous crop monitoring that is key to production.

Through combining analytical software development with aerial image capture from light aircraft and in-field images taken using smartphones and tablets, the team have already designed their first computerised growth model for iceberg lettuce.

The group is particularly interested in working with farmers to provide user-friendly and highly accessible computerised solutions, which will give key insights not only into the growth stages and health of crops, but can provide valuable information on irrigation timing and harvestable yield, for example.

In this way, farmers can gain a better understanding of how crops are performing each season, using real in-field data to more efficiently and effectively plan their infrastructure and optimise operations.

According to Dr Daniel Reynolds, “We believe the gap between agriculture and science is becoming less apparent with time. Previously, the cost of advanced agricultural technology and the level of knowledge required to operate such systems were significant barriers to its widespread adoption. 

“As devices such as smartphones and tablets are becoming increasingly cheaper and more popular, it is now commonplace for breeders to be carrying a high-quality camera that is easy-to-use and connected to the internet.

“This exposure and adoption of more general technology has facilitated the agri-tech industry to be more savvy and build-upon these existing and familiar technologies in order to monitor and analyse crops through image-based solutions. 

“When low-cost, intuitive, and highly integratable systems are able to provide meaningful insights for farmers and growers, it is easy to appreciate their value in agriculture.”

More than meets the eye.

Where once a scrupulous human eye was the best tool to keep track of crop diseases, growth and marketable yield, now we are increasingly looking to technology both in the field and in the skies.

This is important for genomics research as well as agriculture; it’s all well and good knowing the sequence of x, y and z, but without any context, to us, genetic elements are just that - a bunch of letters.

Therefore, along with an increasing need to analyse big sets of genomic data, we too have to step up our efforts when it comes to identifying the traits caused by changes in DNA sequences or the environmental conditions surrounding a crop plant.

This is where phenomics comes into its own.

Phenomics allows us to see the whole picture, literally. By detailing precisely what changes occur to a plant based on genetic mutations and environmental changes, we have a much greater understanding of biological mechanisms as a whole.

However, while we have had to step up our game, technologically speaking, to cope with the abundant information encoded by genomes, we similarly have to drastically improve the way we capture and process images in order to keep pace.

There is so much more going on in a plant than is obvious to a human eye, while the incredible scale on which we practise agriculture requires us to improve both the capacity for analysing plants and the computer algorithms to help find meaning amongst all of the resulting information.

Here, again, the Plant Phenomics group at EI can play a role.

CropQuant.

Already, companies such as Agrosmart in Brazil are coming up with innovative and efficient systems to help monitor crops in the field, providing farmers with real-time, up to date information about what plants require throughout the growing season, including when to irrigate, for example.

However, harnessing the benefits of the high performance computing cluster at Earlham Institute, the Plant Phenomics group are taking things one step further.

With CropQuant, the group is combining in-field monitoring of environmental conditions with automated data capture, which will allow farmers to analyse how their crops are growing throughout each growing season.

The technology utilises common and affordable tools, such as the RaspberryPi, combining them with high resolution digital sensors and RGB cameras which can capture high quality and high frequency data, including images.

Images are processed using advanced computer vision analysis, which, along with data for qualities such as crop height, greenness, growth rate, and lodging risk are combined with environmental information to generate complex models from the data, which can be used to inform breeders and farmers just how their crops are being affected by conditions in the field at each and every stage of a season.

CropQuant provides a robust and affordable way to glean invaluable data which can truly help to improve yields and better inform agriculture as a whole. As the group themselves point out, “manual crop monitoring is time-consuming, expensive and greatly subject to human error, the responsibilities of which can be reduced significantly by the automated devices we are developing.”

What's next in store?

We asked Dr Reynolds to sum up by telling us what is next in store for these varied applications:

“As this is a work in progress, we plan to continue developing our crop analysis systems with a focus on improving accuracy, reliability, and scalability, in order to provide growers with increasingly more valuable insights. 

“Furthermore, we will continue to investigate other innovative and exciting technologies, such as AirSurf, an automated aerial imagery analysis application based on machine learning algorithms, and our lab-based germination quantification system SeedGerm, which can be incorporated into our systems to add further significant commercial value for the end user.”


Link to the paper: 

http://www.earlham.ac.uk/articles/phenomenal-phenomics?utm_source=facebook&utm_medium=social&utm_content=Oktopost-007vt51vcr3echh&utm_campaign=Oktopost-Phenomenal+phenomics

 


COPYRIGHT ©  Plant Phenomics Research Center (PPRC),Nanjing Agricultural University

No. 1, Weigang, NanjingJiangsu Province, China 210095