- The phasing out of direct support payment
- The introduction of funding for farmer-led R&D and collaboration on productivity innovation
- A new Environmental Land Management (ELM) scheme
I started writing about the case for smarter tech in cattle and sheep breeding programmes a while ago, and quickly realised there was more than would fit in a single blog post. In my previous post I described how technology can help us increase the pace of genetic progress, by:
- Measuring what we can’t easily see, including a range of important traits such as efficiency, emissions and disease resistance; and
- Reducing errors in recording and transcription.
There are two further areas where smart technology can and is helping:
Cut collection costs
How much does it cost to collect a phenotype for an animal?
A phenotype is the observable characteristics of an individual that result from both its genotype and the environment. When you collect sufficient phenotypic information about many individuals, you can adjust for environmental effects, and predict the genetic merit of that animals. Collecting phenotypes involves measuring details such as calving ease, weight gains, progeny survival and other characteristics of economic value to farmers and consumers.
Current technology for DNA analysis tell us more about the genetic similarity and differences between animals (their genetic relatedness) than their outright performance. There are relatively few simple gene interactions you can measure with a DNA test.
What DNA testing helps with is the challenge of recording which animals are related to each other, especially as pedigree recording gets harder with larger sets of animals. For the DNA test to be useful, it must be able to be statistically connected to animals that have had phenotypes measured.
As Dr Mike Coffey of SRUC, wrote in 2011, “In the age of the Genotype, Phenotype is king!”
Phenotyping will continue to be required. Livestock breeders will need a pool of accurate and comprehensive phenotype observations to support both new and existing breeding objectives, connected to the growing pool of genotyped animals.
Measuring animals is time-consuming and expensive: the tedious task of observing and measuring characteristics of animals, linking these to the correct animal records, and transcribing the observations into computer systems.
With the need for ongoing, and potentially more detailed phenotype recording, technology is our friend. From automated weighing and electronic identification to cameras, sensors, and deep learning, properly configured and managed technologies reduce the cost of data collection – especially for larger operations and at commercial scale.
We’ve helped several organisations organise and automate their phenotype recording, integrating electronic identification and a range of other technologies, and streaming that data back to central databases.
Better buying behaviour
What really drives buying decisions?
Ultimately livestock breeding is driven by demand. Livestock breeders focus their selection efforts on traits that drive economic performance such as:
- Birth weight and survival;
- Feed conversion efficiency and growth rates or milk production;
- Carcass characteristics or milk characteristics;
- Docility, resistance or resilience to disease;
and more. In the livestock industry, this is presented as the predicted benefit to future generations (typically termed EPDs or EBVs depending on your species, breed, and country). The EBVs or EPDs are combined into economic indexes where each is given a $ or £ weighting based on its impact in a typical farm system and usually delivered as tabular reports and graphs to buyers.
And what do buyers select on?
- In our experience, New Zealand dairy farmers make great use of the primary economic index – Breeding Worth or BW, combining that with decisions about breed and gestation length.
- In the sheep industry, breed and breeder decisions are often the primary decider. Once a breeder is selected, the numbers are used to buy the best animals one can afford.
- In the NZ and Australian beef industry, an expert tells me that for animals sold at auction, the closest purchase price correlation is with liveweight: in their opinion, animals are often valued on how big they are.
When sheep and beef farmers are presented with multiple EBVs and indexes, it can be overwhelming, and it is not surprising some farmers revert to assessing a ram visually, negating the value of genetics programmes.
I’m a great fan of Daniel Kahneman’s Thinking Fast, Thinking Slow, where he points out the challenge we have with quantifying difficult or unknown measures.
“If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. I call the operation of answering one question in place of another substitution.” – Daniel Kahneman, Thinking Fast, Thinking Slow.
If understanding EBV’s and indexes seems difficult, you’re most likely to substitute an easier question – how good the ram looks or how knowledgeable the breeder seems – without even realising you have done so.
How can technology help here? We can use technology to make ourselves smarter – or the questions simpler. For instance, there’s potential to create a data-driven system that analyses your current farm system and recommends which economic index or other measure you should use when choosing animals. Or perhaps a tool that with a few simple selections works out how much you should spend on rams or bulls, finds those at a sale that meet your needs, and returns a ranked short list that you can use in your final decisions.
Smarter technology for livestock buyers is a key area where the industry can make real progress on the rate of genetic gain, and a strong understanding of how people make decisions will be critical to its success.
Is bespoke development always the expensive option?
It is hard to think of a business today that does not rely on software to improve efficiency, compliance or reaching customers. As businesses grow the opportunity to encapsulate functions and processes within an information system increases. Indeed, your software may define your business success.
One of the first questions our customers ask is ‘what off the shelf solutions are there?’. Often there are candidates and the next question is ‘will this off the shelf solution meet my needs and provide the cheapest option?’
Assessing potential candidates is not easy – in reality, its only once you are using the software that you can evaluate if it really fits your requirements. Making the wrong choice causes disruption, demoralises staff and makes you look bad. Assessing candidate software is not just ticking features off the checklist. It is about how the software will help grow your business – how the software fits the way you do business, how useable it is, how it will evolve to meet your changing needs and in so doing create great solutions.
There are times when it is also worth considering custom or “bespoke” software developed precisely for your needs, rather than what is available off the shelf.
Let’s look at four compelling reasons for bespoke software
- The value of any software can be measured by uptake; either by your customers or internally. While customer uptake is easier to measure, a low internal uptake may just seem like staff being lazy, or they seem to need a lot of training – it may just be that the software is hard to use. The greatest chance you have for achieving a solution that exactly meets your needs is by building it that way.
Workflow – Your software should reflect the efficient way you wish to manage business. As the creator of Visual Basic, Alan Cooper described, software should take away the pain. The worst case is an off the shelf solution simply does not fit with your workflow – and forces your business to needlessly change instead.
Functionality – Build only what you need and make the experience satisfying to users – great software should be a delight to use. Feature laden off the shelf solutions add no value if these features are not used. In fact, they reduce the value as users navigate through a mess of user interface that is irrelevant. As a subscriber of an off the shelf solution you are paying for all features even if this cost is spread over many more subscribers.
- Software innovation has proven to turn problems into unique solutions.
Software not only has the potential to increase the efficiency of your current business processes it can analyse and solve challenges and connect people so that problems are now unique solutions for you and your customers. Indeed, today businesses have innovated with bespoke software to create solutions that are the reason for their phenomenal growth. They haven’t achieved this with off the shelf software their moderately performing competitors are using.
- Your business needs and opportunities should drive your development roadmap.
In our software development business, we use agile processes and lean development to ensure you are buying functionality that has the highest priority for your business growth. You are in control. Providers of off the shelf products must meet the needs of a diverse user group. If you can see the providers roadmap you may be surprised just how long you may wait for your features; at worst your highest priority requirements may not even be on the roadmap. In such case you will need the provider to undertake bespoke development. Often they will do this when it is convenient and there is little incentive for their price to be competitive.
- The very process of planning and undertaking bespoke development can have wider benefits in reviewing your processes – leading to innovation in the way you do business
When we develop software, we use Design Thinking to understand the business goal and innovate with you – not just with the software but also the interaction with people, processes, hardware and data. As a result, your unique world view and people can unleash transformations well beyond the software.
This is not an exhaustive list of benefits in bespoke development but focusses on four areas often unaccounted or undervalued. In summary bespoke development can be the best option where uptake is essential, where innovation can lead to business growth and where a business wants a competitive solution.
My first job in livestock performance recording was with the Genetics Section, as it was called, at Ruakura Research Centre in New Zealand. I worked part time while studying at university, transferring research trial data off the government mainframe on reel-to-reel tape, and writing inbreeding coefficient calculation software.
The genetics section was based in an old converted house, where we sat around at large, wooden, public service desks, surrounded by high stacks of computer printouts, all painstakingly bound and labelled for future use. We were the leading edge of genetic improvement and livestock performance recording.
That was nearly thirty years ago of course, and the face and capability of modern technology has radically changed. Interestingly however, many of the practices in livestock recording industries still reflect that past golden age, and it is only recently that the software tools and databases of – let’s be generous and say – 15 years ago have started to be refreshed.
In this, the first of two articles about technology in livestock breeding, I propose that we could make much more effective use of smart technologies to increase the rate of genetic progress and address commercially important, but hard to measure, animal characteristics. In my next post, I’ll examine how technology could reduce the cost of phenotype collection (I might even explain what a phenotype is), and encourage better use of improved genetics by commercial producers.
Measure what you can’t see
In our traditional performance breeding tools, we focused on things that farmers could readily measure: kilograms and counts. Numbers of live progeny, and kilograms of liveweight, milk, and wool. Good news, most of those production traits are heritable and we’ve made good progress over the last 30+ years.
So how do you measure characteristics that are important in modern farming systems?
- Meat eating quality, so that consumers can repeatably have a great eating experience;
- Feed conversion efficiency, converting inputs into product more efficiently, reducing greenhouse gas emissions per unit of product, and making the farming system more profitable;
- For that matter, greenhouse gas emissions (where this is driven by livestock genetics rather than inoculation by a specific set of gut microorganisms);
- Urine nitrate concentration, and hence one key environmental impact of extensive livestock farming;
- Disease resistance and the response of animals to a variety of disease and parasite challenges;
- Behaviour of animals around people and other livestock, including how they handle stressful environments such as being moved; and
- Longevity, the ability of female animals to raise progeny season after season, reducing the substantial cost of replacement animals.
There are proxies for many of these measures of course. Breeding for growth rates or milk production have arguably improved greenhouse gas efficiency for example, but in some breeding systems a change in mature weight of animals has increased emissions. Progeny tests and laboratory measures have been used in key programmes, but they may not help us with routinely identifying the genetic outliers that will lead the next leap in genetic progress.
New measurement and sensing technologies offer real potential to help with these “hard to measure” areas of animal performance in the coming years. Accelerometer and microphone technologies can identify individual animal eating habits, heats and parturition (birth) dates. 3D and multispectral cameras tell us about carcass and meat product characteristics, and additional characteristics of milk. Increasingly, this data will be collected in-line or in near-real-time, providing a rich stream of data that could be analysed for many purposes.
The next generation of animal recording and genetic analysis systems must be built to handle this variety of real-time, stream data: or at least the results of analysing it.
Fewer errors, more progress
A primary driver of any livestock recording and animal evaluation system is to enable breeders and commercial producers to make better decisions about the animals they use in breeding. Computers don’t select animals: people do. Where a producer chooses an animal because they like the look of its eyes, or its stance, or its colour, and ignores the potential impact of the animal on their herd, the results will be at best random, and often detrimental.
Formal breeding schemes with EBVs and indexes seek to inform better decisions about the breeding merit of animals, but EBVs can be limited by the information available:
- Accuracy of recording parentage and animal relationships;
- Incorrect allocation of records to the wrong animals;
- Transposition and recording errors when capturing data; and
- Failing to account for the impact of environmental effects such as the feeding and management regimes of groups of animals, the age of the mother, or whether an animal was reared as a single or twin.
Technology is playing a substantial role in improving the accuracy of EBVs, notably through genomic DNA analyses resolving the fraught process of parentage recording and contributing substantially more information, earlier in each animals’ life-cycle. Better facilitation and handling of genomic data collection is well overdue in animal recording systems, and I’m pleased to see this being addressed.
In addition to genomics, electronic identification (EID) and automated recording systems can remove many identification and data capture areas, and the ability to feed this data seamlessly into modern evaluation systems without having to manually manipulate data will provide another leap forward.
Recording management groups properly has been a real limiting factor in many breeding programmes, and is one of the key hesitations in extending these to commercial producers. I believe that sensors that identify eating and movement behaviours, and location or proximity to other animals, will help us to automatically and transparently solve the problem of recording management groups and regimes. This will provide another substantial step forward in removing the noise of environmental effects.
Of course, more accurate EBVs is still only a piece of the puzzle. Helping producers to make use of this information effectively is another, and something I’ll address in my next post.
Rezare Systems is a bespoke software design and development company specialising in the agriculture sector. We have special expertise in building livestock recording and management systems, and tools for data collection and integration. Learn how Rezare Systems can assist your business.
Last week the UK Agriculture and Horticulture Development Board (AHDB) announced an industry consultation to develop a set of principles (code) to promote the sharing of farm data. Happily, we at Rezare UK have been awarded the contract to run this project based on our unique agridata expertise and our significant experience in developing a code in NZ.
Improving the flow of data from farms to other organisations is seen (rightly) by the AHDB as part of the productivity agenda for UK agriculture, but there remain significant barriers to getting the data flowing in practice mainly because of issues around trust and interoperability of disparate sets of data.
While the code will go someway towards addressing issues of trust (and start to build some alignment across industry on best practice when it comes to sharing and using farm data), other issues will also need to be addressed going forward beyond the code itself, particularly the more technical aspects of exchanging and using the data.
Two really good examples of dealing with this have emerged in the past couple of years – DataLinker in NZ and Agrimetrics in the UK. These two approaches (the latter is one of four UK government agritech centres of excellence) while quite different in nature (and to a degree in objectives), are actually also potentially very complimentary.
DataLinker works on a model where no one party becomes the single repository and broker of farm data. Instead, data owners build APIs to standardised schema and do this once only so that permissioned third parties can access that data in a known way. The exchange of data between the owner and user of it (“consumer”) is a bilateral relationship where DataLinker provides the permissioning (tokens) and legal frameworks (templated agreements) to streamline and standardise the process.
DataLinker assumes that each potential system is in fact its own “locker” (store of data) with one or more types of data. Users of some sort already interact with those systems, so what DataLinker does is standardise the way of finding which systems have which types of data (the findable F in FAIR data sharing) and in which formats (the interoperable I in FAIR). It specifies the method by which organisations agree data access rules and users provide permission (together, the accessible A of FAIR), with the result that the data is reusable (the R in FAIR). DataLinker has been focused more on the farmer or user-facing sharing of data than for broad data access necessary for researchers for example (at least without organisations explicitly addressing this).
Agrimetrics in the UK employs the semantic web whereby publicly available data (published on the web) or private data made available under a licence agreement is organised according to a Resource Description Framework (RDF). Each data entity is described as a “triple” (subject-predicate-object) and in that way stored data becomes machine readable by being linked to other data entities. The data contributed is effectively “held” by Agrimetrics and then exposed through APIs (charged or free) under licence for third parties to use.
Agrimetrics is focused on big data and using semantic web is tagging or structuring large datasets in public HTML documents (and other data) in a way that makes it machine recognisable and readable.
In essence the two approaches can be differentiated thus:
- DataLinker is a network approach – a set of protocols and standards that allow myriad parties to exchange and share data in multiple bilateral (albeit mostly templated) arrangements through standardised APIs.
- Agrimetrics is a hub approach – where data is is shared to the Agrimetrics “centre” where it is stored, manipulated and interpreted before being shared as a more user-friendly asset under licence through APIs.
In many ways Agrimetrics is the more comprehensive since it seeks not only to broker data exchange but also to add value to the data by linking it and manipulating it to meet a particular consumer’s need. It can handle structured or unstructured data. This is potentially very powerful as it allows a consumer of the data to draw on Agrimetrics’ technical know-how and capacity to do increasingly clever and machine-learning based activities with the data. In other words, Agrimetrics can offer a one-stop-shop for brokering and adding value to data.
However, there are also problems with the approach. It assumes a high degree of integrity and legal rigour being exercised by Agrimetrics since the data sharers are effectively “letting go” of their data to be stored and used by an organisation that is looking to commercialise it. And in the absence of private data holders being prepared to release data, Agrimetrics is only as good as the publicly available (web published) data.
DataLinker does not (and is not intended to) become involved in negotiating commercial deals to share data. Nor does it become involved in managing, manipulating or interpreting the data. It is largely a hand-off approach designed to facilitate the network, not control it. But the adoption of the standardised schemas means there is an IT burden on the data sharers – either in-house or outsourced – to build compliant APIs.
So is one approach likely to prevail? Most likely not and it’s actually preferable for the two to co-exist and complement each other. Here’s why:
- First, because culturally the DataLinker approach is more aligned to putting the interests of the farmer first and right now farmer trust in how their data is controlled and used is becoming almost the biggest blocker to progress
- Second, because it is unlikely industry will want to have all its eggs in the one basket
- Third, because the horsepower in Agrimetrics is potentially a game changer in terms of releasing real innovation based on farm data and thus demonstrating the value proposition to farmers from sharing their data (another piece of the sharing jigsaw that is missing)
- Fourth, because the DataLinker approach through its JSON_LD APIs means data can be “readied” for consumption in a semantic way which would complement the success of Agrimetrics
- And fifth, because the semantic web is likely to be a long-term approach favoured particularly by the research community within the agrifood sector.
There are other concepts for farm data sharing that are being considered around the globe.
For example, Wageningen University in the Netherlands has proposed a Farm Data Train which effectively creates a number of data lockers (stores), all with the same API and approach to authorisation, which means their interfaces in effect align closely to what is proposed in DataLinker. At present this concept is focused more on plant breeding data but it could easily grow outwards.
So what’s my point? Well, as can be seen, there is more than one way to skin the proverbial cat. What’s important is for the sector to provide space for the approaches to breathe so that there is increased opportunity for innovation to deliver against the productivity agenda. That’ll need some collaboration and collaborative thinking and in the UK we shall, in the coming months, discover how its agri sector wants to address these issues.
It’s a great time to be involved in agridata and better still that Rezare are in the thick of shaping the future.
Information is the life-blood of today’s businesses and will enable the transformations occurring in the agriculture and food business sector. Historically, information has only been exchanged between businesses at the transactional level (such as shipping notices and invoices), while richer data that could differentiate products, demonstrate environmental compliance, and optimise business value has remained isolated in silos.
DataLinker is designed to give agricultural businesses (farmers, processors, input suppliers and advisers) the ability to access and combine data from multiple sources in flexible, and timely ways, without requiring many hours of skilled technical resource to carry out data exports and imports.
Integrating and effectively sharing data looms large for many businesses, so companies are investing in their own development and infrastructure, and are also finding the challenges: data standardisation, supporting different interfaces for each partner organisation, and time taken to negotiate data access agreements. DataLinker addresses these issues.
What is DataLinker?
DataLinker is a framework for agriculture and food businesses who wish to interchange data. In many ways it is analogous to the GS1 frameworks used to interchange shipping notice and invoice data, or the Ag Gateway framework used in grain supply space. DataLinker’s primary focus was to allow farmers to bring data from a variety of sources into the tools they use for decision making, but it can be equally beneficial to all companies in the sector.
DataLinker consists of four major components that work together:
- Data exchange specifications (“schemas”) that standardise sets of data using the Farm Data Standards and modern internet protocols (developed collaboratively with the input of member companies);
- A small central registry where companies can discover which organisations implement each specification and how these are accessed;
- Standardised contract terms that can be used to reduce negotiating time and legal costs in the majority of data interchanges; and
- Technical tools to support secure agreement of data access terms, approval of access, and (where necessary) farmer permission for individual farm data sets.
DataLinker is not a database, nor a central communications hub through which all data might pass.
All of the DataLinker specifications and framework components are based on internet standards, and companies are responsible for implementing the specifications in their own IT systems, although support is provided.
How does the commercial model work?
DataLinker Limited has been incorporated as a separate entity to operate the DataLinker registry and support the collaborative development of standardised API specifications for areas where its users direct. The board of directors comprises representatives from Beef+Lamb NZ, DairyNZ, MPI, and an independent chair appointed by DataLinker’s subscribing members. DataLinker Limited operates effectively as a not-for-profit to encourage adoption and benefits for the agricultural community.
There are no transaction fees.
Members pay a joining fee of $6,000 NZD (waived for New Zealand organisations prior to 31 May 2018), and an annual subscription of either $3,500 (for organisations either providing or consuming data), or $4,500 (for heavier users both providing and consuming data). These fees are analogous to membership of a standards organisation, supporting the operation of the registry and collaborative maintenance of specifications.
If you’re in the food business (whether that’s retail, food service, processing, farming, or supply), consumers are asking questions about your supply chain.
Of course, they may not be asking you directly, and they may not be asking your retail or food service partner, but they are asking: on social media, on recommendation sites such as TripAdvisor and Yelp, even over drinks at their local.
Are you providing the information they need to be confident about the quality and safety of your product? Do you have a substantiated story around provenance, animal welfare and the environment?
Safeguards such as DNA testing lasagna are “bottom of the cliff” activities, an attempt to rebuild broken trust and arguably too limited and late in the supply chain.
Future product preference and even acceptance relies upon a supply chain that can show ethical practices: in how environmental impacts are managed, natural biodiversity is encouraged, animal welfare is maintained, anti-microbial resistance is avoided, and workers and communities are treated.
Activist groups and the power of social media means that our response to these demands must be much more solid than a promise or a declaration form. We must have the systems and measures to back up our words – and to demonstrate as much to auditors and our supply-chain partners.
For those of us at the confluence of technology and agriculture, this means we must do more than just record activities and calculate gross margins. We must step up with tools that capture rich data in support of farming activities, and which actively encourage good decisions that improve both profitability and sustainability.
All this needs to be done with minimal additional effort by farmers and their staff, and aligned to real-world processes on farm.
I’ll be speaking at MobileTech 2017, the annual summit for technology innovations in the primary sector, reflecting on these challenges. I’ll summarise some of the work Rezare Systems is doing in this space, and suggest ways the industry could apply technology to the opportunity.
This article was first published at www.rezare.com/blog
Increasing numbers of farmers see technology as useful and important to their farming businesses, and farmers are looking to invest further in new technology over the coming years. Despite this, lingering concerns about data sharing, privacy and control remain.
According to the October 2016 Commonwealth Bank of Australia Agri-Insights Survey of 1600 Australian farmers, 70% of farmers believe that the digital technology available adds significant value to their businesses.
The Ag Data Survey published by the American Farm Bureau Federation (AFBF) also found that farmers are optimistic about technology, with 77% of farmers planning to invest in new technology for their farms in the next three years.
Farmers also see value in sharing and re-use of data, but privacy and control are the largest barriers to more widespread re-use.
The Agri-Insights Survey found that:
- 76% of farmers think that there is value in sharing on-farm production information with others;
- 58% of farmers currently share some on-farm production information with others; and
- Of farmers who don’t see value in data sharing, “privacy concerns” at 28% is the largest reason.
The New Zealand Office of the Privacy Commissioner surveyed New Zealanders about privacy and their attitudes to data sharing in April 2016. They noted that:
- 57% of respondents were open to sharing data if they could choose to opt out;
- 59% were open to sharing if there were strict controls on who could access data and how it was used; and
- 61% were open to sharing if the data was anonymised and they couldn’t be personally identified.
The US AFBF survey also highlighted some of these concerns in an agricultural context:
- Only 33% of farmers had signed contracts with their ag-tech provider. Another 39% knew of their provider’s policies but had not signed anything;
- When farmers were asked if they were aware of the ways in which an ag-tech provider might use their data, 78% of farmers answered “no”; and
- 77% of farmers were concerned about which entities can access their farm data and whether it could be used for regulatory purposes.
Not just farmers
Confidentiality and control can be barriers to companies too. After all, much of the data is about their activities, products, or equipment as well as the farm itself.
It’s not always clear how other parties will behave when sharing data. Organisations generally make reasonable and effective use of data and meet confidentiality expectations, but there is always a risk that they won’t. So companies sharing data are forced to negotiate “iron-clad” agreements, keeping the corporate lawyers busy and making any new data exchange the subject of long-winded negotiations.
As soon as you get into negotiations like this, costs rise. If one of the parties is a smaller player with less negotiating power (company or farmer), they may never be able to conclude a useful data access deal. The end result? A slower rate of innovation, the benefits of information to the farmer and overall supply chain are not fully realised, and sharing data becomes a much more expensive exercise than you would otherwise expect.
Over the years, industry players have experimented with different ways to address these issues. Centralised industry-good databases and exchanges have been proposed, and these could be very effective. Unfortunately, concern about centralising large amounts of data, and the loss of control that this brings has led players to hold back some or all of their data from such repositories.
Other groups have posited that all data should be in the exclusive control of the farmer, and have built exchanges or created open API standards on that basis. We applaud this, but it doesn’t always reflect the significant effort that companies and service providers invest in creating and curating some data sets. The end result is that some data sets are often held back from such exchanges.
A collaborative approach
The New Zealand primary industry has worked on several approaches to this problem in a collaboration between the red meat sector, the dairy sector, and the Ministry for Primary industries.
The Farm Data Code of Practice is designed to encourage greater transparency between farmers and service providers or vendors about the data that is held, and the rights that each party has to the data. A straight-forward accreditation process gives farmers confidence that organisations have “got their house in order” when it comes to terms and conditions and data policies.
The DataLinker protocol builds on the standardised, open API approach to sharing data, but with three key considerations:
- It provides a way for organisations to agree a Data Access Agreement without a protracted legal negotiation. Standard agreements are provided and encouraged, to reduce the overhead that all parties face in legal costs and time (that said, custom agreements are still possible where absolutely necessary).
- Accepting a Data Access Agreement doesn’t give the recipient “open slather” to the data; for most data sets, explicit farmer approval is also required, requested and confirmed by the farmer using standard web authorisation protocols. Farmers grant permission to access data that covers their business, and can also withdraw that authorisation.
- As an Open API approach is used rather than a central database or exchange, there is no “central service” that must be involved in each data transfer. This reduces the “attack surface” from a security perspective and enables organisations to retain control of the data they hold.
Organisations adopting the DataLinker protocols benefit in several ways:
- Farmers see that they are playing their part in maximising the use of information;
- Standardised APIs and Data Access Agreements reduce the time and money invested in negotiating and creating custom solutions for every interaction;
- Data Access Agreements mean that companies still retain the necessary control over high-value data sets, and are able to meet the privacy and confidentiality terms they have agreed with farmers; and
- Companies and farmers can efficiently use sets of data which otherwise might have been too expensive to collect, or required a level of farmer input which would have discouraged adoption.
Our hope is that this framework will help organisations and farmers to maximise use of farm information, reducing long-term costs and encouraging greater innovation.
What you can do about this:
- Want to see which New Zealand companies are accredited under the Farm Data Code of Practice? Check out www.farmdatacode.org.nz and drop an email to your key information providers to find out when they will be accredited.
- Interested in the DataLinker protocols and how they can be adopted by your business? You’ll find information at www.datalinker.org.
- Planning your strategy in this data space, or considering next steps? Talk to us – we’re happy to provide you with background and advice.
Can on-farm technologies and “big data” support food and fibre product attributes that consumers value?
- What ingredients are in the food or beverage product (64%);
- How a company treats animals used in its products (44%); and
- From where a company sources its ingredients (43%).
We call these informational aspects of a product “credence attributes”, meaning that they give credence to our decision to purchase (or not purchase) a product or service, but can’t be directly assessed from the product itself, either before purchase (on the basis of colour or feel) or after purchase (on the basis of taste, for instance).
Characteristics such as “organic”, “environmentally responsible”, “grass-fed”, and “naturally raised” relate to the story behind a product. A product may communicate these through advertising, packaging, and other ways of telling the product story.
But consumers are also looking for authenticity and integrity in their food and other products. There’s a consumer backlash when the product story on the pack is in conflict with other data sources – such as claims in news articles or secret video footage.
We’ve been exploring ways that feeds of data from on-farm technology could be used to support the product provenance and credence story – or at least signal to farmers and their supply chain partners where checks and improvements should be considered. Here are a couple of examples.
Monitoring carbon footprint
Carbon life-cycle assessments (LCAs) are used to understand the extent to which production, manufacture, and distribution of a product impacts on climate change through deforestation or release of greenhouse gases such as carbon dioxide, methane, and nitrous oxide. We learn some interesting things from these, sometimes showing that shipping food products from the other side of the world can have a lower impact than growing products locally if the local environment is less hospitable.
Importantly, producing a Life-cycle assessment creates a model – a series of equations and if-then logic that describes the calculation. We can use this model with appropriate local farm and supply chain data to understand how management decisions and activities, timing and stock or crop productivity impact on emissions.
Automated systems on farms that capture data about crop production, livestock weights and production, and farm activities can also deliver data for a custom life-cycle assessment. Benchmark data across multiple farms and it becomes possible to identify the patterns of complete vs missing data, to understand how climatic constraints change emissions, or to identify outliers that need to be more closely examined.
A note of caution here: as we’ve learned from nutrient budgeting, farm systems can be varied and life-cycle assessment models are frequently based on the “typical”. An outlier result may indicate greater variation than the model can handle, rather than a more or less efficient farming system.
Demonstrating animal welfare
Animal welfare and the ability to live a healthy and natural life is another area of concern to consumers. Here too, metrics collected on-farm can be the subject of automated analysis to demonstrate good practices are followed.
In Europe where a premium is payable for “grass-fed” dairy in some regions, farmers are experimenting with the use of monitoring devices – smart tags and neck bands for example. These devices capture data that provide farmers with early warning of heats and potential animal health issues – raised temperatures, more or less movement, and reduced eating for example – but can also be analysed for patterns that only show up in outdoor grazing.
In other jurisdictions, veterinary product purchase, use, and reordering records can help to demonstrate compliance with animal health plans worked out between farmers and veterinarians, and hence demonstrate good welfare practices and appropriate use of medicines. Paper records have been used for this purpose for many years, but software technologies and automated data analysis can reduce the burden of data collection and the need for manual audits and analysis.
Some producers will find the thought of such automated systems invasive and potentially threatening. Certainly, given the potential for outliers, for good practices that just don’t quite fit the expected mould, and for technology glitch or human error, you couldn’t use these measures as legal baselines that determine “rights to farm”.
Nevertheless, application of technology and analytics such as these can help us as we seek to improve farming practice and improve the integrity of our food supply chains. A good starting point might be to apply these as tools for committed producer groups that are already aligned with supply of a premium product or market.
we apply software and models to agricultural data.
Every second business writer today seems to be talking about ag-tech, big data, and the internet of things. If you’re an agriculture sector organisation, or a company servicing or purchasing from primary producers, you might be forgiven for thinking you missed a day in the office and overnight farming has become a connected, automated, artificially-intelligent system.
Reality of course is still far from that utopia (or dystopia, depending on your point of view). What the writers are telling us is that there are a range of interesting technical possibilities that might have application in agriculture: and that some of the very early adopters, enthusiasts, and visionaries are trialling these. In some cases, they’ve used technologies in interesting experiments and learned useful things about their supply chain or farming system.
All well and good; but if you don’t consider yourself a leading edge visionary (or perhaps you don’t have the same appetite for risk) should you just ignore the hype, and wait until the technology matures?
For the pragmatists among us, who are more interested in achieving practical goals and strategic goals than experimenting, here are some areas where I think there is value in leveraging technology into your business today.
We all know that good communication between people is what keeps the wheels of business oiled and turning. Technologies, systems, and business rules are no replacement for good people relationships.
Your customers, suppliers, and business partners see all of your interactions with them as part of that same interpersonal relationship. Are your reports or invoices late? Do they lack critical information your business partners need? Personal reassurances will go so far, but if you can’t achieve timely and data-rich information delivery that helps your partners, they may look elsewhere.
We’ve helped a number of our customers lift their communications with suppliers or customers. In some instances, we’ve delivered apps that help farmers access key pieces of information as they become available. In other cases, we’ve implemented reports and visualisations that are valued by business partners for the timely insight they provide. This isn’t just data: it’s business communication.
Understanding critical business metrics
Many businesses try and track too many metrics, and don’t always closely manage the key metrics that are leading indicators of success.
What drives your business? Primary production and processing businesses are often heavily influenced by factors such as weather and global market demand, but these factors are outside the control of most businesses and may have less impact on long-term profitability than we imagine.
Its typically our response to outside factors, and our ability to continue to produce value despite them that determines long term success. Measures of productivity per unit of input, and effectiveness at delivering high-value products are better indicators than raw dollar returns or kilograms of product shipped.
For some of our customers, this has meant improving alignment between their financial data and physical data records. For instance, benchmarking kilograms of product produced, farm working expenses and profit against the potential pasture or crop production for that season, allowing more effective comparison of improvements across seasons. For others it has meant focusing on the proportion of product meeting specification for high-value markets, regardless of whether the market actually delivered the desired price premium in that particular season.
Data integration can help bring these disparate data sets together for timely comparison, and in-field monitoring technologies or remote sensing can deliver the physical data needed to make sense of the product and financial outcomes.
Responding to changing conditions
How do you assess and respond to the risk of changing weather and markets?
Studies of farmer responses to drought and similar challenges indicate that we tend to respond too late, and in a conservative, “piecemeal” fashion. We seem to bet that things will trend back to normal sooner rather than later, and under-do our response.
Monitoring tools such as climate stations and market data visualisations allow us to understand trends and risks early – before their impacts really start to bite. Of course, responding to these is still a challenge: will I reduce stock numbers only to see the weather change?
Mathematical models don’t yet give those definitive answers some futurists might lead you to expect, but they allow you to ask the “what if” questions, looking at potential decisions and impacts. These allow you to build a plan for your business and understand what your critical review and decision points need to be.
Time to start learning
The pace of change in technology is only likely to accelerate in coming years, and the technology we use in farming in ten years may be very different than what is now available. It is tempting to just “wait and see” what evolves, but advanced agricultural businesses choose to embrace technologies that can deliver concrete benefits and which align with their goals. Consider ways that technology can help you achieve more effective communication, improved understanding of business metrics, and the ability to assess and respond to change.
What technologies are you embracing in your business, and why?