Person in field stares at the sky

Three steps to your agritech product vision

You’re creating the future: a product or service that does not yet exist. How do you ensure your team is on the same page? Do you tell them how they will build it? How it will work? Or do you paint the picture of how it will make people’s lives better?

Your investors and partners are the same. Your product vision is the core of your business vision, so you want key people to understand and love it. Still, formalising your product vision seems like another chore in your busy schedule. You might even fear your idea will lose some of its magic in the harsh light of the day.

A clear representation of your vision:

  • Helps you communicate the vision to your team, your partners, and potential investors. You can be confident that you’ve passed on the key elements and not missed anything out.
  • Inspires others! People who believe and buy-in to your vision will go above and beyond the call of duty to make it a reality.

Here are three keys to get you started on composing your agricultural product vision.

Start with the users

Identify who your intended users are and are not. What problems will you solve for them? How much do those problems matter to your intended users? Are you tackling their problems or forcing your own solution?

The Einstellung Effect is a psychological term. It describes how we can fall in to the trap of applying a familiar approach or solution to problems. When all we have is a hammer, everything looks like a nail. This is a great reminder to us to fall in love with problems, not our solutions. Your agritech product vision should describe the problems you want to solve.

Think big

The eventual picture is larger than your first release. Think about what your vision looks like over the next three, five, or even ten years.

At the same time, your first product release (or releases) don’t need to fulfil the entire vision. Break the plan into bite-size chunks that you can achieve and learn from them on the way.

Trace weak signals and trends

Take a view about the way we will solve problems in two to five years. You’re creating the future, so yesterday’s rules may not apply.

I’m not suggesting divorce from reality. Ensure you have time and space to read, research, and test. Seek to understand how people might work in the future, and how your product might contribute. If your product is a service (to some extent all are), how might it fit with the ways your users want to communicate?

Your product vision is not a product specification. It’s not an elevator pitch either. Whether it is a story or bullets in a slide deck, it’s the way you bring your team and partners with you on the journey. It should help them pull together and solve the problems that count.

As your Agritech product vision evolves, there is one more thing to do: communicate relentlessly. Share your vision with your team and your partners. Evolve it based on what you learn.

Photo of person taking notes

Ag Innovations Bootcamp – All Coming Together

Things are coming together for the Ag Innovations Bootcamp (5-6 Dec, Mystery creek, Hamilton).

We’ve put our heads together with NZ National Fieldays and our partners Amazon Web Services and support from AGMARDT to bring you some great keynote speakers:

Jenene Crossan – The brilliant digital entrepreneur behind some of new Zealand’s most recognisable digital brands, including nzgirl.co.nz and her latest project Flossie.

David Downs – A General Manager at New Zealand Trade and Enterprise (NZTE), he’s an ex-comedian, TV and Radio actor, author of No.8 Rewired and No.8 Recharged

And now announcing facilitator and creative coach Jeremy Suisted. Jeremy is founder and director of Creativate, an innovation consultancy and is a passionate and driven communication expert who can turn any endeavour into a success.

Don’t miss out on these incredible speakers and this enlightening experience!

Ag Innovations Bootcamp

Ag Innovations Bootcamp – Early-bird Special

Do you have an idea for what could be the next great agricultural or ag-tech product or service? Looking for ways to reliably convert ideas and concepts into products or services that meet the real needs of the market and your business?

Ag Innovations Bootcamp (5-6 December 2018, Hamilton, New Zealand) is a two-day workshop for:

  • Ag-sector visionaries;
  • Product managers and owners; and
  • Business leaders looking to “change up” their business

Hosted by NZ National Fieldays and Rezare Systems, with our partners Amazon Web Services and AGMARDT, Ag Innovations Bootcamp is packed with inspiring speakers, case studies from other product managers, hands-on learning and networking opportunities. You’ll learn techniques you can apply to build and validate business cases, understand user needs, and construct lightweight prototypes.

The early-bird offer for Ag Innovation Bootcamp runs until 9th November 2018.

For more information go to https://www.rezare.co.nz/bootcamp/

 

Product team (source: iStockPhoto)

5 tactics of an effective agritech product manager

Why are relatively few agritech products achieving adoption at scale when billions of dollars are being invested internationally every year? New start-ups appear almost weekly. And established companies are shifting from small innovations around the edges to major projects that sit at the heart of business plans.

Yet for all this activity, few product ideas seemingly “make it”.

We work with many agricultural companies – start-ups and established organisations – to help them develop smart digital products and services. In our experience, there’s a strong correlation between a business’s approach to product management, and their success in developing meaningful products or services that get used. The choice of product manager, the scope and objectives of their role, and their level of skill and authority, drives the success (or otherwise) of the product or service development.

The CEO or a Project Sponsor may define the overall business outcomes and vision, and project managers may be concerned with product budget and timeline, but the product manager is at once the “voice of the product” to the business, and simultaneously translates the “voice of the customer” to the development team (this part of the role is also called product owner). They decide the detailed problems the team will try to solve, the relative priority of those problems, and when the solution is complete enough to be put into the hands of customers.

Here are five tactics an agritech product manager can use to be more effective in their role:

1.      Allocate your attention

A product manager juggles many tasks. They must understand scope, be able to prioritise effectively, understand how the team is delivering and what is planned. It’s incredibly hard to do this if the product only gets a small time-slice of your attention.

You won’t be able to effectively manage your product by turning up for a fortnightly sprint planning session. Product managers need to be able to spend time with both stakeholders and the development team. They participate in customer interviews, review the product in showcases put on by the development team, and are deeply involved in product planning workshops.

2.      Build stakeholder relationships

New products and services are built at the intersection of customer or user desirability, business viability, and technical feasibility:

  • Does this product solve a real problem for its users, and can they readily get the benefits?
  • Are customers willing to pay for a solution, and is this solution sufficiently “better” that they will switch?
  • Does this product or service meet the objectives and fit the strategic direction of the company?
  • Is there a business model that makes sense for the company and which could be profitable?
  • Can it be built to operate as envisioned, at a cost the company can justify?

Effective product managers really understand the needs of their users and customers – their behaviours and the problems the product is trying to solve. They use observation and interviews to inform their opinions and seek data from the existing tools or products that customers use.

Product managers must also build trust with the business, effectively communicating how the direction and priorities chosen for the product meet the objectives of the business.

3.      Discover, don’t assume

It’s very tempting to build technology products and services the way corporate computer systems were developed in the past: envisage a solution, document it as a set of requirements, and set the development team to work. When the developers and testers are done, roll out the solution (or pass it through to sales and marketing).

Effective product managers know that detailed customer needs are emergent, and so too must be the solution to those needs. They make use of product discovery activities: carrying out interviews, running experiments, and building prototypes. They know that testing an idea by building a prototype and validating that thinking with real users may not only be an order of magnitude quicker and less expensive than building software, it avoids the huge waste of developing robust, performant, tested software that does not address the real problem.

Software and hardware will still need to be built, but continuously using discovery activities to understand and address customer problems reduces the risk of building a great solution to the wrong problems.

4.      Prioritise value

If customer and user problems and the solutions are emergent, how can you effectively manage the work of a development team (or teams)? How do you decide what gets released to customers, and when?

Effective product managers decide what discovery and development tasks are the highest priority to work on at any point in time. They pay great attention to the “product backlog” – the set of problems waiting to be worked on, ideas waiting to be tested, and validated ideas waiting to be turned into production software. They may visualise these using story maps, or as items on a Kanban board.

This is not “project management”, seeking to most efficiently have all the tasks completed on time and within budget. Rather, the product manager is making value-based decisions about which tasks or stories (feature sets) are the most valuable to do now, and which can be deferred (and might never be done if sufficient value can be delivered to customers and the business without them).

Product managers consider value on multiple scales:

  • Which stories deliver the most value to customers and end users?
  • Which stories help the company achieve its objectives (revenue, customer acquisition, or other outcomes)?
  • Which activities must be prioritised for the product development process to be successful (for instance, prioritising a discovery or validation activity that may change the overall shape of the product)?
  • Which essential dependencies must be built for more valuable stories to work?

5.      Release early and often

This may be an agile mantra, but it remains valid. It’s tempting to hold off putting your product into customers hands until it is “complete”. This is especially the case for established companies who worry about reputational risk.

Delaying until the product or service is largely “complete” misses the opportunity to learn how customers choose to use your product or service. They may pay no attention to that wonderful feature you slaved over and be thrilled by other functions. You may discover that the value you expected just isn’t there, and that you need to “pivot” to a different approach. Far better to do this early than wait until the entire budget is spent.

For organisations worried about reputational risk, limited pilots are a useful tool, whether with a subset of staff or a small group of customers. Early adopters may not fully represent your entire eventual market, but carefully chosen they can provide learning and become advocates to your broader market.

Learn More

We use a variety of tools and techniques to support Product Managers in their role, including discovery techniques and activities, dual-track agile (a team working on discovery and a team developing prioritised stories), and flexible scope contracts that focus on value delivery in a time frame rather than a fixed set of requirements. Talk to us to learn more.

If you’re a product manager in agriculture or agritech, consider attending the Ag Innovations Bootcamp – get inspiration and hands on “how-to” experience for product development best practices.

 

Read about the Ag Innovations Bootcamp

Sounds like DEFRA’s been listening

Back in March I posted an article on LinkedIn arguing the case for future farm support to be channelled into technology solutions that can deliver productivity gains and better deliver of social and environmental goods.
 
Well it seems the UK government is listening. Its publication of the Agriculture Bill last month which will determine farming support for a post-Brexit UK (noting that there will be differences in devolved administrations), caught my eye on three counts:
 
  • The phasing out of direct support payment
  • The introduction of funding for farmer-led R&D and collaboration on productivity innovation
  • A new Environmental Land Management (ELM) scheme
 
Of course the devil is in the detail, but on first glance (and at odds with some farming leaders) I like the look of what’s being proposed. Here’s why:
 
First, phasing out of direct support finally puts an end to the subsidy crutch that for too long has made British farming unproductive. We lag hopelessly behind many of our major competitors on this metric and while transitioning to a brave new world won’t be easy, it is vital to give farming the boot up the backside to become more innovative by necessity.
 
The fact that there may no longer be a requirement to farm to receive progressively reduced payments over seven years is a good thing. It gives farmers wishing to exit a dignified means of doing so, and might even start to make land occupation (rents or purchase) a little more reflective of economic viability – a good thing for innovating farmers and new entrants alike.
 
Second – and the one which in many ways I am most excited about – is the directing of funds towards farmer-led R&D and innovation. This is potentially game changing and totally in tune with a more technology-driven future for the sector. 
 
Back in March I noted the announcement of the Innovate UK Transforming Food Production fund of £90m as being a welcome start, but really just a drop in the ocean. I really hope the government through the Bill is bold enough to provide significant budget into this farmer-led area and not just pay it lip service.  There are some exciting initiatives we are involved in that fall squarely into what the government is driving at here. But this funding MUST encourage innovation that is focused on food production as well as other areas. As I wrote in the spring, more food is needed in the next 50 years than has been consumed in the entire history of humanity! It’s a big challenge that needs big thinking.
 
Third is the ELM scheme. For me there is also a huge technology role here. Delivery of public goods has to be measurable and we are now in the era of big (and small) data, machine learning and AI that could deliver real transformation in ways that can transparently demonstrate public value. The taxpayer should expect nothing less.
 
Moving away from direct support and into the territory of funding innovation and targeted activity is a sea change and something I believe to be a good thing. Ultimately, this approach is about the development of solutions which should, over time, stand on their own two feet. That’s what we are focused on and why so many of our clients come to us asking the question: “How will digital and data help us do the job better?” 
 
So, yes I understand why farm leaders are concerned. But this is not the time to cling onto the past. It is absolutely the time to tear up the rule book, imagine what the future should look like, and back truly innovative thinking and innovative farmers to get us there.  
Looking at sheep performance

Ways smarter tech can accelerate livestock genetic progress

I started writing about the case for smarter tech in cattle and sheep breeding programmes a while ago, and quickly realised there was more than would fit in a single blog post. In my previous post I described how technology can help us increase the pace of genetic progress, by:

  • Measuring what we can’t easily see, including a range of important traits such as efficiency, emissions and disease resistance; and
  • Reducing errors in recording and transcription.

There are two further areas where smart technology can and is helping:

Cut collection costs

How much does it cost to collect a phenotype for an animal?

A phenotype is the observable characteristics of an individual that result from both its genotype and the environment. When you collect sufficient phenotypic information about many individuals, you can adjust for environmental effects, and predict the genetic merit of that animals. Collecting phenotypes involves measuring details such as calving ease, weight gains, progeny survival and other characteristics of economic value to farmers and consumers.

Current technology for DNA analysis tell us more about the genetic similarity and differences between animals (their genetic relatedness) than their outright performance. There are relatively few simple gene interactions you can measure with a DNA test.

What DNA testing helps with is the challenge of recording which animals are related to each other, especially as pedigree recording gets harder with larger sets of animals. For the DNA test to be useful, it must be able to be statistically connected to animals that have had phenotypes measured.

As Dr Mike Coffey of SRUC, wrote in 2011, “In the age of the Genotype, Phenotype is king!

Phenotyping will continue to be required. Livestock breeders will need a pool of accurate and comprehensive phenotype observations to support both new and existing breeding objectives, connected to the growing pool of genotyped animals.

Measuring animals is time-consuming and expensive: the tedious task of observing and measuring characteristics of animals, linking these to the correct animal records, and transcribing the observations into computer systems.

With the need for ongoing, and potentially more detailed phenotype recording, technology is our friend. From automated weighing and electronic identification to cameras, sensors, and deep learning, properly configured and managed technologies reduce the cost of data collection – especially for larger operations and at commercial scale.

We’ve helped several organisations organise and automate their phenotype recording, integrating electronic identification and a range of other technologies, and streaming that data back to central databases.

Better buying behaviour

What really drives buying decisions?

Ultimately livestock breeding is driven by demand.  Livestock breeders focus their selection efforts on traits that drive economic performance such as:

  • Fertility;
  • Birth weight and survival;
  • Feed conversion efficiency and growth rates or milk production;
  • Carcass characteristics or milk characteristics;
  • Docility, resistance or resilience to disease;

and more.  In the livestock industry, this is presented as the predicted benefit to future generations (typically termed EPDs or EBVs depending on your species, breed, and country). The EBVs or EPDs are combined into economic indexes where each is given a $ or £ weighting based on its impact in a typical farm system and usually delivered as tabular reports and graphs to buyers.

And what do buyers select on?

  • In our experience, New Zealand dairy farmers make great use of the primary economic index – Breeding Worth or BW, combining that with decisions about breed and gestation length.
  • In the sheep industry, breed and breeder decisions are often the primary decider. Once a breeder is selected, the numbers are used to buy the best animals one can afford.
  • In the NZ and Australian beef industry, an expert tells me that for animals sold at auction, the closest purchase price correlation is with liveweight: in their opinion, animals are often valued on how big they are.

When sheep and beef farmers are presented with multiple EBVs and indexes, it can be overwhelming, and it is not surprising some farmers revert to assessing a ram visually, negating the value of genetics programmes.

I’m a great fan of Daniel Kahneman’s Thinking Fast, Thinking Slow, where he points out the challenge we have with quantifying difficult or unknown measures.

“If a satisfactory answer to a hard question is not found quickly, System 1 will find a related question that is easier and will answer it. I call the operation of answering one question in place of another substitution.”  – Daniel Kahneman, Thinking Fast, Thinking Slow.

If understanding EBV’s and indexes seems difficult, you’re most likely to substitute an easier question – how good the ram looks or how knowledgeable the breeder seems – without even realising you have done so.

How can technology help here? We can use technology to make ourselves smarter – or the questions simpler. For instance, there’s potential to create a data-driven system that analyses your current farm system and recommends which economic index or other measure you should use when choosing animals. Or perhaps a tool that with a few simple selections works out how much you should spend on rams or bulls, finds those at a sale that meet your needs, and returns a ranked short list that you can use in your final decisions.

Smarter technology for livestock buyers is a key area where the industry can make real progress on the rate of genetic gain, and a strong understanding of how people make decisions will be critical to its success.

4 reasons why bespoke software development may be the right choice

Is bespoke development always the expensive option?

It is hard to think of a business today that does not rely on software to improve efficiency, compliance or reaching customers. As businesses grow the opportunity to encapsulate functions and processes within an information system increases. Indeed, your software may define your business success.

One of the first questions our customers ask is ‘what off the shelf solutions are there?’. Often there are candidates and the next question is ‘will this off the shelf solution meet my needs and provide the cheapest option?’

Assessing potential candidates is not easy – in reality, its only once you are using the software that you can evaluate if it really fits your requirements. Making the wrong choice causes disruption, demoralises staff and makes you look bad. Assessing candidate software is not just ticking features off the checklist. It is about how the software will help grow your business – how the software fits the way you do business, how useable it is, how it will evolve to meet your changing needs and in so doing create great solutions.

There are times when it is also worth considering custom or “bespoke” software developed precisely for your needs, rather than what is available off the shelf.

Let’s look at four compelling reasons for bespoke software

  1. The value of any software can be measured by uptake; either by your customers or internally. While customer uptake is easier to measure, a low internal uptake may just seem like staff being lazy, or they seem to need a lot of training – it may just be that the software is hard to use. The greatest chance you have for achieving a solution that exactly meets your needs is by building it that way.

Workflow – Your software should reflect the efficient way you wish to manage business. As the creator of Visual Basic, Alan Cooper described, software should take away the pain. The worst case is an off the shelf solution simply does not fit with your workflow – and forces your business to needlessly change instead.

Functionality – Build only what you need and make the experience satisfying to users – great software should be a delight to use. Feature laden off the shelf solutions add no value if these features are not used. In fact, they reduce the value as users navigate through a mess of user interface that is irrelevant. As a subscriber of an off the shelf solution you are paying for all features even if this cost is spread over many more subscribers.

  1. Software innovation has proven to turn problems into unique solutions.

Software not only has the potential to increase the efficiency of your current business processes it can analyse and solve challenges and connect people so that problems are now unique solutions for you and your customers. Indeed, today businesses have innovated with bespoke software to create solutions that are the reason for their phenomenal growth. They haven’t achieved this with off the shelf software their moderately performing competitors are using.

  1. Your business needs and opportunities should drive your development roadmap.

In our software development business, we use agile processes and lean development to ensure you are buying functionality that has the highest priority for your business growth. You are in control. Providers of off the shelf products must meet the needs of a diverse user group. If you can see the providers roadmap you may be surprised just how long you may wait for your features; at worst your highest priority requirements may not even be on the roadmap. In such case you will need the provider to undertake bespoke development. Often they will do this when it is convenient and there is little incentive for their price to be competitive.

  1. The very process of planning and undertaking bespoke development can have wider benefits in reviewing your processes – leading to innovation in the way you do business

When we develop software, we use Design Thinking to understand the business goal and innovate with you – not just with the software but also the interaction with people, processes, hardware and data. As a result, your unique world view and people can unleash transformations well beyond the software.

This is not an exhaustive list of benefits in bespoke development but focusses on four areas often unaccounted or undervalued.  In summary bespoke development can be the best option where uptake is essential, where innovation can lead to business growth and where a business wants a competitive solution.

The case for smarter tech in cattle and sheep breeding

My first job in livestock performance recording was with the Genetics Section, as it was called, at Ruakura Research Centre in New Zealand. I worked part time while studying at university, transferring research trial data off the government mainframe on reel-to-reel tape, and writing inbreeding coefficient calculation software.

The genetics section was based in an old converted house, where we sat around at large, wooden, public service desks, surrounded by high stacks of computer printouts, all painstakingly bound and labelled for future use. We were the leading edge of genetic improvement and livestock performance recording.

That was nearly thirty years ago of course, and the face and capability of modern technology has radically changed. Interestingly however, many of the practices in livestock recording industries still reflect that past golden age, and it is only recently that the software tools and databases of – let’s be generous and say – 15 years ago have started to be refreshed.

In this, the first of two articles about technology in livestock breeding, I propose that we could make much more effective use of smart technologies to increase the rate of genetic progress and address commercially important, but hard to measure, animal characteristics. In my next post, I’ll examine how technology could reduce the cost of phenotype collection (I might even explain what a phenotype is), and encourage better use of improved genetics by commercial producers.

Measure what you can’t see

In our traditional performance breeding tools, we focused on things that farmers could readily measure: kilograms and counts. Numbers of live progeny, and kilograms of liveweight, milk, and wool. Good news, most of those production traits are heritable and we’ve made good progress over the last 30+ years.

So how do you measure characteristics that are important in modern farming systems?

  • Meat eating quality, so that consumers can repeatably have a great eating experience;
  • Feed conversion efficiency, converting inputs into product more efficiently, reducing greenhouse gas emissions per unit of product, and making the farming system more profitable;
  • For that matter, greenhouse gas emissions (where this is driven by livestock genetics rather than inoculation by a specific set of gut microorganisms);
  • Urine nitrate concentration, and hence one key environmental impact of extensive livestock farming;
  • Disease resistance and the response of animals to a variety of disease and parasite challenges;
  • Behaviour of animals around people and other livestock, including how they handle stressful environments such as being moved; and
  • Longevity, the ability of female animals to raise progeny season after season, reducing the substantial cost of replacement animals.

There are proxies for many of these measures of course. Breeding for growth rates or milk production have arguably improved greenhouse gas efficiency for example, but in some breeding systems a change in mature weight of animals has increased emissions. Progeny tests and laboratory measures have been used in key programmes, but they may not help us with routinely identifying the genetic outliers that will lead the next leap in genetic progress.

New measurement and sensing technologies offer real potential to help with these “hard to measure” areas of animal performance in the coming years. Accelerometer and microphone technologies can identify individual animal eating habits, heats and parturition (birth) dates. 3D and multispectral cameras tell us about carcass and meat product characteristics, and additional characteristics of milk. Increasingly, this data will be collected in-line or in near-real-time, providing a rich stream of data that could be analysed for many purposes.

The next generation of animal recording and genetic analysis systems must be built to handle this variety of real-time, stream data: or at least the results of analysing it.

Fewer errors, more progress

A primary driver of any livestock recording and animal evaluation system is to enable breeders and commercial producers to make better decisions about the animals they use in breeding. Computers don’t select animals: people do. Where a producer chooses an animal because they like the look of its eyes, or its stance, or its colour, and ignores the potential impact of the animal on their herd, the results will be at best random, and often detrimental.

Formal breeding schemes with EBVs and indexes seek to inform better decisions about the breeding merit of animals, but EBVs can be limited by the information available:

  • Accuracy of recording parentage and animal relationships;
  • Incorrect allocation of records to the wrong animals;
  • Transposition and recording errors when capturing data; and
  • Failing to account for the impact of environmental effects such as the feeding and management regimes of groups of animals, the age of the mother, or whether an animal was reared as a single or twin.

Technology is playing a substantial role in improving the accuracy of EBVs, notably through genomic DNA analyses resolving the fraught process of parentage recording and contributing substantially more information, earlier in each animals’ life-cycle. Better facilitation and handling of genomic data collection is well overdue in animal recording systems, and I’m pleased to see this being addressed.

In addition to genomics, electronic identification (EID) and automated recording systems can remove many identification and data capture areas, and the ability to feed this data seamlessly into modern evaluation systems without having to manually manipulate data will provide another leap forward.

Recording management groups properly has been a real limiting factor in many breeding programmes, and is one of the key hesitations in extending these to commercial producers. I believe that sensors that identify eating and movement behaviours, and location or proximity to other animals, will help us to automatically and transparently solve the problem of recording management groups and regimes. This will provide another substantial step forward in removing the noise of environmental effects.

Of course, more accurate EBVs is still only a piece of the puzzle. Helping producers to make use of this information effectively is another, and something I’ll address in my next post.

 

Rezare Systems is a bespoke software design and development company specialising in the agriculture sector. We have special expertise in building livestock recording and management systems, and tools for data collection and integration. Learn how Rezare Systems can assist your business.

More than one way to skin the data sharing cat

Last week the UK Agriculture and Horticulture Development Board (AHDB) announced an industry consultation to develop a set of principles (code) to promote the sharing of farm data. Happily, we at Rezare UK have been awarded the contract to run this project based on our unique agridata expertise and our significant experience in developing a code in NZ.

 

Improving the flow of data from farms to other organisations is seen (rightly) by the AHDB as part of the productivity agenda for UK agriculture, but there remain significant barriers to getting the data flowing in practice mainly because of issues around trust and interoperability of disparate sets of data.

 

While the code will go someway towards addressing issues of trust (and start to build some alignment across industry on best practice when it comes to sharing and using farm data), other issues will also need to be addressed going forward beyond the code itself, particularly the more technical aspects of exchanging and using the data.

 

Two really good examples of dealing with this have emerged in the past couple of years – DataLinker in NZ and Agrimetrics in the UK. These two approaches (the latter is one of four UK government agritech centres of excellence) while quite different in nature (and to a degree in objectives), are actually also potentially very complimentary.

 

DataLinker works on a model where no one party becomes the single repository and broker of farm data. Instead, data owners build APIs to standardised schema and do this once only so that permissioned third parties can access that data in a known way. The exchange of data between the owner and user of it (“consumer”) is a bilateral relationship where DataLinker provides the permissioning (tokens) and legal frameworks (templated agreements) to streamline and standardise the process.

 

DataLinker assumes that each potential system is in fact its own “locker” (store of data) with one or more types of data. Users of some sort already interact with those systems, so what DataLinker does is standardise the way of finding which systems have which types of data (the findable F in FAIR data sharing) and in which formats (the interoperable I in FAIR). It specifies the method by which organisations agree data access rules and users provide permission (together, the accessible A of FAIR), with the result that the data is reusable (the R in FAIR). DataLinker has been focused more on the farmer or user-facing sharing of data than for broad data access necessary for researchers for example (at least without organisations explicitly addressing this).

 

Agrimetrics in the UK employs the semantic web whereby publicly available data (published on the web) or private data made available under a licence agreement is organised according to a Resource Description Framework (RDF). Each data entity is described as a “triple” (subject-predicate-object) and in that way stored data becomes machine readable by being linked to other data entities. The data contributed is effectively “held” by Agrimetrics and then exposed through APIs (charged or free) under licence for third parties to use.

 

Agrimetrics is focused on big data and using semantic web is tagging or structuring large datasets in public HTML documents (and other data) in a way that makes it machine recognisable and readable.

 

In essence the two approaches can be differentiated thus:

  • DataLinker is a network approach – a set of protocols and standards that allow myriad parties to exchange and share data in multiple bilateral (albeit mostly templated) arrangements through standardised APIs.
  • Agrimetrics is a hub approach – where data is is shared to the Agrimetrics “centre” where it is stored, manipulated and interpreted before being shared as a more user-friendly asset under licence through APIs.

 

In many ways Agrimetrics is the more comprehensive since it seeks not only to broker data exchange but also to add value to the data by linking it and manipulating it to meet a particular consumer’s need. It can handle structured or unstructured data. This is potentially very powerful as it allows a consumer of the data to draw on Agrimetrics’ technical know-how and capacity to do increasingly clever and machine-learning based activities with the data. In other words, Agrimetrics can offer a one-stop-shop for brokering and adding value to data.

 

However, there are also problems with the approach. It assumes a high degree of integrity and legal rigour being exercised by Agrimetrics since the data sharers are effectively “letting go” of their data to be stored and used by an organisation that is looking to commercialise it. And in the absence of private data holders being prepared to release data, Agrimetrics is only as good as the publicly available (web published) data.

 

DataLinker does not (and is not intended to) become involved in negotiating commercial deals to share data. Nor does it become involved in managing, manipulating or interpreting the data.  It is largely a hand-off approach designed to facilitate the network, not control it. But the adoption of the standardised schemas means there is an IT burden on the data sharers – either in-house or outsourced – to build compliant APIs.
Understanding the DataLinker and Agrimetrics approaches

So is one approach likely to prevail? Most likely not and it’s actually preferable for the two to co-exist and complement each other. Here’s why:

  • First, because culturally the DataLinker approach is more aligned to putting the interests of the farmer first and right now farmer trust in how their data is controlled and used is becoming almost the biggest blocker to progress
  • Second, because it is unlikely industry will want to have all its eggs in the one basket
  • Third, because the horsepower in Agrimetrics is potentially a game changer in terms of releasing real innovation based on farm data and thus demonstrating the value proposition to farmers from sharing their data (another piece of the sharing jigsaw that is missing)
  • Fourth, because the DataLinker approach through its JSON_LD APIs means data can be “readied” for consumption in a semantic way which would complement the success of Agrimetrics
  • And fifth, because the semantic web is likely to be a long-term approach favoured particularly by the research community within the agrifood sector.

 

There are other concepts for farm data sharing that are being considered around the globe.

 

For example, Wageningen University in the Netherlands has proposed a Farm Data Train which effectively creates a number of data lockers (stores), all with the same API and approach to authorisation, which means their interfaces in effect align closely to what is proposed in DataLinker. At present this concept is focused more on plant breeding data but it could easily grow outwards.

 

So what’s my point? Well, as can be seen, there is more than one way to skin the proverbial cat. What’s important is for the sector to provide space for the approaches to breathe so that there is increased opportunity for innovation to deliver against the productivity agenda. That’ll need some collaboration and collaborative thinking and in the UK we shall, in the coming months, discover how its agri sector wants to address these issues.

 

It’s a great time to be involved in agridata and better still that Rezare are in the thick of shaping the future.

Photo credit: Vadim_Key (iStock)

How DataLinker streamlines agricultural technology connections

Information is the life-blood of today’s businesses and will enable the transformations occurring in the agriculture and food business sector. Historically, information has only been exchanged between businesses at the transactional level (such as shipping notices and invoices), while richer data that could differentiate products, demonstrate environmental compliance, and optimise business value has remained isolated in silos.

DataLinker is designed to give agricultural businesses (farmers, processors, input suppliers and advisers) the ability to access and combine data from multiple sources in flexible, and timely ways, without requiring many hours of skilled technical resource to carry out data exports and imports.

Integrating and effectively sharing data looms large for many businesses, so companies are investing in their own development and infrastructure, and are also finding the challenges: data standardisation, supporting different interfaces for each partner organisation, and time taken to negotiate data access agreements. DataLinker addresses these issues.

What is DataLinker?

DataLinker is a framework for agriculture and food businesses who wish to interchange data. In many ways it is analogous to the GS1 frameworks used to interchange shipping notice and invoice data, or the Ag Gateway framework used in grain supply space. DataLinker’s primary focus was to allow farmers to bring data from a variety of sources into the tools they use for decision making, but it can be equally beneficial to all companies in the sector.

DataLinker consists of four major components that work together:

  • Data exchange specifications (“schemas”) that standardise sets of data using the Farm Data Standards and modern internet protocols (developed collaboratively with the input of member companies);
  • A small central registry where companies can discover which organisations implement each specification and how these are accessed;
  • Standardised contract terms that can be used to reduce negotiating time and legal costs in the majority of data interchanges; and
  • Technical tools to support secure agreement of data access terms, approval of access, and (where necessary) farmer permission for individual farm data sets.

DataLinker is not a database, nor a central communications hub through which all data might pass.

All of the DataLinker specifications and framework components are based on internet standards, and companies are responsible for implementing the specifications in their own IT systems, although support is provided.

How does the commercial model work?

DataLinker Limited has been incorporated as a separate entity to operate the DataLinker registry and support the collaborative development of standardised API specifications for areas where its users direct. The board of directors comprises representatives from Beef+Lamb NZ, DairyNZ, MPI, and an independent chair appointed by DataLinker’s subscribing members. DataLinker Limited operates effectively as a not-for-profit to encourage adoption and benefits for the agricultural community.

There are no transaction fees.

Members pay a joining fee of $6,000 NZD (waived for New Zealand organisations prior to 31 May 2018), and an annual subscription of either $3,500 (for organisations either providing or consuming data), or $4,500 (for heavier users both providing and consuming data). These fees are analogous to membership of a standards organisation, supporting the operation of the registry and collaborative maintenance of specifications.

Page 1 of 512345