4 reasons why bespoke software development may be the right choice

Is bespoke development always the expensive option?

It is hard to think of a business today that does not rely on software to improve efficiency, compliance or reaching customers. As businesses grow the opportunity to encapsulate functions and processes within an information system increases. Indeed, your software may define your business success.

One of the first questions our customers ask is ‘what off the shelf solutions are there?’. Often there are candidates and the next question is ‘will this off the shelf solution meet my needs and provide the cheapest option?’

Assessing potential candidates is not easy – in reality, its only once you are using the software that you can evaluate if it really fits your requirements. Making the wrong choice causes disruption, demoralises staff and makes you look bad. Assessing candidate software is not just ticking features off the checklist. It is about how the software will help grow your business – how the software fits the way you do business, how useable it is, how it will evolve to meet your changing needs and in so doing create great solutions.

There are times when it is also worth considering custom or “bespoke” software developed precisely for your needs, rather than what is available off the shelf.

Let’s look at four compelling reasons for bespoke software

  1. The value of any software can be measured by uptake; either by your customers or internally. While customer uptake is easier to measure, a low internal uptake may just seem like staff being lazy, or they seem to need a lot of training – it may just be that the software is hard to use. The greatest chance you have for achieving a solution that exactly meets your needs is by building it that way.

Workflow – Your software should reflect the efficient way you wish to manage business. As the creator of Visual Basic, Alan Cooper described, software should take away the pain. The worst case is an off the shelf solution simply does not fit with your workflow – and forces your business to needlessly change instead.

Functionality – Build only what you need and make the experience satisfying to users – great software should be a delight to use. Feature laden off the shelf solutions add no value if these features are not used. In fact, they reduce the value as users navigate through a mess of user interface that is irrelevant. As a subscriber of an off the shelf solution you are paying for all features even if this cost is spread over many more subscribers.

  1. Software innovation has proven to turn problems into unique solutions.

Software not only has the potential to increase the efficiency of your current business processes it can analyse and solve challenges and connect people so that problems are now unique solutions for you and your customers. Indeed, today businesses have innovated with bespoke software to create solutions that are the reason for their phenomenal growth. They haven’t achieved this with off the shelf software their moderately performing competitors are using.

  1. Your business needs and opportunities should drive your development roadmap.

In our software development business, we use agile processes and lean development to ensure you are buying functionality that has the highest priority for your business growth. You are in control. Providers of off the shelf products must meet the needs of a diverse user group. If you can see the providers roadmap you may be surprised just how long you may wait for your features; at worst your highest priority requirements may not even be on the roadmap. In such case you will need the provider to undertake bespoke development. Often they will do this when it is convenient and there is little incentive for their price to be competitive.

  1. The very process of planning and undertaking bespoke development can have wider benefits in reviewing your processes – leading to innovation in the way you do business

When we develop software, we use Design Thinking to understand the business goal and innovate with you – not just with the software but also the interaction with people, processes, hardware and data. As a result, your unique world view and people can unleash transformations well beyond the software.

This is not an exhaustive list of benefits in bespoke development but focusses on four areas often unaccounted or undervalued.  In summary bespoke development can be the best option where uptake is essential, where innovation can lead to business growth and where a business wants a competitive solution.

The case for smarter tech in cattle and sheep breeding

My first job in livestock performance recording was with the Genetics Section, as it was called, at Ruakura Research Centre in New Zealand. I worked part time while studying at university, transferring research trial data off the government mainframe on reel-to-reel tape, and writing inbreeding coefficient calculation software.

The genetics section was based in an old converted house, where we sat around at large, wooden, public service desks, surrounded by high stacks of computer printouts, all painstakingly bound and labelled for future use. We were the leading edge of genetic improvement and livestock performance recording.

That was nearly thirty years ago of course, and the face and capability of modern technology has radically changed. Interestingly however, many of the practices in livestock recording industries still reflect that past golden age, and it is only recently that the software tools and databases of – let’s be generous and say – 15 years ago have started to be refreshed.

In this, the first of two articles about technology in livestock breeding, I propose that we could make much more effective use of smart technologies to increase the rate of genetic progress and address commercially important, but hard to measure, animal characteristics. In my next post, I’ll examine how technology could reduce the cost of phenotype collection (I might even explain what a phenotype is), and encourage better use of improved genetics by commercial producers.

Measure what you can’t see

In our traditional performance breeding tools, we focused on things that farmers could readily measure: kilograms and counts. Numbers of live progeny, and kilograms of liveweight, milk, and wool. Good news, most of those production traits are heritable and we’ve made good progress over the last 30+ years.

So how do you measure characteristics that are important in modern farming systems?

  • Meat eating quality, so that consumers can repeatably have a great eating experience;
  • Feed conversion efficiency, converting inputs into product more efficiently, reducing greenhouse gas emissions per unit of product, and making the farming system more profitable;
  • For that matter, greenhouse gas emissions (where this is driven by livestock genetics rather than inoculation by a specific set of gut microorganisms);
  • Urine nitrate concentration, and hence one key environmental impact of extensive livestock farming;
  • Disease resistance and the response of animals to a variety of disease and parasite challenges;
  • Behaviour of animals around people and other livestock, including how they handle stressful environments such as being moved; and
  • Longevity, the ability of female animals to raise progeny season after season, reducing the substantial cost of replacement animals.

There are proxies for many of these measures of course. Breeding for growth rates or milk production have arguably improved greenhouse gas efficiency for example, but in some breeding systems a change in mature weight of animals has increased emissions. Progeny tests and laboratory measures have been used in key programmes, but they may not help us with routinely identifying the genetic outliers that will lead the next leap in genetic progress.

New measurement and sensing technologies offer real potential to help with these “hard to measure” areas of animal performance in the coming years. Accelerometer and microphone technologies can identify individual animal eating habits, heats and parturition (birth) dates. 3D and multispectral cameras tell us about carcass and meat product characteristics, and additional characteristics of milk. Increasingly, this data will be collected in-line or in near-real-time, providing a rich stream of data that could be analysed for many purposes.

The next generation of animal recording and genetic analysis systems must be built to handle this variety of real-time, stream data: or at least the results of analysing it.

Fewer errors, more progress

A primary driver of any livestock recording and animal evaluation system is to enable breeders and commercial producers to make better decisions about the animals they use in breeding. Computers don’t select animals: people do. Where a producer chooses an animal because they like the look of its eyes, or its stance, or its colour, and ignores the potential impact of the animal on their herd, the results will be at best random, and often detrimental.

Formal breeding schemes with EBVs and indexes seek to inform better decisions about the breeding merit of animals, but EBVs can be limited by the information available:

  • Accuracy of recording parentage and animal relationships;
  • Incorrect allocation of records to the wrong animals;
  • Transposition and recording errors when capturing data; and
  • Failing to account for the impact of environmental effects such as the feeding and management regimes of groups of animals, the age of the mother, or whether an animal was reared as a single or twin.

Technology is playing a substantial role in improving the accuracy of EBVs, notably through genomic DNA analyses resolving the fraught process of parentage recording and contributing substantially more information, earlier in each animals’ life-cycle. Better facilitation and handling of genomic data collection is well overdue in animal recording systems, and I’m pleased to see this being addressed.

In addition to genomics, electronic identification (EID) and automated recording systems can remove many identification and data capture areas, and the ability to feed this data seamlessly into modern evaluation systems without having to manually manipulate data will provide another leap forward.

Recording management groups properly has been a real limiting factor in many breeding programmes, and is one of the key hesitations in extending these to commercial producers. I believe that sensors that identify eating and movement behaviours, and location or proximity to other animals, will help us to automatically and transparently solve the problem of recording management groups and regimes. This will provide another substantial step forward in removing the noise of environmental effects.

Of course, more accurate EBVs is still only a piece of the puzzle. Helping producers to make use of this information effectively is another, and something I’ll address in my next post.


Rezare Systems is a bespoke software design and development company specialising in the agriculture sector. We have special expertise in building livestock recording and management systems, and tools for data collection and integration. Learn how Rezare Systems can assist your business.

More than one way to skin the data sharing cat

Last week the UK Agriculture and Horticulture Development Board (AHDB) announced an industry consultation to develop a set of principles (code) to promote the sharing of farm data. Happily, we at Rezare UK have been awarded the contract to run this project based on our unique agridata expertise and our significant experience in developing a code in NZ.


Improving the flow of data from farms to other organisations is seen (rightly) by the AHDB as part of the productivity agenda for UK agriculture, but there remain significant barriers to getting the data flowing in practice mainly because of issues around trust and interoperability of disparate sets of data.


While the code will go someway towards addressing issues of trust (and start to build some alignment across industry on best practice when it comes to sharing and using farm data), other issues will also need to be addressed going forward beyond the code itself, particularly the more technical aspects of exchanging and using the data.


Two really good examples of dealing with this have emerged in the past couple of years – DataLinker in NZ and Agrimetrics in the UK. These two approaches (the latter is one of four UK government agritech centres of excellence) while quite different in nature (and to a degree in objectives), are actually also potentially very complimentary.


DataLinker works on a model where no one party becomes the single repository and broker of farm data. Instead, data owners build APIs to standardised schema and do this once only so that permissioned third parties can access that data in a known way. The exchange of data between the owner and user of it (“consumer”) is a bilateral relationship where DataLinker provides the permissioning (tokens) and legal frameworks (templated agreements) to streamline and standardise the process.


DataLinker assumes that each potential system is in fact its own “locker” (store of data) with one or more types of data. Users of some sort already interact with those systems, so what DataLinker does is standardise the way of finding which systems have which types of data (the findable F in FAIR data sharing) and in which formats (the interoperable I in FAIR). It specifies the method by which organisations agree data access rules and users provide permission (together, the accessible A of FAIR), with the result that the data is reusable (the R in FAIR). DataLinker has been focused more on the farmer or user-facing sharing of data than for broad data access necessary for researchers for example (at least without organisations explicitly addressing this).


Agrimetrics in the UK employs the semantic web whereby publicly available data (published on the web) or private data made available under a licence agreement is organised according to a Resource Description Framework (RDF). Each data entity is described as a “triple” (subject-predicate-object) and in that way stored data becomes machine readable by being linked to other data entities. The data contributed is effectively “held” by Agrimetrics and then exposed through APIs (charged or free) under licence for third parties to use.


Agrimetrics is focused on big data and using semantic web is tagging or structuring large datasets in public HTML documents (and other data) in a way that makes it machine recognisable and readable.


In essence the two approaches can be differentiated thus:

  • DataLinker is a network approach – a set of protocols and standards that allow myriad parties to exchange and share data in multiple bilateral (albeit mostly templated) arrangements through standardised APIs.
  • Agrimetrics is a hub approach – where data is is shared to the Agrimetrics “centre” where it is stored, manipulated and interpreted before being shared as a more user-friendly asset under licence through APIs.


In many ways Agrimetrics is the more comprehensive since it seeks not only to broker data exchange but also to add value to the data by linking it and manipulating it to meet a particular consumer’s need. It can handle structured or unstructured data. This is potentially very powerful as it allows a consumer of the data to draw on Agrimetrics’ technical know-how and capacity to do increasingly clever and machine-learning based activities with the data. In other words, Agrimetrics can offer a one-stop-shop for brokering and adding value to data.


However, there are also problems with the approach. It assumes a high degree of integrity and legal rigour being exercised by Agrimetrics since the data sharers are effectively “letting go” of their data to be stored and used by an organisation that is looking to commercialise it. And in the absence of private data holders being prepared to release data, Agrimetrics is only as good as the publicly available (web published) data.


DataLinker does not (and is not intended to) become involved in negotiating commercial deals to share data. Nor does it become involved in managing, manipulating or interpreting the data.  It is largely a hand-off approach designed to facilitate the network, not control it. But the adoption of the standardised schemas means there is an IT burden on the data sharers – either in-house or outsourced – to build compliant APIs.
Understanding the DataLinker and Agrimetrics approaches

So is one approach likely to prevail? Most likely not and it’s actually preferable for the two to co-exist and complement each other. Here’s why:

  • First, because culturally the DataLinker approach is more aligned to putting the interests of the farmer first and right now farmer trust in how their data is controlled and used is becoming almost the biggest blocker to progress
  • Second, because it is unlikely industry will want to have all its eggs in the one basket
  • Third, because the horsepower in Agrimetrics is potentially a game changer in terms of releasing real innovation based on farm data and thus demonstrating the value proposition to farmers from sharing their data (another piece of the sharing jigsaw that is missing)
  • Fourth, because the DataLinker approach through its JSON_LD APIs means data can be “readied” for consumption in a semantic way which would complement the success of Agrimetrics
  • And fifth, because the semantic web is likely to be a long-term approach favoured particularly by the research community within the agrifood sector.


There are other concepts for farm data sharing that are being considered around the globe.


For example, Wageningen University in the Netherlands has proposed a Farm Data Train which effectively creates a number of data lockers (stores), all with the same API and approach to authorisation, which means their interfaces in effect align closely to what is proposed in DataLinker. At present this concept is focused more on plant breeding data but it could easily grow outwards.


So what’s my point? Well, as can be seen, there is more than one way to skin the proverbial cat. What’s important is for the sector to provide space for the approaches to breathe so that there is increased opportunity for innovation to deliver against the productivity agenda. That’ll need some collaboration and collaborative thinking and in the UK we shall, in the coming months, discover how its agri sector wants to address these issues.


It’s a great time to be involved in agridata and better still that Rezare are in the thick of shaping the future.

Photo credit: Vadim_Key (iStock)

How DataLinker streamlines agricultural technology connections

Information is the life-blood of today’s businesses and will enable the transformations occurring in the agriculture and food business sector. Historically, information has only been exchanged between businesses at the transactional level (such as shipping notices and invoices), while richer data that could differentiate products, demonstrate environmental compliance, and optimise business value has remained isolated in silos.

DataLinker is designed to give agricultural businesses (farmers, processors, input suppliers and advisers) the ability to access and combine data from multiple sources in flexible, and timely ways, without requiring many hours of skilled technical resource to carry out data exports and imports.

Integrating and effectively sharing data looms large for many businesses, so companies are investing in their own development and infrastructure, and are also finding the challenges: data standardisation, supporting different interfaces for each partner organisation, and time taken to negotiate data access agreements. DataLinker addresses these issues.

What is DataLinker?

DataLinker is a framework for agriculture and food businesses who wish to interchange data. In many ways it is analogous to the GS1 frameworks used to interchange shipping notice and invoice data, or the Ag Gateway framework used in grain supply space. DataLinker’s primary focus was to allow farmers to bring data from a variety of sources into the tools they use for decision making, but it can be equally beneficial to all companies in the sector.

DataLinker consists of four major components that work together:

  • Data exchange specifications (“schemas”) that standardise sets of data using the Farm Data Standards and modern internet protocols (developed collaboratively with the input of member companies);
  • A small central registry where companies can discover which organisations implement each specification and how these are accessed;
  • Standardised contract terms that can be used to reduce negotiating time and legal costs in the majority of data interchanges; and
  • Technical tools to support secure agreement of data access terms, approval of access, and (where necessary) farmer permission for individual farm data sets.

DataLinker is not a database, nor a central communications hub through which all data might pass.

All of the DataLinker specifications and framework components are based on internet standards, and companies are responsible for implementing the specifications in their own IT systems, although support is provided.

How does the commercial model work?

DataLinker Limited has been incorporated as a separate entity to operate the DataLinker registry and support the collaborative development of standardised API specifications for areas where its users direct. The board of directors comprises representatives from Beef+Lamb NZ, DairyNZ, MPI, and an independent chair appointed by DataLinker’s subscribing members. DataLinker Limited operates effectively as a not-for-profit to encourage adoption and benefits for the agricultural community.

There are no transaction fees.

Members pay a joining fee of $6,000 NZD (waived for New Zealand organisations prior to 31 May 2018), and an annual subscription of either $3,500 (for organisations either providing or consuming data), or $4,500 (for heavier users both providing and consuming data). These fees are analogous to membership of a standards organisation, supporting the operation of the registry and collaborative maintenance of specifications.

Family eating dinner

Are you making authentic supply chain promises?

If you’re in the food business (whether that’s retail, food service, processing, farming, or supply), consumers are asking questions about your supply chain.

Of course, they may not be asking you directly, and they may not be asking your retail or food service partner, but they are asking: on social media, on recommendation sites such as TripAdvisor and Yelp, even over drinks at their local.

Are you providing the information they need to be confident about the quality and safety of your product? Do you have a substantiated story around provenance, animal welfare and the environment?

Safeguards such as DNA testing lasagna are “bottom of the cliff” activities, an attempt to rebuild broken trust and arguably too limited and late in the supply chain.

Future product preference and even acceptance relies upon a supply chain that can show ethical practices: in how environmental impacts are managed, natural biodiversity is encouraged, animal welfare is maintained, anti-microbial resistance is avoided, and workers and communities are treated.

Activist groups and the power of social media means that our response to these demands must be much more solid than a promise or a declaration form. We must have the systems and measures to back up our words – and to demonstrate as much to auditors and our supply-chain partners.

For those of us at the confluence of technology and agriculture, this means we must do more than just record activities and calculate gross margins. We must step up with tools that capture rich data in support of farming activities, and which actively encourage good decisions that improve both profitability and sustainability.

All this needs to be done with minimal additional effort by farmers and their staff, and aligned to real-world processes on farm.

I’ll be speaking at MobileTech 2017, the annual summit for technology innovations in the primary sector, reflecting on these challenges. I’ll summarise some of the work Rezare Systems is doing in this space, and suggest ways the industry could apply technology to the opportunity.

This article was first published at www.rezare.com/blog  

Dairy farmer with technology

Farmers love technology, fear misuse

Increasing numbers of farmers see technology as useful and important to their farming businesses, and farmers are looking to invest further in new technology over the coming years. Despite this, lingering concerns about data sharing, privacy and control remain.

According to the October 2016 Commonwealth Bank of Australia Agri-Insights Survey of 1600 Australian farmers, 70% of farmers believe that the digital technology available adds significant value to their businesses.

The Ag Data Survey published by the American Farm Bureau Federation (AFBF) also found that farmers are optimistic about technology, with 77% of farmers planning to invest in new technology for their farms in the next three years.

Farmers also see value in sharing and re-use of data, but privacy and control are the largest barriers to more widespread re-use.

The Agri-Insights Survey found that:

  • 76% of farmers think that there is value in sharing on-farm production information with others;
  • 58% of farmers currently share some on-farm production information with others; and
  • Of farmers who don’t see value in data sharing, “privacy concerns” at 28% is the largest reason.

The New Zealand Office of the Privacy Commissioner surveyed New Zealanders about privacy and their attitudes to data sharing in April 2016. They noted that:

  • 57% of respondents were open to sharing data if they could choose to opt out;
  • 59% were open to sharing if there were strict controls on who could access data and how it was used; and
  • 61% were open to sharing if the data was anonymised and they couldn’t be personally identified.

The US AFBF survey also highlighted some of these concerns in an agricultural context:

  • Only 33% of farmers had signed contracts with their ag-tech provider. Another 39% knew of their provider’s policies but had not signed anything;
  • When farmers were asked if they were aware of the ways in which an ag-tech provider might use their data, 78% of farmers answered “no”; and
  • 77% of farmers were concerned about which entities can access their farm data and whether it could be used for regulatory purposes.

Not just farmers

Confidentiality and control can be barriers to companies too. After all, much of the data is about their activities, products, or equipment as well as the farm itself.

It’s not always clear how other parties will behave when sharing data. Organisations generally make reasonable and effective use of data and meet confidentiality expectations, but there is always a risk that they won’t. So companies sharing data are forced to negotiate “iron-clad” agreements, keeping the corporate lawyers busy and making any new data exchange the subject of long-winded negotiations.

As soon as you get into negotiations like this, costs rise. If one of the parties is a smaller player with less negotiating power (company or farmer), they may never be able to conclude a useful data access deal. The end result? A slower rate of innovation, the benefits of information to the farmer and overall supply chain are not fully realised, and sharing data becomes a much more expensive exercise than you would otherwise expect.

Over the years, industry players have experimented with different ways to address these issues. Centralised industry-good databases and exchanges have been proposed, and these could be very effective. Unfortunately, concern about centralising large amounts of data, and the loss of control that this brings has led players to hold back some or all of their data from such repositories.

Other groups have posited that all data should be in the exclusive control of the farmer, and have built exchanges or created open API standards on that basis. We applaud this, but it doesn’t always reflect the significant effort that companies and service providers invest in creating and curating some data sets. The end result is that some data sets are often held back from such exchanges.

A collaborative approach

The New Zealand primary industry has worked on several approaches to this problem in a collaboration between the red meat sector, the dairy sector, and the Ministry for Primary industries.

The Farm Data Code of Practice is designed to encourage greater transparency between farmers and service providers or vendors about the data that is held, and the rights that each party has to the data. A straight-forward accreditation process gives farmers confidence that organisations have “got their house in order” when it comes to terms and conditions and data policies.

The DataLinker protocol builds on the standardised, open API approach to sharing data, but with three key considerations:

  • It provides a way for organisations to agree a Data Access Agreement without a protracted legal negotiation. Standard agreements are provided and encouraged, to reduce the overhead that all parties face in legal costs and time (that said, custom agreements are still possible where absolutely necessary).
  • Accepting a Data Access Agreement doesn’t give the recipient “open slather” to the data; for most data sets, explicit farmer approval is also required, requested and confirmed by the farmer using standard web authorisation protocols. Farmers grant permission to access data that covers their business, and can also withdraw that authorisation.
  • As an Open API approach is used rather than a central database or exchange, there is no “central service” that must be involved in each data transfer. This reduces the “attack surface” from a security perspective and enables organisations to retain control of the data they hold.

Organisations adopting the DataLinker protocols benefit in several ways:

  • Farmers see that they are playing their part in maximising the use of information;
  • Standardised APIs and Data Access Agreements reduce the time and money invested in negotiating and creating custom solutions for every interaction;
  • Data Access Agreements mean that companies still retain the necessary control over high-value data sets, and are able to meet the privacy and confidentiality terms they have agreed with farmers; and
  • Companies and farmers can efficiently use sets of data which otherwise might have been too expensive to collect, or required a level of farmer input which would have discouraged adoption.

Our hope is that this framework will help organisations and farmers to maximise use of farm information, reducing long-term costs and encouraging greater innovation.

What you can do about this:

  • Want to see which New Zealand companies are accredited under the Farm Data Code of Practice? Check out www.farmdatacode.org.nz and drop an email to your key information providers to find out when they will be accredited.
  • Interested in the DataLinker protocols and how they can be adopted by your business? You’ll find information at www.datalinker.org.
  • Planning your strategy in this data space, or considering next steps? Talk to us – we’re happy to provide you with background and advice.

How on-farm data and analysis can support credence attributes

Can on-farm technologies and “big data” support food and fibre product attributes that consumers value?

In a previous article I noted a Hartman Group study that suggested that consumers are interested in attributes other than just the look and price of a product, wanting to know:

  • What ingredients are in the food or beverage product (64%);
  • How a company treats animals used in its products (44%); and
  • From where a company sources its ingredients (43%).

We call these informational aspects of a product “credence attributes”, meaning that they give credence to our decision to purchase (or not purchase) a product or service, but can’t be directly assessed from the product itself, either before purchase (on the basis of colour or feel) or after purchase (on the basis of taste, for instance).

Characteristics such as “organic”, “environmentally responsible”, “grass-fed”, and “naturally raised” relate to the story behind a product. A product may communicate these through advertising, packaging, and other ways of telling the product story.

But consumers are also looking for authenticity and integrity in their food and other products. There’s a consumer backlash when the product story on the pack is in conflict with other data sources – such as claims in news articles or secret video footage.

We’ve been exploring ways that feeds of data from on-farm technology could be used to support the product provenance and credence story – or at least signal to farmers and their supply chain partners where checks and improvements should be considered. Here are a couple of examples.

Monitoring carbon footprint

Carbon life-cycle assessments (LCAs) are used to understand the extent to which production, manufacture, and distribution of a product impacts on climate change through deforestation or release of greenhouse gases such as carbon dioxide, methane, and nitrous oxide. We learn some interesting things from these, sometimes showing that shipping food products from the other side of the world can have a lower impact than growing products locally if the local environment is less hospitable.

Importantly, producing a Life-cycle assessment creates a model – a series of equations and if-then logic that describes the calculation. We can use this model with appropriate local farm and supply chain data to understand how management decisions and activities, timing and stock or crop productivity impact on emissions.

Automated systems on farms that capture data about crop production, livestock weights and production, and farm activities can also deliver data for a custom life-cycle assessment. Benchmark data across multiple farms and it becomes possible to identify the patterns of complete vs missing data, to understand how climatic constraints change emissions, or to identify outliers that need to be more closely examined.

A note of caution here: as we’ve learned from nutrient budgeting, farm systems can be varied and life-cycle assessment models are frequently based on the “typical”. An outlier result may indicate greater variation than the model can handle, rather than a more or less efficient farming system.

Demonstrating animal welfare

Animal welfare and the ability to live a healthy and natural life is another area of concern to consumers. Here too, metrics collected on-farm can be the subject of automated analysis to demonstrate good practices are followed.

In Europe where a premium is payable for “grass-fed” dairy in some regions, farmers are experimenting with the use of monitoring devices – smart tags and neck bands for example. These devices capture data that provide farmers with early warning of heats and potential animal health issues – raised temperatures, more or less movement, and reduced eating for example – but can also be analysed for patterns that only show up in outdoor grazing.

In other jurisdictions, veterinary product purchase, use, and reordering records can help to demonstrate compliance with animal health plans worked out between farmers and veterinarians, and hence demonstrate good welfare practices and appropriate use of medicines. Paper records have been used for this purpose for many years, but software technologies and automated data analysis can reduce the burden of data collection and the need for manual audits and analysis.

Practical application

Some producers will find the thought of such automated systems invasive and potentially threatening. Certainly, given the potential for outliers, for good practices that just don’t quite fit the expected mould, and for technology glitch or human error, you couldn’t use these measures as legal baselines that determine “rights to farm”.

Nevertheless, application of technology and analytics such as these can help us as we seek to improve farming practice and improve the integrity of our food supply chains. A good starting point might be to apply these as tools for committed producer groups that are already aligned with supply of a premium product or market.


This article was first published at http://www.rezare.co.nz/blog/.
Contact us to learn how

we apply software and models to agricultural data.

Why invest in tech for farming?

Every second business writer today seems to be talking about ag-tech, big data, and the internet of things. If you’re an agriculture sector organisation, or a company servicing or purchasing from primary producers, you might be forgiven for thinking you missed a day in the office and overnight farming has become a connected, automated, artificially-intelligent system.

Reality of course is still far from that utopia (or dystopia, depending on your point of view). What the writers are telling us is that there are a range of interesting technical possibilities that might have application in agriculture: and that some of the very early adopters, enthusiasts, and visionaries are trialling these. In some cases, they’ve used technologies in interesting experiments and learned useful things about their supply chain or farming system.

All well and good; but if you don’t consider yourself a leading edge visionary (or perhaps you don’t have the same appetite for risk) should you just ignore the hype, and wait until the technology matures?

For the pragmatists among us, who are more interested in achieving practical goals and strategic goals than experimenting, here are some areas where I think there is value in leveraging technology into your business today.

Improving communication

We all know that good communication between people is what keeps the wheels of business oiled and turning. Technologies, systems, and business rules are no replacement for good people relationships.

Your customers, suppliers, and business partners see all of your interactions with them as part of that same interpersonal relationship. Are your reports or invoices late? Do they lack critical information your business partners need? Personal reassurances will go so far, but if you can’t achieve timely and data-rich information delivery that helps your partners, they may look elsewhere.

We’ve helped a number of our customers lift their communications with suppliers or customers. In some instances, we’ve delivered apps that help farmers access key pieces of information as they become available. In other cases, we’ve implemented reports and visualisations that are valued by business partners for the timely insight they provide. This isn’t just data: it’s business communication.

Understanding critical business metrics

Many businesses try and track too many metrics, and don’t always closely manage the key metrics that are leading indicators of success.

What drives your business? Primary production and processing businesses are often heavily influenced by factors such as weather and global market demand, but these factors are outside the control of most businesses and may have less impact on long-term profitability than we imagine.

Its typically our response to outside factors, and our ability to continue to produce value despite them that determines long term success. Measures of productivity per unit of input, and effectiveness at delivering high-value products are better indicators than raw dollar returns or kilograms of product shipped.

For some of our customers, this has meant improving alignment between their financial data and physical data records. For instance, benchmarking kilograms of product produced, farm working expenses and profit against the potential pasture or crop production for that season, allowing more effective comparison of improvements across seasons. For others it has meant focusing on the proportion of product meeting specification for high-value markets, regardless of whether the market actually delivered the desired price premium in that particular season.

Data integration can help bring these disparate data sets together for timely comparison, and in-field monitoring technologies or remote sensing can deliver the physical data needed to make sense of the product and financial outcomes.

Responding to changing conditions

How do you assess and respond to the risk of changing weather and markets?

Studies of farmer responses to drought and similar challenges indicate that we tend to respond too late, and in a conservative, “piecemeal” fashion. We seem to bet that things will trend back to normal sooner rather than later, and under-do our response.

Monitoring tools such as climate stations and market data visualisations allow us to understand trends and risks early – before their impacts really start to bite. Of course, responding to these is still a challenge: will I reduce stock numbers only to see the weather change?

Mathematical models don’t yet give those definitive answers some futurists might lead you to expect, but they allow you to ask the “what if” questions, looking at potential decisions and impacts. These allow you to build a plan for your business and understand what your critical review and decision points need to be.

Time to start learning

The pace of change in technology is only likely to accelerate in coming years, and the technology we use in farming in ten years may be very different than what is now available. It is tempting to just “wait and see” what evolves, but advanced agricultural businesses choose to embrace technologies that can deliver concrete benefits and which align with their goals. Consider ways that technology can help you achieve more effective communication, improved understanding of business metrics, and the ability to assess and respond to change.

What technologies are you embracing in your business, and why?

Is connectivity interrupting your agricultural vision?

The last few years (even the last few months) have seen a surge in organisations exciting us about what farmers will do with sensors and mobile devices. We’ll be able to collect data with drones flying above our fields and tiny devices scattered across our farms. We’ll collect data with smart ear-tags and boluses in livestock, and we’ll crunch it all with powerful cloud-based analytics and smartphones will be a convenient way to tap into these analytics so that we can make decisions on the move.

That’s the promise.

Given time, much of this is achievable, though being savvy business people, farmers will only adopt technologies that deliver value or reduce risk. Even if the technology does provide significant value the may also be a challenge in connectivity; how do all these devices connect together and to the cloud?

I was thinking about this just last week, when I was demonstrating livestock management software to a group of farmers. It was a wintery day with snow on the hills, and we were standing out in the sheep yards, protected from the sun (but not the wind) by the steel roof.

We had just demonstrated how trivially easy it was to capture information about animals and synchronise it onto a smartphone, and now we were going to show the same information synchronised to the cloud, through a web browser and a mobile data connection.

We pressed “refresh” and we waited – and waited.

Did I mention that I had plenty of time to think?

Of course, eventually the page loaded and the demonstration carried on, reasonably successfully too. The farmers of course took the opportunity to point out that their remote yards would have no coverage at all!

We face a variety of different connectivity challenges in rural environments:

  • Low bandwidth and high latency connections;
  • Connectivity black-spots and intermittent coverage; and
  • Areas with no connection at all.

Low bandwidth and high latency

In many cases, rural networks (fixed or mobile) provide significantly lower bandwidth than those in the cities, or they may have significantly higher latency (time to respond). For some applications, this doesn’t matter at all, but for others it may be a show stopper.

I watched a drone software manufacturer demonstrate the features of a very smart unmanned aerial vehicle (UAV). On-board software used the GPS to navigate the device over a flight plan we outlined using Google Maps. A multi-spectral camera captured high resolution images at 1cm per pixel, and the device transmitted the images to the cloud while it was still flying. That’s pretty incredible!

Transmitting images as they are captured is a great innovation. The intensive processing power required to stitch images together and analyse them is provided by a remote data centre and applied to data arriving from drones around the world. It also frees the operator from messing about with memory cards or thumb drives and processing software.

This works just great on the 4G mobile networks in the Salinas Valley, California. Not so much in rural Wales where the mobile networks drop back to GSM or EDGE, with significantly lower throughput. Still, there may be opportunities to address this. If the drone has sufficient memory, it could cache images until bandwidth improves, or when it returns to a connected base. Alternatively, flying at 400ft above the terrain may provide better coverage than we experience on the ground.

High latency rural connections provide a different challenge. We typically experience an increase in latency with satellite connections, as signal travels through a far-distant satellite, to and from ground stations. Satellite connections can still be very high bandwidth (high data throughput), but have a measurable delay, as you’ll notice if you use a satellite based phone service – “over”.

This latency won’t affect most applications to any noticeable extent, but consider the case of a livestock auction happening at a remote location. Smart auction software allows internet bids to be synchronised with the bids happening on site, right down to subscribers hearing and seeing the auction in real time. A delay of one or two seconds in this case becomes significant, so auction planning and operation needs to take this into account.

Mobility “black spots”

Much of the rural countryside lacks mobile coverage. Hills, trees, and buildings can all affect local reception, especially as distance from a cell tower increases.

This limits our dependence on these networks for “life and death” situations. There’s a reason why Search and Rescue services encourage hikers and hunters to use Personal Locator Beacons (PLBs) rather than rely on mobile phones when in remote areas, and the same should apply to tools that are triggered when an ATV rolls on the farm – the vehicle could well be in a gully outside of cellular network coverage.

We should also plan on intermittent connectivity when developing apps for the farming sector. Requiring an internet connection to start or use an application will limit its use to favourable environments close to home where that coverage exists. An answer of course is smart synchronisation of relevant data, when devices come back into network coverage.

I was quite confident demonstrating our livestock management application remotely for example, as the list of animals had already been stored by the mobile device and most importantly the application was designed to work disconnected and sync later. If there was no coverage, the new data I had captured would be sent to the server when a connection became available.

Bringing your own network

There are always spots with no connectivity: sheep yards in a valley between steep hills; installing nitrate sensors in a creek in a ravine; even equipment in the shadow of a large steel building. What options are there if we must install our technologies where there is no connection?

It turns out there are a range of options, but the need to be technology savvy and the cost of equipment rises in these cases.

  • Consumer networking such as Wi-Fi and Zigbee mesh networks operate at 2.4GHz, providing line-of-sight connectivity over relatively short distances. Trees, hills, and even buildings can block these signals. Wi-Fi’s real strength is that it is a relatively low-cost and easily deployed option. It’s less suitable where extreme battery life and low power is a requirement.
  • There are also specialist wireless networks. These often use lower frequency radio transmissions, operating at 433, 868, or 900-921 MHz. They require a larger antenna, but these frequencies can travel longer distances without increasing the power requirement, and may be less prone to blockage by trees. Some of these networks transmit point-to-point between sender and receiver, and others form a mesh communicating between multiple devices.
  • New standards are evolving for low-power, long-range wireless networks such as the LoRaWAN and SigFox networks. Telecommunications providers are starting to adopt these standards to provide specialist connectivity for “Internet of Things” devices and remote locations. Where these are used at low frequencies in rural networks, they may well provide better coverage for monitoring devices in previously poor locations.
  • Finally, we might consider using mobile phones to “collect and deliver” data from monitoring devices in places where there are no coverage, but where coverage spots are regularly visited or passed. A remote sensor with some memory and Bluetooth Low Energy capability might deliver summary information to your smartphone when you pass by. This is the same approach that is used by activity sensors you wear on your arm – data is captured, and delivered to your phone at intervals.

So there are possible solutions even to those areas with currently poor or non-existent coverage – but many of these require some skills, choices or trade-offs, and often a higher investment cost, and this makes adoption of technology more challenging. Companies developing new products can’t cater for every possible network technology, and may end up integrating a just subset of connectivity options.

Our team at Rezare Systems often discusses these challenges and the evolving set of solutions, because these will enable some of the software solutions we “brain-storm”. We’re not telecommunication providers or hardware developers, but we are very interested in how connectivity influences adoption of new technology and our ability to capture and leverage data.

Have you had experience with connectivity issues in rural applications? Can you share advice for farmers and rural professionals about some of the suggestions above and their alternatives? Let us know your thoughts.

What do consumers know about your supply chain?

Consumers. A jaded and cynical bunch. I include myself in that statement.

Just last weekend, a lovely salesperson was extolling the praises of a new smoothie product (“would you like to try it sir, it’s packed with fruit”), while I was remembering comments from my children about the level of sugar in smoothies and trying to see what was on the ingredients panel.

Studies by the Hartman Group would suggest that consumers are interested in more than just what a product’s packaging looks like, instead wanting to know:

  • What ingredients are in the food or beverage product (64%);
  • How a company treats animals used in its products (44%); and
  • From where a company sources its ingredients (43%).

Of course, that’s not to say we are always completely logical and analytical. When I buy Bella Pane bread at our local farmers’ market, I don’t ask to see the ingredients list, or the best-before date, or ask when it was made. Probably Mike has already told me he got up at 3am to bake the day’s bread, but even if he hasn’t done so, I gain a level of confidence and trust from his local proximity, previous discussions, and the farmers’ market brand story.

That level of trust and confidence in product quality, source, and ingredients is what supports positioning of premium food products. A large North American corporate recently discovered that promising “Food with Integrity” was only a start, and those promises needed to be backed with processes and checks to maintain confidence in their products.

I’ve spent a while recently considering how the information we collect on farm can support the broader story about premium protein products. The Hartman Group research would tell us that consumers in the US are interested in:

  • Hormone free (52%);
  • Free of antibiotics (49%);
  • Artificial (48%);
  • GMO-free (41%); and
  • Organic (31%).

When it comes to animal welfare consumers want to know that companies avoid inhumane treatment of animals – and while they may not know the details of what that means, the proportion of people who care is rising:

  • Other animals are not harmed in capture/raising (e.g. bycatch) (68%);
  • Animals are raised in as natural environment as possible (65%);
  • Animals are not used for product safety testing (65%);
  • Animals are not given hormones or antibiotics (63%);
  • Company supports animal welfare causes/organisations (51%);
  • No animals at all used in products (45%); and
  • Animals fed only organic food (33%).

We know that products and processes that meet these criteria – and more importantly, have a compelling story in these areas – may command a premium in the market, and are in a position to build stronger, more defensible brands.

Consumers expect products and brands to live up to the brand story they are told. When lack of integrity in process or supply chain is exposed, consumers act angrily, as though we have been “tricked” (read Seth Godin’s “All Marketers are Liars” to learn more of how this works).

For that reason, any claims we make about our agricultural products having green origins or being “very pure indeed” need to be backed up by guides, processes and records that demonstrate our commitment to those brand values. Claims of greenness or purity are potentially for naught if we don’t have both safeguards and evidence in place.

Hence the importance of Farm Assurance or Good Agricultural Practice programmes, and the need for audits and for simple to use, on-farm record keeping tools that back up the story. We’re working on some of the latter with our partners. It’s hard work, because farmers are busy people with limited finance. In order for supply programmes to really deliver the benefits promised by the brand, I think we need to do two key things:

Link the activities to the brand story

Make sure everyone who has a role in the supply chain understands how their role contributes to the brand and to the consumer experience. Spell out how actions on farm impact the supply chain: safety, provenance, and in-market claims. Ensure staff know the risks to the business if product integrity fails.

Make it easier to comply than not

Most audit schemes today run on paper – recording pages in a paper book or filling in forms. For practical reasons, these are filled in at the farm office, and often updated just before the auditor arrives. We remove a substantial barrier if it is easy to capture information in the field rather than spending evenings in the office. Reusing information captured for farm assurance records to provide insights for farm management aligns goals and makes adoption more likely.

Your thoughts?

Consumer expectations have been changing over the last decade. Our supply chains and production systems are evolving to meet those expectations. This will require a greater commitment from us all to transparency and integrity, making sure what we do lines up with what we claim.

Do you manage a supply programme, or participate as a farmer, grower or processor? We’re interested in your thoughts. Drop me a note in the comments, or contact me directly.

Page 1 of 41234